Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.


I like the legend

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Might as well have called it "The Battlefield 3 Patch" from what I can tell. Minor-to-marginal improvements in some other games, big big big jump in BF3 performance. Which I guess is significant since they were advertising how well it ought to do with the engine BF3 is based on, not a good look for nVidia to eat their lunch on it. But "Never Settle" is some pretty silly marketing terminology and they've already got most of what they were going to get out of the drivers' overall performance, that's pretty clear from the aggregate benches. Battlefield 3 sees a big improvement, the rest are a few FPS up (or at least a few minimum FPS up) from 12.7, and that's it.

Edit: Badass games bundle, though, nice job on the value-add side of things.

I still don't think this is going to mean gamers trend meaningfully toward the 7970, even though it's probably the best value card right now. AMD has a trust issue with their products. This is a nice outreach to the gaming community, phrased in terms that people who don't read sites like this will get, but nVidia starts ahead in opinion by a significant enough margin that it's difficult to overcome just because you have equal to or often better-than performance. Performance and pack-ins are difficult to market as well as they should be when "TEAM GREEN, YEEEEAH" is a selling point of its own, y'know?

Plus PhysX is actually kind of a thing right now thanks to Borderlands showing that if you keep it calculation-simple and well-optimized, you can have some pretty nice GPU PhysX and also render without needing a coprocessor. If you want to dick around in the config and force higher PhysX than the game would normally allow, not so good of an option, but for the first time it's more than just something people can't turn on if they want to have very smooth gameplay.

AMD/ATI's edge-case technological wins are less substantial and more edge-case. Eyefinity is on the extreme outside edge of how people are likely to actually set up their systems, so while they take home the crown there, it's not worth a ton in the general market.

Well, regardless, seeing improvements in the 5% to 10% range across many games and seeing Battlefield 3 fully back in play for each company's top end products isn't a bad thing. I just don't think it's a good enough thing to change hearts and minds at this point, and AMD can't really wait around on that.

Agreed fucked around with this message at 15:09 on Oct 22, 2012

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
Driver seems cool, but now I'm really hoping I magically get a 7850 back for my RMA'd 6950. Or just say screw everything and buy a 7950 and sell the old one.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

Agreed posted:

Might as well have called it "The Battlefield 3 Patch" from what I can tell. Minor-to-marginal improvements in some other games, big big big jump in BF3 performance. Which I guess is significant since they were advertising how well it ought to do with the engine BF3 is based on, not a good look for nVidia to eat their lunch on it. But "Never Settle" is some pretty silly marketing terminology and they've already got most of what they were going to get out of the drivers' overall performance, that's pretty clear from the aggregate benches. Battlefield 3 sees a big improvement, the rest are a few FPS up (or at least a few minimum FPS up) from 12.7, and that's it.

Edit: Badass games bundle, though, nice job on the value-add side of things.

I still don't think this is going to mean gamers trend meaningfully toward the 7970, even though it's probably the best value card right now. AMD has a trust issue with their products. This is a nice outreach to the gaming community, phrased in terms that people who don't read sites like this will get, but nVidia starts ahead in opinion by a significant enough margin that it's difficult to overcome just because you have equal to or often better-than performance. Performance and pack-ins are difficult to market as well as they should be when "TEAM GREEN, YEEEEAH" is a selling point of its own, y'know?

Plus PhysX is actually kind of a thing right now thanks to Borderlands showing that if you keep it calculation-simple and well-optimized, you can have some pretty nice GPU PhysX and also render without needing a coprocessor. If you want to dick around in the config and force higher PhysX than the game would normally allow, not so good of an option, but for the first time it's more than just something people can't turn on if they want to have very smooth gameplay.

AMD/ATI's edge-case technological wins are less substantial and more edge-case. Eyefinity is on the extreme outside edge of how people are likely to actually set up their systems, so while they take home the crown there, it's not worth a ton in the general market.

Well, regardless, seeing improvements in the 5% to 10% range across many games and seeing Battlefield 3 fully back in play for each company's top end products isn't a bad thing. I just don't think it's a good enough thing to change hearts and minds at this point, and AMD can't really wait around on that.

Much have to do with the marketing prowess of NV but the "enthusiast" market filled with clueless fools also play a major role. There is one guy I knew who was even considering GTS450 to a 550 Ti when both were terrible buys IIRC since like forever and the other flat out refuses AMD GPUs because their drivers suck as if it is still year 2001. I can imagine the same poo poo Intel is going to face when people will still want a discrete laptop GPU when it is barely faster than the Haswell iGPU.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Palladium posted:

Much have to do with the marketing prowess of NV but the "enthusiast" market filled with clueless fools also play a major role. There is one guy I knew who was even considering GTS450 to a 550 Ti when both were terrible buys IIRC since like forever and the other flat out refuses AMD GPUs because their drivers suck as if it is still year 2001. I can imagine the same poo poo Intel is going to face when people will still want a discrete laptop GPU when it is barely faster than the Haswell iGPU.

Quick note, I'll comment further later, but Intel and AMD do share an unfortunate truth in that particular comparison; check the reviews and note that the driver breaks a lighting pass in Skyrim, a major game that a lot of people are playing and buying and will be playing and buying for some time yet, heretofor pretty much fine with AMD Radeon hardware (and requiring a solid 60% of catch-up from nVidia's drivers to be competitive on their end).

I predict that if it becomes noticed as an issue, AMD will be well remembered for "that driver that broke Skyrim" while nobody will have much to say about "those drivers during which nVidia's hardware played Skyrim like rear end." Haven't heard anyone complain about it so far, even though that kind performance improvement doesn't just come from optimizations, but from finding and fixing some deeper level incompatibility or error in how the hardware was rendering the game (especially with interior lighting iirc).

Intel also has a very much less than sterling reputation for IGPU drivers. Pushing boundaries significantly when it comes to hardware, but their software support team seems to have difficulty keeping up.

Anyway, I figure AMD will fix the problem, probably quickly, but if anything catches from it besides the big BF3 performance boost, it'll be that, I figure. Nobody likes an update that breaks their game and you know how people get invested in the Elder Scrolls games, having their tweaked-to-the-nines modded Skyrim install look a bit like rear end until the fix will stick in their minds and be another example of "AMD drivers SUCK" while nVidia is made of teflon when it comes to driver issues. Seems like people don't even really remember the whole 560Ti and BF3 debacle where the game had major rendering errors that, to my knowledge, have not yet been entirely fixed.

Goon Matchmaker
Oct 23, 2003

I play too much EVE-Online

You seem to be forgetting that it's a beta driver.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Goon Matchmaker posted:

You seem to be forgetting that it's a beta driver.

Not sure that's a meaningful distinction when it's just the expected thing to use beta drivers these days. nVidia has run huge series of drivers that were beta drivers to meet certain game needs in between official releases and caught no flak for it. (And they're the guys who have great drivers, remember?)

cancelope
Sep 23, 2010

The cops want to search the train
I just picked up a Zotac GTX 660 yesterday to replace my ailing 460. It's absolutely tiny compared to the old one! And it takes just one power cable and seems to produce less heat. I"m pretty sure it has brought my processor temps down a little as well. I'm impressed.

Mierdaan
Sep 14, 2004

Pillbug
Apologies if this has been covered before.

We have a structural biologist starting at my work soon who will be doing 2/3d molecular rendering on a CentOS workstation. From my own personal life I'm real familiar with the GeForce line of GPUs but it seems that Dell pushes the Quadro line in rendering workstation setups. What's the feeling on the two lines?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Mierdaan posted:

Apologies if this has been covered before.

We have a structural biologist starting at my work soon who will be doing 2/3d molecular rendering on a CentOS workstation. From my own personal life I'm real familiar with the GeForce line of GPUs but it seems that Dell pushes the Quadro line in rendering workstation setups. What's the feeling on the two lines?

Quadro and GeForce are based on the same silicon. The difference comes in firmware and drivers.

Quadros are professional cards, outfitted with ECC VRAM (and more of it than GeForces, to support GPGPU calculations). The drivers are tuned for stability and precision rather than speed-at-all-costs, so Quadros are suitable for scientifically accurate rendering. They are also not artificially limited (or less so) on FP64 CUDA/GPGPU operations.

There are more differentiating features, but that's the gist of it.

Mierdaan
Sep 14, 2004

Pillbug

Factory Factory posted:

Quadro and GeForce are based on the same silicon. The difference comes in firmware and drivers.

Quadros are professional cards, outfitted with ECC VRAM (and more of it than GeForces, to support GPGPU calculations). The drivers are tuned for stability and precision rather than speed-at-all-costs, so Quadros are suitable for scientifically accurate rendering. They are also not artificially limited (or less so) on FP64 CUDA/GPGPU operations.

There are more differentiating features, but that's the gist of it.

Excellent, thanks. I also found this old brief describing the differences but it looks like it's from 2003. Gives me a good idea of how they treat the two different lines, though.

movax
Aug 30, 2008

Mierdaan posted:

Excellent, thanks. I also found this old brief describing the differences but it looks like it's from 2003. Gives me a good idea of how they treat the two different lines, though.

Basically when you pay for the Quadro, you get the ISV certification, ECC memory and you know that software vendors qualified against your particular card. Should cut down on compatibility issues as well.

It's somewhat an example of executing the whole "we can sell the [essentially] same product to w people for $x, and y people for $z, so let's do both!".

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Factory Factory posted:

Quadros are professional cards, outfitted with ECC VRAM (and more of it than GeForces, to support GPGPU calculations).
Pedantic note: My understanding is that ECC is not supported on current-gen Quadros because hardware support was removed in the Kepler GPU. Additionally, I don't think the older Quadros used ECC RAM, pe se. Rather, they did ECC calculations on the GPU, meaning available memory and memory bandwidth were reduced to make room for the ECC data. So, for example, if your card had 3GB of RAM and you enabled ECC, you'd then have 2457MB remaining. Another complicating factor is that GDDR5 also supports LINK ECC, which is implemented on nearly all cards (I seem to recall the first Geforces to support it didn't make use of it, but everything else does). This is why, when overclocking graphics memory, you'll see performance start to go back down right before you encounter errors if you overclock too high.

norg
Jul 5, 2006
Agreed, do you have more info on this Skyrim issue with 12.11? I can't say I noticed anything amiss after installing the driver (unlike 12.9, which gave me loads of flickery glitching).

I saw something on Anandtech that very briefly mentioned they were missing a lighting pass on their 7970 but not their 7870, but that was about it. How does the problem manifest itself? I'm using an ENB for the game which obviously totally changes the lighting anyway, so I dunno if that affects the issue?

Dominoes
Sep 20, 2007

Dominoes posted:

Hey, I use eyefinity/crossfire with 3 portrait monitors. I get pretty bad screen tearing because vsync doesn't work. As far as I can tell, there is no way to get eyefinity to work with vsync. Searching shows a few cases of this with no solution, although I haven't found solid information, Ie an article from AMD explaining that there are technical issues preventing vsync from working with eyefinity.

Have any of you found a solution? Or know that there will or won't be one?

Does Nvidia surround work with vsync?

Running games in a window is a workaround; vsync works there, although crossfire doesn't.
Seems like no one's gotten vsync working with eyefinity. What about Nvidia surround? If you use it, let us know if vsync works.

Also, let me know if HD audio works properly on the current gen of Nvidia cards. I'm probably going to upgrade to Nvidia since I've heard (one reply on the overclocker forum) that surround does work with vsync. No word on status/quirks of HDMI audio, or anything confirming this. Before I drop $700+ on a pair of cards, I'd like to know that they won't gently caress up my setup. The fact that they only have a single DisplayPort connector is worrying. The ATI setup has some quirks that aren't really published (ie phantom desktop space with HDMI audio, the vsync issue with Eyefinity, and until a driver update a few months ago, quirky behavior when turning on/off the audio receiver - flashing screens etc for a few seconds)

Dominoes fucked around with this message at 00:55 on Nov 2, 2012

norg
Jul 5, 2006

norg posted:

Agreed, do you have more info on this Skyrim issue with 12.11? I can't say I noticed anything amiss after installing the driver (unlike 12.9, which gave me loads of flickery glitching).

I saw something on Anandtech that very briefly mentioned they were missing a lighting pass on their 7970 but not their 7870, but that was about it. How does the problem manifest itself? I'm using an ENB for the game which obviously totally changes the lighting anyway, so I dunno if that affects the issue?

Welp, I was wrong. Actually at night everything is too dark and unless it's bright sunshine it all looks a bit washed out too. Back to 12.8 again then I guess. :unsmith:

Chuu
Sep 11, 2004

Grimey Drawer
Just got a new factory overclocked GTX 680 (I know I really should have gotten a 670 but I just wanted to splurge and get the best for once) and want to stress it to make sure it's completely stable.

Is furmark still a video card killer, or assuming this card is stable, should it be safe to run overnight assuming the temps plateau at a reasonable temperature? Also, what is a reasonable temperature for the GTX 680?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Furmark doesn't work well for stress-testing GTX 600-series cards because they spend all of their time at the TDP cap, well below max clock speeds. So far I've had the best luck with looping the Metro 2033 benchmark, but there may be better options. Try to keep the card between 65C to 69C for maximum boost clocks. First thing you should do is max out the TDP slider and then go from there.

Blame Pyrrhus
May 6, 2003

Me reaping: Well this fucking sucks. What the fuck.
Pillbug
So I'm currently running 2x 570s in SLI pushing a 2560x1600 display. Is there a single card that I can replace them with and get same or better performance?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Linux Nazi posted:

So I'm currently running 2x 570s in SLI pushing a 2560x1600 display. Is there a single card that I can replace them with and get same or better performance?
No, only dual-GPU cards like the Geforce GTX 690 and Radeon HD 7990.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
If the next GPU architectures are as big performance changes as this generation's were, you can probably get a single card equal performance swap around March of 2014, the current rumored release date for Nvidia's Maxwell. AMD is hush about their follow-up to Sea Islands, the 2013 optimization pass of Southern Islands (Radeon HD 7000), so no clue when a Team Red equivalent would show up other than "Probably around the same time maybe???"

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Factory Factory posted:

If the next GPU architectures are as big performance changes as this generation's were, you can probably get a single card equal performance swap around March of 2014, the current rumored release date for Nvidia's Maxwell. AMD is hush about their follow-up to Sea Islands, the 2013 optimization pass of Southern Islands (Radeon HD 7000), so no clue when a Team Red equivalent would show up other than "Probably around the same time maybe???"

Is big Kepler going to get a consumer release? That might change the game a bit.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
:iiam: It's only now even hit the HPC market, its target, and yields are apparently pretty bad - only 13/15 SMXs are enabled. Everybody has their own favorite rumor for a "GeForce 685" or "780," and none of it is sourced.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
Do all 7950s come with the never settle bundle? Or just ones from Newegg? I want this MSI 7950 TF, but it says nothing about the free games.

An Unoriginal Name
Jul 11, 2011

My favorite touhou is my beloved Nitori.
:swoon:

Endymion FRS MK1 posted:

Do all 7950s come with the never settle bundle? Or just ones from Newegg? I want this MSI 7950 TF, but it says nothing about the free games.

Check here: http://blogs.amd.com/play/this-holiday-never-settle/

Amazon is not listed in any of the retail partners participating so I would assume not.

Chuu
Sep 11, 2004

Grimey Drawer

printf posted:

Is big Kepler going to get a consumer release? That might change the game a bit.

nVidia has stated that the rumors that the 7xx series were going to be based on the GK110 are false. There's nothing solid on exactly what the 7xx series is going to look like.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.

An Unoriginal Name posted:

Check here: http://blogs.amd.com/play/this-holiday-never-settle/

Amazon is not listed in any of the retail partners participating so I would assume not.

Well that's a bummer, I had $37 in Amazon gift cards to help out with it.

KillHour
Oct 28, 2007


Endymion FRS MK1 posted:

Do all 7950s come with the never settle bundle? Or just ones from Newegg? I want this MSI 7950 TF, but it says nothing about the free games.

Until I clicked that link, I was wondering why you were buying a graphics card from 2006.

http://www.nvidia.com/page/geforce_7950.html

forbidden dialectics
Jul 26, 2005





KillHour posted:

Until I clicked that link, I was wondering why you were buying a graphics card from 2006.

http://www.nvidia.com/page/geforce_7950.html

http://en.wikipedia.org/wiki/Radeon_R100

Or from 2000.

Charles Martel
Mar 7, 2007

"The Hero of the Age..."

The hero of all ages
Wow, what an OP. I didn't see this thread before, and it would be better suited for the question I posted in the parts-picking megathread earlier:

Does anyone know of or can point me in the direction of the minor differences between Radeon and GeForce cards? I have been searching in vain in Google, YouTube, Wikipedia, etc and cannot find a concrete pros and cons list between the two.

I've read accounts where GeForce cards are more geared toward 3D gaming whereas Radeon cards work better for older PC games during the early 3D days. Is this just from people talking out of their asses or is any of this true?

I'm genuinely interested in the minute differences between the two.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Charles Martel posted:

Wow, what an OP. I didn't see this thread before, and it would be better suited for the question I posted in the parts-picking megathread earlier:

Does anyone know of or can point me in the direction of the minor differences between Radeon and GeForce cards? I have been searching in vain in Google, YouTube, Wikipedia, etc and cannot find a concrete pros and cons list between the two.

I've read accounts where GeForce cards are more geared toward 3D gaming whereas Radeon cards work better for older PC games during the early 3D days. Is this just from people talking out of their asses or is any of this true?

I'm genuinely interested in the minute differences between the two.

They have completely different architectures, but those details aren't important for a general overview.

There are differences of course in performance, varying across the ranges, but this isn't the thread for posting many graphs - try AnandTech bench or some other reputable site - no point repeating all the game performance differences here, but if you're looking at the low-mid range of card prices, AMD is a good bet right now, and at the top end, most would say NVIDIA has an edge - but recently AMD has boosted the clocks of the top end which helps in some situations.

In a very basic sense, the difference is that NVIDIA has CUDA exclusively, and hardware accelerated PhysX, but if you're into compute that runs under OpenCL, the newest Radeon generation is in general, faster at compute than the current NVIDIA cards at the same price.

HalloKitty fucked around with this message at 13:32 on Nov 9, 2012

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
The GK110-based Tesla K20 and K20X have been officially released. They are monstrous.

AMD also released a monster: the FirePro S10000, a dual-GPU Tahiti board (akin to a Radeon 7990) for density when power limits allow.

Can't wait to see benchmarks on these things. Between K20, GCN-based FirePro, and Intel's Xeon Phi, there are three strong contenders for the HPC crown. Nvidia has an early lead, what with K20 being used in the current #1 fastest supercomputer, but ultimately the game is still afoot.

coffeetable
Feb 5, 2006

TELL ME AGAIN HOW GREAT BRITAIN WOULD BE IF IT WAS RULED BY THE MERCILESS JACKBOOT OF PRINCE CHARLES

YES I DO TALK TO PLANTS ACTUALLY
This seems as good a place as any to ask: I find the design and manufacture of modern microprocessors fascinating. What books should I look for if I want to learn more? For reference, I've a background in mathematics and computer science, so technical content isn't going to put me off.

coffeetable fucked around with this message at 01:03 on Nov 14, 2012

canyoneer
Sep 13, 2005


I only have canyoneyes for you

coffeetable posted:

This seems as good a place as any to ask: I find the design and manufacture of modern microprocessors fascinating. What books should I look for if I want to learn more? For reference, I've a background in mathematics and computer science, so technical content isn't going to put me off.

I don't know of a good book, but do a google search for (in my opinion) the more interesting parts.
"immersion lithography","multiple patterning", "high k metal gate", and "atomic layer deposition" should get you started. There are a score of fantastic semiconductor manufacturing videos on youtube as well that I've used to help explain the process to people.

Due to IP and trade secrets, almost all information you'll find is going to be either light on details, 5 years obsolete or both.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

coffeetable posted:

This seems as good a place as any to ask: I find the design and manufacture of modern microprocessors fascinating. What books should I look for if I want to learn more? For reference, I've a background in mathematics and computer science, so technical content isn't going to put me off.

Intel occasionally puts out "how this poo poo done get made" documents [PDF, YouTube without text] that show up either in PDF form or repackaged by news websites. AMD apparently made a video in 2009, too, but I'm too lazy to find a new version.

That doesn't cover design, though. I'm not really qualified to tell you about that, as my understanding is cobbled together from pre-filtered sources, like Wikipedia and newsmedia articles. You can pick up a hell of a lot that way, but it relies on other people having done the really hard work for you. The more you understand, the deeper and more abstractly you can appreciate the differences in architectures.

If I were to just splat out some good articles, I'd basically be regurgitating the AnandTech CPU category. So let's do that.

Intel Historical
Other stuff:

Googling "fundamentals of CPU design" and hitting "I feel lucky" gave me a PDF textbook from a technical school's CS Computer Architecture course. That would be the bottom-up method; the more you understand, the closer you get to the feel for performance that benchmarks and practical experience give.

One of the details between high-level understanding and theoretical understanding that it might help to know now is that the CISC vs. RISC debate from the mid-90s has disappeared. CISC x86 CPUs now use RISC-based, pipelined execution cores, and the x86 instruction set is translated using an instruction decoder that splits a CISC operation into multiple RISC micro-ops.

jink
May 8, 2002

Drop it like it's Hot.
Taco Defender

Factory Factory posted:

Intel occasionally puts out "how this poo poo done get made" documents [PDF, YouTube without text] that show up either in PDF form or repackaged by news websites. AMD apparently made a video in 2009, too, but I'm too lazy to find a new version.

That doesn't cover design, though. I'm not really qualified to tell you about that, as my understanding is cobbled together from pre-filtered sources, like Wikipedia and newsmedia articles. You can pick up a hell of a lot that way, but it relies on other people having done the really hard work for you. The more you understand, the deeper and more abstractly you can appreciate the differences in architectures.

If I were to just splat out some good articles, I'd basically be regurgitating the AnandTech CPU category. So let's do that.

Hilarious. I thought suggesting Anandtech articles would be a bad idea for some reason and deleted my post.

I've learned so much from that site about details I never thought mattered in CPU design. I haven't seen another site that dives into processor architecture like Anandtech does. Great stuff.

Richard M Nixon
Apr 26, 2009

"The greatest honor history can bestow is the title of peacemaker."
Edit: I'm a stupid gently caress who can't use Google.

Richard M Nixon fucked around with this message at 22:12 on Nov 14, 2012

lkz
May 1, 2009
Soiled Meat

jink posted:

Hilarious. I thought suggesting Anandtech articles would be a bad idea for some reason and deleted my post.

I've learned so much from that site about details I never thought mattered in CPU design. I haven't seen another site that dives into processor architecture like Anandtech does. Great stuff.

Real World Tech is also a pretty good site for some of those nitty-gritty details on CPU/GPU architectures. This is getting a little off topic but here's a pretty recent article from them on Haswell architecture.

jink
May 8, 2002

Drop it like it's Hot.
Taco Defender

lkz posted:

Real World Tech is also a pretty good site for some of those nitty-gritty details on CPU/GPU architectures. This is getting a little off topic but here's a pretty recent article from them on Haswell architecture.

They sure do get nittry-gritty. I am pretending to understand half of what they are talking about in this article.


In GPU news, a new BETA driver 310.54 came out from nVidia. Improvements across the board:


http://www.geforce.com/whats-new/articles/nvidia-geforce-310-54-beta-drivers-released

Adbot
ADBOT LOVES YOU

chippy
Aug 16, 2006

OK I DON'T GET IT
I didn't realise Sniper Elite v2 was such a demanding game. Is it really nice looking?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply