|
wipeout posted:Upcoming 12.11 drivers have been previewed by a few sites, and look like they'll be pretty decent performance increases - if you're on a 7 series card. I like the legend
|
# ? Oct 22, 2012 11:28 |
|
|
# ? Apr 25, 2024 21:33 |
|
Might as well have called it "The Battlefield 3 Patch" from what I can tell. Minor-to-marginal improvements in some other games, big big big jump in BF3 performance. Which I guess is significant since they were advertising how well it ought to do with the engine BF3 is based on, not a good look for nVidia to eat their lunch on it. But "Never Settle" is some pretty silly marketing terminology and they've already got most of what they were going to get out of the drivers' overall performance, that's pretty clear from the aggregate benches. Battlefield 3 sees a big improvement, the rest are a few FPS up (or at least a few minimum FPS up) from 12.7, and that's it. Edit: Badass games bundle, though, nice job on the value-add side of things. I still don't think this is going to mean gamers trend meaningfully toward the 7970, even though it's probably the best value card right now. AMD has a trust issue with their products. This is a nice outreach to the gaming community, phrased in terms that people who don't read sites like this will get, but nVidia starts ahead in opinion by a significant enough margin that it's difficult to overcome just because you have equal to or often better-than performance. Performance and pack-ins are difficult to market as well as they should be when "TEAM GREEN, YEEEEAH" is a selling point of its own, y'know? Plus PhysX is actually kind of a thing right now thanks to Borderlands showing that if you keep it calculation-simple and well-optimized, you can have some pretty nice GPU PhysX and also render without needing a coprocessor. If you want to dick around in the config and force higher PhysX than the game would normally allow, not so good of an option, but for the first time it's more than just something people can't turn on if they want to have very smooth gameplay. AMD/ATI's edge-case technological wins are less substantial and more edge-case. Eyefinity is on the extreme outside edge of how people are likely to actually set up their systems, so while they take home the crown there, it's not worth a ton in the general market. Well, regardless, seeing improvements in the 5% to 10% range across many games and seeing Battlefield 3 fully back in play for each company's top end products isn't a bad thing. I just don't think it's a good enough thing to change hearts and minds at this point, and AMD can't really wait around on that. Agreed fucked around with this message at 15:09 on Oct 22, 2012 |
# ? Oct 22, 2012 14:58 |
|
Driver seems cool, but now I'm really hoping I magically get a 7850 back for my RMA'd 6950. Or just say screw everything and buy a 7950 and sell the old one.
|
# ? Oct 22, 2012 21:03 |
|
Agreed posted:Might as well have called it "The Battlefield 3 Patch" from what I can tell. Minor-to-marginal improvements in some other games, big big big jump in BF3 performance. Which I guess is significant since they were advertising how well it ought to do with the engine BF3 is based on, not a good look for nVidia to eat their lunch on it. But "Never Settle" is some pretty silly marketing terminology and they've already got most of what they were going to get out of the drivers' overall performance, that's pretty clear from the aggregate benches. Battlefield 3 sees a big improvement, the rest are a few FPS up (or at least a few minimum FPS up) from 12.7, and that's it. Much have to do with the marketing prowess of NV but the "enthusiast" market filled with clueless fools also play a major role. There is one guy I knew who was even considering GTS450 to a 550 Ti when both were terrible buys IIRC since like forever and the other flat out refuses AMD GPUs because their drivers suck as if it is still year 2001. I can imagine the same poo poo Intel is going to face when people will still want a discrete laptop GPU when it is barely faster than the Haswell iGPU.
|
# ? Oct 23, 2012 11:19 |
|
Palladium posted:Much have to do with the marketing prowess of NV but the "enthusiast" market filled with clueless fools also play a major role. There is one guy I knew who was even considering GTS450 to a 550 Ti when both were terrible buys IIRC since like forever and the other flat out refuses AMD GPUs because their drivers suck as if it is still year 2001. I can imagine the same poo poo Intel is going to face when people will still want a discrete laptop GPU when it is barely faster than the Haswell iGPU. Quick note, I'll comment further later, but Intel and AMD do share an unfortunate truth in that particular comparison; check the reviews and note that the driver breaks a lighting pass in Skyrim, a major game that a lot of people are playing and buying and will be playing and buying for some time yet, heretofor pretty much fine with AMD Radeon hardware (and requiring a solid 60% of catch-up from nVidia's drivers to be competitive on their end). I predict that if it becomes noticed as an issue, AMD will be well remembered for "that driver that broke Skyrim" while nobody will have much to say about "those drivers during which nVidia's hardware played Skyrim like rear end." Haven't heard anyone complain about it so far, even though that kind performance improvement doesn't just come from optimizations, but from finding and fixing some deeper level incompatibility or error in how the hardware was rendering the game (especially with interior lighting iirc). Intel also has a very much less than sterling reputation for IGPU drivers. Pushing boundaries significantly when it comes to hardware, but their software support team seems to have difficulty keeping up. Anyway, I figure AMD will fix the problem, probably quickly, but if anything catches from it besides the big BF3 performance boost, it'll be that, I figure. Nobody likes an update that breaks their game and you know how people get invested in the Elder Scrolls games, having their tweaked-to-the-nines modded Skyrim install look a bit like rear end until the fix will stick in their minds and be another example of "AMD drivers SUCK" while nVidia is made of teflon when it comes to driver issues. Seems like people don't even really remember the whole 560Ti and BF3 debacle where the game had major rendering errors that, to my knowledge, have not yet been entirely fixed.
|
# ? Oct 23, 2012 11:44 |
|
You seem to be forgetting that it's a beta driver.
|
# ? Oct 23, 2012 13:37 |
|
Goon Matchmaker posted:You seem to be forgetting that it's a beta driver. Not sure that's a meaningful distinction when it's just the expected thing to use beta drivers these days. nVidia has run huge series of drivers that were beta drivers to meet certain game needs in between official releases and caught no flak for it. (And they're the guys who have great drivers, remember?)
|
# ? Oct 23, 2012 14:19 |
|
I just picked up a Zotac GTX 660 yesterday to replace my ailing 460. It's absolutely tiny compared to the old one! And it takes just one power cable and seems to produce less heat. I"m pretty sure it has brought my processor temps down a little as well. I'm impressed.
|
# ? Oct 23, 2012 17:04 |
|
Apologies if this has been covered before. We have a structural biologist starting at my work soon who will be doing 2/3d molecular rendering on a CentOS workstation. From my own personal life I'm real familiar with the GeForce line of GPUs but it seems that Dell pushes the Quadro line in rendering workstation setups. What's the feeling on the two lines?
|
# ? Oct 24, 2012 15:52 |
|
Mierdaan posted:Apologies if this has been covered before. Quadro and GeForce are based on the same silicon. The difference comes in firmware and drivers. Quadros are professional cards, outfitted with ECC VRAM (and more of it than GeForces, to support GPGPU calculations). The drivers are tuned for stability and precision rather than speed-at-all-costs, so Quadros are suitable for scientifically accurate rendering. They are also not artificially limited (or less so) on FP64 CUDA/GPGPU operations. There are more differentiating features, but that's the gist of it.
|
# ? Oct 24, 2012 16:09 |
|
Factory Factory posted:Quadro and GeForce are based on the same silicon. The difference comes in firmware and drivers. Excellent, thanks. I also found this old brief describing the differences but it looks like it's from 2003. Gives me a good idea of how they treat the two different lines, though.
|
# ? Oct 24, 2012 17:00 |
|
Mierdaan posted:Excellent, thanks. I also found this old brief describing the differences but it looks like it's from 2003. Gives me a good idea of how they treat the two different lines, though. Basically when you pay for the Quadro, you get the ISV certification, ECC memory and you know that software vendors qualified against your particular card. Should cut down on compatibility issues as well. It's somewhat an example of executing the whole "we can sell the [essentially] same product to w people for $x, and y people for $z, so let's do both!".
|
# ? Oct 24, 2012 17:14 |
|
Factory Factory posted:Quadros are professional cards, outfitted with ECC VRAM (and more of it than GeForces, to support GPGPU calculations).
|
# ? Oct 24, 2012 19:27 |
|
Agreed, do you have more info on this Skyrim issue with 12.11? I can't say I noticed anything amiss after installing the driver (unlike 12.9, which gave me loads of flickery glitching). I saw something on Anandtech that very briefly mentioned they were missing a lighting pass on their 7970 but not their 7870, but that was about it. How does the problem manifest itself? I'm using an ENB for the game which obviously totally changes the lighting anyway, so I dunno if that affects the issue?
|
# ? Oct 26, 2012 11:05 |
|
Dominoes posted:Hey, I use eyefinity/crossfire with 3 portrait monitors. I get pretty bad screen tearing because vsync doesn't work. As far as I can tell, there is no way to get eyefinity to work with vsync. Searching shows a few cases of this with no solution, although I haven't found solid information, Ie an article from AMD explaining that there are technical issues preventing vsync from working with eyefinity. Also, let me know if HD audio works properly on the current gen of Nvidia cards. I'm probably going to upgrade to Nvidia since I've heard (one reply on the overclocker forum) that surround does work with vsync. No word on status/quirks of HDMI audio, or anything confirming this. Before I drop $700+ on a pair of cards, I'd like to know that they won't gently caress up my setup. The fact that they only have a single DisplayPort connector is worrying. The ATI setup has some quirks that aren't really published (ie phantom desktop space with HDMI audio, the vsync issue with Eyefinity, and until a driver update a few months ago, quirky behavior when turning on/off the audio receiver - flashing screens etc for a few seconds) Dominoes fucked around with this message at 00:55 on Nov 2, 2012 |
# ? Oct 29, 2012 02:08 |
|
norg posted:Agreed, do you have more info on this Skyrim issue with 12.11? I can't say I noticed anything amiss after installing the driver (unlike 12.9, which gave me loads of flickery glitching). Welp, I was wrong. Actually at night everything is too dark and unless it's bright sunshine it all looks a bit washed out too. Back to 12.8 again then I guess.
|
# ? Oct 29, 2012 10:29 |
|
Just got a new factory overclocked GTX 680 (I know I really should have gotten a 670 but I just wanted to splurge and get the best for once) and want to stress it to make sure it's completely stable. Is furmark still a video card killer, or assuming this card is stable, should it be safe to run overnight assuming the temps plateau at a reasonable temperature? Also, what is a reasonable temperature for the GTX 680?
|
# ? Oct 30, 2012 05:15 |
|
Furmark doesn't work well for stress-testing GTX 600-series cards because they spend all of their time at the TDP cap, well below max clock speeds. So far I've had the best luck with looping the Metro 2033 benchmark, but there may be better options. Try to keep the card between 65C to 69C for maximum boost clocks. First thing you should do is max out the TDP slider and then go from there.
|
# ? Oct 30, 2012 08:15 |
|
So I'm currently running 2x 570s in SLI pushing a 2560x1600 display. Is there a single card that I can replace them with and get same or better performance?
|
# ? Nov 1, 2012 22:03 |
|
Linux Nazi posted:So I'm currently running 2x 570s in SLI pushing a 2560x1600 display. Is there a single card that I can replace them with and get same or better performance?
|
# ? Nov 1, 2012 23:43 |
|
If the next GPU architectures are as big performance changes as this generation's were, you can probably get a single card equal performance swap around March of 2014, the current rumored release date for Nvidia's Maxwell. AMD is hush about their follow-up to Sea Islands, the 2013 optimization pass of Southern Islands (Radeon HD 7000), so no clue when a Team Red equivalent would show up other than "Probably around the same time maybe???"
|
# ? Nov 2, 2012 06:53 |
|
Factory Factory posted:If the next GPU architectures are as big performance changes as this generation's were, you can probably get a single card equal performance swap around March of 2014, the current rumored release date for Nvidia's Maxwell. AMD is hush about their follow-up to Sea Islands, the 2013 optimization pass of Southern Islands (Radeon HD 7000), so no clue when a Team Red equivalent would show up other than "Probably around the same time maybe???" Is big Kepler going to get a consumer release? That might change the game a bit.
|
# ? Nov 3, 2012 01:26 |
|
It's only now even hit the HPC market, its target, and yields are apparently pretty bad - only 13/15 SMXs are enabled. Everybody has their own favorite rumor for a "GeForce 685" or "780," and none of it is sourced.
|
# ? Nov 3, 2012 03:12 |
|
Do all 7950s come with the never settle bundle? Or just ones from Newegg? I want this MSI 7950 TF, but it says nothing about the free games.
|
# ? Nov 4, 2012 05:18 |
|
Endymion FRS MK1 posted:Do all 7950s come with the never settle bundle? Or just ones from Newegg? I want this MSI 7950 TF, but it says nothing about the free games. Check here: http://blogs.amd.com/play/this-holiday-never-settle/ Amazon is not listed in any of the retail partners participating so I would assume not.
|
# ? Nov 4, 2012 05:38 |
|
printf posted:Is big Kepler going to get a consumer release? That might change the game a bit. nVidia has stated that the rumors that the 7xx series were going to be based on the GK110 are false. There's nothing solid on exactly what the 7xx series is going to look like.
|
# ? Nov 4, 2012 05:39 |
|
An Unoriginal Name posted:Check here: http://blogs.amd.com/play/this-holiday-never-settle/ Well that's a bummer, I had $37 in Amazon gift cards to help out with it.
|
# ? Nov 4, 2012 05:47 |
|
Endymion FRS MK1 posted:Do all 7950s come with the never settle bundle? Or just ones from Newegg? I want this MSI 7950 TF, but it says nothing about the free games. Until I clicked that link, I was wondering why you were buying a graphics card from 2006. http://www.nvidia.com/page/geforce_7950.html
|
# ? Nov 4, 2012 06:16 |
|
KillHour posted:Until I clicked that link, I was wondering why you were buying a graphics card from 2006. http://en.wikipedia.org/wiki/Radeon_R100 Or from 2000.
|
# ? Nov 4, 2012 07:30 |
|
Wow, what an OP. I didn't see this thread before, and it would be better suited for the question I posted in the parts-picking megathread earlier: Does anyone know of or can point me in the direction of the minor differences between Radeon and GeForce cards? I have been searching in vain in Google, YouTube, Wikipedia, etc and cannot find a concrete pros and cons list between the two. I've read accounts where GeForce cards are more geared toward 3D gaming whereas Radeon cards work better for older PC games during the early 3D days. Is this just from people talking out of their asses or is any of this true? I'm genuinely interested in the minute differences between the two.
|
# ? Nov 9, 2012 08:36 |
|
Charles Martel posted:Wow, what an OP. I didn't see this thread before, and it would be better suited for the question I posted in the parts-picking megathread earlier: They have completely different architectures, but those details aren't important for a general overview. There are differences of course in performance, varying across the ranges, but this isn't the thread for posting many graphs - try AnandTech bench or some other reputable site - no point repeating all the game performance differences here, but if you're looking at the low-mid range of card prices, AMD is a good bet right now, and at the top end, most would say NVIDIA has an edge - but recently AMD has boosted the clocks of the top end which helps in some situations. In a very basic sense, the difference is that NVIDIA has CUDA exclusively, and hardware accelerated PhysX, but if you're into compute that runs under OpenCL, the newest Radeon generation is in general, faster at compute than the current NVIDIA cards at the same price. HalloKitty fucked around with this message at 13:32 on Nov 9, 2012 |
# ? Nov 9, 2012 13:28 |
|
The GK110-based Tesla K20 and K20X have been officially released. They are monstrous. AMD also released a monster: the FirePro S10000, a dual-GPU Tahiti board (akin to a Radeon 7990) for density when power limits allow. Can't wait to see benchmarks on these things. Between K20, GCN-based FirePro, and Intel's Xeon Phi, there are three strong contenders for the HPC crown. Nvidia has an early lead, what with K20 being used in the current #1 fastest supercomputer, but ultimately the game is still afoot.
|
# ? Nov 12, 2012 16:10 |
|
This seems as good a place as any to ask: I find the design and manufacture of modern microprocessors fascinating. What books should I look for if I want to learn more? For reference, I've a background in mathematics and computer science, so technical content isn't going to put me off.
coffeetable fucked around with this message at 01:03 on Nov 14, 2012 |
# ? Nov 14, 2012 00:46 |
|
coffeetable posted:This seems as good a place as any to ask: I find the design and manufacture of modern microprocessors fascinating. What books should I look for if I want to learn more? For reference, I've a background in mathematics and computer science, so technical content isn't going to put me off. I don't know of a good book, but do a google search for (in my opinion) the more interesting parts. "immersion lithography","multiple patterning", "high k metal gate", and "atomic layer deposition" should get you started. There are a score of fantastic semiconductor manufacturing videos on youtube as well that I've used to help explain the process to people. Due to IP and trade secrets, almost all information you'll find is going to be either light on details, 5 years obsolete or both.
|
# ? Nov 14, 2012 04:05 |
|
coffeetable posted:This seems as good a place as any to ask: I find the design and manufacture of modern microprocessors fascinating. What books should I look for if I want to learn more? For reference, I've a background in mathematics and computer science, so technical content isn't going to put me off. Intel occasionally puts out "how this poo poo done get made" documents [PDF, YouTube without text] that show up either in PDF form or repackaged by news websites. AMD apparently made a video in 2009, too, but I'm too lazy to find a new version. That doesn't cover design, though. I'm not really qualified to tell you about that, as my understanding is cobbled together from pre-filtered sources, like Wikipedia and newsmedia articles. You can pick up a hell of a lot that way, but it relies on other people having done the really hard work for you. The more you understand, the deeper and more abstractly you can appreciate the differences in architectures. If I were to just splat out some good articles, I'd basically be regurgitating the AnandTech CPU category. So let's do that. Intel Historical
Googling "fundamentals of CPU design" and hitting "I feel lucky" gave me a PDF textbook from a technical school's CS Computer Architecture course. That would be the bottom-up method; the more you understand, the closer you get to the feel for performance that benchmarks and practical experience give. One of the details between high-level understanding and theoretical understanding that it might help to know now is that the CISC vs. RISC debate from the mid-90s has disappeared. CISC x86 CPUs now use RISC-based, pipelined execution cores, and the x86 instruction set is translated using an instruction decoder that splits a CISC operation into multiple RISC micro-ops.
|
# ? Nov 14, 2012 04:28 |
|
Factory Factory posted:Intel occasionally puts out "how this poo poo done get made" documents [PDF, YouTube without text] that show up either in PDF form or repackaged by news websites. AMD apparently made a video in 2009, too, but I'm too lazy to find a new version. Hilarious. I thought suggesting Anandtech articles would be a bad idea for some reason and deleted my post. I've learned so much from that site about details I never thought mattered in CPU design. I haven't seen another site that dives into processor architecture like Anandtech does. Great stuff.
|
# ? Nov 14, 2012 04:32 |
|
Edit: I'm a stupid gently caress who can't use Google.
Richard M Nixon fucked around with this message at 22:12 on Nov 14, 2012 |
# ? Nov 14, 2012 21:57 |
|
jink posted:Hilarious. I thought suggesting Anandtech articles would be a bad idea for some reason and deleted my post. Real World Tech is also a pretty good site for some of those nitty-gritty details on CPU/GPU architectures. This is getting a little off topic but here's a pretty recent article from them on Haswell architecture.
|
# ? Nov 14, 2012 22:22 |
|
lkz posted:Real World Tech is also a pretty good site for some of those nitty-gritty details on CPU/GPU architectures. This is getting a little off topic but here's a pretty recent article from them on Haswell architecture. They sure do get nittry-gritty. I am pretending to understand half of what they are talking about in this article. In GPU news, a new BETA driver 310.54 came out from nVidia. Improvements across the board: http://www.geforce.com/whats-new/articles/nvidia-geforce-310-54-beta-drivers-released
|
# ? Nov 15, 2012 06:57 |
|
|
# ? Apr 25, 2024 21:33 |
|
I didn't realise Sniper Elite v2 was such a demanding game. Is it really nice looking?
|
# ? Nov 15, 2012 15:01 |