|
Zero VGS posted:Not that I want to give NVidia a pass for the 970 debacle, but it is also kind of telling when people are like "This is an outrage" and Newegg is like "Okay, send it back, full refund", and everyone is like "Oh it's still the best overall card in existence, I just want free monies". I can't think of a Radeon card that has had enough issues that they needed to change specs/recall them. Nvidia had problems with both 8800s series and mobile GPUs for years that they failed to address until they got hit with lawsuits.
|
# ? Feb 20, 2015 01:27 |
|
|
# ? May 28, 2024 00:56 |
|
I can! Apple just started an out-of-warranty repair program for all Macs with AMD GPUs from 2011 through 2013 (though most of the problems are with 2011 models).
|
# ? Feb 20, 2015 01:36 |
|
Swartz posted:Just a note: this no longer works. Sucks that it doesn't work anymore. I've seen multiple people get 20-30% back from Amazon (posted a pic of their chat) just by asking even today. I guess there was only a certain amount of money to go around. I legitimately was planning on sending the card back if it was my only option. I wasn't going to get an AMD though, just the 980.
|
# ? Feb 20, 2015 03:49 |
|
Party Plane Jones posted:I can't think of a Radeon card that has had enough issues that they needed to change specs/recall them. Nvidia had problems with both 8800s series and mobile GPUs for years that they failed to address until they got hit with lawsuits. Yeah, but a purchase based on that alone is spite. Its still currently the best card overall... for now. I had higher hopes for the 3 series, the leaks kind of have ups and downs imo, but its going to come down to the pricing in the end. AMD doesn't seem to have any issue with pricing competitively but I'll just hold out for more info rather than guess at this point.
|
# ? Feb 20, 2015 07:34 |
|
Zero VGS posted:Not that I want to give NVidia a pass for the 970 debacle, but it is also kind of telling when people are like "This is an outrage" and Newegg is like "Okay, send it back, full refund", and everyone is like "Oh it's still the best overall card in existence, I just want free monies". Plenty of people who returned it and then got a 980, though E: Speak of the devil: suddenlyissoon posted:[...] I legitimately was planning on sending the card back if it was my only option. I wasn't going to get an AMD though, just the 980.
|
# ? Feb 20, 2015 09:25 |
|
Instant Grat posted:Plenty of people who returned it and then got a 980, though Hey, I only buy video cards once every 3 or 4 years. If I've got the money at the moment, I need to jump to something that will be better under 1440p!
|
# ? Feb 20, 2015 12:04 |
|
Yeah, I was gonna get 2 970s to run my 30", and now I don't think I will over this. 3.5gb might be just right for 1440p, but people with 2560x1600 say it can bog down in some games. I think I'll just wait for R300 and see what happens to 980 prices or if a r300 is a better deal.
|
# ? Feb 20, 2015 12:24 |
|
nVidia is bowing to pressure; future drivers will once again allow overclocking on mobile GPUs.
|
# ? Feb 20, 2015 16:39 |
|
I give it three weeks before angry retards who bought "gaming laptops" and OC'd the very specifically-cooled Nvidia GPUs past the breaking point start a witch hunt.
|
# ? Feb 20, 2015 16:44 |
|
Factory Factory posted:I can! Apple just started an out-of-warranty repair program for all Macs with AMD GPUs from 2011 through 2013 (though most of the problems are with 2011 models). Only the 2011 models use an AMD GPU; the affected late 2012 and early 2013 Retina Pros used a GeForce 650m.
|
# ? Feb 20, 2015 16:49 |
|
Truga posted:Yeah, I was gonna get 2 970s to run my 30", and now I don't think I will over this. 3.5gb might be just right for 1440p, but people with 2560x1600 say it can bog down in some games. I think I'll just wait for R300 and see what happens to 980 prices or if a r300 is a better deal. I have a single 980 for a 2560 * 1600 monitor, I find it plenty for everything I play atm. I think it gets 75 ish fps in Shadows of Mordor. It might not cut it for a future consumer occulus, but that's about the only downside I can think of. I wasn't really interested in experimenting with an SLI setup either.
|
# ? Feb 20, 2015 17:04 |
|
I'm looking out for something that will deliver ~100fps in most games so I can just buy and plug a consumer rift in when it arrives. A single 980 doesn't quite cut it, but two 970 do IIRC. I'm definitely going to wait a bit longer now though, just to see what happens. Last couple months or so I haven't been playing anything graphically heavy enough to warrant an upgrade anyway.
|
# ? Feb 20, 2015 18:02 |
|
Truga posted:Yeah, I was gonna get 2 970s to run my 30", and now I don't think I will over this. 3.5gb might be just right for 1440p, but people with 2560x1600 say it can bog down in some games. I think I'll just wait for R300 and see what happens to 980 prices or if a r300 is a better deal. I have a 970 and I play at 2560x1600. Though I'm mainly playing blizzard games. Some people are saying that the ubisoft ones tend to be quite slow at these resolutions on a 970. It's all up to whatever you're playing i guess. Since none of the ubisoft games caught on me, I have a really easy time saying: screw you. YMMV.
|
# ? Feb 20, 2015 19:42 |
|
Truga posted:Yeah, I was gonna get 2 970s to run my 30", and now I don't think I will over this. 3.5gb might be just right for 1440p, but people with 2560x1600 say it can bog down in some games. I think I'll just wait for R300 and see what happens to 980 prices or if a r300 is a better deal.
|
# ? Feb 21, 2015 02:28 |
|
I think I might be CPU bound. I managed to get the memory and core clocks up to +300 each and while the numbers went up the performance did not. Also I was getting frame drops in game despite the GPU being less than 100%. Then I tried goosing the CPU up a little faster and couldn't POST anymore. welp
|
# ? Feb 21, 2015 05:22 |
|
Panty Saluter posted:I think I might be CPU bound. I managed to get the memory and core clocks up to +300 each and while the numbers went up the performance did not. Also I was getting frame drops in game despite the GPU being less than 100%. Then I tried goosing the CPU up a little faster and couldn't POST anymore. Check your thermal throttling, numb nuts.
|
# ? Feb 21, 2015 05:43 |
|
Truga posted:I'm looking out for something that will deliver ~100fps in most games so I can just buy and plug a consumer rift in when it arrives. A single 980 doesn't quite cut it, but two 970 do IIRC. I'm definitely going to wait a bit longer now though, just to see what happens. Last couple months or so I haven't been playing anything graphically heavy enough to warrant an upgrade anyway. Hey SLI is bad for VR and probably will still be when the consumer rift arrives. Source - That guy who works there who posts here a lot.
|
# ? Feb 21, 2015 13:12 |
|
r0ck0 posted:Check your thermal throttling, numb nuts. I looked for it but didnt see anything so f u t It does top at 80c that does sound right. Would the throttling be something in afterburner or the nvidia panel?
|
# ? Feb 21, 2015 17:26 |
|
Panty Saluter posted:I think I might be CPU bound. I managed to get the memory and core clocks up to +300 each and while the numbers went up the performance did not. Also I was getting frame drops in game despite the GPU being less than 100%. Then I tried goosing the CPU up a little faster and couldn't POST anymore. Yes it sounds like it
|
# ? Feb 21, 2015 18:50 |
|
Fajita Fiesta posted:Hey SLI is bad for VR and probably will still be when the consumer rift arrives. Source - That guy who works there who posts here a lot. Nvidia was claiming that they had a way to make it so that one card rendered for one eye and the other for the other in a 2 card SLI set up, but I don't think that that has been enabled for consumers yet.
|
# ? Feb 21, 2015 20:59 |
|
The future of VR rendering is still uncertain. Even if I could tell you everything I know, I couldn't give you a confident prediction of how SLI/etc. will benefit VR applications in specific cases or in general, at the point we ship CV1. I don't recommend making assumptions about what configurations will work best for the consumer Rift until we release more information; I don't have a timeline to share on that, unfortunately.
|
# ? Feb 21, 2015 23:14 |
|
Just buy your card when the CV1 is out if that's all you're planning around guys, there might be some awesome 20nm or 16nm cards or by then, who can say. Also, just got my 970 EVGA reference blower, much quieter acoustics than my PNY 970 reference blower, if anyone was curious. Same coil whine on both though. Edit: It begins: http://wccftech.com/nvidia-face-lawsuit-gtx-970-false-advertising/ Zero VGS fucked around with this message at 00:30 on Feb 22, 2015 |
# ? Feb 22, 2015 00:27 |
|
Zero VGS posted:Edit: It begins: http://wccftech.com/nvidia-face-lawsuit-gtx-970-false-advertising/ Is release of specifications to press considered marketing? This will be interesting. I don't recall NVIDIA marketing the number of ROPs on their site when I was comparing cards, and I'm pretty sure it wasn't listed anywhere in MSI's marketing or packaging materials.
|
# ? Feb 22, 2015 00:34 |
|
Zero VGS posted:Not that I want to give NVidia a pass for the 970 debacle, but it is also kind of telling when people are like "This is an outrage" and Newegg is like "Okay, send it back, full refund", and everyone is like "Oh it's still the best overall card in existence, I just want free monies". Not quite the same, but I was looking to get a new video card, and a 970 was on the top of my list. With the information that Nvidia lied, I got a 290x instead. It helped that the 290x were heavily discounted so after all the discounts, I got the 290x for $220.
|
# ? Feb 22, 2015 02:04 |
|
Zero VGS posted:Edit: It begins: http://wccftech.com/nvidia-face-lawsuit-gtx-970-false-advertising/ They're being way too nice to NVIDIA in that article. The incorrect specs stayed up for months on review sites, it's inconceivable that none of NVIDIA's techs read those sites, or that all of them missed the errors. If NVIDIA had corrected the specs in a reasonable timeframe I'd have no issues with them, mistakes do happen, but if not for the fact that AMD has burned me too many times in the past I would have returned my 970 over this.'
|
# ? Feb 22, 2015 04:13 |
|
I spent the last couple hours learning something really interesting about PCI Riser cards, those flexible ribbon cables. I got a 12-inch cable in the mail and got no image when I plugged my 970 into it. I plugged in a shorter 8-inch cable and it worked perfectly fine, even overclocked. Then I read online from one person selling riser cables that they recommend 8-inch and shorter for PCI-E Gen 3. So I went into my BIOS and forced PCI-E Gen 2. With Gen 2, the 12-inch cable worked, but running benchmarks caused the Nvidia driver to gracefully crash, similar to when I overclock too far. gently caress this I said, and I set the BIOS to PCI-E Gen 1. Now the card runs rock-solid with the 12-inch riser on Gen 1. Finally I re-ran the benchmark program with identical settings. PCI-E Gen 3 was 124fps average, Gen 1 was 124fps average. Hmm. Going to Wikipedia, it says that x16 Gen 1 is still 32 gigabits a second, that's a loving lot, isn't it? In what situation would I wind up maxing that out?
|
# ? Feb 22, 2015 04:21 |
|
Zero VGS posted:I spent the last couple hours learning something really interesting about PCI Riser cards, those flexible ribbon cables. Guru3D actually just ran an article seeing if there is a significant difference in performance between PCI-E 1.1 - 3.0 with GTX 980's. For single GPU solutions, the answer was that there was no significant difference in performance. There was a more noted difference in SLI and likely a dual-GPU card like a R9 295X saturates PCI-E 1.1, but it still looks to be enough for now. http://www.guru3d.com/articles_pages/pci_express_scaling_game_performance_analysis_review,1.html
|
# ? Feb 22, 2015 04:43 |
|
Thanks for posting that, that's excellent. I'm developing a case that requires a riser so this is all good to learn. I think I might try making a custom riser, replacing the ribbon cables with something like twisted-pair ethernet to see if I can get more length without the signal degrading.
|
# ? Feb 22, 2015 05:02 |
|
Zero VGS posted:Just buy your card when the CV1 is out if that's all you're planning around guys, there might be some awesome 20nm or 16nm cards or by then, who can say. Oh boy, I can't wait to get my $3.65!
|
# ? Feb 22, 2015 05:06 |
|
TerminalSaint posted:Oh boy, I can't wait to get my $3.65! Class action lawsuits are less about awarding the plaintiffs and more about punishing the defendant. That's $3.65 over hundreds of thousands of customers.
|
# ? Feb 22, 2015 05:59 |
|
dpbjinc posted:Class action lawsuits are less about awarding the plaintiffs and more about punishing the defendant. That's $3.65 over hundreds of thousands of customers. It'd be hilarious if a retailer actually joined in on the action, I doubt any would damage their partnership for a small payout but it'd be fun to watch.
|
# ? Feb 22, 2015 07:03 |
|
Today I got a second EVGA 780ti with the intention of setting up SLI on my desktop. Individually both cards work fine in either PCI x16 slot. The issue that I"m seeing is that when I have both plugged in and connected with the SLI bridge cable, the fans on just one of the cards go full blast all the time starting during bootup when the user profile is loading. Both cards are detected and I can enable SLI in the NVIDIA control panel but performance in games is well below what I was getting with just one card. Does this sound more like a driver issue or is there something else I should be checking first? The GeForce Experience is showing that I have the latest drivers installed and I'm using a 850w power supply. Tomorrow I'll check for bios updates and try reinstalling drivers. Thanks for any direction and advice!
|
# ? Feb 22, 2015 08:39 |
|
Factory Factory posted:I can! Apple just started an out-of-warranty repair program for all Macs with AMD GPUs from 2011 through 2013 (though most of the problems are with 2011 models). I read about this problem, though, and it's not to do with the AMD GPU specifically, it occurs on Macbooks with NVIDIA GPUs also, and is to do with the solder attaching the GPU package to the motherboard; as far as I understand it.
|
# ? Feb 22, 2015 14:29 |
|
Desuwa posted:it's inconceivable that none of NVIDIA's techs read those sites, or that all of them missed the errors. If NVIDIA had corrected the specs in a reasonable timeframe I'd have no issues with them, mistakes do happen, but if not for the fact that AMD has burned me too many times in the past I would have returned my 970 over this.' I disagree. I'm not looking to give Nvidia an easy pass here but it's completely conceivable that in a large, highly technical and multi-discipline working environment, two separate groups (marketing guys and engineer guys) hosed up in communicating technical specifications and a fairly obscure, esoteric piece of information was missed out. It's not the job of Engineer Guys to go on websites in the months following the product release and check every marketing chart. Checking should have happened before the information was released to marketing (though I don't know how strict their checking/approval QA processes would be - these things get very lax in the absence of audits even if they do have procedures. We can expect they're much stricter now ) but even if it was, "4 GB VRAM, xxx-bit bus width" is still superficially correct and I can see how it might not be caught as an error. Nvidia deserve a bit of suffering for this gently caress-up, but my interpretation of much of the rage I've read is that it is artificial anger and stems from the notion that making enough noise about the affair would force Nvidia to offer free upgrades to 980s
|
# ? Feb 22, 2015 17:38 |
|
Daviclond posted:I disagree. I'm not looking to give Nvidia an easy pass here but it's completely conceivable that in a large, highly technical and multi-discipline working environment, two separate groups (marketing guys and engineer guys) hosed up in communicating technical specifications and a fairly obscure, esoteric piece of information was missed out. It's not the job of Engineer Guys to go on websites in the months following the product release and check every marketing chart. Checking should have happened before the information was released to marketing (though I don't know how strict their checking/approval QA processes would be - these things get very lax in the absence of audits even if they do have procedures. We can expect they're much stricter now ) but even if it was, "4 GB VRAM, xxx-bit bus width" is still superficially correct and I can see how it might not be caught as an error. If NVIDIA was quick-witted enough, they'd have copied Ubisoft, giving a free game to every 970 owner, but by accepting the game "gift", you throw away your right to sue them.
|
# ? Feb 22, 2015 18:55 |
|
Daviclond posted:I disagree. I'm not looking to give Nvidia an easy pass here but it's completely conceivable that in a large, highly technical and multi-discipline working environment, two separate groups (marketing guys and engineer guys) hosed up in communicating technical specifications and a fairly obscure, esoteric piece of information was missed out. It's not the job of Engineer Guys to go on websites in the months following the product release and check every marketing chart. Checking should have happened before the information was released to marketing (though I don't know how strict their checking/approval QA processes would be - these things get very lax in the absence of audits even if they do have procedures. We can expect they're much stricter now ) but even if it was, "4 GB VRAM, xxx-bit bus width" is still superficially correct and I can see how it might not be caught as an error. The suit is almost definitely a settlement grab. Absolute best case assuming no settlement is that the suit survives motion to dismiss (where the standard is "Is there any actionable claim alleged under any half-plausible set of facts whatsoever?") and then the plaintiff lawyers get to rummage through Nvidia's communications during discovery. That would be annoying enough that Nvidia would settle for significant "nuisance money" just to be rid of them. Even that level of success is a pretty optimistic, considering the level of intentionality necessary in the claims they're alleging.
|
# ? Feb 22, 2015 21:45 |
|
Man, I just read that the 300 series is going to be a bunch of refreshed old stuff. Seeing how the 200 series was already refreshed old stuff... I don't know, I guess I figured it wouldn't be this time. (not including 390x, 290x, etc) Time will tell I guess, but it will be ironic if its just overclocked versions of the cards they are selling so cheap now suddenly at full price creating an awkward situation about which to buy.
|
# ? Feb 22, 2015 23:55 |
|
This is a corrupted generalization, but it can be said that only the people who are in the R9 390X's target market care about whether or not the latest generation of GPUs are using the latest new silicon anyways. With that kind of mindset, it's viable strategy to keep touting oneself as the best with a halo product while 1) saving money on not having to develop cut-downs of the new silicon and 2) getting rid of old silicon stock. Both AMD and nVidia have been doing it, it's fresh in my mind since that was very soon after I started following new GPU releases. the HD 7970 and GTX 680 have been around in some form for a fairly long time. double edit: and only recently (last month) is the GTX 680 starting to replaced in partity by the GTX 960 Sidesaddle Cavalry fucked around with this message at 00:15 on Feb 23, 2015 |
# ? Feb 23, 2015 00:06 |
|
Panty Saluter posted:I looked for it but didnt see anything so f u t Nvidia cards start throttling themselves at 70C. If you're hitting 80C while being throttled, you should probably look into fixing your case airflow. Or maybe you got one of the MSI 4Gs with bad fans that sometimes just fail to spin up.
|
# ? Feb 23, 2015 13:18 |
|
|
# ? May 28, 2024 00:56 |
|
Hamburger Test posted:Nvidia cards start throttling themselves at 70C. If you're hitting 80C while being throttled, you should probably look into fixing your case airflow. Nvidia cards definitely don't throttle at 70c. Default is 80c, can be raised to 95c with a simple slider in Afterburner. Even then, that is a "temp target", in that the card will still generally run at its max boost clock while at those temps, but it won't have the dynamic boost beyond that.
|
# ? Feb 23, 2015 13:38 |