|
sauer kraut posted:I hope some idiot who's panic selling his MSI 970 buys a referebce blower R9 290 just like that to stick it to the man. I continue to believe that AMD is shooting themselves in the foot with regards to user experience by sticking with their two-DVI-and-some-DP-maybe-a-mini-DP-oh-and-you've-got-to-have-a-HDMI-out! I/O. When the first 200-series cards came out, some people did mods to their retention brackets, cutting out almost everything except for a thin border. This did not reduce temperatures significantly. However, what it *did* do was reduce the backpressure caused by the restrictive bracket, removing a lot of what was obstructing airflow, and changing the frequency of the air being pushed by the blower to one less irritating to the human ear, with the effect of making it subjectively quieter. TL;DR, "EEEEEEEEEEEE" to "whooooooosh". I think that if AMD wants to stop loving themselves over with their own I/O, they should abandon all DVI ports, and just have quad DisplayPort/triple DP and one HDMI on one layer, with every other bit of the retention bracket an open grid to allow maximum airflow. And if someone still needs the DVI, throw in a passive adapter or two. (But really, Intel, AMD, Dell, Lenovo, Samsung and LG have all committed to phasing out DVI since 2010 anyways. Except for Nvidia, who still haven't committed because they're stubborn fucks that don't play nice with ANYONE, especially not AMD. ) It would have the side-effect of making sure that people use the correct connection to take advantage of Freesync/Adaptive Sync as well, something I'm sure AMD really, really, really wants people to use as soon as possible and as quickly as possible. Bonus! SwissArmyDruid fucked around with this message at 12:57 on Jan 29, 2015 |
# ? Jan 29, 2015 12:45 |
|
|
# ? Apr 25, 2024 21:56 |
|
I wonder how the issue effects the secondary market. Are private sellers obligated to disclose the memory/bandwidth/ROP count problem to potential buyers? And if they don't, are they making themselves vulnerable to litigation? There are enough assholes out there after all...
|
# ? Jan 29, 2015 13:28 |
|
mcbexx posted:I wonder how the issue effects the secondary market. Are private sellers obligated to disclose the memory/bandwidth/ROP count problem to potential buyers? And if they don't, are they making themselves vulnerable to litigation? There are enough assholes out there after all... Unless they display the incorrect details from the original Nvidia marketing info I'm going to put on my internet lawyer hat and say there's no legal vulnerability there at all.
|
# ? Jan 29, 2015 13:55 |
|
If you put up a craigslist ad saying "I am selling a GTX 970" and someone searching for a GTX 970 finds your ad and buys your GTX 970, how on earth would that leave you "vulnerable to litigation"
|
# ? Jan 29, 2015 14:03 |
|
I currently have the option to send back my Asus GTX 970 to get my money back. Thinking about actually doing it and buying a 980 instead. Should I just stick with the 970 despite the flaw and wait for the next generation of graphics cards? I'm running a core i5-2500k overclocked at 4.2ghz with 16 GB ram, just wondering if the CPU would bottleneck a 980 too much.
|
# ? Jan 29, 2015 14:17 |
|
It's not a flaw, it's an intended consequence of the way the card was designed. Did Nvidia realize their marketing department had sent out inflated ROP/L2 cache numbers to reviewers and were just hoping no-one would notice? Probably, I find it very hard to believe that no-one on their technical team realized that every single tech site was listing incorrect specs for the card. Is that shady as gently caress? Absolutely it is. Does this negate the real-world benchmarks showing how well the card performs? No it does not. Because of architectural nonsense, the card will have substantially slower memory transfer speeds if it needs to access the last 512MB of VRAM (still faster than if it had to go to system memory, but not drastically so), so just treat it like a 3.5GB card and then consider if you'd consider that a good deal.
|
# ? Jan 29, 2015 14:51 |
|
Haerc posted:Same, the only person selling any 970s near me wants $325, which is a joke.
|
# ? Jan 29, 2015 15:31 |
|
FrickenMoron posted:I currently have the option to send back my Asus GTX 970 to get my money back. Thinking about actually doing it and buying a 980 instead. Should I just stick with the 970 despite the flaw and wait for the next generation of graphics cards? What resolution are you gaming at is the important question? If it's anything over 1080 then it's not tooooo hard to talk yourself in to a 980 (I certainly did ), but if you're just at 1080 don't bother. A 2500k isn't going to bottleneck at higher than 1080 resolutions, I'm less sure about 1080 though. You could probably OC it a bit more though, 4.2ghz is really low for a 2500k.
|
# ? Jan 29, 2015 16:14 |
|
Im playing at 1080p, so yeah. Maybe I'm just expecting a bit too much from my card but I certainly can't get a steady 60 fps on DA Inquisition with everything on ultra.
|
# ? Jan 29, 2015 16:24 |
|
sauer kraut posted:I hope some idiot who's panic selling his MSI 970 buys a referebce blower R9 290 just like that to stick it to the man. Or you plan to watercool
|
# ? Jan 29, 2015 16:52 |
|
FrickenMoron posted:Im playing at 1080p, so yeah. Maybe I'm just expecting a bit too much from my card but I certainly can't get a steady 60 fps on DA Inquisition with everything on ultra. DA:I pushed my 4670K @ 4.2GHz to 100%.
|
# ? Jan 29, 2015 16:57 |
|
FrickenMoron posted:Im playing at 1080p, so yeah. Maybe I'm just expecting a bit too much from my card but I certainly can't get a steady 60 fps on DA Inquisition with everything on ultra. It's not the CPU from what I can tell. Here are some relevant benchmarks. So at 1280×720 with a 980, there's a small difference between the Intel CPUs. Of course, the AMD CPUs are absolutely in the shitter, but if you check closer to the bottom of the page, the Intel and AMD CPUs end up performing pretty much the same at Ultra settings - clearly not a CPU bottleneck, as the AMD ones were demonstrably weaker when the GPU bottleneck was removed. At 1920×1200, the Geforce 970 is at 49 FPS average on Ultra with 2×MSAA. This is with a 4770K at stock, as noted on the first page. tl;dr - the 970 not doing a smooth 60 FPS in this game at Ultra is expected and normal HalloKitty fucked around with this message at 17:41 on Jan 29, 2015 |
# ? Jan 29, 2015 17:33 |
|
FrickenMoron posted:Im playing at 1080p, so yeah. Maybe I'm just expecting a bit too much from my card but I certainly can't get a steady 60 fps on DA Inquisition with everything on ultra. A benchmark 980 did 55fps at 1080p DA:I on the Ultra preset, so don't worry about it. ^^ yeah
|
# ? Jan 29, 2015 18:30 |
|
Thanks guys, I'll stick with my 970 and hope for some free stuff from Nvidia, you guys have no idea what kind of uproar that issue is currently causing on all major gaming/hardware sites here. I'll check how much i can OC my cpu before it crashes and fiddle with the voltage a bit I guess.
|
# ? Jan 29, 2015 19:43 |
|
People keep complaining about the 970's VRAM issue but isn't the 660/660ti like that too with the last 512MB of VRAM being much slower? It's the main reason I want to upgrade from my 660ti because VRAM overclocking only does so much especially in newer games. Where's the $249 960Ti at Nvidia?
|
# ? Jan 29, 2015 22:11 |
|
HalloKitty posted:It's not the CPU from what I can tell. Here are some relevant benchmarks. I play 1440p but keep everything ultra except turn off MSAA. Can't tell a difference at all but maybe it's more noticeable at lower rez.
|
# ? Jan 29, 2015 23:19 |
|
Bleh Maestro posted:I play 1440p but keep everything ultra except turn off MSAA. Can't tell a difference at all but maybe it's more noticeable at lower rez. A huge part of the performance hit is probably from the 8x Ultra AA.
|
# ? Jan 29, 2015 23:53 |
|
spasticColon posted:People keep complaining about the 970's VRAM issue but isn't the 660/660ti like that too with the last 512MB of VRAM being much slower? It's the main reason I want to upgrade from my 660ti because VRAM overclocking only does so much especially in newer games. Yes but it was advertised as such (pretty much). Thats the real problem here. That AMD ad made me . I knew it was coming but I figured they'd use it for 390/380 ads. I also thought it'd be... slightly... more muted than that. Something like "R9 390X, 4 (actual) GB. But hey. Though a 290 reference would be a solid sidestep at best. That last 512MB is real loud Can get an asus 290x for like $270 after rebates today on newegg ediT: wow those are some insanely bad reviews for newegg 1gnoirents fucked around with this message at 00:08 on Jan 30, 2015 |
# ? Jan 30, 2015 00:03 |
|
What would be illuminating would be if nvidia manufactured a 970 to the originally advertised specs and let the tech press benchmark it to show once and for all what the performance difference is.
|
# ? Jan 30, 2015 00:53 |
|
1gnoirents posted:Can get an asus 290x for like $270 after rebates today on newegg quote:Load up DA Inquisition and a minute into it what's that sound? Is someone cooking hot pockets? Smells like someone is burning styrofoam popcorn. Alright new card I read the reviews and knew this card can get hot but didn't think it would be a problem on a new card. Checked the temps and it was at 92c. Felt my exhaust and it was a heater blowing hot air
|
# ? Jan 30, 2015 01:11 |
|
I'm super confused. Can some r9 290's be used in crossfire without a bridge? Is this a thing? This card to be specific Betty fucked around with this message at 01:18 on Jan 30, 2015 |
# ? Jan 30, 2015 01:13 |
|
Yeah, the 290 and 290X don't need bridges. They trap the PCI bus to do their crossfire talk, which is upwards of 16x faster than an external crossfire bridge.
|
# ? Jan 30, 2015 01:23 |
How come nvidia cards don't do that?
|
|
# ? Jan 30, 2015 01:25 |
|
fletcher posted:How come nvidia cards don't do that? They can't charge you for their new LED bridges is they do that!
|
# ? Jan 30, 2015 01:29 |
|
My guess? SLI has been around since 1998 as a feature on high-end 3dfx Voodoo cards. That's where Nvidia acquired the technology and acronym from. It was designed for PCI, and is a relic of the era where the entire system bus peaked at 533 MB/s. Crossfire was designed for PCIe, and even PCIe 1.x had a 16-lane slot speed of 4 GB/s. Crossfire's come a fairly long way since it first came out -- originally you needed a more expensive "master" card as well as the regular card, and the bridge was a Y-connector DVI dongle. There was communication using the PCIe bus, but all the actual image data was transmitted from the secondary card to the primary one via the DVI dongle. Crossfire 2 ran straight over the PCIe bus. CrossFireX switched to bridge chaining so they wouldn't have to basically freeze the entire system to take control of the PCIe bus while they coordinated four cards.
|
# ? Jan 30, 2015 01:40 |
|
spasticColon posted:People keep complaining about the 970's VRAM issue but isn't the 660/660ti like that too with the last 512MB of VRAM being much slower? It's the main reason I want to upgrade from my 660ti because VRAM overclocking only does so much especially in newer games. Since the 960 is a full GM206, is a 960 Ti going to be possible on GM204 without gimping it even worse than the 970?
|
# ? Jan 30, 2015 01:59 |
|
Bleh Maestro posted:Since the 960 is a full GM206, is a 960 Ti going to be possible on GM204 without gimping it even worse than the 970? It'll likely just be cut down GM204, with 3/4 of a 980 instead of 7/8 a 980 that the 970 is. Maybe 5/8? Either way will fit nicely above a 960 and below a 970.
|
# ? Jan 30, 2015 02:08 |
|
How many harvested GM204's that don't make the 970 cut can there be in a mature 28nm process? Aren't those already being shipped to partners as 970M? I don't think there'll be anything beyond a pointless 4GB model until 970 sales take a nosedive.
|
# ? Jan 30, 2015 02:37 |
|
fletcher posted:Your post convinced me to proceed with my plan to upgrade my 4GB 770 to 970 SLI. I'm gaming at 1080p and I definitely want that sweet sweet FPS for my shiny new G-Sync monitor. Do it! It is just an amazing gaming experience! So Nvidia hosed the specs up. They lied. This is infuriating. We all agree here. But suddenly we should pretend the 970s are bad because everyone plays on 4K res and is playing Far Cry 4 all day with fps drops to 0 or everyone plays other crappy ports and games with poor Memory allocation and VRAM Management. So we forget that overclocking of 970s allows benchmark results that even pass stock 980 results, even with only 56 ROPs and 1792 KB L2 Cache and only 224 Bit Bus for the 3,5 GByte VRAM Partition. Meanwhile I am wondering where I am affected while rocking Shadow of Mordor (a game where the 290x outperforms the 970) @ 2560*1440 res on Ultra with avg 98 frames (with "drops" to 49 frames acc. to the benchmark) Nvidia may suffer, but the 970 are still a thing of Beauty @ Full HD 1080p and 1440p. Stay classy, AMD and release sth to compete. Mr.PayDay fucked around with this message at 03:05 on Jan 30, 2015 |
# ? Jan 30, 2015 02:57 |
|
Kazinsal posted:My guess? SLI has been around since 1998 as a feature on high-end 3dfx Voodoo cards. That's where Nvidia acquired the technology and acronym from. It was designed for PCI, and is a relic of the era where the entire system bus peaked at 533 MB/s. Crossfire was designed for PCIe, and even PCIe 1.x had a 16-lane slot speed of 4 GB/s. nVidia Scalable Link Interface has nearly no relation to 3dfx Scan Line Interleave beyond the acronym and being a way to use multiple GPUs, and was most definitely not designed for PCI. It was introduced in 2004, for PCIe cards.
|
# ? Jan 30, 2015 03:26 |
|
BurritoJustice posted:It'll likely just be cut down GM204, with 3/4 of a 980 instead of 7/8 a 980 that the 970 is. Maybe 5/8? Either way will fit nicely above a 960 and below a 970. That's what I'm hoping for but that probably means the 960Ti will only be a 192-bit 3GB VRAM card though.
|
# ? Jan 30, 2015 03:41 |
|
spasticColon posted:That's what I'm hoping for but that probably means the 960Ti will only be a 192-bit 3GB VRAM card though. As long as there is no ROP fuckery, that'll do. GTX 960 beats GTX 660 clearly even though the latter has theoretically 25% more memory bandwidth. A 960Ti 4 GB with 3 GB through 192bit mode and last 1GB through 64-bit should still be enough for 1080p.
|
# ? Jan 30, 2015 05:58 |
|
I wasn't planning on upgrading to a GTX 970 anytime soon but I'm so tempted to exploit all the rich kids around here that have more money than sense. Sure, they'll still be richer than me but I'll have a great card that's near the eBay prices of my GTX 680 for maybe $40 more despite the card being 2 generations old. It just means I'll hold an upgrade for another 3 years instead of 1.
|
# ? Jan 30, 2015 13:58 |
|
https://www.youtube.com/watch?v=spZJrsssPA0
|
# ? Jan 30, 2015 15:48 |
|
Is this GTX970 business even a thing? I've been checking out YouTube videos of people recording their card pushing 4GB and I've barely seen anything amiss.
|
# ? Jan 30, 2015 15:48 |
|
That was legitimately hysterical.
|
# ? Jan 30, 2015 15:52 |
|
HoboWithAShotgun posted:Is this GTX970 business even a thing? I've been checking out YouTube videos of people recording their card pushing 4GB and I've barely seen anything amiss. Other than an advertising thing, not really. Unless for some reason your 970 spontaneously started running slower once you learned your card has fewer ROPs than you thought. It's still an excellent piece of hardware.
|
# ? Jan 30, 2015 15:59 |
|
That clip was better than Downfall.
|
# ? Jan 30, 2015 16:04 |
|
I'm dying. I can't breath.
|
# ? Jan 30, 2015 16:22 |
|
|
# ? Apr 25, 2024 21:56 |
|
This is amazing, I am crying
|
# ? Jan 30, 2015 17:53 |