Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
SwissArmyDruid
Feb 14, 2014

by sebmojo

sauer kraut posted:

I hope some idiot who's panic selling his MSI 970 buys a referebce blower R9 290 just like that to stick it to the man.
Don't buy that card unless you really like the smell of melting plastic and having a miniature jet engine sitting next to you.

I continue to believe that AMD is shooting themselves in the foot with regards to user experience by sticking with their two-DVI-and-some-DP-maybe-a-mini-DP-oh-and-you've-got-to-have-a-HDMI-out! I/O. When the first 200-series cards came out, some people did mods to their retention brackets, cutting out almost everything except for a thin border. This did not reduce temperatures significantly. However, what it *did* do was reduce the backpressure caused by the restrictive bracket, removing a lot of what was obstructing airflow, and changing the frequency of the air being pushed by the blower to one less irritating to the human ear, with the effect of making it subjectively quieter. TL;DR, "EEEEEEEEEEEE" to "whooooooosh".

I think that if AMD wants to stop loving themselves over with their own I/O, they should abandon all DVI ports, and just have quad DisplayPort/triple DP and one HDMI on one layer, with every other bit of the retention bracket an open grid to allow maximum airflow. And if someone still needs the DVI, throw in a passive adapter or two. (But really, Intel, AMD, Dell, Lenovo, Samsung and LG have all committed to phasing out DVI since 2010 anyways. Except for Nvidia, who still haven't committed because they're stubborn fucks that don't play nice with ANYONE, especially not AMD. :barf:)

It would have the side-effect of making sure that people use the correct connection to take advantage of Freesync/Adaptive Sync as well, something I'm sure AMD really, really, really wants people to use as soon as possible and as quickly as possible. Bonus!

SwissArmyDruid fucked around with this message at 12:57 on Jan 29, 2015

Adbot
ADBOT LOVES YOU

mcbexx
Jul 4, 2004

British dentistry is
not on trial here!



I wonder how the issue effects the secondary market. Are private sellers obligated to disclose the memory/bandwidth/ROP count problem to potential buyers? And if they don't, are they making themselves vulnerable to litigation? There are enough assholes out there after all...

Daviclond
May 20, 2006

Bad post sighted! Firing.

mcbexx posted:

I wonder how the issue effects the secondary market. Are private sellers obligated to disclose the memory/bandwidth/ROP count problem to potential buyers? And if they don't, are they making themselves vulnerable to litigation? There are enough assholes out there after all...

Unless they display the incorrect details from the original Nvidia marketing info I'm going to put on my internet lawyer hat and say there's no legal vulnerability there at all.

Instant Grat
Jul 31, 2009

Just add
NERD RAAAAAAGE
If you put up a craigslist ad saying "I am selling a GTX 970" and someone searching for a GTX 970 finds your ad and buys your GTX 970, how on earth would that leave you "vulnerable to litigation"

FrickenMoron
May 6, 2009

Good game!
I currently have the option to send back my Asus GTX 970 to get my money back. Thinking about actually doing it and buying a 980 instead. Should I just stick with the 970 despite the flaw and wait for the next generation of graphics cards?
I'm running a core i5-2500k overclocked at 4.2ghz with 16 GB ram, just wondering if the CPU would bottleneck a 980 too much.

Instant Grat
Jul 31, 2009

Just add
NERD RAAAAAAGE
It's not a flaw, it's an intended consequence of the way the card was designed. Did Nvidia realize their marketing department had sent out inflated ROP/L2 cache numbers to reviewers and were just hoping no-one would notice? Probably, I find it very hard to believe that no-one on their technical team realized that every single tech site was listing incorrect specs for the card. Is that shady as gently caress? Absolutely it is.

Does this negate the real-world benchmarks showing how well the card performs? No it does not.

Because of architectural nonsense, the card will have substantially slower memory transfer speeds if it needs to access the last 512MB of VRAM (still faster than if it had to go to system memory, but not drastically so), so just treat it like a 3.5GB card and then consider if you'd consider that a good deal.

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

Haerc posted:

Same, the only person selling any 970s near me wants $325, which is a joke.
I'd give it a little bit and see if second-hand prices fall in a week or so. Kinda doubt it since it's not like with the mining cards where they had no choice but to dump them (thanks miners) due to market saturation. The guy I bought mine from moved to a 970 - wonder if he's regretting that decision now.

Gwaihir
Dec 8, 2009
Hair Elf

FrickenMoron posted:

I currently have the option to send back my Asus GTX 970 to get my money back. Thinking about actually doing it and buying a 980 instead. Should I just stick with the 970 despite the flaw and wait for the next generation of graphics cards?
I'm running a core i5-2500k overclocked at 4.2ghz with 16 GB ram, just wondering if the CPU would bottleneck a 980 too much.

What resolution are you gaming at is the important question? If it's anything over 1080 then it's not tooooo hard to talk yourself in to a 980 (I certainly did :v:), but if you're just at 1080 don't bother.

A 2500k isn't going to bottleneck at higher than 1080 resolutions, I'm less sure about 1080 though. You could probably OC it a bit more though, 4.2ghz is really low for a 2500k.

FrickenMoron
May 6, 2009

Good game!
Im playing at 1080p, so yeah. Maybe I'm just expecting a bit too much from my card but I certainly can't get a steady 60 fps on DA Inquisition with everything on ultra.

veedubfreak
Apr 2, 2005

by Smythe

sauer kraut posted:

I hope some idiot who's panic selling his MSI 970 buys a referebce blower R9 290 just like that to stick it to the man.
Don't buy that card unless you really like the smell of melting plastic and having a miniature jet engine sitting next to you.

Or you plan to watercool :)

Hamburger Test
Jul 2, 2007

Sure hope this works!

FrickenMoron posted:

Im playing at 1080p, so yeah. Maybe I'm just expecting a bit too much from my card but I certainly can't get a steady 60 fps on DA Inquisition with everything on ultra.

DA:I pushed my 4670K @ 4.2GHz to 100%.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

FrickenMoron posted:

Im playing at 1080p, so yeah. Maybe I'm just expecting a bit too much from my card but I certainly can't get a steady 60 fps on DA Inquisition with everything on ultra.

It's not the CPU from what I can tell. Here are some relevant benchmarks.

So at 1280×720 with a 980, there's a small difference between the Intel CPUs. Of course, the AMD CPUs are absolutely in the shitter, but if you check closer to the bottom of the page, the Intel and AMD CPUs end up performing pretty much the same at Ultra settings - clearly not a CPU bottleneck, as the AMD ones were demonstrably weaker when the GPU bottleneck was removed.

At 1920×1200, the Geforce 970 is at 49 FPS average on Ultra with 2×MSAA. This is with a 4770K at stock, as noted on the first page.

tl;dr - the 970 not doing a smooth 60 FPS in this game at Ultra is expected and normal

HalloKitty fucked around with this message at 17:41 on Jan 29, 2015

sauer kraut
Oct 2, 2004

FrickenMoron posted:

Im playing at 1080p, so yeah. Maybe I'm just expecting a bit too much from my card but I certainly can't get a steady 60 fps on DA Inquisition with everything on ultra.

A benchmark 980 did 55fps at 1080p DA:I on the Ultra preset, so don't worry about it.

^^ yeah

FrickenMoron
May 6, 2009

Good game!
Thanks guys, I'll stick with my 970 and hope for some free stuff from Nvidia, you guys have no idea what kind of uproar that issue is currently causing on all major gaming/hardware sites here. I'll check how much i can OC my cpu before it crashes and fiddle with the voltage a bit I guess.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
People keep complaining about the 970's VRAM issue but isn't the 660/660ti like that too with the last 512MB of VRAM being much slower? It's the main reason I want to upgrade from my 660ti because VRAM overclocking only does so much especially in newer games.

Where's the $249 960Ti at Nvidia? :f5:

Bleh Maestro
Aug 30, 2003

HalloKitty posted:

It's not the CPU from what I can tell. Here are some relevant benchmarks.

So at 1280×720 with a 980, there's a small difference between the Intel CPUs. Of course, the AMD CPUs are absolutely in the shitter, but if you check closer to the bottom of the page, the Intel and AMD CPUs end up performing pretty much the same at Ultra settings - clearly not a CPU bottleneck, as the AMD ones were demonstrably weaker when the GPU bottleneck was removed.

At 1920×1200, the Geforce 970 is at 49 FPS average on Ultra with 2×MSAA. This is with a 4770K at stock, as noted on the first page.

tl;dr - the 970 not doing a smooth 60 FPS in this game at Ultra is expected and normal

I play 1440p but keep everything ultra except turn off MSAA. Can't tell a difference at all but maybe it's more noticeable at lower rez.

sauer kraut
Oct 2, 2004

Bleh Maestro posted:

I play 1440p but keep everything ultra except turn off MSAA. Can't tell a difference at all but maybe it's more noticeable at lower rez.

A huge part of the performance hit is probably from the 8x Ultra AA.

1gnoirents
Jun 28, 2014

hello :)

spasticColon posted:

People keep complaining about the 970's VRAM issue but isn't the 660/660ti like that too with the last 512MB of VRAM being much slower? It's the main reason I want to upgrade from my 660ti because VRAM overclocking only does so much especially in newer games.

Where's the $249 960Ti at Nvidia? :f5:

Yes but it was advertised as such (pretty much). Thats the real problem here.

That AMD ad made me :lol:. I knew it was coming but I figured they'd use it for 390/380 ads. I also thought it'd be... slightly... more muted than that. Something like "R9 390X, 4 (actual) GB. But hey.

Though a 290 reference would be a solid sidestep at best. That last 512MB is real loud

Can get an asus 290x for like $270 after rebates today on newegg :wow:
ediT: wow those are some insanely bad reviews for newegg

1gnoirents fucked around with this message at 00:08 on Jan 30, 2015

HERAK
Dec 1, 2004
What would be illuminating would be if nvidia manufactured a 970 to the originally advertised specs and let the tech press benchmark it to show once and for all what the performance difference is.

sauer kraut
Oct 2, 2004

1gnoirents posted:

Can get an asus 290x for like $270 after rebates today on newegg :wow:
ediT: wow those are some insanely bad reviews for newegg

quote:

Load up DA Inquisition and a minute into it what's that sound? Is someone cooking hot pockets? Smells like someone is burning styrofoam popcorn. Alright new card I read the reviews and knew this card can get hot but didn't think it would be a problem on a new card. Checked the temps and it was at 92c. Felt my exhaust and it was a heater blowing hot air
There's a reason AMD slashed the prices for Hawaii down to the bone.

Betty
Apr 14, 2008
I'm super confused. Can some r9 290's be used in crossfire without a bridge? Is this a thing?
This card to be specific

Betty fucked around with this message at 01:18 on Jan 30, 2015

Kazinsal
Dec 13, 2011



Yeah, the 290 and 290X don't need bridges. They trap the PCI bus to do their crossfire talk, which is upwards of 16x faster than an external crossfire bridge.

fletcher
Jun 27, 2003

ken park is my favorite movie

Cybernetic Crumb
How come nvidia cards don't do that?

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>

fletcher posted:

How come nvidia cards don't do that?

They can't charge you for their new LED bridges is they do that!

Kazinsal
Dec 13, 2011



My guess? SLI has been around since 1998 as a feature on high-end 3dfx Voodoo cards. That's where Nvidia acquired the technology and acronym from. It was designed for PCI, and is a relic of the era where the entire system bus peaked at 533 MB/s. Crossfire was designed for PCIe, and even PCIe 1.x had a 16-lane slot speed of 4 GB/s.

Crossfire's come a fairly long way since it first came out -- originally you needed a more expensive "master" card as well as the regular card, and the bridge was a Y-connector DVI dongle. There was communication using the PCIe bus, but all the actual image data was transmitted from the secondary card to the primary one via the DVI dongle. Crossfire 2 ran straight over the PCIe bus. CrossFireX switched to bridge chaining so they wouldn't have to basically freeze the entire system to take control of the PCIe bus while they coordinated four cards.

Bleh Maestro
Aug 30, 2003

spasticColon posted:

People keep complaining about the 970's VRAM issue but isn't the 660/660ti like that too with the last 512MB of VRAM being much slower? It's the main reason I want to upgrade from my 660ti because VRAM overclocking only does so much especially in newer games.

Where's the $249 960Ti at Nvidia? :f5:

Since the 960 is a full GM206, is a 960 Ti going to be possible on GM204 without gimping it even worse than the 970?

BurritoJustice
Oct 9, 2012

Bleh Maestro posted:

Since the 960 is a full GM206, is a 960 Ti going to be possible on GM204 without gimping it even worse than the 970?

It'll likely just be cut down GM204, with 3/4 of a 980 instead of 7/8 a 980 that the 970 is. Maybe 5/8? Either way will fit nicely above a 960 and below a 970.

sauer kraut
Oct 2, 2004
How many harvested GM204's that don't make the 970 cut can there be in a mature 28nm process? Aren't those already being shipped to partners as 970M?
I don't think there'll be anything beyond a pointless 4GB model until 970 sales take a nosedive.

Mr.PayDay
Jan 2, 2004
life is short - play hard

fletcher posted:

Your post convinced me to proceed with my plan to upgrade my 4GB 770 to 970 SLI. I'm gaming at 1080p and I definitely want that sweet sweet FPS for my shiny new G-Sync monitor.

Do it! It is just an amazing gaming experience!

So Nvidia hosed the specs up. They lied. This is infuriating.
We all agree here.
But suddenly we should pretend the 970s are bad because everyone plays on 4K res and is playing Far Cry 4 all day with fps drops to 0 or everyone plays other crappy ports and games with poor Memory allocation and VRAM Management.
So we forget that overclocking of 970s allows benchmark results that even pass stock 980 results, even with only 56 ROPs and 1792 KB L2 Cache and only 224 Bit Bus for the 3,5 GByte VRAM Partition.
Meanwhile I am wondering where I am affected while rocking Shadow of Mordor (a game where the 290x outperforms the 970) @ 2560*1440 res on Ultra with avg 98 frames (with "drops" to 49 frames acc. to the benchmark)

Nvidia may suffer, but the 970 are still a thing of Beauty @ Full HD 1080p and 1440p.
Stay classy, AMD and release sth to compete.

Mr.PayDay fucked around with this message at 03:05 on Jan 30, 2015

GokieKS
Dec 15, 2012

Mostly Harmless.

Kazinsal posted:

My guess? SLI has been around since 1998 as a feature on high-end 3dfx Voodoo cards. That's where Nvidia acquired the technology and acronym from. It was designed for PCI, and is a relic of the era where the entire system bus peaked at 533 MB/s. Crossfire was designed for PCIe, and even PCIe 1.x had a 16-lane slot speed of 4 GB/s.

Crossfire's come a fairly long way since it first came out -- originally you needed a more expensive "master" card as well as the regular card, and the bridge was a Y-connector DVI dongle. There was communication using the PCIe bus, but all the actual image data was transmitted from the secondary card to the primary one via the DVI dongle. Crossfire 2 ran straight over the PCIe bus. CrossFireX switched to bridge chaining so they wouldn't have to basically freeze the entire system to take control of the PCIe bus while they coordinated four cards.

nVidia Scalable Link Interface has nearly no relation to 3dfx Scan Line Interleave beyond the acronym and being a way to use multiple GPUs, and was most definitely not designed for PCI. It was introduced in 2004, for PCIe cards.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance

BurritoJustice posted:

It'll likely just be cut down GM204, with 3/4 of a 980 instead of 7/8 a 980 that the 970 is. Maybe 5/8? Either way will fit nicely above a 960 and below a 970.

That's what I'm hoping for but that probably means the 960Ti will only be a 192-bit 3GB VRAM card though.

Rosoboronexport
Jun 14, 2006

Get in the bath, baby!
Ramrod XTreme

spasticColon posted:

That's what I'm hoping for but that probably means the 960Ti will only be a 192-bit 3GB VRAM card though.

As long as there is no ROP fuckery, that'll do. GTX 960 beats GTX 660 clearly even though the latter has theoretically 25% more memory bandwidth. A 960Ti 4 GB with 3 GB through 192bit mode and last 1GB through 64-bit should still be enough for 1080p.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
I wasn't planning on upgrading to a GTX 970 anytime soon but I'm so tempted to exploit all the rich kids around here that have more money than sense. Sure, they'll still be richer than me but I'll have a great card that's near the eBay prices of my GTX 680 for maybe $40 more despite the card being 2 generations old. It just means I'll hold an upgrade for another 3 years instead of 1.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
https://www.youtube.com/watch?v=spZJrsssPA0

rednecked_crake
Mar 17, 2012

srsly who wants to play this lamer?
Is this GTX970 business even a thing? I've been checking out YouTube videos of people recording their card pushing 4GB and I've barely seen anything amiss.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

That was legitimately hysterical.

UHD
Nov 11, 2006


HoboWithAShotgun posted:

Is this GTX970 business even a thing? I've been checking out YouTube videos of people recording their card pushing 4GB and I've barely seen anything amiss.

Other than an advertising thing, not really. Unless for some reason your 970 spontaneously started running slower once you learned your card has fewer ROPs than you thought. It's still an excellent piece of hardware.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
That clip was better than Downfall.

Floor is lava
May 14, 2007

Fallen Rib

I'm dying. I can't breath.

Adbot
ADBOT LOVES YOU

Mr.PayDay
Jan 2, 2004
life is short - play hard

This is amazing, I am crying :lol:

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply