Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
sauer kraut
Oct 2, 2004
No regrets about my 190€ GTX 960. I play old steam stuff at vsynced 60 fps and new stuff at 30fps (1080p obviously)
It's.. perfectly adequate :confused:
As soon as GTA5 and the fixed GOTY version of Batman are on sale for 3.99 I'll buy a Pascal or whatever without thinking twice. Maybe I'll get a free Witcher game again.

Adbot
ADBOT LOVES YOU

Riso
Oct 11, 2008

by merry exmarx
There's nothing wrong per se, it's just buying a 2GB card is not a good idea anymore and the 4GB 960s cost like 250€ and 280x/380's start at 235€.

BurritoJustice
Oct 9, 2012

Riso posted:

http://www.guru3d.com/articles-pages/gigabyte-geforce-gtx-960-g1-gaming-4gb-review,1.html shows a massive difference of one fps in most games.
Your link has like six fps at best.

It's mostly margin of error. I am not sure I'd bother, with other more powerful cards around the same price point.

You're comparing average framerates, where the difference is minor. The improvements in minimum framerates and frametimes is large.

Base model 380's are also 2GB, both cards you need to pay more for 4GB. In Australia at least, 960s are cheaper than 380s

Riso
Oct 11, 2008

by merry exmarx
I am comparing 3/4gb pricing in my post. Must be the first time something in down under's cheaper than in Europe.

1K Dollar Chair
Nov 20, 2014
yeah well one 960 might be bad but two of them are rad, sli those thangs #frametimings

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost

Manifest posted:

I need a new card and am on a roughly $200 budget.
My GTX 760 is dying.

Is this the best buck for my bang at that price?

http://www.newegg.com/Product/Product.aspx?Item=N82E16814487133
Anyone have negative experiences with heat due to the size?
I could sell you my old GTX 680 4GB that's actually faster than the GTX 760 in performance for maybe $140 + shipping. It was previously goon-owned and I got it at a great price so I figure I should pay it forward somehow.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

1K Dollar Chair posted:

yeah well one 960 might be bad but two of them are rad, sli those thangs #frametimings

Don't those only sometimes beat a 290?

Also SLI frametimes are still pretty bad, they need something to catch up to PCIe CF in that regard.

Truga
May 4, 2014
Lipstick Apathy

xthetenth posted:

Also SLI frametimes are still pretty bad, they need something to catch up to PCIe CF in that regard.

Isn't that what NVLink is supposed to fix, for only a $200 premium? :v:

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Truga posted:

Isn't that what NVLink is supposed to fix, for only a $200 premium? :v:

Yaaaaaaaaay.

The Iron Rose
May 12, 2012

:minnie: Cat Army :minnie:

1K Dollar Chair posted:

yeah well one 960 might be bad but two of them are rad, sli those thangs #frametimings

Don't SLI 960s barely beat a 970 for a good $100 more in price?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

The Iron Rose posted:

Don't SLI 960s barely beat a 970 for a good $100 more in price?

Pretty sure he was joking

penus penus penus
Nov 9, 2014

by piss__donald
Pretty interesting to see how doubling the reference ram actually made a difference under normal use for once, for some card, finally.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Makes sense, though, everyone was expecting higher VRAM requirements after seeing the console specs. I was surprised they went whole hog right away, but I don't know why that was surprising, there's no good reason they shouldn't have. nVidia was miserly compared to AMD (hence the less qualified longevity of the 7970 Ghz edition vs. the old 2GB 680s/770s), but at this point it's not really hard for them to add more memory if needed so I'm kinda glad that the envelope got pushed early and maybe we don't have to deal with that for a few generations, at least.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

HalloKitty posted:

Pretty sure he was joking

Oh yeah he was.


THE DOG HOUSE posted:

Pretty interesting to see how doubling the reference ram actually made a difference under normal use for once, for some card, finally.

It made a difference for a few other cards, like the 8800 GT 256 MB and 8800 GTS 320 MB.

penus penus penus
Nov 9, 2014

by piss__donald

xthetenth posted:

Oh yeah he was.


It made a difference for a few other cards, like the 8800 GT 256 MB and 8800 GTS 320 MB.

Ah yeah I only know about stuff ~600 series and on

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

xthetenth posted:

Oh yeah he was.


It made a difference for a few other cards, like the 8800 GT 256 MB and 8800 GTS 320 MB.

IIRC, there was a 2900XT with 1GB GDDR4 that got a performance boost, and the HD 3000/4000 series were much better performers with 1GB as well. I think the 7800/7900 geforce cards also enjoyed 512MB over 256MB and the 9800XT was potent enough that 256MB made a difference. Something like the HD6450 could use 1GB but only if it was GDDR5, and Fermi got some performance gains moving from 896MB to 1.25GB. I think in general AMD, due to bus width and memory advantages, also got the best use out of larger amounts of memory. This might also be why AMD fans think AMD caters more to customers over the long term, the high bus width and higher memory capacity age fairly well but are more artifacts of AMD specific uarch design than any concern towards customers.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

FaustianQ posted:

IIRC, there was a 2900XT with 1GB GDDR4 that got a performance boost, and the HD 3000/4000 series were much better performers with 1GB as well. I think the 7800/7900 geforce cards also enjoyed 512MB over 256MB and the 9800XT was potent enough that 256MB made a difference. Something like the HD6450 could use 1GB but only if it was GDDR5, and Fermi got some performance gains moving from 896MB to 1.25GB. I think in general AMD, due to bus width and memory advantages, also got the best use out of larger amounts of memory. This might also be why AMD fans think AMD caters more to customers over the long term, the high bus width and higher memory capacity age fairly well but are more artifacts of AMD specific uarch design than any concern towards customers.

Generally tending towards higher memory capacity for a wide bus and not leaving old generations to rot as far as drivers go are at the heart of things, but in the long term, the past few generations of AMD cards have been fantastic in the long run because of it.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

xthetenth posted:

Generally tending towards higher memory capacity for a wide bus and not leaving old generations to rot as far as drivers go are at the heart of things, but in the long term, the past few generations of AMD cards have been fantastic in the long run because of it.

I got the opposite impression, at least with my 4870 which is the single card I kept using the longest.

Two years out and every new driver would break something, but I'd still need to upgrade for certain games. One driver introduced some minor rendering glitches into BF2, and those were never fixed but they were minor enough I didn't downgrade. At some point it started rapidly cycling between 2D and 3D clocks which would cause artifacts in video and some games that weren't strenuous enough, I had to manually set profiles from then on.

The physical cards themselves, ignoring drivers, do seem fine though, even in the long run.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Desuwa posted:

I got the opposite impression, at least with my 4870 which is the single card I kept using the longest.

Two years out and every new driver would break something, but I'd still need to upgrade for certain games. One driver introduced some minor rendering glitches into BF2, and those were never fixed but they were minor enough I didn't downgrade. At some point it started rapidly cycling between 2D and 3D clocks which would cause artifacts in video and some games that weren't strenuous enough, I had to manually set profiles from then on.

The physical cards themselves, ignoring drivers, do seem fine though, even in the long run.

That one's VLIW, and that reputation seems really to be established on the 7000 series and later GCN chips.

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

Manifest posted:

I need a new card and am on a roughly $200 budget.
My GTX 760 is dying.

Is this the best buck for my bang at that price?

http://www.newegg.com/Product/Product.aspx?Item=N82E16814487133
Anyone have negative experiences with heat due to the size?

Check EVGA;s website for their B-Stock stuff - they frequently stock some newer generation cards and you might be able to score a 780 3GB Classified for like $250, I think even 770 2GB models in the $180-220 range. I've got it set to notify me for a couple 970 models, one being the SC for $249, since I'd like to upgrade and EVGA have been pretty great over the years.

funkymonks
Aug 31, 2004

Pillbug
Welp I couldn't handle the coil whine on my evga 980ti so I had to send it back. I couldn't hear it on air since the fan was defective and very noisy but once I got a waterblock on it the whine was very annoying. At least there was an actual defect with the fan so I feel less bad about returning it.

Can anyone anecdotally suggest a card less likely to have the issue?

The Iron Rose
May 12, 2012

:minnie: Cat Army :minnie:

Ozz81 posted:

Check EVGA;s website for their B-Stock stuff - they frequently stock some newer generation cards and you might be able to score a 780 3GB Classified for like $250, I think even 770 2GB models in the $180-220 range. I've got it set to notify me for a couple 970 models, one being the SC for $249, since I'd like to upgrade and EVGA have been pretty great over the years.

Gotta be honest EVGA's b-stock supply is hella tempting. Trouble is I already have a 970, and SLI is a pain in the rear end. I'm pretty sure I can wait another 12 months for pascal... but drat that's a fantastic price for a 970.

At $200 it'd be a no brainer though.

Filthy Monkey
Jun 25, 2007

funkymonks posted:

Can anyone anecdotally suggest a card less likely to have the issue?
Coil whine can happen to cards from any manufacturer. My old Asus 570 would sing every time the menu on a stalker game came up. Pretty much luck of the draw.

Bleh Maestro
Aug 30, 2003
Coil whine is weird. My 970 sings on the witcher 3 menus for some reason but that's about it.

E: Kitguru got the Fury non-X and Fury Nano specs. Looks like the nano is full fat Fiji except maybe lower clock speed. Might be pretty nice if it's like $500ish.

Bleh Maestro fucked around with this message at 08:28 on Jul 5, 2015

Truga
May 4, 2014
Lipstick Apathy
Uhhh. I'm reading rumours of nvidia lowering image quality through their default driver settings to get ~10% higher FPS in games.

Example: https://youtu.be/a2IIM9fncqc

Anyone know anything about that? Cause that is hilariously lovely, if true, since I'm guessing that's what gets used in benchmarks?

Unfortunately, I don't have an nvidia on me at the moment, so I can't test myself :<


edit: and it'd account for people seeing better image quality I imagine?

Truga fucked around with this message at 12:37 on Jul 5, 2015

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Truga posted:

Uhhh. I'm reading rumours of nvidia lowering image quality through their default driver settings to get ~10% higher FPS in games.

Example: https://youtu.be/a2IIM9fncqc

Anyone know anything about that? Cause that is hilariously lovely, if true, since I'm guessing that's what gets used in benchmarks?

Unfortunately, I don't have an nvidia on me at the moment, so I can't test myself :<


edit: and it'd account for people seeing better image quality I imagine?

Wouldn't be the first time drivers have been "optimised" in some slightly suspicious way.

HalloKitty fucked around with this message at 14:05 on Jul 5, 2015

Anime Schoolgirl
Nov 28, 2002

Bleh Maestro posted:

Coil whine is weird. My 970 sings on the witcher 3 menus for some reason but that's about it.

E: Kitguru got the Fury non-X and Fury Nano specs. Looks like the nano is full fat Fiji except maybe lower clock speed. Might be pretty nice if it's like $500ish.
I was hoping the Nano would be an even lower binned part :ohdear:

repiv
Aug 13, 2009


Tested it myself, albeit on GM204 instead of GM200:

https://a.pomf.cat/yauvks.png
https://a.pomf.cat/inymul.png
https://a.pomf.cat/lnwcng.png

:shrug:

Truga
May 4, 2014
Lipstick Apathy

Thanks, looks like it's just a dumb guy who can't set poo poo up.

Good.


vvv: This makes sense, thanks.

Truga fucked around with this message at 13:52 on Jul 5, 2015

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Truga posted:

Uhhh. I'm reading rumours of nvidia lowering image quality through their default driver settings to get ~10% higher FPS in games.

Example: https://youtu.be/a2IIM9fncqc

Anyone know anything about that? Cause that is hilariously lovely, if true, since I'm guessing that's what gets used in benchmarks?

Unfortunately, I don't have an nvidia on me at the moment, so I can't test myself :<


edit: and it'd account for people seeing better image quality I imagine?
nVidia's drivers have a Texture Filtering Quality setting that ranges from High Performance to High Quality, and it controls the number of optimizations applied and potentially their strength. I think the artifacts people are noticing are related to the Anisotropic Sample Optimization feature, which is applied when Texture Filtering Quality is set to Performance or High Performance. I suspect all this hullabaloo is because of this setting unexpectedly changing when people manipulate overall graphics performance settings via the "EZ Mode" interface.

Edit: The more I read, it seems like after the video fuckery and some driver bugs with AF being stuck at 0X after changing card brands, this all comes down the people being upset that driver optimizations are applied when the "EZ Mode" interface is set to "Let the 3D application decide." Honestly this seems silly to me, driver optimizations should always be applied unless you specifically disable them, and if you care about your settings then use the Advanced mode to set them to what you want. When I switched to a Geforce I felt that default quality settings were kind of ugly, but I didn't bitch because you can just go into the drivers and force it to High Quality if you want.

Alereon fucked around with this message at 14:02 on Jul 5, 2015

BurritoJustice
Oct 9, 2012

Anisotropic filtering is one of the least intensive image enhancement for modern GPUs, so cutting corners there would be incredibly stupid.

E: Also what Alereon said, the basic "High quality ----- High performance slider" changes a whole load of universal settings that override what you set in game.

BurritoJustice fucked around with this message at 13:53 on Jul 5, 2015

pr0zac
Jan 18, 2004

~*lukecagefan69*~


Pillbug
Ed: awful.app pocket post

Truga
May 4, 2014
Lipstick Apathy
So this showed up: http://geizhals.eu/asus-strix-r9fury-dc3-4g-gaming-90yv08k0-m0nm00-a1291570.html


Fake edit:
http://www.kitguru.net/components/graphic-cards/anton-shilov/asus-readies-strix-radeon-r9-fury-with-directcu-iii-cooler/

Anime Schoolgirl
Nov 28, 2002

Sapphire's GPU tweaker will have Fury X/Fury voltage modification in the next version, so that's something to look out for at least

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Bleh Maestro posted:

Kitguru got the Fury non-X and Fury Nano specs. Looks like the nano is full fat Fiji except maybe lower clock speed. Might be pretty nice if it's like $500ish.

If the Nano is full Fiji and is priced considerably lower @ ~450-500$ it might make a weird situation where no one buys the Fury X and just buys Nanos with an aftermarket waterblock, unlock the voltage and overclock like hell for something 50-75$ cheaper.

Anime Schoolgirl
Nov 28, 2002

FaustianQ posted:

If the Nano is full Fiji and is priced considerably lower @ ~450-500$ it might make a weird situation where no one buys the Fury X and just buys Nanos with an aftermarket waterblock, unlock the voltage and overclock like hell for something 50-75$ cheaper.
This situation sounds eerily familiar somehow

I'm hoping it's a further cut-down Fiji or simply low-grade cards that don't overclock well mostly because they'd be priced at around ~400 instead of ~500-550

Anti-Hero
Feb 26, 2004
I'm not really satisfied with the performance of my GTX980 (oc'ed to ~1450 boost) at 1440P, especially in Witcher 3 where I have to really tweak too many sliders to achieve a consistent 60FPS. I end up spending more time fiddling than playing. I have a couple of questions/points to consider:

1) For games, is there any reason to buy a Titan-X over a 980Ti?

2) If noise isn't a factor is it OK to just stick with a reference design? I use headphones (rarely hear fan noise) and live in a cold state so my loaded GPU temps rarely get above 70C.

3) Would prefer not to SLI but if grabbing another GTX980 and going that route is better I would consider it, it might require a PSU upgrade as I'm guessing my 660W unit might be pushing it.

Budget isn't a huge concern, my last GPU upgrade pre-980 was SLI580s that lasted me close to 3 years. While I would prefer to stick with single card solutions am I the case where SLI is really the only way forward? The only way I see a single card upgrade worth it is if I OC the poo poo out of a Ti/Titan-X.

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

FaustianQ posted:

If the Nano is full Fiji and is priced considerably lower @ ~450-500$ it might make a weird situation where no one buys the Fury X and just buys Nanos with an aftermarket waterblock, unlock the voltage and overclock like hell for something 50-75$ cheaper.

Would that work with the Nano only having a 8-pin connector to draw on? I'm thinking it might work with some PSUs and not with others. Also I wonder about price points, if we have Fury X at $650, Fury at $550 and Nano at $500 don't they seem way too close together? So what would Nano have to be at it's highest? $450? $400? But if you can overclock the Nano to nearly Fury X levels then no one in their right mind will buy a plain old Fury.

It will be interesting to see how this shakes out.

Truga
May 4, 2014
Lipstick Apathy

AVeryLargeRadish posted:

Would that work with the Nano only having a 8-pin connector to draw on?

295x2 can draw like 650W from 2x8pin, so I don't see the issue :v:

Adbot
ADBOT LOVES YOU

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Anti-Hero posted:

I'm not really satisfied with the performance of my GTX980 (oc'ed to ~1450 boost) at 1440P, especially in Witcher 3 where I have to really tweak too many sliders to achieve a consistent 60FPS. I end up spending more time fiddling than playing. I have a couple of questions/points to consider:

1) For games, is there any reason to buy a Titan-X over a 980Ti?

2) If noise isn't a factor is it OK to just stick with a reference design? I use headphones (rarely hear fan noise) and live in a cold state so my loaded GPU temps rarely get above 70C.

3) Would prefer not to SLI but if grabbing another GTX980 and going that route is better I would consider it, it might require a PSU upgrade as I'm guessing my 660W unit might be pushing it.

Budget isn't a huge concern, my last GPU upgrade pre-980 was SLI580s that lasted me close to 3 years. While I would prefer to stick with single card solutions am I the case where SLI is really the only way forward? The only way I see a single card upgrade worth it is if I OC the poo poo out of a Ti/Titan-X.

1) No, it's like 5% faster on average.

2) You should be OK, might not be able to OC quite as high as the non-reference cards but it would be a pretty small difference.

3) Almost certainly the best route is to sell the 980 and buy a 980 Ti, SLI is a PITA. Going by benchmarks you should see 80-ish FPS with a 980 Ti at 1440p with everything cranked.

Truga posted:

295x2 can draw like 650W from 2x8pin, so I don't see the issue :v:

Ahhh, I was not sure how far out of spec you could go before things start getting weird, apparently waaaay out there. :v:

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply