Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Tanreall
Apr 27, 2004

Did I mention I was gay for pirate ducks?

~SMcD
I like the difference in the reviews.

overclock3d: AMD pulled a rabbit out of a hat.

Guru3D: Meh look for cheap 290/x's.

Adbot
ADBOT LOVES YOU

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Yeah, both are true in a way. The 390X is a high performance card positioned well for that performance. The thing is, so's the 290X and that's positioned for a throttling, loud mess of a card's performance.

Sucks for AMD they had so much inventory in the channel they couldn't do the rebrand when the 980 came out to dampen the enthusiasm for it.

Oh and MoraleHazard in case you missed my edit on the last post:
If you want to check out a noise comparison video, computerbase.de has a comparison if you search for "Elf Nvidia GeForce GTX 970 im Vergleich", go down to the nav bar, click Seite 1/6, and then Lautstärke & Temperatur. (sorry, at work and posting from phone)

xthetenth fucked around with this message at 16:17 on Jun 18, 2015

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
I really don't get this, these benchmarks / comparison are a good bit faster than the 290 / 290X reviews at launch. Is that all thermal throttling? Is my MSI 290 that overclocks to 1100 actually comparable to a 970?

Truga
May 4, 2014
Lipstick Apathy
Slightly worse at 1080p, slightly better above.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Twerk from Home posted:

I really don't get this, these benchmarks / comparison are a good bit faster than the 290 / 290X reviews at launch. Is that all thermal throttling? Is my MSI 290 that overclocks to 1100 actually comparable to a 970?

Higher frame rate than reference clock 970, probably evenish with say a G1 gaming 970. The 970 will have slightly smoother frames most likely.

For reference hardware.fr got a Tri-X 290 that held a steady 1000 MHz, their reference one went as low as 845, and probably averaged around 900. The Tri-X is 5% slower than a 970 G1 gaming and 4% faster than a reference clocked 970. You've got up to 10% more clocks than that Tri-X. Numbers are for 1440p, 290 probably does a bit worse in comparison at 1080.

AMD shot themselves in the foot with the 290(X) blower. It literally never got above 947 MHz in that test.

xthetenth fucked around with this message at 16:26 on Jun 18, 2015

SwissArmyDruid
Feb 14, 2014

by sebmojo

THE DOG HOUSE posted:

They use it for other games. Very weird to me theyd have it off for BF4 since that was the headline Mantle game.

Not using Mantle sets a level playing field between the two cards. Takes away the "but they're using Mantle" card away from the negative nancies. Also, Mantle is being deprecated, so....

repiv
Aug 13, 2009

SwissArmyDruid posted:

Not using Mantle sets a level playing field between the two cards. Takes away the "but they're using Mantle" card away from the negative nancies. Also, Mantle is being deprecated, so....

But they did use Mantle in the Sniper Elite test, so it feels more like cherry picking the render path which works best for them in that test.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Running mantle in sniper elite 3 but not bf4 makes me think that they wanted to avoid using it if they still won outright, but weren't willing to turn it off if it meant a loss (also sniper elite is newer so it may be a real mess for them in dx). Also Sniper Elite has many more options for mantle to hide in. Could also be they wanted one where they win with it off and one where they show it making a huge difference.

I'd say cherry picking but I'm pretty sure bf4 is faster in mantle.

xthetenth fucked around with this message at 17:14 on Jun 18, 2015

repiv
Aug 13, 2009

xthetenth posted:

I'd say cherry picking but I'm pretty sure bf4 is faster in mantle.

There's a lot of anecdotes out there about Frostbites Mantle path performing worse than the DX11 path on fast CPUs, nobody seems to know what causes it. Maybe AMD got bit by that issue :shrug:

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

repiv posted:

There's a lot of anecdotes out there about Frostbites Mantle path performing worse than the DX11 path on fast CPUs, nobody seems to know what causes it. Maybe AMD got bit by that issue :shrug:

That might be and would make sense, since cherry picking and using every advantage is expected from manufacturer benchmarks. I wouldn't know about mantle render path issues, I've never owned an AMD card.

eggyolk
Nov 8, 2007


R9 390X reviews went up a few hours ago. Doesn't seem to be worth the $50 premium at all, although it uses 30 less watts than the equivalent 290X according to one benchmark.

http://www.guru3d.com/articles-pages/msi-radeon-r9-390x-gaming-8g-oc-review,1.html

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

eggyolk posted:

R9 390X reviews went up a few hours ago. Doesn't seem to be worth the $50 premium at all, although it uses 30 less watts than the equivalent 290X according to one benchmark.

http://www.guru3d.com/articles-pages/msi-radeon-r9-390x-gaming-8g-oc-review,1.html

Pretty much. Aftermarket 290s are priced according to the performance of much worse reference 290s, and it looks like the end of the fire sale on Hawaiis is coming soon. Heck I think some 290s are getting raised in price because the 390 reviews are good coverage of what they can really do.

Also the anandtech review makes the 8 GB 390 change make sense. Apparently that lets them move from 2 Gb chips to more common 4 Gb chips, and apparently helps with the memory speed boost because the new chips have better timings.

xthetenth fucked around with this message at 18:43 on Jun 18, 2015

Stanley Pain
Jun 16, 2001

by Fluffdaddy
It's almost like AMD threw a bunch of engineers at a problem or two :q:

Tanreall
Apr 27, 2004

Did I mention I was gay for pirate ducks?

~SMcD
I'm surprised that the 300 series lacks bundled games.

Bleh Maestro
Aug 30, 2003
So I was a little bit confused by the announcement and forgot to ask here but are they just putting fury x2 inside their little VR box or will they be selling it?

A fury x2 card would be pretty badass

repiv
Aug 13, 2009

They're selling it but not until the end of the year.

Bleh Maestro
Aug 30, 2003
Second question I've been trying to figure out: is the 295x2 just a 290x x2 or is it some difference that makes it "295"

penus penus penus
Nov 9, 2014

by piss__donald

SwissArmyDruid posted:

Not using Mantle sets a level playing field between the two cards. Takes away the "but they're using Mantle" card away from the negative nancies. Also, Mantle is being deprecated, so....

They used it for other games in that same benchmark. Or at least, another game, I worded it poorly. Plus I get what youre saying but it'd be just as valid to use Mantle imo for the benchmark since thats what would really be used IRL and I'd care more about the Mantle results than a improbably comparison.

I'm glad the 300 series is getting somewhat positive words but I'm taking a very glass half empty stance on all of it. Can't get over the 3 year old release , 10months late... but anyway, here's hoping the Fury stuff is great.

Moral_Hazard
Aug 21, 2012

Rich Kid of Insurancegram

xthetenth posted:

I'm pretty sure that one's excellent and one of the best 970. Basically EVGA half-assed their cooler on release, so their stuff is a mixed bag, with some early coolers having one heatpipe not actually functional and so on and getting beat by everyone else's coolers. I'd wait for someone who stayed paying attention to 970 coolers to be sure of it though.

If you want to check out a noise comparison video, computerbase.de has a comparison if you search for "Elf Nvidia GeForce GTX 970 im Vergleich", go down to the nav bar, click Seite 1/6, and then Lautstärke & Temperatur. (sorry, at work and posting from phone)

Thanks for the site. Google translated the German for me. I'm going to go w/ the MSI 970. I just want to measure dimensions and whatnot and check the connectors before ordering. But I have a tall tower, so I think there will be issues.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

Bleh Maestro posted:

Second question I've been trying to figure out: is the 295x2 just a 290x x2 or is it some difference that makes it "295"

Dual 290X card, no special sauce other than a stock cooler than keeps it from throttling all the time.

^^^Glad I could help. Conputerbase is great for those roundups with video of the coolers in action.

xthetenth fucked around with this message at 20:04 on Jun 18, 2015

SwissArmyDruid
Feb 14, 2014

by sebmojo
So much for hoping that AMD can actually beat Nvidia cleanly.

Does anyone know where the gently caress WCCFT is getting their info from? They've got specs on the Fury Nano, saying it's got a 175W TDP?

http://wccftech.com/amd-radeon-r9-nano-detailed-features-fiji-gpu-175w-tdp-single-8pin-connector-sff-design-faster-hawaii/

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Bleh Maestro posted:

Second question I've been trying to figure out: is the 295x2 just a 290x x2 or is it some difference that makes it "295"

It does have slightly higher clocks, but there's nothing significant. That liquid cooler does an excellent job of keeping the noise levels down and the cores cool, though.

penus penus penus
Nov 9, 2014

by piss__donald

SwissArmyDruid posted:

So much for hoping that AMD can actually beat Nvidia cleanly.

Does anyone know where the gently caress WCCFT is getting their info from? They've got specs on the Fury Nano, saying it's got a 175W TDP?

http://wccftech.com/amd-radeon-r9-nano-detailed-features-fiji-gpu-175w-tdp-single-8pin-connector-sff-design-faster-hawaii/

I think that's based on the plugs. There was a little quip during the press release about being twice as efficient per watt (presumably compared to their very own 390x) so its not a stretch to imagine. Not sure there is any hard data though


edit: should have looked at the link but it says it right there

penus penus penus fucked around with this message at 20:24 on Jun 18, 2015

Wistful of Dollars
Aug 25, 2009

The thought of a fury x2 in an itx machine is tempting...

I really need to stop being a lazy rear end and sell the cards I have laying around.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

THE DOG HOUSE posted:

I think that's based on the plugs. There was a little quip during the press release about being twice as efficient per watt (presumably compared to their very own 390x) so its not a stretch to imagine. Not sure there is any hard data though


edit: should have looked at the link but it says it right there

Yeah, it was based on the PCIe spec with regards to the connectors on the card, if I recall there was a slide from AMD about it.

repiv
Aug 13, 2009

Huh, seems Grenada isn't a straight Hawaii rebrand after all. Witcher 3 with Hairworks on runs dramatically faster on 390X than 290X, so at the very least they've beefed up the tessellation engine.

http://www.hardocp.com/article/2015/06/18/msi_r9_390x_gaming_8g_video_card_review/3

EDIT: Actually maybe not, they're using different drivers for the 290X and 390X. Maybe just optimization or a Witcher 3 profile that forces the tess scale down by default.

repiv fucked around with this message at 21:58 on Jun 18, 2015

Gwaihir
Dec 8, 2009
Hair Elf

Ak Gara posted:

I'm assuming you can also overclock the G1's factory OC? I wonder if G1's binning makes them better at customer OCing or would they be no better or worse due to silicon lottery?

I have a G1-980 that sits at about 1550 boost. I initially thought it wasn't stable there, but it turned out something was up with the shadowplay streaming service (nvstream or something like that), even when not using shadowplay, that was crashing the gently caress out of games. I manually disabled that service and it's been smooth sailing ever since.

Anecdata go!

penus penus penus
Nov 9, 2014

by piss__donald

Gwaihir posted:

I have a G1-980 that sits at about 1550 boost. I initially thought it wasn't stable there, but it turned out something was up with the shadowplay streaming service (nvstream or something like that), even when not using shadowplay, that was crashing the gently caress out of games. I manually disabled that service and it's been smooth sailing ever since.

Anecdata go!

Shadowplay is rough on OC in my experience. Same with streaming. If I'm on the edge of stability, turning on Shadowplay will crash it for me.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

repiv posted:

Huh, seems Grenada isn't a straight Hawaii rebrand after all. Witcher 3 with Hairworks on runs dramatically faster on 390X than 290X, so at the very least they've beefed up the tessellation engine.

http://www.hardocp.com/article/2015/06/18/msi_r9_390x_gaming_8g_video_card_review/3

EDIT: Actually maybe not, they're using different drivers for the 290X and 390X. Maybe just optimization or a Witcher 3 profile that forces the tess scale down by default.

Huh, so maybe new TSMC silicon they've been holding onto for the 300 release? At least the 390 has a legitimate performance advantage over a 290.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
It annoys the gently caress out of me that hardocp doesn't use the same settings to compare graphics cards.

edit: ohh further down they do do that.

Ham Sandwiches
Jul 7, 2000

Don Lapre posted:

It annoys the gently caress out of me that hardocp doesn't use the same settings to compare graphics cards.

edit: ohh further down they do do that.

And there's dozens other sites that do use the same settings, who cares?

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

It'd be super cool if they did a test of an 8GB 290X against a 390X and a bios flashed 8GB 290X.

And honestly this is the same [H] that managed to conclude that a 295X and 980 SLI beating a 980 Ti without framerate dips meant that 6 GB was a "MINIMUM" for 4K gaming, so not the highest standards there.

repiv
Aug 13, 2009

xthetenth posted:

And honestly this is the same [H] that managed to conclude that a 295X and 980 SLI beating a 980 Ti without framerate dips meant that 6 GB was a "MINIMUM" for 4K gaming, so not the highest standards there.

You do have to wonder how they'd notice that discrepancy then not verify it in a vacuum using Tessmark :downs:

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Don Lapre posted:

It annoys the gently caress out of me that hardocp doesn't use the same settings to compare graphics cards.

edit: ohh further down they do do that.

Actually, it's kind of nice to get a different perspective, to see performance at settings you might actually use. They do also have 'apples to apples' comparisons if you want the same settings, yeah.

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

repiv posted:

You do have to wonder how they'd notice that discrepancy then not verify it in a vacuum using Tessmark :downs:

Yeah, that's really promising data, it's like the guy talking about how he didn't follow up on the 970 giving different performance from the 980 in a synthetic. That sort of thing is huge, having an explanation for something nobody else does.

On the subject of things I really want to see regarding the 2/390s, 290X CF vs 290X 8 GB CF. Really see what it takes to make them diverge.

KS
Jun 10, 2003
Outrageous Lumpwad
Gigabyte G1 980 Ti vs Gigabyte G1 970s in SLI.

Happy to not deal with SLI for a while, but I dunno, I kinda expected better.



The 980 Ti only hit 62 C during the test though.

KS fucked around with this message at 03:21 on Jun 19, 2015

penus penus penus
Nov 9, 2014

by piss__donald

KS posted:

Gigabyte G1 980 Ti vs Gigabyte G1 970s in SLI.

Happy to not deal with SLI for a while, but I dunno, I kinda expected better.



Time to put one of those 970s back in and crush the Phys-X test?

You expected a 980ti to like crush SLI 970's in a synthetic?

I think its crazy that a card as good as the 970 in SLI is almost dead even with a 980ti, a card that "only" costs twice as much. That's like an even performance:dollar ratio to the top, unheard of. And in reality the 980ti is better in so many practical ways that you could even say the 980ti is actually a better value choice than the 970.

Also a little jealous you snagged that card. I havent seen any aftermarket 980tis in stock period

penus penus penus fucked around with this message at 03:30 on Jun 19, 2015

xthetenth
Dec 30, 2012

Mario wasn't sure if this Jeb guy was a good influence on Yoshi.

It really is a better choice. First, it's a single card, so you don't have any of the frame time issues with SLI (If you thought the 290X was bad, wait till you see two card graphs). Second, no profiles. Third, you don't have to worry about VRAM size or that weird partition. Fourth, you only have to pay the aftermarket model once if you want a fancy high performance cooler and quieter part. Fifth is much better power use. All that for the same performance and similar price? There is no dual card solution that makes sense that isn't multiple 980 Ti or maybe four Titan X, including the 295X even when it does higher framerates than the 980 Ti and already has a very good cooler.

KS
Jun 10, 2003
Outrageous Lumpwad

THE DOG HOUSE posted:

That's like an even performance:dollar ratio to the top, unheard of. And in reality the 980ti is better in so many practical ways that you could even say the 980ti is actually a better value choice than the 970.

Also a little jealous you snagged that card. I havent seen any aftermarket 980tis in stock period

Had not thought of it that way. Actually, that's pretty nuts that you can get nearly equal FPS/$, but not need an SLI motherboard, beefy PSU, etc. Had I not already bought those things I would have come out well ahead.

Adbot
ADBOT LOVES YOU

penus penus penus
Nov 9, 2014

by piss__donald

KS posted:

Had not thought of it that way. Actually, that's pretty nuts that you can get nearly equal FPS/$, but not need an SLI motherboard, beefy PSU, etc. Had I not already bought those things I would have come out well ahead.

Plus it is just plain better, you will see.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply