Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Shrimp or Shrimps
Feb 14, 2012


I mean, if it's priced low enough, it's not an unattractive solution for consumers. If it's priced too close to the 1080, then the OC headroom of the 1080 really leaves it in the dust since I'm assuming AMD is following tradition and basically having this thing clocked to max levels out of the box with no headroom left. And that's before the power consumption / heat stuff that nerds like us care about.

At what price, if you were looking for a new card in the 1080 perf range, would this card be a compelling choice over the 1080. $350?

Adbot
ADBOT LOVES YOU

1gnoirents
Jun 28, 2014

hello :)
Yeah AMD has had no problem pricing things where they need to be to sell. However this all but guarantees a total dogshit cooler again

AVeryLargeRadish
Aug 19, 2011

I LITERALLY DON'T KNOW HOW TO NOT BE A WEIRD SEXUAL CREEP ABOUT PREPUBESCENT ANIME GIRLS, READ ALL ABOUT IT HERE!!!

Shrimp or Shrimps posted:

I mean, if it's priced low enough, it's not an unattractive solution for consumers. If it's priced too close to the 1080, then the OC headroom of the 1080 really leaves it in the dust since I'm assuming AMD is following tradition and basically having this thing clocked to max levels out of the box with no headroom left. And that's before the power consumption / heat stuff that nerds like us care about.

At what price, if you were looking for a new card in the 1080 perf range, would this card be a compelling choice over the 1080. $350?

TBH I don't think I would ever get one at any price, mostly because I don't want it pouring 300W+ of heat into my already hard to cool room.

For everyone else I expect it to be compelling at the $350-$400 mark.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
If the custom cooler versions are priced near the 1080 then I'm sure people with FreeSync monitors will pick them up but I'm gonna sit this round out since I've gone 4-liter SFF.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

OhFunny posted:

Vega FE is going to be crushed in performance and price by the 1180 isn't it?

The real beauty of this card is that if it offers any kind of a market-relevant price it'll never see a motherboard that isn't inside a milk crate.

$400 for 1.5x a Fury at the same TDP? And more VRAM?

edit: cryptobutt thread says they're hashing the same as a 580 right now. However, that's obviously low so something may need to be tuned.

but yes, the 1170 will probably smoke this

Paul MaudDib fucked around with this message at 04:17 on Jun 29, 2017

1gnoirents
Jun 28, 2014

hello :)

Zero VGS posted:

If the custom cooler versions are priced near the 1080 then I'm sure people with FreeSync monitors will pick them up but I'm gonna sit this round out since I've gone 4-liter SFF.

Nah just drill little eye holes in the side of your case and hang an AIO on it clearly !!

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

1gnoirents posted:

Yeah AMD has had no problem pricing things where they need to be to sell. However this all but guarantees a total dogshit cooler again

600mm^2 die (IINW?) and HBM2 at $300-400. It's totally profitable

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
Does anyone know how the costs of HBM2 compare to GDDR5X? I'm curious as to what their margins would be like selling this at $400.

MaxxBot fucked around with this message at 04:22 on Jun 29, 2017

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

OhFunny posted:

Vega FE is going to be crushed in performance and price by the 1180 isn't it?

Considering nVidia has nine months to a year to crush however they like and even potentially refine their crushing technique, it's a pretty safe bet.

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord
The pascal refresh is going to be a tock cycle, gonna crush it imo

Cygni
Nov 12, 2005

raring to post

MaxxBot posted:

Does anyone know how the costs of HBM2 compare to GDDR5X? I'm curious as to what their margins would be like selling this at $400.

HBM is way, way more expensive/GB than GDDR5X, especially when you count in the interposer and packaging. I've seen estimates as high as $50 a card. Thats why Micron developed GDDR5X in the first place, as they gambled that HBM wouldnt be cheap enough, fast enough for prime time.

You have to wonder if AMD wasn't well aware that Vega was a flop the moment the 1080 launched and gave its production an extreme backseat to the Ryzen, since they share a process (Samsung' 14nm licensed by Glofo) and manufacturing facilities. Might explain the delay. Also might explain Volta's slip, cause Nvidia is looking at another year of pure profit.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

MaxxBot posted:

Does anyone know how the costs of HBM2 compare to GDDR5X? I'm curious as to what their margins would be like selling this at $400.

They are putting the same memory (and interposer/chip assembly process) that NVIDIA puts on a $7,000 Tesla GP100 card on a 1080 with a 375W TDP. What do you think margins look like for them? Forget the flagship, most chips won't be perfect. What do the die harvests cost?

Even if NVIDIA totally has no uarch / power tricks up their sleeve (they do), even let's say they have to go to a 512-bit bus + GDDR5X on a super-sized GV102 to make a release in the next 6 months, NVIDIA is still going to be laughing all the way to the loving bank.

Paul MaudDib fucked around with this message at 04:53 on Jun 29, 2017

Anime Schoolgirl
Nov 28, 2002

would it have killed them to make a gddr5 version lmao

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️
Because RTG isn't in the GPU business, they are in the VRAM early adopter business.

Cygni
Nov 12, 2005

raring to post

Anime Schoolgirl posted:

would it have killed them to make a gddr5 version lmao

i think Micron/Nvidia have an exclusive deal on GDDR5X. you can probably get GDDR5 up to like 300GB/s in bandwidth, but probably couldn't get near the 1080 Ti's 484GB/s without HBM.

but there is also the possibility that without HBM2, this thing is even more of a dog. considering its got a ~475mm die size, which makes its nearly the same size as a GP102 (titan x/xp, 1080ti) and 44% BIGGER than GP104 (1080/1070), they likely REALLY want to be able to charge a lot for it. its gotta be expensive, which means they gotta sink the cost and go HBM2 to get the bandwidth to perform.

the thing is nearly the size of 2.5 Ryzen dies, each. and they will have to sell it for less than the cost of a 1080 with HBM2 to move any. lordy.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
To be very honest I don't see how this chip can enter volume production in the state it's in. They can make money on them as FE cards but even if they're sole-sourcing them 3DFX style (that's right I'm going there) there's not going to be a lot of room for profit at $400 for the perfect chips let alone $250 for die harvest. Also, how do AIB partners feel about their whole margin in this?

Unless there's some secret VEGA MODE ACTIVATE thing that hasn't been turned on I just don't see how this is a saleable product at those margins right now let alone going against Volta. Is this thing going to compete well against the 1160?

I guess miners will buy whatever but if this bubble pops it's all over for AMD as they try to move industrial-scale quantities of lovely superhot cards on the consumer market, against a massive supply of used cards

Probably why there's been zero fanfare and literally only one rando with a card in hand.

Paul MaudDib fucked around with this message at 05:10 on Jun 29, 2017

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
I thought my 1080 Ti STRIX was comically large, can't wait to see the cooling solutions they come up with for this.

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord
If the vega numbers are to be believed, it's only hashing about 40 for 300+W. You can get mid 30s with Hawaii by comparison.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
Yeah Vega definitely sucks for Etherium but I'd like to see some other crypto benches, dumb miners will probably buy it either way ofc.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

MaxxBot posted:

Yeah Vega definitely sucks

Anarchist Mae
Nov 5, 2009

by Reene
Lipstick Apathy
None of this makes any sense:

- 14nm process advantage
- Clock speed advantage
- Architecture advantage
- 1070-1080 level performance

Conspiracy theory level bullshit:

At some point late last year RTG realised that they had a serious issue with their silicon when gaming that meant they were not going to be able to meet their deadline without a fixed stepping. To avoid this they chose to release a 'prosumer' card with the bad silicon that nobody would actually want, and then hope that they'd have fixed silicon in some reasonable time later in the year.

This would explain the inexplicable delay for RX Vega, at least if you dismiss that HBM2 supply is the probable cause, and why the Vega FE is simply unremarkable in every aspect considering that it is beaten in gaming and professional work by much older hardware.

Aesculus
Mar 22, 2013

Yeah how the actual gently caress does the Vega get worse performance than a Fury X with a die shrink, better memory, and higher clocks with more power? :psyboom:

Fauxtool
Oct 21, 2008

by Jeffrey of YOSPOS
amd ages like fine wine and the fury X is older.

Did they stop making the fury X because it was too expensive to make or something? It was a legitimately good card, just not at its price point.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Fauxtool posted:

amd ages like fine wine and the fury X is older.

Did they stop making the fury X because it was too expensive to make or something? It was a legitimately good card, just not at its price point.

How was the Fury X a good card? It was literally in the same boat this monstrosity is now, Fury X to 980, Vega to 1080. The situation is nearly 100% comparable.

Jesus AMD would have been better served with a 3328SP version of Polaris instead of the dumb money sunk into Lexa for a 1070-1080 competitor. It'd have run cooler, had similar performance (on 384bit bus and 9Ghz GDDR5), would have been smaller by a large margin (~350mm²), more profitable, have come sooner (more late 2016 than late 2017). Then a 2816SP cut that competes with a bonestock 1070 or just under it, and just have a properly cut P11 to 768SP or 640SP for the lowest end (based on Mac performance, this would have been superior to all the time and money invested in loving Lexa).

AMD would have been a bit late, but none of this would have been catastrophic, the long wait for Vega wouldn't ever be an issue and they could have taken their time getting it right for early 2018. Heads need to loving roll at AMD right now, and this feels like it's all on Raja at this point. Lisa gave him full run of the GPU division, gave him RTG, and this is his output. This is such an unmitigated disaster that RTG is likely onmly to squeak by because of Apple, and this may even hurt Zens chances due to brand association and net negative revenue.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️
At least Fury X won a Pyrrhic victory against 980 non-Ti. This stinker is pretty much worse than the notorious 2900XT launch in every aspect.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
I just took a look back, the Fury X beat up the 970 pretty badly at the time of the release and almost matched the 980 Ti granted it would get creamed when both were OC'd. This time the relative performance is even worse unless the actual RX Vega with the final drivers is better, in its current state it trades blows with my OC'd 1070.

MaxxBot fucked around with this message at 11:23 on Jun 29, 2017

Setzer Gabbiani
Oct 13, 2004

While this is probably a hot take considering people genuinely believed the RX series was finished at launch due to a handful of shitbox motherboards being fried, we should probably wait until there's more than a not-games prosumer hybrid thing made for early adopters/miners with mombux before labeling the now and forever Vega lineup as GeForce FX part deux

Anarchist Mae
Nov 5, 2009

by Reene
Lipstick Apathy

wargames posted:

X299 see this and looks dejected, it wanted to be the worst launch this year.

Ah yes, but you see X299 still had this trick up its sleeve:

https://www.youtube.com/watch?v=f7BqAjC4ZCc

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Palladium posted:

At least Fury X won a Pyrrhic victory against 980 non-Ti. This stinker is pretty much worse than the notorious 2900XT launch in every aspect.

At least the 2900XT had the excuse of being on a much worse node, moving to 55nm and the 3850/70 actually gave AMD something tremendously reasonable, and AMD only had to suffer the 2900XT for like, 6 months. That's not going to happen for Vega, even if you consider the 14nmLPP node inferior, because this is entirely a design issue now, AMD owns this not GloFo because GP108/107 and Ryzen are not disasters. God we'll be lucky to even have an inkling of what Navi is going to be because this is basically going to kill RTG in the dGPU end and relegate them forever to an iGPU alongside AMDs CPUs.

Measly Twerp posted:

Ah yes, but you see X299 still had this trick up its sleeve:

https://www.youtube.com/watch?v=f7BqAjC4ZCc

Lmao, after a quick view it sounds like Intel stared long and hard at the Ryzen launch and said "AMD will never beat us at anything! Including lovely launches!", and X299 is Ryzen launch V.2 and amped to 11, because if they're having issues with the current crop of released CPUs, holy poo poo how are the 12, 14, and 18 core products supposed to run?

EmpyreanFlux fucked around with this message at 14:30 on Jun 29, 2017

Anarchist Mae
Nov 5, 2009

by Reene
Lipstick Apathy
Supposedly the gaming mode driver is older than the pro driver:

DavidGraham posted:

Maybe @Rys could shine a light on the matter, just one question: is the Vega FE gaming driver so far gimped compared to the Vega RX driver coming a month later?

Rys posted:

It's not gimped (that would be completely ridiculous), but it is older.

Which has led to this speculation:

Glo. posted:

It appears we have more answers.

The GPU-Z is recognizing the driver as 17.1.1 because the gaming driver for this GPU may actually be from that period.

Check out what AMD rep has said about Game driver:
https://forum.beyond3d.com/posts/1989522/

That also tells why there is no difference between Game and Pro mode when running gaming apps, and might tell why the GPU is recognized as Greenland.

What has been rumored at the time of Vega presentation(January, the driver 17.1.1 is also from January) that in that moment it was ONLY Fury X drivers, tuned only to work "just" with Vega architecture.

This may answer few questions.

:salt:

NewFatMike
Jun 11, 2015

I was going to say, the gaming and pro drivers offering the same performance makes me wonder if the pro drivers took priority and the current gaming drivers just work or

Super comedy option: gaming drivers are where they should be and pro are going to get the most performance increase.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Measly Twerp posted:

Supposedly the gaming mode driver is older than the pro driver:



Which has led to this speculation:


:salt:

Oh god if Vega turns out to be some 20-30% better because AMD cannot loving get drivers right and fucks themselves because they over promised to shareholders and had to do this dumb FE poo poo. I don't loving even know anymore, August cannot come soon enough. I want off Raja's wild ride!

Anarchist Mae
Nov 5, 2009

by Reene
Lipstick Apathy

FaustianQ posted:

Oh god if Vega turns out to be some 20-30% better because AMD cannot loving get drivers right and fucks themselves because they over promised to shareholders and had to do this dumb FE poo poo. I don't loving even know anymore, August cannot come soon enough. I want off Raja's wild ride!

It's both captivating and extremely nauseating.

repiv
Aug 13, 2009

I wouldn't read too much into anything GPU-Z reports about Vega FE currently, given it claims the card is using Micron memory.

(Micron doesn't even manufacture HBM2)

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord

NewFatMike posted:

I was going to say, the gaming and pro drivers offering the same performance makes me wonder if the pro drivers took priority and the current gaming drivers just work or

Super comedy option: gaming drivers are where they should be and pro are going to get the most performance increase.

This wouldn't be comedy it would be shrewd business to focus on the target market.

1gnoirents
Jun 28, 2014

hello :)
I suddenly just got sad for that one youtuber with the heavy Scottish accent.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

1gnoirents posted:

I suddenly just got sad for that one youtuber with the heavy Scottish accent.

Nah, he'll be vindicated if this really is the performance because he called roughly 1080 performance for it based on available information months ago.

1gnoirents
Jun 28, 2014

hello :)

FaustianQ posted:

Nah, he'll be vindicated if this really is the performance because he called roughly 1080 performance for it based on available information months ago.

But that made him sad

Wistful of Dollars
Aug 25, 2009

:shrug:

Adbot
ADBOT LOVES YOU

Cygni
Nov 12, 2005

raring to post

ah, so I see we are already into the "just wait, maybe we will get actual working drivers and performance will improve!" stage of bad AMD product adoption.

e: btw, i say that as someone who was an AMD loyalyst for a looong time. used an old Phenom II as my regular computer until like a year ago

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply