Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Spoiler : they won't compete.

AMD is not going to compete above the $200 mark at best until they rework literally everything from the ground up. We've seen them fail enough times to know the same old revise and re-release strategy only works at the very low end (and barely there, considering the used market and how modern GPUs essentially last forever).

Intel's failure to scale up 14nm production to compensate for their 10nm/EUV failures is leaving a huge opportunity for AMD in the CPU market, hopefully they're already planning on sinking some of their inevitable windfall into completely overhauling the GPU group and building a serious R&D team that can compete with Nvidia.

Adbot
ADBOT LOVES YOU

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

Riflen posted:

If there were, you can bet AMD would be promoting it. RTX 2070 + 10% in best case game is what you should be expecting for the foreseeable. Radeon VII is what they're selling for more performance than that right now.
Meh then. NVidia should get their 7nm shrink of Turing done then, ideally before Cyberpunk 2077.

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin

Statutory Ape posted:

What's likely price point for them? $400? Less on sale/bundled/with games etc?

If they cant compete in value with used evga/etc nvidia cards they'll get 0 consideration from me

As usual, this is a tease. They may be not quite ready to talk about pricing. There will be more information at E3 on 10th June I believe.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

Combat Pretzel posted:

Meh then. NVidia should get their 7nm shrink of Turing done then, ideally before Cyberpunk 2077.

The fact that Nvidia is teasing Turing refresh with GDDR6 means that they probably won't hit 7nm before Q1 2020 at the earliest. Unfortunately the drought of worthwhile GPUs is going to continue.

eames
May 9, 2009

IIRC Dr. Su claimed 1.25x higher IPC and 1.5x perf/watt compared to Vega at the presentation, so I roughly expect parity with Pascal. That‘s not really great considering the process advantage.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Statutory Ape posted:

What's likely price point for them? $400? Less on sale/bundled/with games etc?

If they cant compete in value with used evga/etc nvidia cards they'll get 0 consideration from me

Even if AMD dropped cards at aggressive price points for the performance, it's very, very difficult to have a new card compete with the value proposition of a used one like that, and their recent pricing decisions don't make it seem like AMD is too interested in cutting prices to begin with.

The most realistic scenario is AMD releases their new stuff at similar price:performance points to nVidia 20xx cards, nVidia releases the 21xx series (or whatever they decide to call it) shortly thereafter and makes AMD look overpriced above the $200-$300 range again.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

As somebody that doesn't care for red GPUs, I think catching up to Pascal is pretty good tbh. Next series off this process should bring further refinement I would imagine


E: yes it's hard to compete in that market ofc, but that's also their competition so :shrug:

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

K8.0 posted:

The fact that Nvidia is teasing Turing refresh with GDDR6 means that they probably won't hit 7nm before Q1 2020 at the earliest. Unfortunately the drought of worthwhile GPUs is going to continue.
Seems a reasonable timeframe, since I don't expect any major AAA titles, that interest me and also warrant me upgrading, until past that quarter. The only wrench in my scheme is that dumb Index I've ordered.

repiv
Aug 13, 2009

RTX confirmed for the new Wolfenstein spinoff: https://www.nvidia.com/en-us/geforce/news/wolfenstein-youngblood-bundle-ray-tracing-nas/

It's got adaptive shading and some kind of raytraced effect(s), but they're not going into specifics yet.

Surprise Giraffe
Apr 30, 2007
1 Lunar Road
Moon crater
The Moon

K8.0 posted:

The fact that Nvidia is teasing Turing refresh with GDDR6 means that they probably won't hit 7nm before Q1 2020 at the earliest. Unfortunately the drought of worthwhile GPUs is going to continue.

Not true, I saw 2080 ti sell for £680 on ebay the other day! Thats relatively acceptable universe mrsp!

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.

repiv posted:

RTX confirmed for the new Wolfenstein spinoff: https://www.nvidia.com/en-us/geforce/news/wolfenstein-youngblood-bundle-ray-tracing-nas/

It's got adaptive shading and some kind of raytraced effect(s), but they're not going into specifics yet.

Awesome. If any studio can do a new tech justice it’ll be them.

Ragingsheep
Nov 7, 2009
So was NVIDIA's Super just the Studio brand launch?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Zedsdeadbaby posted:

I'm honestly pretty shocked they stopped beating the GCN dead horse and moved on from it.
My only disappointment is the total lack of 2.1 HDMI functionality, it wasn't mentioned at all.

It's GCN with the shader engine limit removed and a reworked cache. "New RDNA architecture" is marketing fluff, they then go on to say "redesigned compute unit" which wouldn't be a thing you'd say about a completely new architecture.

As far as marketing is concerned, Vega had a "new compute unit" (Next Compute Unit) too, and every generation has been a "new architecture". It's all GCN underneath.

Remember that they really can't abandon GCN as an architecture because of consoles. Even if they came up with something using the GCN ISA (which is pointless since that's an implementation detail: game developers write shaders, not GCN instructions, and they are compiled at runtime by the driver) then all the same GCN optimizations and code patterns would have to work on the new architecture too. So it would have to have all the same performance characteristics, and then what exactly did you change?

25% more perf and 50% more efficiency is good, and will hopefully allow GCN to scale into higher-end parts a little better. But a 2070-tier part isn't really exploiting that very well. If Big Navi is waiting for next year, they'll be going up against Ampere, not Turing. They needed Big Navi like now, this whole "big chips will come a year behind all the other parts" has to stop.

The question they didn't answer is of course performance-per-unit-area or performance-per-transistor though, since that's the major driver of costs. Perf-per-clock is meaningless in GPUs, since they're a throughput-oriented architecture. Fermi had higher perf-per-cycle than Kepler and Maxwell... but Kepler and Maxwell both shrunk the cores enough that they performed higher per unit-area despite that.

That's the catch with Turing too - perf-per-SM went up quite a lot, but the cache/etc ate a lot of space too. In practice it's only like a 10-15% improvement per unit area on older titles. We're definitely starting to hit an asymptotic limit beyond which perf-per-area just can't be substantially increased.

Paul MaudDib fucked around with this message at 00:54 on May 28, 2019

Purgatory Glory
Feb 20, 2005

Riflen posted:

Nvidia are releasing their Quake II RTX project on June the 6th.

EDIT: Trailer
https://www.youtube.com/watch?v=unGtBbhaPeU

Is nvidia giving proper credit to the guys at http://brechpunkt.de/q2vkpt/

Riflen
Mar 13, 2009

"Cheating bitch"
Bleak Gremlin

Purgatory Glory posted:

Is nvidia giving proper credit to the guys at http://brechpunkt.de/q2vkpt/

Yes. The author Christoph Schied had an internship at Nvidia too. They also credit the guy who created a lot of the textures Nvidia used.

Arzachel
May 12, 2012

Paul MaudDib posted:

Remember that they really can't abandon GCN as an architecture because of consolesit's fuckin' stupid.

Arzachel
May 12, 2012
https://wccftech.com/amd-radeon-rx-5000-navi-gpu-7nm-asrock-two-variants-report/

180w and a 150w variant in the works, I'm guessing rx5700 is the former. That's honestly better than I was expecting?

apropos man
Sep 5, 2016

You get a hundred and forty one thousand years and you're out in eight!
I'll be in the market for a cheap RX580 when these new cards drop. Which is gonna be best for staying quiet in a small case? I've heard the MSI Armor is noisy. Is the Gaming X the one to go for, or maybe an EVGA FTW?

alex314
Nov 22, 2007

Sapphire Nitro is usually the highly regarded series. EVGA does only nVidia cards.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
I don't expect huge price drops for the 580/570, these new cards are clearly replacements for the Vega shitpiles and those are the cards that will drop hard (and STILL not be worth buying). If you really care about keeping noise down, grab a used 1060. It'll be a bit faster, use a ton less power and thus generate a lot less heat/noise, and will both last longer and resell better (largely for the same reason).

The other thing is if the Navi cards turn out to be even slightly good, there will be a spike in used GPU availability as the hordes of people who have been waiting to upgrade finally do so.

apropos man
Sep 5, 2016

You get a hundred and forty one thousand years and you're out in eight!
OK. Cheers.

wargames
Mar 16, 2008

official yospos cat censor

K8.0 posted:

these new cards are clearly replacements for the Vega shitpiles and those are the cards that will drop hard (and STILL not be worth buying).

But the vega 56 i have isn't terrible and is as good as a 1070/ti, has fanless mode and was 300 bucks with 3 good games. also the driver stack has been amazing. Radeon chill is the best thing btw.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

If you actually dont have and want the 3 games and they're expensive games, those bundles have been pretty good.

From what I can see the problem is that in price, a vega 64 is less than $50 from an rtx2070...which should be preferable in every way I can think of at this point

wargames
Mar 16, 2008

official yospos cat censor

Statutory Ape posted:

If you actually dont have and want the 3 games and they're expensive games, those bundles have been pretty good.

From what I can see the problem is that in price, a vega 64 is less than $50 from an rtx2070...which should be preferable in every way I can think of at this point

agreed the 64 is over price most of the time but a 270-300 dollar 56 runs the same as a 1070ti/1660ti

sauer kraut
Oct 2, 2004

Arzachel posted:

https://wccftech.com/amd-radeon-rx-5000-navi-gpu-7nm-asrock-two-variants-report/

180w and a 150w variant in the works, I'm guessing rx5700 is the former. That's honestly better than I was expecting?

Videocardz.com (:unsmith:) has the 5700 as a slightly larger die than Polaris, available in 180W (same as 580) or 225W (same as 590 Jesus F Christ) configurations.
If accurate that is what I was afraid of, overjuiced to hell for what the sixth time in a row to eek out a meagre lead over a reference 1660ti or 1070ti in the fps graphs.

sauer kraut fucked around with this message at 14:52 on May 29, 2019

pofcorn
May 30, 2011

K8.0 posted:

I don't expect huge price drops for the 580/570, these new cards are clearly replacements for the Vega shitpiles and those are the cards that will drop hard (and STILL not be worth buying). If you really care about keeping noise down, grab a used 1060. It'll be a bit faster, use a ton less power and thus generate a lot less heat/noise, and will both last longer and resell better (largely for the same reason).

My PNY 1060 was loud as gently caress, and it went up to 80c under load. Maybe lovely cooler design, but still.

wargames posted:

But the vega 56 i have isn't terrible and is as good as a 1070/ti, has fanless mode and was 300 bucks with 3 good games. also the driver stack has been amazing. Radeon chill is the best thing btw.

Chill is the best feature by far. I had no idea it existed when I bought my Vega 56, and now I can't live without it.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

What's the chill thing do

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Murders your frame pacing in a vain attempt to compensate for AMD pushing their cards to the absolute limit and still failing to compete. It caps your framerate based on how active the driver judges the game to be.

pofcorn posted:

My PNY 1060 was loud as gently caress, and it went up to 80c under load. Maybe lovely cooler design, but still.

Definitely a lovely cooler design and probably terrible case airflow, but even with a good cooler a 580 would have been worse because it uses almost twice as much power. There's no way to cheat an extra 100+ watts out of a case.

wargames
Mar 16, 2008

official yospos cat censor

K8.0 posted:

Murders your frame pacing in a vain attempt to compensate for AMD pushing their cards to the absolute limit and still failing to compete. It caps your framerate based on how active the driver judges the game to be.


it doesn't because its not a frame limiter like nvidia it an fps target


Statutory Ape posted:

What's the chill thing do


what radeon chill does is you set a fps you want the game to run at, the card will either overclock (rarely) or underclock the hell out of your card till it meets that target. Do note i have a freesync monitor so i do not notice tearing ever or weird fps hang ups.

wargames fucked around with this message at 17:38 on May 29, 2019

sauer kraut
Oct 2, 2004

Statutory Ape posted:

What's the chill thing do

It's a pretty effective framerate limiter if nothing else, and afaik it also tries to perform some magic like Nvidias 'Optimal' power setting by not redrawing frames if nothing has changed from the previous one.
Since I use vsync or borderless mode on everything I haven't tried it very much to see if it works reliably, or messes up things that it shouldn't.

Arzachel
May 12, 2012

sauer kraut posted:

Videocardz.com (:unsmith:) has the 5700 as a slightly larger die than Polaris, available in 180W (same as 580) or 225W (same as 590 Jesus F Christ) configurations.
If accurate that is what I was afraid of, overjuiced to hell for what the sixth time in a row to eek out a meagre lead over a reference 1660ti or 1070ti in the fps graphs.

That's total board power which includes memory and type-c which I'm pretty sure Nvidia doesn't include in their TDP numbers.

Cygni
Nov 12, 2005

raring to post

Arzachel posted:

That's total board power which includes memory and type-c which I'm pretty sure Nvidia doesn't include in their TDP numbers.

Yeah AMD does indeed do "TBP", but both sides are using dumb marketing horseshit as best they can with their power calcs, so they are really only useful for comparing against parts in their own stack. AMD lists the higher end Navi at the 590's TBP, so we can look at the 590 to compare:




Seems like it will be slightly more power hungry than the 2070 its competing against. IF this is the high end navi i guess. And if the leaks are true. Who knows yet!

Arzachel
May 12, 2012

Cygni posted:

Yeah AMD does indeed do "TBP", but both sides are using dumb marketing horseshit as best they can with their power calcs, so they are really only useful for comparing against parts in their own stack. AMD lists the higher end Navi at the 590's TBP, so we can look at the 590 to compare:




Seems like it will be slightly more power hungry than the 2070 its competing against. IF this is the high end navi i guess. And if the leaks are true. Who knows yet!

The TDP delta between the variants is 30w while the TBP delta is 45w which either means more memory or a powered type-c connector that wouldn't pull extra power unless there's something hooked up.

Although the rx590 is rated for a tdp of 175w so who the gently caress knows

Arzachel fucked around with this message at 07:26 on May 30, 2019

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Navi "RDNA" confirmed to be "GCN hybrid".

I'm sure post-Navi will definitely be a "pure" RDNA though, right after AMD sold the console guys on another generation of GCN parts.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
It used to be that either ATI or Nvidia had a good GPU and the other one would be telling you about how good their next GPU would be. Now we're on like 7 years of AMD telling us "yeah, our current GPUs are poo poo, and our next GPU is going to be poo poo, but just wait until the one after that!"

repiv
Aug 13, 2009

RTX confirmed for the new call of doody game

https://twitter.com/NVIDIAGeForce/status/1134171617519497216?s=19

SwissArmyDruid
Feb 14, 2014

by sebmojo
That may be the most realistic night vision I've ever seen.

repiv
Aug 13, 2009

I couldn't care less about COD but their new engine does sound pretty cool, apparently they're rendering beyond the visible spectrum so the IR and NV views are the real deal rather than cheap filters.

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.

repiv posted:

I couldn't care less about COD but their new engine does sound pretty cool, apparently they're rendering beyond the visible spectrum so the IR and NV views are the real deal rather than cheap filters.

I mean... CoD. But holy poo poo if that isn’t the coolest loving thing.

Also, it’s a reboot of the best CoD, CoD4:Modern Warfare.

So yeah, good news all around.

Adbot
ADBOT LOVES YOU

SwissArmyDruid
Feb 14, 2014

by sebmojo
On the other hand, a very clever and directed use of raytracing.

You're not raytracing a global illumination source with bajillions of rays, just a few very specific IR sources, because anything beyond a certain intensity threshold, you can just dummy it out with an un-raytraced illumination source, and everything has extreme falloff.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply