|
Spoiler : they won't compete. AMD is not going to compete above the $200 mark at best until they rework literally everything from the ground up. We've seen them fail enough times to know the same old revise and re-release strategy only works at the very low end (and barely there, considering the used market and how modern GPUs essentially last forever). Intel's failure to scale up 14nm production to compensate for their 10nm/EUV failures is leaving a huge opportunity for AMD in the CPU market, hopefully they're already planning on sinking some of their inevitable windfall into completely overhauling the GPU group and building a serious R&D team that can compete with Nvidia.
|
# ? May 27, 2019 12:45 |
|
|
# ? Apr 25, 2024 14:43 |
|
Riflen posted:If there were, you can bet AMD would be promoting it. RTX 2070 + 10% in best case game is what you should be expecting for the foreseeable. Radeon VII is what they're selling for more performance than that right now.
|
# ? May 27, 2019 12:45 |
|
Statutory Ape posted:What's likely price point for them? $400? Less on sale/bundled/with games etc? As usual, this is a tease. They may be not quite ready to talk about pricing. There will be more information at E3 on 10th June I believe.
|
# ? May 27, 2019 12:47 |
|
Combat Pretzel posted:Meh then. NVidia should get their 7nm shrink of Turing done then, ideally before Cyberpunk 2077. The fact that Nvidia is teasing Turing refresh with GDDR6 means that they probably won't hit 7nm before Q1 2020 at the earliest. Unfortunately the drought of worthwhile GPUs is going to continue.
|
# ? May 27, 2019 12:51 |
|
IIRC Dr. Su claimed 1.25x higher IPC and 1.5x perf/watt compared to Vega at the presentation, so I roughly expect parity with Pascal. That‘s not really great considering the process advantage.
|
# ? May 27, 2019 13:29 |
|
Statutory Ape posted:What's likely price point for them? $400? Less on sale/bundled/with games etc? Even if AMD dropped cards at aggressive price points for the performance, it's very, very difficult to have a new card compete with the value proposition of a used one like that, and their recent pricing decisions don't make it seem like AMD is too interested in cutting prices to begin with. The most realistic scenario is AMD releases their new stuff at similar price:performance points to nVidia 20xx cards, nVidia releases the 21xx series (or whatever they decide to call it) shortly thereafter and makes AMD look overpriced above the $200-$300 range again.
|
# ? May 27, 2019 13:32 |
|
As somebody that doesn't care for red GPUs, I think catching up to Pascal is pretty good tbh. Next series off this process should bring further refinement I would imagine E: yes it's hard to compete in that market ofc, but that's also their competition so
|
# ? May 27, 2019 13:33 |
|
K8.0 posted:The fact that Nvidia is teasing Turing refresh with GDDR6 means that they probably won't hit 7nm before Q1 2020 at the earliest. Unfortunately the drought of worthwhile GPUs is going to continue.
|
# ? May 27, 2019 13:41 |
|
RTX confirmed for the new Wolfenstein spinoff: https://www.nvidia.com/en-us/geforce/news/wolfenstein-youngblood-bundle-ray-tracing-nas/ It's got adaptive shading and some kind of raytraced effect(s), but they're not going into specifics yet.
|
# ? May 27, 2019 17:36 |
|
K8.0 posted:The fact that Nvidia is teasing Turing refresh with GDDR6 means that they probably won't hit 7nm before Q1 2020 at the earliest. Unfortunately the drought of worthwhile GPUs is going to continue. Not true, I saw 2080 ti sell for £680 on ebay the other day! Thats relatively acceptable universe mrsp!
|
# ? May 27, 2019 22:58 |
|
repiv posted:RTX confirmed for the new Wolfenstein spinoff: https://www.nvidia.com/en-us/geforce/news/wolfenstein-youngblood-bundle-ray-tracing-nas/ Awesome. If any studio can do a new tech justice it’ll be them.
|
# ? May 28, 2019 00:06 |
|
So was NVIDIA's Super just the Studio brand launch?
|
# ? May 28, 2019 00:15 |
|
Zedsdeadbaby posted:I'm honestly pretty shocked they stopped beating the GCN dead horse and moved on from it. It's GCN with the shader engine limit removed and a reworked cache. "New RDNA architecture" is marketing fluff, they then go on to say "redesigned compute unit" which wouldn't be a thing you'd say about a completely new architecture. As far as marketing is concerned, Vega had a "new compute unit" (Next Compute Unit) too, and every generation has been a "new architecture". It's all GCN underneath. Remember that they really can't abandon GCN as an architecture because of consoles. Even if they came up with something using the GCN ISA (which is pointless since that's an implementation detail: game developers write shaders, not GCN instructions, and they are compiled at runtime by the driver) then all the same GCN optimizations and code patterns would have to work on the new architecture too. So it would have to have all the same performance characteristics, and then what exactly did you change? 25% more perf and 50% more efficiency is good, and will hopefully allow GCN to scale into higher-end parts a little better. But a 2070-tier part isn't really exploiting that very well. If Big Navi is waiting for next year, they'll be going up against Ampere, not Turing. They needed Big Navi like now, this whole "big chips will come a year behind all the other parts" has to stop. The question they didn't answer is of course performance-per-unit-area or performance-per-transistor though, since that's the major driver of costs. Perf-per-clock is meaningless in GPUs, since they're a throughput-oriented architecture. Fermi had higher perf-per-cycle than Kepler and Maxwell... but Kepler and Maxwell both shrunk the cores enough that they performed higher per unit-area despite that. That's the catch with Turing too - perf-per-SM went up quite a lot, but the cache/etc ate a lot of space too. In practice it's only like a 10-15% improvement per unit area on older titles. We're definitely starting to hit an asymptotic limit beyond which perf-per-area just can't be substantially increased. Paul MaudDib fucked around with this message at 00:54 on May 28, 2019 |
# ? May 28, 2019 00:38 |
|
Riflen posted:Nvidia are releasing their Quake II RTX project on June the 6th. Is nvidia giving proper credit to the guys at http://brechpunkt.de/q2vkpt/
|
# ? May 28, 2019 05:51 |
|
Purgatory Glory posted:Is nvidia giving proper credit to the guys at http://brechpunkt.de/q2vkpt/ Yes. The author Christoph Schied had an internship at Nvidia too. They also credit the guy who created a lot of the textures Nvidia used.
|
# ? May 28, 2019 08:49 |
|
Paul MaudDib posted:Remember that they really can't abandon GCN as an architecture because
|
# ? May 28, 2019 10:52 |
|
https://wccftech.com/amd-radeon-rx-5000-navi-gpu-7nm-asrock-two-variants-report/ 180w and a 150w variant in the works, I'm guessing rx5700 is the former. That's honestly better than I was expecting?
|
# ? May 29, 2019 11:13 |
|
I'll be in the market for a cheap RX580 when these new cards drop. Which is gonna be best for staying quiet in a small case? I've heard the MSI Armor is noisy. Is the Gaming X the one to go for, or maybe an EVGA FTW?
|
# ? May 29, 2019 11:41 |
|
Sapphire Nitro is usually the highly regarded series. EVGA does only nVidia cards.
|
# ? May 29, 2019 12:12 |
|
I don't expect huge price drops for the 580/570, these new cards are clearly replacements for the Vega shitpiles and those are the cards that will drop hard (and STILL not be worth buying). If you really care about keeping noise down, grab a used 1060. It'll be a bit faster, use a ton less power and thus generate a lot less heat/noise, and will both last longer and resell better (largely for the same reason). The other thing is if the Navi cards turn out to be even slightly good, there will be a spike in used GPU availability as the hordes of people who have been waiting to upgrade finally do so.
|
# ? May 29, 2019 12:34 |
|
OK. Cheers.
|
# ? May 29, 2019 13:39 |
|
K8.0 posted:these new cards are clearly replacements for the Vega shitpiles and those are the cards that will drop hard (and STILL not be worth buying). But the vega 56 i have isn't terrible and is as good as a 1070/ti, has fanless mode and was 300 bucks with 3 good games. also the driver stack has been amazing. Radeon chill is the best thing btw.
|
# ? May 29, 2019 13:45 |
|
If you actually dont have and want the 3 games and they're expensive games, those bundles have been pretty good. From what I can see the problem is that in price, a vega 64 is less than $50 from an rtx2070...which should be preferable in every way I can think of at this point
|
# ? May 29, 2019 13:54 |
|
Statutory Ape posted:If you actually dont have and want the 3 games and they're expensive games, those bundles have been pretty good. agreed the 64 is over price most of the time but a 270-300 dollar 56 runs the same as a 1070ti/1660ti
|
# ? May 29, 2019 14:06 |
|
Arzachel posted:https://wccftech.com/amd-radeon-rx-5000-navi-gpu-7nm-asrock-two-variants-report/ Videocardz.com () has the 5700 as a slightly larger die than Polaris, available in 180W (same as 580) or 225W (same as 590 Jesus F Christ) configurations. If accurate that is what I was afraid of, overjuiced to hell for what the sixth time in a row to eek out a meagre lead over a reference 1660ti or 1070ti in the fps graphs. sauer kraut fucked around with this message at 14:52 on May 29, 2019 |
# ? May 29, 2019 14:49 |
|
K8.0 posted:I don't expect huge price drops for the 580/570, these new cards are clearly replacements for the Vega shitpiles and those are the cards that will drop hard (and STILL not be worth buying). If you really care about keeping noise down, grab a used 1060. It'll be a bit faster, use a ton less power and thus generate a lot less heat/noise, and will both last longer and resell better (largely for the same reason). My PNY 1060 was loud as gently caress, and it went up to 80c under load. Maybe lovely cooler design, but still. wargames posted:But the vega 56 i have isn't terrible and is as good as a 1070/ti, has fanless mode and was 300 bucks with 3 good games. also the driver stack has been amazing. Radeon chill is the best thing btw. Chill is the best feature by far. I had no idea it existed when I bought my Vega 56, and now I can't live without it.
|
# ? May 29, 2019 16:32 |
|
What's the chill thing do
|
# ? May 29, 2019 16:39 |
|
Murders your frame pacing in a vain attempt to compensate for AMD pushing their cards to the absolute limit and still failing to compete. It caps your framerate based on how active the driver judges the game to be.pofcorn posted:My PNY 1060 was loud as gently caress, and it went up to 80c under load. Maybe lovely cooler design, but still. Definitely a lovely cooler design and probably terrible case airflow, but even with a good cooler a 580 would have been worse because it uses almost twice as much power. There's no way to cheat an extra 100+ watts out of a case.
|
# ? May 29, 2019 16:58 |
|
K8.0 posted:Murders your frame pacing in a vain attempt to compensate for AMD pushing their cards to the absolute limit and still failing to compete. It caps your framerate based on how active the driver judges the game to be. it doesn't because its not a frame limiter like nvidia it an fps target Statutory Ape posted:What's the chill thing do what radeon chill does is you set a fps you want the game to run at, the card will either overclock (rarely) or underclock the hell out of your card till it meets that target. Do note i have a freesync monitor so i do not notice tearing ever or weird fps hang ups. wargames fucked around with this message at 17:38 on May 29, 2019 |
# ? May 29, 2019 17:31 |
|
Statutory Ape posted:What's the chill thing do It's a pretty effective framerate limiter if nothing else, and afaik it also tries to perform some magic like Nvidias 'Optimal' power setting by not redrawing frames if nothing has changed from the previous one. Since I use vsync or borderless mode on everything I haven't tried it very much to see if it works reliably, or messes up things that it shouldn't.
|
# ? May 29, 2019 17:38 |
|
sauer kraut posted:Videocardz.com () has the 5700 as a slightly larger die than Polaris, available in 180W (same as 580) or 225W (same as 590 Jesus F Christ) configurations. That's total board power which includes memory and type-c which I'm pretty sure Nvidia doesn't include in their TDP numbers.
|
# ? May 29, 2019 21:54 |
|
Arzachel posted:That's total board power which includes memory and type-c which I'm pretty sure Nvidia doesn't include in their TDP numbers. Yeah AMD does indeed do "TBP", but both sides are using dumb marketing horseshit as best they can with their power calcs, so they are really only useful for comparing against parts in their own stack. AMD lists the higher end Navi at the 590's TBP, so we can look at the 590 to compare: Seems like it will be slightly more power hungry than the 2070 its competing against. IF this is the high end navi i guess. And if the leaks are true. Who knows yet!
|
# ? May 29, 2019 23:56 |
|
Cygni posted:Yeah AMD does indeed do "TBP", but both sides are using dumb marketing horseshit as best they can with their power calcs, so they are really only useful for comparing against parts in their own stack. AMD lists the higher end Navi at the 590's TBP, so we can look at the 590 to compare: The TDP delta between the variants is 30w while the TBP delta is 45w which either means more memory or a powered type-c connector that wouldn't pull extra power unless there's something hooked up. Although the rx590 is rated for a tdp of 175w so who the gently caress knows Arzachel fucked around with this message at 07:26 on May 30, 2019 |
# ? May 30, 2019 07:00 |
|
Navi "RDNA" confirmed to be "GCN hybrid". I'm sure post-Navi will definitely be a "pure" RDNA though, right after AMD sold the console guys on another generation of GCN parts.
|
# ? May 30, 2019 19:01 |
|
It used to be that either ATI or Nvidia had a good GPU and the other one would be telling you about how good their next GPU would be. Now we're on like 7 years of AMD telling us "yeah, our current GPUs are poo poo, and our next GPU is going to be poo poo, but just wait until the one after that!"
|
# ? May 30, 2019 19:15 |
|
RTX confirmed for the new call of doody game https://twitter.com/NVIDIAGeForce/status/1134171617519497216?s=19
|
# ? May 30, 2019 20:01 |
|
That may be the most realistic night vision I've ever seen.
|
# ? May 30, 2019 20:43 |
|
I couldn't care less about COD but their new engine does sound pretty cool, apparently they're rendering beyond the visible spectrum so the IR and NV views are the real deal rather than cheap filters.
|
# ? May 30, 2019 20:53 |
|
repiv posted:I couldn't care less about COD but their new engine does sound pretty cool, apparently they're rendering beyond the visible spectrum so the IR and NV views are the real deal rather than cheap filters. I mean... CoD. But holy poo poo if that isn’t the coolest loving thing. Also, it’s a reboot of the best CoD, CoD4:Modern Warfare. So yeah, good news all around.
|
# ? May 31, 2019 00:54 |
|
|
# ? Apr 25, 2024 14:43 |
|
On the other hand, a very clever and directed use of raytracing. You're not raytracing a global illumination source with bajillions of rays, just a few very specific IR sources, because anything beyond a certain intensity threshold, you can just dummy it out with an un-raytraced illumination source, and everything has extreme falloff.
|
# ? May 31, 2019 01:01 |