Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Begall
Jul 28, 2008
https://youtu.be/xBDFCoGhZ4g

Turns out that no, modern games are not bottlenecked by 4C/8T 🤔

Adbot
ADBOT LOVES YOU

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
in the final minutes of Hardware Unboxed's i5-12400 review, Steve mentioned something about he was working on a different video, so he paused testing for the Intel CPU, and then some shenanigans happened, and then he had to stop working on that video and come back to the i5, which is why there were some results that he had to truncate and tack on at the very end

I didn't think much of it, until this came across my feed

https://twitter.com/HardwareUnboxed/status/1480803296524718081

now, I know we're not too cool with MLID, and I will grant that the information he imparts in his video is just stuff you could get from videocardz if you were paying enough attention (AFAIK MLID's entire schtick is repackaging leaks from everyone else and passing it off as his own sources), but HWUB's reaction does seem to imply that he's right: NVidia is lifting pre-orders and the review embargo on the same day, AND they're holding out on driver support until the last possible minute, so that reviewers can't do a proper review of the 3080 12GB until it's "too late" for people who want a shot at buying it

EDIT:

Begall posted:

https://youtu.be/xBDFCoGhZ4g

Turns out that no, modern games are not bottlenecked by 4C/8T 🤔

wow that is a shockingly good result

gradenko_2000 fucked around with this message at 09:32 on Jan 11, 2022

Dr. Video Games 0031
Jul 17, 2004

MLID has had unique information about Arc that has turned out to be correct (such as the appearance of the reference GPUs), so he actually does have some legit sources at Intel. I'll believe that he has legit sources elsewhere too, but he's also leaked some total BS in the past, and he also misinterprets some of his own leaks and says some wild poo poo from time to time. So you have to take everything MLID says with a huge grain of salt. He's not a total rumor regurgitator/BS spewer, but he's still plain wrong a fair percentage of the time.

That said, nothing in that video is hard to believe.

Begall posted:

https://youtu.be/xBDFCoGhZ4g

Turns out that no, modern games are not bottlenecked by 4C/8T 🤔

I mean, it's clearly weaker than most of the six-core parts, so I don't know if this is the takeaway here. They also didn't test many (any?) CPU-bound games that are heavily multi-threaded like Halo Infinite, or BF2042. I still see this more of a general-purpose productivity CPU for cheap workstations, but it looks like it will also be decent as a budget gaming CPU. Though I feel like even budget gamers should pay the extra $50 for a 12400.

Dr. Video Games 0031 fucked around with this message at 11:04 on Jan 11, 2022

Begall
Jul 28, 2008

Dr. Video Games 0031 posted:

I mean, it's clearly weaker than most of the six-core parts, so I don't know if this is the takeaway here. They also didn't test many (any?) CPU-bound games that are heavily multi-threaded like Halo Infinite, or BF2042. I still see this more of a general-purpose productivity CPU for cheap workstations, but it looks like it will also be decent as a budget gaming CPU. Though I feel like even budget gamers should pay the extra $50 for a 12400.

It finishes above the 3600 in most/all benchmarks and is competitive with the 11400F, 3900x and 9900k in many benchmarks. And this is using a 3080Ti as the GPU (something I think we can agree is not exactly a budget gamer GPU), deliberately emphasising the differences between the CPUs. I think this absolutely demonstrates the flaw in thinking 6C+ for gaming is mandatory - I doubt on a right-sized GPU that you could find any gaming differences between it and a 12400.

Dr. Video Games 0031
Jul 17, 2004

Begall posted:

It finishes above the 3600 in most/all benchmarks and is competitive with the 11400F, 3900x and 9900k in many benchmarks. And this is using a 3080Ti as the GPU (something I think we can agree is not exactly a budget gamer GPU), deliberately emphasising the differences between the CPUs. I think this absolutely demonstrates the flaw in thinking 6C+ for gaming is mandatory - I doubt on a right-sized GPU that you could find any gaming differences between it and a 12400.

I mean, saying the bottleneck disappears with a weak enough GPU isn't the same as saying there's no bottleneck. And these are old parts you're comparing it against. Compared to its contemporaries, it's definitely weaker than the 6 core CPUs. The 12400 reviews that are out so far have it as very competitive if not a little better than the 5600X in gaming, so it has a clear advantage over the 12100. That's not entirely down to core count, but it's a part of the picture. Ultimately, core count isn't what makes a CPU good or bad—it's its overall power level, and core counts are simply one part of that equation. The 12100 is weaker than the 12400 because it has a lower clock speed and less cache, but also because it has fewer cores. It seems fairly plausible to me that it would be scoring better on some of these benchmarks if it had six.

Don't get me wrong, it's good that it's able to perform so well compared to higher end parts from just a few years ago, but that's also sort of what I expect from a new CPU architecture.

njsykora
Jan 23, 2012

Robots confuse squirrels.


I was definitely watching that video and thinking about using the 12100 in a media server PC, I do think in a build primarily meant for gaming you'd still want a 12400 though but it depends on where the actual prices for these fall. In the UK at least the 12400 is only about £50-60 more in general than the 12100, and the 12400 is actually in stock most places.

Pilfered Pallbearers
Aug 2, 2007

gradenko_2000 posted:

in the final minutes of Hardware Unboxed's i5-12400 review, Steve mentioned something about he was working on a different video, so he paused testing for the Intel CPU, and then some shenanigans happened, and then he had to stop working on that video and come back to the i5, which is why there were some results that he had to truncate and tack on at the very end

I didn't think much of it, until this came across my feed

https://twitter.com/HardwareUnboxed/status/1480803296524718081

now, I know we're not too cool with MLID, and I will grant that the information he imparts in his video is just stuff you could get from videocardz if you were paying enough attention (AFAIK MLID's entire schtick is repackaging leaks from everyone else and passing it off as his own sources), but HWUB's reaction does seem to imply that he's right: NVidia is lifting pre-orders and the review embargo on the same day, AND they're holding out on driver support until the last possible minute, so that reviewers can't do a proper review of the 3080 12GB until it's "too late" for people who want a shot at buying it

EDIT:

wow that is a shockingly good result

Nvidia was doing this exact poo poo last cycle and got called on it, so I can’t say I’m surprised.

Was it then that threatened to pull review samples from HWUB?

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
yeah, it was NVidia that threatened to pull review samples from HWUB

that said, the last company I remember to do the "embargo and pre-order" trap was Intel when they released their last round of HEDT chips, because they knew Threadripper was going to poo poo all over it

njsykora
Jan 23, 2012

Robots confuse squirrels.


There is now comment. Straight up saying the drivers weren't given to reviewers like normal to prevent anyone having a review out day 1. Though he also does this weird shuffle saying he doesn't believe reviewers "deserve" advance access after saying how important it is for reviews to be out before a product launch.

njsykora fucked around with this message at 15:59 on Jan 11, 2022

Klyith
Aug 3, 2007

GBS Pledge Week

Begall posted:

It finishes above the 3600 in most/all benchmarks and is competitive with the 11400F, 3900x and 9900k in many benchmarks. And this is using a 3080Ti as the GPU (something I think we can agree is not exactly a budget gamer GPU), deliberately emphasising the differences between the CPUs. I think this absolutely demonstrates the flaw in thinking 6C+ for gaming is mandatory - I doubt on a right-sized GPU that you could find any gaming differences between it and a 12400.

This is not exactly news, the Ryzen 3300X was 4C/8T and at the time was a really good budget gaming CPU that competed well with 3600 & 3700 in most games. Now the 12100F is even better.

The idea that you probably want a 6C CPU for a gaming machine is not that it's mandatory now, but that games will continue to become more multithreaded over time. Currently a 4C/4T CPU is a huge limiter in a decent number of games. I haven't seen anything that kills an 6C/6T 8600K yet, but I'd give decent odds it'll happen this year. Steam hardware puts 6 & 8 cores at 50% of the market, so for AAA games targeting the high end it's safe to say most of your customers have 6+.

But all this depends on how far in the future you care about keeping your CPU.



OTOH I think the real question is: will the 12100F be available in quantity? Last go-around the 11400F was also a really good budget gaming CPU. But it was pretty much a paper product when it came to actually buying them. Intel clearly only sells as many F CPUs as they have dies with failed GPUs, and that's not a high number. So IMO the Fs are pretty much a marketing gimmick, a thing Intel gives to reviewers and puts a spectacular price on to get amazing reviews. Looking at a 12300 at $140, it's not so amazing.

kliras
Mar 27, 2021
A neat new update:

https://twitter.com/VideoCardz/status/1480921263107461130

Basically an AI-enhanced DSR and some new Freestyle filters from working with ReShade to add SSAO, GI, and dynamic DOF. So only fiddlers will know about it and apply it on a per-game basis.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

kliras posted:

A neat new update:

if I understand this correctly:

regular DSR is when you have a 1080p monitor, but you render the image at 4k, and shrink it down to 1080p, and that improves image quality

this new DLDSR is when you have a 1080p monitor, and you render the image at (in their example) 1620p, then there's an AI algorithm that further improves image quality, then shrinks it down to 1080p
because the render resolution is 1620p, then the performance hit is smaller than a render resolution of 4k
but because of the special sauce AI layer, the image quality is still comparable to regular DSR with 4k

kliras
Mar 27, 2021
yep, that's what they're claiming

wonder if anyone will bother benchmarking to test the theory. most likely they'll do it as one of those "MAKE YOUR GAMES LOOK AMAZING WITH THIS ONE WEIRD TRICK"

EngineerJoe
Aug 8, 2004
-=whore=-



Will DLDSR work in Overwatch? I started on a 4k screen and then downgraded to a 1440p screen (but 165hz) and I miss the perfect sharpness of 4k... too many jaggies at 1440p.

kliras
Mar 27, 2021
I don't see why not, as long as you have a 20-series or later.

I tried downloading the latest driver today, but something weird's going on, maybe wait a day to see if something gets pulled and reuploaded.

Pilfered Pallbearers
Aug 2, 2007

kliras posted:

yep, that's what they're claiming

wonder if anyone will bother benchmarking to test the theory. most likely they'll do it as one of those "MAKE YOUR GAMES LOOK AMAZING WITH THIS ONE WEIRD TRICK"

If it’s a reasonable upgrade, GN probably will. They did plenty of testing for DLSS and AMDs version or whatever.

repiv
Aug 13, 2009

gamersnexus rarely delves into image quality analysis, digitalfoundry are the main guys for that

i'm not sure what AI really brings to the table for a downscaler (it's not like they need to infer any missing information) but we'll see how it goes

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
When a game does MSAA, what does happen, anyway? The GPU does the oversampled rendering of the scene and sends the downsampled result to the game? Or does the game render at higher resolution into an offscreen buffer and does the resampling on its own? Just asking, because overriding MSAA or similar tomfoolery sounds like an idea (i.e. applying DL assisted downsampling to the MSAA result).

repiv
Aug 13, 2009

MSAA can be summarised as "supersampling but only on triangle edges", the amount of supersampling varies pixel by pixel, with pixels wholly inside a triangle not supersampled at all

Almost no modern engines support MSAA anymore (except for VR stuff) so hooking into it wouldn't work for a generic solution like DLDSR

Nvidia does do some driver-level MSAA shenanigans to implement VRSS, but that requires the game to support MSAA in the first place (hence the focus on VR)

https://developer.nvidia.com/blog/nvidia-vrss-a-zero-effort-way-to-improve-your-vr-image-quality/

repiv fucked around with this message at 19:31 on Jan 11, 2022

Dr. Video Games 0031
Jul 17, 2004

repiv posted:

gamersnexus rarely delves into image quality analysis, digitalfoundry are the main guys for that

i'm not sure what AI really brings to the table for a downscaler (it's not like they need to infer any missing information) but we'll see how it goes

Choosing which pixels to toss and which to keep seems important for having the best image quality.

This is going to come with a heavy performance penalty in any game that isn't CPU bottlenecked. Nvidia is talking like you're saving so much FPS over regular DSR, and I guess you are, but you can still probably expect to see your FPS drop in half in games that aren't CPU limited, right?

repiv
Aug 13, 2009

Why toss away any pixels though? It's all good data

The only thing I can think of is maybe they're using AI to drive the strength of the filter kernel at each pixel, to balance out the aliasing/blurryness tradeoff when doing non-integer downscales. OG DSR just has one big DSR Smoothness knob to dial that in across the whole image.

Dr. Video Games 0031
Jul 17, 2004

repiv posted:

Why toss away any pixels though? It's all good data

To make the final image? The goal is to downscale a high-res image into a low-res one. That requires throwing away pixels.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
If you're throwing away pixels, there's no benefit to rendering at a higher resolution to begin with. I think repiv is right, they're probably trying to use AI to reduce the inherent jankyness of scaling down at a low, off-integer sample rate. I have no particular expectation of how well it will work.

repiv
Aug 13, 2009

Dr. Video Games 0031 posted:

To make the final image? The goal is to downscale a high-res image into a low-res one. That requires throwing away pixels.

When you supersample you're not throwing any pixels away, you're combining multiple source pixels into one output pixel, all the source pixels get used one way or another

That's straightforward for integer scales (just average 2x2 pixels into 1 pixel) but for non-integer scales the weighting needs to be tuned (which is what DSR Smoothness does, and maybe what this AI model does)

Fixing the non-integer jank would be a welcome improvement since integer scale is mostly impractical on 1440p monitors. 5K is tough to render even in older games, at some point memory bandwidth just kills you.

repiv fucked around with this message at 20:06 on Jan 11, 2022

Dr. Video Games 0031
Jul 17, 2004

I must be phrasing this in a bad way because nothing you guys are saying is contradicting what I'm trying to say. What you are describing is a fancy way of throwing away pixels. That is what I meant.

EngineerJoe
Aug 8, 2004
-=whore=-



Dr. Video Games 0031 posted:

This is going to come with a heavy performance penalty in any game that isn't CPU bottlenecked. Nvidia is talking like you're saving so much FPS over regular DSR, and I guess you are, but you can still probably expect to see your FPS drop in half in games that aren't CPU limited, right?

It should be a minor performance hit. If you're running at 1440p and use DLDSR it would be like running DLSS at 4k. DLSS at 4k high quality mode renders at 1440p anyways which is how DLSS provides such a massive performance boost.

repiv
Aug 13, 2009

Dr. Video Games 0031 posted:

I must be phrasing this in a bad way because nothing you guys are saying is contradicting what I'm trying to say. What you are describing is a fancy way of throwing away pixels. That is what I meant.

I get what you mean, throwing away pixels has a more literal meaning in scalers though (temporal methods do completely discard samples they deem to be invalid) so IMO it's clearer to call this weighting instead

Dr. Video Games 0031
Jul 17, 2004

EngineerJoe posted:

It should be a minor performance hit. If you're running at 1440p and use DLDSR it would be like running DLSS at 4k. DLSS at 4k high quality mode renders at 1440p anyways which is how DLSS provides such a massive performance boost.

I don't think that is how this works at all. My understanding is that, say you are at 1080p, you render the screen at 1620p instead and then do ai-assisted downsampling from there. They are not doing DLSS up to 1620p first, that's a raw native rendering. Rendering at 1620p is a lot more expensive than 1080p, and you're going to incur a huge frame rate penalty from that unless you have a GPU that's overpowered for your monitor resolution/refresh rate. The example image nvidia showed is deceptive because it was done using a CPU-bottlenecked game.

Dr. Video Games 0031 fucked around with this message at 20:34 on Jan 11, 2022

repiv
Aug 13, 2009

Yeah it's going to be fairly expensive. They're pitching it as an alternative to monstrously expensive 4x supersampling, but it's still 1.78x or 2.25x the pixel count plus whatever the AI model costs to run.

The internal resolutions for 1080p are listed as 1440p (1.78x mode) or 1620p (2.25x mode). I guess for 1440p it will be 1920p or 2160p/4K internal.

repiv fucked around with this message at 20:30 on Jan 11, 2022

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

EngineerJoe posted:

It should be a minor performance hit. If you're running at 1440p and use DLDSR it would be like running DLSS at 4k. DLSS at 4k high quality mode renders at 1440p anyways which is how DLSS provides such a massive performance boost.

Based on what Nvidia has said, it's really not like DLSS. It can't really be temporal, because they don't have motion vectors or subpixel jitter for the inputs. It's just a way of trying to make better decisions about how to do difficult downscaling when you don't have a lot of pixels to blend.

EngineerJoe
Aug 8, 2004
-=whore=-



Ah... whats the point. Well, I'll still try it with overwatch. I have enough headroom that I should still hover around my monitor refresh rate.

Dr. Video Games 0031
Jul 17, 2004

It does seem like it's best used for games where you're already bottlenecked in one way or another. That way you can put your GPU to work improving the image quality slightly.

Otherwise you're probably better off with a higher frame rate (and imo, frame rate is quality, so trading it away for what may be just a slight increase in image sharpness is counterproductive)

Dr. Video Games 0031 fucked around with this message at 21:10 on Jan 11, 2022

repiv
Aug 13, 2009

Prey is a pretty good example really, that's one of those awkward transitional games that has lots of detail but a poor TAA implementation so it's a shimmer-fest

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

K8.0 posted:

Based on what Nvidia has said, it's really not like DLSS. It can't really be temporal, because they don't have motion vectors or subpixel jitter for the inputs. It's just a way of trying to make better decisions about how to do difficult downscaling when you don't have a lot of pixels to blend.
Use the video encoder unit to generate motion vectors. :marc:

--edit
So anyhow, I wonder if DLSS+DLDSR would give better results than DLAA.

Combat Pretzel fucked around with this message at 00:31 on Jan 12, 2022

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

kliras
Mar 27, 2021
AV1 encoding is getting wider adoption on YouTube; this HWUB video has AV1 across all resolutions:

https://www.youtube.com/watch?v=JTkIeBhVOck

code:
mp4   3840x2160 60 | 1.54GiB 11056k | av01.0.13M.08
webm  3840x2160 60 | 1.74GiB 12438k | vp9
YouTube is generally taking the road of "same quality, lower bitrate" rather than "same bitrate, higher quality" with AV1, which I assume is in large part due to the algo dorks who believe it will result in higher watchtimes, engagement, etc.

kliras fucked around with this message at 15:22 on Jan 18, 2022

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
I doubt ISPs have stopped their "Youtube/Netflix needs to pay for the bandwidth they use on our networks" shpiel. It's probably partly related to that. And IIRC back when Covid started, the EU mandated lower video bitrate to make room on the interwebs for home office data traffic. Not sure if that's still in effect.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
AV1 getting wider adoption is going to hurt the RX 6500 XT even more

Fame Douglas
Nov 20, 2013

by Fluffdaddy

Combat Pretzel posted:

And IIRC back when Covid started, the EU mandated lower video bitrate to make room on the interwebs for home office data traffic. Not sure if that's still in effect.

No, that was a dumb and pointless stunt by Thierry Breton to get himself in the news as a man of action and only lasted for a few months.

Adbot
ADBOT LOVES YOU

repiv
Aug 13, 2009

gradenko_2000 posted:

AV1 getting wider adoption is going to hurt the RX 6500 XT even more

The lack of H264/HEVC encode also means it won't work with the Oculus Quest, which will probably catch some people out

Are there any other modern GPUs that can't even do H264?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply