Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!
I'm not so sure about how good Window's NUMA scheduler is, at least for Threadripper style NUMA where the latencies between the nodes are very low. Turning on NUMA in my BIOS tanked CPU performance on Windows (using geekbench anyway), but I never did reinstall and I know Windows can be weird about CPUs "changing" out from underneath it. They only mention Windows in that article so this might just be a lower level bandaid on a poor NUMA balancing implementation in Windows.

Adbot
ADBOT LOVES YOU

SwissArmyDruid
Feb 14, 2014

by sebmojo
Someone out there appears to have access to Rome:



source: https://www.notebookcheck.net/Chiphell-leaks-another-apparent-AMD-Rome-Cinebench-score-this-time-with-some-proof.337356.0.html

Allegedly, whatever AMD chip(s) is/are running Cinebench there clocks in at 12587 score.

For reference, the current world record Cinebench run is 10038 score, running on 4x Xeon Platinum 8160s: http://hwbot.org/submission/3630844_rauwomos_cinebench___r15_4x_xeon_platinum_8160_10038_cb

The usual Chiphell disclaimers apply.

SwissArmyDruid fucked around with this message at 12:12 on Oct 8, 2018

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
If it's a fake, it's a very good fake though, also 12587 isn't really insane for 128C/256T.

Arzachel
May 12, 2012

EmpyreanFlux posted:

If it's a fake, it's a very good fake though, also 12587 isn't really insane for 128C/256T.

Dunno, 30% faster clock for clock over a theoretical 128c 7601 seems pretty good!

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Arzachel posted:

Dunno, 30% faster clock for clock over a theoretical 128c 7601 seems pretty good!

I think that is assuming too much about boost clocks imho. Based on the current record holder, it seems to me to be closer to 10-15% improvement, which I think is not only reasonable but I think AMD has insinuated as much? I could have fever dreamed that though.

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord
8 core ccx sounds pretty good.

E: either that or 4x4c ccx per die

Risky Bisquick fucked around with this message at 15:24 on Oct 8, 2018

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Speculation is actually that it's a dual 64C/128T.

--edit: Whoops, link says it.

Arzachel
May 12, 2012

EmpyreanFlux posted:

I think that is assuming too much about boost clocks imho. Based on the current record holder, it seems to me to be closer to 10-15% improvement, which I think is not only reasonable but I think AMD has insinuated as much? I could have fever dreamed that though.

My napkin math was 1700@4ghz scores 1750-ish which works out to 12600 for 128 cores@1.8ghz with perfect scaling.

Who the hell still has a GT210 on hand anyways?

Anime Schoolgirl
Nov 28, 2002

Risky Bisquick posted:

8 core ccx sounds pretty good.

E: either that or 4x4c ccx per die
4 4 core CCX on a die sounds like a clusterfuck that's more trouble than it's worth, honestly. The internal tracing would be better on two larger CCXes and there's a better granulation of cores you can disable for yields.

Plus an 8 core CCX would finally bring their APU line up to "very serious consideration" territory.

Anime Schoolgirl fucked around with this message at 17:01 on Oct 8, 2018

NewFatMike
Jun 11, 2015

I'm running an animation render overnight. Looks like for a 6 second loop at 1440p, it'll take about 7.5 hours on my R7 1700, which probably isn't too special.

I'm in love with the fact that my desktop is completely silent and maintaining a consistent 64° at 100% CPU usage. I'll need to save a screenshot of the total time so I can compare it with the next upgrade.

If I keep doing stuff like this, it could be the excuse I need to upgrade to Threadripper 5 or whatever instead of Zen 4.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Anime Schoolgirl posted:

4 4 core CCX on a die sounds like a clusterfuck that's more trouble than it's worth, honestly. The internal tracing would be better on two larger CCXes and there's a better granulation of cores you can disable for yields.

Plus an 8 core CCX would finally bring their APU line up to "very serious consideration" territory.

Yeah, I'm all for this. 8-core CCX obviates the need for any NUMA-aware programming at the desktop and allows AMD products to just field a single die for up to eight cores to better go toe-to-toe with their Intel counterparts. Realistically, 8c/16t is about the right size larger for desktop computers to grow into and get clocks/IPC up before increasing core counts again.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

NewFatMike posted:

If I keep doing stuff like this, it could be the excuse I need to upgrade to Threadripper 5 or whatever instead of Zen 4.

There is nothing quite like the feeling of maxing out a high-end rig, in whatever area. It is unironically super-cool to be running an encode at the same time as you game, or dumping files to your NAS from your main rig, another PC, and a spare HDD at full speed each, and it doesn't even bother your rig.

dumb toy chat: I got a J5005 NUC and I'm seeing how QuickSync encode quality compares to x264 on the new Gemini Lake chips. Probably badly but given how much it's making GBS threads on the CPU itself maybe the "best" setting is doing well? I can run two "best" quality encodes at 1080p on the NUC at 45fps each whereas my CPU (E5-1650/3930K) is doing 6fps at veryslow preset (I'm running a batch of CRF x264 runs to figure out where the quality matches).

I have Pascal for comparison on NVENC, if someone with Turing or Raven Ridge wants to try maybe PM me and we can figure out what to compare on.

Paul MaudDib fucked around with this message at 07:34 on Oct 9, 2018

NewFatMike
Jun 11, 2015

Paul MaudDib posted:

There is nothing quite like the feeling of maxing out a high-end rig, in whatever area. It is unironically super-cool to be running an encode at the same time as you game, or dumping files to your NAS from your main rig, another PC, and a spare HDD at full speed each, and it doesn't even bother your rig.

dumb toy chat: I got a J5005 NUC and I'm seeing how encode quality compares to x264 on the new Gemini Lake chips. Probably badly but given how much it's making GBS threads on the CPU itself maybe the "best" setting is doing well? I can run two "best" quality encodes at 1080p on the NUC at 45fps each whereas my CPU (E5-1650/3930K) is doing 6fps at veryslow preset (I'm running a batch of CRF x264 runs to figure out where the quality matches).

I have Pascal for comparison on NVENC, if someone with Turing or Raven Ridge wants to try maybe PM me and we can figure out what to compare on.

I may be getting the Sapphire Ryzen embedded 1807B for some prototyping at work in the next few weeks, so I'm definitely down to clown on checking out different use cases. Ripping videos is definitely something this will be doing.

Klyith
Aug 3, 2007

GBS Pledge Week

Desuwa posted:

They only mention Windows in that article so this might just be a lower level bandaid on a poor NUMA balancing implementation in Windows.

It's absolutely a bandaid, but I wouldn't call it a low-level one at all. It's like manually setting CPU affinity in task manager, but automatic by an app. That works well enough for people using threadrippers for gaming or video editing or other single-user stuff, but would totally not fly in more complicated environments.

Desuwa
Jun 2, 2011

I'm telling my mommy. That pubbie doesn't do video games right!

Klyith posted:

It's absolutely a bandaid, but I wouldn't call it a low-level one at all. It's like manually setting CPU affinity in task manager, but automatic by an app. That works well enough for people using threadrippers for gaming or video editing or other single-user stuff, but would totally not fly in more complicated environments.

Huh, I assumed it was lower level than that.

Whale Cancer
Jun 25, 2004

I'm planning a 2600x / 1070ti build. Is Samsung B Die ram worth the extra $50 or would I be better off putting the $50 toward getting a 1080 or 1080ti.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Whale Cancer posted:

I'm planning a 2600x / 1070ti build. Is Samsung B Die ram worth the extra $50 or would I be better off putting the $50 toward getting a 1080 or 1080ti.

IIRC b-die was only a must-have for first gen Ryzen. My Team dark 3000 runs at 2933 just fine (which is all it ran on Intel too).

NewFatMike
Jun 11, 2015

Check the QVL for motherboard you're interested in/have the features you want, you may be able to find a solid deal

I think in any case, the 2000 series supports 2933MHz by default and it's kinda diminishing returns beyond that

If I were in your shoes, I'd put the money towards a 1080Ti and 2933MHz RAM. The extra GPU grunt will do you better for games and it sounds like this is primarily a gaming machine.

Whale Cancer
Jun 25, 2004

NewFatMike posted:

Check the QVL for motherboard you're interested in/have the features you want, you may be able to find a solid deal

I think in any case, the 2000 series supports 2933MHz by default and it's kinda diminishing returns beyond that

If I were in your shoes, I'd put the money towards a 1080Ti and 2933MHz RAM. The extra GPU grunt will do you better for games and it sounds like this is primarily a gaming machine.

Yes, it's primarily a gaming machine. I was just curious because corsair LPX 3200 is on the msi b450 tomahawks QVL list as able to run at 3200mhz, but nobody can seem to get it to that point. Ram for reference.
https://www.newegg.com/Product/Product.aspx?Item=N82E16820236214&cm_re=CMK16GX4M2Z3200C16-_-20-236-214-_-Product

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Whale Cancer posted:

Yes, it's primarily a gaming machine. I was just curious because corsair LPX 3200 is on the msi b450 tomahawks QVL list as able to run at 3200mhz, but nobody can seem to get it to that point. Ram for reference.
https://www.newegg.com/Product/Product.aspx?Item=N82E16820236214&cm_re=CMK16GX4M2Z3200C16-_-20-236-214-_-Product

I've been running the following RAM with 0 problems day 1.

https://www.newegg.ca/Product/Product.aspx?Item=N82E16820232530

I have a 1700, a 2700x and a 1950x Threadripper all running that memory with 0 problems. Set XMP profile, save, reboot, done.

Munkeymon
Aug 14, 2003

Motherfucker's got an
armor-piercing crowbar! Rigoddamndicu𝜆ous.



I was gonna chime in with the RAM I bought last year but it's $120 more expensive this year :stare:

Klyith
Aug 3, 2007

GBS Pledge Week

ItBreathes posted:

IIRC b-die was only a must-have for first gen Ryzen. My Team dark 3000 runs at 2933 just fine (which is all it ran on Intel too).

Possibly just first gen BIOSes even. I have some hynix-chip g skill 3000 that runs at 3000 + CAS 15 + other settings it claims on the box. When I put it together it only did 2933. Results may be different with the super-premium stuff, but most tests I've seen say chasing a higher frequency at the expense of timings doesn't really get you much. As long as you can do 2900-3000 or higher to avoid the infinity fabric slowdown it's pretty much the same.


Whale Cancer posted:

$50 toward getting a 1080 or 1080ti.
If I were you I'd buy the cheap ram and a 1080, because the current pile-up of 1070, 1070ti, and 1080 all being within $75 of each other is insane. You can play around with speed & timings and see what you can get -- even if it doesn't do 3200-16 it might do 3000-15.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Computer Jesus rips Intel a new one: https://www.youtube.com/watch?v=D1mJMI_uaa8

Seamonster
Apr 30, 2007

IMMER SIEGREICH

Klyith posted:

If I were you I'd buy the cheap ram and a 1080, because the current pile-up of 1070, 1070ti, and 1080 all being within $75 of each other is insane. You can play around with speed & timings and see what you can get -- even if it doesn't do 3200-16 it might do 3000-15.

1080 has and will continue to age better than the last few generations of x80 non-ti chips. Get one.

GRINDCORE MEGGIDO
Feb 28, 1985


Stanley Pain posted:

I've been running the following RAM with 0 problems day 1.

https://www.newegg.ca/Product/Product.aspx?Item=N82E16820232530

I have a 1700, a 2700x and a 1950x Threadripper all running that memory with 0 problems. Set XMP profile, save, reboot, done.

I'm ordering this set at the weekend. Can you post a memory bandwidth score for the 1950x please?

Stanley Pain
Jun 16, 2001

by Fluffdaddy

SPOOKCORE MEGGIDO posted:

I'm ordering this set at the weekend. Can you post a memory bandwidth score for the 1950x please?

I can't right now. That system died a couple of days ago and I'm in the process of RMAing the board. That's basically the reason I have a 2700x in the place

GRINDCORE MEGGIDO
Feb 28, 1985


Stanley Pain posted:

I can't right now. That system died a couple of days ago and I'm in the process of RMAing the board. That's basically the reason I have a 2700x in the place

Oh no!

I'm going to run it with a 2700x. I hope this board isn't total poo poo.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

SPOOKCORE MEGGIDO posted:

Oh no!

I'm going to run it with a 2700x. I hope this board isn't total poo poo.

Which board did you get?

TorakFade
Oct 3, 2006

I strongly disapprove


Hi guys, I recently got a 2600x and I have a few questions (coming from an FX 6300, it feels like a sports car would feel for a man coming from the middle ages)


In the BIOS I only activated XMP profile for my RAM and made sure "core performance boost" was on auto, if I understand correctly it should be XFR2; my system is boosting seemingly all the time to 4.1-4.2 Ghz / 1.40-1.45V on all cores even at idle on Windows 10 desktop, is it normal? (this is a fresh install with only Steam, games, and basic stuff like Chrome, VLC, drivers, and utilities for my RGB led stuff :pcgaming: )

Considering the above, are idle temperatures of ~35-45 C with very brief spikes up to 50-55C and load temperatures of 65-75C (real) / 80-90C (torture test) normal? I have a Meshify C with 2x 140mm intakes, 2x 120mm exhausts, CPU cooler Scythe Mugen 5 with an LL120 fan instead of the standard Kaze Flex - the LL120 has a little less CFM, but almost double the static pressure, and I still have to properly configure fan curves so I could probably shave some degrees just by running them at higher speeds.

Here's a screen with HWinfo sensors, Ryzen Master and task manager to maybe spot any possible issue? It feels weird that it's boosting all the time, and temps feel a little high, but that might be me not being used to modern CPUs at all



Thanks in advance :)

Edit: to answer my own question, at least partially, I reviewed my fan curves, disabled AMD Ryzen balanced power saving in favor of the "standard" Balanced, and now CPU is idling at a more comfortable 2.2Ghz, though temps did not change much: 30-40C in idle, 60-70C temps at load, 75-85C in torture test

Edit 2: in testing more with AIDA64 stress test, I notice that whatever the setting, it'll pull only 1.35V to get an all-core frequency of 3.9/4.0 Ghz at full load. Which means that at idle (with AMD Ryzen balanced setting), it pulls more voltage (1.40-1.45) and overclocks faster (4.2Ghz all core) than at full load? I really don't understand, help guys :confused:

TorakFade fucked around with this message at 10:24 on Oct 10, 2018

GRINDCORE MEGGIDO
Feb 28, 1985


Stanley Pain posted:

Which board did you get?

Asrock x470 itx.

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE

TorakFade posted:

Edit 2: in testing more with AIDA64 stress test, I notice that whatever the setting, it'll pull only 1.35V to get an all-core frequency of 3.9/4.0 Ghz at full load. Which means that at idle (with AMD Ryzen balanced setting), it pulls more voltage (1.40-1.45) and overclocks faster (4.2Ghz all core) than at full load? I really don't understand, help guys :confused:
This is perfectly normal. When you're at low load and the CPU is clocking up to handle some minor background task, it has plenty of power and temperature budget to work with because it's not really doing much despite the high clock speed, and so it can clock higher. When you're putting a heavy load on all cores, it can't clock as high because there's no headroom in those power/temp budgets.

sauer kraut
Oct 2, 2004
Try the Windows 10 balanced setting, the AMD hack is no longer necessary. Make sure min cpu is not set to 100% by accident :shobon:
Idling at 4+ GHz is deffo not normal unless you're one of the crazy people who clamp it to 100% all the time.
I have no idea what "core performance boost" is though, maybe turn it off? You should run a 2_00X bone stock +XFR and just fix the memory.

sauer kraut fucked around with this message at 11:32 on Oct 10, 2018

TorakFade
Oct 3, 2006

I strongly disapprove


TheFluff posted:

This is perfectly normal. When you're at low load and the CPU is clocking up to handle some minor background task, it has plenty of power and temperature budget to work with because it's not really doing much despite the high clock speed, and so it can clock higher. When you're putting a heavy load on all cores, it can't clock as high because there's no headroom in those power/temp budgets.

Makes perfect sense, thanks, but why is the CPU clocking up on all cores, all the time? I understand if it would "spike" and then go back to normal, but I've been sitting in front of the idling PC for 20 minutes straight, and the clock never went lower than 4.0Ghz @1.4V ...

sauer kraut posted:

Try the Windows 10 balanced setting, the AMD hack is no longer necessary. Make sure min cpu is not set to 100% by accident :shobon:
Idling at 4+ GHz is deffo not normal unless you're one of the crazy people who clamp it to 100% all the time.
I have no idea what "core performance boost" is though, maybe turn it off? You should run a 2_00X bone stock +XFR and just fix the memory.

windows power saving options are :
full performance (100% CPU all the time, so 4.2Ghz @1.4V)
AMD balanced (90-100% CPU so in theory should idle a little lower, but it still stays all the time at 4.2Ghz - 1.4V ... my guess is that XFR2 detects decent thermal headroom and goes YEAHHH LET'S TEAR poo poo UP)
Balanced (5%-100% CPU, with apparently normal CPU behavior: 2.2Ghz @0.78V when idle, 4.0-4.1Ghz @1.38-1.40V when under load)

I really can't tell why AMD "balanced" is basically the same as full performance mode, and AMD doesn't recommend using standard "balanced" mode. I see something about "core parking" but apparently that's already included in the standard Balanced plan with the latest Windows10 updates... I don't want my CPU to run 100% all the time so I'm guessing Balanced would be the right choice, hopefully not leaving performance on the table when it's needed.

as far as I can tell, "core performance boost" is MSI's way of calling XFR2/Precision boost (no mention of XFR2 or Precision boost in BIOS)

Fake edit: according to Anandtech, yeah, and not even MSI's fault:

quote:

A side note � despite the marketing name being called �Precision Boost 2�, the internal BIOS name is called �Core Performance Boost�. It sounds similar to Multi-Core Enhancement, which is a feature on some Intel motherboards designed to go above and beyond the turbo mechanism. However, this is just AMD�s standard PB2: disabling it will disable PB2. Initially we turned it off, thinking it was a motherboard manufacturer tool, only to throw away some testing because there is this odd disconnect between AMD�s engineers and AMD�s marketing.

Bulgakov
Mar 8, 2009


рукописи не горят


savage

I love the idea of gamer jesus mans getting confrontationally honest and doing journalism like this while expecting the same, exposing shill consultants that provide numbers that make it into corporate slides

Stanley Pain
Jun 16, 2001

by Fluffdaddy

SPOOKCORE MEGGIDO posted:

Asrock x470 itx.

I have two Asrock boards. The Fatality x370 Pro Gaming and the X470 Master SLI. Both have been rock solid. Asrock ITX boards are known for being really bad rear end too so I wouldn't worry.

ufarn
May 30, 2009
I finally got my spikes and temps on my 2700X more or less under control by first setting my fan curve point right above where the temperature fluctuates (like 55/60C), and then I'd set my minimum CPU usage to 5% in the Power Options and then slowly increase them until my temps/voltage would stop fluctuating like mad; for whatever reason, the fluctuations make the temps way higher than they get once you've found a steady level. It's also possible that you could try upping the usage first and then adjust your fan curve afterwards.

Mine is currently at 20% and I no longer hear the CPU fan hysteresis all the time. I'm not even sure I can hear the fan at all.

I'm currently on the XMP/DOCP profile in my ASUS bios, too, so it's not like it's not highly clocked or whatever.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

TorakFade posted:

In the BIOS I only activated XMP profile for my RAM and made sure "core performance boost" was on auto, if I understand correctly it should be XFR2; my system is boosting seemingly all the time to 4.1-4.2 Ghz / 1.40-1.45V on all cores even at idle on Windows 10 desktop, is it normal? (this is a fresh install with only Steam, games, and basic stuff like Chrome, VLC, drivers, and utilities for my RGB led stuff :pcgaming: )

Considering the above, are idle temperatures of ~35-45 C with very brief spikes up to 50-55C and load temperatures of 65-75C (real) / 80-90C (torture test) normal? I have a Meshify C with 2x 140mm intakes, 2x 120mm exhausts, CPU cooler Scythe Mugen 5 with an LL120 fan instead of the standard Kaze Flex - the LL120 has a little less CFM, but almost double the static pressure, and I still have to properly configure fan curves so I could probably shave some degrees just by running them at higher speeds.
XFR2 causes sort of a temperature ratchet during idle, which looks like a saw tooth wave when you graph it. If your fan curve is ramping in that temperature range, and your fans are loud and/or the controllers in the fans accelerate hard, you'll be hearing it.

So you need to adapt the curve accordingly. Which can be annoying however, because when things are under some, but not maximum load, the noise reducing curve might cause things to run slightly hotter, because the fans don't spin as much as they could. My Threadripper keeps spiking to 45� and sometimes even 50� (latter usually when doing things like decoding a huge JPEG), but when playing a CPU heavy-ish game like Watchdogs 2, it runs at 36-38�C. Watercooling though.

--edit: Those weird special character were temperature signs before submitting. Who broke the forums again?

--edit: FYI, I fixed things by tweaking the fan curve (shallow ramp) and getting those fancy new Noctua A12x25s.

Combat Pretzel fucked around with this message at 17:49 on Oct 10, 2018

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Interesting test done, not sure about the rigorousness of it though.

https://www.youtube.com/watch?v=S3YKMf0BDno

Around 35W cTDP, a 2700X will clock @ 2.8-3.2Ghz, but loses it's poo poo at 15W (probably encountering an issue with the uncore). I wonder if 25W would be 2.4-2.8Ghz? Or 2.2-2.6Ghz? That's still hella good, I could see that an an MX150/RX M550 in a 13.3"-14" solution or a 1050/Ti/M560 in a 15" one.

GutBomb
Jun 15, 2005

Dude?

Bulgakov posted:

savage

I love the idea of gamer jesus mans getting confrontationally honest and doing journalism like this while expecting the same, exposing shill consultants that provide numbers that make it into corporate slides

He actually went to the consultant’s office unannounced and grilled the guy for 30 minutes on video.

https://www.youtube.com/watch?v=qzshhrIj2EY

Adbot
ADBOT LOVES YOU

Klyith
Aug 3, 2007

GBS Pledge Week

TorakFade posted:

I really can't tell why AMD "balanced" is basically the same as full performance mode, and AMD doesn't recommend using standard "balanced" mode. I see something about "core parking" but apparently that's already included in the standard Balanced plan with the latest Windows10 updates... I don't want my CPU to run 100% all the time so I'm guessing Balanced would be the right choice, hopefully not leaving performance on the table when it's needed.

Yeah the AMD Balanced was a solution to a problem that MS independently fixed, windows now does not park cores on Ryzen desktop / AC power.

Also "problem" is maybe more severe than it deserved: I never noticed any difference back when it was an issue. Evidently the main effect was that parking could cause micro-stuttering in games, which is one of those things that you are either sensitive to or you aren't. I wasn't aware of micro-stuttering with Ryzen and I also didn't notice it with my 7870 gpu during the time when AMD drivers were bad at micro-stutters. I can only see it in demo videos that deliberately exaggerate the effect.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply