Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
NewFatMike
Jun 11, 2015

SwissArmyDruid posted:

It's like "does my ext4/5 need ECC, because btrfs doesn't need ECC" questions all over again.

Bitch, if the use of ECC were important to you, you would be using it REGARDLESS of whatever file system you eventually arrive at.

I mean, the ideal scenario is creating for yourself the situation where workstation parts are things you might conceivably need and then just using the computer to push entertaining pixels. Throw ECC in there for extra good renders.

I could get by on a 4C/8T machine with my current GPU for my CAD and video games. But now, hardware availability might let me get up to a 32C/64T Threadripper machine, and hopefully that Navi GPU with 1080 perf @ $250 happens, because then I can virtualize my CAD environment in Windows and render in whatever Linux distro I choose with all those cores. And write it off on my taxes.

And then play video games.

Thank you AMD :911:

Adbot
ADBOT LOVES YOU

Klyith
Aug 3, 2007

GBS Pledge Week

PC LOAD LETTER posted:

Eventually but it'll be years before it does.

The developers have to wait for the hardware to be commonplace enough that they know it'll work on most people's machines before they start making 6-8 core CPU's the normal baseline to target. There are some games that'll use more than 4 threads now, and more will pop up over time, but they still aren't the norm yet.

Disagree: it won't happen until compilers or perhaps the major engines start having really good, mostly-automated methods to create multiprocessing optimizations. Which is kinda like saying it won't happen without a magic wand.

Multi-threaded programming is real hard, games are not naturally suited for it, and studios have limited resources especially in the true programmer department. History has shown that just handing some new hardware to game developers does not result in them having the ability or resources to meaningfully take advantage of it.


Combat Pretzel posted:

Yeah, I don't get the drama home users create about these exploits.

Enthusiasts who read the tech news just pick up on the general level of hype and alarm that the industry covers it with. And those exploits got some real "the sky is falling" type coverage because the biggest attack target is The Cloud, which the industry as a whole just spent billions of dollars in advertising alone to sell everyone on. If the Clown were not safe it could cause a massive industry crash.


OTOH meltdown really would have been a giant problem if they hadn't had fixes ready to go by the time of publication. A webpage running javascript could scan mamory or keylog your whole system from inside the browser sandbox, that is a huge vulnerability for home users. Anyone running intel systems who intentionally disables the meltdown mitigations is a fool.

SwissArmyDruid
Feb 14, 2014

by sebmojo

NewFatMike posted:

I mean, the ideal scenario is creating for yourself the situation where workstation parts are things you might conceivably need and then just using the computer to push entertaining pixels. Throw ECC in there for extra good renders.

I could get by on a 4C/8T machine with my current GPU for my CAD and video games. But now, hardware availability might let me get up to a 32C/64T Threadripper machine, and hopefully that Navi GPU with 1080 perf @ $250 happens, because then I can virtualize my CAD environment in Windows and render in whatever Linux distro I choose with all those cores. And write it off on my taxes.

And then play video games.

Thank you AMD :911:

That's not the situation that was being asked, though. The situation being asked was "do I need to turn off SMT to be SUPER ULTRA COMPLETELY TOTALY ABSOLUTELY SECURE".

I mean, short of putting yourself in an Absolutely Safe Capsule, or literally rolling your own silicon based on RISC-V? No. You will never be safe. You will only ever be safe *enough*.

SlayVus posted:

I just tried their CorePrio program on my 1950X and was running Indigo myself. Without CorePrio and without /affinity, I was getting 1.6 on Indigo. With just /affinity 0xFFFFFFFE I was getting 1.73 in Indigo(Launching Indigo with Core 0 affinity off). With CorePrio and no /affinity, I shot up to 2.2 in Indigo with my 1950x OC'd to 3.9GHz. So not the same 50% increase in performance he was, but I did see a ~37% increase in performance in just Indigo.

I wonder if this would solve my OBS encoder issues I've been having where I'll drop 1% of my frames when trying to render a 1080p image to 720p output.

Yay, glad I was able to share news that helped.

SwissArmyDruid fucked around with this message at 23:59 on Jan 4, 2019

BangersInMyKnickers
Nov 3, 2004

I have a thing for courageous dongles

Klyith posted:

OTOH meltdown really would have been a giant problem if they hadn't had fixes ready to go by the time of publication. A webpage running javascript could scan mamory or keylog your whole system from inside the browser sandbox, that is a huge vulnerability for home users. Anyone running intel systems who intentionally disables the meltdown mitigations is a fool.

All browsers have mitigated js meltdown attacks fyi.

SlayVus
Jul 10, 2009
Grimey Drawer

SwissArmyDruid posted:

Yay, glad I was able to share news that helped.

It didn't completely fix my dropped frames in OBS, but I went from having tens of thousands of drop frames in like an hour span to only having 54 dropped frames after an hour of gaming. This poo poo seemed to have fixed my OBS issues.

Inept
Jul 8, 2003

BangersInMyKnickers posted:

All browsers have mitigated js meltdown attacks fyi.

Yeah, but if you're disabling meltdown mitigations you're probably running Brave Browser and followed some idiot's guide on Reddit to make the internet faster

PC LOAD LETTER
May 23, 2005
WTF?!

Klyith posted:

Disagree: it won't happen until compilers or perhaps the major engines start having really good, mostly-automated methods to create multiprocessing optimizations. Which is kinda like saying it won't happen without a magic wand.
While doing multi threading programming is indeed real hard (never said it was easy FYI) and magic does-it-all-for-you compilers/middleware won't ever be arriving (and never said they would be either) there are games that are indeed starting to use 8 threads today (which was once unheard of) and various middleware game engines have supported 8 threads as of like 2-3yr ago so I'm kind've confused why you'd think the sorts of stuff you're talking about aren't already appearing, or being actively worked on for near future release, or hasn't already sort've popped up now.

And yeah games that make use of 8 threads is a faaar cry from a game that'd make use of say 16, 24, or 32 threads, like these newer Zen2 model Ryzens are rumored to have, but I don't think its an unreasonable real world example to point to of progress over time on this subject. And yes it took years for that to happen but I said as much already.

Klyith posted:

History has shown that just handing some new hardware to game developers does not result in them having the ability or resources to meaningfully take advantage of it.
Also true but I also never said things worked this way either!

Lambert
Apr 15, 2018

by Fluffdaddy
Fallen Rib
Games still benefit from single-threaded performance much more (and will continue to in the future) because they don't tend to be well-suited to multithreaded processing. Magic middleware won't be able to change that.

Klyith
Aug 3, 2007

GBS Pledge Week

PC LOAD LETTER posted:

there are games that are indeed starting to use 8 threads today

Yeah, but the question is how much work those other threads are doing and how limited the game is by the performance or the 1 or 2 most important ones?

Looking back at some of the reviews from the original ryzen launch, it's interesting to note that while there are some games that have significant performance loss on a 1300X (4c/4t), they get pretty much all of it back on a 1400 (4c/8t) and marginal benefit from 6 or more real cores. My interpretation of that would be the threads >4 aren't really doing that much work, though they do need the faster scheduling of HT. But only the people that wrote it know if they used more than 4 threads because the consoles have more, and if it could have been equally good with 4 instead.

PC LOAD LETTER posted:

Also true but I also never said things worked this way either!

Sorry, didn't mean to imply that, or that you think multithreading is easy. Just my reasons for why I have a different opinion.

NewFatMike
Jun 11, 2015

SwissArmyDruid posted:

That's not the situation that was being asked, though. The situation being asked was "do I need to turn off SMT to be SUPER ULTRA COMPLETELY TOTALY ABSOLUTELY SECURE".

I mean, short of putting yourself in an Absolutely Safe Capsule, or literally rolling your own silicon based on RISC-V? No. You will never be safe. You will only ever be safe *enough*.


Yay, glad I was able to share news that helped.

Ah, yeah, you're right. I think I made a jump in thinking about it that wasn't there.

PC LOAD LETTER
May 23, 2005
WTF?!

Lambert posted:

Games still benefit from single-threaded performance much more (and will continue to in the future) because they don't tend to be well-suited to multithreaded processing.
True but also irrelevant to both my original post on this subject and my reply to Klyith's post above.

It is however worth pointing out on this line of thought that developers have been aware for a long time now that big (ie. 20%+) single thread performance increases were pretty much not going to happen anymore (due to thermal/power limits for CPU's + inherent IPC limits of x86 itself) and all that AMD/Intel would eventually be able to offer in the way of consistent and big performance gains for x86 would be through more multithreading/cores.

I'm sure they'd muuuch prefer to have more single thread performance instead but ultimately they're gonna have to make due with what they have to work with which is part of the reason why they've slooowly begun to support and use more than 4 threads in game as CPU's 4+ cores with HT become commonplace. So I see no reason for the long term trend of gradually using more and more threads in game to change even if games are and always will continue to be poorly suited to multithreading.

Well at least until Amdahl's Law takes full effect anyways.

Lambert posted:

Magic middleware won't be able to change that.
Can you please quote whoever has unironicially/seriously said magic middleware is a real thing or that it is coming? Thanks!

Klyith posted:

Yeah, but the question is how much work those other threads are doing and how limited the game is by the performance or the 1 or 2 most important ones?
Probably will vary from game to game considerably buuuut at least one game as of ~2yr ago was doing a decent job of actually putting 8 CPU threads to use and not just churning through some minor background stuff. Even now that is sort've a practical best case for any game but I think it does show that using lotsa threads to do Real Work in game is not only possible but beneficial now and not only just X years into the future.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!
Yeah things definitely are changing, just like 3-4 years ago people here were still recommending dual cores for some budget gamers and now 4/4 CPUs are starting to have issues in some games. There was definitely no concern from most people at the time I got my 6600k that the lack of hyperthreading would ever limit gaming performance, turns out that thinking was wrong.

MaxxBot fucked around with this message at 06:28 on Jan 5, 2019

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

MaxxBot posted:

Yeah things definitely are changing, just like 3-4 years ago people here were still recommending dual cores for some budget gamers and now 4/4 CPUs are starting to have issues in some games. There was definitely no concern from most people at the time I got my 6600k that the lack of hyperthreading would ever limit gaming performance, turns out that thinking was wrong.

Well I was recommending 4790K with a H/B mobo over OCing a Haswell i5 with a Z-board back in 2014, there were already a few games where the stock factory overclocked 4.2/4.4GHz 4790K beat OCed 4690K and yet people thought I was committing CPU heresy because "but-but-but OC IS EASY MAN AND WORTH IT"

Anarchist Mae
Nov 5, 2009

by Reene
Lipstick Apathy
I have the worst luck it seems. My PSU self combusted on the one 40c day we had last week. Fingers crossed my components are ok, because I can't afford a new PSU let alone a new motherboard, CPU or GPU.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Palladium posted:

Well I was recommending 4790K with a H/B mobo over OCing a Haswell i5 with a Z-board back in 2014, there were already a few games where the stock factory overclocked 4.2/4.4GHz 4790K beat OCed 4690K and yet people thought I was committing CPU heresy because "but-but-but OC IS EASY MAN AND WORTH IT"

4790k without bothering to overclock (with h97, for example) became a staple of the build thread, hell, I even built one for a family member

Falcorum
Oct 21, 2010
Most games nowadays have somewhat decent multithreading outside of niche (as the only developer, I'll make my own engine) or games with very specific use cases (I need everything to be fully deterministic), render command submission used to be one of the main bottlenecks but a lot of games nowadays just build command lists on separate threads and then actually submit them on the render thread. The other issue is physics since that's significantly more of a pain in the rear end to multithread and still have them be deterministic and I haven't heard of any recent work done on that (Bullet has had a GPU version as a WIP for about 10 years now for example).

An issue with a bunch of games and threading nowadays however is not pinning worker threads to specific cores so the end result is you'd have like 10 worker threads constantly context switching instead of actually doing useful things. :v:

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Monster Hunter World PC has literally hundreds of threads running at any one time and around 20% CPU usage is spent just switching them around. It's the crappiest port I've ever seen.

Setset
Apr 14, 2012
Grimey Drawer

Zedsdeadbaby posted:

Monster Hunter World PC has literally hundreds of threads running at any one time and around 20% CPU usage is spent just switching them around. It's the crappiest port I've ever seen.

Well the technology has to begin somewhere. At least they seem to be pushing the envelope and utilizing a lot of threads?

karoshi
Nov 4, 2008

"Can somebody mspaint eyes on the steaming packages? TIA" yeah well fuck you too buddy, this is the best you're gonna get. Is this even "work-safe"? Let's find out!

Lube banjo posted:

Well the technology has to begin somewhere. At least they seem to be pushing the envelope and utilizing a lot of threads?

Ideally you want N threads, where N is the number of logical processors. An arbitrarily high number of threads is bad, each thread switch is a kernel level operation, requiring 2 context switches, from thread A to kernel and back to thread B. This is expensive. In comparison a thread running a loop fetching small jobs will take much less to switch to a new task after finishing the current task and won't be interrupted, killing the cache efficiency, by a different task.

It's an anti-pattern where you assign a thread to every little thing. Button animation? New thread. Scrolling a view? New thread. A sewage agent in the latest Sim City? Give it a thread. You might also find it in corporate business java applications, it's not like you expect better from them. Also assign a thread for every single I/O channel you have. Thread per file, thread per network connection, let's benchmark kernel context switching.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Lube banjo posted:

Well the technology has to begin somewhere. At least they seem to be pushing the envelope and utilizing a lot of threads?

Not when you get things like this;

quote:

https://i.imgur.com/VvuZgpX.gifv

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
I don't know how the gently caress somethingawful doesn't support properly downsizing gifv embeds yet

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

Zedsdeadbaby posted:

Not when you get things like this;

What am I supposed to be seeing here?

Llamadeus
Dec 20, 2005

Sidesaddle Cavalry posted:

What am I supposed to be seeing here?
Every flick of the controller resulting in a frametime spike

Setset
Apr 14, 2012
Grimey Drawer

Llamadeus posted:

Every flick of the controller resulting in a frametime spike

Not every flick. There's one flick towards the end that causes a stutter from 16.7 ms to 26.6 ms. Jitter within +-1 ms is very normal

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
Oh that's a frametime histogram. Ohhh. Gotcha

I haven't perused the Monster Hunter thread in a long long time and a lot of the port controversy i only heard of by word of mouth and read secondhand accounts. (i liked the port because it let me play with my friends so i tend to be a profuse apologist for it) Hundreds of threads and not 32?

karoshi posted:

let's benchmark kernel context switching.

*entitled gamer voice* Then it's time for kernel context switching to get faster. Get to it, Microsoft! Intel! Wintel! :v:

Inept
Jul 8, 2003

Zedsdeadbaby posted:

I don't know how the gently caress somethingawful doesn't support properly downsizing gifv embeds yet

SA has one volunteer that does some backend stuff in his spare time. I don't think lowtax has any employees anymore.

Varashi
Sep 1, 2006
THE MAN is limiting my BANDWIDTH :argh: [belgian goons]

Llamadeus posted:

Every flick of the controller resulting in a frametime spike

Also that looks like terrible input lag.

Subjunctive
Sep 12, 2006

✨sparkle and shine✨


They mean “2 = Model Revision”, right?

Arzachel
May 12, 2012

Subjunctive posted:

They mean “2 = Model Revision”, right?

Yeah. An alternate theory is that the 121 part isn't base clock but the core config.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Lambert posted:

Games still benefit from single-threaded performance much more (and will continue to in the future) because they don't tend to be well-suited to multithreaded processing. Magic middleware won't be able to change that.

In contrast, I would argue that with each passing console generation that remains under AMD's control, is more time spent acclimatizing devs to multicore programming.

Remember that last year, Sony committed a great many Ryzen LLVM improvements to the project, which will almost certainly be involved in the forthcoming PS5, as well as have ripple effects to anything else that AMD touches.

Zedsdeadbaby posted:

Monster Hunter World PC has literally hundreds of threads running at any one time and around 20% CPU usage is spent just switching them around. It's the crappiest port I've ever seen.

You know, I wonder if that CorePrio tool might help MHW. I don't have MHW installed at the moment, can anyone check?

SwissArmyDruid fucked around with this message at 19:26 on Jan 5, 2019

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

SwissArmyDruid posted:

In contrast, I would argue that with each passing console generation that remains under AMD's control, is more time spent acclimatizing devs to multicore programming.

Multicore consoles handily pre-date AMD entering the market for their CPUs though? It's not like anyone but Nintendo, maybe, would be stupid enough to revert to single-core or low core count consoles, and they've also never been on AMD CPUs anyway.

SwissArmyDruid
Feb 14, 2014

by sebmojo
Fishmech gonna fishmech.

Fine. Acclimatiling devs to multicore programming in a way that should yield results outside of a console environment, as this is the first generation where the majority of the console market runs off of hardware built on an x86 architecture that is consequently more applicable to the wider industry as a whole.

And now I remember why I put you on ignore.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

SwissArmyDruid posted:

Fishmech gonna fishmech.

Fine. Acclimatiling devs to multicore programming in a way that should yield results outside of a console environment, as this is the first generation where the majority of the console market runs off of hardware built on an x86 architecture that is consequently more applicable to the wider industry as a whole.

And now I remember why I put you on ignore.

What because you're salty your statements are nonsensical? I can assure you that coding systems well for the triple core symmetric PPC core in the 360 was quite applicable to getting good results in PC multicore programming too.

GRINDCORE MEGGIDO
Feb 28, 1985


Easier it is for ports the better imo

Khorne
May 1, 2002

Arzachel posted:

Yeah. An alternate theory is that the 121 part isn't base clock but the core config.
Core config makes more sense because there was another zen2 leak with a different number there and a 1.21 base clock. It could also just be an internal identifier of some kind.

Khorne fucked around with this message at 01:05 on Jan 6, 2019

PC LOAD LETTER
May 23, 2005
WTF?!

GRINDCORE MEGGIDO posted:

Easier it is for ports the better imo

Yeah. The X360's CPU and cache structure was very different from a typical x86 CPU of the time. The Xb1's and PS4's CPU and cache structure is nearly identical to a typical x86 desktop CPU in comparison.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
The current gen consoles came after a generation that was born in the recession era and the console developers really didn't want to do strange, expensive bespoke designs again. That's why the current bunch use practically off the shelf pc configurations as opposed to weird hosed up triple core ppc or cell tech with spes and such.

There's a lot of expertise overlap for multi thread programming as a result

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

fishmech posted:

What because you're salty your statements are nonsensical? I can assure you that coding systems well for the triple core symmetric PPC core in the 360 was quite applicable to getting good results in PC multicore programming too.

It even had SMT..

NewFatMike
Jun 11, 2015

Ryzen Mobile 3000 series got announced ahead of the CES keynote:

https://www.anandtech.com/show/13771/amd-ces-2019-ryzen-mobile-3000-series-launched

Process improvement over Raven Ridge, fully on Zen+ 12nm architecture. Looks like clock boosts in similarly addressed thermal envelope, except there are now 35W parts. Some Excavator stuff got pulled back from the grave for 1C/2T Chromebook parts.

Still caps out at 4C/8T, which is a little disappointing. Hopefully this means the keynote will be much more exciting/focused on desktop parts.

Adbot
ADBOT LOVES YOU

SwissArmyDruid
Feb 14, 2014

by sebmojo
You're forgetting the most important part: DAY ZERO MOBILE APU DRIVER UPDATES (gott in himmel, finally)

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply