Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
kuskus
Oct 20, 2007

Klyith posted:

For mATX B450s, the MSI B450M Gaming Plus can handle a 3900X. Or if you are wedded to Asus mobos, get one of their X570 boards because all of those are good.
Thanks for saving me headaches (this looks like it, so I'll wishlist). My primary interest is the CPU, all other elements can change.
I actually like Mini ITX (sometimes I need to take a full-sized GPU rig to demo things) and mATX seems the next step viable for this CPU.

Adbot
ADBOT LOVES YOU

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.
Any chance a Gigabyte AB350 Gaming 3 could handle a 3900x? I know the VRM is a 4 + 3 phase, which is marginal for the chip but could probably manage it with PBO if you had a solid air cooler.

orcane
Jun 13, 2012

Fun Shoe
It should work (one of those fancy VRM tier lists for reference: https://i.redd.it/2iwdy5wrrly31.png), but you want airflow over those VRMs and definitely do NOT enable PBO or OC features with that combo.

Klyith
Aug 3, 2007

GBS Pledge Week

kuskus posted:

Thanks for saving me headaches (this looks like it, so I'll wishlist). My primary interest is the CPU, all other elements can change.
I actually like Mini ITX (sometimes I need to take a full-sized GPU rig to demo things) and mATX seems the next step viable for this CPU.

No, that's not the gaming plus. Again the board you are looking at has no heatsinks on the vrm that is not good for a 3900X.

Of what's carried at microcenter, your best bet is the ASRock B450M Pro4 followed by the Gigabyte B450 AORUS M. The asrock is better, but has some asrock weirdness.

redeyes
Sep 14, 2002

by Fluffdaddy
Buying a really low end mobo with a high end processor seems.. less than optimal.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Rabid Snake posted:

Man, I miss when mobo manufactuers paid more attention to the mATX. It's kind of stuck in the middle between SFF builders (ITX) and regular ATX builds but drat the mATX was the perfect size

Yup. Really.

kuskus
Oct 20, 2007

redeyes posted:

Buying a really low end mobo with a high end processor seems.. less than optimal.
What's the specific optimal combo? Not coming from a frugal angle, coming in without up to date knowledge of the products.

Klyith
Aug 3, 2007

GBS Pledge Week
Unfortunately mATX mobos aren't much cheaper to make than a regular ATX mobo, but most people want a discount because it's smaller. Mobo makers are desperate for margin so they go to where the cool people who buy expensive poo poo are.

Still, it does seem like an opportunity -- it's a completely empty market now, so if just one of the oems made a single good mATX X570 you'd think they'd get some takers for it.


kuskus posted:

What's the specific optimal combo? Not coming from a frugal angle, coming in without up to date knowledge of the products.

If you're not overclocking, most high-end mobos are a waste of money. Motherboards have very little impact on overall system speed -- and when they do it's generally because they're cheating.

The optimal combo is a mobo with a good VRM for the CPU you'll put in it, and all the IO you need. That's it really. Extra features like cpu-less bios flash are a big plus on AM4 but historically low value with intel. But to do that you have to know what a VRM is and what's good enough.

the tl;dr rule is that if you see a motherboard with no heatsinks on the components around the CPU area, don't buy it unless it's for an office / mom-pc or will only use a low-power CPU.

Klyith fucked around with this message at 15:51 on Jun 3, 2020

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

kuskus posted:

What's the specific optimal combo? Not coming from a frugal angle, coming in without up to date knowledge of the products.

if you're getting a 3900X, then you should really pair it with an X570 board, because those are the boards with the power delivery designed to handle that kind of thing

and from a productivity perspective, it's the X570s that have the I/O to support whatever it is you'd be doing

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

redeyes posted:

Buying a really low end mobo with a high end processor seems.. less than optimal.

Oh, I know it's not ideal, or even smart. That was just lurid curiosity since it's listed as officially supported, even if it's Not Swell.

Arzachel
May 12, 2012

gradenko_2000 posted:

if you're getting a 3900X, then you should really pair it with an X570 board, because those are the boards with the power delivery designed to handle that kind of thing

and from a productivity perspective, it's the X570s that have the I/O to support whatever it is you'd be doing

If you don't need the extra features, X570 is a waste of money with a chipset fan. A good B450/B550 board will have more than enough VRM beef to handle a 3900X.

redeyes
Sep 14, 2002

by Fluffdaddy

Arzachel posted:

If you don't need the extra features, X570 is a waste of money with a chipset fan. a good B450/B550 will have more than enough VRM beef to handle a 3900X.

Thats not what I read though. I think having good VRMs is directly proportional to the temperatures you get. Could be wrong though. But with a 3900x you do need a bit of VRM cooling.

Klyith
Aug 3, 2007

GBS Pledge Week

redeyes posted:

Thats not what I read though. I think having good VRMs is directly proportional to the temperatures you get. Could be wrong though. But with a 3900x you do need a bit of VRM cooling.

The VRM will not affect the CPU temperatures. A low-efficiency VRM generates more waste heat for itself that the system has to deal with, but that is fairly small potatoes compared to the heat of the CPU & GPU. A merely "ok" VRM is fine for a 3900X as long as the CPU is air-cooled, because an air cooler will generate some cooling for the VRM heatsink.


The B450 boards pointed out as being unsuitable for a 3900X aren't "ok", they're really bad and have no heatsink at all. But there are like 3 retail B450 boards that are so bad they shouldn't use a 3900X and the OP happened to pick 2 of them. There are plenty of B450s that are totally good for a 3900X -- the MSI Tomahawk that's the board of choice in the PC build thread can do a stock 3900X with zero airflow considerations.

Also all of this "can't handle a 3900X" is relative. There are a lot of safeties build into modern power delivery, the board is unlikely to burn itself out. But one of the ways the safeties work is by limiting how much power the CPU can get, which means you don't have 100% performance from your expensive CPU. And the other downside is that hot components don't last as long.


Klyith posted:

the tl;dr rule is that if you see a motherboard with no heatsinks on the components around the CPU area, don't buy it unless it's for an office / mom-pc or will only use a low-power CPU.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
The very bottom tier of $50-70 boards with weaksauce 3-phase VRMs with 30A stages are paperweight tier bad (and it's kind of dumb that people cite these as the alternative to more expensive Z490 boards), you get what you deserve buying something in that segment. But something like the B450 Tomahawk Max is fine for a 3900X if you have airflow and aren't sitting there running Prime95 24/7.

The real-world workload that most closely emulates Prime95 is probably video encoding or 3D/CAD rendering where you have a lot of AVX activity going on, but even still, it's a lot less intense than Prime95. A lot of the complaints come down to "doctor, it hurts when I do this", stop running Prime95 and looking for a problem and it won't be a problem in actual usage.

(although with AMD's current limiting even Prime95 probably just causes downclocking since the processors have a hard PPT limit...)

orcane
Jun 13, 2012

Fun Shoe
Who is suggesting these boards? Most B450 boards I'm seeing people recommend are closer to $90+ and are perfectly fine alternatives to the overpriced X570 and Z490 ~SUPER GAMER OC~ boards.

gradenko_2000 posted:

if you're getting a 3900X, then you should really pair it with an X570 board, because those are the boards with the power delivery designed to handle that kind of thing

and from a productivity perspective, it's the X570s that have the I/O to support whatever it is you'd be doing
A 3900X is not a significant step up from the power requirements of the old 2700X, a CPU for which the B450 boards were actually made.

orcane fucked around with this message at 19:08 on Jun 3, 2020

Worf
Sep 12, 2017

If only Seth would love me like I love him!

now im scared i have a bad mobo in my AMD build and ill be damned if i remember what i got

Klyith
Aug 3, 2007

GBS Pledge Week

orcane posted:

Who is suggesting these boards?

They've frequently been the only thing in stock.


Statutory Ape posted:

now im scared i have a bad mobo in my AMD build and ill be damned if i remember what i got

Apparently an asrock B450M Pro4, which is the best VRM available on a mATX B450. Still has some good old asrock weirdness with the M.2 slots, but it's not bad.

AARP LARPer
Feb 19, 2005

THE DARK SIDE OF SCIENCE BREEDS A WEAPON OF WAR

Buglord

Statutory Ape posted:

now im scared i have a bad mobo in my AMD build and ill be damned if i remember what i got

you should be!

In all known cases of keyboards spontaneously exploding into users’ faces, 95% of them involved “not knowing what kind of mobo I’ve got.”

plz be careful

Worf
Sep 12, 2017

If only Seth would love me like I love him!

Klyith posted:

Apparently an asrock B450M Pro4, which is the best VRM available on a mATX B450. Still has some good old asrock weirdness with the M.2 slots, but it's not bad.

lol nice ty. havent noticed any fuckery with the m.2 slots, was it placement of them or some other issue?

e; lol nice i set that goon up pretty well if he took my advice on going AMD. dude can upgrade for days

god this blows
Mar 13, 2003

Statutory Ape posted:

lol nice ty. havent noticed any fuckery with the m.2 slots, was it placement of them or some other issue?

e; lol nice i set that goon up pretty well if he took my advice on going AMD. dude can upgrade for days

I think the issue with the m.2 slots is that it disables SATA slots.

Klyith
Aug 3, 2007

GBS Pledge Week

god this blows posted:

I think the issue with the m.2 slots is that it disables SATA slots.

No, that would be perfectly normal -- most every ryzen mobo disables sata 5&6 when the m.2 slot is used. The Asrock board only has 4 sata ports though so it isn't a problem.

Statutory Ape posted:

lol nice ty. havent noticed any fuckery with the m.2 slots, was it placement of them or some other issue?

They're weird because one is NVMe/PCIe only and the other is Sata only (and takes away sata port 3 if you use it).

I understand why the 2nd one is sata-only, B450 doesn't have enough pcie lanes to wire it up. They stuck that on just so the board would have two m.2 slots shown on the pics. But why does the 1st one only do NVMe? That should be the one that can dynamically switch between NVMe and sata on the CPU, like every other ryzen mobo. IDGI.

Anyways if you ever want 2 nvme drives in the future you'll have to get an adapter card (cheap) to hook it up to the 2nd big PCIe slot.

Klyith fucked around with this message at 23:18 on Jun 3, 2020

PC LOAD LETTER
May 23, 2005
WTF?!

EmpyreanFlux posted:

Like unless Zen 3 is hella faster than Zen 2, a backcourt of Willow Cove should beat it in theory.

The rumored issue with all the new *cove cores is supposedly the clock speeds are mediocre to terrible either on 14 or 10nm with sane power envelopes.

There is LOTS of leaks on the IPC improvements with these up coming Intel cores but solid info on the base clocks isn't so good and what you can find suggests something in the mid to low 3Ghz range at the top end with most chips being in the 2Ghz-ish range.

There have been some leaks on the boost clocks which go into the 4Ghz+ range....but those are just boost clocks.

So anyways yeah if they can get the clock speeds up to ~4Ghz+ without blowing out the TDP's to 200W than it should at stock meet or beat Zen3 or 2 handily. If they're stuck at ~3-3.5Ghz or so though than, perversely (AMD with a large clock speed advantage vs Intel would be bizarre), Zen3 or even Zen2 should be able to do well against it or perhaps even beat it (Zen3 obviously) by OK margins.

Even then though it should still scale down very well power-wise so it'll be able to compete well in laptops of course so it won't be a total loss for Intel here no matter what.

But yeah if the CEO of the company is talking down benchmarks and performance than that doesn't bode well at all for that company's product.

Reminds me of the Itanium days all over again actually since Intel said similar things back then for a brief while.

BlankSystemDaemon
Mar 13, 2009



So long as the IPC is better than a Sandy Bridge era i7-2600, and it turbos to at least 4.2GHz, I'll honestly be happy as long as it has enough PCIe lanes to allow me to do what I want.

ConanTheLibrarian
Aug 13, 2004


dis buch is late
Fallen Rib

PC LOAD LETTER posted:

So anyways yeah if they can get the clock speeds up to ~4Ghz+ without blowing out the TDP's to 200W than it should at stock meet or beat Zen3 or 2 handily. If they're stuck at ~3-3.5Ghz or so though than, perversely (AMD with a large clock speed advantage vs Intel would be bizarre), Zen3 or even Zen2 should be able to do well against it or perhaps even beat it (Zen3 obviously) by OK margins.
The 10900K's base clock is lower than the 3900X, but no one's under any illusion which one runs at the highest frequency in normal use.

That said, it would be a weird situation if the new desktop Cove part traded Hz for IPC to the point where it becomes more competitive in productivity tasks but worse at gaming.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

D. Ebdrup posted:

So long as the IPC is better than a Sandy Bridge era i7-2600, and it turbos to at least 4.2GHz, I'll honestly be happy as long as it has enough PCIe lanes to allow me to do what I want.

I'm pretty sure Zen 2 is already better than Skylake (or any of the Skylake sequels) clock for clock

vv I'm not sure Intel has a single thread IPC lead.. they have an absolute single thread lead due to higher clocks, and lower latencies, but per clock? I'm not so sure. Of course, results vary from application to application

HalloKitty fucked around with this message at 16:03 on Jun 4, 2020

Hasturtium
May 19, 2020

And that year, for his birthday, he got six pink ping pong balls in a little pink backpack.

HalloKitty posted:

I'm pretty sure Zen 2 is already better than Skylake (or any of the Skylake sequels) clock for clock

It depends on what you're doing with it. Intel still maintains a lead in memory latency-sensitive work and single thread IPC, but in multithreaded scenarios Zen 2 goes from competitive to superior. Those big, wide SMT cores working in tandem at high clocks are *really* effective.

It's possible that the right application with effective AVX-512 (or the new machine learning instruction set...) optimizations would give Skylake-X and its kin an advantage, but that's the definition of an edge case at a high price point. The clock speed offset for AVX-512 could also impact speed enough to let Zen 2 running AVX-256 keep stride. What I have noticed is that a 3900x with fast dual-channel memory is fast enough to beat or hold steady against my 7940x despite that chip having quad channel memory and two additional cores. Even with PBO enabled, it also consumes maybe 66% of the power under sustained load. AMD deserves the credit it's gotten here.

Cygni
Nov 12, 2005

raring to post

There are plenty of clock for clock IPC comparisons out there. Intel generally wins if the thing is memory latency sensitive (games) or leans on AVX2+ heavily (some pro stuff), but basically everything else is faster on AMD clock for clock. But Intel is also pushing more clock in a lot of parts. And also outside of big time nerd architectural convos and dipshit brand warriors who wanna argue about who is ~superior~, in the end, price is really all that matters.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Intel still is not bad at productivity, if you primarily game but occasionally want to be able to encode a video then Intel is still plenty fast to handle that. A stock 9900K or 10700F is still 52% faster in 1080p handbrake x264 encoding and 62% faster in handbrake x265 encoding than a 1800X that people were falling over themselves to recommend a few years ago as the "productivity king", and so on.

The whole "AMD efficiency" thing was largely that AMD had higher core counts and Intel was chasing clocks to catch up. AMD only had half the AVX2 performance and GF 12/14nm were actually much less efficient than Intel 14++, so once you equalize the core counts AMD was terribly inefficient. Intel could run like low 4 GHz range and still hammer AMD on performance in those productivity tasks, or run higher clocks and get a huge amount of additional performance.

But, Zen2 fixed the AVX2 performance problems and is no longer on a worse node than Intel, so between higher core counts per dollar and better power efficiency, it's not worth doing Intel for a primarily productivity build anymore, with few exceptions (digital audio workstations being one that comes to mind). They are running full speed AVX2 and on a better node, plus you get more cores at a given price point.

AVX-512 is a whole can of worms, if you have a very AVX-heavy task like blender then it can help efficiency significantly, despite (or because of) the offset. In particular the offset on HEDT chips is not as high as the server chips, iirc. Or you can disable the offset entirely but then you're pulling quite a bit more power. And regardless, you have to know your workload is heavy on AVX before that matters. Now that AMD is no longer making GBS threads their pants on AVX2, the theoretical benefits are a lot less in even the optimal situation, and they have 7nm to offset the efficiency benefits.

AVX-512 may pick up more performance benefits over time, especially once we see it adopted in desktop CPUs. I think a lot of stuff just isn't optimized for it right now because outside Skylake-X, Skylake-SP and Ice Lake laptops it's really just not available anywhere and there's no sense optimizing for it. Of course, there is also the problem that unlike AVX2, there is no "single" AVX-512 standard and every single implementation supports a slightly different mix of instructions...

Paul MaudDib fucked around with this message at 17:24 on Jun 4, 2020

BlankSystemDaemon
Mar 13, 2009



Egosoft also magically fixed their game in the latest beta-patch, so I no longer need to upgrade (I didn't really need to, but really wanted to even if I couldn't afford it).

MarsellusWallace
Nov 9, 2010

Well he doesn't WANT
to look like a bitch!

pyrotek posted:

Also RDNA2, not 1, combined with faster storage than anything the PC has in the case of PS5. The power jump is so huge I imagine it will take a few years at minimum for games to start properly taking advantage of them.

I've seen this bandied about a bunch of times - what does fast storage get you? I went from a Samsung 850 to a fast NVMe drive and other than file transfers have seen basically no difference. Even load times 'feel' very similar, and I have seen no impact whatsoever to framerates. NVMe has been out long enough that surely developers have had time to take advantage of whatever gains could be had. Is it marketting bupkiss, or will game engines now actually benefit from blazing fast storage?

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

MarsellusWallace posted:

I've seen this bandied about a bunch of times - what does fast storage get you? I went from a Samsung 850 to a fast NVMe drive and other than file transfers have seen basically no difference. Even load times 'feel' very similar, and I have seen no impact whatsoever to framerates. NVMe has been out long enough that surely developers have had time to take advantage of whatever gains could be had. Is it marketting bupkiss, or will game engines now actually benefit from blazing fast storage?

NVMe only pulls meaningfully ahead at higher queue depths than the average consumer uses their SSD, for games the difference isn't perceptible.

The PS5 is supposed to have dedicated decompression hardware such that it can stream assets directly from storage in 'new and revolutionary' way, but Sony's been pretty mum about the console as a whole so other than some game devs totally swearing it'll change everything, nobody knows.

Pablo Bluth
Sep 7, 2007

I've made a huge mistake.

MarsellusWallace posted:

I've seen this bandied about a bunch of times - what does fast storage get you? I went from a Samsung 850 to a fast NVMe drive and other than file transfers have seen basically no difference. Even load times 'feel' very similar, and I have seen no impact whatsoever to framerates. NVMe has been out long enough that surely developers have had time to take advantage of whatever gains could be had. Is it marketting bupkiss, or will game engines now actually benefit from blazing fast storage?
As I understand it, the raw hardware isn't necessarily faster but it's used in a way to make it much more optimised. Dedicated hardware decompression to transparently get more data over the same bandwidth without putting any load on the CPU, and some sort of optimisation to get data from the SSD to the GPU memory without the traditional bottlenecks (that otherwise limit the gains from nvme over sata).

Pablo Bluth fucked around with this message at 21:05 on Jun 4, 2020

Saukkis
May 16, 2003

Unless I'm on the inside curve pointing straight at oncoming traffic the high beams stay on and I laugh at your puny protest flashes.
I am Most Important Man. Most Important Man in the World.

MarsellusWallace posted:

I've seen this bandied about a bunch of times - what does fast storage get you? I went from a Samsung 850 to a fast NVMe drive and other than file transfers have seen basically no difference. Even load times 'feel' very similar, and I have seen no impact whatsoever to framerates. NVMe has been out long enough that surely developers have had time to take advantage of whatever gains could be had. Is it marketting bupkiss, or will game engines now actually benefit from blazing fast storage?

Consoles have had limited amounts of RAM and the games have been restricted by how fast they can load assests from disk. One example that has been given was some Spiderman game, where the speed of how fast you could swing around the city was directly restricted to the drive speed. Since consoles had slow drives and computers usually had more RAM and VRAM there wasn't much reason to design games for fast drives.

Truga
May 4, 2014
Lipstick Apathy
also, "faster than anything else available" is a bit silly when you can easily raid0 cheap small nvmes and get absolutely insane speeds on a pc, yet nobody really does that.

what i wonder though is, are ps5 xboxxx going to be another $599 thing :v:

Inept
Jul 8, 2003

Even if they are, thanks to inflation, the PS3 launch price would now be ~$760. It was a stupid expensive console when it came out.

sincx
Jul 13, 2012

furiously masturbating to anime titties
.

sincx fucked around with this message at 05:50 on Mar 23, 2021

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
They aren't going to be in the same price category. MS is going big unit first, small unit later, because the small unit wouldn't be distinguished enough from the Bone X. Sony is doing the reverse, because they can sell a cheaper unit that's still a huge step up from the PS4 Pro.

crazypenguin
Mar 9, 2005
nothing witty here, move along

MarsellusWallace posted:

I've seen this bandied about a bunch of times - what does fast storage get you? I went from a Samsung 850 to a fast NVMe drive and other than file transfers have seen basically no difference. Even load times 'feel' very similar,

We'll see, but the basic idea here is that an 8 core CPU can zlib decompress about 600 MB/s of input data (which is about what SATA can supply) using 100% of the CPU.

Both consoles (I think? At least the PS5) will have hardware decompression in the SoC. This can decode as fast as the data can arrive, without using any CPU time. It's this hardware offload that's key.

Basically, consoles went from spinning rust disks to SSDs, and immediately went "oh god, we definitely need to offload decompression!" and then they just did it. Meanwhile, there's no announced plans for that to come to PC yet. We're waiting on AMD/Intel for that. Still crickets. But at least Microsoft has announced DirectStorage, so as soon as this capability comes to PC hardware, there's a way to use it.

Will it matter? Who knows, let's see. But at least in theory there's like 17x faster loading times possible there, or possibly adapting engines to just 100% eliminate them. That's pretty sweet.

repiv
Aug 13, 2009

crazypenguin posted:

Both consoles (I think? At least the PS5) will have hardware decompression in the SoC.

Both do, the implementations are just a bit different. PS5 has the very strong general purpose Kraken codec in hardware, XBSX has the weaker zlib codec for general purpose data but also has a new specialized codec for super-compressing GPU compressed textures. It remains to be seen which strategy has better throughput in the real world.

Adbot
ADBOT LOVES YOU

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

Some Goon posted:

Sony's been pretty mum about the console as a whole

I mean sure, unless you count that one video where Mark Cerny talks directly to a camera for fifty-three minutes about the PS5 hardware, with a focus on the design and performance of the storage subsystem. But other than that, yeah.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply