Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Howard Phillips
May 4, 2008

His smile; it shines in the darkest of depths. There is hope yet.

movax posted:

As one of my profs used to say, there isn’t analog and digital, there’s just analog and really really fast analog. The design of high-performance caches and other SRAMs are fascinating.

I wonder if anyone was foreseeing the tight integration of power management and basically playing the game around a control loop to keep voltage above BOR levels as a significant area of investment.

Cache design comes down to temporal and spatial probability. When running applications and doing background services your computer is usually doing certain functions over and over again. If the memory addresses or the timing frequency of this recurring code could be predicted, then no need to go into memory, virtual memory, or disk to retrieve it. Each access, respectively, is at least an order of magnitude slower in time. Looking forward to more efficient and clever caching algorithms that will improve processor performance even as core clock speeds reach their physical limits for transistor based micro-architecture.

Adbot
ADBOT LOVES YOU

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
I'm pretty sure he meant the electrical and signalling bullshit involved in implementing SRAM caches.

CerealKilla420
Jan 3, 2014

"I need a handle man..."
Just got a 3600x. I cheaped out because I was looking to spend <$500. It is incredibly cool and good. Not to mention fast.

A huge upgrade from my dogshit FX-8320E. God FX was bad. I'm sure people will try and defend it a few years from now but holy poo poo was that chip worthless.

movax
Aug 30, 2008

Combat Pretzel posted:

I'm pretty sure he meant the electrical and signalling bullshit involved in implementing SRAM caches.

Yeah, this is what I was primarily talking about but the post about temporal and spatial locality being fundamental to a cache doing its job is certainly correct.

There are increasing amounts of EDAC (error detection and correction) being found in caches how as transistors shrink; ECC on L2 caches have been offered a la carte to silicon integrators from IP vendors for some time now.

90s Solo Cup
Feb 22, 2011

To understand the cup
He must become the cup



64bit_Dophins posted:

Just got a 3600x. I cheaped out because I was looking to spend <$500. It is incredibly cool and good. Not to mention fast.

A huge upgrade from my dogshit FX-8320E. God FX was bad. I'm sure people will try and defend it a few years from now but holy poo poo was that chip worthless.

FX gave you more cores, but did it in such a lovely way that you were better off having fewer cores and better performance with Intel. Ryzen did magnitudes to right that wrong.

If the 3800x is still $329 in a few weeks, I might actually pull the trigger on one.

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

64bit_Dophins posted:

Just got a 3600x. I cheaped out because I was looking to spend <$500. It is incredibly cool and good. Not to mention fast.

I don't think the 3600(x or otherwise) is cheaping out. I think it's a sweet spot for personal/desktop use. The Ryzen X600 chips have all had really good punch for the cost, and that cost has been low enough that a more aggressive upgrade cadence is affordable -- especially if you do those upgrades a year(ish) behind the release date.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

Balliver Shagnasty posted:

FX gave you more cores, but did it in such a lovely way that you were better off having fewer cores and better performance with Intel. Ryzen did magnitudes to right that wrong.

If the 3800x is still $329 in a few weeks, I might actually pull the trigger on one.

The 3800 doesn't outperform the 3700, literally a couple percent in select benchmarks, while drawing significantly more power. It just exists to fill a hole in the SKU lineup.

Lungboy
Aug 23, 2002

NEED SQUAT FORM HELP
What's the current recommended ram for a new ryzen 3600 build? Cheap 3200mhz or slower e-die and over clock?

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Lungboy posted:

What's the current recommended ram for a new ryzen 3600 build? Cheap 3200mhz or slower e-die and over clock?

the difference between 3200 and 3600 is not that big. E-die isn't that expensive though

Crunchy Black
Oct 24, 2017

by Athanatos

Paul MaudDib posted:

holy poo poo

If there was something awful gold, this post fuckin' deserves it, thanks!

GRINDCORE MEGGIDO
Feb 28, 1985


Crunchy Black posted:

If there was something awful gold, this post fuckin' deserves it, thanks!

Content like that makes me happy. Good thread.

Leandros
Dec 14, 2008

I have an i5 750 so it's time for a refresh and AMD definitely seems like a smart bet this time. I'd prefer to do it sooner rather than later, but the ~20% IPC gains the Zen3 apparently touts is sounding pretty good too. Would it make sense to go for a 3700X now with an X570 mobo and swap out the CPU for a 4k series somewhere down the line or will the X570 be a bottleneck? 3800X is about 35 euros more but doesn't seem worth the investment. I'm not getting any super authoritative info that Zen3 would even run on AM4, aside from the fact that Epyc3 will be running DDR4 and a move to AM5 would imply DDR5 memory, which won't be manufactured in high enough volumes around the release of Zen3. Considering the X570 is currently high-end, I would assume it ought to be enough, but I honestly don't know enough about this stuff.

VorpalFish
Mar 22, 2007
reasonably awesometm

Leandros posted:

I have an i5 750 so it's time for a refresh and AMD definitely seems like a smart bet this time. I'd prefer to do it sooner rather than later, but the ~20% IPC gains the Zen3 apparently touts is sounding pretty good too. Would it make sense to go for a 3700X now with an X570 mobo and swap out the CPU for a 4k series somewhere down the line or will the X570 be a bottleneck? 3800X is about 35 euros more but doesn't seem worth the investment. I'm not getting any super authoritative info that Zen3 would even run on AM4, aside from the fact that Epyc3 will be running DDR4 and a move to AM5 would imply DDR5 memory, which won't be manufactured in high enough volumes around the release of Zen3. Considering the X570 is currently high-end, I would assume it ought to be enough, but I honestly don't know enough about this stuff.

There's always something better around the corner with computer hardware. You should upgrade when you want something better unless you know the newer architecture is releasing like next month.

I'd also forget about upgrading zen2 to zen3. Socket compatibility sounds great in theory but in practice, spending 300+ for a 15-20% performance uplift is terrible value.

Coming from a 750, you'd probably be super happy with a 3700x or even a 3600 depending on your workloads. Also consider b450 over x570 if you don't have a compelling use case for pcie4 because it's cheaper and gently caress active chipset cooling forever.

CrazyLoon
Aug 10, 2015

"..."

VorpalFish posted:

I'd also forget about upgrading zen2 to zen3. Socket compatibility sounds great in theory but in practice, spending 300+ for a 15-20% performance uplift is terrible value.

The main reason I got a basic zen+ at bargain bin pricing instead. Would've been truly brave and gone for a 1000 series for max pricing :smuggo: buuut X570 mobos don't officially support those so...not that brave lol.

BeastOfExmoor
Aug 19, 2003

I will be gone, but not forever.

Leandros posted:

I'd prefer to do it sooner rather than later, but the ~20% IPC gains the Zen3 apparently touts is sounding pretty good too.

20% IPC sounds wildly optimistic to me, but I'm a little out of the loop. Is their someone credible reporting these numbers?

Leandros
Dec 14, 2008

VorpalFish posted:

I'd also forget about upgrading zen2 to zen3. Socket compatibility sounds great in theory but in practice, spending 300+ for a 15-20% performance uplift is terrible value.

Coming from a 750, you'd probably be super happy with a 3700x or even a 3600 depending on your workloads. Also consider b450 over x570 if you don't have a compelling use case for pcie4 because it's cheaper and gently caress active chipset cooling forever.
I'd obviously sell the 3700X if I were to get a new one, so it'd not be that big a hit, but you're probably right that it might not be worth it. As for X570, I was originally looking for a mobo with 8 SATA ports as I have a buttload of storage and those seemed the cheapest option. I'm now going for a NAS build, so the more common 6 SATA ports should be fine.
I'm not a fan of the chipset cooler either, and will probably go for an RTX 2070S, so PCIe 3.0 seems to be alright for now. However I do tend to upgrade GPU about twice as often as CPU, so would I be alright with that for the coming, say, 5 years?

BeastOfExmoor posted:

20% IPC sounds wildly optimistic to me, but I'm a little out of the loop. Is their someone credible reporting these numbers?
Nothing authoritative either, but e.g. http://www.redgamingtech.com/amd-zen-3-more-info-on-ipc-clock-speeds-am5-follows-am4-exclusive/

Leandros fucked around with this message at 18:59 on Dec 29, 2019

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

lol god not this guy

uhhhhahhhhohahhh
Oct 9, 2012
I went 3600 & X570 when I upgraded from my 2500k because if Zen3 is super juicy I'll get an 8 core, flip the 3600 and stick with that for hopefully as long as the 2500k lasted me.

VorpalFish
Mar 22, 2007
reasonably awesometm

Leandros posted:

I'd obviously sell the 3700X if I were to get a new one, so it'd not be that big a hit, but you're probably right that it might not be worth it. As for X570, I was originally looking for a mobo with 8 SATA ports as I have a buttload of storage and those seemed the cheapest option. I'm now going for a NAS build, so the more common 6 SATA ports should be fine.
I'm not a fan of the chipset cooler either, and will probably go for an RTX 2070S, so PCIe 3.0 seems to be alright for now. However I do tend to upgrade GPU about twice as often as CPU, so would I be alright with that for the coming, say, 5 years?

Nothing authoritative either, but e.g. http://www.redgamingtech.com/amd-zen-3-more-info-on-ipc-clock-speeds-am5-follows-am4-exclusive/

I mean a 2080ti right now is fine on 8 lanes of pcie3, let alone 16. I don't see pcie4 being useful for gpus for awhile. Better use case is like 10/100gig ethernet or feeding nvme drives with fewer lanes so you can provide more connectivity from the chipset, or maybe some future thunderbolt spec.

*yes I know those recently released amd 5500s are showing bottlenecks on pcie3 x8 probably because amd hosed something up but they're garbage value cards and nobody should be buying them anyways.

movax
Aug 30, 2008

VorpalFish posted:

I mean a 2080ti right now is fine on 8 lanes of pcie3, let alone 16. I don't see pcie4 being useful for gpus for awhile. Better use case is like 10/100gig ethernet or feeding nvme drives with fewer lanes so you can provide more connectivity from the chipset, or maybe some future thunderbolt spec.

*yes I know those recently released amd 5500s are showing bottlenecks on pcie3 x8 probably because amd hosed something up but they're garbage value cards and nobody should be buying them anyways.

I think it’s more appealing for the first reasons there, using fewer lanes / physical I/O where possible. I always liked doing bandwidth bridging in designs with PCIe Gen 2 or 3 feeding into a switch that would fan out to slower devices that didn’t need all the bandwidth. Getting 10 GbE in x1 would be great for density.

VorpalFish
Mar 22, 2007
reasonably awesometm

movax posted:

I think it’s more appealing for the first reasons there, using fewer lanes / physical I/O where possible. I always liked doing bandwidth bridging in designs with PCIe Gen 2 or 3 feeding into a switch that would fan out to slower devices that didn’t need all the bandwidth. Getting 10 GbE in x1 would be great for density.

100% agreed, but in think we're a long ways away from the point where a typical home user is going to benefit which is why I tend to steer people away unless they have a specific use case. How many people need even a single 10Ge in their home desktop at this point, or will even in the next 5 years?

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

VorpalFish posted:

I mean a 2080ti right now is fine on 8 lanes of pcie3, let alone 16. I don't see pcie4 being useful for gpus for awhile. Better use case is like 10/100gig ethernet or feeding nvme drives with fewer lanes so you can provide more connectivity from the chipset, or maybe some future thunderbolt spec.

*yes I know those recently released amd 5500s are showing bottlenecks on pcie3 x8 probably because amd hosed something up but they're garbage value cards and nobody should be buying them anyways.

the only reason they show a difference is that with 4gb vram you need to stream data in over pcie.


PCIe speed only matters if that's your bottleneck, and that only happens if your scene can't fit into VRAM.

tl;dr buy more vram

CrazyLoon
Aug 10, 2015

"..."

Leandros posted:

I'm not a fan of the chipset cooler either,

Just a quick note on this, if it performs as it should you will literally not even hear it over all the other fans + if you have any kind of decent aircooling from the front, I seriously doubt it's even needed. There were some early BIOS versions that made them annoying, but as of this past month they are utterly beneath notice imo.

Leandros
Dec 14, 2008

CrazyLoon posted:

Just a quick note on this, if it performs as it should you will literally not even hear it over all the other fans + if you have any kind of decent aircooling from the front, I seriously doubt it's even needed. There were some early BIOS versions that made them annoying, but as of this past month they are utterly beneath notice imo.

That's good to know, but it's not just the noise for me. I used my current mobo for a good 9 years, so fan failure is also on my mind. I'm guessing it's gonna be some poo poo custom part that's either impossible to find down the line or unnecessarily expensive.

After some more looking around, I think I'll just go for a high-end X470 and 3700X. Too many mobos that either are EATX or need multiple 12V connectors, which would mean upgrading the case and/or PSU. Guess I slowed down my incremental upgrading a bit too much to keep up v:shobon:v

OhFunny
Jun 26, 2013

EXTREMELY PISSED AT THE DNC
Finished my AMD 3600 build the yesterday and went through some issues today.

So I was playing some heavy CPU games and noticed the CPU wasn't boosting to it's advertised 4.2Ghz. It was going to about 3.8-3.9Ghz.

I did some Googling and headed into the Bios and turned on "Game Boost" and it boosts up to 4.18Ghz. All good or so I thought.

I then decided to put together the Lego set my Uncle gave me for Christmas and switched on World Community Gird to crunch some tasks while I'm away. When I got back the computer had shutdown. While troubleshooting this I noticed in task manager that the clocks were stuck at 4.2Ghz even as the machine was idling. When WCG was using all the cores and threads the temps rose to 104C before I suspended the program. I fixed this by turning "Game Boost" back off in the Bios and temps are stable at 88C while crunching WCG tasks.

Can the stock cooler the 3600 comes with just not handle the CPU when it's pushed that hard? It seemed fine while gaming, but obviously games aren't going to use as many cores and threads as WCG can.

OhFunny fucked around with this message at 03:44 on Dec 30, 2019

CrazyLoon
Aug 10, 2015

"..."
AFAIK the stock cooler for the 3600 is the same as the one for the 2600, that I just put into my build - namely the kind that spreads the heat onto and around the motherboard so it's bound to be a very similar experience either way (i.e. not ideal for overclocking). I tried benchmarking it with a conservative OC for awhile and decided very quickly that rather than do searing hot benchmarks that look fine, until you put them under a real stress test that shuts the computer down for any number of details while under that kind of voltage + heat using that sort of stock cooler, I'd instead prefer a chilly, undervolted experience instead, while putting the CPU through any kind of stress test. Now it never goes past 60 celsius while crunching anything hard at stock and you know what...that's a-okay by me since I didn't buy this thing for super speed.

CrazyLoon fucked around with this message at 03:40 on Dec 30, 2019

SeaGoatSupreme
Dec 26, 2009
Ask me about fixed-gear bikes (aka "fixies")
The packaged amd coolers are okay for stock, not pinned-for-hours cooling. If you ask anything more from most of them they just kind of fall apart.

Get a big ole baby-head sized air cooler, or a two fan aio if you want to let the motherboard overvolt the CPU to make it boost as high as it can.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
I just finished putting together an APU build.

I was pretty despairing last night upon seeing that at least one of the pins on the CPU was bent, especially since I'd watched a video on how to fix bent pins on an Intel motherboard and saw that it was a very delicate and difficult process, but as a Hail Mary I also looked for how to fix bent pins on an AMD CPU and found that it was much easier since you're not limited to working within the close confines of the motherboard socket.

https://www.youtube.com/watch?v=y8U2NkbiMAI

So I bought some razorblades at the drug store and had at it. I managed to straighten them out enough that I could now seat the CPU in the socket properly (I couldn't before), but by that time it was 1 AM so I left it off for later.

This morning I bench-tested the drat thing and it went to the BIOS, everything looked normal, and I was getting the full 16 gigs of RAM. I spent the rest of the morning putting together the rest of the case, the fans, the SSD, installed Windows and drivers, and everything seemed normal.

The only snag I ran into was that this old, busted-up circa-2001 case I was using (which was really the point of the build, I wanted to put something inside this case to practice my assembly skills) could not for some reason properly seat/fit the GTX 650 I wanted to put inside it. I think maybe a low-profile or single-slot GPU could have fit, but the only one I have is a GT 710 and the APU's on-board graphics are probably better than that.

It's an A8-7650K, so pretty old tech, but the whole build ran to less than 200 dollars, and it's the first computer I've built entirely by myself, so I feel pretty good about that, and especially tackling something like a bent pin as a bit of extra troubleshooting that I had to learn to tackle.

That said, I wouldn't recommend it for anyone else, because an Athlon 200GE/3000 is not that much more expensive, even if you have to get DDR4 RAM.

Cygni
Nov 12, 2005

raring to post

OhFunny posted:

Finished my AMD 3600 build the yesterday and went through some issues today.

So I was playing some heavy CPU games and noticed the CPU wasn't boosting to it's advertised 4.2Ghz. It was going to about 3.8-3.9Ghz.

I did some Googling and headed into the Bios and turned on "Game Boost" and it boosts up to 4.18Ghz. All good or so I thought.

I then decided to put together the Lego set my Uncle gave me for Christmas and switched on World Community Gird to crunch some tasks while I'm away. When I got back the computer had shutdown. While troubleshooting this I noticed in task manager that the clocks were stuck at 4.2Ghz even as the machine was idling. When WCG was using all the cores and threads the temps rose to 104C before I suspended the program. I fixed this by turning "Game Boost" back off in the Bios and temps are stable at 88C while crunching WCG tasks.

Can the stock cooler the 3600 comes with just not handle the CPU when it's pushed that hard? It seemed fine while gaming, but obviously games aren't going to use as many cores and threads as WCG can.

AMD and Intel (with its thermal velocity boost) both use boost clocks to mean "things you will only see for a millisecond when the CPU isnt loaded". The number is no longer something you should expect to see all the time, especially not for all core workloads.

The newer BIOS versions with the newer AGESA revisions do help clock behavior though, although most people still report that they are a little off the advertised peaks.

I will say, if the 3.8 number was on 1 core workloads, that does seem awfully low. Around 4.1 was what I saw with the 3600 I had my hands on.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot

OhFunny posted:

Finished my AMD 3600 build the yesterday and went through some issues today.

So I was playing some heavy CPU games and noticed the CPU wasn't boosting to it's advertised 4.2Ghz. It was going to about 3.8-3.9Ghz.

I did some Googling and headed into the Bios and turned on "Game Boost" and it boosts up to 4.18Ghz. All good or so I thought.

I then decided to put together the Lego set my Uncle gave me for Christmas and switched on World Community Gird to crunch some tasks while I'm away. When I got back the computer had shutdown. While troubleshooting this I noticed in task manager that the clocks were stuck at 4.2Ghz even as the machine was idling. When WCG was using all the cores and threads the temps rose to 104C before I suspended the program. I fixed this by turning "Game Boost" back off in the Bios and temps are stable at 88C while crunching WCG tasks.

Can the stock cooler the 3600 comes with just not handle the CPU when it's pushed that hard? It seemed fine while gaming, but obviously games aren't going to use as many cores and threads as WCG can.

Unlike Intel, current AMD CPUs generally do not actually boost to their advertised frequencies at all, or for well under a second if they do. It doesn't really matter, because all the benchmarks you've seen for every AMD CPU involved them running at their real rather than advertised boost clocks. The performance is real, the clocks are not. If you're hitting what you should in benches, leave it alone.

You could probably get VERY slightly more performance with a better cooler, but it's unlikely to be significant - certainly not enough to warrant the price. Specialized cooling is more about noise than performance.

e - I should also mention that the reason for this is that current AMD CPUs are automatically pushing themselves basically as hard as it makes any sense to push them, and there is really nothing to gain from overclocking.

K8.0 fucked around with this message at 07:12 on Dec 30, 2019

yomisei
Mar 18, 2011

gradenko_2000 posted:

I just finished putting together an APU build.

I was pretty despairing last night upon seeing that at least one of the pins on the CPU was bent, especially since I'd watched a video on how to fix bent pins on an Intel motherboard and saw that it was a very delicate and difficult process, but as a Hail Mary I also looked for how to fix bent pins on an AMD CPU and found that it was much easier since you're not limited to working within the close confines of the motherboard socket.

Pin-bender buddy :hfive:

I had all four corners of a 2200G bent after mr. ebay from 2nd story right apartment (who writes these things on his shipping label?) just put the apu back into the plasic holder. Angered by this, I accidentally sweeped over my desk and the thing hit the ground, pins toward floor.

In the end I had to right up about 20 pins, but with a small razorblade-like knife it was pretty doable. Works fine in a DeskMini A300 with DDR4-3000 CL16.

Lungboy
Aug 23, 2002

NEED SQUAT FORM HELP
There seems to be lots of reports on various forums of the ASRock b450i fatality board having huge issues with 3000 series ryzen chips. Has anyone experienced it? I'd rather not fork out the extra for the Asus board but at least it seems to work.

L33t_Kefka
Jul 16, 2000

My 1337 littl3 magic us3r, put 0n this cr0wn, bitch! H4W H4W! I 0wn j00!!!!

Lungboy posted:

There seems to be lots of reports on various forums of the ASRock b450i fatality board having huge issues with 3000 series ryzen chips. Has anyone experienced it? I'd rather not fork out the extra for the Asus board but at least it seems to work.

I ended up recently getting a ASRock B450 Steel legend and a 3600X and its working fine for me.

Lungboy
Aug 23, 2002

NEED SQUAT FORM HELP

L33t_Kefka posted:

I ended up recently getting a ASRock B450 Steel legend and a 3600X and its working fine for me.

Aye it seems to be specific to the itx fatality board.

vvvv I have ethernet at home so WiFi and Bluetooth are essentially useless for me.

Lungboy fucked around with this message at 15:01 on Dec 30, 2019

Mu Zeta
Oct 17, 2002

Me crush ass to dust

I really like the wifi and bluetooth on my Asus ITX board.

Lungboy
Aug 23, 2002

NEED SQUAT FORM HELP
Am I correct in thinking the gigabyte b450i has terrible vrms and vrm cooling? Would a 3600 be too much for them? Otherwise the board seems good with the same audio as the Asus and Asrock boards and it's as cheap as the ASRock.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

The 3600 will work in pretty much anything. It's not a demanding chip.

Arzachel
May 12, 2012

Lungboy posted:

Am I correct in thinking the gigabyte b450i has terrible vrms and vrm cooling? Would a 3600 be too much for them? Otherwise the board seems good with the same audio as the Asus and Asrock boards and it's as cheap as the ASRock.

The vrm tier list is Msi > Asus > Gigabyte > Asrock but all the B450 itx boards will be perfectly fine with a stock 3900x, never mind a 3600.

orcane
Jun 13, 2012

Fun Shoe
VRM "tier lists" are dumb in many ways, but the B450 I Aorus Pro Wifi is still listed as a "mid range" option for Ryzen 2000 on the old X470/B450 list and the newer CPUs actually have a lower power consumption so :shrug:

Lungboy posted:

Am I correct in thinking the gigabyte b450i has terrible vrms and vrm cooling? Would a 3600 be too much for them? Otherwise the board seems good with the same audio as the Asus and Asrock boards and it's as cheap as the ASRock.
I have that board with an undervolted and TDP-limited 2700X and it's fine, a 3600 will be alright. It has adequate VRMs and the heatsink seems serviceable. If you're overclocking you have to make sure airflow is hitting the area on all ITX boards, but you shouldn't be overclocking Ryzen 3000 in the first place.

The only real downsides, IMO, are 1) Gigabyte's RGB software is atrocious if you care about that and 2) unlike the MSI MAX mainboards there's no guarantee you get a board that supports Ryzen 3000 out of the box, so you'd have to use a loaner CPU or bring it to a place where they update the BIOS for you - the first BIOS release supporting the latest Ryzens was released in May but it's theoretically possible the board was sitting in a warehouse longer than that. Otherwise I think it's a great board for budget ITX Ryzen builds, with Intel WiFi and LAN and ALC1220 sound.

Adbot
ADBOT LOVES YOU

Spiderdrake
May 12, 2001



There is definitely no manufacturer vrm tier list, given they vary from board to board immensely

Lungboy posted:

Am I correct in thinking the gigabyte b450i has terrible vrms and vrm cooling? Would a 3600 be too much for them? Otherwise the board seems good with the same audio as the Asus and Asrock boards and it's as cheap as the ASRock.
Here's da list https://docs.google.com/spreadsheets/d/1d9_E3h8bLp-TXr-0zTJFqqVxdCR9daIVNyMatydkpFA/edit#gid=611478281

Looks like you're right. It'll still work, though.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply