|
movax posted:As one of my profs used to say, there isn’t analog and digital, there’s just analog and really really fast analog. The design of high-performance caches and other SRAMs are fascinating. Cache design comes down to temporal and spatial probability. When running applications and doing background services your computer is usually doing certain functions over and over again. If the memory addresses or the timing frequency of this recurring code could be predicted, then no need to go into memory, virtual memory, or disk to retrieve it. Each access, respectively, is at least an order of magnitude slower in time. Looking forward to more efficient and clever caching algorithms that will improve processor performance even as core clock speeds reach their physical limits for transistor based micro-architecture.
|
# ? Dec 28, 2019 01:14 |
|
|
# ? Apr 25, 2024 07:24 |
|
I'm pretty sure he meant the electrical and signalling bullshit involved in implementing SRAM caches.
|
# ? Dec 28, 2019 03:11 |
|
Just got a 3600x. I cheaped out because I was looking to spend <$500. It is incredibly cool and good. Not to mention fast. A huge upgrade from my dogshit FX-8320E. God FX was bad. I'm sure people will try and defend it a few years from now but holy poo poo was that chip worthless.
|
# ? Dec 28, 2019 08:17 |
|
Combat Pretzel posted:I'm pretty sure he meant the electrical and signalling bullshit involved in implementing SRAM caches. Yeah, this is what I was primarily talking about but the post about temporal and spatial locality being fundamental to a cache doing its job is certainly correct. There are increasing amounts of EDAC (error detection and correction) being found in caches how as transistors shrink; ECC on L2 caches have been offered a la carte to silicon integrators from IP vendors for some time now.
|
# ? Dec 28, 2019 16:43 |
|
64bit_Dophins posted:Just got a 3600x. I cheaped out because I was looking to spend <$500. It is incredibly cool and good. Not to mention fast. FX gave you more cores, but did it in such a lovely way that you were better off having fewer cores and better performance with Intel. Ryzen did magnitudes to right that wrong. If the 3800x is still $329 in a few weeks, I might actually pull the trigger on one.
|
# ? Dec 28, 2019 17:13 |
|
64bit_Dophins posted:Just got a 3600x. I cheaped out because I was looking to spend <$500. It is incredibly cool and good. Not to mention fast. I don't think the 3600(x or otherwise) is cheaping out. I think it's a sweet spot for personal/desktop use. The Ryzen X600 chips have all had really good punch for the cost, and that cost has been low enough that a more aggressive upgrade cadence is affordable -- especially if you do those upgrades a year(ish) behind the release date.
|
# ? Dec 28, 2019 17:52 |
|
Balliver Shagnasty posted:FX gave you more cores, but did it in such a lovely way that you were better off having fewer cores and better performance with Intel. Ryzen did magnitudes to right that wrong. The 3800 doesn't outperform the 3700, literally a couple percent in select benchmarks, while drawing significantly more power. It just exists to fill a hole in the SKU lineup.
|
# ? Dec 28, 2019 18:10 |
|
What's the current recommended ram for a new ryzen 3600 build? Cheap 3200mhz or slower e-die and over clock?
|
# ? Dec 28, 2019 19:28 |
|
Lungboy posted:What's the current recommended ram for a new ryzen 3600 build? Cheap 3200mhz or slower e-die and over clock? the difference between 3200 and 3600 is not that big. E-die isn't that expensive though
|
# ? Dec 28, 2019 19:55 |
|
Paul MaudDib posted:holy poo poo If there was something awful gold, this post fuckin' deserves it, thanks!
|
# ? Dec 29, 2019 03:19 |
|
Crunchy Black posted:If there was something awful gold, this post fuckin' deserves it, thanks! Content like that makes me happy. Good thread.
|
# ? Dec 29, 2019 03:42 |
|
I have an i5 750 so it's time for a refresh and AMD definitely seems like a smart bet this time. I'd prefer to do it sooner rather than later, but the ~20% IPC gains the Zen3 apparently touts is sounding pretty good too. Would it make sense to go for a 3700X now with an X570 mobo and swap out the CPU for a 4k series somewhere down the line or will the X570 be a bottleneck? 3800X is about 35 euros more but doesn't seem worth the investment. I'm not getting any super authoritative info that Zen3 would even run on AM4, aside from the fact that Epyc3 will be running DDR4 and a move to AM5 would imply DDR5 memory, which won't be manufactured in high enough volumes around the release of Zen3. Considering the X570 is currently high-end, I would assume it ought to be enough, but I honestly don't know enough about this stuff.
|
# ? Dec 29, 2019 17:41 |
|
Leandros posted:I have an i5 750 so it's time for a refresh and AMD definitely seems like a smart bet this time. I'd prefer to do it sooner rather than later, but the ~20% IPC gains the Zen3 apparently touts is sounding pretty good too. Would it make sense to go for a 3700X now with an X570 mobo and swap out the CPU for a 4k series somewhere down the line or will the X570 be a bottleneck? 3800X is about 35 euros more but doesn't seem worth the investment. I'm not getting any super authoritative info that Zen3 would even run on AM4, aside from the fact that Epyc3 will be running DDR4 and a move to AM5 would imply DDR5 memory, which won't be manufactured in high enough volumes around the release of Zen3. Considering the X570 is currently high-end, I would assume it ought to be enough, but I honestly don't know enough about this stuff. There's always something better around the corner with computer hardware. You should upgrade when you want something better unless you know the newer architecture is releasing like next month. I'd also forget about upgrading zen2 to zen3. Socket compatibility sounds great in theory but in practice, spending 300+ for a 15-20% performance uplift is terrible value. Coming from a 750, you'd probably be super happy with a 3700x or even a 3600 depending on your workloads. Also consider b450 over x570 if you don't have a compelling use case for pcie4 because it's cheaper and gently caress active chipset cooling forever.
|
# ? Dec 29, 2019 18:05 |
|
VorpalFish posted:I'd also forget about upgrading zen2 to zen3. Socket compatibility sounds great in theory but in practice, spending 300+ for a 15-20% performance uplift is terrible value. The main reason I got a basic zen+ at bargain bin pricing instead. Would've been truly brave and gone for a 1000 series for max pricing buuut X570 mobos don't officially support those so...not that brave lol.
|
# ? Dec 29, 2019 18:13 |
|
Leandros posted:I'd prefer to do it sooner rather than later, but the ~20% IPC gains the Zen3 apparently touts is sounding pretty good too. 20% IPC sounds wildly optimistic to me, but I'm a little out of the loop. Is their someone credible reporting these numbers?
|
# ? Dec 29, 2019 18:19 |
|
VorpalFish posted:I'd also forget about upgrading zen2 to zen3. Socket compatibility sounds great in theory but in practice, spending 300+ for a 15-20% performance uplift is terrible value. I'm not a fan of the chipset cooler either, and will probably go for an RTX 2070S, so PCIe 3.0 seems to be alright for now. However I do tend to upgrade GPU about twice as often as CPU, so would I be alright with that for the coming, say, 5 years? BeastOfExmoor posted:20% IPC sounds wildly optimistic to me, but I'm a little out of the loop. Is their someone credible reporting these numbers? Leandros fucked around with this message at 18:59 on Dec 29, 2019 |
# ? Dec 29, 2019 18:50 |
|
Leandros posted:
lol god not this guy
|
# ? Dec 29, 2019 19:04 |
|
I went 3600 & X570 when I upgraded from my 2500k because if Zen3 is super juicy I'll get an 8 core, flip the 3600 and stick with that for hopefully as long as the 2500k lasted me.
|
# ? Dec 29, 2019 20:01 |
|
Leandros posted:I'd obviously sell the 3700X if I were to get a new one, so it'd not be that big a hit, but you're probably right that it might not be worth it. As for X570, I was originally looking for a mobo with 8 SATA ports as I have a buttload of storage and those seemed the cheapest option. I'm now going for a NAS build, so the more common 6 SATA ports should be fine. I mean a 2080ti right now is fine on 8 lanes of pcie3, let alone 16. I don't see pcie4 being useful for gpus for awhile. Better use case is like 10/100gig ethernet or feeding nvme drives with fewer lanes so you can provide more connectivity from the chipset, or maybe some future thunderbolt spec. *yes I know those recently released amd 5500s are showing bottlenecks on pcie3 x8 probably because amd hosed something up but they're garbage value cards and nobody should be buying them anyways.
|
# ? Dec 29, 2019 20:54 |
|
VorpalFish posted:I mean a 2080ti right now is fine on 8 lanes of pcie3, let alone 16. I don't see pcie4 being useful for gpus for awhile. Better use case is like 10/100gig ethernet or feeding nvme drives with fewer lanes so you can provide more connectivity from the chipset, or maybe some future thunderbolt spec. I think it’s more appealing for the first reasons there, using fewer lanes / physical I/O where possible. I always liked doing bandwidth bridging in designs with PCIe Gen 2 or 3 feeding into a switch that would fan out to slower devices that didn’t need all the bandwidth. Getting 10 GbE in x1 would be great for density.
|
# ? Dec 29, 2019 21:11 |
|
movax posted:I think it’s more appealing for the first reasons there, using fewer lanes / physical I/O where possible. I always liked doing bandwidth bridging in designs with PCIe Gen 2 or 3 feeding into a switch that would fan out to slower devices that didn’t need all the bandwidth. Getting 10 GbE in x1 would be great for density. 100% agreed, but in think we're a long ways away from the point where a typical home user is going to benefit which is why I tend to steer people away unless they have a specific use case. How many people need even a single 10Ge in their home desktop at this point, or will even in the next 5 years?
|
# ? Dec 29, 2019 21:31 |
|
VorpalFish posted:I mean a 2080ti right now is fine on 8 lanes of pcie3, let alone 16. I don't see pcie4 being useful for gpus for awhile. Better use case is like 10/100gig ethernet or feeding nvme drives with fewer lanes so you can provide more connectivity from the chipset, or maybe some future thunderbolt spec. the only reason they show a difference is that with 4gb vram you need to stream data in over pcie. PCIe speed only matters if that's your bottleneck, and that only happens if your scene can't fit into VRAM. tl;dr buy more vram
|
# ? Dec 29, 2019 21:54 |
|
Leandros posted:I'm not a fan of the chipset cooler either, Just a quick note on this, if it performs as it should you will literally not even hear it over all the other fans + if you have any kind of decent aircooling from the front, I seriously doubt it's even needed. There were some early BIOS versions that made them annoying, but as of this past month they are utterly beneath notice imo.
|
# ? Dec 29, 2019 21:56 |
|
CrazyLoon posted:Just a quick note on this, if it performs as it should you will literally not even hear it over all the other fans + if you have any kind of decent aircooling from the front, I seriously doubt it's even needed. There were some early BIOS versions that made them annoying, but as of this past month they are utterly beneath notice imo. That's good to know, but it's not just the noise for me. I used my current mobo for a good 9 years, so fan failure is also on my mind. I'm guessing it's gonna be some poo poo custom part that's either impossible to find down the line or unnecessarily expensive. After some more looking around, I think I'll just go for a high-end X470 and 3700X. Too many mobos that either are EATX or need multiple 12V connectors, which would mean upgrading the case and/or PSU. Guess I slowed down my incremental upgrading a bit too much to keep up vv
|
# ? Dec 29, 2019 23:42 |
|
Finished my AMD 3600 build the yesterday and went through some issues today. So I was playing some heavy CPU games and noticed the CPU wasn't boosting to it's advertised 4.2Ghz. It was going to about 3.8-3.9Ghz. I did some Googling and headed into the Bios and turned on "Game Boost" and it boosts up to 4.18Ghz. All good or so I thought. I then decided to put together the Lego set my Uncle gave me for Christmas and switched on World Community Gird to crunch some tasks while I'm away. When I got back the computer had shutdown. While troubleshooting this I noticed in task manager that the clocks were stuck at 4.2Ghz even as the machine was idling. When WCG was using all the cores and threads the temps rose to 104C before I suspended the program. I fixed this by turning "Game Boost" back off in the Bios and temps are stable at 88C while crunching WCG tasks. Can the stock cooler the 3600 comes with just not handle the CPU when it's pushed that hard? It seemed fine while gaming, but obviously games aren't going to use as many cores and threads as WCG can. OhFunny fucked around with this message at 03:44 on Dec 30, 2019 |
# ? Dec 30, 2019 02:58 |
|
AFAIK the stock cooler for the 3600 is the same as the one for the 2600, that I just put into my build - namely the kind that spreads the heat onto and around the motherboard so it's bound to be a very similar experience either way (i.e. not ideal for overclocking). I tried benchmarking it with a conservative OC for awhile and decided very quickly that rather than do searing hot benchmarks that look fine, until you put them under a real stress test that shuts the computer down for any number of details while under that kind of voltage + heat using that sort of stock cooler, I'd instead prefer a chilly, undervolted experience instead, while putting the CPU through any kind of stress test. Now it never goes past 60 celsius while crunching anything hard at stock and you know what...that's a-okay by me since I didn't buy this thing for super speed.
CrazyLoon fucked around with this message at 03:40 on Dec 30, 2019 |
# ? Dec 30, 2019 03:37 |
|
The packaged amd coolers are okay for stock, not pinned-for-hours cooling. If you ask anything more from most of them they just kind of fall apart. Get a big ole baby-head sized air cooler, or a two fan aio if you want to let the motherboard overvolt the CPU to make it boost as high as it can.
|
# ? Dec 30, 2019 03:38 |
|
I just finished putting together an APU build. I was pretty despairing last night upon seeing that at least one of the pins on the CPU was bent, especially since I'd watched a video on how to fix bent pins on an Intel motherboard and saw that it was a very delicate and difficult process, but as a Hail Mary I also looked for how to fix bent pins on an AMD CPU and found that it was much easier since you're not limited to working within the close confines of the motherboard socket. https://www.youtube.com/watch?v=y8U2NkbiMAI So I bought some razorblades at the drug store and had at it. I managed to straighten them out enough that I could now seat the CPU in the socket properly (I couldn't before), but by that time it was 1 AM so I left it off for later. This morning I bench-tested the drat thing and it went to the BIOS, everything looked normal, and I was getting the full 16 gigs of RAM. I spent the rest of the morning putting together the rest of the case, the fans, the SSD, installed Windows and drivers, and everything seemed normal. The only snag I ran into was that this old, busted-up circa-2001 case I was using (which was really the point of the build, I wanted to put something inside this case to practice my assembly skills) could not for some reason properly seat/fit the GTX 650 I wanted to put inside it. I think maybe a low-profile or single-slot GPU could have fit, but the only one I have is a GT 710 and the APU's on-board graphics are probably better than that. It's an A8-7650K, so pretty old tech, but the whole build ran to less than 200 dollars, and it's the first computer I've built entirely by myself, so I feel pretty good about that, and especially tackling something like a bent pin as a bit of extra troubleshooting that I had to learn to tackle. That said, I wouldn't recommend it for anyone else, because an Athlon 200GE/3000 is not that much more expensive, even if you have to get DDR4 RAM.
|
# ? Dec 30, 2019 06:59 |
|
OhFunny posted:Finished my AMD 3600 build the yesterday and went through some issues today. AMD and Intel (with its thermal velocity boost) both use boost clocks to mean "things you will only see for a millisecond when the CPU isnt loaded". The number is no longer something you should expect to see all the time, especially not for all core workloads. The newer BIOS versions with the newer AGESA revisions do help clock behavior though, although most people still report that they are a little off the advertised peaks. I will say, if the 3.8 number was on 1 core workloads, that does seem awfully low. Around 4.1 was what I saw with the 3600 I had my hands on.
|
# ? Dec 30, 2019 07:09 |
|
OhFunny posted:Finished my AMD 3600 build the yesterday and went through some issues today. Unlike Intel, current AMD CPUs generally do not actually boost to their advertised frequencies at all, or for well under a second if they do. It doesn't really matter, because all the benchmarks you've seen for every AMD CPU involved them running at their real rather than advertised boost clocks. The performance is real, the clocks are not. If you're hitting what you should in benches, leave it alone. You could probably get VERY slightly more performance with a better cooler, but it's unlikely to be significant - certainly not enough to warrant the price. Specialized cooling is more about noise than performance. e - I should also mention that the reason for this is that current AMD CPUs are automatically pushing themselves basically as hard as it makes any sense to push them, and there is really nothing to gain from overclocking. K8.0 fucked around with this message at 07:12 on Dec 30, 2019 |
# ? Dec 30, 2019 07:09 |
|
gradenko_2000 posted:I just finished putting together an APU build. Pin-bender buddy I had all four corners of a 2200G bent after mr. ebay from 2nd story right apartment (who writes these things on his shipping label?) just put the apu back into the plasic holder. Angered by this, I accidentally sweeped over my desk and the thing hit the ground, pins toward floor. In the end I had to right up about 20 pins, but with a small razorblade-like knife it was pretty doable. Works fine in a DeskMini A300 with DDR4-3000 CL16.
|
# ? Dec 30, 2019 11:26 |
|
There seems to be lots of reports on various forums of the ASRock b450i fatality board having huge issues with 3000 series ryzen chips. Has anyone experienced it? I'd rather not fork out the extra for the Asus board but at least it seems to work.
|
# ? Dec 30, 2019 13:50 |
|
Lungboy posted:There seems to be lots of reports on various forums of the ASRock b450i fatality board having huge issues with 3000 series ryzen chips. Has anyone experienced it? I'd rather not fork out the extra for the Asus board but at least it seems to work. I ended up recently getting a ASRock B450 Steel legend and a 3600X and its working fine for me.
|
# ? Dec 30, 2019 14:16 |
|
L33t_Kefka posted:I ended up recently getting a ASRock B450 Steel legend and a 3600X and its working fine for me. Aye it seems to be specific to the itx fatality board. vvvv I have ethernet at home so WiFi and Bluetooth are essentially useless for me. Lungboy fucked around with this message at 15:01 on Dec 30, 2019 |
# ? Dec 30, 2019 14:17 |
|
I really like the wifi and bluetooth on my Asus ITX board.
|
# ? Dec 30, 2019 14:20 |
|
Am I correct in thinking the gigabyte b450i has terrible vrms and vrm cooling? Would a 3600 be too much for them? Otherwise the board seems good with the same audio as the Asus and Asrock boards and it's as cheap as the ASRock.
|
# ? Dec 30, 2019 15:38 |
|
The 3600 will work in pretty much anything. It's not a demanding chip.
|
# ? Dec 30, 2019 16:03 |
|
Lungboy posted:Am I correct in thinking the gigabyte b450i has terrible vrms and vrm cooling? Would a 3600 be too much for them? Otherwise the board seems good with the same audio as the Asus and Asrock boards and it's as cheap as the ASRock. The vrm tier list is Msi > Asus > Gigabyte > Asrock but all the B450 itx boards will be perfectly fine with a stock 3900x, never mind a 3600.
|
# ? Dec 30, 2019 16:10 |
|
VRM "tier lists" are dumb in many ways, but the B450 I Aorus Pro Wifi is still listed as a "mid range" option for Ryzen 2000 on the old X470/B450 list and the newer CPUs actually have a lower power consumption so Lungboy posted:Am I correct in thinking the gigabyte b450i has terrible vrms and vrm cooling? Would a 3600 be too much for them? Otherwise the board seems good with the same audio as the Asus and Asrock boards and it's as cheap as the ASRock. The only real downsides, IMO, are 1) Gigabyte's RGB software is atrocious if you care about that and 2) unlike the MSI MAX mainboards there's no guarantee you get a board that supports Ryzen 3000 out of the box, so you'd have to use a loaner CPU or bring it to a place where they update the BIOS for you - the first BIOS release supporting the latest Ryzens was released in May but it's theoretically possible the board was sitting in a warehouse longer than that. Otherwise I think it's a great board for budget ITX Ryzen builds, with Intel WiFi and LAN and ALC1220 sound.
|
# ? Dec 30, 2019 18:20 |
|
|
# ? Apr 25, 2024 07:24 |
|
There is definitely no manufacturer vrm tier list, given they vary from board to board immenselyLungboy posted:Am I correct in thinking the gigabyte b450i has terrible vrms and vrm cooling? Would a 3600 be too much for them? Otherwise the board seems good with the same audio as the Asus and Asrock boards and it's as cheap as the ASRock. Looks like you're right. It'll still work, though.
|
# ? Dec 30, 2019 18:21 |