Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.

eames posted:

O/T but I have that CPU/Heatsink/Fan combination and found that the added second fan only lowered temperatures by 1-2 Celsius under full load. It wasn’t worth it and I ended up removing the second fan again. YMMV, I imagine the second fan would be useful for a larger CPU like the Threadripper.

Thanks for that! I have recently begun running folding@home on it and the temperature spiked to 89C at one point but I did have the fan on a really low speed profile. Now with it ramping up at temp it doesn’t go above 65C on moderate OC (4.4GHz) but if a second fan doesn’t do a lot I’ll just be happy with that then..

Love the heatsink though it is so ridiculously huge.

E and my apologies for tracking intel all up in the thread ya’ll

Adbot
ADBOT LOVES YOU

dorkanoid
Dec 21, 2004

ratbert90 posted:

So with the 4000 series AMD has now completed the trifecta of beating Intel yes?

Server, Desktop, and now Mobile offerings are all pretty much universally better with AMD in just about every category that I can think of.

- Energy consumption
- Cost
- Gaming performance (save for the 99%tile)
- Cooling
- Price

Is there any good reason to buy Intel for anything now?

We don't seem to be able to buy a multi-GPU AMD server with NVlink, so that's something.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

K8.0 posted:

Intel is still significantly superior for gaming, but nowhere near enough to justify the price difference.

I wouldn't say significantly superior by any means. Before the Ryzen 3000 series I'd agree, but "significant" is not something I'd use any longer.
On the desktop with low-resolution, very high refresh rates in mind, there is a gain to be had in running a 9900K, but that's basically the last thing they have going for them. Intel has their work cut out.
If we're talking laptops at any point, then AMD is just flat-out ahead. Even AMD 35W vs Intel 90W, single thread, the worst-case for AMD. It's a bloodbath. I almost couldn't believe the benchmarks. Intel's best vs AMD's best at the same power? Oof.

HalloKitty fucked around with this message at 17:24 on Apr 3, 2020

uhhhhahhhhohahhh
Oct 9, 2012
It's super likely the Ryzen 4700 desktop parts, or whatever they're going to call it, will beat the 9900k finally. They only need something like a 5% IPC and 5% clock boost to get there. Whether it will cover them for what Intel comes out with after :shrug:

Khorne
May 1, 2002

HalloKitty posted:

On the desktop with low-resolution, very high refresh rates in mind, there is a gain to be had in running a 9900K, but that's basically the last thing they have going for them.
Even there it's debatable. I have a 3900x and average 360 fps in CSGO at 1440p.

I only wish I had a 9900k in really poorly optimized games where it'd take me from 72->80 or 26->30 fps, starcraft 2 where you gain like 30% fps because Blizzard, and that's about it. But even in those games, your fps is still awful or still fine. It's not that big of a gap. I am sure there are some titles out there where there is a relevant gap, maybe apex legends, but I don't happen to be playing them.

Khorne fucked around with this message at 19:02 on Apr 3, 2020

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

HalloKitty posted:

I wouldn't say significantly superior by any means. Before the Ryzen 3000 series I'd agree, but "significant" is not something I'd use any longer.
On the desktop with low-resolution, very high refresh rates in mind, there is a gain to be had in running a 9900K, but that's basically the last thing they have going for them. Intel has their work cut out.

Zen3's lead will likely be smaller than Intel's current "insignificant" lead, if it exists at all, and yet a lot of people will race to sell their 3000 series processors and buy the new hotness for that "insignificant" gain in performance. :shrug:

uhhhhahhhhohahhh posted:

It's super likely the Ryzen 4700 desktop parts, or whatever they're going to call it, will beat the 9900k finally. They only need something like a 5% IPC and 5% clock boost to get there. Whether it will cover them for what Intel comes out with after :shrug:

5 GHz 9900K/KS is about 15% faster in average and 13% faster in minimum fps than a 3700X across Computerbase's test suite. 9900KS is a proxy here for a 5 GHz 9900K, you can get that performance on most chips, just at higher temperatures.

Whether that matters to you is a different question, but there are still a fair number of CPU-intensive titles like Battlefield V, Metro Exodus, Far Cry 5, RDR2 where the 9900K makes a difference in minimum framerates, etc. Frequently that's the difference between 75 fps and 90 fps minimum. Sometimes that's the difference between 75 and 90 fps average.

Yes, if you're maxing out all the settings when playing your competitive FPS shooter (lol), you're just doing highly optimized esports that run fine on a 2C, or you're playing slow-paced strategy games or whatever, then you may not care. But if you're running 5700XT or 2070S at 1440p then yes, there probably are titles where you'd be getting 10-15% better averages and minimums with a 9900K.

Whether that's worth almost twice the cost is a different question but Zen2 has a ways to go to catch the 9900K. Zen2 will probably slot in 0-5% faster than the 9900K I'd think.

Paul MaudDib fucked around with this message at 18:32 on Apr 3, 2020

B-Mac
Apr 21, 2003
I'll never catch "the gay"!
Tech spot has some nice cpu scaling benchmarks with a number of GPUs across various resolutions and settings to give a good idea as well.

https://www.techspot.com/review/1897-ryzen-5-ryzen-9-core-i9-gaming-scaling/

B-Mac fucked around with this message at 18:55 on Apr 3, 2020

Inept
Jul 8, 2003


Yeah, but once you move to 1440p in those tests for both frametimes and FPS, the difference is basically margin of error. So mentioning 1440p and a 2070/5700xt means that the CPU matters there very little, and you'd get a lot more out of upgrading to a 2080/ti. There are edge cases sure, but for a large majority of people, it doesn't matter.

edit: nevermind, they mention UHD, which I'm guessing they mean 4k. So I don't actually know what the difference is at 1440p. Probably somewhere in between.

Cygni
Nov 12, 2005

raring to post

I mean at this point Intel barely makes boxed CPUs. They've got less SKUs on shelves than they've had since the Pentium 2 days, and all of em are way overpriced. Without double counting the K/F/KF SKUs, its the 9900, 9700, 9600, 9400, 9100... and thats basically it. 9350/9500/9320 occasionally show up, then vanish again for months.

They've shunted every chunk of manufacturing they can get away with to server with the insane demand for SP parts. I don't think there is any doubt that Intel could go to bat with AMD on price in the DIY market and make a lot of their parts very attractive. poo poo they could sell every part for $10 and it wouldn't do much to their bottom line. But at this point, they are happy to just keep running more SPs and raking in the money.

Once Ice Lake SP comes out later this year and 14nm capacity frees up, maybe Intel will be more willing to actually make it a contest. But I doubt we see much compelling out of them in price/performance until Rocket Lake.

VorpalFish
Mar 22, 2007
reasonably awesometm

I mean you're usually going to get way better returns buying the fastest single gpu you can afford for a pure gaming workload so one would assume if you're considering a 9900k/s over the 3700x for gaming you're pairing it with a 2080s at least or you've got like an emulator use case that really favors intel or something. Doesn't make a ton of sense to spend +$150 on a processor when that could buy you a faster GPU.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

VorpalFish posted:

I mean you're usually going to get way better returns buying the fastest single gpu you can afford for a pure gaming workload so one would assume if you're considering a 9900k/s over the 3700x for gaming you're pairing it with a 2080s at least or you've got like an emulator use case that really favors intel or something. Doesn't make a ton of sense to spend +$150 on a processor when that could buy you a faster GPU.

the price to performance curve falls off a cliff after the 2070S, a 2080S is spending 60% more for 13% more performance. If you're willing to spend an extra $300 for a 15% return on your graphics, why not your CPU, which will last you multiple GPUs?

remember that Navi 2 and Ampere are both coming later this year, which means the price point at which you become CPU bottlenecked shifts downwards as well. Right now the bottleneck starts showing up around $400 at 1440p, next year it will probably show up on the $250-300 tier of GPUs, and the $500-700 tier of GPUs will probably be showing a noticeable bottleneck.

VorpalFish
Mar 22, 2007
reasonably awesometm

Paul MaudDib posted:

the price to performance curve falls off a cliff after the 2070S, a 2080S is spending 60% more for 13% more performance. If you're willing to spend an extra $300 for a 15% return on your graphics, why not your CPU, which will last you multiple GPUs?

remember that Navi 2 and Ampere are both coming later this year, which means the price point at which you become CPU bottlenecked shifts downwards as well. Right now the bottleneck starts showing up around $400 at 1440p, next year it will probably show up on the $250-300 tier of GPUs, and the $500-700 tier of GPUs will probably be showing a noticeable bottleneck.

Unless you live in a region with very different pricing from me, its like a 40% jump in cost, not 60%. I'm not saying it's not poor value, stuff at the far end of the performance curve usually is. But 3700x -> 9900k is also bad value. If I'm willing to spend into diminishing returns for better absolute performance and my use case is gaming (and I can only afford one), I'm taking the 2080s every time.

Obviously there's a point where it's ridiculous which is why I didn't say ti - at that price point the difference in processor cost doesn't even get you halfway to the next gpu step

Edit: I guess it's only fair to mention the 9700k which gets you the same performance in games at least today as the 9900k at 85% of the cost. I could see pairing that with less than a 2080s if you're willing to risk "only" having 8c8t.

VorpalFish fucked around with this message at 22:12 on Apr 3, 2020

Indiana_Krom
Jun 18, 2007
Net Slacker

Paul MaudDib posted:

Whether that matters to you is a different question, but there are still a fair number of CPU-intensive titles like Battlefield V, Metro Exodus, Far Cry 5, RDR2 where the 9900K makes a difference in minimum framerates, etc.

Just gonna chime in here a bit about Far Cry 5; I did some pretty extensive benchmarking at different resolutions and settings to find the performance bottleneck and even with just a GTX 1080 at 1080p/ultra that particular game still hits the CPU limit on a 9900k. The Ubershit DRM is probably responsible for some of that, but basically you have to go past 1440p before you consistently hit the GPU bottleneck there. Literally turning down the details or using an aggressive resolution scale makes barely any difference in that game (the averages max out at about 10 FPS higher, but that is mostly in the peaks because the minimums basically don't budge).

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Indiana_Krom posted:

Just gonna chime in here a bit about Far Cry 5; I did some pretty extensive benchmarking at different resolutions and settings to find the performance bottleneck and even with just a GTX 1080 at 1080p/ultra that particular game still hits the CPU limit on a 9900k. The Ubershit DRM is probably responsible for some of that, but basically you have to go past 1440p before you consistently hit the GPU bottleneck there. Literally turning down the details or using an aggressive resolution scale makes barely any difference in that game (the averages max out at about 10 FPS higher, but that is mostly in the peaks because the minimums basically don't budge).

Far Cry 5 is just super single-threaded, which is why in particular it works better on Intel, despite being an AMD sponsored game. Even at 3440x1440, it's CPU bottlenecked for me on a 2080 Ti. It's one of the games I use to test my memory overclocks on my 3900X to see how well I'm cutting into that ST lead that Intel has. Got my tuned 3900X to be better than at least a stock 9900K, but still loses out to tweaked 9900K's.

mdxi
Mar 13, 2006

to JERK OFF is to be close to GOD... only with SPURTING

I'm going to wait for benchmarks on desktop Ryzen 4X00 -- and for the full lineup reveal -- before deciding what my upgrade strategy will be for my 3900X machines.

What's a lot harder to figure out is what to do with my 2700 machines. I had planned to just leave them as-is for now, and swap 3900Xs into them as those machines got new CPUs. But with all the covid19 goings on in the grid computing space these days I feel like maybe it's worth upgrading them now and get more work done, sooner. In any case, I have until next payday to torture myself think about it.

Anime Schoolgirl
Nov 28, 2002

I wonder if the 4700G is even going to exist with how tentatively successful the Renoir dies are going to be for mobile because my deskmini could definitely use 8 cores and MX250-level graphics

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
I'll probably be able to use my B350 motherboard for desktop-Renoir, right? just might need a BIOS update?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Paul MaudDib posted:

Zen3's lead will likely be smaller than Intel's current "insignificant" lead, if it exists at all, and yet a lot of people will race to sell their 3000 series processors and buy the new hotness for that "insignificant" gain in performance. :shrug:

I hope so, I'm looking forward to picking up a used 3950X

Edit. Also, there are plenty of varying benchmarks out there, techspot has the 9900KS vs 3950X with tuned memory timings having the 9900Ks winning by an average of 4% at 1080p, which is not what most will want to play at.
Increase that resolution even a bit and the difference is much smaller or simply not there. I of course appreciate that then you're testing the GPU, not the CPU, so it's useless as a CPU test, but it does reflect real-world conditions.
Then again, we could run around all day cherry-picking benchmarks, but I really don't give a poo poo, because I'm not getting paid by AMD to say this. I just appreciate the competition they've brought to the space.

HalloKitty fucked around with this message at 09:27 on Apr 4, 2020

Budzilla
Oct 14, 2007

We can all learn from our past mistakes.

gradenko_2000 posted:

I'll probably be able to use my B350 motherboard for desktop-Renoir, right? just might need a BIOS update?
At least a BIOS update.

Paul MaudDib posted:

5 GHz 9900K/KS is about 15% faster in average and 13% faster in minimum fps than a 3700X across Computerbase's test suite. 9900KS is a proxy here for a 5 GHz 9900K, you can get that performance on most chips, just at higher temperatures........... Whether that's worth almost twice the cost is a different question but Zen2 has a ways to go to catch the 9900K. Zen2 will probably slot in 0-5% faster than the 9900K I'd think.
Where I live (Oz) the 9900K is a 60% markup on the 3700X (don't forget the 3700X has a cooler thrown in too). Those processors aren't even in the same price bracket why compare the 2? That money could be spent on other parts to make the difference even more marginal in those specific cases (full HD gaming) and significantly better with everything else.

Arzachel
May 12, 2012

Budzilla posted:

Where I live (Oz) the 9900K is a 60% markup on the 3700X (don't forget the 3700X has a cooler thrown in too). Those processors aren't even in the same price bracket why compare the 2? That money could be spent on other parts to make the difference even more marginal in those specific cases (full HD gaming) and significantly better with everything else.

Unless you're already CPU limited, which is what Paul is talking about. If you want to drive a 240hz (or even 144hz, depending on the game) monitor, there's a pretty good argument for going with a 9900k over Zen 2.

Klyith
Aug 3, 2007

GBS Pledge Week

Arzachel posted:

Unless you're already CPU limited, which is what Paul is talking about. If you want to drive a 240hz (or even 144hz, depending on the game) monitor, there's a pretty good argument for going with a 9900k over Zen 2.

The people for whom this is a thing are pro CSGO players.

Having a high-refresh monitor is great, but it doesn't mean you have to run everything at 144 fps. Most people like having pretty graphics settings, most people are GPU limited in most games that they play. That doesn't make high-refresh monitors pointless, 80-100 fps on a 144hz monitor is nicer than 60 on a 60hz monitor. High refresh is kinda halfway to VRR; delays from long frames have less time to wait before the next refresh update.

ufarn
May 30, 2009
Hopefully G-Sync or Freesync can pick up some of the slack, too.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Anime Schoolgirl posted:

I wonder if the 4700G is even going to exist with how tentatively successful the Renoir dies are going to be for mobile because my deskmini could definitely use 8 cores and MX250-level graphics

You're comparison is too low here, Computerbase.de did some comparisons and the 35W 4800HS either beats or matches the 65W 3400G, and matches desktop GT 1030. A 4700G or 4600G is probably going to surpass a GT 1030 noticeably and start chasing after Polaris 21 or GP107, if they can clock anything like a Vega VII can.

Truga
May 4, 2014
Lipstick Apathy
man, 32 threads is loving wild.

streaming x264 with OBS on medium preset gives me like 10% cpu usage lmao

Arzachel
May 12, 2012

Klyith posted:

The people for whom this is a thing are pro CSGO players.

Having a high-refresh monitor is great, but it doesn't mean you have to run everything at 144 fps. Most people like having pretty graphics settings, most people are GPU limited in most games that they play. That doesn't make high-refresh monitors pointless, 80-100 fps on a 144hz monitor is nicer than 60 on a 60hz monitor. High refresh is kinda halfway to VRR; delays from long frames have less time to wait before the next refresh update.

I agree but competitive games are a big reason behind the 144hz push and you want the framerate to be rock solid for that. Ironically though, CSGO is one of the games where it's a wash between AMD and Intel for some reason.

EmpyreanFlux posted:

You're comparison is too low here, Computerbase.de did some comparisons and the 35W 4800HS either beats or matches the 65W 3400G, and matches desktop GT 1030. A 4700G or 4600G is probably going to surpass a GT 1030 noticeably and start chasing after Polaris 21 or GP107, if they can clock anything like a Vega VII can.

I'm wondering how well desktop Renoir will do since a large chunk of the GPU performance increase seems to come from faster memory.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!

Arzachel posted:

I agree but competitive games are a big reason behind the 144hz push and you want the framerate to be rock solid for that. Ironically though, CSGO is one of the games where it's a wash between AMD and Intel for some reason.


I'm wondering how well desktop Renoir will do since a large chunk of the GPU performance increase seems to come from faster memory.

Memory controller is supposedly fantastic, so getting to 4266 with the right kits (Mostly Micron E-die or Hynix CJR) is not only probable but you'll see a larger delta from it. The drop in CPU performance from looser timings might not be noticeable, if you're using the iGPU, as a bottleneck.

DDR5 is going to be lit though, kits bottom out @ 4800 and apparently Hynix is aiming for 8400, which is like ~130GB/s for bandwidth. Zen4 or even Zen3 APUs might actually be 1650 tier or possibly more. At which point yeah, I think APUs will be a legit choice for 1080p gaming, not just budget.

karoshi
Nov 4, 2008

"Can somebody mspaint eyes on the steaming packages? TIA" yeah well fuck you too buddy, this is the best you're gonna get. Is this even "work-safe"? Let's find out!

EmpyreanFlux posted:

DDR5 is going to be lit though, kits bottom out @ 4800 and apparently Hynix is aiming for 8400, which is like ~130GB/s for bandwidth.

:catstare:

eames
May 9, 2009

EmpyreanFlux posted:

DDR5 is going to be lit though, kits bottom out @ 4800 and apparently Hynix is aiming for 8400, which is like ~130GB/s for bandwidth. Zen4 or even Zen3 APUs might actually be 1650 tier or possibly more. At which point yeah, I think APUs will be a legit choice for 1080p gaming, not just budget.

There’s also talk about on-memory die ECC support, so sticks could support ECC without requiring extra chips for parity because it would be built into the memory Dies. That could also be great if the consumer products end up supporting it.

eames fucked around with this message at 20:17 on Apr 4, 2020

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Arzachel posted:

I agree but competitive games are a big reason behind the 144hz push and you want the framerate to be rock solid for that. Ironically though, CSGO is one of the games where it's a wash between AMD and Intel for some reason.

CS:GO is actually faster on Zen2, significantly so. Source Engine responds really well to huge caches, I think. Maybe a lot of pointer chasing or something.

(It also seems like maybe that’s something that could be optimized but lol Valve)

Paul MaudDib fucked around with this message at 21:32 on Apr 4, 2020

ufarn
May 30, 2009
AMD literally had a huge slide with Source Engine performance improvements for Zen2 when they did that weird semi-announcement where they only ran Cinebench to show it off.

Valorant might finally be the game where no one gives a drat because it's optimized so much.

Cygni
Nov 12, 2005

raring to post

eames posted:

There’s also talk about on-memory die ECC support, so sticks could support ECC without requiring extra chips for parity because it would be built into the memory Dies. That could also be great if the consumer products end up supporting it.

I saw that on the spec sheet too. And if AMD's DDR5 controller is like their DDR4 one, it should support ECC out of the box. ECC being standard on desktops may actually happen.

(yes yes i know it doesnt really matter for home use anyway)

Klyith
Aug 3, 2007

GBS Pledge Week

Cygni posted:

(yes yes i know it doesnt really matter for home use anyway)

I'm kinda suspecting that if EEC is being added on-die, it's because DDR5 will need it in normal operation.

NewFatMike
Jun 11, 2015

Side note, StoreMI is dead, long live the replacement due... Later this year:

https://www.tomshardware.com/amp/news/amd-axes-storemi-technology-replacement-q2-2020

This is kind of weak because free tiered storage was a really nice to have for my modest homelab next upgrade. Hopefully the new one is also good.

Seamonster
Apr 30, 2007

IMMER SIEGREICH

EmpyreanFlux posted:

DDR5 is going to be lit though, kits bottom out @ 4800 and apparently Hynix is aiming for 8400, which is like ~130GB/s for bandwidth. Zen4 or even Zen3 APUs might actually be 1650 tier or possibly more. At which point yeah, I think APUs will be a legit choice for 1080p gaming, not just budget.

Oh yes my body is ready but in reality it will be a bit much to expect. OEMs and laptops especially are going to "welp there's plenty of bandwidth now so here's your 8GB DIMM in single channel, gently caress you very much - also its soldered and there's no open slots." Unless of course AMD specifically prohibits this kind of fuckery.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

Seamonster posted:

Oh yes my body is ready but in reality it will be a bit much to expect. OEMs and laptops especially are going to "welp there's plenty of bandwidth now so here's your 8GB DIMM in single channel, gently caress you very much - also its soldered and there's no open slots." Unless of course AMD specifically prohibits this kind of fuckery.

https://www.youtube.com/watch?v=7TUjOb1H5pg

there's this video of single-channel vs dual-channel performance of a 4800HS, and the delta is pretty funny

orcane
Jun 13, 2012

Fun Shoe
DDR5 is going to be rad eventually, but platforms for it will have to mature first and if DDR5-8400 is the goal that will take like 3-5 years to show up after DDR5 launches in the first place.

Methylethylaldehyde
Oct 23, 2004

BAKA BAKA

Klyith posted:

I'm kinda suspecting that if EEC is being added on-die, it's because DDR5 will need it in normal operation.

Given the incredible quantity of black loving magic needed to get DDR5 to even work at all, ECC isn't a 'nice to have' it's a 'only way to get this fucker to work at all'. Moving it on-die and doing the parity calcs in-line and in real time lets you get away with some pretty marginal signaling, and allows for the feed forward tuning engine to compensate for poo poo without the machine just horking it's guts up and segfaulting on you.


This is the 'data eye' That little solid ovalish thing in the center is the difference between a 1 and a 0, worst case. That picture is from GDDR6, DDR5 is apparently even worse. You need complex forward feedback equalization just to get the eye open enough to work at all. Feedback that runs in real time, constantly tuning parameters in order to make the eye as large as possible, so a tiny hiccup in VCC doesn't close it and cause your data read to turn into gibberish.

SwissArmyDruid
Feb 14, 2014

by sebmojo
...poo poo, I ain't gonna complain if memory makers just take ECC out of the hands of AMD and Intel's wielding it as a market segmentation feature.

edit: CNN did a short piece on AMD and Dr. Su. https://www.youtube.com/watch?v=lHT5MRky9SA

edit edit: Our favorite German did a piece on the new 4900HS, even liquid metals one. https://www.youtube.com/watch?v=_aLH0Q6CZF4

SwissArmyDruid fucked around with this message at 15:36 on Apr 5, 2020

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Methylethylaldehyde posted:

Given the incredible quantity of black loving magic needed to get DDR5 to even work at all, ECC isn't a 'nice to have' it's a 'only way to get this fucker to work at all'. Moving it on-die and doing the parity calcs in-line and in real time lets you get away with some pretty marginal signaling, and allows for the feed forward tuning engine to compensate for poo poo without the machine just horking it's guts up and segfaulting on you.


This is the 'data eye' That little solid ovalish thing in the center is the difference between a 1 and a 0, worst case. That picture is from GDDR6, DDR5 is apparently even worse. You need complex forward feedback equalization just to get the eye open enough to work at all. Feedback that runs in real time, constantly tuning parameters in order to make the eye as large as possible, so a tiny hiccup in VCC doesn't close it and cause your data read to turn into gibberish.

Ddr is the last parallel single ended interface in wide use, they could switch to serial like IBM does. In that case differential signaling and fec gets you to dozens of gigabits per pair. But that's only over the wire.

Malcolm XML fucked around with this message at 17:56 on Apr 5, 2020

Adbot
ADBOT LOVES YOU

EoRaptor
Sep 13, 2003

by Fluffdaddy

Malcolm XML posted:

Ddr is the last parallel single ended interface in wide use, they could switch to serial like IBM does. In that case differential signaling and fec gets you to dozens of gigabits per pair. But that's only over the wire.

RDRAM was serialized, so it's certainly possible. You'd need to solve the latency problem somehow if you wanted to use that method these days. I do wonder if any of the Rambus patents are still active

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply