Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Craptacular!
Jul 9, 2001

Fuck the DH
I’m on a 3770K at 4.4 but I feel the pain. It helps that I plan to be foolish and buy a motherboard more for it’s looks than for it’s capabilities next time, going for an all white build that has become quite the trend, and B450 (and thus MSI’s Arctic board) won’t be out for a while.

I just OCed this week. This can carry me through to Black Friday at minimum, and hopefully by then we’ll get rumors of Zen 2.

Adbot
ADBOT LOVES YOU

MagusDraco
Nov 11, 2011

even speedwagon was trolled
I'll be fine with my 3550 til Zen 2. Probably.

GRINDCORE MEGGIDO
Feb 28, 1985


That new motherboard smell though.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance

PerrineClostermann posted:

My current CPU is a 2600k, running at 4.2GHz. Ryzen+ can do 4.2GHz with better IPC and twice the cores.

I'm so tempted.

Same here but I'm running a 2500K@4.2 and without hyper-threading this CPU is really starting to show it's age in newer games. I can only imagine going from a 4c/4t CPU to a 8c/16t CPU with better IPC to boot. For me it would be twice the cores and quadruple the threads.

spasticColon fucked around with this message at 03:42 on Apr 20, 2018

Otakufag
Aug 23, 2004
So has there been an explanation for why the AnandTech benchmarks gave like a +30% Ryzen 2 gaming advantage over Intel? Some say Intel's security patches lowered performance, others say it was something with RAM speeds running at the minimum supported for each platform. My uncle that works at AMD says they accidentally sent a ZEN 7nm sample.

eames
May 9, 2009

Der8auer has a nice Frequency/Voltage/Power consumption chart 1 Minute into this video. (can’t hotliink from my phone)

https://youtu.be/ogYess5WelY

he said his sample was cherry picked so for an average 2700X we can roughly expect:

3900 MHz 1.125V 98W
4025 MHz 1.225V 121W
4125 MHz 1.325V 151W
4200 MHz 1.425V 192W

Klyith
Aug 3, 2007

GBS Pledge Week

Otakufag posted:

So has there been an explanation for why the AnandTech benchmarks gave like a +30% Ryzen 2 gaming advantage over Intel? Some say Intel's security patches lowered performance, others say it was something with RAM speeds running at the minimum supported for each platform. My uncle that works at AMD says they accidentally sent a ZEN 7nm sample.

I think the best explanation is the GamersNexus one: the time window for reviews provided by the manufacturer is way too short for a thorough & careful job. they had to slap an OS and their benchmarking suite on that poo poo and just plow through, no time to look at unexpected results or tweak things for the fairest conditions.



Spidi posted:

I'm about to get new PC but I'm debating if it's worth buying 2600X over 2600 for that little boost of performance or just save that 30 bucks and get myself better keyboard :psyduck:

This is a tough choice this time around. the 1600X was pretty meh because any 1600 non-X could easily OC to the same performance. with Ryzen 2 it looks like there's a lot more performance in the XFR2 special sauce and the X CPUs might be mildly binned for that. So the non-X may not be able to hit the same mark even OCed to the same 4 ghz.

If you hadn't already budgeted for a decent heatsink, definitely take the 2600 and throw in a Hyper 212 with the extra 30 bucks. Ryzen 2000 is definitely throwing more power dissipation around, and to manage it they're using the same types of temp vs power control they have on video cards. So good cooling literally equals more performance.

NIGARS
Sep 12, 2004

yeah nigars

Otakufag posted:

So has there been an explanation for why the AnandTech benchmarks gave like a +30% Ryzen 2 gaming advantage over Intel? Some say Intel's security patches lowered performance, others say it was something with RAM speeds running at the minimum supported for each platform. My uncle that works at AMD says they accidentally sent a ZEN 7nm sample.

The Intel numbers are the same as the original Coffee Lake review (actually marginally higher), so if there's something fucky it's with the Pinnacle Ridge numbers. No explanation yet. The discrepancy is so big it can't be explained by accidentally overclocking the RAM or something. At this stage it seems like it has to be some kind of massive human error.

NIGARS fucked around with this message at 09:04 on Apr 20, 2018

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Otakufag posted:

So has there been an explanation for why the AnandTech benchmarks gave like a +30% Ryzen 2 gaming advantage over Intel? Some say Intel's security patches lowered performance, others say it was something with RAM speeds running at the minimum supported for each platform. My uncle that works at AMD says they accidentally sent a ZEN 7nm sample.

Forget Meltdown/Spectre, Anandtech also has Ryzen 2000 as like 30% faster than Ryzen 1000 which it's just not. Anandtech hosed up their testing, they've said they're redoing it all just to be sure. There's an outside chance they discovered One Weird Trick For Better Ryzen Performance, but they probably just did something wrong.

Rocket League appears to be another weird outlier but it's possible that one is a legit Spectre/Meltdown impact. RAM clocks would also have a pretty strong impact on that as well.

It sounds like IMC stability is up, cache latency is down, XFR2 pretty much eliminates the need for manual overclocking (at better TDPs than you can do manually, which is cool), and clockspeeds+IPC are up modestly. Ryzen is finally out of beta and it's good. Can't wait to see what they come up with for Threadripper 2.

Intel better be making GBS threads bricks, 8% aggregate difference at 1080p is nothing. They need that 8086K yesterday and I don't even think they will have a response for 7nm Zen2, beyond "make more cores". They have engineered themselves into a corner and show no signs of extricating themselves so far. Bare minimum, they need to start getting the out-of-box clocks up, they do still have some headroom while AMD is pushing it right to the limit, but if you're not doing a 5+ GHz OC then AMD is right on their rear end.

edit: Here's L1Tech's launch coverage, not really a review but more of a preview/discussion

Paul MaudDib fucked around with this message at 09:48 on Apr 20, 2018

sauer kraut
Oct 2, 2004

Klyith posted:

This is a tough choice this time around. the 1600X was pretty meh because any 1600 non-X could easily OC to the same performance. with Ryzen 2 it looks like there's a lot more performance in the XFR2 special sauce and the X CPUs might be mildly binned for that. So the non-X may not be able to hit the same mark even OCed to the same 4 ghz.

If you hadn't already budgeted for a decent heatsink, definitely take the 2600 and throw in a Hyper 212 with the extra 30 bucks. Ryzen 2000 is definitely throwing more power dissipation around, and to manage it they're using the same types of temp vs power control they have on video cards. So good cooling literally equals more performance.

Good advice, the effective difference between the 2600 and 2600X is like 2% in games. Budget for a decent 30$ tower cooler (or buy a Prism from a 2700X owner who discarded it on ebay) and quality RAM.
If paying extra for the little X makes you feel good, that's fine too. We understand :)

eames
May 9, 2009

anandtech still has excellent CPU reviews, they're just hidden on page 72 in one of their forum threads :lol:

https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/page-72#post-39391302

SwissArmyDruid
Feb 14, 2014

by sebmojo
From PCPer:



Intra-CCX latency slower than Zen 1, what?

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

eames posted:

anandtech still has excellent CPU reviews, they're just hidden on page 72 in one of their forum threads :lol:

https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/page-72#post-39391302

That user review seems better than most of the site reviews.

NewFatMike
Jun 11, 2015

SwissArmyDruid posted:

From PCPer:



Intra-CCX latency slower than Zen 1, what?

Yeah... You want lower latency? Lower time = faster speed?

E: Oh on the early logical cores. No clue unless it's within margin of error or bad math? Latency is usually measured in clocks, not nanoseconds, so if clock speeds weren't matched or converted properly I could see it.

EE: matched with Intel, so maybe you had to trade in one area for the other? I'll take it? Seems like an overall gain.

NewFatMike fucked around with this message at 14:12 on Apr 20, 2018

Anime Schoolgirl
Nov 28, 2002

SwissArmyDruid posted:

From PCPer:



Intra-CCX latency slower than Zen 1, what?
edit: Slower within own core apparently. Result of new process?

Truga
May 4, 2014
Lipstick Apathy
intra-ccx

SwissArmyDruid
Feb 14, 2014

by sebmojo
Intra-, guys, not inter-.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

B-Mac posted:

That user review seems better than most of the site reviews.

Yeah, there's a pretty impressive level of detail. I'm particularly struck by the part about Pinnacle Ridge using a lot more power than advertised and the 2700X really being a 140W part. I would be curious to see similar graphs for wattage required at various performance levels for Raven Ridge, since I've been considering getting a 2200G to try underclocking it. Wattage figures for Ryzen Embedded suggest that it could get close to Core M territory and still deliver enough performance for a home server, but I haven't seen any mobos with the embedded version yet.

Anarchist Mae
Nov 5, 2009

by Reene
Lipstick Apathy

Eletriarnation posted:

Yeah, there's a pretty impressive level of detail. I'm particularly struck by the part about Pinnacle Ridge using a lot more power than advertised and the 2700X really being a 140W part. I would be curious to see similar graphs for wattage required at various performance levels for Raven Ridge, since I've been considering getting a 2200G to try underclocking it. Wattage figures for Ryzen Embedded suggest that it could get close to Core M territory and still deliver enough performance for a home server, but I haven't seen any mobos with the embedded version yet.

TPD does not tell you how much power a CPU will use, they do not advertise that. I'm not really sure what utility it does have.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Eletriarnation posted:

Yeah, there's a pretty impressive level of detail. I'm particularly struck by the part about Pinnacle Ridge using a lot more power than advertised and the 2700X really being a 140W part. I would be curious to see similar graphs for wattage required at various performance levels for Raven Ridge, since I've been considering getting a 2200G to try underclocking it. Wattage figures for Ryzen Embedded suggest that it could get close to Core M territory and still deliver enough performance for a home server, but I haven't seen any mobos with the embedded version yet.

Don't worry, AMD already covered this with their "electrical watts are not thermal watts" statement. Instead, AMD processors actually convert that excess energy into enthusiastic Reddit posts.

Anarchist Mae
Nov 5, 2009

by Reene
Lipstick Apathy

Paul MaudDib posted:

Don't worry, AMD already covered this with their "electrical watts are not thermal watts" statement. Instead, AMD processors actually convert that excess energy into enthusiastic Reddit posts.

You know what I'd like to see from Adored TV? If he turned his zeal against that lovely reddit and those awful YouTube channels that post crap like "Ryzen 2000 will OC to 5GHz!". That'd be entertaining.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Measly Twerp posted:

You know what I'd like to see from Adored TV? If he turned his zeal against that lovely reddit and those awful YouTube channels that post crap like "Ryzen 2000 will OC to 5GHz!". That'd be entertaining.

That's not just Reddit, that's actually from AMD... AMD_Robert is Robert Hallock, director of technical marketing.

TDP is actually a real thing, and power in = heat out. It's certainly not an exact science, there is a lot of wiggle room on what you're measuring, eg base clocks vs boost clocks vs AVX, etc. But there are also times when the company just lies about what the TDP is... the GTX 970 was never a 145W GPU, even its reference card pulled more like 175W under load. Given all these factors, in practice it's more of a marketing number than a strict measurement, but there is an actual underlying basis for what a TDP is.

Pinnacle Ridge probably does hit its TDP at base clocks, but that boost clock comes at a price too. And to be fair, it's not like the 8700K is a lightweight either... the whole reason Intel hasn't been pushing up clocks aggressively is because it's going to inflate the TDP just as badly as on Ryzen. If you enable 4.2 GHz MCE, it's probably not far behind Ryzen 2000 (if at all), and it'll be higher if you're pushing the overclocks closer to 5 GHz.

The Reddit crowd is pretty bad, dunno if it's a lot of enthusiastic kids who haven't been through release cycles before or what, but they really get on the hype train. It turns into a game of chinese whispers, someone says "Ryzen 2000 will probably clock 5-10% higher and have 0-5% IPC improvement", then someone else turns that into "definitely 10% clocks and 5% IPC, could be more", and then by the end it blows up into 15-20% clocks and 10% IPC. Then someone finds a leaked benchmark that shows a big gain under unknown conditions/hardware/settings, throws that against another benchmark, and everyone is sure that there's a big across-the-board gain coming.

Being skeptical of pre-launch stuff is probably the best policy in general. It's fun to speculate and project, but it doesn't actually count until it's in reviewer's hands and they can test it under controlled circumstances. I really hate that AMD always opens pre-orders before review embargo lifts. They are really bad about this, they just blatantly milk the fanboys for a whole week every single time they do a launch, because they know a lot of them will suck it up and pay even if they don't know what they're buying.

(the thing about AMD trying to screw GamersNexus and GN ending up getting samples anyway, having 4x as much time to test as all of the other reviewers, and also not having to adhere to embargo or NDA, is just hilarious)

He's also right that they're super toxic, you can probably count on one hand the number of reviewers that r/AMD haven't declared jihad against in the last year... which is pretty much a uniquely AMD thing. And there is always that one fan who takes things too far and starts sending threatening emails or something.

Paul MaudDib fucked around with this message at 20:38 on Apr 20, 2018

Cygni
Nov 12, 2005

raring to post

TDP ratings are in a weird spot with Turboing becoming the norm. You either strictly enforce TDP like Intel which leads to your 4.3ghz 8700 running at 3.2ghz, or you ignore it given the thermal headroom, which then what is the point of even publishing a TDP? Also feels misleading to say something has a specific TDP (which implies a certain electricity use/cost and heat level), and then just blast by that without user input. I dunno. Both answers seem dumb.

I guess the third answer is to just be honest, but you know...

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


Cygni posted:

TDP ratings are in a weird spot with Turboing becoming the norm. You either strictly enforce TDP like Intel which leads to your 4.3ghz 8700 running at 3.2ghz, or you ignore it given the thermal headroom, which then what is the point of even publishing a TDP? Also feels misleading to say something has a specific TDP (which implies a certain electricity use/cost and heat level), and then just blast by that without user input. I dunno. Both answers seem dumb.

I guess the third answer is to just be honest, but you know...

How hard is it to go Base TDP 145W Boost TDP up to 160W? Cap the boost TDP strictly based on what the box says, let the user override it if they want more. If the chip is garbage well you said up to, that works for ISPs having a 1Gbit line and selling it to 20 people as a gigabit connection.

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
Unless the standard is forced from an impartial authority, no one is going to take the first step and stick their hands into the blender of honest TDP ratings.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Cygni posted:

TDP ratings are in a weird spot with Turboing becoming the norm. You either strictly enforce TDP like Intel which leads to your 4.3ghz 8700 running at 3.2ghz, or you ignore it given the thermal headroom, which then what is the point of even publishing a TDP? Also feels misleading to say something has a specific TDP (which implies a certain electricity use/cost and heat level), and then just blast by that without user input. I dunno. Both answers seem dumb.

I guess the third answer is to just be honest, but you know...

I don't even get why people get upset about this... you doubled the core count, you're going to pull pretty close to twice the power. That's always been the tradeoff for running HEDT processors, and Ryzen+Coffee Lake are basically Sandy Bridge/Haswell-era hexa/octocore processors stuffed into a consumer socket. I know big numbers are scary but AIOs are no longer an exotic thing, and do a reasonable-enough job of controlling heat (TIM issues aside). I can pull 200W through my 5820K and not break 60C under load on a 140mm AIO.

But yeah, if we keep ramping up the core count without increasing TDP, the base clocks/all-core turbo are just going to keep creeping lower and lower. And when you do, today's HEDT processors pull scary amounts of power. Threadripper and Skylake-X can pull 300-500W if you push them hard. It's hard to see that trend as sustainable for too much longer.

Paul MaudDib fucked around with this message at 20:57 on Apr 20, 2018

Klyith
Aug 3, 2007

GBS Pledge Week

Cygni posted:

I guess the third answer is to just be honest, but you know...

FaustianQ posted:

Unless the standard is forced from an impartial authority, no one is going to take the first step and stick their hands into the blender of honest TDP ratings.

It is an honest spec, enthusiasts just misuse it. If TDP was meant to indicate maximum power draw, it would be called MPD.

Also the design in the name is because it is a spec based on expected real-world use, not benchmarks. As chips get more sophisticated abilities to both save power when it's not needed and use extra when it is, TDP diverges from from possible measured maximums and minimums. GPUs and now CPUs that can respond based on internal temperature sensing widen that potential gap even more -- since TDP is a spec for required thermal dissipation, a heatsink that can dissipate 200w is exceeding that spec.



Should they have a different spec for power consumption so people can add CPU max + GPU max + rest of system and buy a power supply based on those maximums? It wouldn't be a bad idea. But then they'd miss out on all the co-marketing deals.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Klyith posted:

It is an honest spec, enthusiasts just misuse it. If TDP was meant to indicate maximum power draw, it would be called MPD.

Also the design in the name is because it is a spec based on expected real-world use, not benchmarks. As chips get more sophisticated abilities to both save power when it's not needed and use extra when it is, TDP diverges from from possible measured maximums and minimums. GPUs and now CPUs that can respond based on internal temperature sensing widen that potential gap even more -- since TDP is a spec for required thermal dissipation, a heatsink that can dissipate 200w is exceeding that spec.

Reviewers aren't measuring peak power draw, they are measuring average power draw, and average should absolutely match the TDP.

"Smart" power management is great but it still averages out to the same thing, you are just moving power expenditure from time A to time B when it can be used more efficiently, and on any sort of a human-relevant timescale (seconds) it all averages out to whatever the TDP setting is.

Also, just because you have thermal headroom, doesn't mean you have power headroom available too. Most "smart" power management systems look at both when determining their available boost states.

Manufacturers may pretend that the phrase "TDP" doesn't have a meaning but it does, and unless they've managed to break the laws of thermodynamics then every bit of electricity that goes into the CPU comes out as heat, and every bit of heat that comes out of the CPU originally went in as power. They are one and the same, and on human-relevant timescales TDP = average power consumption. The fact that it may instantaneously be above or below the average is meaningless, that's why it's an average and not a hard cap.

Peak power consumption (on an instantaneous basis) is probably several hundred watts for most processors, for a zillionth of a second or something as the AVX kicks on. But that's why the PSU has output caps.

Paul MaudDib fucked around with this message at 22:05 on Apr 20, 2018

Pablo Bluth
Sep 7, 2007

I've made a huge mistake.
If 'every bit of electricty' that went in to a CPU came out as heat it'd wouldn't be able to do any other work and wouldn't be very useful. Just most of it.

lDDQD
Apr 16, 2006
The CPU is not a perfect resistor. Heck, even a resistor probably isn't a perfect resistor.
Some of the current simply leaks into the ground, dissipating very little. Some of the current leaves the CPU in order to drive I/O pins, for example... sure, the current's energy will be dissipated eventually - just elsewhere.

Craptacular!
Jul 9, 2001

Fuck the DH

Paul MaudDib posted:

(the thing about AMD trying to screw GamersNexus and GN ending up getting samples anyway, having 4x as much time to test as all of the other reviewers, and also not having to adhere to embargo or NDA, is just hilarious)

I wonder what GN did. He's been fairly enthusiastic about Ryzen in general and even slapped together a Threadripper machine to compress his videos as an example of something TR is good at.

Klyith
Aug 3, 2007

GBS Pledge Week

Paul MaudDib posted:

Manufacturers may pretend that the phrase "TDP" doesn't have a meaning but it does, and unless they've managed to break the laws of thermodynamics then every bit of electricity that goes into the CPU comes out as heat, and every bit of heat that comes out of the CPU originally went in as power. They are one and the same, and on human-relevant timescales TDP = average power consumption. The fact that it may instantaneously be above or below the average is meaningless, that's why it's an average and not a hard cap.

Misunderstanding the spec to mean something it's not intended for does tend to make things into nonsense, yes.

95W TDP = "this processor expects to generate 95W of heat in operation, the heatsink and system cooling need to dissipate that"

Put a crappy heatsink specced for 95W @ 40C over ambient on a 95W TPD processor, I'll bet it generates something much closer to 95W than 140W.



e:

Craptacular! posted:

I wonder what GN did. He's been fairly enthusiastic about Ryzen in general and even slapped together a Threadripper machine to compress his videos as an example of something TR is good at.
AMD was doing dumb stuff with the threadripper embargo, he called it out as dumb. AMD is definitely trying to "manage" product launches in ways that are crappy but likely profitable, complete with hardware pre-orders before reviews come out. See also nvidia founder's edition and other ways to separate gullible nerds from their money.

Klyith fucked around with this message at 22:56 on Apr 20, 2018

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

lDDQD posted:

The CPU is not a perfect resistor. Heck, even a resistor probably isn't a perfect resistor.
Some of the current simply leaks into the ground, dissipating very little. Some of the current leaves the CPU in order to drive I/O pins, for example... sure, the current's energy will be dissipated eventually - just elsewhere.

Those are essentially negligible amounts though. There's also probably a few mW that gets thrown off as RF... but 99.99% of the electricity that goes into a CPU comes out as heat. I'd love to see proof otherwise.

For a sense of just how good electronics are at turning electricity into heat, a PC is literally a better space-heater than a space-heater... and that largely comes back to all the various processors in all the subsystems of a PC being incredibly efficient at turning all the electricity being pumped into them into heat. Some efficiency loss, of course, but what goes in as electricity comes back out as heat.


https://www.pugetsystems.com/labs/articles/Gaming-PC-vs-Space-Heater-Efficiency-511/

I mean, do people really think there's hundreds of watts surging around the various subsystems in their PC like a superconductor? :lol:

Paul MaudDib fucked around with this message at 00:04 on Apr 21, 2018

SwissArmyDruid
Feb 14, 2014

by sebmojo
never mind

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

SwissArmyDruid posted:

Christ, that's not what Paul is saying.

Paul is saying two outputs, one input:

One input: Power

Two outputs: Work, or heat.

The work is also heat, though. Switching transistors is the work, and that takes electricity that gets turned into heat. Even stuff like IO requires switching transistors, which is heat. Everything gets turned into heat. It's not like your CPU is turning a pulley or doing some other work to dissipate the energy. Really, a CPU is pretty much a black box that turns electricity into heat, minus some infinitesimal losses for other things.

Can you set a TDP throttle so that you can make a Max-Q processor or whatever: yes. Should a processor that has a "TDP" of 105W be pulling 140W: no.

But hey, at this point it's really just a marketing number, at least as far as an unlocked processor with no hard limits is concerned. You can make it say whatever you want by measuring boost TDP vs base TDP and so on.

Remember: last generation Ryzen's TDP was pretty much bang on for power consumption. Most other processors' TDPs are also bang-on for power consumption, with a few notable exceptions. Ryzen 2000 isn't some oddity that behaves totally different from every other CMOS chip.

AMD basically just turned on MCE by default, but didn't increase the TDP rating appropriately to match. The 105W TDP is technically wrong, but :capitalism:

Paul MaudDib fucked around with this message at 01:15 on Apr 21, 2018

Cygni
Nov 12, 2005

raring to post

Klyith posted:

It is an honest spec, enthusiasts just misuse it. If TDP was meant to indicate maximum power draw, it would be called MPD.
Right but thats not my point. TDP may have an engineering definition, but they are not using the "honest spec" in the first place. They both use different calculations for the term "TDP" based on what they feel they can get away with in order to use the term for marketing, and neither use the basic formula. Cause thats what the "TDP" printed on the box is, marketing that the consumer is attempting to glean information from. And thats real hard when the behavior with regards to the stated TDP is wildly different depending on brand and generation. Here are three real world examples of stock CPU behaviors:

CPU A: 65w TDP that will turbo to marketed frequencies over 65w and stay there. Does not return to baseclocks if thermals allow.
CPU B: 65w TDP that will turbo to un-published frequencies over 65w for 3 seconds, then return strictly to 65w. Does not return to baseclocks if thermals allow.
CPU C: 65w TDP that will turbo to marketed frequencies over 65w for 3 seconds, then return strictly to 65w. Returns to baseclocks regardless of thermals.

For a consumer, its a mess. Not only do we have different uses and calculations of the term, we also have different implementations. Which is why reviews now have a page talking about how the CPU actually behaves. Maybe I'm just an old man yelling at clouds, but its annoying. And im on the internet, which means I need to complain about minor annoyances with way too many words cause im bored at work

Drakhoran
Oct 21, 2012

So apparently the 2700x can be overclocked to 6Ghz, if you use enough liquid Nitrogen.

https://www.youtube.com/watch?v=ogYess5WelY

Kazinsal
Dec 13, 2011



AMD's back in the extreme WTF lead it seems. Good on them! :byoscience:

EmpyreanFlux
Mar 1, 2013

The AUDACITY! The IMPUDENCE! The unabated NERVE!
In crazy batshit news, someone has been able to push DDR4 4000 CL14 with Ryzen 2.



This bodes well for the 12nm refresh of Raven later this year if they can get this as a close to standard OC (like at least 60% of all dies), and especially so for Ryzen 3000 series.

Adbot
ADBOT LOVES YOU

Alpha Mayo
Jan 15, 2007
hi how are you?
there was this racist piece of shit in your av so I fixed it
you're welcome
pay it forward~
Is it worth paying extra for X470 boards? Plan on upgrading my i5 2500K to Ryzen 2700X next month but X470 is pretty pricey (looks like they start around $180 at Microcenter).

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply