Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Theris
Oct 9, 2007

I hope AMD realizes that any product with names like those has to come in a box featuring a poorly CGI'd warrior lady in bikini armor.

Adbot
ADBOT LOVES YOU

Theris
Oct 9, 2007

PC LOAD LETTER posted:


Progressive tweaks of an existing design tend to get you middling to high single digit performance increases with each revision. Sometimes not even that really. Kind've like how Intel has been doing since Sandybridge. Which no one is very impressed with, at least not for Intel's prices.
.

Was Sandybridge a clean sheet redesign? I was under the impression that the current Cores are still an evolution of P6.

Theris
Oct 9, 2007

Inside the Machine by Jon "Hannibal from Ars" Stokes is great as an approachable introduction to in-depth CPU stuff. Unfortunately it's pretty outdated. (Only goes up to K8 and Core 2) He's supposedly working on a new revision but has been pretty quiet about it.

Theris
Oct 9, 2007

I'm guessing SciMark uses AVX?

Theris
Oct 9, 2007

I've had that exact ram in my Z170 system since Skylake was brand new. It's good stuff: does 2666 at stock voltage and timings so would probably run much faster with relaxed timings and a voltage bump, I've just never bothered to try.

Theris
Oct 9, 2007

It should have the "Rip and tear your threads" Doom comic edit right on the box.

Theris
Oct 9, 2007

MaxxBot posted:

Console games can be optimized a lot better than PC games since you're always gonna be running one set of hardware, it would probably be better to use a really well optimized game like DOOM as a point of comparison.
]

The "XB1X is like a PC with a 1070" comparison has to be including the console optimization in the comparison. Because an XB1X has 40 CUs @ 1.2GHz, where on the PC to match a 1070 Vega needs 56 CUs at 1.4GHz. (Depending on the game, ofc)

Theris fucked around with this message at 17:21 on Oct 31, 2017

Theris
Oct 9, 2007

Combat Pretzel posted:

Who knows? There may be reasons we're not privvy to. Just saying, the memory controller design seems to be more a technical/design one than a managerial.

I believe the Ryzen IMC is a SIP block licensed from Rambus, not an in-house design. I don't know if that means AMD isn't capable of designing one, or if maybe could do better if they shifted some resources to doing one in-house. Getting the micron guy might be a step towards that.

Theris
Oct 9, 2007

ufarn posted:

Do Noctua still have the offer of the free parts needed to mount their fans on AM4 boards?

It wasn't a limited time thing: If you have a Noctua cooler and they make a mounting kit for it that you need and don't have, they'll give you one for free, always.

Theris
Oct 9, 2007

HalloKitty posted:

Ah, the classiest of them all, with their fake cache chips 'n' all.
But didn't they get bought out by ECS?

IIRC, PC Chips bought ECS and took the name.

I had a K7S5A and it was great, but I get the impression that the K7S5A was a huge fluke and they've never made a board anywhere near that good before or since.

Theris
Oct 9, 2007

Kazinsal posted:

Should I know who Gadi Evron is?

He's obviously a well respected security researcher and highly regarded member of the infosec twitter community with... *squints* ...1000 followers.

...who passes along "we, a company that popped up out of nowhere and clearly associated with a group heavily shorting AMD stock, failed to follow anything resembling responsible disclosure because we care so much about you, the end user" without batting an eye.

Theris
Oct 9, 2007

Here's a thread of some actually well-respected infosec Twitter people:

https://twitter.com/taviso/status/973622044200919040

Consensus seems to be that yes, they're legit vulnerabilities, but not much worse than the average "having root access allows arbitrary code execution" caliber vulnerability.

Theris
Oct 9, 2007

Palladium posted:

I'm curious how well a 2700X can undervolt at let's say 4GHz because 150W stock is too much for me but I'm sure I won't be able to find out when everyone is gonna YOLO their chips because OCing street creds

Gamers Nexus posted:

To this end, we found that, at a given frequency of 4.0GHz, our R7 2700X held stable at 1.175V input at LLC level 4, which equated to 1.162V VCore at SVI2 TFN. The result was stability in Blender and Prime95 with torturous FFTs, while measuring at about 129W power consumption in Blender. For this same test, our 1700 at 4.0GHz required a 1.425V input at LLC level 5, yielding a 1.425VCore, a 201W power draw – so 70W higher – and pushed thermals to 79 degrees Tdie. That’s up from 57.8 degrees Tdie at the same ambient.

Theris
Oct 9, 2007

Bloody Antlers posted:

I wonder what the Intel engineers that designed the 8086 back in 1976 would say if they could see how far past the point of diminishing returns we've carried their baby.

It was only 40 years ago. Stephen Morse, Bill Pohlman, and Bruce Ravenel are still alive. Morse did an interview with PC World for the 30th anniversary. He doesn't talk much about modern CPUs but did mention that if they had known that x86 was going to stick around rather than just be a stopgap until the 432 could be released, they'd have done some things differently.

Theris
Oct 9, 2007

Deuce posted:

Mostly stick to Glorious Hair Man's Nerd Porn.

This is good advice. If you want to watch someone who knows his poo poo talk about PC hardware, even if the presentation is kinda dry, watch Gamer's Nexus. If you want to see a dipshit do dumb things with PC hardware but in a reasonably entertaining way, watch Linus. Almost everyone else (I'm sure there's probably a couple occasionally worth watching) is about as entertaining as Gamer's Nexus and about as informative as Linus and I have no idea how they have an audience.

Theris
Oct 9, 2007

Xae posted:

Once THE ALGORITHM has determined that you will like a video/channel there isn't a drat thing you can do to get it to gently caress off.

THE ALGORITHM thinks I'd be super into alt-right/MRA/gamergate dudes talking into a webcam in front of their anime figurine collection for an hour, presumably because I watch a lot of nerd poo poo. I always do the "not interested in this channel" thing but it never stops.

Theris
Oct 9, 2007

SwissArmyDruid posted:

Except that for a few months at the beginning of this year, Nvidia was in a mess of driver regressions and fuckups.

This is important to note. When most people who say "AMD's drivers are fine, and have been for several years," say it, they don't mean they're good or even fine by any sane quality standard, they mean that they aren't anymore broken than Nvidia's drivers tend to be. That or the speaker uses some feature in particular that's perma-broken in Nvidia's drivers. (I'm AMD-only until Nvidia catches up to ATI circa-2006 and lets you switch between individual and spanned desktops without forcing you to reconfigure the spanned desktop and bezel compensation from scratch every time)

Theris
Oct 9, 2007

wargames posted:

5k is just ultrawide 4k.

5k is 5120x2880, quad 1440p. Or to put it another way, 5k:1440p::4k:1080p.

But maybe some vendor uses 5k to mean ultrawide 2160p because there's nothing that says they can't.

Theris
Oct 9, 2007

The bigger cache should be a huge help for minimum frame times, but the inter-chiplet latency might wipe out any benefit from the cache. We won't know for sure until there's actual benchmarks.

Theris
Oct 9, 2007

PC LOAD LETTER posted:

That being said even the "garbage tier" X570 mobos in the ~$150-200 are looking to be better than the best X470 mobos going by what Buildzoid has mentioned so far.

Better in terms of build quality and power delivery, not features/connectivity. So if you want, say, tons of SATA ports or 2.5G Ethernet or the like (or lots of RGB) you might still be better off with a mid/high-end X470 board vs a similarly priced low-end x570 board.

For someone like me who doesn't care about that stuff, though, the Asus X570 Prime P is looking really good if it actually comes in at the rumored $150.

Theris fucked around with this message at 03:22 on Jun 22, 2019

Theris
Oct 9, 2007

ilkhan posted:

Damnit. Someone take these things to the limit on air or water, please.

It's not annoying at all that every leaked overclocking result is on ln2 and every benchmark leak is on some oddball memory config.

Theris
Oct 9, 2007

iospace posted:

Also, when's the embargo lifting?

12PM EDT/9AM PDT on the 7th, I think.

Theris
Oct 9, 2007

eames posted:

I suspect that the difference in temps and performance is minimal because the heat transfer, not dissipation, is the bottleneck.

I believe you're right and the tiny surface area of the chiplet is playing a big role here. A Zen 2 chiplet is less than half the area of a Zen+ die. Even using way less power it's going to have trouble moving heat as well.

Theris
Oct 9, 2007

lllllllllllllllllll posted:

For 65W-TDP these 3xxx CPUs sure are hot & power hungry. Wish AMD had something to offer for the cool and quiet crowd as well.

Despite exceeding TDP under heavy loads they still use substantially less power than the performance equivalent Intel, and while they run hotter than what we're used to that's a function of physics wrt the reduced die size. Your cooler doesn't have to work harder/louder (because it's not actually dissipating more energy), you just need to be comfortable with the chip sitting at 70C instead of 60.

vvvv: Right. When you're looking at review charts a power/heat throttling Intel part might look better in instantaneous power consumption, but it's going to use more total power because it takes longer to complete the task. A couple reviewers point this out in the text accompanying the charts, at least.

Theris fucked around with this message at 12:57 on Jul 10, 2019

Theris
Oct 9, 2007

SSJ_naruto_2003 posted:

Kind of curious why my processor sits in the 1.45 to 1.5 volt range even when idling pretty hard. It never goes above 1.5 though so...

Apparently it's a conflict between how some tools read voltage and the way core parking works on the 3000s. CPU-z and Ryzen Master will show correct idle voltage, most everything else reads way high.

Theris
Oct 9, 2007

xgalaxy posted:

They have the Chromax line now, that is a decent compromise all jokes aside.

Even that is extremely off-brand for them.

Theris
Oct 9, 2007

Amazon finally delivered the Prime X570-P I ordered launch day, to go with the 3600X I've had since Tuesday. Haven't had much time to play with it, but here's a quick trip report:

I missed entering UEFI setup on the initial boot to have it boot from my Windows 10 install USB so it booted to my existing install and...it works? It's a Windows 10 Pro install from a 6700k/z170 and everything seems to work just fine. I went ahead and installed the AMD chipset drivers and have yet to run into any obvious problems. I'll eventually nuke and pave it anyway because I know if I don't I'll eventually run into some mysterious problem that's impossible to troubleshoot, but for now I'm just going to leave it.

The BIOS it comes with freezes on the confirmation dialog if you make any changes at all, gg Asus. Fortunately the launch day downloadable BIOS fixes it.

It might have the "won't boost past 4.3GHz" issue. HWinfo says the current BIOS has AGESA 1.0.0.3 but I don't know if that's 1.0.0.3A or 1.0.0.3AB that's supposed to have the fixes and it's just truncating the letters, or if 1.0.0.3 is it's own version. With PBO enabled it sits at 4.2 exactly on all core loads, (which is normal even with the boost fixes, I think) and bounces around 4.3 on single core. I don't have the high idle voltage problem. Haven't had time yet to play around with the various PBO scaling and AutoOC options or try different voltage offsets.

Currently running Prime95 Large FFT (don't want it to melt from small/blend while I'm not watching it) overnight. Under an NH-D15 it hasn't gotten above 75C yet.

Edit: Forgot to mention that I don't really notice the chipset fan. It's audible, but just blends in with the other fans and background noise unless I'm actually trying to listen for it.

Theris fucked around with this message at 05:10 on Jul 12, 2019

Theris
Oct 9, 2007

People should be thinking of the 3900x as an HEDT chip that just happens to work in regular desktop motherboards. There's absolutely no reason to get one over an 8 or 6 core if all you do with your PC is game.

Theris
Oct 9, 2007

There are lots of boards out there that either haven't been updated to an AGESA where boosting works properly or have had a BIOS update that actually breaks boosting again.

If you have good cooling and aren't hitting advertised boosts it's more likely that's why than silicon quality. (Though it's entirely possible you did get a turd)

Theris
Oct 9, 2007

Lambert posted:

Windows does run on ARM. You can buy laptops with Snapdragon CPUs that run Windows right now.

It's not something that comes up in the era of x86 dominance, but Windows NT's architecture independence was a big deal in the 90s. You could get Windows for Alpha, PowerPC and MIPS up through NT 4, and 2000(I think), XP, and the Vista/7 era server releases supported Itanium.

The aspects of Windows' design that enabled that didn't go away in Vista/7/etc, there just wasn't a non-x86 architecture worth running a desktop OS on during that time.

Theris fucked around with this message at 12:12 on Jul 23, 2019

Theris
Oct 9, 2007

Mr.Radar posted:

The Scottish dude is done with AMD and won't release any more (public) speculation videos:

https://www.youtube.com/watch?v=kU5h0MYpmfg

And nothing of value was lost.

Theris
Oct 9, 2007

MaxxBot posted:

AMD actually lowered the performance of their chips in response to stupid people on Reddit complaining about a nonexistent issue, I mean it might be a totally insignificant difference but I still don't like it.

To be fair, there's something of an actual problem with the high reported idle temperatures causing fans to ramp when they don't really need to. But, that can be fixed by adjusting fan curves or making sure AIOs are changing fan speed off the coolant temp instead of CPU temp. AMD changing the boost behavior to mollify people freaking out over numbers they don't like that aren't actually a problem is lame as hell.

Edit: lmao that idle clocks have increased so now they will actually be using more power at idle, but hey the reported temps and voltages will be lower so the people who stare at hwinfo all day instead of using their computer will be happy.

Edit 2: Looks like they've changed the boost delay to be more like Windows Balanced or Ryzen Power Saver. Why not just tell people to switch their plan to one of those instead of changing Ryzen Balanced to match?

Theris fucked around with this message at 07:23 on Jul 31, 2019

Theris
Oct 9, 2007

Grog posted:

I'm only running a 3600 and MSI X570-A PRO, but it seems to get up to the specified max boost on at least a couple of cores (like the Hardware Unboxed results for the 3800X)

My 3600X on an Asus X570-P boosts to 4267 all core, which I've heard is pretty good. Unfortunately it also boosts to 4267 single core. No combination of PBO, Auto OC, or the Asus OC settings changes that. :shrug: I'm not super worried about it but I wouldn't mind if a future update fixes it.

Edit: In other weirdness, I've been running a pair of Corsair 3000CL15 (Hynix A-die) at 3200 with all the timings set via XMP. This gets me 73ns in AIDA64. I saw the positive results some people have been getting with manual timings so I grabbed Thaiphoon Burner and the DRAM calculator. The timings from DRAM calculator on manual with the XML import from TB (including a drop to CL14) gets me...76ns.

It did up the bandwidth from high 45GB/s to low 46GB/sec. I went back to auto timing on everything except leaving CAS at 14 and it went back to 73ns but kept the small bandwidth gain so I guess I'll just leave it at that. DRAM Calculator's V1 and V2 presets have even looser timings than auto so I haven't bothered trying them.

I also tried boosting the FCLK. It's stable at 1900, but latency goes up to 80ns so I'm guessing this board ties UCLK to MCLK and I can't find a setting for it in the BIOS. (Not even under anything that seems like it might be a possible alternate name like Memory Controller Clock, Uncore Clock, IO Die Clock, SOC Clock or whatever)

Theris fucked around with this message at 09:56 on Aug 22, 2019

Theris
Oct 9, 2007

ConanTheLibrarian posted:

prime95 doesn't use the GPU while F@H does, so it's probably just down to more heat in the same case.

It's probably this. My 3600x (in a Define R5 with an NH-D15) will sit at around 65 when doing F@H by itself. If the GPU actually manages to get a unit assigned, the CPU jumps up to around 75. It's the same "delta over ambient," it's just that "ambient" inside the case is a lot higher with the GPU dumping another 200W of heat into the air that's headed for the CPU cooler.

Theris fucked around with this message at 20:01 on Apr 27, 2020

Theris
Oct 9, 2007

gradenko_2000 posted:

Also, this seems to suggest one more line of CPUs after Ryzen 4000 that's still on AM4 and still supported by B550 etc., though I guess that might just be Zen 3-based APUs.

I don't think there's any technical reason that they couldn't pair Zen4/5/whatever with a Zen2/3 era I/O die to make it work on AM4, barring big changes to infinity fabric. Would they want to do that is another question entirely, and I kinda doubt it.

Theris
Oct 9, 2007

Fame Douglas posted:

Motherboard manufacturers have been quietly increasing their default voltages for Intel CPUs above what Intel recommends for a long time, this is type of behaviour is (unfortunately) a very old hat.

The default voltage on my Asus z170 board was something so absurd that I was able to run an offset in the -.3 range even with a pretty hefty overclock.

Theris
Oct 9, 2007

The last time AMD had objectively superior products to Intel they maxed out around 50% market share. And that was with CPUs that were better in every way, there weren't any "but what about single thread limited high framerate esports games?" reasons for buying Intel. (IIRC Intel fanboys at the time justified their purchases by pointing to AMD's chipset issues, which were mostly a problem from the early Athlon days that weren't really a thing anymore by then)

And yeah, the r/AMD crowd ensures that AMD has a market share floor regardless of the crap they're putting out, but it's way lower than Intel's.

Theris
Oct 9, 2007

Seamonster posted:

ehhhh no 5000 series for DDR5? booo

I have an incredibly hard time believing that AMD of all people would pass up a chance to use the 5000 series for a 5nm CPU that will use socket AM5 and introduce support for DDR5 and PCI-E 5 just to make their naming scheme slightly less confusing.

Edit: I'd be more inclined to believe they'll skip "Zen 4" or use it for what is presumably the Zen 3 refresh in Warhol than they'll use Ryzen 5000 for Zen 3.

Theris fucked around with this message at 15:36 on Sep 10, 2020

Theris
Oct 9, 2007

Jeff Fatwood posted:

Wow, I had no idea, that's actually really cool. How many iX86-generations have been after that?

Hard to tell, really. I guess you could make an argument that if Nehalem/Westmere = 786, then *Bridge = 886, *Well = 986 and *Lake = 1086. But the whole thing ignores that Pentium 4 would have been 786, before they went back to 686 for the pre-i7 Cores. So...:shrug:

Edit: I think the problem you run into in trying to figure this out is how big a change to the microarchitecture justifies an x86 generation increase anyway? IIRC there was a bigger difference between Core 2 and the "686" Core than there is between Core 2 and original i7 Nehalem.

Theris fucked around with this message at 10:45 on Sep 29, 2020

Adbot
ADBOT LOVES YOU

Theris
Oct 9, 2007

future ghost posted:

Elitegroup Computer Systems might have something to say about that.

Maybe, but there will always be a special place in my heart for the K7S5A.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply