|
Looks like reviewer kits are in the wild: https://twitter.com/VideoCardz/status/1144319602383622144
|
# ? Jun 27, 2019 20:13 |
|
|
# ? Apr 29, 2024 10:42 |
|
Just started watching the Buildzoid Gigabyte X570 Aorus Ultra video, which I've been waiting for because that's the motherboard I've been planning on getting for my upcoming 3900X build. He's super down on it compared to the X570 Pro Wifi (which I've also been considering). I haven't finished the video yet, but I'm surprised he hasn't made note of the 2 RAM slots on the Pro Wifi compared to the 4 on the Ultra. That seems like a big difference-makers for some users.
|
# ? Jun 28, 2019 02:04 |
|
surf rock posted:Just started watching the Buildzoid Gigabyte X570 Aorus Ultra video, which I've been waiting for because that's the motherboard I've been planning on getting for my upcoming 3900X build. Ryzen generally performs best with 2x single rank RAM sticks as well. Most people only need 16gb or 32gb of RAM, and two slots will handle that.
|
# ? Jun 28, 2019 05:07 |
|
Cygni posted:Looks like reviewer kits are in the wild: Heeeeeeeeere we go https://wccftech.com/amd-ryzen-9-3950x-cpu-world-record-overclock-5-4-ghz-cinebench-r15/
|
# ? Jun 28, 2019 12:05 |
|
Damnit. Someone take these things to the limit on air or water, please.
|
# ? Jun 28, 2019 14:01 |
|
BangersInMyKnickers posted:Intel is a cautious company I mean they really are, in a "say fella, you might want to, uhhh, disable hyperthreading" kinda way. They were very cautious in scheduling quarterly security microcode patches to ensure the cumulative performance loss is experienced in gentle steps. Out of an abundance of caution, they reserved HUGE swaths of CVE identifiers for documenting all the security gently caress ups they're going to have to address.
|
# ? Jun 28, 2019 15:21 |
ilkhan posted:Damnit. Someone take these things to the limit on air or water, please. It's not annoying at all that every leaked overclocking result is on ln2 and every benchmark leak is on some oddball memory config.
|
|
# ? Jun 28, 2019 16:18 |
|
Bloody Antlers posted:I mean they really are, in a "say fella, you might want to, uhhh, disable hyperthreading" kinda way. They were very cautious in scheduling quarterly security microcode patches to ensure the cumulative performance loss is experienced in gentle steps. Out of an abundance of caution, they reserved HUGE swaths of CVE identifiers for documenting all the security gently caress ups they're going to have to address.
|
# ? Jun 28, 2019 16:20 |
|
Alpha Mayo posted:Intel got super greedy. I know 90% of the board still runs a 2500K but honestly, that shouldn't be possible to get away with. That CPU is 8 years old now, it came out in 2011. I'm still quite happily running a 4770K. It sits at 4.6ghz all day and never goes over 74c on a D15. All I want is more cores at a reasonable price at the same speed or better. The 2700x was almost but not quite that. Zen2 almost certainly will be but good lord how did it take until 2019 for me to even think about maybe being excited about a new CPU?
|
# ? Jun 28, 2019 16:35 |
|
I had a 3570K (or was it 3750K?) Paired up with a Nvidia 1070 I had to upgrade earlier this year because it really could not keep up with the new games like BF5. I would see drastic frame drops, sometimes 1sec lock ups which made the game unplayable. So went out and got an i5 9600K and a new board and ram for about $500 total and the improvement was drastic. In BF5 I wentr from stutters and barely 60 FPS,t o 130FPS I destiny 2 I went from 60 FPS to 140FPS... So IDK how you guys are using the 2500K with modern games anymore. Granted the 2500K was better chip overall but still.
|
# ? Jun 28, 2019 18:05 |
|
Peechka posted:I had a 3570K (or was it 3750K?) Paired up with a Nvidia 1070 I had to upgrade earlier this year because it really could not keep up with the new games like BF5. I would see drastic frame drops, sometimes 1sec lock ups which made the game unplayable. So went out and got an i5 9600K and a new board and ram for about $500 total and the improvement was drastic. In BF5 I wentr from stutters and barely 60 FPS,t o 130FPS I destiny 2 I went from 60 FPS to 140FPS... So IDK how you guys are using the 2500K with modern games anymore. Granted the 2500K was better chip overall but still. This is exactly what my 3570k is doing as well.
|
# ? Jun 28, 2019 18:08 |
|
Zotix posted:This is exactly what my 3570k is doing as well. Honestly same but 3550 and BF1. I didn't bother trying BF5. Glad most games are going with unreal instead of the timeline where frostbyte somehow became a giant engine
|
# ? Jun 28, 2019 18:11 |
|
Zotix posted:This is exactly what my 3570k is doing as well. My kid has the i5 6800K OCed to 4.8 and he could not play Division 2. The game would pretty much lock the processor in 100% utilization. Im sure its just a bug with the game, although a bad one, but still thats a 2-3? yr old proc. Honestly I would not get anything lower than a six core from now on, actually probably stick to 8 cores for gaming from now on.
|
# ? Jun 28, 2019 18:31 |
|
Yeah the division 2 would do some hard rear end stutters for me.
|
# ? Jun 28, 2019 18:44 |
|
Isn't The Division 2 one of the games where profiling revealed that the CPU was spending 80% of its time checking that you were legitimate via DRM, and 20% actually running the game?
|
# ? Jun 28, 2019 18:45 |
*cries in Denuvo*
|
|
# ? Jun 28, 2019 18:57 |
|
Twerk from Home posted:Isn't The Division 2 one of the games where profiling revealed that the CPU was spending 80% of its time checking that you were legitimate via DRM, and 20% actually running the game? That's why I need more cores! Don't some of these games live encrypt and decrypt the code over and over?
|
# ? Jun 28, 2019 19:01 |
|
Twerk from Home posted:Isn't The Division 2 one of the games where profiling revealed that the CPU was spending 80% of its time checking that you were legitimate via DRM, and 20% actually running the game? People say that about every recent Ubisoft game but there's usually not much to back it up. Asscreed Odysseys bad performance was blamed on DRM too but it actually ended up being a hilariously bad resource manager aggressively unloading assets then reloading them then unloading them then reloading them forever.
|
# ? Jun 28, 2019 19:11 |
|
Twerk from Home posted:Isn't The Division 2 one of the games where profiling revealed that the CPU was spending 80% of its time checking that you were legitimate via DRM, and 20% actually running the game?
|
# ? Jun 28, 2019 19:48 |
|
7/7 release date. Any idea what time during the day these kinds of things go live? Or is it very much a refresh your sites until they appear?
|
# ? Jun 28, 2019 21:42 |
|
Setzer Gabbiani posted:Heeeeeeeeere we go BY GOD.
|
# ? Jun 28, 2019 22:24 |
|
5.4ghz you say? Im sure Intel can do better on freakin LN2. This is not useful.
|
# ? Jun 28, 2019 22:25 |
|
I've done all I can do on my Ryzen cross-gen comparison doc until I have a 3900X in hand. If you're interested in how gens 1 and 2 stack up, check it out here. The prose isn't polished, and the bottom section is incomplete, but the data is finalized. I took 4 whole extra days to run the WCG workunit comparison tests a second time, with SMT disabled, because there are still people all over reddit "helpfully" informing newbies that actually hyperthreading will slow your CPU down. I've never seen that happen, and it sure didn't happen here. In tasks which so constantly stream data into and out of the CPU that accidentally having the memory underclocked on one machine led to a 5% performance degradation, the smallest uplift I saw from SMT was 1.14x. If anybody else has any fairly self-contained things that they'd like to see run for comparisons, I'm happy to consider adding them in. No purely synthetic benchmarks, please; I'm interested in actual tasks.
|
# ? Jun 28, 2019 22:37 |
|
Dr. Fishopolis posted:I'm still quite happily running a 4770K. It sits at 4.6ghz all day and never goes over 74c on a D15. All I want is more cores at a reasonable price at the same speed or better. The 2700x was almost but not quite that. Zen2 almost certainly will be but good lord how did it take until 2019 for me to even think about maybe being excited about a new CPU? Zen+ is technically a bit faster (at 4.4 I score ~4480 single core in geekbench), but eh. The 8700k tempted me but I waited for zen+ and wasn't using my pc much at the time. The 9900k certainly tempted me to upgrade, but its price tag did not. Although I seem to not mind $500 for a 12 core. I have no idea what I'll walk away with on the 1st/7th/8th/whatever, but it will probably be a 3700x or 3900x.
|
# ? Jun 28, 2019 23:48 |
|
redeyes posted:5.4ghz you say? Im sure Intel can do better on freakin LN2. This is not useful. The Intel cinebench r15 record that it has apparently beaten is under LN2 as well, so it's comparable in that respect. It's not like it matters to the average user, but top .0001% scores of anything never really do. The 9960x record is clocked at 5.9GHz for what it's worth. https://hwbot.org/submission/4168151_bigblock990_cinebench___r15_core_i9_9960x_5320_cb
|
# ? Jun 28, 2019 23:51 |
|
I haven't even overclocked my 4770k yet but I have not yet run into the game that makes it seem to suck, thankfully. I guess I ought to see what kind of OC headroom I have on it - I got all the supporting stuff to overclock it but I just kept it stock when nothing I was doing needed more CPU power. That said, definitely going to go with a higher core count on my next build now that games and apps are actually using them well. My first good comp that performed well for the generation was a Barton core AMD based one and I would be happy to go with AMD again if the value is there.
|
# ? Jun 29, 2019 01:19 |
|
Agreed posted:I haven't even overclocked my 4770k yet but I have not yet run into the game that makes it seem to suck, thankfully. I guess I ought to see what kind of OC headroom I have on it - I got all the supporting stuff to overclock it but I just kept it stock when nothing I was doing needed more CPU power. Guild Wars 2 absolutely sucks with mine @4.2 as it's quite old and notoriously CPU-bound, but I'm more than willing to sacrifice whatever I lose over a 9900k for the multi-core performance. I'm definitely waiting for the 3rd party reviews but i'm 90% in the AMD camp now. My ML workloads will benefit the most I think. Things like filling a shuffle-buffer will go a lot faster with 12c.
|
# ? Jun 29, 2019 04:08 |
|
You don't really need that many cores for most GPU ML use-cases - 2 cores per GPU as a rule of thumb is typically more than sufficient. I have a home ML server (1950X with 4x 1080 Tis) and desktop (1700 w/ RTX Titan) and have never hit a case where the CPU was the bottleneck.
|
# ? Jun 29, 2019 07:00 |
|
mdxi posted:If anybody else has any fairly self-contained things that they'd like to see run for comparisons, I'm happy to consider adding them in. No purely synthetic benchmarks, please; I'm interested in actual tasks.
|
# ? Jun 29, 2019 09:05 |
|
Pablo Bluth posted:See if you can get OpenFOAM CFD to work and run the benchmark case discussed here. CFD is great for giving memory access a workout. Halfassed home-lab CFD is like the reason to have a Threadripper. High clocks for serial tasks like meshing, and high memory bandwidth to feed the threads your 90gb of model data.
|
# ? Jun 29, 2019 10:02 |
|
Anyone know what everyone's least-favorite bloviating scotsman is on about this time? A video posted 20 hours ago popped up in my youtube feed.
|
# ? Jun 29, 2019 11:33 |
|
SwissArmyDruid posted:Anyone know what everyone's least-favorite bloviating scotsman is on about this time? A video posted 20 hours ago popped up in my youtube feed. Not worth the watch this time, and I say that as someone who enjoys his videos. He just rants about that intel letter that was released, and goes on a bit about intel hiring basically a PR team
|
# ? Jun 29, 2019 12:02 |
|
SwissArmyDruid posted:Anyone know what everyone's least-favorite bloviating scotsman is on about this time? A video posted 20 hours ago popped up in my youtube feed.
|
# ? Jun 29, 2019 12:53 |
|
Peechka posted:I had a 3570K (or was it 3750K?) Paired up with a Nvidia 1070 I had to upgrade earlier this year because it really could not keep up with the new games like BF5. I would see drastic frame drops, sometimes 1sec lock ups which made the game unplayable. So went out and got an i5 9600K and a new board and ram for about $500 total and the improvement was drastic. In BF5 I wentr from stutters and barely 60 FPS,t o 130FPS I destiny 2 I went from 60 FPS to 140FPS... So IDK how you guys are using the 2500K with modern games anymore. Granted the 2500K was better chip overall but still. Still have a 60hz monitor and am overclocked to 4.5ghz. I don't get any lockups though the most intensive game I play is BLOPS4. But I would like to get a 144hz monitor and my CPU needs to keep up for that to happen, so it's time to upgrade. I think part of why 2500K held up so long is that both the Xbox One and PS4 use weak, 2x quad-core AMD apus. 8 weak cores and devs werent too used to spreading out the workload at first, so most of the work was really done on one slow core in many early games, so 4 fast cores could easily keep up. Now that they are better at distributing the work across 8 cores and gaming APIs are more optimized for multicore, quad-core is just going to fall further and further behind very quickly and that I'll need to upgrade to at least 8 cores too.. Especially because the PS5/next xbox is coming soon and will have 8 ryzen cores. That's my theory but I am not a game developer.
|
# ? Jun 29, 2019 14:30 |
|
Well this is an interesting development https://www.cpubenchmark.net/singleThread.html
|
# ? Jun 29, 2019 16:04 |
|
So why is only Ryzen 5 3600 getting leaks?
|
# ? Jun 29, 2019 16:12 |
|
Sininu posted:So why is only Ryzen 5 3600 getting leaks? Probably because it's the only one that's out in the wild? I wouldn't be surprised if they shipped it out as engineering samples since it's the "lowest" of the Zen2's.
|
# ? Jun 29, 2019 16:19 |
|
Don Dongington posted:Well this is an interesting development I am seriously considering dropping one in for my 1800X that I got at a discount earlier this year
|
# ? Jun 29, 2019 17:58 |
|
Agreed posted:I haven't even overclocked my 4770k yet but I have not yet run into the game that makes it seem to suck, thankfully. I guess I ought to see what kind of OC headroom I have on it - I got all the supporting stuff to overclock it but I just kept it stock when nothing I was doing needed more CPU power. It's worth delidding. I couldn't get mine over 4.4 without it, but a little dab of gallium and it's been going for 6 years solid at 4.6 with nary a complaint.
|
# ? Jun 29, 2019 18:06 |
|
|
# ? Apr 29, 2024 10:42 |
|
Sininu posted:So why is only Ryzen 5 3600 getting leaks? There have been tons of leaks on the 3900X too. Its all just synthetic benches though so I wouldn't pay them much mind for now. edit: here is one here if you want it really: https://browser.geekbench.com/v4/cpu/13698306 Generally speaking going by all these leaks (and there are a bunch if you know where to look) Zen2 is meeting or slightly beating Coffeelake on single thread and curb stomping it on multithread performance at same or similar clocks...but again these are all leaks of synthetic benches. I'd be skeptical of them for now. PC LOAD LETTER fucked around with this message at 18:12 on Jun 29, 2019 |
# ? Jun 29, 2019 18:07 |