|
BurritoJustice posted:I'm itching to get a 9900K, something Z390 with a PLX chip and some 4000MHz+ RAM. What are the best estimates for the next gen release date at the moment? I read somewhere about a ~surprise~ release date of the 14th, but we'd have seen more leaks by now. Figure on the first or second week of October, with 'leaks' leading up to it.
|
# ? Sep 11, 2018 15:43 |
|
|
# ? Apr 25, 2024 21:11 |
|
BurritoJustice posted:I'm itching to get a 9900K, something Z390 with a PLX chip and some 4000MHz+ RAM. What are the best estimates for the next gen release date at the moment? I don't remember any Z370 boards with a PLX chip. There might be some out there from Supermicro or somethin, but Asus def didn't offer one even on the Maximus skus when I was lookin. Whats the use case you are looking for? You might be better served by Threadripper or Skylake-X if you need piles of PCIe lanes. Also I wouldn't recommend the 4000mhz baller RAM. Intel platforms generally plateau around 3000mhz, if not sooner.
|
# ? Sep 11, 2018 15:52 |
|
I asked before but I dont think I saw an answer. Can I put a 9700j in a 370 board to replace my 8700k? Or will features be locked down without a 390 board?
|
# ? Sep 11, 2018 15:56 |
|
We don’t know yet. Likely will be able to use a Z370 to OC, based on the leaks.
|
# ? Sep 11, 2018 15:59 |
|
Aeka 2.0 posted:I asked before but I dont think I saw an answer. Can I put a 9700j in a 370 board to replace my 8700k? Or will features be locked down without a 390 board? Yes, the 9700k/9900k chips are supported on Z370 boards. There should be a bios update from your board manufacturer that adds support for them.
|
# ? Sep 11, 2018 16:03 |
|
Cygni posted:I don't remember any Z370 boards with a PLX chip. There might be some out there from Supermicro or somethin, but Asus def didn't offer one even on the Maximus skus when I was lookin. Whats the use case you are looking for? You might be better served by Threadripper or Skylake-X if you need piles of PCIe lanes. All the manufacturers are waiting on Z390 to do their halo tier boards, gigabyte told me ages ago that their PLX equipped Gaming 9 will release with Z390 and it's a similar story for other manufacturers. I SLI and need a thunderbolt card as well and want to run an Optane drive without running out of lanes. I also have a PCIE USB controller for my audio interface as it doesn't like to play nice if it shares a USB bus. PLX boards just mean way less headache when everything gets max lanes. Intel scales linearly all the way to 4000+ in memory bottlenecked single thread games (think open world), and the super high memory speeds provide the ultra smooth experience that everyone chases with the 5775C. At DDR4-4000 it's actually faster than the eDRAM on the 5775C. Great for 99th percentile frame times and stability, plus major FPS boosts in CPU intensive AAAs. I've been waiting 7 years to upgrade so I'm not going to skimp out. Good for the stability and load speeds of my silly 100GB Skyrim installs too. The memory speeds don't matter on Intel is a common theme in the part picking thread and I honestly don't know why. E: I'm not saying it's cost effective but it will be going into my custom water-cooled SLI computer and will definitely be the bottleneck and the cheapest part at that. I'm hoping to get a long, long time out of it. Fun unrelated fact, my 4TB Seagate is dying a noisy painful death. I can hear it across the room while it's transferring data. BurritoJustice fucked around with this message at 16:24 on Sep 11, 2018 |
# ? Sep 11, 2018 16:21 |
|
Have a link to the 4000mhz ram gaming benchmarks? I’m curious to see how the performance is vs 3000/3200.
|
# ? Sep 11, 2018 16:44 |
|
B-Mac posted:Have a link to the 4000mhz ram gaming benchmarks? I’m curious to see how the performance is vs 3000/3200. Off the top of my head, this. There are many more but I am lazy. Digital foundry does live comparisons with frametime charts so you can see the difference in realtime, check their youtube channel.
|
# ? Sep 11, 2018 16:57 |
|
My experience with real world testing has been that ram speed made next to no difference at anything over 1080p. I game at 1440p/60 with an 8700/32gb Gskill 3200mhz. Anything over like 2666mhz made like no difference. I think my results at 3200mhz were actually slightly slower even. Maybe if you are shooting for like 1080p/240hz or somethin? e: BurritoJustice posted:Off the top of my head, this. There are many more but I am lazy. Digital foundry does live comparisons with frametime charts so you can see the difference in realtime, check their youtube channel. That link def doesnt line up with my experience, but maybe im wrong.
|
# ? Sep 11, 2018 17:03 |
|
For gaming at 4K 60 Hz*, I have little to no incentive to upgrade my 6700K, correct? I’m tempted to pick up a 9900K just for fun, but I really doubt I’d see real world benefits. I do mess around with VMs but rarely more than one at a time. I’m sure the 6700K will become less tenable in a few years, but I’m trying to follow the sensible advice of ‘only upgrade when you have a need’. *Assuming I ever get my GPU RMA debacle ironed out.
|
# ? Sep 11, 2018 17:17 |
|
4K is almost always going to be GPU-limited unless you have an absolutely cutting-edge GPU/multiple cutting-edge GPUs. A G4560 performs pretty much the same as a 7700K at 4K.
|
# ? Sep 11, 2018 18:04 |
|
Paul MaudDib posted:4K is almost always going to be GPU-limited unless you have an absolutely cutting-edge GPU/multiple cutting-edge GPUs. A G4560 performs pretty much the same as a 7700K at 4K. Not really
|
# ? Sep 11, 2018 18:07 |
|
BurritoJustice posted:Off the top of my head, this. There are many more but I am lazy. Digital foundry does live comparisons with frametime charts so you can see the difference in realtime, check their youtube channel. Saw that tech spot before, I’ll have to search digital foundry again. I remember them doing ram speed tests but not with speeds that high.
|
# ? Sep 11, 2018 19:12 |
|
I’m having trouble finding HEVC benchmarks weighing RAM speeds and cores. I know that the i9 CPUs sweep the i7-8700K, but I’m not $1000+ CPU crazy enough to do it. However, an extra $100 for RAM that could buy me another 10% in FPS on encodes is pretty nice. In any case, so far I’m seeing that Ryzen is just not that great in Handbrake compared to similarly priced Intel CPUs and that tilts me back toward Intel... unless I’m being misled and a 2700X is spanking the i7-8700K (because I haven’t seen any encoding bench that made it even close). Plodding along with my old E3-1230 Sandy Bridge is humbling.
|
# ? Sep 11, 2018 22:09 |
|
RAM isn't going to do poo poo for encodes unless you are already maxed out loading just your normal programs. AMD has crap AVX performance and thats why Intels are way faster at handbrake encodes.
|
# ? Sep 11, 2018 23:03 |
|
Risky Bisquick posted:Not really He's right, there are only two games that bottleneck my 6600k at 4k and they're both trash ports (mhw, as:o) Every other game hits 100% usage on the 1080ti long before the cpu can reach 60-70%
|
# ? Sep 12, 2018 20:36 |
|
He was talking about a 2 core chip being good enough for 4k, which is not correct though. 4 cores will obviously have more runway between playable and unplayable, depending on the engine used. Frostbite games are getting to that point it seems like
|
# ? Sep 12, 2018 21:08 |
|
Zedsdeadbaby posted:He's right, there are only two games that bottleneck my 6600k at 4k and they're both trash ports (mhw, as:o) You are being disingenuous. 2 core chip isn't the same as a 4 core chip let alone the bottle neck of the clock speed of that pentium. I have a 1080 Ti. Jumping from a skylake 6700k to a 8700k was huge for 3440x1440p @ 120hz. I'd imagine 4k would be the same. Yes higher resolutions tend to be GPU but come on dude. Especially if you're pushing a high refresh rate monitor, you'll notice the minimums easily. You add a better CPU and you'll easily have 4k @ 60hz. I know you aren't pushing that on a 6600k because I wasn't pushing 120hz @ 3440x1440p on a 6700k until I upgraded to a 8700k OCed to 5 Ghz Rabid Snake fucked around with this message at 14:56 on Sep 16, 2018 |
# ? Sep 16, 2018 14:48 |
|
What's happening with higher resolutions that requires more CPU power in addition to more GPU power?
|
# ? Sep 17, 2018 02:35 |
|
Farmer Crack-rear end posted:What's happening with higher resolutions that requires more CPU power in addition to more GPU power? Probably stuff like LoD scaling.
|
# ? Sep 17, 2018 03:49 |
|
The last stats I saw that’s of material importance for CPUs in games was the minimum FPS statistic, which really is an important one. Until reviewers start getting percentiles instead of average or geometric mean FPS I can’t precisely say whether that minimum is statistically significant either. I wasn’t really aware of anything interesting happening even anecdotally for higher FPS near 3440x1440 involving CPUs though and presumed that it’s mostly the 4K resolution that was having the crunch in CPU. Guess I’ll get some good use out of the 9700k coming up although I’d pay extra for more threads to throw at my x265 jobs.
|
# ? Sep 17, 2018 21:38 |
|
It's kind of annoying that people talk about resolutions being easy or difficult for a CPU to drive, when it's really the framerate that matters. 120 Hz 3440x1440 is not any easier to drive than 120 Hz 1920x1080, what matters is the refresh rate you're trying to drive.
|
# ? Sep 17, 2018 21:45 |
|
Paul MaudDib posted:It's kind of annoying that people talk about resolutions being easy or difficult for a CPU to drive, when it's really the framerate that matters. 120 Hz 3440x1440 is not any easier to drive than 120 Hz 1920x1080, what matters is the refresh rate you're trying to drive. Huh? I can't parse this comment right now. Are you saying that 3440x1440 is easier to drive that 1920x1080? Both resolution and refresh matter, as do min frame times, etc.
|
# ? Sep 18, 2018 17:12 |
|
save for extreme cases (such as using an atom or AM1-based CPU) higher resolutions don't have an appreciable time taxing CPUs, as simply pumping out that much in pixels is a function of GPU bandwidth alone
|
# ? Sep 18, 2018 17:34 |
|
Paul MaudDib posted:It's kind of annoying that people talk about resolutions being easy or difficult for a CPU to drive, when it's really the framerate that matters. 120 Hz 3440x1440 is not any easier to drive than 120 Hz 1920x1080, what matters is the refresh rate you're trying to drive. 3440*1440*120 > 1920*1080*120, I'm not sure what you're getting at?
|
# ? Sep 18, 2018 17:40 |
|
Maybe Paul MaudDib is talking about games and saying something like "an increased resolution mainly creates work for the gpu, whereas increasing the fps means that that the game has to go through its main loop faster which increases the load on the cpu"? Although I don't think most games are cpu bound these days anyway.
|
# ? Sep 18, 2018 18:51 |
|
Strategy games in general are certainly cpu bound, mmo's would likely be as well
|
# ? Sep 18, 2018 19:46 |
I think Paul is saying that in the vast majority of games there isn't much if anything done on the CPU that is resolution dependent. If a CPU can run a game at 144fps at 640x480, it's generally going to be able to run that game at 144fps at 1080p or 1440p or 4k provided the GPU can too. The opposite as well: if a game is cpu limited then dropping resolution isn't going to get you much of a speed up.
|
|
# ? Sep 18, 2018 21:36 |
|
Theris posted:I think Paul is saying that in the vast majority of games there isn't much if anything done on the CPU that is resolution dependent. If a CPU can run a game at 144fps at 640x480, it's generally going to be able to run that game at 144fps at 1080p or 1440p or 4k provided the GPU can too. The opposite as well: if a game is cpu limited then dropping resolution isn't going to get you much of a speed up. Yeah, this. Lower resolutions happen to be easier for the GPU to run, but talking about "a CPU that is good at 1080p" is actually a misnomer: what you actually mean is a CPU that is good at high refresh rates, it's just that it's easier for the GPU to drive high refresh rates at lower resolutions. If you are CPU bottlenecked then your system that does 100fps at 1440p will also do 100 fps at 720p or whatever - there is the same amount of game logic to run regardless of resolution. It's more sensible to just cut out the whole "well lower resolutions are easier to drive and..." bit and just say "this CPU is good for 200 fps in title X". Whether or not you hit that will of course depend on your GPU as well, but graphics aren't the part that's running on the CPU, so it's a little nonsensical to talk about a CPU in terms of graphical performance. The performance of two hypothetical systems, one with a 1050 and one with dual 2080 Tis is going to be very different even though they're both "at 4K" and you're obviously going to need a much beefier CPU to keep up with the SLI 2080 Ti system. So cramming these both into the same metaphorical bucket by talking about a CPU's "4K performance" is dumb, what you really mean is HFR/not-HFR and it would be better to just say as much. So instead of saying "good at 1080p" just say "good at HFR" instead, and instead of saying "good at 4K" say "targeting 60fps" instead. Paul MaudDib fucked around with this message at 23:49 on Sep 18, 2018 |
# ? Sep 18, 2018 23:28 |
|
Paul MaudDib posted:Yeah, this. Lower resolutions happen to be easier for the GPU to run, but talking about "a CPU that is good at 1080p" is actually a misnomer: what you actually mean is a CPU that is good at high refresh rates, it's just that it's easier for the GPU to drive high refresh rates at lower resolutions. If you are CPU bottlenecked then your system that does 100fps at 1440p will also do 100 fps at 720p or whatever - there is the same amount of game logic to run regardless of resolution. This isn't completely true. Ideally the CPU would do the same work no matter the resolution, but when there's CPU benchmarks at different resolutions on an otherwise identical videocard, different CPUs end up at different FPS. https://www.gamersnexus.net/guides/3009-amd-r7-1700-vs-i7-7700k-144hz-gaming According to this, the r7 1700 is a 200FPS CPU, so anything over 1080p should have completely identical scores. Yet the 1700 is always just a little bit slower, even at completely gpu-bound tasks. I've seen benchmarks where the difference was pronounced even at higher resolutions but I don't have time to go hunting them down tonight. If I had to guess what it was, it'd be textures. Dynamically loading textures from RAM to the GPU takes time, letting the driver recompress to the card-native format to maximize GPU memory space takes cycles, etc. game->GPU is going to eat a few context switches as well, so faster clocks mean lower latency between the end of one frame and the start of the next. Slower context switches are going to show up no matter what the FPS is, as they add a constant number of microseconds to each frame. It's close enough for a buying guideline.
|
# ? Sep 19, 2018 03:54 |
|
Looks like z390 motherboard showcase on October 8th, processor launch likely around two weeks after that. https://twitter.com/AorusOfficial/status/1042081436772257792
|
# ? Sep 19, 2018 09:18 |
|
Winks posted:Looks like z390 motherboard showcase on October 8th, processor launch likely around two weeks after that. Well my wallet will weep a silent tear as I open up for this. I wanted to wait for Ryzen 2, but that's likely a year out away, at a minimum. And realistically for a gaming PC, the 9900K is going to be faster anyway. Obviously more expensive, but that extra year of use will justify the cost. My only solace is that since I have a Skylake PC, I can at least salvage the DDR4-3000 in it and save myself 160 bucks on RAM. Based on benchmarks for other Coffee Lakes, it looks like DDR4-3000 is fast enough to not be a bottleneck, unless the 2 extra cores change this significantly.
|
# ? Sep 19, 2018 09:24 |
|
As someone with an X99, memory bandwidth shouldn't be a problem. I still have mine in dual channel mode because there's virtually no performance difference doubling the bandwidth to quad. You just need faster than 2400-15 and then diminishing returns hit hard.
|
# ? Sep 19, 2018 10:04 |
|
Anyone else waiting for cannon lake? I'm pretty sure my 6600k will work fine for gaming until then.
|
# ? Sep 19, 2018 13:04 |
|
AEMINAL posted:Anyone else waiting for cannon lake? tbh, its getting to the point that I don't think we will ever see Ice Lake either.
|
# ? Sep 19, 2018 17:09 |
|
Cygni posted:Cannon Lake is never launching on desktop, so you might be waitin' a while! Oh goddamn it!
|
# ? Sep 19, 2018 17:33 |
|
If I wanted to start piecing together stuff for a 9900k what would be the ram I'd want to look at? I have an old ivy bridge, so I know I'd need to upgrade to ddr4. I'd want some good ram, as I'd be overclocking the cpu. I just haven't done a build in 5 years so the specifics to current chips I'm a bit out of date on.
|
# ? Sep 19, 2018 19:25 |
|
Anything faster than 2400-15 is fine, which is pretty much any memory they're taking money for now. With a AMD CPU the fastest memory you could get is around 3600, with an Intel CPU around 4600.
|
# ? Sep 19, 2018 19:30 |
|
I'm guessing I can just plop my existing DDR4 2666mhz memory sticks in the new z390 mobos?
|
# ? Sep 19, 2018 19:54 |
|
|
# ? Apr 25, 2024 21:11 |
|
Zedsdeadbaby posted:I'm guessing I can just plop my existing DDR4 2666mhz memory sticks in the new z390 mobos? Yes
|
# ? Sep 19, 2018 20:13 |