Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

BurritoJustice posted:

I'm itching to get a 9900K, something Z390 with a PLX chip and some 4000MHz+ RAM. What are the best estimates for the next gen release date at the moment?

I read somewhere about a ~surprise~ release date of the 14th, but we'd have seen more leaks by now. Figure on the first or second week of October, with 'leaks' leading up to it.

Adbot
ADBOT LOVES YOU

Cygni
Nov 12, 2005

raring to post

BurritoJustice posted:

I'm itching to get a 9900K, something Z390 with a PLX chip and some 4000MHz+ RAM. What are the best estimates for the next gen release date at the moment?

I don't remember any Z370 boards with a PLX chip. There might be some out there from Supermicro or somethin, but Asus def didn't offer one even on the Maximus skus when I was lookin. Whats the use case you are looking for? You might be better served by Threadripper or Skylake-X if you need piles of PCIe lanes.

Also I wouldn't recommend the 4000mhz baller RAM. Intel platforms generally plateau around 3000mhz, if not sooner.

Aeka 2.0
Nov 16, 2000

:ohdear: Have you seen my apex seals? I seem to have lost them.




Dinosaur Gum
I asked before but I dont think I saw an answer. Can I put a 9700j in a 370 board to replace my 8700k? Or will features be locked down without a 390 board?

Cygni
Nov 12, 2005

raring to post

We don’t know yet. Likely will be able to use a Z370 to OC, based on the leaks.

Krailor
Nov 2, 2001
I'm only pretending to care
Taco Defender

Aeka 2.0 posted:

I asked before but I dont think I saw an answer. Can I put a 9700j in a 370 board to replace my 8700k? Or will features be locked down without a 390 board?

Yes, the 9700k/9900k chips are supported on Z370 boards. There should be a bios update from your board manufacturer that adds support for them.

BurritoJustice
Oct 9, 2012

Cygni posted:

I don't remember any Z370 boards with a PLX chip. There might be some out there from Supermicro or somethin, but Asus def didn't offer one even on the Maximus skus when I was lookin. Whats the use case you are looking for? You might be better served by Threadripper or Skylake-X if you need piles of PCIe lanes.

Also I wouldn't recommend the 4000mhz baller RAM. Intel platforms generally plateau around 3000mhz, if not sooner.

All the manufacturers are waiting on Z390 to do their halo tier boards, gigabyte told me ages ago that their PLX equipped Gaming 9 will release with Z390 and it's a similar story for other manufacturers. I SLI and need a thunderbolt card as well and want to run an Optane drive without running out of lanes. I also have a PCIE USB controller for my audio interface as it doesn't like to play nice if it shares a USB bus. PLX boards just mean way less headache when everything gets max lanes.

Intel scales linearly all the way to 4000+ in memory bottlenecked single thread games (think open world), and the super high memory speeds provide the ultra smooth experience that everyone chases with the 5775C. At DDR4-4000 it's actually faster than the eDRAM on the 5775C. Great for 99th percentile frame times and stability, plus major FPS boosts in CPU intensive AAAs. I've been waiting 7 years to upgrade so I'm not going to skimp out. Good for the stability and load speeds of my silly 100GB Skyrim installs too.

The memory speeds don't matter on Intel is a common theme in the part picking thread and I honestly don't know why.

E: I'm not saying it's cost effective but it will be going into my custom water-cooled SLI computer and will definitely be the bottleneck and the cheapest part at that. I'm hoping to get a long, long time out of it. Fun unrelated fact, my 4TB Seagate is dying a noisy painful death. I can hear it across the room while it's transferring data.

BurritoJustice fucked around with this message at 16:24 on Sep 11, 2018

B-Mac
Apr 21, 2003
I'll never catch "the gay"!
Have a link to the 4000mhz ram gaming benchmarks? I’m curious to see how the performance is vs 3000/3200.

BurritoJustice
Oct 9, 2012

B-Mac posted:

Have a link to the 4000mhz ram gaming benchmarks? I’m curious to see how the performance is vs 3000/3200.

Off the top of my head, this. There are many more but I am lazy. Digital foundry does live comparisons with frametime charts so you can see the difference in realtime, check their youtube channel.

Cygni
Nov 12, 2005

raring to post

My experience with real world testing has been that ram speed made next to no difference at anything over 1080p. I game at 1440p/60 with an 8700/32gb Gskill 3200mhz. Anything over like 2666mhz made like no difference. I think my results at 3200mhz were actually slightly slower even.

Maybe if you are shooting for like 1080p/240hz or somethin?

e:

BurritoJustice posted:

Off the top of my head, this. There are many more but I am lazy. Digital foundry does live comparisons with frametime charts so you can see the difference in realtime, check their youtube channel.

That link def doesnt line up with my experience, but maybe im wrong.

The Illusive Man
Mar 27, 2008

~savior of yoomanity~
For gaming at 4K 60 Hz*, I have little to no incentive to upgrade my 6700K, correct? I’m tempted to pick up a 9900K just for fun, but I really doubt I’d see real world benefits. I do mess around with VMs but rarely more than one at a time.

I’m sure the 6700K will become less tenable in a few years, but I’m trying to follow the sensible advice of ‘only upgrade when you have a need’.

*Assuming I ever get my GPU RMA debacle ironed out.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
4K is almost always going to be GPU-limited unless you have an absolutely cutting-edge GPU/multiple cutting-edge GPUs. A G4560 performs pretty much the same as a 7700K at 4K.

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord

Paul MaudDib posted:

4K is almost always going to be GPU-limited unless you have an absolutely cutting-edge GPU/multiple cutting-edge GPUs. A G4560 performs pretty much the same as a 7700K at 4K.

Not really :thunk:

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

BurritoJustice posted:

Off the top of my head, this. There are many more but I am lazy. Digital foundry does live comparisons with frametime charts so you can see the difference in realtime, check their youtube channel.

Saw that tech spot before, I’ll have to search digital foundry again. I remember them doing ram speed tests but not with speeds that high.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
I’m having trouble finding HEVC benchmarks weighing RAM speeds and cores. I know that the i9 CPUs sweep the i7-8700K, but I’m not $1000+ CPU crazy enough to do it. However, an extra $100 for RAM that could buy me another 10% in FPS on encodes is pretty nice. In any case, so far I’m seeing that Ryzen is just not that great in Handbrake compared to similarly priced Intel CPUs and that tilts me back toward Intel... unless I’m being misled and a 2700X is spanking the i7-8700K (because I haven’t seen any encoding bench that made it even close).

Plodding along with my old E3-1230 Sandy Bridge is humbling.

redeyes
Sep 14, 2002

by Fluffdaddy
RAM isn't going to do poo poo for encodes unless you are already maxed out loading just your normal programs. AMD has crap AVX performance and thats why Intels are way faster at handbrake encodes.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

He's right, there are only two games that bottleneck my 6600k at 4k and they're both trash ports (mhw, as:o)

Every other game hits 100% usage on the 1080ti long before the cpu can reach 60-70%

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord
He was talking about a 2 core chip being good enough for 4k, which is not correct though.

4 cores will obviously have more runway between playable and unplayable, depending on the engine used. Frostbite games are getting to that point it seems like

Rabid Snake
Aug 6, 2004



Zedsdeadbaby posted:

He's right, there are only two games that bottleneck my 6600k at 4k and they're both trash ports (mhw, as:o)

Every other game hits 100% usage on the 1080ti long before the cpu can reach 60-70%

You are being disingenuous. 2 core chip isn't the same as a 4 core chip let alone the bottle neck of the clock speed of that pentium. I have a 1080 Ti. Jumping from a skylake 6700k to a 8700k was huge for 3440x1440p @ 120hz. I'd imagine 4k would be the same. Yes higher resolutions tend to be GPU but come on dude. Especially if you're pushing a high refresh rate monitor, you'll notice the minimums easily.

You add a better CPU and you'll easily have 4k @ 60hz. I know you aren't pushing that on a 6600k because I wasn't pushing 120hz @ 3440x1440p on a 6700k until I upgraded to a 8700k OCed to 5 Ghz

Rabid Snake fucked around with this message at 14:56 on Sep 16, 2018

Farmer Crack-Ass
Jan 2, 2001

this is me posting irl
What's happening with higher resolutions that requires more CPU power in addition to more GPU power?

redeyes
Sep 14, 2002

by Fluffdaddy

Farmer Crack-rear end posted:

What's happening with higher resolutions that requires more CPU power in addition to more GPU power?

Probably stuff like LoD scaling.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
The last stats I saw that’s of material importance for CPUs in games was the minimum FPS statistic, which really is an important one. Until reviewers start getting percentiles instead of average or geometric mean FPS I can’t precisely say whether that minimum is statistically significant either.

I wasn’t really aware of anything interesting happening even anecdotally for higher FPS near 3440x1440 involving CPUs though and presumed that it’s mostly the 4K resolution that was having the crunch in CPU. Guess I’ll get some good use out of the 9700k coming up although I’d pay extra for more threads to throw at my x265 jobs.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
It's kind of annoying that people talk about resolutions being easy or difficult for a CPU to drive, when it's really the framerate that matters. 120 Hz 3440x1440 is not any easier to drive than 120 Hz 1920x1080, what matters is the refresh rate you're trying to drive.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Paul MaudDib posted:

It's kind of annoying that people talk about resolutions being easy or difficult for a CPU to drive, when it's really the framerate that matters. 120 Hz 3440x1440 is not any easier to drive than 120 Hz 1920x1080, what matters is the refresh rate you're trying to drive.

Huh? I can't parse this comment right now.

Are you saying that 3440x1440 is easier to drive that 1920x1080? Both resolution and refresh matter, as do min frame times, etc.

Anime Schoolgirl
Nov 28, 2002

save for extreme cases (such as using an atom or AM1-based CPU) higher resolutions don't have an appreciable time taxing CPUs, as simply pumping out that much in pixels is a function of GPU bandwidth alone

hobbesmaster
Jan 28, 2008

Paul MaudDib posted:

It's kind of annoying that people talk about resolutions being easy or difficult for a CPU to drive, when it's really the framerate that matters. 120 Hz 3440x1440 is not any easier to drive than 120 Hz 1920x1080, what matters is the refresh rate you're trying to drive.

3440*1440*120 > 1920*1080*120, I'm not sure what you're getting at?

mystes
May 31, 2006

Maybe Paul MaudDib is talking about games and saying something like "an increased resolution mainly creates work for the gpu, whereas increasing the fps means that that the game has to go through its main loop faster which increases the load on the cpu"? Although I don't think most games are cpu bound these days anyway.

Risky Bisquick
Jan 18, 2008

PLEASE LET ME WRITE YOUR VICTIM IMPACT STATEMENT SO I CAN FURTHER DEMONSTRATE THE CALAMITY THAT IS OUR JUSTICE SYSTEM.



Buglord
Strategy games in general are certainly cpu bound, mmo's would likely be as well

Theris
Oct 9, 2007

I think Paul is saying that in the vast majority of games there isn't much if anything done on the CPU that is resolution dependent. If a CPU can run a game at 144fps at 640x480, it's generally going to be able to run that game at 144fps at 1080p or 1440p or 4k provided the GPU can too. The opposite as well: if a game is cpu limited then dropping resolution isn't going to get you much of a speed up.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Theris posted:

I think Paul is saying that in the vast majority of games there isn't much if anything done on the CPU that is resolution dependent. If a CPU can run a game at 144fps at 640x480, it's generally going to be able to run that game at 144fps at 1080p or 1440p or 4k provided the GPU can too. The opposite as well: if a game is cpu limited then dropping resolution isn't going to get you much of a speed up.

Yeah, this. Lower resolutions happen to be easier for the GPU to run, but talking about "a CPU that is good at 1080p" is actually a misnomer: what you actually mean is a CPU that is good at high refresh rates, it's just that it's easier for the GPU to drive high refresh rates at lower resolutions. If you are CPU bottlenecked then your system that does 100fps at 1440p will also do 100 fps at 720p or whatever - there is the same amount of game logic to run regardless of resolution.

It's more sensible to just cut out the whole "well lower resolutions are easier to drive and..." bit and just say "this CPU is good for 200 fps in title X". Whether or not you hit that will of course depend on your GPU as well, but graphics aren't the part that's running on the CPU, so it's a little nonsensical to talk about a CPU in terms of graphical performance.

The performance of two hypothetical systems, one with a 1050 and one with dual 2080 Tis is going to be very different even though they're both "at 4K" and you're obviously going to need a much beefier CPU to keep up with the SLI 2080 Ti system. So cramming these both into the same metaphorical bucket by talking about a CPU's "4K performance" is dumb, what you really mean is HFR/not-HFR and it would be better to just say as much. So instead of saying "good at 1080p" just say "good at HFR" instead, and instead of saying "good at 4K" say "targeting 60fps" instead.

Paul MaudDib fucked around with this message at 23:49 on Sep 18, 2018

Harik
Sep 9, 2001

From the hard streets of Moscow
First dog to touch the stars


Plaster Town Cop

Paul MaudDib posted:

Yeah, this. Lower resolutions happen to be easier for the GPU to run, but talking about "a CPU that is good at 1080p" is actually a misnomer: what you actually mean is a CPU that is good at high refresh rates, it's just that it's easier for the GPU to drive high refresh rates at lower resolutions. If you are CPU bottlenecked then your system that does 100fps at 1440p will also do 100 fps at 720p or whatever - there is the same amount of game logic to run regardless of resolution.

It's more sensible to just cut out the whole "well lower resolutions are easier to drive and..." bit and just say "this CPU is good for 200 fps in title X". Whether or not you hit that will of course depend on your GPU as well, but graphics aren't the part that's running on the CPU, so it's a little nonsensical to talk about a CPU in terms of graphical performance.

The performance of two hypothetical systems, one with a 1050 and one with dual 2080 Tis is going to be very different even though they're both "at 4K" and you're obviously going to need a much beefier CPU to keep up with the SLI 2080 Ti system. So cramming these both into the same metaphorical bucket by talking about a CPU's "4K performance" is dumb, what you really mean is HFR/not-HFR and it would be better to just say as much. So instead of saying "good at 1080p" just say "good at HFR" instead, and instead of saying "good at 4K" say "targeting 60fps" instead.

This isn't completely true. Ideally the CPU would do the same work no matter the resolution, but when there's CPU benchmarks at different resolutions on an otherwise identical videocard, different CPUs end up at different FPS.

https://www.gamersnexus.net/guides/3009-amd-r7-1700-vs-i7-7700k-144hz-gaming

According to this, the r7 1700 is a 200FPS CPU, so anything over 1080p should have completely identical scores. Yet the 1700 is always just a little bit slower, even at completely gpu-bound tasks. I've seen benchmarks where the difference was pronounced even at higher resolutions but I don't have time to go hunting them down tonight.

If I had to guess what it was, it'd be textures. Dynamically loading textures from RAM to the GPU takes time, letting the driver recompress to the card-native format to maximize GPU memory space takes cycles, etc. game->GPU is going to eat a few context switches as well, so faster clocks mean lower latency between the end of one frame and the start of the next. Slower context switches are going to show up no matter what the FPS is, as they add a constant number of microseconds to each frame.

It's close enough for a buying guideline.

Winks
Feb 16, 2009

Alright, who let Rube Goldberg in here?
Looks like z390 motherboard showcase on October 8th, processor launch likely around two weeks after that.

https://twitter.com/AorusOfficial/status/1042081436772257792

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Winks posted:

Looks like z390 motherboard showcase on October 8th, processor launch likely around two weeks after that.

https://twitter.com/AorusOfficial/status/1042081436772257792

Well my wallet will weep a silent tear as I open up for this. I wanted to wait for Ryzen 2, but that's likely a year out away, at a minimum. And realistically for a gaming PC, the 9900K is going to be faster anyway. Obviously more expensive, but that extra year of use will justify the cost. My only solace is that since I have a Skylake PC, I can at least salvage the DDR4-3000 in it and save myself 160 bucks on RAM. Based on benchmarks for other Coffee Lakes, it looks like DDR4-3000 is fast enough to not be a bottleneck, unless the 2 extra cores change this significantly.

craig588
Nov 19, 2005

by Nyc_Tattoo
As someone with an X99, memory bandwidth shouldn't be a problem. I still have mine in dual channel mode because there's virtually no performance difference doubling the bandwidth to quad. You just need faster than 2400-15 and then diminishing returns hit hard.

AEMINAL
May 22, 2015

barf barf i am a dog, barf on your carpet, barf
Anyone else waiting for cannon lake?

I'm pretty sure my 6600k will work fine for gaming until then.

Cygni
Nov 12, 2005

raring to post

AEMINAL posted:

Anyone else waiting for cannon lake?

I'm pretty sure my 6600k will work fine for gaming until then.
Cannon Lake is never launching on desktop, so you might be waitin' a while!

tbh, its getting to the point that I don't think we will ever see Ice Lake either.

AEMINAL
May 22, 2015

barf barf i am a dog, barf on your carpet, barf

Cygni posted:

Cannon Lake is never launching on desktop, so you might be waitin' a while!

tbh, its getting to the point that I don't think we will ever see Ice Lake either.

Oh goddamn it!

Zotix
Aug 14, 2011



If I wanted to start piecing together stuff for a 9900k what would be the ram I'd want to look at? I have an old ivy bridge, so I know I'd need to upgrade to ddr4. I'd want some good ram, as I'd be overclocking the cpu. I just haven't done a build in 5 years so the specifics to current chips I'm a bit out of date on.

craig588
Nov 19, 2005

by Nyc_Tattoo
Anything faster than 2400-15 is fine, which is pretty much any memory they're taking money for now. With a AMD CPU the fastest memory you could get is around 3600, with an Intel CPU around 4600.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
I'm guessing I can just plop my existing DDR4 2666mhz memory sticks in the new z390 mobos?

Adbot
ADBOT LOVES YOU

Winks
Feb 16, 2009

Alright, who let Rube Goldberg in here?

Zedsdeadbaby posted:

I'm guessing I can just plop my existing DDR4 2666mhz memory sticks in the new z390 mobos?

Yes

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply