Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

snickothemule posted:

Hey fellas, with all the self isolation going on I think I'm developing a fresh case of brain worms, I'm looking at a 1660 v4 ES chip (6900k equivalent) that is reported to be overclockable. Being an engineering sample I understand is a huge risk, but having 2 extra cores over my 6800k could do wonders for some of my workloads in photogrammetry and give me 40pcie lanes instead of the current 28 (which isn't a problem now, but I may end up adding a u.2 drive down the track).

I have a strong sense this is a foolish endeavor but...brain worms.

I keep telling myself to just wait for the 4900x or equivalent and stop horsing around, but it's been years since I've done anything with this machine and I have the itch to muck around.

AFAIK v4s are not overclockable although specifically the ES only may be. What’s the s-spec (will be a 4 letter q code for an engineering sample)?

The 1660v3 retail chips are overclockable and are down to around $160 on eBay. I would strongly encourage you to not gently caress with ES silicon unless the price is really really right. Haswell-E performs really close to Broadwell when overclocked and I don’t think you’ll notice any difference unless you really push it balls to the wall.

You can probably flip your 6800K for at least $100 or so. A $60 upgrade isn’t really a bad deal for two more cores. Zen2 is really good priced and is now basically outperforming X99 in all respects on a per-core basis but you will be spending more like $270 for that 8-core or $400 for a 12c. And Zen2 cannot substitute for the PCIe lanes unless you move to TR3000, where the entry level is $1400.

Waiting for the 4000 series is an option too but prices will bump back up to MSRP and we’ll see whether that’s better than the deals you can get on 3000 today and as prices continue to decline. Zen2 is a major performance increment, Zen3 will have to prove its case for its higher prices just like Coffee Lake does.

Paul MaudDib fucked around with this message at 03:16 on Mar 31, 2020

Adbot
ADBOT LOVES YOU

snickothemule
Jul 11, 2016

wretched single ply might as well use my socks
It's a QK3S chip, sitting at around $280USD, supposedly it can do 4.2gh at 1.25v which has raised my eyebrow a bit. I've also seen another unit that only does the standard clock speed that was also an engineering sample IIRC. I'm predominately thinking of selling the 6800k if this chip is ok, mostly I'm interested in gaining the two cores without having to upgrade CPU, MB, RAM.

I'm still rather happy with how the 6800k performs for it's age, for my workloads and for gaming it still hangs in there ok unless there's something that really only depends on clock speed, which is why I'm partial for trying to get a chip closer to 6900k performance with those additional cores.

Those v3's look pretty good, but if I can get this model to OC like a few folks have reported over on reddit and overclockers, it might be worth my time, even though Broadwell is such a strange product.

Also, thanks Paul. I appreciate your insight.

snickothemule fucked around with this message at 04:33 on Mar 31, 2020

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
That’s like, a basic overclock on Haswell-E. You can do very close to that if not the same thing on one of those 1660v3s. Especially given that it’s ES silicon - Broadwell was the first 14nm chip and early silicon was not good. Auto voltage on my 5820K gets me 4.13 at 1.27v for sure and I’ve never hosed around exhaustively to see the exact limit. With sufficient voltage, Haswell-E will do about 4.6 or 4.7 and Broadwell-E will do as high as 4.7 or 4.8.

I would strongly encourage you to get a used 1660v3 off eBay instead of spending almost twice as much on an ES. That’s crazy and a waste of money, it’s one thing if it’s basically a $60 upgrade for one component but at that point you should just pony up for a 3700X or a 3900X. ES are known to have weird quirks and may or may not run on your board, the board needs microcode to support each specific processor, they are worth it if they’re cheap but not at almost $300.

If you want a best shot at good silicon try to send a message to one of the 1660v3 sellers on eBay and ask them to handpick a chip with a production code that starts with J. These are all late-production silicon that stands a very good chance of overclocking effortlessly. Don’t reveal the wu-tang secret though lol, or they’ll start upcharging for them. (Not that there’s a huge community of X99 owners let alone who are still using them, I guess...)

I’d love to own a 6950X too, but it’s just not worth the money, even used. Broadwell has always been too expensive and a poor seller compared to Haswell-E, and this has translated into inflated prices on the used market as well. It has very little to offer except for the 10C version, it’s barely higher IPC and doesn’t clock notably better, and the power consumption difference is minor. It’s just not worth paying twice as much for buggy engineering samples.

Paul MaudDib fucked around with this message at 16:28 on Mar 31, 2020

SwissArmyDruid
Feb 14, 2014

by sebmojo

D. Ebdrup posted:

So like I started with saying: we're screwed, and might as well go back 20 years unless we're doing HPC. :P

I mean, AMD keeps eating their lunch, might as well go dump some money into Si-Five.

eames
May 9, 2009

Comet Lake-H, Intel’s response to Renoir:

https://www.hd-tecnologia.com/estas-son-las-especificaciones-de-los-nuevos-intel-comet-lake-h-para-portatiles/

Saukkis
May 16, 2003

Unless I'm on the inside curve pointing straight at oncoming traffic the high beams stay on and I laugh at your puny protest flashes.
I am Most Important Man. Most Important Man in the World.
I'm surprised there hasn't been talk of the Intel's new Xeon R-line processors, their weird subterfuge to avoid dropping CPU prices. Feels like they are finally starting to feel pressure from AMD.

We were planning to purchase new server and Xeon Gold 6248 was the prime candidate, when we noticed the 6248R for slightly lower price. But instead of a cheaper 6248, the new processor is more like a rebranded Platinum 8268, twice the price of 6248.

Intel Xeon Gold 6248R Benchmarks and Review Big Refresh Gains
https://ark.intel.com/content/www/us/en/ark/compare.html?productIds=192481,199351,192446

BlankSystemDaemon
Mar 13, 2009



Probably because most of us don't spend +3000USD on a CPU?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
I've noticed it, in a quote from HPE.. we were comparing epyc and intel, and bam, the new part was right there in the quote, with a much lower price than I've seen before from Intel.
There's not much to talk about though, AMD still has the better product in general. Prejudice is rampant, however

Rastor
Jun 2, 2001

Cygni posted:

Cyrix has been dead for 20 years my dude. NatSemi bought them, then flipped them to VIA, who used the ip to make their low power C3 chips. They were still repackaging it for the embedded market, even making a quad core version called Eden x4, but it’s not in production anymore. They are using ARM now.

Rastor posted:

Centaur still Hauls, in China




This is interesting: Gamers Nexus smuggled one of the Cyrix/Centaur/VIA/Zhaoxin x86 systems out of China

https://www.youtube.com/watch?v=-DanhnASClQ


Edit: but this is one of the old VIA Nano designs, not the new microarchitecture :(

Rastor fucked around with this message at 14:03 on Apr 2, 2020

Cygni
Nov 12, 2005

raring to post

Rastor posted:

This is interesting: Gamers Nexus smuggled one of the Cyrix/Centaur/VIA/Zhaoxin x86 systems out of China

https://www.youtube.com/watch?v=-DanhnASClQ


Edit: but this is one of the old VIA Nano designs, not the new microarchitecture :(

I'm legit surprised they even got the thing to RUN some of those tests. That is such an ancient arch, and it was low performance when it was brand new... a decade ago.

In other news, first Tiger Lake U (with Xe) 3dmark runs with everything actually running at some sort of real clockspeed are showing up:

https://twitter.com/TUM_APISAK/status/1245899475991662592

5% ahead of the Ryzen 4800U in the graphics score, and the CPU is running at a locked 3ghz and not turboing, so there is probably more performance to come. On one hand, thats good for an IGP... the best for any laptop ever. On the other hand, its not really the revolution some were hoping for.

People got excited about IGP and it will probably still be disappointing, i know, shocking.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Cygni posted:

People got excited about IGP and it will probably still be disappointing, i know, shocking.

Says you. I've been waiting for this moment for years. In one fell swoop, Intel and AMD have killed off Nvidia MX GPUs, and I couldn't be more pleased.

BlankSystemDaemon
Mar 13, 2009




Die-shot from AnandTech featuring 8 cores of Coffee Lake silicon.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

SwissArmyDruid posted:

Says you. I've been waiting for this moment for years. In one fell swoop, Intel and AMD have killed off Nvidia MX GPUs, and I couldn't be more pleased.

Agreed. The MX series always seemed to be highly compromised devices--not really performant enough to do much, but sucked a lot more power than an iGPU, so they'd hurt battery life considerably. Always felt like you never got enough of anything to make it a legit good option, instead of "well this is what we've got." If an iGPU can replace the MX entirely, that's all the better for the thin-and-light laptops I enjoy.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
so if Rocket Lake is expected to have Xe graphics, and that very early leak from Tiger Lake, which does have Xe graphics, suggests performance that's comparable to a Ryzen APU, doesn't that create a situation where any non-F Rocket Lake CPU is going to be able to drive graphics at a level similar to, say, an Athlon 3000G or a Ryzen 2200G?

That seems kind of interesting, I'm kinda thinking from the perspective of even office computers with a Rocket Lake Pentium having a decent iGPU.

Meanwhile Comet Lake on the desktop is still going to be using UHD, right?

eames
May 9, 2009

gradenko_2000 posted:

so if Rocket Lake is expected to have Xe graphics, and that very early leak from Tiger Lake, which does have Xe graphics, suggests performance that's comparable to a Ryzen APU, doesn't that create a situation where any non-F Rocket Lake CPU is going to be able to drive graphics at a level similar to, say, an Athlon 3000G or a Ryzen 2200G?

That seems kind of interesting, I'm kinda thinking from the perspective of even office computers with a Rocket Lake Pentium having a decent iGPU.

Meanwhile Comet Lake on the desktop is still going to be using UHD, right?

Yep. It’s also interesting because the dedicated Intel GPU card should have 2x or 4x the performance of a single Xe iGPU due to the rumored chiplet design, though there will be differences in CUs and certainly frequency.

Cygni
Nov 12, 2005

raring to post

Comet Lake won’t be available (and reviews are embargoed, bullshit) until May 27th. They are keeping the April 30th “launch date”, but it’s a paper launch.

Initially was supposed to be available in Feb/March, and with only 1 part that is really new. Prices better be great, otherwise it’s getting in to “might as well wait for Zen3” territory.

Sipher
Jan 14, 2008
Cryptic
I've been tossing around the idea of a pc refresh whenever Nvidia's next x80ti comes out, but with the pandemic I'm bored and eyeing Comet Lake. Good idea or wait it out, since the x80ti is probably a year out still? I'm not sure what the next proc after Comet Lake will bring. Currently on an I7-6700k and 1080ti.

Sipher fucked around with this message at 20:53 on Apr 6, 2020

Cygni
Nov 12, 2005

raring to post

Sipher posted:

I've been tossing around the idea of a pc refresh whenever Nvidia's next x80ti comes out, but with the pandemic I'm bored and eyeing Comet Lake. Good idea or wait it out, since the x80ti is probably a year out still? I'm not sure what the next proc after Comet Lake will bring. Currently on an I7-6700k and 1080ti.

I'm in the same boat. The rumor is that the next x80ti is coming this year. The next intel CPU is Rocket Lake, which is Tiger Lake backported to 14nm. So finally new features, but likely the same temp/power issues keeping a lid on performance.

I was gonna get the 10 core Comet Lake, but now im thinkin ill get the 3080 Ti instead and see what Zen 3 has to offer.

On the 6700k, GN has a good comparison video.

https://www.youtube.com/watch?v=LCV9yyD8X6M

movax
Aug 30, 2008

Sipher posted:

I've been tossing around the idea of a pc refresh whenever Nvidia's next x80ti comes out, but with the pandemic I'm bored and eyeing Comet Lake. Good idea or wait it out, since the x80ti is probably a year out still? I'm not sure what the next proc after Comet Lake will bring. Currently on an I7-6700k and 1080ti.

I think I'm probably stuck holding out in the 12-18 month timeframe for Intel HEDT to be competitive against Zen 3 — Zen 2 Threadripper at work has spoiled me, but have always run Intel at home. But, if they can't get their poo poo together... :btroll: :rajatear:

BlankSystemDaemon
Mar 13, 2009



EdEddnEddy
Apr 5, 2012



It still slightly pizzles me when people say their 6xxx series or newer unlocked chips need an upgrade. Outside of some big multithreaded workloads, how big really is the performance difference between those and a super current chip in real world usage?

I know Benchmarks can show the difference, but at the same time, actual personal experience with the hardware can sometimes feel a bit closer than the numbers may suggest.

I built a few Ryzen 3700/3800X systems recently and while they were faster, they didn't feel that exponentially faster than my ancient rig. Hell even running a few Benchmarks only threw them a stones throw above since we were all using similar GPU's too. (Though it was a 2080 Super vs my non Super).

I still can max out Ultrawide gaming, run a VM, and do some remote work at the same time on my now ancient x79 1660v1 with a 2080. I can imagine stuff being faster and better, but I wonder truly by how much.

Don't get me wrong I would love to upgrade someday, but being I recently became House Poor, I'm pretty much stuck with this beast for at least a few more years.

LRADIKAL
Jun 10, 2001

Fun Shoe
4 or 6 core can be a performance bottleneck in certain multitasking use cases like live streaming and certain games. If you do any sort of video encoding or software rendering each is that much more speed.

The territory you speak of, the less obvious stuff is still a pretty big difference, power consumption, memory speed) latency, usb3/c, nvme also add up onto big gains.

EdEddnEddy
Apr 5, 2012



LRADIKAL posted:

4 or 6 core can be a performance bottleneck in certain multitasking use cases like live streaming and certain games. If you do any sort of video encoding or software rendering each is that much more speed.

The territory you speak of, the less obvious stuff is still a pretty big difference, power consumption, memory speed) latency, usb3/c, nvme also add up onto big gains.

Yea I know about the more cores for multithreaded heavy task which has prompted me to debate grabbing a 1680v2 for this old rig, but for the most part I can still play and Livestream 1080P60 rather well which is nice though it is using the 2080's NVEC encoder vs Software.

As for all the other things, this old thing has nVME, an add-on USB 3.0 card for VR as well as the onboard 3.0 ASMedia stuff that's hit or miss for sure.

I think the biggest benefit might be power consumption for me, however I will probably remain the same with whatever new build since I'd stick with HEDT and any power savings I get will be canceled out by all the cores among other things. Idle might be better though.

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


EdEddnEddy posted:

It still slightly pizzles me when people say their 6xxx series or newer unlocked chips need an upgrade. Outside of some big multithreaded workloads, how big really is the performance difference between those and a super current chip in real world usage?

I know Benchmarks can show the difference, but at the same time, actual personal experience with the hardware can sometimes feel a bit closer than the numbers may suggest.

I built a few Ryzen 3700/3800X systems recently and while they were faster, they didn't feel that exponentially faster than my ancient rig. Hell even running a few Benchmarks only threw them a stones throw above since we were all using similar GPU's too. (Though it was a 2080 Super vs my non Super).

I still can max out Ultrawide gaming, run a VM, and do some remote work at the same time on my now ancient x79 1660v1 with a 2080. I can imagine stuff being faster and better, but I wonder truly by how much.

Don't get me wrong I would love to upgrade someday, but being I recently became House Poor, I'm pretty much stuck with this beast for at least a few more years.

The key for a 3700x build is fast as hell RAM, which might also benefit an Intel build. I've used some 6 core Intel builds with a single stick of RAM and iGPU because work computers. They work fine but are noticeably slower in response. Then again I spent more on parts for my gaming AMD build than it cost for the Dell that I'm using at work. So I have way better RAM and a faster NVMe drive so it's not a fair comparison. I think I'd be as happy with my work 6 core i5 desktop with more RAM and my RX 5700 though. I just like to run stuff in the background like watching a Netflix stream on another monitor which certainly smooths things out with the extra cores.

I think most people are just being told to get good RAM with a Ryzen. 3600CL18 RAM takes longer from power on to BIOS splash screen than BIOS to windows desktop. If I set my RAM to 2666CL16 which is more typical in an Intel build it takes significantly longer to get into Windows (it's still fast as all hell). IF can't be that big a game changer, but without getting a feel myself I can't really comment other than looking at benchmarks which we all know average and max FPS aren't as important as worst case response time which is when things get frustrating.

MiniSune
Sep 16, 2003

Smart like Dodo!

LRADIKAL posted:

4 or 6 core can be a performance bottleneck in certain multitasking use cases like live streaming and certain games. If you do any sort of video encoding or software rendering each is that much more speed.

This I would agree with. Where the 6600K is in benchmarks its not great in comparison, it is often quite playable. But sadly I'm on a Cities Skylines kick at the moment and CPU usage is getting right up there for large cities. Ditto for Gal Civ 3 and endgame, so more cores and hertz and IPC would be welcome. Unless their engines are lovely then drat.

quote:

The territory you speak of, the less obvious stuff is still a pretty big difference, power consumption, memory speed) latency, usb3/c, nvme also add up onto big gains.

My Z170X Gaming 7 board with a 6600K has DDR 4 ram running at 3200 fine, USB 3.1 & C and NVME support. It's old, but not that old.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
6700k has nowhere been near taxed on any of my games, I play at 4k60 and it's always the gpu doing most of the work. The 6700k is easily managing workloads at 60 frames so far. It's got a few years yet.

pixaal
Jan 8, 2004

All ice cream is now for all beings, no matter how many legs.


Zedsdeadbaby posted:

6700k has nowhere been near taxed on any of my games, I play at 4k60 and it's always the gpu doing most of the work. The 6700k is easily managing workloads at 60 frames so far. It's got a few years yet.

The CPU load at 1080p and 4k should be nearly identical. The CPU load will increase if you go from 60fps to 144fps.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
I know, that's why I referred to the 60 fps in particular. For a guy like me that's just swell enough, but peeps who game at 120+ fps will obviously need a beefier cpu sooner than I will

Indiana_Krom
Jun 18, 2007
Net Slacker
I had a 7700k feeding my GTX 1080 video card at 1080p high hz and in several newer games it consistently couldn't keep up making the GPU to idle down. The next couple generations of video cards are definitely going to leave 4 core CPUs behind with anything past mid-range.

LRADIKAL
Jun 10, 2001

Fun Shoe

Zedsdeadbaby posted:

I know, that's why I referred to the 60 fps in particular. For a guy like me that's just swell enough, but peeps who game at 120+ fps will obviously need a beefier cpu sooner than I will

Also, people who are sensitive to micro stutter and transient fps drops still like stuff a little more cutting edge.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness
5820k for life!

movax
Aug 30, 2008

DrDork posted:

5820k for life!

If I was less lazy I'd post here friendship with 2600K ending, now 3960X is my new friend.

so many threads, so much RAM

Sphyre
Jun 14, 2001

kind of want to move my machine from an O11 dynamic to an ncase m1. should I buy an itx mobo in anticipation of z390 being phased out soon

i suppose what i'm askin is how imminent is the new intel chipset

Sphyre fucked around with this message at 08:09 on Apr 8, 2020

Nutsak
Jul 21, 2005
All balls.

Zedsdeadbaby posted:

I know, that's why I referred to the 60 fps in particular. For a guy like me that's just swell enough, but peeps who game at 120+ fps will obviously need a beefier cpu sooner than I will

I'm running a 4790k with a 1070. If I run CoD: Warzone in 4k my GPU usages is 99% and my CPU is 65%-ish, I get about 40fps. If I run at 1080p my GPU is 95% and my CPU is about 50% and cruise at over 100fps. This CPU just doesn't want to be replaced.

track day bro!
Feb 17, 2005

#essereFerrari
Grimey Drawer

DrDork posted:

5820k for life!

I bought a used 5820k and an asus x99 ws board and its been the best build ive had. Wouldnt mind one of those haswell-e 8 core xeons but the chepest one is in the us and it adds a huge shipping cost onto it.

I've seen second hand 5960x's go for around £200 but then thats inline in price with some of the modern ryzen stuff. Although maybe if they drop a bit more in price.

iospace
Jan 19, 2038


EdEddnEddy posted:

It still slightly pizzles me when people say their 6xxx series or newer unlocked chips need an upgrade. Outside of some big multithreaded workloads, how big really is the performance difference between those and a super current chip in real world usage?

I know Benchmarks can show the difference, but at the same time, actual personal experience with the hardware can sometimes feel a bit closer than the numbers may suggest.

I built a few Ryzen 3700/3800X systems recently and while they were faster, they didn't feel that exponentially faster than my ancient rig. Hell even running a few Benchmarks only threw them a stones throw above since we were all using similar GPU's too. (Though it was a 2080 Super vs my non Super).

I still can max out Ultrawide gaming, run a VM, and do some remote work at the same time on my now ancient x79 1660v1 with a 2080. I can imagine stuff being faster and better, but I wonder truly by how much.

Don't get me wrong I would love to upgrade someday, but being I recently became House Poor, I'm pretty much stuck with this beast for at least a few more years.

I think part of it may do with the vulnerability mitigations. While your average consumer may not feel the performance drop from them, and even some high end users for that matter, I think the presence of mind that you're using a chip that has hardware defenses against the recent big name attacks is driving some of it.

Or gives them an excuse to upgrade earlier than planned anyway, even if they didn't notice :shrug:

B-Mac
Apr 21, 2003
I'll never catch "the gay"!

Sphyre posted:

kind of want to move my machine from an O11 dynamic to an ncase m1. should I buy an itx mobo in anticipation of z390 being phased out soon

i suppose what i'm askin is how imminent is the new intel chipset

Rumors have the next chipset coming out in the coming months but nothing official that I’ve seen. The AsRock phantom gaming itx is generally considered the best MITX Z390 board, it at least has the best VRM and 3 fan headers. I’m using it with a 9900K in the ncase and it works great.

I believe the next best is the ASUS rog.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

iospace posted:

I think part of it may do with the vulnerability mitigations.

The real pros here are using chips old enough to not be vulnerable to some of those issues in the first place! :smug:

LRADIKAL
Jun 10, 2001

Fun Shoe

Nutsak posted:

I'm running a 4790k with a 1070. If I run CoD: Warzone in 4k my GPU usages is 99% and my CPU is 65%-ish, I get about 40fps. If I run at 1080p my GPU is 95% and my CPU is about 50% and cruise at over 100fps. This CPU just doesn't want to be replaced.

If the game is maxing out individual cores then you may benefit from an upgrade. In that scenario, the CPU will never hit 100% total.

Also, playing at an atrocious frame rate is easier on the CPU. Most people aim for 60fps.

Adbot
ADBOT LOVES YOU

Nutsak
Jul 21, 2005
All balls.
It's certainly something I'd test. I'd honestly never thought to check each core. The issue I have when I think of upgrading my CPU is do I hold out for the Covid-19 stuff to end (potential price drops) and get a 10 series because of the new socket, or just bite the bullet and get a 9 series and hope I don't need to upgrade in the next couple of years.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply