Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Theris
Oct 9, 2007

lDDQD posted:

I'm not seeing any big wins for it, aside from project: cars in that review. Other reivews paint an even bleaker picture.

Yeah, I really don't get why people are going so crazy about it. It's marginally better than the 6770k in games, is slower in every other usage, and costs ~$100 more when you can find it. I could see an argument for it if you already have an LGA1150 board with an i5 and really feel like you need a CPU upgrade. Or maybe if you're upgrading from Sandy/Ivy or earlier and only use your desktop for games and you can get it for MSRP. It just doesn't make sense to me otherwise.

Adbot
ADBOT LOVES YOU

Theris
Oct 9, 2007

Jan posted:

Unfortunately, few application benchmarks ever bother including compilation times nowadays.

Tech Report's review tested GCC compiling the Qt SDK. The 6770k is ~20% faster at stock clocks, just like it is in almost everything else that's not a game.

Overclocking may wipe that advantage out, as the comparably-clocked Haswell (4790k) is much closer to the 6770k, but Broadwell C overclocking results are hard to find.

Theris fucked around with this message at 02:17 on Sep 23, 2015

Theris
Oct 9, 2007

Paul MaudDib posted:

Maybe there's some SPARCs or something too?

POWER8 :science:

quote:

Ivy Bridge lost a whoooole lot of clocks over Sandy Bridge (what happened there anyway?)


New, less mature process and the switch away from soldered IHS.

quote:

Even the loss of 400 MHz would seriously ding Intel's cred in the gaming community with Ryzen snapping at their heels like that.


For the "It's useless garbage because it only gets 180 fps instead of 195 at 720p even though the game in question is GPU limited at any settings anyone would actually play it at" crowd, sure. For the rest of us, not so much. I'll gladly pickup a six core Coffee Lake if it works on my Z170 board and I can get ~4.2+GHz out of it and couldn't care less that it would be imperceptibly slower than my 4.5GHz 6700k at single threaded stuff. Otherwise I'll be building a 6 or 8 core Zen+ system when they drop.

Theris fucked around with this message at 02:28 on Apr 20, 2017

Theris
Oct 9, 2007

Happy_Misanthrope posted:

Eh I'd say the 8400 has that price/performance ratio locked down right now. The 8350k does win some when overclocked but overall looking through the benches the leads it has are minor, but the losses in more multi-threaded apps can be quite large compared to the 8400. The 8400's 2.8's base clock can be largely ignored it seems.

The 8350k is definitely going to be the CPU of choice for the "I need 700FPS in CS:GO" crowd and anyone else with a niche single thread workload, however.

Theris
Oct 9, 2007

I think Paul is saying that in the vast majority of games there isn't much if anything done on the CPU that is resolution dependent. If a CPU can run a game at 144fps at 640x480, it's generally going to be able to run that game at 144fps at 1080p or 1440p or 4k provided the GPU can too. The opposite as well: if a game is cpu limited then dropping resolution isn't going to get you much of a speed up.

Theris
Oct 9, 2007

craig588 posted:

I thought the 14+++ thing was a joke, but there it is in Intel's slides.

:allears:

it's fake (or I'm getting whooshed)

Theris fucked around with this message at 14:30 on Jul 10, 2019

Theris
Oct 9, 2007

Sormus posted:

I am the "Products formerly called Comet Lake"

Surprisingly that's terminology that Intel actually uses internally, although it's still suspect because they don't typically do that until after something from that code name family hits retail. (And IIRC they use "formerly known as" rather than "formerly called".)

Anyway it's less of a red flag than "14+++ nm," even if it's still pretty red.

Edit: it's just "Products formerly <codename>", no "called" or "known as" at all

Edit 2: the chart actually has it right lol

Theris fucked around with this message at 08:41 on Jul 11, 2019

Adbot
ADBOT LOVES YOU

Theris
Oct 9, 2007

gradenko_2000 posted:

Is it just a production thing where it's not worth "removing" the iGPU?

Is this. Desktop chips are high leakage laptop chips. It doesn't make sense to have a separate GPU-less die for mid range desktops since it's relatively low volume and there'd be nowhere for them to use laptop chips that don't make the cut.

Theris fucked around with this message at 11:56 on Jan 16, 2020

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply