|
4790K@7GHz
|
# ? Jun 14, 2014 05:34 |
|
|
# ? May 23, 2024 12:54 |
|
Avatar+post combo
|
# ? Jun 14, 2014 05:59 |
|
But it probably won't work for long @ 1.8V.
|
# ? Jun 14, 2014 11:30 |
|
Ika posted:But it probably won't work for long @ 1.8V. Err, yeah, those are suicide runs with engineering samples, 2 cores and LN2 cooling. But still goddamn impressive.
|
# ? Jun 14, 2014 17:33 |
|
What's the multiplier limit on haswell k?
|
# ? Jun 14, 2014 20:27 |
|
This might be a kind of weird question but is there anywhere I can find some decent or even comprehensive benchmarks and comparisons between old Core Duo chips like the E5200 and modern low power Celerons like the Celeron 1037U or Celeron G1820? More than the minimal info that sites like CPUBoss have.
|
# ? Jun 14, 2014 20:36 |
|
Cardboard Box A posted:This might be a kind of weird question but is there anywhere I can find some decent or even comprehensive benchmarks and comparisons between old Core Duo chips like the E5200 and modern low power Celerons like the Celeron 1037U or Celeron G1820? More than the minimal info that sites like CPUBoss have. The E5200 was the "Pentium" aka low end version of Wolfdale Core 2 Duo CPUs 6 years ago, so both the 1037U and G1820 parts should handle anything the E5200 could, unless the workload you have in mind specifically needs the additional processor cache the E5200 had. The main difference will be that the Celeron parts also have onboard GPUs. However, the E5200, assuming you already have it will perform slightly better on most tasks that don't use any newer instructions, so if you were considering replacing one with those Celerons, it wouldn't be worth it.
|
# ? Jun 14, 2014 20:46 |
|
Shaocaholica posted:What's the multiplier limit on haswell k? 80x
|
# ? Jun 14, 2014 21:37 |
|
Cardboard Box A posted:This might be a kind of weird question but is there anywhere I can find some decent or even comprehensive benchmarks and comparisons between old Core Duo chips like the E5200 and modern low power Celerons like the Celeron 1037U or Celeron G1820? More than the minimal info that sites like CPUBoss have. I found it pretty funny that the 1.8Ghz Core Duo in my old 2006-era Macbook Pro gets outperformed by the Celeron 2955U in the Chromebook I bought to replace it.
|
# ? Jun 15, 2014 03:50 |
|
Cardboard Box A's question made me take a look at Intel's low end for the first time in a few years. I see they are blurring the waters between Haswell and Silvermont by using the Pentium and Celeron names for both product lines. Benchmarks indicate that the fastest Silvermont still performs only like 30% of the most crippled Haswell, so this seems pretty shady to me. Really, the distinction between Pentium and Celeron has been a mess ever since they introduced the Core brand and mixing in Atom makes things worse. They should make Celeron mean high-end Atom and Pentium can be the low-end Core.
|
# ? Jun 15, 2014 10:23 |
|
Nintendo Kid posted:The E5200 was the "Pentium" aka low end version of Wolfdale Core 2 Duo CPUs 6 years ago, so both the 1037U and G1820 parts should handle anything the E5200 could, unless the workload you have in mind specifically needs the additional processor cache the E5200 had. The main difference will be that the Celeron parts also have onboard GPUs. I have underclocked and undervolt the E5200 down to 1.2GHz I think, and I still don't think it's anywhere the low power the 1037U can get to, so I'm sure it can't compete there, but maybe it will do ok once I bring it up in clock a little. Grey Area posted:Cardboard Box A's question made me take a look at Intel's low end for the first time in a few years. I see they are blurring the waters between Haswell and Silvermont by using the Pentium and Celeron names for both product lines. Benchmarks indicate that the fastest Silvermont still performs only like 30% of the most crippled Haswell, so this seems pretty shady to me.
|
# ? Jun 15, 2014 10:36 |
|
Maybe a matter of binning and making a product for every single piece of silicon they put out without regard to clarity? Now that I think about it, what was Pentium before in the Core era exactly? I remember looking at Celerons for a while for stuff cause they seemed like low end Core CPUs at the time, never really thought about how Pentiums fit in. As for benchmarks, for a raw CPU one there's Geekbench. Just going off the 32-bit numbers cause they're there (single/multi core), so yeah, it seems roughly in the same ballpark: E5200: 1244/2217 (I guess you'd get roughly half that with your underclock though?) Celeron 1037U: 1471/2583 Celeron G1820: 1037U sounded familiar, then I realized I bought one as part of this board a little while ago. I just use it to run WMC and record shows off a HDHomeRun so I can't say much about performance other than it works for that.
|
# ? Jun 15, 2014 11:32 |
|
japtor posted:Maybe a matter of binning and making a product for every single piece of silicon they put out without regard to clarity? Now that I think about it, what was Pentium before in the Core era exactly? Pentium was the i7 for, like, a decade.
|
# ? Jun 15, 2014 14:08 |
|
Pentium was an absolutely huge brand and it's kind of funny how Intel resurrected it to be the new poo poo-tier of their processors.
|
# ? Jun 15, 2014 15:06 |
|
Cardboard Box A posted:Thanks for your answer. Ah I see, yeah if you're after low power draw, those Celeron parts will outperform that Wolfdale watt-for-watt. It's just if you're willing to use full power draw where the E5200 would come out ahead.
|
# ? Jun 15, 2014 17:15 |
|
mobby_6kl posted:Pentium was an absolutely huge brand and it's kind of funny how Intel resurrected it to be the new poo poo-tier of their processors. I'm waiting for Pentium 'Vee'...
|
# ? Jun 15, 2014 18:14 |
|
Am I correct there is basically no reason to go from a 4.3ghz 3570k to a 4790k for gaming.
|
# ? Jun 16, 2014 00:25 |
Don Lapre posted:Am I correct there is basically no reason to go from a 4.3ghz 3570k to a 4790k for gaming. I don't think I could say "no reason" if I can assume the 3570k in this case won't be getting higher than that, but also assuming the 4790k gets to ~4.8. I would say more like little reason, or a very cost inefficient reason. It's not like there isn't a difference between 500 mhz, +5-10% clock for clock speeds, and even a tiny bump due to hyper-threading for games. But then I'd probably argue that simply overclocking the 3570k further would diminish that since it is already well on its way. This is coming from the guy who was considering dumping a 4670k at 4.5 ghz for a 4790k if I could reasonably hope to hit 5 ghz. So in all honesty if I were in your shoes, I probably would do it, but I fully understand that's its not the best use of money in any way.
|
|
# ? Jun 16, 2014 01:20 |
|
Ignoarints posted:I don't think I could say "no reason" if I can assume the 3570k in this case won't be getting higher than that, but also assuming the 4790k gets to ~4.8. I would say more like little reason, or a very cost inefficient reason. It's not like there isn't a difference between 500 mhz, +5-10% clock for clock speeds, and even a tiny bump due to hyper-threading for games. But then I'd probably argue that simply overclocking the 3570k further would diminish that since it is already well on its way. So what's the chance these will ship on June 20 like amazon says. Bank of america has 10% back on tiger direct.
|
# ? Jun 16, 2014 01:28 |
|
Don Lapre posted:Am I correct there is basically no reason to go from a 4.3ghz 3570k to a 4790k for gaming. I'd just get a 3770K if I were you, as the main difference will be going from an i5 to an i7, unless you really want to pay for a Z97 board. Either way it's going to be a very marginal difference in most games, and you could probably do better with $400+ IMO. Hace fucked around with this message at 01:36 on Jun 16, 2014 |
# ? Jun 16, 2014 01:32 |
|
Don Lapre posted:So what's the chance these will ship on June 20 like amazon says. Bank of america has 10% back on tiger direct. Really? Thats pretty dope.
|
# ? Jun 16, 2014 01:56 |
|
So this happened http://www.overclock.net/t/1496007/3570k-ht-unlocked-4c-8t
|
# ? Jun 16, 2014 13:47 |
|
Very interesting, and it's easy to believe after all the evidence, but I somehow doubt it will be replicated, since even forgetting the HT unlock, it was an incredible chip before that. It'll be interesting if he dumps the BIOS to see if there are differences to the stock BIOS, but my guess is this was a manufacturing fluke, and HT was supposed to be locked out in hardware. The people suggesting 3570K isn't HT capable are a bit naïve, it's pretty certain that Intel just bins and lasers off or locks out in firmware the additional features instead of spinning a new die. This one just seems to be a manufacturing flaw. A freak. That said, if Intel did something silly like requiring the BIOS to respect HT off with certain CPU codes, and it's been possible to enable HT all along with a simple BIOS switch, that would be, uh, something. HalloKitty fucked around with this message at 14:27 on Jun 16, 2014 |
# ? Jun 16, 2014 14:24 |
|
HalloKitty posted:Very interesting, and it's easy to believe after all the evidence, but I somehow doubt it will be replicated, since even forgetting the HT unlock, it was an incredible chip before that. I have a 3570K, so it would be pretty awesome if that was the case.
|
# ? Jun 16, 2014 15:07 |
|
It's running at such ridiculously high clocks that I wonder if electromigration actually "healed" HT back on. Truly a golden chip if I've ever seen one.
|
# ? Jun 16, 2014 15:13 |
|
Yeah as cool as it would be I don't believe you can re-enable disabled features just via bitflips in microcode, if anything it might be an efuse that didn't quite open, or was marginal then failed closed in some manner. I wonder if you can even download microcode from the CPU, or just upload it?
Alereon fucked around with this message at 18:31 on Jun 16, 2014 |
# ? Jun 16, 2014 18:20 |
|
KillHour posted:I have a 3570K, so it would be pretty awesome if that was the case. Nah, you might just get the 2k of cache back
|
# ? Jun 16, 2014 18:31 |
|
Alereon posted:Yeah as cool as it would be I don't believe you can re-enable disabled features just via bitflips in microcode, if anything it might be an efuse that didn't quite open, or was marginal then failed closed in some manner. I wonder if you can even download microcode from the CPU, or just upload it? Remember how Intel dabbled with software CPU upgrade unlocks? Intel sold "upgrade cards" for certain low end Sandy Bridge CPU models through retail. You'd buy one, enter the code on it into an Intel website, and it would generate a program that would permanently change your CPU to a different, faster model. (I say "generate" because I believe it was tied to your CPU serial number -- you weren't getting a program that would unlock anybody's CPU.) Most of the time the upgrade was just a clock speed boost, but one of the upgrades Intel offered unlocked hyperthreading. It's a good guess that on-chip firmware is involved in the feature enable process. Not the microcode, but a pre-boot ROM. It's common to design in protected configuration registers where there's a tiny window of time after powerup or hard reset to write to them, after which they lock their values. The reason for doing this is that fuses are expensive -- they take a lot of die area per bit. So you'd rather compress as much information as you can into a tiny number of fuse bits, then use pre-boot firmware stored in mask ROM to interpret them and do appropriate register configuration. I'd guess that Intel uses a small number of fuse bits as a model ID code. If that guess is correct, the ROM would contain a table with all the necessary configuration information for every CPU model that particular die design can be, and it would use the ID code stored in fuses to select just one row from the table. (The software upgrade process likely involves blowing one fuse bit to select a different row.) Getting back to this supposed accidental unlock, I could believe a damaged hidden configuration register getting stuck in the state which permits hyperthreading.
|
# ? Jun 16, 2014 21:07 |
|
Don Lapre posted:Am I correct there is basically no reason to go from a 4.3ghz 3570k to a 4790k for gaming.
|
# ? Jun 16, 2014 22:17 |
|
^ An i7 920, for example, remains a very powerful piece of hardware for modern gaming. It usually benchmarks not significantly lower than even the 4790k, and is held back only by the increasingly archaic features of X58 and a comparatively silly power draw. The only reason to upgrade before Skylake, if you are coming from a 1366 or 2011 socket, is either changing your form factor or because your hardware straight up died. If I wasn't downsizing to mITX, I'd run my 920 until the capacitors blew off. Intel has completely and utterly poo poo the bed on delivering a worthwhile upgrade path now that AMD is no longer a threat. Rime fucked around with this message at 22:52 on Jun 16, 2014 |
# ? Jun 16, 2014 22:49 |
|
BobHoward posted:Remember how Intel dabbled with software CPU upgrade unlocks? Intel sold "upgrade cards" for certain low end Sandy Bridge CPU models through retail. Here we go: http://www.anandtech.com/show/4621/intel-to-offer-cpu-upgrades-via-software-for-selected-models
|
# ? Jun 17, 2014 00:20 |
|
Alereon posted:Yeah as cool as it would be I don't believe you can re-enable disabled features just via bitflips in microcode, if anything it might be an efuse that didn't quite open, or was marginal then failed closed in some manner. I wonder if you can even download microcode from the CPU, or just upload it? New microcode doesn't get saved on CPUs, it's not persistent. It's uploaded during the boot process either from bios or, if you don't want to/can't upgrade the bios, from the OS. In windows I'm guessing it's included as the "cpu driver" or whatever. In linux you can just check dmesg to see what version you have. If you want to know more, this is all in Intel's system programming guide. You want section 9.11. http://www.intel.com/content/dam/www/public/us/en/documents/manuals/64-ia-32-architectures-software-developer-vol-3a-part-1-manual.pdf Longinus00 fucked around with this message at 03:38 on Jun 17, 2014 |
# ? Jun 17, 2014 03:33 |
|
Rime posted:^ An i7 920, for example, remains a very powerful piece of hardware for modern gaming. It usually benchmarks not significantly lower than even the 4790k, and is held back only by the increasingly archaic features of X58 and a comparatively silly power draw. I don't know if I agree with this; from what we know about Haswell-E it's going to be a very sweet platform; especially if you could use the extra cores or your board doesn't support NVMe. I'd be shocked if anything outperforms it for at least a year after it's release. I'd definitely be keeping a close eye on it if I was still running an i7-920. I might be a little biased though since I do some scientific computing on my home box. Haswell-E and the DC P3500 can't get here soon enough. Chuu fucked around with this message at 03:42 on Jun 17, 2014 |
# ? Jun 17, 2014 03:37 |
|
Chuu posted:your board doesn't support NVMe. There should be no need for this if you're just plugging directly into a PCIe slot right?
|
# ? Jun 17, 2014 03:44 |
|
coffeetable posted:Very few modern games will be CPU bottlenecked. Load up the game which is underperforming for you and check which component is at full load, then upgrade that component. This is not a reliable way to gauge which component to upgrade for the biggest gains in your game. Many MMOs don't actually tax one's CPU to the fullest on a day-to-day basis, yet switching to a processor with more IPC (such as, perhaps, the switch from a Phenom 980 Black Edition to an i5-3570K) can make them run with far fewer hiccups just from how quickly they burn through awful coding. Or at least that's what I gather from a terrible layman's perspective of programming
|
# ? Jun 17, 2014 03:49 |
|
Longinus00 posted:There should be no need for this if you're just plugging directly into a PCIe slot right? From what I understand (and I hope someone corrects me if I'm wrong) your bios needs to support NVMe if you want to boot off of it, but if you're just using it as a data drive it doesn't matter.
|
# ? Jun 17, 2014 03:57 |
|
Chuu posted:
Yeah, I'm talking from the perspective of Joe Gamer who runs ARMA III as his most CPU-intensive task. If you're doing scientific computing or video / 3D rendering, you're more likely to benefit from the 15% performance upgrade. Just gaming though? The 920 is still making it into the top ten high score lists on most benchmarking sites.
|
# ? Jun 17, 2014 13:09 |
|
Rime posted:An i7 920, for example, remains a very powerful piece of hardware for modern gaming. It usually benchmarks not significantly lower than even the 4790k Rime posted:Yeah, I'm talking from the perspective of Joe Gamer who runs ARMA III as his most CPU-intensive task. If you're doing scientific computing or video / 3D rendering, you're more likely to benefit from the 15% performance upgrade. Just gaming though? The 920 is still making it into the top ten high score lists on most benchmarking sites. I'm pretty sure you didn't do much legwork making sure your post was factual here because the first benchmarking site I pulled up (Anandtech's "Bench") shows a 4790 faster than a 920 by a ratio greater than 2x in some of the single thread CPU tests. Granted, there's not a lot of tests from 2008 that AT still runs on modern CPUs, but still. Yes, the 920 is still a useful CPU, but so is a Core 2. Can we stop with the mythmaking? It's over 5 years old, and real progress has actually been made since then. It's time to let go and acknowledge that your pride and joy is obsolete.
|
# ? Jun 17, 2014 13:39 |
|
Chuu posted:From what I understand (and I hope someone corrects me if I'm wrong) your bios needs to support NVMe if you want to boot off of it, but if you're just using it as a data drive it doesn't matter. Hmmm this sounds right. Any UEFI bios should be able to handle it with an update. Might be able to get away with a bootloader on a bootable drive and have it point to the NVMe drive if you're not supported for some reason.
|
# ? Jun 17, 2014 14:56 |
|
|
# ? May 23, 2024 12:54 |
|
Really hoping amazon's june 20 release date is accurate.
|
# ? Jun 17, 2014 15:03 |