|
Any place where I can reliably find out details of the virtualization improvements between Sandy Bridge and Haswell, and what's planned for Broadwell? Apart from Powerpoint slides from various Intel presentations?
|
# ? Jun 4, 2014 15:07 |
|
|
# ? Jun 6, 2024 09:56 |
|
Devil's Canyon 5.5GHz OC by Intel. Then again, that was probably cherry-picked to hell. Still, if there's any truth in it, and we can get back to the 5GHz previously only attainable by Sandy Bridge, you've got a hell of a CPU.
|
# ? Jun 4, 2014 17:11 |
|
.
sincx fucked around with this message at 05:55 on Mar 23, 2021 |
# ? Jun 4, 2014 17:14 |
HalloKitty posted:Devil's Canyon 5.5GHz OC by Intel. UGhhhhhhhhHHHHHHhhhhHHhhHh Combat Pretzel posted:Any place where I can reliably find out details of the virtualization improvements between Sandy Bridge and Haswell, and what's planned for Broadwell? Apart from Powerpoint slides from various Intel presentations? You know I looked into this really extensively for a few days a few months ago and came up with very, very little. All reliable or comprehensive benchmarks were almost entirely server CPU related
|
|
# ? Jun 4, 2014 17:17 |
|
Ignoarints posted:You know I looked into this really extensively for a few days a few months ago and came up with very, very little. All reliable or comprehensive benchmarks were almost entirely server CPU related
|
# ? Jun 4, 2014 17:31 |
|
sincx posted:Isn't there a theoretical cap of 8 GHz or so for silicon-based chips of any sort? I know IBM has built a 100 GHz chip, but I think that was based on graphene carbon. Ignoarints posted:You know I looked into this really extensively for a few days a few months ago and came up with very, very little. All reliable or comprehensive benchmarks were almost entirely server CPU related
|
# ? Jun 4, 2014 17:37 |
|
HalloKitty posted:The only thing that's memorable is their jingle. That said, they sell a product to people who have no idea what it even is, so I guess marketing have a difficult job. Fun fact: they call it the "Intel Bong"
|
# ? Jun 4, 2014 18:58 |
|
sincx posted:Isn't there a theoretical cap of 8 GHz or so for silicon-based chips of any sort? I know IBM has built a 100 GHz chip, but I think that was based on graphene carbon. A single transistor is very fast (>100 Ghz I think), the issue is a pipeline stage is made up of many of them. You have to wait until the value at the end of the transistor network has stabilized. For a single stage and operation you will have a critical path, the path that takes the longest to stabilize. So to go faster you either add stages to the critical path (hello 31-stage prescott!) or add voltage (hello melted motherboard!). The whole system is limited by the switching time of the slowest stage. For example, your integer ALU add stage might have a transistor depth of 50, as inputs change they will propagate down the adder network and finally the outputs of the last transistor will reflect the desired function. If your ideal transistor settle time is 10ps (100Ghz) your stage will achieve (100/50)=20Ghz speeds. Now, stage 3 in your FP unit is 200 transistors deep, it will fail at speeds >(100/200)=500Mhz: Prime95 not stable ! Caveat: I'm not an EE.
|
# ? Jun 4, 2014 19:06 |
|
necrobobsledder posted:With that said, a theoretical quantum processor can do something like 4 quadrillion times as many computations / second faster than our digital sequential logic ones, so we have a long ways to go across all sorts of computing devices of the future. You might want to read up on how current quantum computers work: http://www.dwavesys.com/tutorials/background-reading-series/quantum-computing-primer#h2-0 They're only good for certain problems/algorithms. Don't expect to be playing BF2143 on a quantum computer.
|
# ? Jun 4, 2014 19:21 |
|
necrobobsledder posted:Think the Intel paper I read like 10+ years ago saying the smallest possible before quantum tunneling is unavoidable for silicon transistors would lead to something rather high like 40 GHz.[/url] Remember, though, that 10+ years ago they thought they could push NetBurst to 10 gigahertz.
|
# ? Jun 4, 2014 19:25 |
|
The skull on the SSD is pretty of cool, but overall this stuff pretty awkward coming from Intel, kind of like your millionaire banker uncle throwing the horns and pretending to be a hardcore metal dude or something. But cheesy skulls or not, Devil's Canyon just might be worth finally getting for me, at least if Broardwell really is delayed until H2/15. To respond to someone from way back, it's not that there was no need to upgrade until now, there was just no point -- you'd drop a grand for a new machine and get a 5-10% increase in CPU performance. This only really made sense if this marginal improvement was very beneficial, or money was no objection. An 8-core Broadwell or something like this finally makes a pretty good case for itself.
|
# ? Jun 4, 2014 19:37 |
Shaocaholica posted:You might want to read up on how current quantum computers work: Welll that's how the dwave sorta-kina works anyways. The theoretical one should be as capable and humanity altering as we imagine
|
|
# ? Jun 4, 2014 19:40 |
|
Ignoarints posted:Welll that's how the dwave sorta-kina works anyways. The theoretical one should be as capable and humanity altering as we imagine Well that's pretty far into the future. Kind of meaningless in the scope of discussion even in SHSC.
|
# ? Jun 4, 2014 19:43 |
|
Ignoarints posted:Welll that's how the dwave sorta-kina works anyways. The theoretical one should be as capable and humanity altering as we imagine Not quite. Solving np-complete problems in polynomial time (with reasonable constants) could be humanity altering sure but quantum computers don't do that. Basically, don't believe the
|
# ? Jun 4, 2014 20:41 |
Longinus00 posted:Not quite. Solving np-complete problems in polynomial time (with reasonable constants) could be humanity altering sure but quantum computers don't do that. But but okay I believe you
|
|
# ? Jun 4, 2014 21:04 |
|
Now that devils butthole is showing up on amazon, newegg, and TD im getting so tempted. I should really just hold onto my 3570k and wait for Broadwell/ddr4
|
# ? Jun 4, 2014 21:31 |
|
Don Lapre posted:Now that devils butthole is showing up on amazon, newegg, and TD im getting so tempted. I should really just hold onto my 3570k and wait for Broadwell/ddr4 DC is giving me a real bad PC building itch. I'm on an i7 960, base clock, and was originally planning to wait until Broadwell as well. Since my CPU is a quite a bit older than your Ivy Bridge, I'm curious if I'm a complete dumbass for sticking with it.
|
# ? Jun 5, 2014 00:18 |
|
I'm kinda wanting to use Devil's Canyon as an excuse to get replace my 2500K system that's used as a gaming/HTPC, as there's a lot of minor issues that make me less than happy with it. Whether it's my particular CPU or the MB (which I cheaped out a bit on, to my dismay), I could never get it above 4.4GHz with any sort of reliability, and last time I updated the BIOS even that ran into problems so I just dropped it down to 4.2 rather than muck with it more. And while the SilverStone FT-01 remains one of my favorite cases from an aesthetic design standpoint, it's really outdated in a lot of ways - no cable routing features, no CPU cutout, no USB 3.0, no support for larger CLCs, etc. On the other hand, I had a 4770K laying around for 6 months before finally decided to use it to upgrade my Hackintosh (the i3 3225 really became a bottleneck after I moved my photo library to it and started using Lightroom), and the old parts are probably getting turned into a dedicated HTPC, so I'm already at the point of having more computers than I really have any semblance of need for. I guess I'll see if I can sell off the whole thing.
|
# ? Jun 5, 2014 00:49 |
|
staticman posted:DC is giving me a real bad PC building itch. I'm on an i7 960, base clock, and was originally planning to wait until Broadwell as well. Since my CPU is a quite a bit older than your Ivy Bridge, I'm curious if I'm a complete dumbass for sticking with it. That's still not that terrible of a CPU despite its age, Devil's Canyon might actually be better than Broadwell though or at least no worse so I think it would be a prime time to upgrade.
|
# ? Jun 5, 2014 01:42 |
|
As exciting as Devil's Canyon is I'd definitely hold off for Broadwell-K+Crystalwell next year if you have a system that's not in desperate need of replacement. I think having eDRAM will result in a much more significant generational performance boost than we've seen in some time, plus the 14nm shrink will be pretty significant. By this time next year there should be a wide set of mature SATAexpress SSDs as well. I'm certainly going to replace my C2Q9550@3.6Ghz mostly due to the memory bandwidth limitations of DDR3, but I feel like any i7 is still a fine CPU if you overclock it. Even with DDR3-1066 triple-channel means you get equivalent bandwidth to DDR3-1600. If you only have 6GB of RAM replacing it with 12GB of DDR3-1600 might make sense, especially before DDR3 prices start to rise as supply drops off. This does mean that non-overclockable Lynnfield systems are showing their age, but their higher IPC means that they're still quite serviceable. I don't see how it makes much sense from a value perspective to replace a Sandy Bridge system, especially overclocked, but value isn't everyone's primary consideration.
|
# ? Jun 5, 2014 01:59 |
|
MaxxBot posted:That's still not that terrible of a CPU despite its age, Devil's Canyon might actually be better than Broadwell though or at least no worse so I think it would be a prime time to upgrade. I probably should've said I am waiting for SKYLAKE. Got the two mixed up.
|
# ? Jun 5, 2014 02:06 |
|
Alereon posted:I think having eDRAM will result in a much more significant generational performance boost than we've seen in some time I am a dinosaur, please tell me more about this newfangled computer chip.
|
# ? Jun 5, 2014 03:06 |
|
Sidesaddle Cavalry posted:I am a dinosaur, please tell me more about this newfangled computer chip. You know what cache is right? Its basically L4 cache on the magnitude of 128MB. Although from the anandtech article it didn't seem like it was -that- fast.
|
# ? Jun 5, 2014 03:08 |
|
Shaocaholica posted:You know what cache is right? Its basically L4 cache on the magnitude of 128MB. Although from the anandtech article it didn't seem like it was -that- fast. Was this the on-die thing? I think I got it mixed up in my head with on-die graphics and that's why it flew under my radar.
|
# ? Jun 5, 2014 03:13 |
|
Sidesaddle Cavalry posted:Was this the on-die thing? I think I got it mixed up in my head with on-die graphics and that's why it flew under my radar. Its on package, separate die. eDRAM is the Shaocaholica fucked around with this message at 06:50 on Jun 5, 2014 |
# ? Jun 5, 2014 03:17 |
|
Shaocaholica posted:Its on package, separate die. Also, while image leeching isn't usually a big deal, please don't embed 2.6MB leeched images. Rehost to imgur (preferably a thumbnail) and link to the original. Alereon fucked around with this message at 04:14 on Jun 5, 2014 |
# ? Jun 5, 2014 04:11 |
|
Ah yes, the pic where I confused the small die for the big die! Skimmed a little bit of Anandtech's words on it and I'd be all for it on 14nm LGA products if it doesn't make the main goods prohibitively expensive or too warm under load to be practical. I like to think I have common sense sometimes.
|
# ? Jun 5, 2014 04:22 |
|
Alereon posted:As exciting as Devil's Canyon is I'd definitely hold off for Broadwell-K+Crystalwell next year if you have a system that's not in desperate need of replacement. I think having eDRAM will result in a much more significant generational performance boost than we've seen in some time, plus the 14nm shrink will be pretty significant. By this time next year there should be a wide set of mature SATAexpress SSDs as well. Desktop broadwell might be delayed until Summer 2015 so I might as well upgrade from my i7-975, especially since the mobo only recognizes 2GB of ram. And every component is near five years of age.
|
# ? Jun 5, 2014 08:57 |
You won't be disappointed if you upgrade now. You won't be disappointed if you wait probably either. Whichever you do, you will be happy. I do wish DDR4 was supported, but I figure by the time that is a reasonable price I'll be in the mood to upgrade
|
|
# ? Jun 5, 2014 14:45 |
|
Ignoarints posted:You won't be disappointed if you upgrade now. You won't be disappointed if you wait probably either. Whichever you do, you will be happy.
|
# ? Jun 5, 2014 15:19 |
I'm dying to know if the 4690k will be as capable as how much they say the 4790k can be
|
|
# ? Jun 5, 2014 15:23 |
|
Ignoarints posted:I'm dying to know if the 4690k will be as capable as how much they say the 4790k can be
|
# ? Jun 5, 2014 16:50 |
|
Welmu posted:What are the advantages eDRAM brings for general use (destop, gaming, video editing) if I have a discrete graphics card? eDRAM (50Gbyte/s, iirc) is used as a victim cache for the LLC, in english: not much. It's great for integrated 3D, but won't matter much otherwise, save your money. You can use the iGPU for OpenCL computations and it helps there, but the iGPU will again be massacred by a decent discrete GPU in parallel compute (OpenCL/CUDA/DirectCompute/C++AMP).
|
# ? Jun 5, 2014 17:13 |
Welmu posted:Napkin calculation: at Computex Intel provided 4790K's for competitive overclocking (CPU Frequency 4cores8threads) that had to be cooled by off-the-shelf all-in-one watercoolers. Teams were able to reach 5.5 GHz, so a 4690K + heavy-duty custom coolant loop + several months of BIOS upgrades might just reach 5.0 GHz as it has 500 MHz lower stock clocks. Here's hopin' Yeah, there just seems to be a wider margin between the two this time. They must be cherry picking for the new 4790k's simply based off boost factory speed (I mean and obviously the 5.5 thing was a pretty sweet indication), however there is no reason for me to think they will on the 4690k in either way. Before it was a very simple choice NOT to buy a 4770k unless you needed HT, whether you planned to overclock or not. But perhaps this time around it might be worth the extra money to spring for the i7 if overclocking was the goal. I dunno, eager to see
|
|
# ? Jun 5, 2014 17:22 |
|
Welmu posted:What are the advantages eDRAM brings for general use (destop, gaming, video editing) if I have a discrete graphics card? karoshi posted:eDRAM (50Gbyte/s, iirc) is used as a victim cache for the LLC, in english: not much. It's great for integrated 3D, but won't matter much otherwise, save your money. You can use the iGPU for OpenCL computations and it helps there, but the iGPU will again be massacred by a decent discrete GPU in parallel compute (OpenCL/CUDA/DirectCompute/C++AMP).
|
# ? Jun 5, 2014 19:29 |
|
I was expecting the eDRAM to have more bandwidth than quad channel ddr3 ~2400. Maybe that doesn't matter.
|
# ? Jun 5, 2014 21:03 |
|
Bandwidth and latency aren't the same!?!?? Stop the presses!
|
# ? Jun 5, 2014 21:11 |
|
I had mentioned latency before and according to the anandtech tests they latency was only half that of ddr3. I was expecting an order of magnitude there. But whatever, as long as things get faster.
|
# ? Jun 5, 2014 21:41 |
|
Shaocaholica posted:I was expecting the eDRAM to have more bandwidth than quad channel ddr3 ~2400. Maybe that doesn't matter. Keep in mind that quad-channel DDR4 will only exist on workstation platforms that won't have Crystalwell. Home users will only have dual-channel DDR4 beginning with Skylake in 2016 or so, for a total of between 40-50GB/sec of memory bandwidth and likely at higher latency than DDR3 at launch. I'd definitely expect eDRAM to continue scaling in both capacity and bandwidth, Intel's long-term goal is to completely replace system RAM with eDRAM on their mobile products.
|
# ? Jun 5, 2014 21:49 |
|
|
# ? Jun 6, 2024 09:56 |
|
Alereon posted:The eDRAM is the CPU's L4 cache, adding an 128MB L4 cache provides huge performance increases in any application that cares about memory bandwidth or latency. Some of the biggest examples are 3D rendering, video editing/encoding, or any applications involving data compression, and some games would also see noticeable boosts. My google-fu is failing me, I can't find an in-depth benchmark of similar CPUs with and without L4. But I don't expect the L4 to make a 2-digit difference in any non-graphic benchmark. It's a waste of money if you get a discrete GPU. Anyway, I would buy one if I was rich, possibly in an ultra-book. But I'm a nerd. In the parts picking thread there should be no references to a L4 CPU, unless they added a "I'm rich bitch" build.
|
# ? Jun 5, 2014 21:58 |