Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Any place where I can reliably find out details of the virtualization improvements between Sandy Bridge and Haswell, and what's planned for Broadwell? Apart from Powerpoint slides from various Intel presentations?

Adbot
ADBOT LOVES YOU

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Devil's Canyon 5.5GHz OC by Intel.

Then again, that was probably cherry-picked to hell.

Still, if there's any truth in it, and we can get back to the 5GHz previously only attainable by Sandy Bridge, you've got a hell of a CPU.

sincx
Jul 13, 2012

furiously masturbating to anime titties
.

sincx fucked around with this message at 05:55 on Mar 23, 2021

Ignoarints
Nov 26, 2010

HalloKitty posted:

Devil's Canyon 5.5GHz OC by Intel.

Then again, that was probably cherry-picked to hell.

Still, if there's any truth in it, and we can get back to the 5GHz previously only attainable by Sandy Bridge, you've got a hell of a CPU.

UGhhhhhhhhHHHHHHhhhhHHhhHh

Combat Pretzel posted:

Any place where I can reliably find out details of the virtualization improvements between Sandy Bridge and Haswell, and what's planned for Broadwell? Apart from Powerpoint slides from various Intel presentations?

You know I looked into this really extensively for a few days a few months ago and came up with very, very little. All reliable or comprehensive benchmarks were almost entirely server CPU related

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

Ignoarints posted:

You know I looked into this really extensively for a few days a few months ago and came up with very, very little. All reliable or comprehensive benchmarks were almost entirely server CPU related
Doesn't need to be necessarily benchmarks, but actually naming the improvements and their purpose. I was initially wanting to upgrade to the Haswell for at least the improved VT-x roundtripping, but the virtual fileserver project it was being based on fell flat. I'm currently curious if anything else is coming with Broadwell.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost

sincx posted:

Isn't there a theoretical cap of 8 GHz or so for silicon-based chips of any sort? I know IBM has built a 100 GHz chip, but I think that was based on graphene carbon.
Think the Intel paper I read like 10+ years ago saying the smallest possible before quantum tunneling is unavoidable for silicon transistors would lead to something rather high like 40 GHz. However, the economics of process improvements will likely mean we don't get anywhere near that with silicon and we'll get to that point with something like graphene. With that said, a theoretical quantum processor can do something like 4 quadrillion times as many computations / second faster than our digital sequential logic ones, so we have a long ways to go across all sorts of computing devices of the future

Ignoarints posted:

You know I looked into this really extensively for a few days a few months ago and came up with very, very little. All reliable or comprehensive benchmarks were almost entirely server CPU related
All I'm aware of are that there's inconsistencies with VT-x with EPT support on Haswell. Here's a chart I've found helpful http://www.ilsistemista.net/index.p...st.html?start=2

canyoneer
Sep 13, 2005


I only have canyoneyes for you

HalloKitty posted:

The only thing that's memorable is their jingle. That said, they sell a product to people who have no idea what it even is, so I guess marketing have a difficult job.

Fun fact: they call it the "Intel Bong" :drugnerd:

karoshi
Nov 4, 2008

"Can somebody mspaint eyes on the steaming packages? TIA" yeah well fuck you too buddy, this is the best you're gonna get. Is this even "work-safe"? Let's find out!

sincx posted:

Isn't there a theoretical cap of 8 GHz or so for silicon-based chips of any sort? I know IBM has built a 100 GHz chip, but I think that was based on graphene carbon.

A single transistor is very fast (>100 Ghz I think), the issue is a pipeline stage is made up of many of them. You have to wait until the value at the end of the transistor network has stabilized. For a single stage and operation you will have a critical path, the path that takes the longest to stabilize. So to go faster you either add stages to the critical path (hello 31-stage prescott!) or add voltage (hello melted motherboard!). The whole system is limited by the switching time of the slowest stage.

For example, your integer ALU add stage might have a transistor depth of 50, as inputs change they will propagate down the adder network and finally the outputs of the last transistor will reflect the desired function. If your ideal transistor settle time is 10ps (100Ghz) your stage will achieve (100/50)=20Ghz speeds. Now, stage 3 in your FP unit is 200 transistors deep, it will fail at speeds >(100/200)=500Mhz: Prime95 not stable :byodood:!

Caveat: I'm not an EE.

Shaocaholica
Oct 29, 2002

Fig. 5E

necrobobsledder posted:

With that said, a theoretical quantum processor can do something like 4 quadrillion times as many computations / second faster than our digital sequential logic ones, so we have a long ways to go across all sorts of computing devices of the future.

You might want to read up on how current quantum computers work:

http://www.dwavesys.com/tutorials/background-reading-series/quantum-computing-primer#h2-0

They're only good for certain problems/algorithms. Don't expect to be playing BF2143 on a quantum computer.

atomicthumbs
Dec 26, 2010


We're in the business of extending man's senses.

necrobobsledder posted:

Think the Intel paper I read like 10+ years ago saying the smallest possible before quantum tunneling is unavoidable for silicon transistors would lead to something rather high like 40 GHz.[/url]

Remember, though, that 10+ years ago they thought they could push NetBurst to 10 gigahertz.

mobby_6kl
Aug 9, 2009

by Fluffdaddy

The skull on the SSD is pretty of cool, but overall this stuff pretty awkward coming from Intel, kind of like your millionaire banker uncle throwing the horns and pretending to be a hardcore metal dude or something.

But cheesy skulls or not, Devil's Canyon just might be worth finally getting for me, at least if Broardwell really is delayed until H2/15. To respond to someone from way back, it's not that there was no need to upgrade until now, there was just no point -- you'd drop a grand for a new machine and get a 5-10% increase in CPU performance. This only really made sense if this marginal improvement was very beneficial, or money was no objection. An 8-core Broadwell or something like this finally makes a pretty good case for itself.

Ignoarints
Nov 26, 2010

Shaocaholica posted:

You might want to read up on how current quantum computers work:

http://www.dwavesys.com/tutorials/background-reading-series/quantum-computing-primer#h2-0

They're only good for certain problems/algorithms. Don't expect to be playing BF2143 on a quantum computer.

Welll that's how the dwave sorta-kina works anyways. The theoretical one should be as capable and humanity altering as we imagine

Shaocaholica
Oct 29, 2002

Fig. 5E

Ignoarints posted:

Welll that's how the dwave sorta-kina works anyways. The theoretical one should be as capable and humanity altering as we imagine

Well that's pretty far into the future. Kind of meaningless in the scope of discussion even in SHSC.

Longinus00
Dec 29, 2005
Ur-Quan

Ignoarints posted:

Welll that's how the dwave sorta-kina works anyways. The theoretical one should be as capable and humanity altering as we imagine

Not quite. Solving np-complete problems in polynomial time (with reasonable constants) could be humanity altering sure but quantum computers don't do that.

Basically, don't believe the hypemarketing.

Ignoarints
Nov 26, 2010

Longinus00 posted:

Not quite. Solving np-complete problems in polynomial time (with reasonable constants) could be humanity altering sure but quantum computers don't do that.

Basically, don't believe the hypemarketing.

But but

okay I believe you

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
Now that devils butthole is showing up on amazon, newegg, and TD im getting so tempted. I should really just hold onto my 3570k and wait for Broadwell/ddr4

staticman
Sep 12, 2008

Be gay
Death to America
Suck my dick Israel
Mess with Texas
and remember to lmao

Don Lapre posted:

Now that devils butthole is showing up on amazon, newegg, and TD im getting so tempted. I should really just hold onto my 3570k and wait for Broadwell/ddr4

DC is giving me a real bad PC building itch. I'm on an i7 960, base clock, and was originally planning to wait until Broadwell as well. Since my CPU is a quite a bit older than your Ivy Bridge, I'm curious if I'm a complete dumbass for sticking with it.

GokieKS
Dec 15, 2012

Mostly Harmless.
I'm kinda wanting to use Devil's Canyon as an excuse to get replace my 2500K system that's used as a gaming/HTPC, as there's a lot of minor issues that make me less than happy with it. Whether it's my particular CPU or the MB (which I cheaped out a bit on, to my dismay), I could never get it above 4.4GHz with any sort of reliability, and last time I updated the BIOS even that ran into problems so I just dropped it down to 4.2 rather than muck with it more. And while the SilverStone FT-01 remains one of my favorite cases from an aesthetic design standpoint, it's really outdated in a lot of ways - no cable routing features, no CPU cutout, no USB 3.0, no support for larger CLCs, etc.

On the other hand, I had a 4770K laying around for 6 months before finally decided to use it to upgrade my Hackintosh (the i3 3225 really became a bottleneck after I moved my photo library to it and started using Lightroom), and the old parts are probably getting turned into a dedicated HTPC, so I'm already at the point of having more computers than I really have any semblance of need for. I guess I'll see if I can sell off the whole thing.

MaxxBot
Oct 6, 2003

you could have clapped

you should have clapped!!

staticman posted:

DC is giving me a real bad PC building itch. I'm on an i7 960, base clock, and was originally planning to wait until Broadwell as well. Since my CPU is a quite a bit older than your Ivy Bridge, I'm curious if I'm a complete dumbass for sticking with it.

That's still not that terrible of a CPU despite its age, Devil's Canyon might actually be better than Broadwell though or at least no worse so I think it would be a prime time to upgrade.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
As exciting as Devil's Canyon is I'd definitely hold off for Broadwell-K+Crystalwell next year if you have a system that's not in desperate need of replacement. I think having eDRAM will result in a much more significant generational performance boost than we've seen in some time, plus the 14nm shrink will be pretty significant. By this time next year there should be a wide set of mature SATAexpress SSDs as well.

I'm certainly going to replace my C2Q9550@3.6Ghz mostly due to the memory bandwidth limitations of DDR3, but I feel like any i7 is still a fine CPU if you overclock it. Even with DDR3-1066 triple-channel means you get equivalent bandwidth to DDR3-1600. If you only have 6GB of RAM replacing it with 12GB of DDR3-1600 might make sense, especially before DDR3 prices start to rise as supply drops off. This does mean that non-overclockable Lynnfield systems are showing their age, but their higher IPC means that they're still quite serviceable. I don't see how it makes much sense from a value perspective to replace a Sandy Bridge system, especially overclocked, but value isn't everyone's primary consideration.

staticman
Sep 12, 2008

Be gay
Death to America
Suck my dick Israel
Mess with Texas
and remember to lmao

MaxxBot posted:

That's still not that terrible of a CPU despite its age, Devil's Canyon might actually be better than Broadwell though or at least no worse so I think it would be a prime time to upgrade.

I probably should've said I am waiting for SKYLAKE. Got the two mixed up.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

Alereon posted:

I think having eDRAM will result in a much more significant generational performance boost than we've seen in some time

I am a dinosaur, please tell me more about this newfangled computer chip.

Shaocaholica
Oct 29, 2002

Fig. 5E

Sidesaddle Cavalry posted:

I am a dinosaur, please tell me more about this newfangled computer chip.

You know what cache is right? Its basically L4 cache on the magnitude of 128MB. Although from the anandtech article it didn't seem like it was -that- fast.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

Shaocaholica posted:

You know what cache is right? Its basically L4 cache on the magnitude of 128MB. Although from the anandtech article it didn't seem like it was -that- fast.

Was this the on-die thing? I think I got it mixed up in my head with on-die graphics and that's why it flew under my radar.

Shaocaholica
Oct 29, 2002

Fig. 5E

Sidesaddle Cavalry posted:

Was this the on-die thing? I think I got it mixed up in my head with on-die graphics and that's why it flew under my radar.

Its on package, separate die.



eDRAM is the biggersmaller die.

Shaocaholica fucked around with this message at 06:50 on Jun 5, 2014

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Shaocaholica posted:

Its on package, separate die.



eDRAM is the bigger die.
eDRAM is actually the smaller die, the normally rectangular Haswell die is square because of the additional graphics hardware.

Also, while image leeching isn't usually a big deal, please don't embed 2.6MB leeched images. Rehost to imgur (preferably a thumbnail) and link to the original.

Alereon fucked around with this message at 04:14 on Jun 5, 2014

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
Ah yes, the pic where I confused the small die for the big die! Skimmed a little bit of Anandtech's words on it and I'd be all for it on 14nm LGA products if it doesn't make the main goods prohibitively expensive or too warm under load to be practical. I like to think I have common sense sometimes. :v:

Welmu
Oct 9, 2007
Metri. Piiri. Sekunti.

Alereon posted:

As exciting as Devil's Canyon is I'd definitely hold off for Broadwell-K+Crystalwell next year if you have a system that's not in desperate need of replacement. I think having eDRAM will result in a much more significant generational performance boost than we've seen in some time, plus the 14nm shrink will be pretty significant. By this time next year there should be a wide set of mature SATAexpress SSDs as well.
What are the advantages eDRAM brings for general use (destop, gaming, video editing) if I have a discrete graphics card?

Desktop broadwell might be delayed until Summer 2015 so I might as well upgrade from my i7-975, especially since the mobo only recognizes 2GB of ram. And every component is near five years of age.

Ignoarints
Nov 26, 2010
You won't be disappointed if you upgrade now. You won't be disappointed if you wait probably either. Whichever you do, you will be happy.

I do wish DDR4 was supported, but I figure by the time that is a reasonable price I'll be in the mood to upgrade

Welmu
Oct 9, 2007
Metri. Piiri. Sekunti.

Ignoarints posted:

You won't be disappointed if you upgrade now. You won't be disappointed if you wait probably either. Whichever you do, you will be happy.

I do wish DDR4 was supported, but I figure by the time that is a reasonable price I'll be in the mood to upgrade
I'm going to upgrade to i5-4690K in order to overclock the hell out of it, then two-three years down the line I'll build a new Intel Skylake + DDR4 & nVidia Pascal rig. I'm just curious what benefits Broadwell has over Haswell as Alereon has mentioned several times it'd be worth the wait.

Ignoarints
Nov 26, 2010
I'm dying to know if the 4690k will be as capable as how much they say the 4790k can be

Welmu
Oct 9, 2007
Metri. Piiri. Sekunti.

Ignoarints posted:

I'm dying to know if the 4690k will be as capable as how much they say the 4790k can be
Napkin calculation: at Computex Intel provided 4790K's for competitive overclocking (CPU Frequency 4cores8threads) that had to be cooled by off-the-shelf all-in-one watercoolers. Teams were able to reach 5.5 GHz, so a 4690K + heavy-duty custom coolant loop + several months of BIOS upgrades might just reach 5.0 GHz as it has 500 MHz lower stock clocks. Here's hopin'

karoshi
Nov 4, 2008

"Can somebody mspaint eyes on the steaming packages? TIA" yeah well fuck you too buddy, this is the best you're gonna get. Is this even "work-safe"? Let's find out!

Welmu posted:

What are the advantages eDRAM brings for general use (destop, gaming, video editing) if I have a discrete graphics card?

eDRAM (50Gbyte/s, iirc) is used as a victim cache for the LLC, in english: not much. It's great for integrated 3D, but won't matter much otherwise, save your money. You can use the iGPU for OpenCL computations and it helps there, but the iGPU will again be massacred by a decent discrete GPU in parallel compute (OpenCL/CUDA/DirectCompute/C++AMP).

Ignoarints
Nov 26, 2010

Welmu posted:

Napkin calculation: at Computex Intel provided 4790K's for competitive overclocking (CPU Frequency 4cores8threads) that had to be cooled by off-the-shelf all-in-one watercoolers. Teams were able to reach 5.5 GHz, so a 4690K + heavy-duty custom coolant loop + several months of BIOS upgrades might just reach 5.0 GHz as it has 500 MHz lower stock clocks. Here's hopin'

Yeah, there just seems to be a wider margin between the two this time. They must be cherry picking for the new 4790k's simply based off boost factory speed (I mean and obviously the 5.5 thing was a pretty sweet indication), however there is no reason for me to think they will on the 4690k in either way. Before it was a very simple choice NOT to buy a 4770k unless you needed HT, whether you planned to overclock or not. But perhaps this time around it might be worth the extra money to spring for the i7 if overclocking was the goal. I dunno, eager to see

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Welmu posted:

What are the advantages eDRAM brings for general use (destop, gaming, video editing) if I have a discrete graphics card?

karoshi posted:

eDRAM (50Gbyte/s, iirc) is used as a victim cache for the LLC, in english: not much. It's great for integrated 3D, but won't matter much otherwise, save your money. You can use the iGPU for OpenCL computations and it helps there, but the iGPU will again be massacred by a decent discrete GPU in parallel compute (OpenCL/CUDA/DirectCompute/C++AMP).
The eDRAM is the CPU's L4 cache, adding an 128MB L4 cache provides huge performance increases in any application that cares about memory bandwidth or latency. Some of the biggest examples are 3D rendering, video editing/encoding, or any applications involving data compression, and some games would also see noticeable boosts.

Shaocaholica
Oct 29, 2002

Fig. 5E
I was expecting the eDRAM to have more bandwidth than quad channel ddr3 ~2400. Maybe that doesn't matter.

JawnV6
Jul 4, 2004

So hot ...
Bandwidth and latency aren't the same!?!?? Stop the presses!

Shaocaholica
Oct 29, 2002

Fig. 5E
I had mentioned latency before and according to the anandtech tests they latency was only half that of ddr3. I was expecting an order of magnitude there. But whatever, as long as things get faster.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Shaocaholica posted:

I was expecting the eDRAM to have more bandwidth than quad channel ddr3 ~2400. Maybe that doesn't matter.
They wanted to get as much bandwidth as possible within reasonable cost and power constraints. The optimal way to do this would typically be to have a very wide, slow bus, for example a 1Ghz 512-bit bus. The issue is that this would require at least ~600 additional contacts on the bottom of the CPU, plus routing of those traces on the substrate, which would be hideously complex and expensive. Every watt drawn by the eDRAM is a watt not going to the CPU or integrated graphics (or another watt sucked from the battery), so just clocking the hell out of the memory on a narrower bus wasn't really a workable solution. What they settled on was using a very fast, low-power serial link to a larger, slower eDRAM die, providing a good balance of bandwidth, cost/complexity, and power usage.

Keep in mind that quad-channel DDR4 will only exist on workstation platforms that won't have Crystalwell. Home users will only have dual-channel DDR4 beginning with Skylake in 2016 or so, for a total of between 40-50GB/sec of memory bandwidth and likely at higher latency than DDR3 at launch. I'd definitely expect eDRAM to continue scaling in both capacity and bandwidth, Intel's long-term goal is to completely replace system RAM with eDRAM on their mobile products.

Adbot
ADBOT LOVES YOU

karoshi
Nov 4, 2008

"Can somebody mspaint eyes on the steaming packages? TIA" yeah well fuck you too buddy, this is the best you're gonna get. Is this even "work-safe"? Let's find out!

Alereon posted:

The eDRAM is the CPU's L4 cache, adding an 128MB L4 cache provides huge performance increases in any application that cares about memory bandwidth or latency. Some of the biggest examples are 3D rendering, video editing/encoding, or any applications involving data compression, and some games would also see noticeable boosts.

My google-fu is failing me, I can't find an in-depth benchmark of similar CPUs with and without L4. But I don't expect the L4 to make a 2-digit difference in any non-graphic benchmark. It's a waste of money if you get a discrete GPU. Anyway, I would buy one if I was rich, possibly in an ultra-book. But I'm a nerd. In the parts picking thread there should be no references to a L4 CPU, unless they added a "I'm rich bitch" build.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply