Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Durinia
Sep 26, 2014

The Mad Computer Scientist

EoRaptor posted:

Intel hasn't shown any desire to use xpoint as a flash ram replacement in SSD's or phones or what have you.
wat?
http://www.anandtech.com/show/9541/intel-announces-optane-storage-brand-for-3d-xpoint-products

Intel posted:

The Optane products will be available in 2016, in both standard SSD (PCIe) form factors for everything from Ultrabooks to servers, and in a DIMM form factor for Xeon systems for even greater bandwidth and lower latencies. As expected, Intel will be providing storage controllers optimized for the 3D XPoint memory, though no further details on that subject matter were provided.

EoRaptor posted:

They are targeting server memory via specialty DIMMs that allow a huge increase in the amount of memory a server can have, by using a blend of xpoint and regular memory on a single DIMM. This is either managed by the CPU itself or by an 'xpoint aware' memory manager (or both!)
This part is awesome.

EoRaptor posted:

On the consumer front, I'd actually expect Apple to be the first ones to use xpoint in their mac pro series. They have the total control over hardware and operating system you need to turn around such a product quickly, and price isn't the first concern for people purchasing workstation class mac products. Xpoint in a high end laptop would also make a lot of sense, if the price is justifiable.

Did I fall asleep and wake up in a world where memory capacity was a constraint in the PC space?

Adbot
ADBOT LOVES YOU

EoRaptor
Sep 13, 2003

by Fluffdaddy

I completely missed this. Oops.

Durinia posted:


This part is awesome.


Did I fall asleep and wake up in a world where memory capacity was a constraint in the PC space?

It's more that, you can stick a terabyte of memory in a server, but now for the same price you can stick 4TB in. For workstations, this is basically the same selling point. It applies especially well to mac pros, which are heavily used for video and image editing, where having lots of memory helps, but you generally are only working with small chunks of it at a time.

For laptops, the fact that xpoint requires no refresh cycle means it should be much more power efficient than dram. So, a system with 4GB of dram and 4GB of xpoint should perform as if it has 8GB of memory but have the battery life equal to the 4GB model. It gets even better as you increase the amount of xpoint memory in the system.

Durinia
Sep 26, 2014

The Mad Computer Scientist

EoRaptor posted:

I completely missed this. Oops.


It's more that, you can stick a terabyte of memory in a server, but now for the same price you can stick 4TB in. For workstations, this is basically the same selling point. It applies especially well to mac pros, which are heavily used for video and image editing, where having lots of memory helps, but you generally are only working with small chunks of it at a time.

For laptops, the fact that xpoint requires no refresh cycle means it should be much more power efficient than dram. So, a system with 4GB of dram and 4GB of xpoint should perform as if it has 8GB of memory but have the battery life equal to the 4GB model. It gets even better as you increase the amount of xpoint memory in the system.

The smallest granularity of XPoint - a single die - is 128 Gb = 16 GB. That die would ostensibly deliver the equivalent bandwidth of a single DRAM die in the best case, which is like 1/16th of DDR channel BW.

The minimum usable granularity for XPoint on a DIMM is probably well north of 64GB.

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me
XPoint makes a lot of sense in tablets too, if the cost can be brought down. No more need for a separate bus for memory and for storage, just partition 8gb as memory and the other 120gb as storage.

Durinia
Sep 26, 2014

The Mad Computer Scientist

Skandranon posted:

XPoint makes a lot of sense in tablets too, if the cost can be brought down. No more need for a separate bus for memory and for storage, just partition 8gb as memory and the other 120gb as storage.

Or, thought of another way, provide 128 GB of "RAM", of which 120 GB is some kind of non-volatile ramdisk. And interesting thought.

Curious how bad the performance would be without any DRAM though.

Nintendo Kid
Aug 4, 2011

by Smythe
I'm eager to see how good consumer Xpoint storage devices perform for long term storage.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
IIRC Intel wanted to target an IC price point in between NAND flash and DRAM (e: admittedly a wide price range), so the limiting factor seems like it will be how much of it they make.

I think I might also settle for an NVMe solution instead of a DIMM solution. It is the interface that was made to push Optane, after all.

Sidesaddle Cavalry fucked around with this message at 20:57 on Nov 6, 2015

High Protein
Jul 12, 2009

Panty Saluter posted:

Interesting. So would allowing the processor speed to throttle down in "high performance" still yield better results than balanced? IIRC default minimum processor state is 100%

There are hidden options (which can be enabled through the registry) that result in 'high performance' clocking up the cpu quicker than balanced does. So running 'high performance' with the minimum at 0% will result in a more aggressive version of 'balanced', effectively. These settings are different for ac and dc power.

Additionally, when you create a new power plan, it copies over the 'personality' (balanced, high performance, etc.) of the original you based it on, but I'm not sure what that's actually used for.

I've seen it happen that an application ran slowly because it somehow couldn't convince Windows to up the cpu speed when using the balanced plan.

Panty Saluter
Jan 17, 2004

Making learning fun!
Trying that now...it's definitely more aggressive about ramping up the CPU.

eggyolk
Nov 8, 2007


Skylake finally prompted me to upgrade. From an i3 2100T to a...... i7 3770. Turns out they go for under $200 occasionally.

EdEddnEddy
Apr 5, 2012



eggyolk posted:

Skylake finally prompted me to upgrade. From an i3 2100T to a...... i7 3770. Turns out they go for under $200 occasionally.

Nice. That i3 isn't a bad chip, but an upgrade to that i7 has to be night and day.

I went from that i3 to a Pentium 20th anniversary after it kept throwing a bad core error which is strange to me with a non overclocked CPU.

But for $52 getting a Pentium to run at 4.4Ghz is great as a HTPC gaming chip.


Also I knew I had seen that glitch with sleep mode on my laptop on occasion in every os from windows 7-8.1, but it was pretty rare and only happened after a lot of sleep time. A quick reboot seemed to fix it and since I never (can't really) sleep my Desktop because of the Overclock, I never run into that issue on it.

I will have to check out this High Performance profile being used like balanced and see how it handles a few of my VR apps.. TO be able to throttle up things properly where balanced does not, but also allowing it to throttle down when it is idle.. Might be worth exploring.

keyframe
Sep 15, 2007

I have seen things
Just built a new pc coming from an i7 920 (stock) to i7 skylake. It is amazing how fast this thing is. :allears:

suck my woke dick
Oct 10, 2012

:siren:I CANNOT EJACULATE WITHOUT SEEING NATIVE AMERICANS BRUTALISED!:siren:

Put this cum-loving slave on ignore immediately!

keyframe posted:

Just built a new pc coming from an i7 920 (stock) to i7 skylake. It is amazing how fast this thing is. :allears:

Yeah, the last great jump in processor performance (sandy bridge and on were more incremental for the average user).

EdEddnEddy
Apr 5, 2012



If he had run that 920 @ 4.0ghz at least vs Stock, it wouldn't feel like that huge of a jump.

But stock for stock, there is no comparison of course.


While we have gotten a bit faster each generation as well as more power efficient, I miss the days of real performance bumps like we had from the P4 to C2D to i Series.

Fallorn
Apr 14, 2005
If you were jumping from a i5-2500k what would be the processor you would go with right now?

VulgarandStupid
Aug 5, 2003
I AM, AND ALWAYS WILL BE, UNFUCKABLE AND A TOTAL DISAPPOINTMENT TO EVERYONE. DAE WANNA CUM PLAY WITH ME!?




Fallorn posted:

If you were jumping from a i5-2500k what would be the processor you would go with right now?

Most would just overclock it.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Fallorn posted:

If you were jumping from a i5-2500k what would be the processor you would go with right now?

I'm assuming that your upgrade is driven mostly by wanting modern platform features, so I'd say you really want to go for a 6600K or 6700K on H170 or Z170.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

Fallorn posted:

If you were jumping from a i5-2500k what would be the processor you would go with right now?

If you are buying now get a skylake setup

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Fallorn posted:

If you were jumping from a i5-2500k what would be the processor you would go with right now?
Assuming you can hit 4Ghz with your 2500k, you wouldn't--the actual processing improvements up to and including Skylake are pretty minor unless you're looking at a few fairly specific use cases (multi-threaded video rendering probably being the most popular). If all you do is game with it, though, upgrading is wasted money in the CPU department.

Now that's not to say there's no reason to upgrade. Access to native USB 3.0 w/C connectors is pretty nice, and will only get nicer as time goes on and more devices support the C-style port. DDR4 isn't really any better than DDR3 right now, but I suppose if you plan on keeping your setup for several years, at some point you should be able to drop in faster DDR4 sticks.

In all seriousness, though, the 2500k is a really hard chip to beat. The 4790k is a solid play, if you really feel the need to upgrade, as is the 6600/6700k, though I personally feel the cost premium of the 6700k is rather high for what you get. If you can manage to wait for the next iteration, I would.

Comedy option: 5775C

EdEddnEddy
Apr 5, 2012



Yea the 2500K should be able to hit 4Ghz with ease and 4.4-4.6+ with a little tuning. Like mentioned above, unless there is a specific feature you really want from a newer chip/chipset, id hold off for a bit longer and just get a good cooler if you don't already have one and enjoy your "free" upgrade. (If you haven't OC'ed it already).

I built a 2600K system when it was new and was able to get it to OC to 4.5Ghz and Turbo to 5ghz. It was an amazing chip and still runs great to this day for the guy I built it for.

Anime Schoolgirl
Nov 28, 2002

DrDork posted:

Comedy option: 5775C
*not actually a comedy option

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Anime Schoolgirl posted:

*not actually a comedy option
*true, especially now that you can buy it at WalMart of all places for $375 or so.

Lovable Luciferian
Jul 10, 2007

Flashing my onyx masonic ring at 5 cent wing n trivia night at Dinglers Sports Bar - Ozma
Is there any good reason to go with the 5820K over the 6700k for a gaming build?

Anime Schoolgirl
Nov 28, 2002

Only if you stream or play large amounts of Cities: Skylines

Slider
Jun 6, 2004

POINTS
Depends on where you're getting the parts from and if you plan on overclocking.

If you can get the 5820k/x99 for nearly the same price as 6700k/z170, and are going to overclock, the 5820k is probably the better buy.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Slider posted:

Depends on where you're getting the parts from and if you plan on overclocking.

If you can get the 5820k/x99 for nearly the same price as 6700k/z170, and are going to overclock, the 5820k is probably the better buy.

Current X99 boards will also support Broadwell-E with a BIOS update, whenever Intel decides to get around to releasing it.

mobby_6kl
Aug 9, 2009

by Fluffdaddy
Speaking of Broadwell-E, there should be a pretty sweet 10-core chip this time, the i7-6950X: http://arstechnica.com/gadgets/2015/11/intels-broadwell-e-lineup-rumoured-to-feature-monster-10-core-cpu/

Ten cores, 20 threads, and 25MB L3 cache. Maybe.

68Bird
Sep 2, 2004
I could see myself biting on the speedier 6 core processor for my next build.

sincx
Jul 13, 2012

furiously masturbating to anime titties
.

sincx fucked around with this message at 05:55 on Mar 23, 2021

EdEddnEddy
Apr 5, 2012



That's the Idea. Around or after Summer.

I may have to bite on that 10 Core because.... I'm stupid.

Also hope it overclocks well, I could probably get by with the 8 core one too so we will see. If I am going to do a X99 build I really would like to go all out with the M2 SSD's and such as well. I guess it all depends on what VR pushes next year and if DX12 needs more than my 6 core 3930K can push at its 4.4Ghz speed.

A Bad King
Jul 17, 2009


Suppose the oil man,
He comes to town.
And you don't lay money down.

Yet Mr. King,
He killed the thread
The other day.
Well I wonder.
Who's gonna go to Hell?
buy the new thing when the new thing gets extreme and not the old thing because the old thing won't upgrade to a new thing later (especially not a new extreme thing).

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.
Would the rumored i7-6850k (6 cores @ 3.6GHz) work well in gaming vs an i7-2600k @ 4.0GHz?
I do some video encoding where the older CPU hurts me but I don't want to upgrade my CPU/mobo/RAM only to lose a bunch of frames in games.

vvv: thanks, I think I'll plan an upgrade to Broadwell-E then

Malloc Voidstar fucked around with this message at 10:26 on Nov 17, 2015

BurritoJustice
Oct 9, 2012

Malloc Voidstar posted:

Would the rumored i7-6850k (6 cores @ 3.6GHz) work well in gaming vs an i7-2600k @ 4.0GHz?
I do some video encoding where the older CPU hurts me but I don't want to upgrade my CPU/mobo/RAM only to lose a bunch of frames in games.

The increase in IPC between sandy bridge and broadwell is greater than the difference in clockspeed, so even if you don't overclock the Broadwell-E CPU (you should) you won't lose per core performance (which is what matters in games).

Lolcano Eruption
Oct 29, 2007
Volcano of LOL.

sincx posted:

Skylake-E when? Any hope for mid-2016?

Broadwell-E is supposedly out Q1 2016, they're not going to release Skylake-E 3 months later!

Original Skylake-E release was Q4 2016 but I think they released a new roadmap somewhere showing Q1 2017.

slidebite
Nov 6, 2005

Good egg
:colbert:

It's crazy, but that's pretty much exactly what they did with the mobile processors.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
Grain of salt because it's a roadmap, but slightly more detailed than usual:



Here's to hoping that 4+4e LGA Skylake part stays alive to make it to release!

pmchem
Jan 22, 2010


mid-Q4 for Kaby Lake desktop chips, so sad :(

Josh Lyman
May 24, 2009


pmchem posted:

mid-Q4 for Kaby Lake desktop chips, so sad :(
That means Q2 2017 for wide availability and sales.

My 3570K + mobo will be 5 years old at that point.

A Bad King
Jul 17, 2009


Suppose the oil man,
He comes to town.
And you don't lay money down.

Yet Mr. King,
He killed the thread
The other day.
Well I wonder.
Who's gonna go to Hell?
Are we stuck at 2013 performance levels then? Doesn't Intel have a huge amount of cash for research?

Adbot
ADBOT LOVES YOU

Skandranon
Sep 6, 2008
fucking stupid, dont listen to me

A Bad King posted:

Are we stuck at 2013 performance levels then? Doesn't Intel have a huge amount of cash for research?

There are some physical limitations we are running up against with current processor technology. Going to take something major to break through that.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply