Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE

Ignoarints posted:

I was really hoping for "well it was 32GB at least" but 8... I'm sorry Lord Bude.

they only made 2gig dimms in those days. Otherwise I'm sure my younger idiot self would have bought more. Even using 4 2gig dimms was challenging on those old nvidia chipsets.

I still haven't told you about the horrible wind tunnel antec 900 case or the ridiculous 1500w psu I thought I needed because I had no clue back then.

The Lord Bude fucked around with this message at 15:54 on Mar 19, 2014

Adbot
ADBOT LOVES YOU

Rastor
Jun 2, 2001

I've been there, only instead of PC parts it was $4600 for the first ever HDTV with LED local dimming. It was available for less than $1600 in less than 9 months.

Oh well. At least it still works and the picture still looks good.

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD
I paid top dollar for my flat panel TV and Pioneer DVD player back in 2000. They both still work :aaa:

Ignoarints
Nov 26, 2010

go3 posted:

I paid top dollar for my flat panel TV and Pioneer DVD player back in 2000. They both still work :aaa:

Haha I remember my dad buying the a first generation Sony DVD player for $550 on :siren: sale :siren:. We got 8 free dvd's! Including the Matrix, which was incredibly awesome at the time. That thing still works to do this day

Siochain
May 24, 2005

"can they get rid of any humans who are fans of shitheads like Kanye West, 50 Cent, or any other piece of crap "artist" who thinks they're all that?

And also get rid of anyone who has posted retarded shit on the internet."


Ignoarints posted:

Haha I remember my dad buying the a first generation Sony DVD player for $550 on :siren: sale :siren:. We got 8 free dvd's! Including the Matrix, which was incredibly awesome at the time. That thing still works to do this day

To continue the derail, we still have a 1986 4-head Sony VCR. The thing still works, but the clock on it only had dates up to 2007. Sony made some incredible products back in the day.

For less derail - its getting close to time to update my media center. Hopefully the lower TDW will let me have a decent CPU with a passive heatsink, since my Celeron G620 is starting to not cut it - but it worked with a Hyper 212 and no fan.

Gunjin
Apr 27, 2004

Om nom nom

Ignoarints posted:

Haha I remember my dad buying the a first generation Sony DVD player for $550 on :siren: sale :siren:. We got 8 free dvd's! Including the Matrix, which was incredibly awesome at the time. That thing still works to do this day

My family paid $99 for a single movie on VHS in 1985.

Ignoarints
Nov 26, 2010
I'm sure he really wanted a VCR in the 80's, but he was in the army then so he probably had to wait a few years (luckily)

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map
How well is Haswell-E's non-GPU performance predicted to scale over good old Sandy/Ivy/Haswell with DDR4 anyways? I need more fastness for muh MMOs

td4guy
Jun 13, 2005

I always hated that guy.

Siochain posted:

To continue the derail, we still have a 1986 4-head Sony VCR. The thing still works, but the clock on it only had dates up to 2007.
Set the year to, like 1997.

atomicthumbs
Dec 26, 2010


We're in the business of extending man's senses.
If Z97 isn't going to have SATA Express, what exactly is it going to have that Z87 doesn't?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Intel has announced some details of their Haswell Refresh and Broadwell lineups:

Intel has announced Haswell Refresh K-edition processors with overclocking enhancements launching in June, most importantly improved thermal interface between the die and heat spreader, potentially removing much of the desire to delid the CPU. These CPUs will be paired with motherboards based on the new 9-series chipsets, they will likely be compatible with 8-series chipsets but this is not confirmed.

Intel has announced that the Broadwell desktop CPUs coming in 2015 will include a K-edition processor with Iris Pro graphics.

Intel has announced that Haswell Refresh will include a K-edition Pentium processor, the first unlocked low-end CPU from Intel. This comes afer the news that they were expanding QuickSync to include Pentium and Celeron processors. The idea is that an overclocked Pentium would be a very compelling gaming CPU compared to AMD's offerings.

Finally, Intel has confirmed that Haswell-E will be the first 8-core desktop CPU, as well as some other tidbits on the X99 chipset.

SpaceBum
May 1, 2006

Alereon posted:

Intel has confirmed that Haswell-E will be the first 8-core desktop CPU, as well as some other tidbits on the X99 chipset.

Thinking this will mean that Intel will have the new Xeon e5 series drop most 4 cores products excepting for a few low power versions. Moving to have a standard 6 cores, unless intel is going to be pushing some kind of new atom server, has a brain fart and gimps the Xeon line.

Dilbert As FUCK
Sep 8, 2007

by Cowcaster
Pillbug
That's pretty interesting; I was going to do a HW upgrade but I might hold out. No use upgrading a DDR3 system when DDR4 is just around the corner.

atomicthumbs
Dec 26, 2010


We're in the business of extending man's senses.

Dilbert As gently caress posted:

That's pretty interesting; I was going to do a HW upgrade but I might hold out. No use upgrading a DDR3 system when DDR4 is just around the corner.

good luck being an early adopter :shepface:

Ignoarints
Nov 26, 2010
Hooray called it

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Dilbert As gently caress posted:

That's pretty interesting; I was going to do a HW upgrade but I might hold out. No use upgrading a DDR3 system when DDR4 is just around the corner.
Keep in mind that Broadwell will still be using DDR3. This has got me pretty excited for an upgrade to Haswell Refresh in June.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE
God dammit intel, now I have to wait till June before I buy a new computer:argh:

Seamonster
Apr 30, 2007

IMMER SIEGREICH
Iris Pro was cool when it came out but since that time dedicated graphics has been steadily improving in performance/watt to the point that it only seems remotely relevant in the mobile space and even now Maxwell is just starting to hit it there too. If only they'd put Iris Pro in something that ISN'T a quad core, 8 thread expensive beast.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Remember that Iris Pro comes with 64/128MB of L4 cache which can have a more general impact on performance. There also are Iris Pro Core i5s.

EoRaptor
Sep 13, 2003



Alereon posted:

Remember that Iris Pro comes with 64/128MB of L4 cache which can have a more general impact on performance. There also are Iris Pro Core i5s.

I'm more interested in what market intel is going after with a K series that has Iris Pro. A general rule is that the bigger the die, the lower the overclock potential, and Iris Pro is a huge amount of silicon. Is it separately clocked? Can you turn it off and use the 'area' to help with cooling?

The performance potential of 128mb of fast, local L4 cache is nice, but few programs will be really able to take advantage of it, and the CPU cache control hardware won't be optimized for it, so it may not yield as much benefit as it could.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Seamonster posted:

Iris Pro was cool when it came out but since that time dedicated graphics has been steadily improving in performance/watt to the point that it only seems remotely relevant in the mobile space and even now Maxwell is just starting to hit it there too. If only they'd put Iris Pro in something that ISN'T a quad core, 8 thread expensive beast.

Intel have made a real mess of Iris Pro, since it hardly shows up anywhere relevant.

Other than the ASUS Zenbook Infinity, where are my 13" laptops with i7-4558U? Even then it's only Iris Pro 5100, but that's close to good enough.

Also, unlocked CPU with Iris Pro? No, Intel, that's not what people wanted. People wanted "Intel TSX-NI, Intel VT-d, vPro, and TXT" extensions that were disabled for no particular reason on the K CPUs, and a heatspreader that wasn't the width of the Grand Canyon away from the die.

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


HalloKitty posted:

Intel have made a real mess of Iris Pro, since it hardly shows up anywhere relevant.

Other than the ASUS Zenbook Infinity, where are my 13" laptops with i7-4558U? Even then it's only Iris Pro 5100, but that's close to good enough.

Also, unlocked CPU with Iris Pro? No, Intel, that's not what people wanted. People wanted "Intel TSX-NI, Intel VT-d, vPro, and TXT" extensions that were disabled for no particular reason on the K CPUs, and a heatspreader that wasn't the width of the Grand Canyon away from the die.

I think you can honestly blame Apple for that. Since they bought up all the initial supply, other OEMs designed their products around the available chips. Short of doing a mid-gen refresh once the parts became available, they are just going to skip them altogether.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE
I don't understand why they don't save some money by ditching the integrated gpu from -k processors altogether. Only enthusiasts/gamers have any interest in -k processors anyhow, and there isn't a single one of us on the face of the earth who wouldn't have a real gpu. It's entirely redundant.

ShaneB
Oct 22, 2002


The Lord Bude posted:

I don't understand why they don't save some money by ditching the integrated gpu from -k processors altogether. Only enthusiasts/gamers have any interest in -k processors anyhow, and there isn't a single one of us on the face of the earth who wouldn't have a real gpu. It's entirely redundant.

Maybe harder/more expensive to have a separate production line?

Seamonster
Apr 30, 2007

IMMER SIEGREICH

Alereon posted:

Remember that Iris Pro comes with 64/128MB of L4 cache which can have a more general impact on performance. There also are Iris Pro Core i5s.

In hindsight, I was focusing purely on the graphics capabilities of Iris Pro since it was marketed so heavily as some sort of "this is where dGPUs start to die off," but so many months on that notion seems laughable.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Seamonster posted:

In hindsight, I was focusing purely on the graphics capabilities of Iris Pro since it was marketed so heavily as some sort of "this is where dGPUs start to die off," but so many months on that notion seems laughable.

I saw it more of a decent option in the mobile space to get rid of those low and mid range discrete GPUs.

Except it never fulfilled that promise, because there are very few Iris Pro 5100 and 5200 laptops available. (Macbook Pro does have 5200, though).

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

The Lord Bude posted:

I don't understand why they don't save some money by ditching the integrated gpu from -k processors altogether. Only enthusiasts/gamers have any interest in -k processors anyhow, and there isn't a single one of us on the face of the earth who wouldn't have a real gpu. It's entirely redundant.

I still use quicksync on an offscreen.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

HalloKitty posted:

I saw it more of a decent option in the mobile space to get rid of those low and mid range discrete GPUs.

Except it never fulfilled that promise, because there are very few Iris Pro 5100 and 5200 laptops available. (Macbook Pro does have 5200, though).

Yields are probably not great and the industry is only just getting out of the race to the bottom mindset where they ceded the high end with its high profits to Apple for dubious market share gains.

Being able to ditch the dGPU for a one chip solution is worth a lot and might even make iris pro cheaper.

ShaneB
Oct 22, 2002


deimos posted:

I still use quicksync on an offscreen.

Speaking of quicksync, why can't I enable it in Open Broadcaster Software? I have an i5-4670K. Is there some way I enable the stock video chipset that I don't know about? I went through my BIOS last night and couldn't find anything.

SamDabbers
May 26, 2003



ShaneB posted:

Speaking of quicksync, why can't I enable it in Open Broadcaster Software? I have an i5-4670K. Is there some way I enable the stock video chipset that I don't know about? I went through my BIOS last night and couldn't find anything.

Only some motherboards support both integrated and PCI-e graphics enabled simultaneously. Is there an option in your BIOS for enabling LucidLogix Virtu support?

ShaneB
Oct 22, 2002


SamDabbers posted:

LucidLogix Virtu support

Don't make things up

SamDabbers
May 26, 2003



Some ASUS motherboards (P8Z77-v, for one) have a BIOS option for multi-GPU mode and the caption says "enable this for LucidLogix Virtu." It's a lovely gimmick, but the multi-GPU mode option makes QuickSync work in OBS while your dedicated GPU runs your games.

QuickSync encodes look like poo poo at reasonable bitrates though, so you might want to do this if you're setting this up for live streaming.

ShaneB
Oct 22, 2002


SamDabbers posted:

Some ASUS motherboards (P8Z77-v, for one) have a BIOS option for multi-GPU mode and the caption says "enable this for LucidLogix Virtu." It's a lovely gimmick, but the multi-GPU mode option makes QuickSync work in OBS while your dedicated GPU runs your games.

QuickSync encodes look like poo poo at reasonable bitrates though, so you might want to do this if you're setting this up for live streaming.

I wasn't for anything besides general experimenting. My PC seems like it can handle 720p 30FPS streaming with no big deal in the tests I've done, though.

JawnV6
Jul 4, 2004

So hot ...

The Lord Bude posted:

I don't understand why they don't save some money by ditching the integrated gpu from -k processors altogether. Only enthusiasts/gamers have any interest in -k processors anyhow, and there isn't a single one of us on the face of the earth who wouldn't have a real gpu. It's entirely redundant.

Processors don't come from an arbitrary creation process. When a architect and designer love each other very much, they work with a few hundred close friends and come up with a design of architectural blocks and circuit designs that come together to make a processor. They send that over to the fab, where that design is reproduces a few million times. Si isn't a perfect process, and a sizeable chunk of those millions are DOA long before they're even packaged. The ones that survive are "binned". In short, -k variants aren't designed from the ground up, they're the cherry-picked units that happened to get the best Si off the wafer.

Ignoarints
Nov 26, 2010
I too wondered if there could be cost savings or, perhaps, performance gains from doing away with integrated graphics on k series chips, but I just assumed it was far more complex than just leaving something out. In fact I was actually turned off on Intel a while ago when I found out I couldn't get one without integrated graphics (I figured, hey is that why they cost so much more?) but that was before I actually looked into it.

evilweasel
Aug 24, 2002

Wouldn't the integrated graphics basically be acting as a heatsink?

Ignoarints
Nov 26, 2010

evilweasel posted:

Wouldn't the integrated graphics basically be acting as a heatsink?

I think whatever physical space it uses would get saturated by heat pretty much instantly. Now if it were being used it'd just create more heat, and if the die was actually smaller I'm sure heat would be more concentrated per *crazy unit of measurement*, but I don't even really mean that. I wondered if leaving it out would leave "room" for design to make the processor gerbils to run better. But I kind of doubt its something as simple as that.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Ignoarints posted:

I too wondered if there could be cost savings or, perhaps, performance gains from doing away with integrated graphics on k series chips, but I just assumed it was far more complex than just leaving something out. In fact I was actually turned off on Intel a while ago when I found out I couldn't get one without integrated graphics (I figured, hey is that why they cost so much more?) but that was before I actually looked into it.
Intel Desktop CPUs aren't limited by physical silicon area, and when the integrated graphics isn't in use its fully powered down and thus using no power, so there's nothing to be gained by removing it. Even if you never plan to use the integrated graphics the hardware acceleration it offers will come in handy, and having the ability to use integrated graphics for troubleshooting is helpful.

BobHoward
Feb 13, 2012

The only thing white people deserve is a bullet to their empty skull

EoRaptor posted:

I'm more interested in what market intel is going after with a K series that has Iris Pro. A general rule is that the bigger the die, the lower the overclock potential, and Iris Pro is a huge amount of silicon. Is it separately clocked? Can you turn it off and use the 'area' to help with cooling?

The performance potential of 128mb of fast, local L4 cache is nice, but few programs will be really able to take advantage of it, and the CPU cache control hardware won't be optimized for it, so it may not yield as much benefit as it could.

Overclocking potential: if you're just going to be OCing the CPUs you're still overclocking the same amount of chip area. The GPU clocks are wholly independent and I believe also have their own power plane in Haswell (iow yes, you can turn it off, by not using it).

Also, a major reason for reduced OC potential on a larger die is process variation: transistors on one corner may not perform as well as those on the opposite corner. The Iris Pro die is much more square than the regular Haswell dies, which have such a rectangular shape that the diagonal may actually be larger.

Cache: what basis do you have for claiming the CPU's cache hierarchy isn't "optimized" for the L4 cache? And plenty of programs will benefit from the L4. Not all types of programs by any means, but it's a pretty nice thing to have.

bull3964 posted:

I think you can honestly blame Apple for that. Since they bought up all the initial supply, other OEMs designed their products around the available chips. Short of doing a mid-gen refresh once the parts became available, they are just going to skip them altogether.

I don't think you can blame Apple for it at all. The "initial supply" argument puts the cart before the horse. If Intel was being run in a halfway competent fashion at that time, I'm sure they chose how to allocate wafer start capacity based on what each OEM was interested in ordering, and in what quantity. If other OEMs had been seriously interested, I'm sure they could have had supply at launch. IMO, the only sense in which you can blame Apple is that Apple has the high end laptop market sewn up so completely that high volume PC OEMs aren't trying real hard to be there.

One cause of that problem: it's hard for PC OEMs to sell stuff like this. Iris Pro 5200 is more expensive and clocked slightly lower than similar ordinary mobile quadcore Haswells. Even though the L4 cache largely makes up for the clock speed, and then some, this combo is hard for PC OEMs to sell to the public. Apple has the luxury of being able to offer something no PC OEM can (OS X), and markets largely by saying "our stuff is awesome, buy it". PC OEMs have to market mostly on specs, and it's tough to inform nontechnical consumers that they're actually better off with slightly less MHz for more money.

HalloKitty posted:

Also, unlocked CPU with Iris Pro? No, Intel, that's not what people wanted. People wanted "Intel TSX-NI, Intel VT-d, vPro, and TXT" extensions that were disabled for no particular reason on the K CPUs, and a heatspreader that wasn't the width of the Grand Canyon away from the die.

What relevance do those named features have to the average overclocking enthusiast? Far as I can tell, gently caress all, except maybe TSX-NI (and even so, not much and not today). VT-d is only useful if you're a heavy user of virtualization and you need the VMs to have native I/O performance. vPro / TXT are enterprise management features.

The L4 cache, on the other hand, actually has a chance of being somewhat relevant. It'd be interesting to see someone do a serious study of how it affects CPU-limited gaming, for example.

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

evilweasel posted:

Wouldn't the integrated graphics basically be acting as a heatsink?

When not in use, bet your sass it does! Ask anyone with a 2600K/2700K in a P67 about the really cool core. I've got arguably the world's best air heat sink bolted to my 2600K and there's still one core that is consistently, under load, solidly 7-10ºC cooler than others.

However, it really only works on whatever it's adjacent to, so I don't think it does much for the overall heat or stability of the chip. If the cores were mounted inside the integrated graphics rather than adjacent to them... Wait, how is Haswell laid out, again? Is it the four-in-a-row then meets the IGP block like Sandy Bridge or did they get fancy with it?

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply