|
Ignoarints posted:I was really hoping for "well it was 32GB at least" but 8... I'm sorry Lord Bude. they only made 2gig dimms in those days. Otherwise I'm sure my younger idiot self would have bought more. Even using 4 2gig dimms was challenging on those old nvidia chipsets. I still haven't told you about the horrible wind tunnel antec 900 case or the ridiculous 1500w psu I thought I needed because I had no clue back then. The Lord Bude fucked around with this message at 15:54 on Mar 19, 2014 |
# ? Mar 19, 2014 15:34 |
|
|
# ? Apr 29, 2024 16:23 |
|
I've been there, only instead of PC parts it was $4600 for the first ever HDTV with LED local dimming. It was available for less than $1600 in less than 9 months. Oh well. At least it still works and the picture still looks good.
|
# ? Mar 19, 2014 15:43 |
|
I paid top dollar for my flat panel TV and Pioneer DVD player back in 2000. They both still work
|
# ? Mar 19, 2014 15:58 |
go3 posted:I paid top dollar for my flat panel TV and Pioneer DVD player back in 2000. They both still work Haha I remember my dad buying the a first generation Sony DVD player for $550 on sale . We got 8 free dvd's! Including the Matrix, which was incredibly awesome at the time. That thing still works to do this day
|
|
# ? Mar 19, 2014 18:36 |
|
Ignoarints posted:Haha I remember my dad buying the a first generation Sony DVD player for $550 on sale . We got 8 free dvd's! Including the Matrix, which was incredibly awesome at the time. That thing still works to do this day To continue the derail, we still have a 1986 4-head Sony VCR. The thing still works, but the clock on it only had dates up to 2007. Sony made some incredible products back in the day. For less derail - its getting close to time to update my media center. Hopefully the lower TDW will let me have a decent CPU with a passive heatsink, since my Celeron G620 is starting to not cut it - but it worked with a Hyper 212 and no fan.
|
# ? Mar 19, 2014 19:30 |
|
Ignoarints posted:Haha I remember my dad buying the a first generation Sony DVD player for $550 on sale . We got 8 free dvd's! Including the Matrix, which was incredibly awesome at the time. That thing still works to do this day My family paid $99 for a single movie on VHS in 1985.
|
# ? Mar 19, 2014 19:37 |
I'm sure he really wanted a VCR in the 80's, but he was in the army then so he probably had to wait a few years (luckily)
|
|
# ? Mar 19, 2014 20:26 |
|
How well is Haswell-E's non-GPU performance predicted to scale over good old Sandy/Ivy/Haswell with DDR4 anyways? I need more fastness for muh MMOs
|
# ? Mar 19, 2014 21:49 |
|
Siochain posted:To continue the derail, we still have a 1986 4-head Sony VCR. The thing still works, but the clock on it only had dates up to 2007.
|
# ? Mar 20, 2014 00:48 |
|
If Z97 isn't going to have SATA Express, what exactly is it going to have that Z87 doesn't?
|
# ? Mar 20, 2014 00:59 |
|
Intel has announced some details of their Haswell Refresh and Broadwell lineups: Intel has announced Haswell Refresh K-edition processors with overclocking enhancements launching in June, most importantly improved thermal interface between the die and heat spreader, potentially removing much of the desire to delid the CPU. These CPUs will be paired with motherboards based on the new 9-series chipsets, they will likely be compatible with 8-series chipsets but this is not confirmed. Intel has announced that the Broadwell desktop CPUs coming in 2015 will include a K-edition processor with Iris Pro graphics. Intel has announced that Haswell Refresh will include a K-edition Pentium processor, the first unlocked low-end CPU from Intel. This comes afer the news that they were expanding QuickSync to include Pentium and Celeron processors. The idea is that an overclocked Pentium would be a very compelling gaming CPU compared to AMD's offerings. Finally, Intel has confirmed that Haswell-E will be the first 8-core desktop CPU, as well as some other tidbits on the X99 chipset.
|
# ? Mar 20, 2014 01:48 |
|
Alereon posted:Intel has confirmed that Haswell-E will be the first 8-core desktop CPU, as well as some other tidbits on the X99 chipset. Thinking this will mean that Intel will have the new Xeon e5 series drop most 4 cores products excepting for a few low power versions. Moving to have a standard 6 cores, unless intel is going to be pushing some kind of new atom server, has a brain fart and gimps the Xeon line.
|
# ? Mar 20, 2014 02:10 |
|
That's pretty interesting; I was going to do a HW upgrade but I might hold out. No use upgrading a DDR3 system when DDR4 is just around the corner.
|
# ? Mar 20, 2014 02:13 |
|
Dilbert As gently caress posted:That's pretty interesting; I was going to do a HW upgrade but I might hold out. No use upgrading a DDR3 system when DDR4 is just around the corner. good luck being an early adopter
|
# ? Mar 20, 2014 02:23 |
Hooray called it
|
|
# ? Mar 20, 2014 02:24 |
|
Dilbert As gently caress posted:That's pretty interesting; I was going to do a HW upgrade but I might hold out. No use upgrading a DDR3 system when DDR4 is just around the corner.
|
# ? Mar 20, 2014 02:41 |
|
God dammit intel, now I have to wait till June before I buy a new computer
|
# ? Mar 20, 2014 02:44 |
|
Iris Pro was cool when it came out but since that time dedicated graphics has been steadily improving in performance/watt to the point that it only seems remotely relevant in the mobile space and even now Maxwell is just starting to hit it there too. If only they'd put Iris Pro in something that ISN'T a quad core, 8 thread expensive beast.
|
# ? Mar 20, 2014 14:39 |
|
Remember that Iris Pro comes with 64/128MB of L4 cache which can have a more general impact on performance. There also are Iris Pro Core i5s.
|
# ? Mar 20, 2014 15:27 |
|
Alereon posted:Remember that Iris Pro comes with 64/128MB of L4 cache which can have a more general impact on performance. There also are Iris Pro Core i5s. I'm more interested in what market intel is going after with a K series that has Iris Pro. A general rule is that the bigger the die, the lower the overclock potential, and Iris Pro is a huge amount of silicon. Is it separately clocked? Can you turn it off and use the 'area' to help with cooling? The performance potential of 128mb of fast, local L4 cache is nice, but few programs will be really able to take advantage of it, and the CPU cache control hardware won't be optimized for it, so it may not yield as much benefit as it could.
|
# ? Mar 20, 2014 16:03 |
|
Seamonster posted:Iris Pro was cool when it came out but since that time dedicated graphics has been steadily improving in performance/watt to the point that it only seems remotely relevant in the mobile space and even now Maxwell is just starting to hit it there too. If only they'd put Iris Pro in something that ISN'T a quad core, 8 thread expensive beast. Intel have made a real mess of Iris Pro, since it hardly shows up anywhere relevant. Other than the ASUS Zenbook Infinity, where are my 13" laptops with i7-4558U? Even then it's only Iris Pro 5100, but that's close to good enough. Also, unlocked CPU with Iris Pro? No, Intel, that's not what people wanted. People wanted "Intel TSX-NI, Intel VT-d, vPro, and TXT" extensions that were disabled for no particular reason on the K CPUs, and a heatspreader that wasn't the width of the Grand Canyon away from the die.
|
# ? Mar 20, 2014 16:05 |
|
HalloKitty posted:Intel have made a real mess of Iris Pro, since it hardly shows up anywhere relevant. I think you can honestly blame Apple for that. Since they bought up all the initial supply, other OEMs designed their products around the available chips. Short of doing a mid-gen refresh once the parts became available, they are just going to skip them altogether.
|
# ? Mar 20, 2014 16:20 |
|
I don't understand why they don't save some money by ditching the integrated gpu from -k processors altogether. Only enthusiasts/gamers have any interest in -k processors anyhow, and there isn't a single one of us on the face of the earth who wouldn't have a real gpu. It's entirely redundant.
|
# ? Mar 20, 2014 16:24 |
|
The Lord Bude posted:I don't understand why they don't save some money by ditching the integrated gpu from -k processors altogether. Only enthusiasts/gamers have any interest in -k processors anyhow, and there isn't a single one of us on the face of the earth who wouldn't have a real gpu. It's entirely redundant. Maybe harder/more expensive to have a separate production line?
|
# ? Mar 20, 2014 16:26 |
|
Alereon posted:Remember that Iris Pro comes with 64/128MB of L4 cache which can have a more general impact on performance. There also are Iris Pro Core i5s. In hindsight, I was focusing purely on the graphics capabilities of Iris Pro since it was marketed so heavily as some sort of "this is where dGPUs start to die off," but so many months on that notion seems laughable.
|
# ? Mar 20, 2014 16:43 |
|
Seamonster posted:In hindsight, I was focusing purely on the graphics capabilities of Iris Pro since it was marketed so heavily as some sort of "this is where dGPUs start to die off," but so many months on that notion seems laughable. I saw it more of a decent option in the mobile space to get rid of those low and mid range discrete GPUs. Except it never fulfilled that promise, because there are very few Iris Pro 5100 and 5200 laptops available. (Macbook Pro does have 5200, though).
|
# ? Mar 20, 2014 16:45 |
|
The Lord Bude posted:I don't understand why they don't save some money by ditching the integrated gpu from -k processors altogether. Only enthusiasts/gamers have any interest in -k processors anyhow, and there isn't a single one of us on the face of the earth who wouldn't have a real gpu. It's entirely redundant. I still use quicksync on an offscreen.
|
# ? Mar 20, 2014 16:53 |
|
HalloKitty posted:I saw it more of a decent option in the mobile space to get rid of those low and mid range discrete GPUs. Yields are probably not great and the industry is only just getting out of the race to the bottom mindset where they ceded the high end with its high profits to Apple for dubious market share gains. Being able to ditch the dGPU for a one chip solution is worth a lot and might even make iris pro cheaper.
|
# ? Mar 20, 2014 16:53 |
|
deimos posted:I still use quicksync on an offscreen. Speaking of quicksync, why can't I enable it in Open Broadcaster Software? I have an i5-4670K. Is there some way I enable the stock video chipset that I don't know about? I went through my BIOS last night and couldn't find anything.
|
# ? Mar 20, 2014 17:05 |
|
ShaneB posted:Speaking of quicksync, why can't I enable it in Open Broadcaster Software? I have an i5-4670K. Is there some way I enable the stock video chipset that I don't know about? I went through my BIOS last night and couldn't find anything. Only some motherboards support both integrated and PCI-e graphics enabled simultaneously. Is there an option in your BIOS for enabling LucidLogix Virtu support?
|
# ? Mar 20, 2014 17:20 |
|
SamDabbers posted:LucidLogix Virtu support Don't make things up
|
# ? Mar 20, 2014 17:22 |
|
Some ASUS motherboards (P8Z77-v, for one) have a BIOS option for multi-GPU mode and the caption says "enable this for LucidLogix Virtu." It's a lovely gimmick, but the multi-GPU mode option makes QuickSync work in OBS while your dedicated GPU runs your games. QuickSync encodes look like poo poo at reasonable bitrates though, so you might want to do this if you're setting this up for live streaming.
|
# ? Mar 20, 2014 17:31 |
|
SamDabbers posted:Some ASUS motherboards (P8Z77-v, for one) have a BIOS option for multi-GPU mode and the caption says "enable this for LucidLogix Virtu." It's a lovely gimmick, but the multi-GPU mode option makes QuickSync work in OBS while your dedicated GPU runs your games. I wasn't for anything besides general experimenting. My PC seems like it can handle 720p 30FPS streaming with no big deal in the tests I've done, though.
|
# ? Mar 20, 2014 17:45 |
|
The Lord Bude posted:I don't understand why they don't save some money by ditching the integrated gpu from -k processors altogether. Only enthusiasts/gamers have any interest in -k processors anyhow, and there isn't a single one of us on the face of the earth who wouldn't have a real gpu. It's entirely redundant. Processors don't come from an arbitrary creation process. When a architect and designer love each other very much, they work with a few hundred close friends and come up with a design of architectural blocks and circuit designs that come together to make a processor. They send that over to the fab, where that design is reproduces a few million times. Si isn't a perfect process, and a sizeable chunk of those millions are DOA long before they're even packaged. The ones that survive are "binned". In short, -k variants aren't designed from the ground up, they're the cherry-picked units that happened to get the best Si off the wafer.
|
# ? Mar 20, 2014 17:48 |
I too wondered if there could be cost savings or, perhaps, performance gains from doing away with integrated graphics on k series chips, but I just assumed it was far more complex than just leaving something out. In fact I was actually turned off on Intel a while ago when I found out I couldn't get one without integrated graphics (I figured, hey is that why they cost so much more?) but that was before I actually looked into it.
|
|
# ? Mar 20, 2014 18:03 |
|
Wouldn't the integrated graphics basically be acting as a heatsink?
|
# ? Mar 20, 2014 18:55 |
evilweasel posted:Wouldn't the integrated graphics basically be acting as a heatsink? I think whatever physical space it uses would get saturated by heat pretty much instantly. Now if it were being used it'd just create more heat, and if the die was actually smaller I'm sure heat would be more concentrated per *crazy unit of measurement*, but I don't even really mean that. I wondered if leaving it out would leave "room" for design to make the processor gerbils to run better. But I kind of doubt its something as simple as that.
|
|
# ? Mar 20, 2014 19:05 |
|
Ignoarints posted:I too wondered if there could be cost savings or, perhaps, performance gains from doing away with integrated graphics on k series chips, but I just assumed it was far more complex than just leaving something out. In fact I was actually turned off on Intel a while ago when I found out I couldn't get one without integrated graphics (I figured, hey is that why they cost so much more?) but that was before I actually looked into it.
|
# ? Mar 20, 2014 20:36 |
|
EoRaptor posted:I'm more interested in what market intel is going after with a K series that has Iris Pro. A general rule is that the bigger the die, the lower the overclock potential, and Iris Pro is a huge amount of silicon. Is it separately clocked? Can you turn it off and use the 'area' to help with cooling? Overclocking potential: if you're just going to be OCing the CPUs you're still overclocking the same amount of chip area. The GPU clocks are wholly independent and I believe also have their own power plane in Haswell (iow yes, you can turn it off, by not using it). Also, a major reason for reduced OC potential on a larger die is process variation: transistors on one corner may not perform as well as those on the opposite corner. The Iris Pro die is much more square than the regular Haswell dies, which have such a rectangular shape that the diagonal may actually be larger. Cache: what basis do you have for claiming the CPU's cache hierarchy isn't "optimized" for the L4 cache? And plenty of programs will benefit from the L4. Not all types of programs by any means, but it's a pretty nice thing to have. bull3964 posted:I think you can honestly blame Apple for that. Since they bought up all the initial supply, other OEMs designed their products around the available chips. Short of doing a mid-gen refresh once the parts became available, they are just going to skip them altogether. I don't think you can blame Apple for it at all. The "initial supply" argument puts the cart before the horse. If Intel was being run in a halfway competent fashion at that time, I'm sure they chose how to allocate wafer start capacity based on what each OEM was interested in ordering, and in what quantity. If other OEMs had been seriously interested, I'm sure they could have had supply at launch. IMO, the only sense in which you can blame Apple is that Apple has the high end laptop market sewn up so completely that high volume PC OEMs aren't trying real hard to be there. One cause of that problem: it's hard for PC OEMs to sell stuff like this. Iris Pro 5200 is more expensive and clocked slightly lower than similar ordinary mobile quadcore Haswells. Even though the L4 cache largely makes up for the clock speed, and then some, this combo is hard for PC OEMs to sell to the public. Apple has the luxury of being able to offer something no PC OEM can (OS X), and markets largely by saying "our stuff is awesome, buy it". PC OEMs have to market mostly on specs, and it's tough to inform nontechnical consumers that they're actually better off with slightly less MHz for more money. HalloKitty posted:Also, unlocked CPU with Iris Pro? No, Intel, that's not what people wanted. People wanted "Intel TSX-NI, Intel VT-d, vPro, and TXT" extensions that were disabled for no particular reason on the K CPUs, and a heatspreader that wasn't the width of the Grand Canyon away from the die. What relevance do those named features have to the average overclocking enthusiast? Far as I can tell, gently caress all, except maybe TSX-NI (and even so, not much and not today). VT-d is only useful if you're a heavy user of virtualization and you need the VMs to have native I/O performance. vPro / TXT are enterprise management features. The L4 cache, on the other hand, actually has a chance of being somewhat relevant. It'd be interesting to see someone do a serious study of how it affects CPU-limited gaming, for example.
|
# ? Mar 20, 2014 21:42 |
|
|
# ? Apr 29, 2024 16:23 |
|
evilweasel posted:Wouldn't the integrated graphics basically be acting as a heatsink? When not in use, bet your sass it does! Ask anyone with a 2600K/2700K in a P67 about the really cool core. I've got arguably the world's best air heat sink bolted to my 2600K and there's still one core that is consistently, under load, solidly 7-10ºC cooler than others. However, it really only works on whatever it's adjacent to, so I don't think it does much for the overall heat or stability of the chip. If the cores were mounted inside the integrated graphics rather than adjacent to them... Wait, how is Haswell laid out, again? Is it the four-in-a-row then meets the IGP block like Sandy Bridge or did they get fancy with it?
|
# ? Mar 20, 2014 21:43 |