Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
SCheeseman
Apr 23, 2003

There were price wars back then too, Intel did cut in response to AMD offerings (particularly during the P4 years).

Adbot
ADBOT LOVES YOU

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Anime Schoolgirl posted:

They're still going to do that, Intel continues enjoying a "premium" reputation no matter where you look, even during the Athlon XP days.

Yeah, running a ~copper~ Thunderbird 'back in the day' felt almost counter-cultural.

SuperDucky
May 13, 2007

by exmarx

BIG HEADLINE posted:

Yeah, running a ~copper~ Thunderbird 'back in the day' felt almost counter-cultural.

My coppermine was probably my favorite chip. Such a scrappy lil' core.

Y'all know more than I remembered was possible pre-NDA, but that probably has more to do with protracted development schedules these days. That said, if you're looking for desktop/workstation multimedia performance, Kaby will be a good place to land until some of the upcoming technologies like non-volatile storage on the memory bus settle on a standard, if you're looking for a long term, ~5 year solution.

apropos man
Sep 5, 2016

You get a hundred and forty one thousand years and you're out in eight!

Anime Schoolgirl posted:

They're still going to do that, Intel continues enjoying a "premium" reputation no matter where you look, even during the Athlon XP days.

True dat. I watched the Linus Tech Tips video yesterday where he had a run through of what all the viewers have been buying for Nov 2016. I had a stab at guessing what the top CPU would be during the intro. I went high-end with my guess and I was right: 6700K. That's a lot of money Intel are raking in from just one (significant) section of the enthusiast community.

https://www.youtube.com/watch?v=227fgWPPPNU

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

SuperDucky posted:

My coppermine was probably my favorite chip. Such a scrappy lil' core.

Y'all know more than I remembered was possible pre-NDA, but that probably has more to do with protracted development schedules these days. That said, if you're looking for desktop/workstation multimedia performance, Kaby will be a good place to land until some of the upcoming technologies like non-volatile storage on the memory bus settle on a standard, if you're looking for a long term, ~5 year solution.

But coppermine was a Pentium III. I had one, 1GHz, ran it at 1.4. Great chip in the day.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
There were two Thunderbird core variants. Everything under 950Mhz was aluminum, 950Mhz and above were copper.

NihilismNow
Aug 31, 2003

BIG HEADLINE posted:

There were two Thunderbird core variants. Everything under 950Mhz was aluminum, 950Mhz and above were copper.

And the P3 "Coppermine" used aluminium interconnects. They got a bit of poo poo for that codename in some magazines at the time because people thought it was misleading.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
Ah the Coppermine, that was my second dual processor rig. Thing outlasted and -performed a lot of Pentium 4s in non-gaming tasks. Intel stopped supporting multiprocessing in consumer CPUs after that. I sat it out a long while until AMD started with the dual cores.

JnnyThndrs
May 29, 2001

HERE ARE THE FUCKING TOWELS

Combat Pretzel posted:

Ah the Coppermine, that was my second dual processor rig. Thing outlasted and -performed a lot of Pentium 4s in non-gaming tasks. Intel stopped supporting multiprocessing in consumer CPUs after that. I sat it out a long while until AMD started with the dual cores.

Same here, I ran my dual-PIII-850 box until AMD socket 939 came out. I also had a 1.4@1.7 Tualitin P-3 gaming rig that poo poo all over P-4's for years until the 3.0 P-4's came out, and those things were furnaces.

gently caress Netburst, forever.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE
Are there any refreshes expected for the Avoton series? I'm trying to plan out a NAS upgrade and I'd like ECC RAM for ZFS.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.

Paul MaudDib posted:

Are there any refreshes expected for the Avoton series? I'm trying to plan out a NAS upgrade and I'd like ECC RAM for ZFS.

http://www.anandtech.com/show/10489/spot-the-denverton-atom-c3000-silicon-on-display

Spotted an early sample months ago, but still not available at retail. These chips are really slow to work their way to small-scale distributors too. If you're after a luxury NAS in a small power budget, look at Xeon-D stuff.

SuperDucky
May 13, 2007

by exmarx

Twerk from Home posted:

http://www.anandtech.com/show/10489/spot-the-denverton-atom-c3000-silicon-on-display

Spotted an early sample months ago, but still not available at retail. These chips are really slow to work their way to small-scale distributors too. If you're after a luxury NAS in a small power budget, look at Xeon-D stuff.

E3-1225v3 would be a sweet spot for that application.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

SuperDucky posted:

E3-1225v3 would be a sweet spot for that application.

Since Haswell you don't even need a Xeon for ECC if you're OK settling for dual-core since dual-core Xeons are no longer a thing. I've been eyeing this Lenovo micro-ATX server with an i3-4150 for $190 but haven't quite been able to justify it yet, since my embedded N3150 board is doing great. Only 4 drive bays though which could be an issue for a NAS if a lot of drives are wanted.

Eletriarnation fucked around with this message at 06:41 on Nov 8, 2016

EdEddnEddy
Apr 5, 2012



Man I wish I knew about P3 Dual CPU setups back then, that would have been fun to mess with. I still have my old P3 933 sitting next to my desk for some old school Non Dosbox gaming lol. Though getting a real SB16 card to work for DOS games is a pita since PCI apparently isn't the way to do it and I loose interest after a day or two wasted trying. Ugh. MW2 will always be a royal pain in the rear end to play natively.

On the next rig I made, a P4 1.8ghz with 256M RDRAM, I could swear that thing ran snappier and faster in some ways then the P4 3.0Ghz Northbridge I built to replace that. I assume it had something to do with how genuinely Fast RDRAM was being PC800 vs DDR1 being PC400? Outside of how drat expensive that stuff was, that system lived way past its prime (it finally got retired as a office computer like a year ago.) The P4 3.0 still is used by a friend for a light Web browsing rig. :stare:

I like building rigs that last, even if P4 is total crap in comparison to todays goodness.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
Intel owes a ton of props to their Israeli branch, who came up with the Banias microarchitecture, then the Merom, and also the Sandy and Ivy Bridge chips. They're the ones who saved them from Netburst hell.

Shalom, fellas - thanks for the 2500K: http://www.zdnet.com/article/israel-inside-a-history-of-intels-r-d-in-israel/

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.
I really like the Xeon-D and would love an 8 core for a home server/NAS. 4 core would be more than sufficient really, but 8 core, yessss.

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

EdEddnEddy posted:

Man I wish I knew about P3 Dual CPU setups back then, that would have been fun to mess with. I still have my old P3 933 sitting next to my desk for some old school Non Dosbox gaming lol. Though getting a real SB16 card to work for DOS games is a pita since PCI apparently isn't the way to do it and I loose interest after a day or two wasted trying. Ugh. MW2 will always be a royal pain in the rear end to play natively.

On the next rig I made, a P4 1.8ghz with 256M RDRAM, I could swear that thing ran snappier and faster in some ways then the P4 3.0Ghz Northbridge I built to replace that. I assume it had something to do with how genuinely Fast RDRAM was being PC800 vs DDR1 being PC400? Outside of how drat expensive that stuff was, that system lived way past its prime (it finally got retired as a office computer like a year ago.) The P4 3.0 still is used by a friend for a light Web browsing rig. :stare:

I like building rigs that last, even if P4 is total crap in comparison to todays goodness.

I have an Alienware desktop from 2004 that came with a 3.0GHz Northwood P4 and 2x256MB of PC400 400MHz DDR. The Northwood actually wasn't bad in my experience, I was able to get it up to at least 3.6 but it did a fine job for the time even at stock. Fortunately, Alienware used what seems to be ASUS' best Socket 478 motherboard ever and it supports all kinds of cool stuff including a Socket 479 adapter. Some goon was selling said adapter on SA-Mart back in 2006 along with a Pentium M 725 for $160 and I snapped it up, then overclocked the Pentium M from 1.6 to 2.56. It was incredibly fast for the time, faster for games than virtually any other processor, and only needed a little aluminum heatsink with a 70mm fan. I used that machine all the way from summer 2004 until the end of 2008, when I upgraded to an i7-920. Probably at least 5 times as fast as my Pentium M, and I'll never see that kind of jump again.

The Alienware Pentium M system still works, too. I have it maxed out with 4GB of ECC and running Windows 10 32-bit off of a SATA 1 SSD, but the AGP slot and inability to run anything with more than one core limits its usefulness.

Eletriarnation fucked around with this message at 15:11 on Nov 10, 2016

EdEddnEddy
Apr 5, 2012



Eletriarnation posted:

I have an Alienware desktop from 2004 that came with a 3.0GHz Northwood P4 and 2x256MB of PC400. The Northwood actually wasn't bad in my experience, I was able to get it up to at least 3.6 but it did a fine job for the time even at stock. Fortunately, Alienware used what seems to be ASUS' best Socket 478 motherboard ever and it supports all kinds of cool stuff including a Socket 479 adapter. Some goon was selling said adapter on SA-Mart back in 2006 along with a Pentium M 725 for $160 and I snapped it up, then overclocked the Pentium M from 1.6 to 2.56. It was incredibly fast for the time, faster for games than virtually any other processor, and only needed a little aluminum heatsink with a 70mm fan. I used that machine all the way from summer 2004 until the end of 2008, when I upgraded to an i7-920. Probably at least 5 times as fast as my Pentium M, and I'll never see that kind of jump again.

The Alienware Pentium M system still works, too. I have it maxed out with 4GB of ECC and running Windows 10 32-bit off of a SATA 1 SSD, but the AGP slot and inability to run anything with more than one core limits its usefulness.

That is pretty awesome. I stuck around on that P4 3.0 for quite a while (and ended up swapping the Northwood for a Prescott since someone else I was upgrading, had a motherboard that supported the Northwood, but not the Prescott for whatever reason) so I got 64Bit capabilities and held on with 2G DDR1 for as long as I could until I got a Q9550. That was a major jump as well. The i7 920 of course was more, but we both got one of the last major jumps we were going to feel. The move from my Overclocked Q9550 to my 3930K didn't "Feel" as big of a difference, however the major ram bump (from 8G to 32G) and the extra cores definitely help multitasking and VM usage scenarios that you can feel.

I have no idea how I was able to run Adobe Premiere like I did on that P4. But it was all pre HD footage anyway back then so I guess it makes sense.

I actually have a old LGA775 ASROCK board here I got years ago for SnG to play with some old hardware as it has slots for DDR1 and DDR2 ram, and a AGP and PCI-E slots. It's not going to be anything fast compared to a good LGA775 board, but I feel it would be fun to play around with some old GPU's and stuff to see the performance difference between AGP/PCI-E of the same card or just nostalgically going back and seeing how gaming was on older hardware or something.

Or mabye I can use it as a sort of old Museum sort of piece to show off how bad it was in the good/bad old days.... I still haven't gotten the thing out of the box yet so we will see if It ever gets used. The only semi good free LGA775 chip I have lying around is a Pentium Dual Core E2160. Those little things back then are like the early version of the Pentium G3258. A 1.8ghz chip that overclocks to 3.6Ghz with minor cooling. It was a overachieving little thing for the sub $100 price you could nab them for back in the day.

JnnyThndrs
May 29, 2001

HERE ARE THE FUCKING TOWELS

EdEddnEddy posted:

I actually have a old LGA775 ASROCK board here I got years ago for SnG to play with some old hardware as it has slots for DDR1 and DDR2 ram, and a AGP and PCI-E slots. It's not going to be anything fast compared to a good LGA775 board, but I feel it would be fun to play around with some old GPU's and stuff to see the performance difference between AGP/PCI-E of the same card or just nostalgically going back and seeing how gaming was on older hardware or something.

I've got a couple of those boards too, I got 'em cheap so I could run my AGP Radeon X800 All-In-Wonder for HTPC duties, since I paid $450 for it in like 2005. Worked well until I retired that setup in favor of something that could deal with digital signals and HD.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
So evidently Kaby-X and Skylake-X will share the same socket: http://www.guru3d.com/news-story/intel-skylake-x-series-cpu-photo-surfaces.html

The downside being that LGA2066 will very likely be a dead-end socket at launch.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
Can someone explain the difference between the X and E lines? IIRC, E is the enthusiast socket, with higher memory channels, PCI lanes, and cores. Is X a super E? Replacing E? Why does it exist?

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
X is rebranded E. Not sure how the Kaby Lake fits into it.

--edit: Skylake-X seems to have less L3 cache per core than Haswell-E and Broadwell-E according to rumors?

Combat Pretzel fucked around with this message at 07:19 on Nov 10, 2016

japtor
Oct 28, 2005

Combat Pretzel posted:

X is rebranded E. Not sure how the Kaby Lake fits into it.
Maybe the Kaby one is just using the extra power to bump up stock clocks a bunch?

Kazinsal
Dec 13, 2011



Holy wow Intel needs an old-fashioned AMD rear end kicking.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

japtor posted:

Maybe the Kaby one is just using the extra power to bump up stock clocks a bunch?

The 112W TDP over the 7700K's 95W suggests it might push 4.8Ghz+ on Turbo. I somehow doubt it'll be 5Ghz, but you never know.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
I'm just puzzled about it, because it's dual channel only despite the huge socket.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Combat Pretzel posted:

I'm just puzzled about it, because it's dual channel only despite the huge socket.

I think it's just a case of "you can choose either/or depending on your need for extra cores or clockspeed."

Lowen SoDium
Jun 5, 2003

Highen Fiber
Clapping Larry

BIG HEADLINE posted:

So evidently Kaby-X and Skylake-X will share the same socket: http://www.guru3d.com/news-story/intel-skylake-x-series-cpu-photo-surfaces.html

The downside being that LGA2066 will very likely be a dead-end socket at launch.

Why do you think 2066 will be a dead-end?

Anime Schoolgirl
Nov 28, 2002

BIG HEADLINE posted:

I think it's just a case of "you can choose either/or depending on your need for extra cores or clockspeed."
And people wonder why Intel is making GBS threads the bed with HPC.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"

Lowen SoDium posted:

Why do you think 2066 will be a dead-end?

Because LGA3647 is also in the pipe. It's entirely possible that 3647 will be Xeon-focused or a Saudi Prince-level enthusiast part, while 2066 could be the "accessible" upper-level SKU.

Until Intel gets chattier, we can't know, and no one with an NDA is telling.

Krailor
Nov 2, 2001
I'm only pretending to care
Taco Defender
It looks like Intel is going to start supporting 3 concurrent sockets; up from the 2 they've historically supported (1150/2011)

1151
Consumer/Entry level server/Low end workstation. Uses the 100 and 200 chipsets on the consumer side and C236 on the server side.
Replaced socket 1150 and is the standard consumer platform that will be used for Skylake, Kaby Lake, and Coffee Lake (socketed Cannonlake).
Assuming the leaks are correct this socket will be updated when Ice Lake is released (2018?) since they're putting the FIVR back on-chip.

2066
Enthusiast/Mid level server/High end workstation. Uses the X299 chipset on the consumer side, I'm not sure the server side chipset has been announced (C422?)
Replaces socket 2011 and will be used for high-end, single-processor machines. Also changing the microarchitecture suffix from -E to -X.
The interesting thing about this is that it looks like they're going to launch Skylake-X and Kaby Lake-X at the same time.
Skylake-X will be the standard 6/8/10 core chips that the enthusiast line has traditionally offered with up to 44 PCIe lanes.
Kaby Lake-X looks like it just consists of a single 4 core chip with only 16 PCIe lanes. I guess there's two scenarios where this makes sense; offering an entry level 2066 product priced below the 6 core Skylake-x, or a high-clocked 'Gamerzzz' focused chip (5GHz stock?) priced accordingly.
Users would then have an upgrade path with Cannonlake-X and the socket would probably be replaced whenever Ice Lake-X gets released (2020?)

3647
High-end and multi-processor servers.
This is the new socket that is being shared by Xeon Phi and Skylake-EP/EX chips. I don't think Intel is planning on making a consumer chipset for this socket so I doubt that we'll ever see a non-server board using this socket.

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.
That's a great summary. I also highly doubt we'd ever see 3647 in anything consumer facing. I don't know if they'll even be doing a socket mount for it that doesn't require torquing some screws to get it down. Any kind of lever based retention like on previous sockets would have to be fairly ridiculous to keep those monsters in place.

Twerk from Home
Jan 17, 2009

This avatar brought to you by the 'save our dead gay forums' foundation.
Is Xeon-D expected to get a Skylake refresh or operate on an entirely unique product cycle? It's more than a year old at this point.

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.

Twerk from Home posted:

Is Xeon-D expected to get a Skylake refresh or operate on an entirely unique product cycle? It's more than a year old at this point.

I'm wondering if it might be some kind of CPU/FPGA hybrid with the Altera stuff in there to change what peripherals it supports.

Methylethylaldehyde
Oct 23, 2004

BAKA BAKA

priznat posted:

I'm wondering if it might be some kind of CPU/FPGA hybrid with the Altera stuff in there to change what peripherals it supports.

That would actually be really handy for certain kinds of scientific calculations, assuming you can get whatever wierdass algorithm into usable verilog code. FPGAs can be 8-20x as power efficient compared to a regular CPU for a lot of stuff that you can either parallelize out the rear end by putting 90 functional compute units on the FPGA, or involves a lot of chewing on small bits of data, like a hash or checksum function, where you can store it in the FPGA's local memory and then pipeline the crap out of the calculation process. I know certain implementations of the openCV spec on FPGAs would get real time 1080p Haar Cascade performance for like 30W, compared to the CPU+GPU at like 200ish.

priznat
Jul 7, 2009

Let's get drunk and kiss each other all night.

Methylethylaldehyde posted:

That would actually be really handy for certain kinds of scientific calculations, assuming you can get whatever wierdass algorithm into usable verilog code. FPGAs can be 8-20x as power efficient compared to a regular CPU for a lot of stuff that you can either parallelize out the rear end by putting 90 functional compute units on the FPGA, or involves a lot of chewing on small bits of data, like a hash or checksum function, where you can store it in the FPGA's local memory and then pipeline the crap out of the calculation process. I know certain implementations of the openCV spec on FPGAs would get real time 1080p Haar Cascade performance for like 30W, compared to the CPU+GPU at like 200ish.

Absolutely, I am convinced a big reason to move to the giant 3647 sockets is so they can have FPGAs with the cpu as a multi chip module and they can do a lot of stuff that offload cards/GPUs do now. Makes sense to have some configurability on the smaller side too.

AEMINAL
May 22, 2015

barf barf i am a dog, barf on your carpet, barf
nice to see the 1151 isnt becoming obsolete after a generation. sadly i doubt we'll see much of a performance gain at the end of its cycle

SuperDucky
May 13, 2007

by exmarx
Krailor is correct, 2066 will be the enthusiast chip moving forward, because Intel is hemorrhaging margins and they know that gamers and enthusiasts are two markets they can make huge bucks on by binning Xeons, and forcing any and all 2+ SLI setups (and, realistically, any SLI build at all) onto 2066.

Methylethylaldehyde
Oct 23, 2004

BAKA BAKA

priznat posted:

Absolutely, I am convinced a big reason to move to the giant 3647 sockets is so they can have FPGAs with the cpu as a multi chip module and they can do a lot of stuff that offload cards/GPUs do now. Makes sense to have some configurability on the smaller side too.

It also means you can do incredibly stupid poo poo like put a general purpose port on a machine, route the traces to the inputs of one of the FPGAs, and exploit whatever shared memory/DMA system the FPGA uses to talk to the host CPU to get a 100G ultra low latency interconnect for whatever HPC node/cluster thing you're working on. Or have it poll a 1024x1024 CCD sitting on top of a civil defense beta calibration source, use the FPGA to run analysis and normalization on it, and get a retarded quantity of ultra high grade random numbers to generate keys or salted hashes.

Adbot
ADBOT LOVES YOU

in a well actually
Jan 26, 2011

dude, you gotta end it on the rhyme

Methylethylaldehyde posted:

It also means you can do incredibly stupid poo poo like put a general purpose port on a machine, route the traces to the inputs of one of the FPGAs, and exploit whatever shared memory/DMA system the FPGA uses to talk to the host CPU to get a 100G ultra low latency interconnect for whatever HPC node/cluster thing you're working on. Or have it poll a 1024x1024 CCD sitting on top of a civil defense beta calibration source, use the FPGA to run analysis and normalization on it, and get a retarded quantity of ultra high grade random numbers to generate keys or salted hashes.

Yeah, but I'm not sure you're going to bother with a FPGA when there's 100G Omnipath on-die (another reason for all those pins.)

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply