Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
brap
Aug 23, 2004

Grimey Drawer
1156 has gpu-on-chip processors-- the dual core i3s and i5s.

Adbot
ADBOT LOVES YOU

Spime Wrangler
Feb 23, 2003

Because we can.

Ok yep I don't know what the hell I'm talking about and am going to shut up.

Lum
Aug 13, 2003

Well yes of course you'll need new/different pins to put the GPU on a socket, same as how the last generation needed new pins to handle the onboard memory controller.

Just means I'm going to wait until they stop loving about and move everything onto the CPU that they're going to move, then maybe we can have a socket that lasts for more than a year again.

Lets face it, until the XBox720 or whatever comes out, there's no need to upgrade your gaming PCs anyway as there wont be any lovely console ports that can use the extra power.

PC LOAD LETTER
May 23, 2005
WTF?!

MachinTrucChose posted:

You can't overclock like you used to
Overclocking is a stupid waste of money and shouldn't be done but limiting the option is a negative for the CPU riceboy types.

You know they demo'd a OC'd SB running at ~5Ghz with just the stock cooler and a small overvolt right? 50-100% overclocks on a CPU that is already pretty fast are no joke.

Just to stir the pot: looks like AMD may be able to get BD out sooner than 2H 11', at least for servers anyways. So we'll know how it will perform even if we couldn't buy it until May or April or something for a desktop. Reading RWT's BD architecture overview it sure sounds like BD could be pretty fast. Apparently its supposed to reach very high clocks like the P4 or POWER6 while having better per clock performance than current Phenom II chips. If they pull that off AMD could end up with a chip that performs as good or even out performs SB.

~Coxy
Dec 9, 2003

R.I.P. Inter-OS Sass - b.2000AD d.2003AD

MachinTrucChose posted:

Overclocking is a stupid waste of money and shouldn't be done but limiting the option is a negative for the CPU riceboy types.

What the hell?

$300 CPU + $50 cooler is a waste of money compared to a $600 CPU?

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

~Coxy posted:

What the hell?

$300 CPU + $50 cooler is a waste of money compared to a $600 CPU?

For many users, a $75 CPU at stock speed is more than enough - for most everyday desktop tasks, the bottleneck is the hard drive and network, not the CPU, even when you're talking about a comparatively wimpy processor like an Athlon II X3 or a Core i3. For most of the remainder, a $200-300 CPU at stock speed is plenty; that's more than enough for just about any game on the planet, and "home power user" stuff like home video editing. Many of the very small remainder, for whom a Core i7-860 or Phenom II X6 1090T isn't enough, are generally using their systems in professional environments where even a tiny risk of overclocking-related instability is unacceptable.

If you're just looking at the size of your e-peen speed of your processor in gigahertz, yeah, it's a great deal. When you look at overall system performance, though, the downsides (increased noise, heat, power consumption, and cooler cost) often outweigh the "benefit" of a CPU that's just going to be bottlenecking harder rather than running faster.

I will admit, overclocking can make sense in some situations. For instance, right now, I'm running a mildly overclocked Conroe. It's allowed me to put off upgrading for a little while, and I'm pretty happy with that. However, the gradual shift of the market to quad-and-more core support is eventually going to leave me behind, and at that point even a balls-to-the-wall 100% overclock won't catch me up to a dirt cheap Athlon II X4. Overclocking can be useful as a stopgap, but the value proposition of overclocking brand new CPUs in the $200-300 range is not very good.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Modern games do scale pretty well with higher CPU clockspeeds though, and if you get a good cooler it can be near-silent at anything from stock to a 1Ghz overclock.

Regarding the move to the LGA-1155 socket, I remember reading it was mostly to enable additional monitors to be driven by the on-chip graphics over LGA-1156, but I'm not sure if this is correct.

Zhentar
Sep 28, 2003

Brilliant Master Genius

MachinTrucChose posted:

New architecture with 10% power savings and 10% speed improvement
Too negligible to matter for the home user. Only big companies will care, and hopefully they realize 99% of their employees can get by on Atoms.

That's 10% speed improvement per cycle, but you'll get more cycles at a lower price; so more like 25-30% better at the same price point (depending on how well the new turbo boost works out).


Spime Wrangler posted:

So I'm not a computer engineer or anything (so I should probably shut up) but given the level of architectural rearrangement we're seeing with CPUs today I think it's somewhat natural that the sockets obsolete relatively quickly.

You're actually right, although it's not the GPU that's driving it. The last sockets truly bridging several architectures were LGA 775 on the Intel side, and Socket A on the AMD side. Coincidentally, those worked with the last architectures before they moved to on-die memory controllers. On the AMD side now, they've been working to maintain some compatibility in spite of this, which you can see from the AM2/AM2+/AM3 sockets. Intel hasn't bothered with it, which is why we get the 1156/1155 forced incompatibility.

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


They way I view it is if you can get more speed out of a processor without doing anything crazy, why the hell not?

I have an Athlon II X3 that I'm overclocking to 3.3ghz from 2.9ghz on stock cooling. Am I really going to notice those extra 400mhz? Probably not. Then again, why not use the thing to it's full potential? The only thing I did was spend about 30 seconds mucking around in the bios.

Jabor
Jul 16, 2010

#1 Loser at SpaceChem

Space Gopher posted:

When you look at overall system performance, though, the downsides (increased noise, heat, power consumption, and cooler cost)

Following this logic to the extreme would suggest that everyone should underclock their processors so as to get reduced noise, heat and power consumption. The downsides of running at stock speeds as opposed to underclocking outweigh the advantages the extra clocks give you :downs:

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

Alereon posted:

Modern games do scale pretty well with higher CPU clockspeeds though, and if you get a good cooler it can be near-silent at anything from stock to a 1Ghz overclock.

Again, only if you're interested in making numbers go up rather than improving your subjective experience.

Game performance is almost always bottlenecked on the GPU. Take that bottleneck out of the picture by dropping to low resolution and visual settings, and CPU bottlenecks usually exist over 60fps. That means that, for the vast majority of users, the limiting factor is their video card or monitor. Running at 125fps with a 60hz monitor isn't really useful, unless you're playing Quake 3.

Jabor posted:

Following this logic to the extreme would suggest that everyone should underclock their processors so as to get reduced noise, heat and power consumption. The downsides of running at stock speeds as opposed to underclocking outweigh the advantages the extra clocks give you :downs:

Or you could just buy the CPU you need for decent performance in whatever it is you do, and let it underclock itself at idle like any modern x86 CPU. Sorry about your [H] cred, but sometimes it's not necessary to "tweak" or "tune" your system.

freeforumuser
Aug 11, 2007
Let's face it, the only real apps left that are still primarily CPU limited are rendering and video encoding. Interestingly, both apps lend themselves well to massively parallel processing on GPUs, same for gaming physics. And now, we see Intel and AMD are pushing with CPUs with integrated GPUs. Coincidence? Me thinks no and let me proclaim the multicore era is already over and welcome our new GPU-dominant processor overlords.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Space Gopher posted:

Again, only if you're interested in making numbers go up rather than improving your subjective experience.

Game performance is almost always bottlenecked on the GPU. Take that bottleneck out of the picture by dropping to low resolution and visual settings, and CPU bottlenecks usually exist over 60fps. That means that, for the vast majority of users, the limiting factor is their video card or monitor. Running at 125fps with a 60hz monitor isn't really useful, unless you're playing Quake 3.
That just isn't true, take a look at these CPU scaling benchmarks for StarCraft II. MMOs like WoW usually scale similarly. Obviously old games are pretty much maxed out, but in newer games you can usually see very significant differences in achieved framerate. Remember also that even if a game averages 60fps, the minimum framerate will be much lower than that, and improving framerates during high-action scenes can noticeably improve how smooth the game feels.

4 Day Weekend
Jan 16, 2009

freeforumuser posted:

Let's face it, the only real apps left that are still primarily CPU limited are rendering and video encoding. Interestingly, both apps lend themselves well to massively parallel processing on GPUs, same for gaming physics. And now, we see Intel and AMD are pushing with CPUs with integrated GPUs. Coincidence? Me thinks no and let me proclaim the multicore era is already over and welcome our new GPU-dominant processor overlords.

I don't think GPU can do video encoding better than CPUs. Faster yes, but the quality is pretty bad in comparison.

movax
Aug 30, 2008

4 Day Weekend posted:

I don't think GPU can do video encoding better than CPUs. Faster yes, but the quality is pretty bad in comparison.

You need to define better. A good deal of the mathematical operations utilized in H.264 encoding can be performed much faster on massively-parallel hardware like a GPU. However, Intel has also been providing SSEx extensions for years now, and some current encoders are written to utilize the very wide SIMD/vector instructions, allowing for boosts in CPU performance. We can look at decoders...ffmpeg vs. CoreAVC (with CUDA enabled) vs. Broadcom CrystalHD card vs. DXVA. Some are pure software, some can leverage hardware in the form of a GPU/add-in card, some are hybrids...and they all turn out varying output quality (go to AVSForum to see people spergin' out about decoder vs. decoder).

It all comes down to software. An encoder that runs on fast hardware but sucks at CABAC will always deliver shittier results than an encoder running on slow hardware, but with an appropriate CABAC implementation. The true test, which I don't know if anyone has gotten too, is running the same encoder on a CPU and GPU, optimized for each respectively, but with the same algorithmic decision-making when it comes to quantization/CABAC/etc.

movax fucked around with this message at 15:21 on Sep 21, 2010

Eletriarnation
Apr 6, 2005

People don't appreciate the substance of things...
objects in space.


Oven Wrangler

Space Gopher posted:

When you look at overall system performance, though, the downsides (increased noise, heat, power consumption, and cooler cost) often outweigh the "benefit" of a CPU that's just going to be bottlenecking harder rather than running faster.

Considering that I managed, with a $24 cooler, to increase my i7 920's speed from 2.66GHz to 3.15GHz *and* decrease its VCore by around 15% from stock... I think I actually probably came out ahead on performance, heat, and power consumption for a fairly minor outlay in cooling cost.

Anecdotal, yes, but it seems to me that minor overclocking is quite a nice risk/outlay:reward ratio.

rscott
Dec 10, 2009

Lum posted:

People saying that 775 was an exception and you should expect sockets to be short lived need to remember further into the past.

Slot A was around for ever, then it was replaced with Socket 7 with which it was electrically compatible and adaptors were available. Lots of people kept their old 440BX boards for years, starting with the likes of a 300MHz Celeron and finishing up on a pIII running over 1GHz.

Of course it didn't help that the chipsets designed to replace the 440BX were all terrible.

I then went to AMD as the P4 was terrible. I can't even remember the name of the socket (Socket A?) but that socket lasted a long time as well. I think I started with a 1400+ tbird then a 2000+ Athlon XP and finally a 2600+ Athlon-XPM that I harvested from a dead company laptop.

I was lucky (poor) enough to skip the Socket 939 debacle and went straight onto 775 with a C2D and later a C2Q.

So yeah, to me at least, a socket that lives for only a year is a shameful socket.


As for the new chipset features. USB3 I can add with a card if I ever happen across a USB3 device and find it too slow and I guess the new SATA is good for people using SSDs? It's certainly worthless for people using harddrives.

Uh Socket 7 was for old Pentium/K6-2/3s/Cyrix/etc. Intel went to Slot 1 to prevent AMD CPUs from being pin compatible drop in replacements for their CPUs. Slot 1 and Slot A were mechanically compatible but not pin compatible. Socket 370 was electronically compatible with Slot 1 and Socket 462 was electronically compatible with Slot A, Intel/AMD moved away from the slots because the packaging for the cartridge CPUs was more expensive than for socket based CPUs.

:goonsay:

Nebulis01
Dec 30, 2003
Technical Support Ninny
Hell yeah, Slockets :)

They were such a pain in the rear end. I didn't know that Slot A / 462 were compatible. I've never seen an AMD slocket converter.

rscott
Dec 10, 2009
AMD had better chipsets come out for socket 462 than they did for Slot A so there wasn't much impetus to for enthusiasts to keep their older mobos, and I can't think of any Slot A motherboards with DDR support off the top of my head. The 440BX was probably the best chipset ever made and everything from a Pentium II 233 up to a PIII 1.4 GHz would work with it.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

rscott posted:

AMD had better chipsets come out for socket 462 than they did for Slot A so there wasn't much impetus to for enthusiasts to keep their older mobos, and I can't think of any Slot A motherboards with DDR support off the top of my head. The 440BX was probably the best chipset ever made and everything from a Pentium II 233 up to a PIII 1.4 GHz would work with it.
440BX didn't have that much longevity. It was replaced within a year, and obsolete within 2 years as all of Intel's CPUs after that point required a 133Mhz FSB. The 1100Mhz Coppermine P3 was the last supported CPU, the Coppermine-T and Tualatin CPUs (as well as all previous 133Mhz FSB CPUs) required at least an Intel i810E chipset.

rscott
Dec 10, 2009

Alereon posted:

440BX didn't have that much longevity. It was replaced within a year, and obsolete within 2 years as all of Intel's CPUs after that point required a 133Mhz FSB. The 1100Mhz Coppermine P3 was the last supported CPU, the Coppermine-T and Tualatin CPUs (as well as all previous 133Mhz FSB CPUs) required at least an Intel i810E chipset.

Tualatin wasn't officially supported on 440BX but I had an Abit BH6 that had no problem running up to 150MHz. Basically the only reason I went from 440BX to 815 was my Radeon 8500 couldn't tolerate the overclocked AGP bus like my old GeForce 2 did.

Jabor
Jul 16, 2010

#1 Loser at SpaceChem

Space Gopher posted:

Or you could just buy the CPU you need for decent performance in whatever it is you do, and let it underclock itself at idle like any modern x86 CPU. Sorry about your [H] cred, but sometimes it's not necessary to "tweak" or "tune" your system.

Or you can buy a CPU that's lower than what you need, overclock it so that it does what you need it to, and let it clock itself back down at idle like any modern x86 CPU. Same performance, lower cost.

Srebrenica Surprise
Aug 23, 2008

"L-O-V-E's just another word I never learned to pronounce."
Overclockers tend to turn off SpeedStep and other equivalent power-saving features to preserve stability at higher voltages. If you have a need for more CPU power (SCII, BC2/some other console ports, hobbyist rendering, whatever) I don't think anybody's going to say you shouldn't, but it's one of those things that has taken on a life of its own in the minds of idiot enthusiasts, like water cooling and Velociraptors.

Jabor posted:

Or you can buy a CPU that's lower than what you need...Same performance, lower cost.
When has this been the case past 2008? The market differentiation right now is between number of cores, not clock speed, and the cost for another core on the AMD side is about $15. It's not as if we're all looking at Conroes anymore where the difference was 0.5ghz and a cache bump for $70.

BlackMK4
Aug 23, 2006

wat.
Megamarm

Srebrenica Surprise posted:

When has this been the case past 2008? The market differentiation right now is between number of cores, not clock speed, and the cost for another core on the AMD side is about $15. It's not as if we're all looking at Conroes anymore where the difference was 0.5ghz and a cache bump for $70.
Look at the 930 i7s that are conservatively overclocking to 3.8ghz on air fairly easily.

leppo
Jul 12, 2003

Alereon posted:

440BX didn't have that much longevity. It was replaced within a year, and obsolete within 2 years as all of Intel's CPUs after that point required a 133Mhz FSB. The 1100Mhz Coppermine P3 was the last supported CPU, the Coppermine-T and Tualatin CPUs (as well as all previous 133Mhz FSB CPUs) required at least an Intel i810E chipset.
Apollo Pro133A all the way!

Srebrenica Surprise
Aug 23, 2008

"L-O-V-E's just another word I never learned to pronounce."

BlackMK4 posted:

Look at the 930 i7s that are conservatively overclocking to 3.8ghz on air fairly easily.
Well yes, but overclocking your $285 CPU to (over) the speed of a $1,040 CPU isn't what I'm talking about. No enthusiast was going to buy the $1,040 CPU in the first place, so you're not saving anything. The point is that it used to be that you could buy a lower-end Core 2 (like, say, the E6400, E6450 or E6600) and overclock the poo poo out of it to match a mainstream chip with perhaps a slight difference in cache but not much of a performance difference, saving you substantial amounts of money.

It's not that way anymore, though, because the price difference between mainstream CPUs is much smaller and dependent on # of cores, not clock speed. The closest analogy would be overclocking a Phenom X4 9xx to match the i5 750, which is a bad idea for a bunch of other reasons: heat, power consumption, the 9xx's poorer results in heavily parallel apps, its lack of turbo mode, whatever. None of those were really a concern when you were comparing two Conroes, but multi-core has become so prevalent (and architectures perform so differently, if you're comparing across those) that the "buy cheap and overclock" strategy for the mainstream pretty much died as soon as the Phenom/Athlon II hit.

Lum
Aug 13, 2003

rscott posted:

Uh Socket 7 was for old Pentium/K6-2/3s/Cyrix/etc. Intel went to Slot 1 to prevent AMD CPUs from being pin compatible drop in replacements for their CPUs. Slot 1 and Slot A were mechanically compatible but not pin compatible. Socket 370 was electronically compatible with Slot 1 and Socket 462 was electronically compatible with Slot A, Intel/AMD moved away from the slots because the packaging for the cartridge CPUs was more expensive than for socket based CPUs.

:goonsay:

You're right, I got the names mixed up. It was a long time ago after all.

Doesn't invalidate the point I was making though.

freeforumuser
Aug 11, 2007

Alereon posted:

440BX didn't have that much longevity. It was replaced within a year, and obsolete within 2 years as all of Intel's CPUs after that point required a 133Mhz FSB. The 1100Mhz Coppermine P3 was the last supported CPU, the Coppermine-T and Tualatin CPUs (as well as all previous 133Mhz FSB CPUs) required at least an Intel i810E chipset.

Good old times when chipsets actually affected CPU performance. VIA ruled the roost back in 2000/01 because Intel tried to dictate the RAM market and shove RDRAM down our throats and what a karmic epic failure that was. Even without adjusting for inflation PC700 RDRAM cost like more than entire gaming rigs of today. However once Intel got their poo poo together by supporting DDR and Nforce came out for AMD...VIA was pretty much dead.

rscott
Dec 10, 2009

Lum posted:

You're right, I got the names mixed up. It was a long time ago after all.

Doesn't invalidate the point I was making though.

Yeah I was just sperging out because I miss the old days of using graphite pencils to unlock extra multipliers or setting jumpers to 2x to get 6x multipliers on my old super socket 7 boards. :)

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


To this day, I think my most stable computer was still my old Socket 370 system. 440 BX motherboard with a Celeron 366 (5.5 multiplier.) I set the bus speed at 100mhz from day one and enjoyed 50% over-clock and speed equivalent to a top of the line P3 for the day. This was back when Celerons weren't nearly as gimped so you could get 95% of the performance of an equivalently clocked P3. I used that PC from 1999 until 2004 when I finally built an Athlon XP system so I could play Half Life 2.

I just pulled it out of the closet last weekend so I could recycle it. Part of me wants to hold onto it for nostalgia, but it really is worthless today.

rscott
Dec 10, 2009
Celeron A's at a given speed were actually faster than Pentium IIs because they had 128KB of full speed L2 cache instead of 512KB of half speed. 300As were probably my second favorite CPU of all time behind 35W mobile Barton 2500+s. Unlocked from the factory + binned like poo poo to be able to hit 1.67GHz at 35 watts = 100% overclocks if you had some BH-5 and a good mobo.

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


rscott posted:

300As were probably my second favorite CPU of all time

Ha, this is bringing back a flood of memories. If I recall correctly, the 366@550 overclock that I did was the big successor to the massive 300A overclocks.

And yes, you are right, I now remember mine was supposed to be FASTER than a P3 550 due to the cache issue.

WhyteRyce
Dec 30, 2001

bull3964 posted:

Ha, this is bringing back a flood of memories. If I recall correctly, the 366@550 overclock that I did was the big successor to the massive 300A overclocks.

The 300As had a near guaranteed overclock to 450. The 366s had a higher multiplier which meant it had to go to 550 if you wanted a FSB of 100mhz, but the success rate of getting those to 550 were much lower than getting the 300s to 450.

I had a 366 which didn't hit 550 so I had to settle for 450 at some funky FSB :(

Spime Wrangler
Feb 23, 2003

Because we can.

freeforumuser posted:

Even without adjusting for inflation PC700 RDRAM cost like more than entire gaming rigs of today.

ell oh ell


I know, old, out of production etc etc but still

That Athlon XP 2100+, 512mb DDR, Radeon 8500 dream machine I put together during the fallout of that battle as my first MY COMPUTER will always have a special place in my heart. :3:

man, computers so fast it was like living in the future...

Now look at us. Just look at us.

PUBLIC TOILET
Jun 13, 2009

Spime Wrangler posted:

ell oh ell


I know, old, out of production etc etc but still

That Athlon XP 2100+, 512mb DDR, Radeon 8500 dream machine I put together during the fallout of that battle as my first MY COMPUTER will always have a special place in my heart. :3:

man, computers so fast it was like living in the future...

Now look at us. Just look at us.

Look at us watching old people fall down stairs on YouTube via our Google Android phones.

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


WhyteRyce posted:

The 300As had a near guaranteed overclock to 450. The 366s had a higher multiplier which meant it had to go to 550 if you wanted a FSB of 100mhz, but the success rate of getting those to 550 were much lower than getting the 300s to 450.

I had a 366 which didn't hit 550 so I had to settle for 450 at some funky FSB :(

I think I spent an extra $10 on my 366 and got a reseller binned one that was guaranteed to hit 550.

Which brings back even more memories. That was quite common at the time to go to a computer parts vendor and pay a little extra to have them pre-test the CPU at an overclocked speed.

I think I managed to push mine a bit beyond 550 and I think the motherboard grew unstable before the CPU did. That Abit board had support for FSBs over 100mhz even though there were no chips yet that supported it.

I look back at the Alpha cooler on it which seemed huge at the time and the retail heatsink that came with my X3 dwarfs it.

bull3964 fucked around with this message at 19:17 on Sep 22, 2010

rscott
Dec 10, 2009
Best HSFU for skt370 was the golden orb. It had a 60mm fan!

spanko
Apr 7, 2004
winnar

Space Gopher posted:

Game performance is almost always bottlenecked on the GPU.

This isn't true anymore for a lot of popular games.

WhyteRyce
Dec 30, 2001

rscott posted:

Best HSFU for skt370 was the golden orb. It had a 60mm fan!

Incorrect. The best cooler was the Glacier 4500C with the Arctic cap

I was so angry I bought this cooler for my 366 that wouldn't even boot at 550

edit - crap attached the wrong photo

Here is the correct one:


Only registered members can see post attachments!

WhyteRyce fucked around with this message at 19:26 on Sep 22, 2010

Adbot
ADBOT LOVES YOU

rscott
Dec 10, 2009
That is a slot 1 cooler sir. :colbert:

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply