Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
WhyteRyce
Dec 30, 2001

rscott posted:

That is a slot 1 cooler sir. :colbert:

Yes but you could buy a socket 370 to slot 1 adapter :colbert:

I forget if that cooler actually worked on the adapter though.

Adbot
ADBOT LOVES YOU

pienipple
Mar 20, 2009

That's wrong!
The Athlon XP days of huge rear end cpu coolers so the box didn't sound like a hair dryer. I had the aluminum/copper Zalman beast with the 120mm fan. Needed it's own little support brackets that bolted to the board. Just BARELY cleared the PSU.

The stock cooler on my 64 X2 is quieter and smaller, and keeps everything at a satisfactory temp. :unsmith:

Spime Wrangler
Feb 23, 2003

Because we can.

COCKMOUTH.GIF posted:

Look at us watching old people fall down stairs on YouTube via our Google Android phones.

overclocked to 1.2ghz on a broadband connection in the woods

"Can I get high on these mushrooms? Cmon google goggles, tell me yes!"

Squibbles
Aug 24, 2000

Mwaha ha HA ha!

WhyteRyce posted:

Incorrect. The best cooler was the Glacier 4500C with the Arctic cap

I was so angry I bought this cooler for my 366 that wouldn't even boot at 550

edit - crap attached the wrong photo

Here is the correct one:



Haha I had a cooler similar to that on my 300A except it had three (3!) fans across the width of the slot. Worked great! I eventually traded that whole computer, case and all for a linksys wireless router. Doh!

PC LOAD LETTER
May 23, 2005
WTF?!
VOS32 was the best cooler from that time period IIRC. Giant hunk of aluminum almost half again as big as the entire Slot A module, only cost like $25 too. Put a few 80mm fans on it (or the ever popular 10k rpm Black Label Delta screamers) and you were set to get those 6-700Mhz Athlons up to nearly 1Ghz. More if you were lucky.



I think Golden Orb's and Alpha heatsinks were much more popular back then though.

old skool OC chat is awesome but anyone know how much those K series SB's will cost anyways?

leppo
Jul 12, 2003
I can still remember penciling in the L1 bridges on my duron.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

PC LOAD LETTER posted:

old skool OC chat is awesome but anyone know how much those K series SB's will cost anyways?
There will be K versions of the i7 2600 and i5 2500. Given this roadmap, I'd expect around $299 and $199 as the base prices, so figure ~10% above that for the K versions.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

rscott posted:

Yeah I was just sperging out because I miss the old days of using graphite pencils to unlock extra multipliers or setting jumpers to 2x to get 6x multipliers on my old super socket 7 boards. :)

That's the "old days?" Bah, you kids these days have no appreciation for the times when we had to desolder the oscillator module on the motherboard and replace it with a higher-frequency one. Of course, there were no provisions for mounting a cooler on the 386's socket, so you had to get creative with thermal adhesives (not easy to find, those days) and heatsinks designed for other devices. But, on the other hand, a 386-33 at 40 MHz could be as fast as a low-end 486!

spanko posted:

This isn't true anymore for a lot of popular games.

Mind providing some examples?

Of course, bottlenecks are going to depend on the exact hardware configuration - but a system built right now with a roughly even balance between video card, CPU, and monitor is almost certain to bottleneck on the video card in any game recent enough that bottlenecks matter. Yeah, if you run with an Athlon II X2 and a pair of GTX480s on a 1280x1024 display it won't be the CPU holding you back, but as long as everything's in the same rough category ("budget," "midrange," "high-end," or "I burn piles of money for laughs") the CPU is rarely the limiting factor.

Admittedly, CPU bottlenecks can be a more serious concern; as your system ages, it's generally possible to dodge a video card bottleneck that leads to unacceptable performance just by dropping settings, but CPU bottlenecks are more concrete. However, that's one place where overclocking is still a viable solution, at least for now. We'll see how it plays out as the market moves away from single-core performance and towards parallelism that can't be made up so easily by just cranking up the clocks.

brap
Aug 23, 2004

Grimey Drawer
my experience is that a video card will tend to bottleneck most of the time except when under heavy load, where the CPU can frequently bottleneck. Most notably in the badly optimized new content in TF2. I had to overclock my c2d to 3ghz from 2.33 to get acceptable frame rates---not even great framerates on koth_viaduct. 9600gt video card, which should digest the game with medium settings at 1680x1050.

coke
Jul 12, 2009

rscott posted:

Best HSFU for skt370 was the golden orb. It had a 60mm fan!

Years later you came to realize what a piece of poo poo it was as it had very little surface area compare to a normal looking cooler. It did look 'cool' though..

Zhentar
Sep 28, 2003

Brilliant Master Genius
I bought a "huge" Thermalright AX-7 for my Athlon... it was big enough for an 80mm fan! I had to dremel off the bottom of a couple fins to fit in on my motherboard.

JnnyThndrs
May 29, 2001

HERE ARE THE FUCKING TOWELS

rscott posted:

Tualatin wasn't officially supported on 440BX but I had an Abit BH6 that had no problem running up to 150MHz. Basically the only reason I went from 440BX to 815 was my Radeon 8500 couldn't tolerate the overclocked AGP bus like my old GeForce 2 did.

Yeah, and there were factory 440BX boards with made expressly for a 133 FSB, with the right multipliers to keep AGP/PCI speeds correct. I had an MSI BX-Master like this - still have it, somewhere.

Alereon posted:


Apollo Pro133A all the way!


gently caress VIA and gently caress that MOTHERFUCKING SHITSUCKING ASSDILDO of a chipset, those goddamn Apollo Pros had the IDE transfer rate of a sedated snail, a total inability to be stable using a Sound Blaster card, and general instability under heavy multitasking when you were hitting the HD. People everywhere bitched for years about it, VIA kept releasing new 4-in-1's that did absolutely nothing but give me temporary hope.

Christ, that stupid chipset gave more hassles than any computer hardware I ever used, before or since. I had a dual-proc P-III/933 box that should have been the cat's meow, but that goddamn chipset was a constant thorn in my side. And gently caress Intel, too, I didn't have $1000 to buy an 820 or 840 board and RDRAM.

mew force shoelace
Dec 13, 2009

by Ozmaugh
You know what would go a long way towards making computers seem better? A dedicated chip and memory just to run the OS. I'd be pretty tiny and it'd take out a ton of the apparent slowness of a computer.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

mew force shoelace posted:

You know what would go a long way towards making computers seem better? A dedicated chip and memory just to run the OS. I'd be pretty tiny and it'd take out a ton of the apparent slowness of a computer.
That wouldn't really help. If you've got a quad-core with 4GB+ of RAM, you're not going to be running low on RAM for the OS or experiencing latency due to CPU load. Granted the Windows scheduler sucks and Explorer does retarded things like lock the interface while waiting for a drive to spin-up, but that's not something you can fix by throwing hardware at it.

JnnyThndrs posted:

Yeah, and there were factory 440BX boards with made expressly for a 133 FSB, with the right multipliers to keep AGP/PCI speeds correct. I had an MSI BX-Master like this - still have it, somewhere.
This is kinda a derail, but no board based on the BX chipset had proper dividers for 133Mhz. Those boards that advertised support were rated to overclock to 133Mhz, but you were still putting a 33% overclock on the PCI and AGP buses. Also, for the record, I didn't post that thing you quoted about Via chipsets.

Zhentar
Sep 28, 2003

Brilliant Master Genius
If you want to take apparent slowness out of a computer, just get an SSD.

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


Alereon posted:


This is kinda a derail, but no board based on the BX chipset had proper dividers for 133Mhz. Those boards that advertised support were rated to overclock to 133Mhz, but you were still putting a 33% overclock on the PCI and AGP buses. Also, for the record, I didn't post that thing you quoted about Via chipsets.

That's true on the AGP, but not the PCI. I had an ABit BM6 (440BX) which had a 1/4th PCI multiplier so you could run the PCI bus at 33 mhz while at 133. Your AGP would be a little over 88mhz at that point, but a lot of cards could take that.

eames
May 9, 2009

Somebody please open a OC nostalgia thread. That picture of the Slot A Athlon with the little debug module (gold finger something?) can only be topped by a Celeron 300A with peltier/air cooler. :unsmith:

Content: I think I’m going to wait for the Sandy Bridge update of Apples MBP line until I replace this 17" C2D. Should arrive roughly one year from now, no?

spanko
Apr 7, 2004
winnar

Space Gopher posted:

Mind providing some examples?

SC2, WoW, Dragon Age, Mass Effect 2, Left 4 Dead (most source engine games,) most mmos are cpu limited. In fact I'd say in general most new RPG/RTS games are cpu limited on a $180-$220 videocard, but for FPS games like AvP, Crysis, Bioshock, etc, what you said is true and they are GPU limited. My main point is you are way better off spending extra to get a quad core CPU than spending over $250 or more on a videocard, unless you mainly play the bleeding edge FPS's.

forbidden dialectics
Jul 26, 2005





My first overclock was when I built my first computer, a tualatin celeron 1100A on an 815 motherboard (Soyo SY-TISU). Both I pulled out of the garbage. Got that sucker running at 1400 MHz. This was around when the 2.53 Northwood P4 was a mid-range part so I was pretty behind the times, but I was proud since most of the parts were scavanged and the only thing I actually bought was a GeForce4 4200. How did I cool this montrosity? Well, the case didn't have any of the side panels so I just stuck a small window fan next to it.

Hoffmann
Dec 29, 2007
Just reading the thread and I'm not too up on hardware, but thought I'd ask a few questions.

My impression was that stuff like GPGPU was a big boon to low cost high performance computing applications like high frequency trading and scientific computing. What sort of market share do those applications represent and are they enough to sustain the discrete card makers?

Also does this mean by this time next year that the sweet spot build in the system building thread might not have a discrete card?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

pseudointellectual posted:

My impression was that stuff like GPGPU was a big boon to low cost high performance computing applications like high frequency trading and scientific computing. What sort of market share do those applications represent and are they enough to sustain the discrete card makers?
The market is still pretty tiny from what I understand, because while there are a lot of theoretical applications, it still takes time and money to rewrite applications for CUDA, and for those applications to get adoption within the industry. It's definitely highly profitable, but there's just not that much volume yet.

quote:

Also does this mean by this time next year that the sweet spot build in the system building thread might not have a discrete card?
For a gaming system a discrete videocard will definitely be necessary, but for a non-gaming system it looks like the integrated graphics will be able to handle basic usage and HTPC applications with good performance. Basically, this will replace the lovely videocards used in low-end systems, but isn't good enough for someone that actually cares about gaming performance.

JawnV6
Jul 4, 2004

So hot ...

mew force shoelace posted:

You know what would go a long way towards making computers seem better? A dedicated chip and memory just to run the OS. I'd be pretty tiny and it'd take out a ton of the apparent slowness of a computer.

Yeah, on an application page fault what you really want to do is go off-chip for the handler instead of just running the OS right on that core. That'd make things zippy.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
It'd be even better if you could have it pull down the pages directly from Microsoft or kernel.org. Trusted computing, gently caress yeah.

Medikit
Dec 31, 2002

que lástima

WhyteRyce posted:

The 300As had a near guaranteed overclock to 450. The 366s had a higher multiplier which meant it had to go to 550 if you wanted a FSB of 100mhz, but the success rate of getting those to 550 were much lower than getting the 300s to 450.

I had a 366 which didn't hit 550 so I had to settle for 450 at some funky FSB :(

I hit 550 :). The last great celeron processor. The only low budget processor that out performed the flagship CPU due to on-dye level 1 cache. The prelude to the coppermine.

Malloc Voidstar
May 7, 2007

Fuck the cowboys. Unf. Fuck em hard.

freeforumuser posted:

Let's face it, the only real apps left that are still primarily CPU limited are rendering and video encoding. Interestingly, both apps lend themselves well to massively parallel processing on GPUs, same for gaming physics. And now, we see Intel and AMD are pushing with CPUs with integrated GPUs. Coincidence? Me thinks no and let me proclaim the multicore era is already over and welcome our new GPU-dominant processor overlords.
Hell loving no it doesn't. GPUs are really good at embarassingly parallel things. Mathematical computations that can be done in 1000+ threads at once is a good example; GPUs will slaughter CPUs at that.

Video encoding, if you're looking for good quality, cannot have very many threads. Frames reference each other in H.264. If you're encoding hundreds of frames at once, they obviously can't effectively do that; you'll have to use another method of threading, like slicing frames into slices and encoding each frame with multiple threads. Sliced threading is inherently lower quality.

On the whole, CPUs are better for video encoding. There are certain things that GPUs can do better than CPUs in video encoding. There's been some study into stuff that they can do, such as motion estimation (see the qual. task too). Note that it is listed as very difficult and needing a large amount of work. There's also probably been lots of papers, but papers are just theory, and are very often not practical.

Thanks a lot for making me sperg :mad:

TOOT BOOT
May 25, 2010

Aleksei Vasiliev posted:

Video encoding, if you're looking for good quality, cannot have very many threads. Frames reference each other in H.264. If you're encoding hundreds of frames at once, they obviously can't effectively do that; you'll have to use another method of threading, like slicing frames into slices and encoding each frame with multiple threads. Sliced threading is inherently lower quality.

Couldn't the GPU be used for a really-fast first pass, since it can look at tons of frames in parallel?

Mr VacBob
Aug 27, 2003
Was yea ra chs hymmnos mea

TOOT BOOT posted:

Couldn't the GPU be used for a really-fast first pass, since it can look at tons of frames in parallel?

The last method tried was to move x264's lookahead thread onto the GPU - this runs about ~50 frames ahead of actual encoding and is used to decide some stuff like frame types and visual importance.

It also avoids one of the worst problems with GPGPU; the motion decisions and so on are chosen just as much for how well the MVs compress as how well they match. This is really easy when you run all the decisions in the same order they're being encoded in... but GPUs do everything at the same time, so that doesn't work. Badaboom/etc most likely just don't bother doing any of it, so they're fast because they suck.

Anyway, the two people working on it still haven't finished after months, so it still can't be that easy...

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
A couple welcome updates:

Sandy Bridge's 6-series chipsets will have native USB3.0 support after all. Both notebook and desktop platforms will have it, no idea on how many ports unfortunately. Apparently Intel kept this on the down-low because they weren't sure if their native implementation would pass qualification, but it did.

Sandy Bridge graphics details, including branding. Graphics on Sandy Bridge will come in two flavors: the 6 Execution Unit version called Intel HD Graphics 100, and the 12 EU version called Intel HD Graphics 200. HD Graphics 200 will only be available on Core i5 2500 and i7 2600 desktop processors (and presumably higher), though according to previous information all mobile processors will have HD Graphics 200. This does appear to confirm that Anandtech tested the 12 EU version.

Murodese
Mar 6, 2007

Think you've got what it takes?
We're looking for fine Men & Women to help Protect the Australian Way of Life.

Become part of the Legend. Defence Jobs.
I just want it all to come out already so I can upgrade my PC for the first time in years :smith:

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
Same here. I hope they release them in January but that's just wishful thinking. Which CPU got that 5GHz overclock on air cooling with a minor voltage increase? I'm hoping it was the i5-2500K because I want one of those so bad.

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010
I hope this chipset is finally the one to force motherboard makers to UEFI.

wolrah
May 8, 2006
what?

incoherent posted:

I hope this chipset is finally the one to force motherboard makers to UEFI.

There's rumors that the next version of Windows will require UEFI, though I doubt that will be the case since almost no one has it right now so very few computers could be upgraded. It would be nice if they'd come out and say that the version after will require it though, that gives plenty of time for hardware manufacturers to get up to speed and for most computers that would be worth upgrading the OS on to have support.

Ryokurin
Jul 14, 2001

Wanna Die?
Very doubtful. Probably the people who started that are the same ones that are saying that Microsoft will allow it next year, when it's been an official option for booting since Vista SP1. It's doubtful because a major reason why there's still a 32-bit version of windows is because up until a year or so ago a decent amount of intel chips were 32-bit only. there's no way Microsoft would ignore millions of machines that could run the next os fine but can't because of some technicality.

Zhentar
Sep 28, 2003

Brilliant Master Genius

wolrah posted:

There's rumors that the next version of Windows will require UEFI, though I doubt that will be the case since almost no one has it right now so very few computers could be upgraded. It would be nice if they'd come out and say that the version after will require it though, that gives plenty of time for hardware manufacturers to get up to speed and for most computers that would be worth upgrading the OS on to have support.

The only thing companies will do if you give them a couple extra years like that is spend a couple years hoping you change your mind.

Lum
Aug 13, 2003

Zhentar posted:

The only thing companies will do if you give them a couple extra years like that is spend a couple years hoping you change your mind.

At least it will be more clear cut than the Vista driver support fiasco. It'll either boot or it wont.

That said all those people like me with decent gaming rigs who have no plans to upgrade will probably be able to run Windows 8 just fine and will hold off on upgrading if UEFI is a requirement.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Remember that UEFI is required in order to boot from HDDs larger than 2TB, and since we currently have 3TB HDDs that are relegated to external storage applications, that's kind of important.

greasyhands
Oct 28, 2006

Best quality posts,
freshly delivered

incoherent posted:

I hope this chipset is finally the one to force motherboard makers to UEFI.

MSI announced all their sandy bridge boards will be UEFI, don't know about other manufacturers.

http://techpinger.com/2010/06/msi-working-on-uefi-that-will-kill-bios-in-three-years/

movax
Aug 30, 2008

incoherent posted:

I hope this chipset is finally the one to force motherboard makers to UEFI.

I hope so, because I'm tired of dealing with AMI's BIOS development environment and x86 assembly. So looking forward to being lazy as poo poo and being able to use C in conjunction with BIOS development.

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010

Alereon posted:

Remember that UEFI is required in order to boot from HDDs larger than 2TB, and since we currently have 3TB HDDs that are relegated to external storage applications, that's kind of important.

This is the primary example of why I'm looking for a uefi board.

Adbot
ADBOT LOVES YOU

4 Day Weekend
Jan 16, 2009

incoherent posted:

This is the primary example of why I'm looking for a uefi board.

Well right now all 2TB HDDs are 5400RPM, so not exactly something you'd want to have as a boot drive. Still, hopefully 1155/new AMD boards will make UEFI standard.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply