Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
mayodreams
Jul 4, 2003


Hello darkness,
my old friend

Alereon posted:

Sup Barton bros, I had a Barton 2500+ @ 2.2Ghz also, in an Abit NF7-S v2.0 motherboard.

:hfive: I had the exact same setup with a bit lower of an OC. That board was one of the best I've ever had. I think it was one of the first to do Dolby Digital Live that encoded stereo in to surround on the fly. I used the poo poo out of that and it was years before a discreet card did it.

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I can't recall the motherboard precisely but it was good at the time and had AGP 8X, yeeeeeehaaaw, and really good memory compatibility with dual-channel capability. I was running a 2800+ at 2.2GHz using a baller big ol' block of copper and lots of fins. The good old days, haha. If I'd had waited just a few months, I'd have been able to build an Athlon64 system instead, use an early heat pipe cooler on it, and have a PCI-e capable computer instead of being one of the last AGP and PCI only platforms. That caused no end of poo poo until I finally upgraded it, but it did last from 2003 to 2008. In the sense that I did not feel an impending NEED to replace it during that time period, though by the time I finally did, I moved from that old Barton 2800+ at 2.2GHz to a Q9550 at 3.4GHz at stock voltage (that same computer is still trucking, too, at 3.8GHz with a bit of an overvolt - gave it to my brother in law because he's been trying to game on a lovely old laptop and it barely runs Bastion, heh), moved from 1GB of I want to say standard data rate dual channel ram to 8GB of dual channel DDR2 (P45Q-E motherboard); I wouldn't see another similar :tviv: moment in computing until SSDs came out and made everything ludicrously faster.

Upgrading from that old computer, I remember one of the first tests I did was to see how long it'd take DVDShrink to do a 2-pass encode. What took 2.5 hours on the Barton core at 2.2GHz with its 1GB of memory took less than seven minutes on the then-new computer. I was blown away by how much faster it was. Then its staying power got me to Sandy Bridge, which is where I'm still at even though I've got a box full of parts that I ought to look into turning into a Haswell computer, it's just, uuugh effort (and also my back hurts really bad so I don't know if I could do it anymore - building in 2011 took me three days or so, it just hurt so much having to get in and mess around in the guts of the thing that I'd have to stop after a bit, and since then I've had a failed surgery so I don't think it'd be any more pleasant an experience).

Panty Saluter
Jan 17, 2004

Making learning fun!

SwissArmyDruid posted:

And to think, people get terrified of loving up their processors NOW with the Intel LGA sockets.

Buncha lightweights, I tell ya :clint:

I will say the pushpin design is kinda crap but after the first time it's cake. Socket A never got any easier.

SCheeseman
Apr 23, 2003

I really want AMD to have some kind of big comeback. Anyone who wants progress in the personal computer space should. Please AMD, do something amazing :cry:

Shitty Treat
Feb 21, 2012

Stoopid?

Agreed posted:

The good old days, haha. If I'd had waited just a few months, I'd have been able to build an Athlon64 system instead, use an early heat pipe cooler on it, and have a PCI-e capable computer instead of being one of the last AGP and PCI only platforms

Did you never get one of those crazy Asrock boards that had AGP and PCIE graphics slots and took either DDR or DDR2.
They were great for upgrading your system gradually, allowed me to use my BH5 ram for ages until the DDR 2 prices dropped a bit.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

lovely Treat posted:

Did you never get one of those crazy Asrock boards that had AGP and PCIE graphics slots and took either DDR or DDR2.
They were great for upgrading your system gradually, allowed me to use my BH5 ram for ages until the DDR 2 prices dropped a bit.

Nooooope, after replacing the heat sink successfully once I honestly felt that my luck would run out if I tried again. It was so damned easy to damage those fragile little wafers, holy crap. A friend ended up turning his all useless (if pretty!) irregular-polygonal with a very nicely crushed corner from trying to install ~basically the same heat sink and fan setup that I was using, and after that I figured This Is Good Enough For Me (since the alternative was trying to figure out how to afford a new computer some time between senior year of high school and... poo poo, I guess junior year of college).

SwissArmyDruid
Feb 14, 2014

by sebmojo

lovely Treat posted:

Imagine those people having to do a CPU pin mod to overclock instead of being able to just raise a multiplier a little and get an extra GHz clockspeed.

Sticker, wire-wrap, or solder? :colbert:

Shitty Treat
Feb 21, 2012

Stoopid?

SwissArmyDruid posted:

Sticker, wire-wrap, or solder? :colbert:

Oh god I forgot about some of the other ones, cutting traces on the CPU joining traces with conductive pen/paint.

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

lovely Treat posted:

Imagine those people having to do a CPU pin mod to overclock instead of being able to just raise a multiplier a little and get an extra GHz clockspeed.

The best mods were the ones you could do on old Athlon XP procs using some conductive paint to short the bridges. Did that on at least 2 of mine, I think it would actually make it default to 333Mhz FSB versus the stock 266Mhz. The original AXP chip I first bought, I got with an Asus board and 512MB DDR, it wasn't too bad overall and I got it from stock 1.53Ghz up to a tad over 1.7Ghz. Sadly that chip died when the power supply decided to literally throw sparks out the back and fry everything. :(

adorai
Nov 2, 2002

10/27/04 Never forget
Grimey Drawer
conductive paint? I used a heavy hand with a .05 mechanical pencil. The graphite was just conductive enough.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

adorai posted:

conductive paint? I used a heavy hand with a .05 mechanical pencil. The graphite was just conductive enough.
I did this too, but multipliers kept dropping off after a couple months until I put a square of scotch tape over the L1 bridges. I guess it was due to the graphite flaking off slowly, as it shouldn't have been hot enough for it to evaporate.

SwissArmyDruid
Feb 14, 2014

by sebmojo
I think we all already knew that our K13 with no more than four physical cores, no integrated graphics, and 14nm Samsung FinFET was in the pipeline, but apparantly the ambidextrous x86/ARM thing is not it.

http://fudzilla.com/home/item/34769-amd-plans-new-architecture-for-2016

The big takeaway: CMT is dead.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Alereon posted:

I did this too, but multipliers kept dropping off after a couple months until I put a square of scotch tape over the L1 bridges. I guess it was due to the graphite flaking off slowly, as it shouldn't have been hot enough for it to evaporate.

And that's why conductive paint was used.

Ragingsheep
Nov 7, 2009
Does this mean that AMD is looking to target the "high end" CPU market again?

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

Ragingsheep posted:

Does this mean that AMD is looking to target the "high end" CPU market again?

I don't know about high end, but if they can get competitive in the mid-range Core i5 market and lower than AMD has a chance. For consumer CPU sales the overwhelming majority of sales comes in the the lower end of the market. I don't know what AMD has planned for the server market, but they might just concede that since they probably have no hope of competing with the solutions that Intel and IBM have in that market.

Rastor
Jun 2, 2001

This is AMD basically admitting that the whole CMT architecture idea was a bad idea and planning a comeback attempt for 2016. So yes, they are targeting the "high end" CPU market (along with the other CPU markets), but we won't see silicon until 2016, and there are many challenges on the way.

We're pulling for you AMD, you crazy underdogs.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

This is pretty much the same strategy they did leading up to Bulldozer, pretending to be a much bigger company than they actually are and spending accordingly, while forgetting somehow that as always they're up against Intel, a company so willing to gently caress you up that it doesn't even matter if you've got a better product (like, I dunno, an integrated modem), they'll sell a package that costs them more just so that it competes, for a lower price, and "insist" via back channels that vendors go with their version because of course they will they're the apex predator right now. Not even patents work against them. Your good ideas are jack poo poo to their huge piles of money and cutthroat tactics.

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD
Abandon desktops, run to low power markets, hope for the best.

Sidesaddle Cavalry
Mar 15, 2013

Oh Boy Desert Map

Agreed posted:

This is pretty much the same strategy they did leading up to Bulldozer, pretending to be a much bigger company than they actually are and spending accordingly, while forgetting somehow that as always they're up against Intel, a company so willing to gently caress you up that it doesn't even matter if you've got a better product (like, I dunno, an integrated modem), they'll sell a package that costs them more just so that it competes, for a lower price, and "insist" via back channels that vendors go with their version because of course they will they're the apex predator right now. Not even patents work against them. Your good ideas are jack poo poo to their huge piles of money and cutthroat tactics.

This is a very depressing post. Is there no hope in persuading the unwashed masses grassroots vendors that AMD processors would increase general sales if only they were stocked more often, marketed to be given a chance?

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Intel has the money, but at the end of the day, if AMD can push out some CPUs that get recommended in the parts picking thread from top to bottom (much like their GPUs), then that would be a good start.

They'd need to significantly increase single thread performance and significantly reduce power usage to do this.

SwissArmyDruid
Feb 14, 2014

by sebmojo

HalloKitty posted:

Intel has the money, but at the end of the day, if AMD can push out some CPUs that get recommended in the parts picking thread from top to bottom (much like their GPUs), then that would be a good start.

They'd need to significantly increase single thread performance and significantly reduce power usage to do this.

This is why GloFo licensing out Samsung's 14nm FINFET is so exciting. The process shrink means they'll be able to increase transistor count in general, the new direction on their high end parts they'll do by cutting core count (increasing area for more transistors-per-core).

Not Wolverine
Jul 1, 2007
All this time I thought "better luck next platform" was just a joke. :( Is there any AMD CPU I should buy for any reason? Or at least an excuse I can tell people for why I didn't just buy Intel? I'm still holding out hope for big green. :smith:

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Crotch Fruit posted:

All this time I thought "better luck next platform" was just a joke. :( Is there any AMD CPU I should buy for any reason? Or at least an excuse I can tell people for why I didn't just buy Intel? I'm still holding out hope for big green. :smith:

Oh, no, there's absolutely no joke.

The only desktop reason to buy an AMD CPU now is if you absolutely need to spend the least possible amount for a passable graphics chip, but never, ever intend to upgrade that system: Richland APUs.

There's a niche element to some of the CPUs if you want a few more cores for a few less dollars, but that's mainly if you're into virtualisation, video encoding and suchlike.

So, in summary, no, no joke, and there's no reason to buy an AMD CPU right now for general use (gaming, web browsing, laptops, etc).

Intel has single-thread performance and power consumption completely sewn up, which matters to all general purpose scenarios.

AMD GPUs are fine, and have now come down to a great price, so the ATI side of things is still doing well.

As for future hope for AMD CPUs, well... there are rumours of a new architecture in the future, but right now, there's not much to say.

HalloKitty fucked around with this message at 15:50 on May 24, 2014

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
I've become uncomfortable even with the "good" AMD APUs since Anandtech revealed that most non-overclocking boards power off or restart under load because the VRMs aren't sufficient for the APU's sustained TDP. That's just bullshit.

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

HalloKitty posted:

As for future hope for AMD CPUs, well... there are rumours of a new architecture in the future, but right now, there's not much to say.

I'm honestly holding on to hope that the whole deal with CMT was a one-off, sort of like when Intel released the Prescott P4 chips way back that were overly hot and didn't have the performance to compete with that generation of A64 processors. I think it's good news that AMD is looking ahead to the next year or two of development, and going back to a more "tried and true" method after the Bulldozer flop. Maybe someday Intel or AMD (or hell, even IBM) might get the whole CMT idea working better, but who knows for sure. I just hope AMD gets back in the fold and can bring back the competition that gave everyone awesome CPU choices between them and Intel.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Crotch Fruit posted:

All this time I thought "better luck next platform" was just a joke. :( Is there any AMD CPU I should buy for any reason? Or at least an excuse I can tell people for why I didn't just buy Intel? I'm still holding out hope for big green. :smith:

Only if you want an inexpensive SFF HTPC. That's about all that AMD APU chips are good for, right now, since they'll give you... oh, let's say 75% of the power that an intel chip will have, with equal or better graphics, (Intel has really stepped up their game in the past year or so) without the price premium that Intel commands. And since it'll be plugged into the mains, no considerations about battery life need be made.

edit: That said, I *would* consider the use of non-APU parts in gaming computers on a very destitute budget. The Pentium name still holds too many bad memories for me.

SwissArmyDruid fucked around with this message at 21:44 on May 24, 2014

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

SwissArmyDruid posted:

Only if you want an inexpensive SFF HTPC. That's about all that AMD chips are good for, right now, since they'll give you... oh, let's say 75% of the power that an intel chip will have, with equal or better graphics, (Intel has really stepped up their game in the past year or so) without the price premium that Intel commands. And since it'll be plugged into the mains, no considerations about battery life need be made.
And this isn't even so compelling now that Intel is including the QuickSync and ClearVideo engines on Celeron and Pentium processors, rather than just the Core i3. A Celeron G1840 for $42 is a pretty capable HTPC CPU now.

SwissArmyDruid
Feb 14, 2014

by sebmojo

Alereon posted:

And this isn't even so compelling now that Intel is including the QuickSync and ClearVideo engines on Celeron and Pentium processors, rather than just the Core i3. A Celeron G1840 for $42 is a pretty capable HTPC CPU now.

Heh, funny you mention the Pentiums and Celerons. Those names hold very painful memories for me, having worked in IT during the P4 HT days. loving Prescott.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

SwissArmyDruid posted:

Heh, funny you mention the Pentiums and Celerons. Those names hold very painful memories for me, having worked in IT during the P4 HT days. loving Prescott.

Some of us remember the 300A and have GOOD MEMORIES that went on to be tarnished by lololol deep pipeline dreams.

sincx
Jul 13, 2012

furiously masturbating to anime titties

deimos posted:

Some of us remember the 300A and have GOOD MEMORIES that went on to be tarnished by lololol deep pipeline dreams.

It's interesting that AMD made the same mistake with Bulldozer (high frequencies with deep pipelines) that Intel made with Prescott. You think they'd have learned from their competitor.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless
So what in particular made deep pipelines so bad? Do they intrinsically have worse per-clock performance? Is it a simple problem of branch predictors not being good enough?

atomicthumbs
Dec 26, 2010


We're in the business of extending man's senses.

PerrineClostermann posted:

So what in particular made deep pipelines so bad? Do they intrinsically have worse per-clock performance? Is it a simple problem of branch predictors not being good enough?

with a deep pipeline, when the predictors are wrong, there's more work to throw away and it takes more time to redo it

Arzachel
May 12, 2012
Longer pipelines mean each part does less work. If you can't make it up with higher clocks, it's going to be slower unless you do some smart caching like Intel does with their micro op cache.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

PerrineClostermann posted:

Is it a simple problem

Not even close. I'd actually say it's an ultra-complex problem, and any attempt at explaining it outside the confines of academia and research is gonna be high on snark and low on detail. No offense intended to regulars. :)

Mirificus
Oct 29, 2004

Kings need not raise their voices to be heard

Alereon posted:

I've become uncomfortable even with the "good" AMD APUs since Anandtech revealed that most non-overclocking boards power off or restart under load because the VRMs aren't sufficient for the APU's sustained TDP. That's just bullshit.
Could you link to the article in question? I'd like to read more about the issue.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



sincx posted:

It's interesting that AMD made the same mistake with Bulldozer (high frequencies with deep pipelines) that Intel made with Prescott. You think they'd have learned from their competitor.
Prescott was just the culmination in the mistake that was the Netburst era for Intel. They should have just stuck with derivatives of the P6 micro arch for the get-go.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Mirificus posted:

Could you link to the article in question? I'd like to read more about the issue.
So far all of the Kaveri boards Anandtech has tested reboot under sustained load unless an additional fan is pointed at the VRMs, this includes the Asus A88X-Pro, which has somewhat upgraded power delivery. The overall issue seems to be that AMD APUs actually draw more power than the specs suggest under sustained load.

PerrineClostermann
Dec 15, 2012

by FactsAreUseless

Agreed posted:

Not even close. I'd actually say it's an ultra-complex problem, and any attempt at explaining it outside the confines of academia and research is gonna be high on snark and low on detail. No offense intended to regulars. :)

Please, make an attempt to do so anyway.

SYSV Fanfic
Sep 9, 2003

by Pragmatica

Alereon posted:

So far all of the Kaveri boards Anandtech has tested reboot under sustained load unless an additional fan is pointed at the VRMs, this includes the Asus A88X-Pro, which has somewhat upgraded power delivery. The overall issue seems to be that AMD APUs actually draw more power than the specs suggest under sustained load.

Wow. Even if AMD makes a killer CPU, people are going to be hesitant to buy one after all the problems this go round.

Adbot
ADBOT LOVES YOU

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

PerrineClostermann posted:

Please, make an attempt to do so anyway.
Anandtech's Bulldozer Aftermath article is what you're looking for here, it provides a detailed yet still accessible overview of the Bulldozer microarchitecture and why it sucks. The articles on RealWorldTech from David Kanter go into some incredible depth, though you already have to know a lot to make sense of them. Here's his piece on the Bulldozer micro-architecture. The Anandtech Trinity review also goes a bit into the improvements from Bulldozer to Piledriver, which shows you what AMD thought were the most important and attainable performance tweaks to Bulldozer.

I don't agree that the CMT is an inherently bad or wrong technology. We have to separate the technology itself from the implementation of that technology in AMD's products. One major issue was that GloFo's manufacturing sucked, so even if the architecture had been great AMD was hamstrung by low clockspeeds, high power usage, and low yields. Another was that AMD's limited engineering resources prevented them from making progress with the architecture in the way they needed to. I see Vishera (FX-8350) as being what Bulldozer should have been, it wouldn't have knocked Intel out of the park but it would have been competitive with Sandy Bridge at launch. Fundamentally CMT is a more-efficient way of allocating transistors to multi-threaded designs, but on the actual shipping product features like Turbo Core weren't sufficient to provide enough per-thread performance for desktop workloads.

Alereon fucked around with this message at 19:36 on May 25, 2014

  • Locked thread