|
Alereon posted:Sup Barton bros, I had a Barton 2500+ @ 2.2Ghz also, in an Abit NF7-S v2.0 motherboard. I had the exact same setup with a bit lower of an OC. That board was one of the best I've ever had. I think it was one of the first to do Dolby Digital Live that encoded stereo in to surround on the fly. I used the poo poo out of that and it was years before a discreet card did it.
|
# ? May 19, 2014 14:11 |
|
|
# ? Dec 13, 2024 03:04 |
|
I can't recall the motherboard precisely but it was good at the time and had AGP 8X, yeeeeeehaaaw, and really good memory compatibility with dual-channel capability. I was running a 2800+ at 2.2GHz using a baller big ol' block of copper and lots of fins. The good old days, haha. If I'd had waited just a few months, I'd have been able to build an Athlon64 system instead, use an early heat pipe cooler on it, and have a PCI-e capable computer instead of being one of the last AGP and PCI only platforms. That caused no end of poo poo until I finally upgraded it, but it did last from 2003 to 2008. In the sense that I did not feel an impending NEED to replace it during that time period, though by the time I finally did, I moved from that old Barton 2800+ at 2.2GHz to a Q9550 at 3.4GHz at stock voltage (that same computer is still trucking, too, at 3.8GHz with a bit of an overvolt - gave it to my brother in law because he's been trying to game on a lovely old laptop and it barely runs Bastion, heh), moved from 1GB of I want to say standard data rate dual channel ram to 8GB of dual channel DDR2 (P45Q-E motherboard); I wouldn't see another similar moment in computing until SSDs came out and made everything ludicrously faster. Upgrading from that old computer, I remember one of the first tests I did was to see how long it'd take DVDShrink to do a 2-pass encode. What took 2.5 hours on the Barton core at 2.2GHz with its 1GB of memory took less than seven minutes on the then-new computer. I was blown away by how much faster it was. Then its staying power got me to Sandy Bridge, which is where I'm still at even though I've got a box full of parts that I ought to look into turning into a Haswell computer, it's just, uuugh effort (and also my back hurts really bad so I don't know if I could do it anymore - building in 2011 took me three days or so, it just hurt so much having to get in and mess around in the guts of the thing that I'd have to stop after a bit, and since then I've had a failed surgery so I don't think it'd be any more pleasant an experience).
|
# ? May 19, 2014 14:24 |
|
SwissArmyDruid posted:And to think, people get terrified of loving up their processors NOW with the Intel LGA sockets. Buncha lightweights, I tell ya I will say the pushpin design is kinda crap but after the first time it's cake. Socket A never got any easier.
|
# ? May 19, 2014 15:25 |
|
I really want AMD to have some kind of big comeback. Anyone who wants progress in the personal computer space should. Please AMD, do something amazing
|
# ? May 19, 2014 16:27 |
|
Agreed posted:The good old days, haha. If I'd had waited just a few months, I'd have been able to build an Athlon64 system instead, use an early heat pipe cooler on it, and have a PCI-e capable computer instead of being one of the last AGP and PCI only platforms Did you never get one of those crazy Asrock boards that had AGP and PCIE graphics slots and took either DDR or DDR2. They were great for upgrading your system gradually, allowed me to use my BH5 ram for ages until the DDR 2 prices dropped a bit.
|
# ? May 19, 2014 16:28 |
|
lovely Treat posted:Did you never get one of those crazy Asrock boards that had AGP and PCIE graphics slots and took either DDR or DDR2. Nooooope, after replacing the heat sink successfully once I honestly felt that my luck would run out if I tried again. It was so damned easy to damage those fragile little wafers, holy crap. A friend ended up turning his all useless (if pretty!) irregular-polygonal with a very nicely crushed corner from trying to install ~basically the same heat sink and fan setup that I was using, and after that I figured This Is Good Enough For Me (since the alternative was trying to figure out how to afford a new computer some time between senior year of high school and... poo poo, I guess junior year of college).
|
# ? May 19, 2014 16:39 |
|
lovely Treat posted:Imagine those people having to do a CPU pin mod to overclock instead of being able to just raise a multiplier a little and get an extra GHz clockspeed. Sticker, wire-wrap, or solder?
|
# ? May 19, 2014 17:09 |
|
SwissArmyDruid posted:Sticker, wire-wrap, or solder? Oh god I forgot about some of the other ones, cutting traces on the CPU joining traces with conductive pen/paint.
|
# ? May 19, 2014 17:32 |
|
lovely Treat posted:Imagine those people having to do a CPU pin mod to overclock instead of being able to just raise a multiplier a little and get an extra GHz clockspeed. The best mods were the ones you could do on old Athlon XP procs using some conductive paint to short the bridges. Did that on at least 2 of mine, I think it would actually make it default to 333Mhz FSB versus the stock 266Mhz. The original AXP chip I first bought, I got with an Asus board and 512MB DDR, it wasn't too bad overall and I got it from stock 1.53Ghz up to a tad over 1.7Ghz. Sadly that chip died when the power supply decided to literally throw sparks out the back and fry everything.
|
# ? May 19, 2014 21:12 |
|
conductive paint? I used a heavy hand with a .05 mechanical pencil. The graphite was just conductive enough.
|
# ? May 19, 2014 22:24 |
|
adorai posted:conductive paint? I used a heavy hand with a .05 mechanical pencil. The graphite was just conductive enough.
|
# ? May 20, 2014 01:12 |
|
I think we all already knew that our K13 with no more than four physical cores, no integrated graphics, and 14nm Samsung FinFET was in the pipeline, but apparantly the ambidextrous x86/ARM thing is not it. http://fudzilla.com/home/item/34769-amd-plans-new-architecture-for-2016 The big takeaway: CMT is dead.
|
# ? May 20, 2014 02:22 |
|
Alereon posted:I did this too, but multipliers kept dropping off after a couple months until I put a square of scotch tape over the L1 bridges. I guess it was due to the graphite flaking off slowly, as it shouldn't have been hot enough for it to evaporate. And that's why conductive paint was used.
|
# ? May 20, 2014 02:24 |
|
Does this mean that AMD is looking to target the "high end" CPU market again?
|
# ? May 20, 2014 06:24 |
|
Ragingsheep posted:Does this mean that AMD is looking to target the "high end" CPU market again? I don't know about high end, but if they can get competitive in the mid-range Core i5 market and lower than AMD has a chance. For consumer CPU sales the overwhelming majority of sales comes in the the lower end of the market. I don't know what AMD has planned for the server market, but they might just concede that since they probably have no hope of competing with the solutions that Intel and IBM have in that market.
|
# ? May 20, 2014 07:39 |
|
This is AMD basically admitting that the whole CMT architecture idea was a bad idea and planning a comeback attempt for 2016. So yes, they are targeting the "high end" CPU market (along with the other CPU markets), but we won't see silicon until 2016, and there are many challenges on the way. We're pulling for you AMD, you crazy underdogs.
|
# ? May 20, 2014 11:36 |
|
This is pretty much the same strategy they did leading up to Bulldozer, pretending to be a much bigger company than they actually are and spending accordingly, while forgetting somehow that as always they're up against Intel, a company so willing to gently caress you up that it doesn't even matter if you've got a better product (like, I dunno, an integrated modem), they'll sell a package that costs them more just so that it competes, for a lower price, and "insist" via back channels that vendors go with their version because of course they will they're the apex predator right now. Not even patents work against them. Your good ideas are jack poo poo to their huge piles of money and cutthroat tactics.
|
# ? May 20, 2014 12:41 |
|
Abandon desktops, run to low power markets, hope for the best.
|
# ? May 20, 2014 14:55 |
|
Agreed posted:This is pretty much the same strategy they did leading up to Bulldozer, pretending to be a much bigger company than they actually are and spending accordingly, while forgetting somehow that as always they're up against Intel, a company so willing to gently caress you up that it doesn't even matter if you've got a better product (like, I dunno, an integrated modem), they'll sell a package that costs them more just so that it competes, for a lower price, and "insist" via back channels that vendors go with their version because of course they will they're the apex predator right now. Not even patents work against them. Your good ideas are jack poo poo to their huge piles of money and cutthroat tactics. This is a very depressing post. Is there no hope in persuading the
|
# ? May 20, 2014 19:15 |
|
Intel has the money, but at the end of the day, if AMD can push out some CPUs that get recommended in the parts picking thread from top to bottom (much like their GPUs), then that would be a good start. They'd need to significantly increase single thread performance and significantly reduce power usage to do this.
|
# ? May 21, 2014 10:53 |
|
HalloKitty posted:Intel has the money, but at the end of the day, if AMD can push out some CPUs that get recommended in the parts picking thread from top to bottom (much like their GPUs), then that would be a good start. This is why GloFo licensing out Samsung's 14nm FINFET is so exciting. The process shrink means they'll be able to increase transistor count in general, the new direction on their high end parts they'll do by cutting core count (increasing area for more transistors-per-core).
|
# ? May 24, 2014 03:17 |
|
All this time I thought "better luck next platform" was just a joke. Is there any AMD CPU I should buy for any reason? Or at least an excuse I can tell people for why I didn't just buy Intel? I'm still holding out hope for big green.
|
# ? May 24, 2014 14:14 |
|
Crotch Fruit posted:All this time I thought "better luck next platform" was just a joke. Is there any AMD CPU I should buy for any reason? Or at least an excuse I can tell people for why I didn't just buy Intel? I'm still holding out hope for big green. Oh, no, there's absolutely no joke. The only desktop reason to buy an AMD CPU now is if you absolutely need to spend the least possible amount for a passable graphics chip, but never, ever intend to upgrade that system: Richland APUs. There's a niche element to some of the CPUs if you want a few more cores for a few less dollars, but that's mainly if you're into virtualisation, video encoding and suchlike. So, in summary, no, no joke, and there's no reason to buy an AMD CPU right now for general use (gaming, web browsing, laptops, etc). Intel has single-thread performance and power consumption completely sewn up, which matters to all general purpose scenarios. AMD GPUs are fine, and have now come down to a great price, so the ATI side of things is still doing well. As for future hope for AMD CPUs, well... there are rumours of a new architecture in the future, but right now, there's not much to say. HalloKitty fucked around with this message at 14:50 on May 24, 2014 |
# ? May 24, 2014 14:47 |
|
I've become uncomfortable even with the "good" AMD APUs since Anandtech revealed that most non-overclocking boards power off or restart under load because the VRMs aren't sufficient for the APU's sustained TDP. That's just bullshit.
|
# ? May 24, 2014 17:52 |
|
HalloKitty posted:As for future hope for AMD CPUs, well... there are rumours of a new architecture in the future, but right now, there's not much to say. I'm honestly holding on to hope that the whole deal with CMT was a one-off, sort of like when Intel released the Prescott P4 chips way back that were overly hot and didn't have the performance to compete with that generation of A64 processors. I think it's good news that AMD is looking ahead to the next year or two of development, and going back to a more "tried and true" method after the Bulldozer flop. Maybe someday Intel or AMD (or hell, even IBM) might get the whole CMT idea working better, but who knows for sure. I just hope AMD gets back in the fold and can bring back the competition that gave everyone awesome CPU choices between them and Intel.
|
# ? May 24, 2014 18:39 |
|
Crotch Fruit posted:All this time I thought "better luck next platform" was just a joke. Is there any AMD CPU I should buy for any reason? Or at least an excuse I can tell people for why I didn't just buy Intel? I'm still holding out hope for big green. Only if you want an inexpensive SFF HTPC. That's about all that AMD APU chips are good for, right now, since they'll give you... oh, let's say 75% of the power that an intel chip will have, with equal or better graphics, (Intel has really stepped up their game in the past year or so) without the price premium that Intel commands. And since it'll be plugged into the mains, no considerations about battery life need be made. edit: That said, I *would* consider the use of non-APU parts in gaming computers on a very destitute budget. The Pentium name still holds too many bad memories for me. SwissArmyDruid fucked around with this message at 20:44 on May 24, 2014 |
# ? May 24, 2014 20:40 |
|
SwissArmyDruid posted:Only if you want an inexpensive SFF HTPC. That's about all that AMD chips are good for, right now, since they'll give you... oh, let's say 75% of the power that an intel chip will have, with equal or better graphics, (Intel has really stepped up their game in the past year or so) without the price premium that Intel commands. And since it'll be plugged into the mains, no considerations about battery life need be made.
|
# ? May 24, 2014 20:47 |
|
Alereon posted:And this isn't even so compelling now that Intel is including the QuickSync and ClearVideo engines on Celeron and Pentium processors, rather than just the Core i3. A Celeron G1840 for $42 is a pretty capable HTPC CPU now. Heh, funny you mention the Pentiums and Celerons. Those names hold very painful memories for me, having worked in IT during the P4 HT days. loving Prescott.
|
# ? May 24, 2014 21:11 |
|
SwissArmyDruid posted:Heh, funny you mention the Pentiums and Celerons. Those names hold very painful memories for me, having worked in IT during the P4 HT days. loving Prescott. Some of us remember the 300A and have GOOD MEMORIES that went on to be tarnished by lololol deep pipeline dreams.
|
# ? May 25, 2014 06:37 |
|
deimos posted:Some of us remember the 300A and have GOOD MEMORIES that went on to be tarnished by lololol deep pipeline dreams. It's interesting that AMD made the same mistake with Bulldozer (high frequencies with deep pipelines) that Intel made with Prescott. You think they'd have learned from their competitor.
|
# ? May 25, 2014 06:55 |
|
So what in particular made deep pipelines so bad? Do they intrinsically have worse per-clock performance? Is it a simple problem of branch predictors not being good enough?
|
# ? May 25, 2014 07:14 |
|
PerrineClostermann posted:So what in particular made deep pipelines so bad? Do they intrinsically have worse per-clock performance? Is it a simple problem of branch predictors not being good enough? with a deep pipeline, when the predictors are wrong, there's more work to throw away and it takes more time to redo it
|
# ? May 25, 2014 10:11 |
|
Longer pipelines mean each part does less work. If you can't make it up with higher clocks, it's going to be slower unless you do some smart caching like Intel does with their micro op cache.
|
# ? May 25, 2014 12:07 |
|
PerrineClostermann posted:Is it a simple problem Not even close. I'd actually say it's an ultra-complex problem, and any attempt at explaining it outside the confines of academia and research is gonna be high on snark and low on detail. No offense intended to regulars.
|
# ? May 25, 2014 16:29 |
|
Alereon posted:I've become uncomfortable even with the "good" AMD APUs since Anandtech revealed that most non-overclocking boards power off or restart under load because the VRMs aren't sufficient for the APU's sustained TDP. That's just bullshit.
|
# ? May 25, 2014 16:59 |
|
sincx posted:It's interesting that AMD made the same mistake with Bulldozer (high frequencies with deep pipelines) that Intel made with Prescott. You think they'd have learned from their competitor.
|
# ? May 25, 2014 17:23 |
|
Mirificus posted:Could you link to the article in question? I'd like to read more about the issue.
|
# ? May 25, 2014 17:24 |
|
Agreed posted:Not even close. I'd actually say it's an ultra-complex problem, and any attempt at explaining it outside the confines of academia and research is gonna be high on snark and low on detail. No offense intended to regulars. Please, make an attempt to do so anyway.
|
# ? May 25, 2014 17:50 |
|
Alereon posted:So far all of the Kaveri boards Anandtech has tested reboot under sustained load unless an additional fan is pointed at the VRMs, this includes the Asus A88X-Pro, which has somewhat upgraded power delivery. The overall issue seems to be that AMD APUs actually draw more power than the specs suggest under sustained load. Wow. Even if AMD makes a killer CPU, people are going to be hesitant to buy one after all the problems this go round.
|
# ? May 25, 2014 18:14 |
|
|
# ? Dec 13, 2024 03:04 |
|
PerrineClostermann posted:Please, make an attempt to do so anyway. I don't agree that the CMT is an inherently bad or wrong technology. We have to separate the technology itself from the implementation of that technology in AMD's products. One major issue was that GloFo's manufacturing sucked, so even if the architecture had been great AMD was hamstrung by low clockspeeds, high power usage, and low yields. Another was that AMD's limited engineering resources prevented them from making progress with the architecture in the way they needed to. I see Vishera (FX-8350) as being what Bulldozer should have been, it wouldn't have knocked Intel out of the park but it would have been competitive with Sandy Bridge at launch. Fundamentally CMT is a more-efficient way of allocating transistors to multi-threaded designs, but on the actual shipping product features like Turbo Core weren't sufficient to provide enough per-thread performance for desktop workloads. Alereon fucked around with this message at 18:36 on May 25, 2014 |
# ? May 25, 2014 18:17 |