|
Lord Windy posted:In this thread, does anyone explain the problems that Bulldozer had? I am interested and want to read up on it. http://www.agner.org/optimize/microarchitecture.pdf Section 15.19 is of interest to you. Long story short: poo poo instruction decode in early models, poo poo execution unit balance which harms integer-based performance, long latencies for many operations, long pipelines causing problems with mispredicted branches aaaaaand finally some issues with caches (more severe on some models).
|
# ? Mar 20, 2015 11:09 |
|
|
# ? Dec 12, 2024 20:59 |
|
No Gravitas posted:http://www.agner.org/optimize/microarchitecture.pdf All of this is prettying shocking to me, especially that AMD's best years from 2003-2006 were primarily because Intel had an overly long pipeline and long latencies that made their chips less power efficient and less competitive. I guess AMD failed to learn a single thing from watching Intel flounder with the P4.
|
# ? Mar 20, 2015 13:49 |
|
Bulldozer was the epitome of 'hurry up and wait'
|
# ? Mar 20, 2015 14:02 |
|
No Gravitas posted:http://www.agner.org/optimize/microarchitecture.pdf Was anything but the pipeline theoretically fixable, and if so would it have had any real improvement on performance? I get the impression long pipelines are intrinsic to the design, but I'm not savvy enough to be able to read the pdf as much more than words in a comparative sense, I completely lack reference to draw conclusions from.
|
# ? Mar 20, 2015 22:28 |
|
Twerk from Home posted:All of this is prettying shocking to me, especially that AMD's best years from 2003-2006 were primarily because Intel had an overly long pipeline and long latencies that made their chips less power efficient and less competitive. I guess AMD failed to learn a single thing from watching Intel flounder with the P4. Wasn't part of their success in that time period due to folks from DEC who worked on Alpha getting jobs at AMD and using the stuff they learned from that?
|
# ? Mar 21, 2015 06:27 |
|
They were also a lot closer to Intel when it came to process tech back then too. They even had a lead on Intel for a brief time during the transition from aluminum to copper interconnects. AMD got their first by at least a few months. Today they're almost 2 yr behind Intel on process tech and they don't have control of the fab they use like they used to.
|
# ? Mar 21, 2015 11:27 |
|
I'm still confused on why Intel thought Netburst was going to be a thing. I think it speaks volumes that they could force their way through that disaster yet come out fine. Even weirder is that at no point did they really need such a radical new architecture, the PIII was fine. In theory they could have moved the PIII to 90nm and P4 would have had no reason to exist, if 479 replaced 478 in this new timeline AMD would have been sunk even before Piledriver, as the Dothans and Yonahs we're perfectly capable of keeping up with equivalent AMD stuff. Maybe there was some deepcover AMD shill at Intel which managed to convince everyone to shove their heads up their asses until they saw daylight again. Further reading on the PDF Gravitas linked has me in stitches at how enormously awful the Atom must be to lose to the VIA Nano. Or I am just not being appreciative enough at how capable the Nano actually is ?
|
# ? Mar 21, 2015 14:16 |
|
FaustianQ posted:I'm still confused on why Intel thought Netburst was going to be a thing. I think it speaks volumes that they could force their way through that disaster yet come out fine. Even weirder is that at no point did they really need such a radical new architecture, the PIII was fine. In theory they could have moved the PIII to 90nm and P4 would have had no reason to exist, if 479 replaced 478 in this new timeline AMD would have been sunk even before Piledriver, as the Dothans and Yonahs we're perfectly capable of keeping up with equivalent AMD stuff. Atom's whole bag was "low power x86 at any cost, and do it as quickly as possible" so it's no surprise it kinda sucked until the latest revs.
|
# ? Mar 21, 2015 14:43 |
|
FaustianQ posted:I'm still confused on why Intel thought Netburst was going to be a thing. Even when cooled with liquid nitrogen I don't the P4 ever got to the 10Ghz it was expected to at some point on air during its lifespan. Highest OC that I can recall for that chip is around 7Ghz stable with extreme cooling. What gets me is why the hell AMD decided to take the same route with Bulldozer and do a speed demon architecture after the failures of the P4 were clear and it was also clear that power and heat were going to still be a big problem in future process shrinks.
|
# ? Mar 21, 2015 17:39 |
|
PC LOAD LETTER posted:What gets me is why the hell AMD decided to take the same route with Bulldozer and do a speed demon architecture after the failures of the P4 were clear and it was also clear that power and heat were going to still be a big problem in future process shrinks.
|
# ? Mar 21, 2015 19:15 |
|
Nintendo Kid posted:Atom's whole bag was "low power x86 at any cost, and do it as quickly as possible" so it's no surprise it kinda sucked until the latest revs. It also wasn't intended for general-purpose PCs at launch. Intel's idea for Atom was that it would give them a credible competitor to ARM in the broad middle class of devices that need way more processing power than a microcontroller can give them, but not enough to make the cost and packaging of a full x86 PC setup worth the hassle. Think set-top boxes, routers, car infotainment, smart appliances, and things like that. This has been a weak point for Intel for a long time. They didn't care so much when they could just pump out high-margin PC and server processors and leave cheap embedded stuff to others, but with a lot of consumer demand moving from PCs to smart devices, ARM was looking like a much bigger threat. Because Intel was getting eaten alive in that market and wanted to jump-start a competitive product, they sold bargain-price Atoms to OEMs under a term that was supposed to lock out PCs: a maximum display size restriction in the complete hardware. Ten inches would be more than enough for a barcode scanner, industrial control interface, or something like that, but nobody wanted a laptop that small, right? Of course, OEMs saw dirt-cheap CPUs that they could slap together with legacy chipsets to run Windows or Linux, and it was off to the races in a machine that just barely met the licensing specs. The original Atom netbooks sucked because the CPUs were never intended to run a full-fat consumer OS on a PC. Now that Intel has actually focused on the "cheap low-end PC" market for Windows tablets, Atoms are actually pretty good at it. This also explains why "netbook" came and went as a size category; it was entirely driven by Intel (and Microsoft) licensing restrictions. When AMD started building netbook-class processors without the display size limit, OEMs immediately slapped them into larger chassis, and they became super-cheap 15" laptops instead of netbooks. And, because Intel and Microsoft are fighting for the tablet market, you can get cheap Atom x86 tablets right now.
|
# ? Mar 21, 2015 19:32 |
|
adorai posted:I believe the initial idea for BD was that their fpu sharing was going to be a home run. It was only after it was apparent internally that it was not that they started going for clock speed. BD was always supposed to have higher clocks than K10 though to make up for the loss in IPC and the module idea, or at least AMD's implementation of it, panned out worse than expected for single thread performance so BD turned into a real mess. On top of that multi threaded applications never took off as fast as AMD expected them too either so there are very few programs even today allow BD to shine on the one thing it was meant to be, and sorta is, good at.
|
# ? Mar 21, 2015 20:28 |
|
All I remember about BD was that JF guy from AMD marketing ultimately poisoning the well
|
# ? Mar 21, 2015 21:03 |
|
PC LOAD LETTER posted:Yea the module approach was supposed to give them better multi threaded performance than SMT for a minor hit in single thread performance while using less die space than a 'full' dual or quad core CPU. The idea sounded good and I was quite optimistic about it at first. It'd be weird if AMD Thubans and Visheras age better than their Intel cousins if we head into a heavy multithreaded era. Considering the challenges of this, I doubt it but it'd be kinda funny. Is it me or does AMD always seem to kind of trip over itself in it's dash to "The Future"? Maybe I'm just not hearing enough groundbreaking stuff from Intel, or just that any activity from AMD is good.
|
# ? Mar 21, 2015 23:18 |
|
I suspect that a lot of future heavily multithreaded stuff is going to be hampered on bulldozer-type chipsets by the sharing FPUs thing.
|
# ? Mar 21, 2015 23:20 |
|
Maybe for FP intensive work loads sure. There is a lot of stuff that is still integer/branchy as heck out there though.FaustianQ posted:Is it me or does AMD always seem to kind of trip over itself in it's dash to "The Future"? edit:\/\/\/\/Well sure but lots of stuff isn't vectorized either. You can always nitpick stuff but what is the common case scenario? PC LOAD LETTER fucked around with this message at 02:07 on Mar 22, 2015 |
# ? Mar 21, 2015 23:35 |
|
PC LOAD LETTER posted:Maybe for FP intensive work loads sure. There is a lot of stuff that is still integer/branchy as heck out there though. Floating-point hardware frequently is reused for integer SIMD instructions, so vectorized integer code has just as much trouble.
|
# ? Mar 22, 2015 01:24 |
|
FaustianQ posted:I'm still confused on why Intel thought Netburst was going to be a thing. I think it speaks volumes that they could force their way through that disaster yet come out fine. Even weirder is that at no point did they really need such a radical new architecture, the PIII was fine. In theory they could have moved the PIII to 90nm and P4 would have had no reason to exist, if 479 replaced 478 in this new timeline AMD would have been sunk even before Piledriver, as the Dothans and Yonahs we're perfectly capable of keeping up with equivalent AMD stuff. Intel loves new architectures. https://en.wikipedia.org/wiki/Intel_iAPX_432 https://en.wikipedia.org/wiki/Intel_i860 https://en.wikipedia.org/wiki/Itanium You might notice a common pattern behind the failure of those projects.
|
# ? Mar 22, 2015 11:39 |
|
Longinus00 posted:Intel loves new architectures. They all start with the letter "i"?
|
# ? Mar 22, 2015 13:00 |
|
Over-reliance on compilers to make software go fast.
|
# ? Mar 22, 2015 13:31 |
|
Running slower than the 68000 series?
|
# ? Mar 22, 2015 21:26 |
|
Man, WCCFTech is known for posting some really stupid rumors but the rumor that Samsung is planning to acquire AMD may be the stupidest one yet.
|
# ? Mar 25, 2015 18:23 |
|
Jesus, that old chestnut again?
|
# ? Mar 25, 2015 19:51 |
|
How good are the Samsung fabs? Wouldn't they be better off doing something else than attempt to make AMD designs better?
|
# ? Mar 25, 2015 22:29 |
|
Lord Windy posted:How good are the Samsung fabs? Wouldn't they be better off doing something else than attempt to make AMD designs better? They already made ARM-based Exynos chips, as well as both planar and 3D TLC NAND. And that's only at their wholly-owned facilities. Recent financials from Nvidia also suggest that they are making chips for Nvidia too, although whether this is GPUs or Tegra is uncertain.
|
# ? Mar 25, 2015 22:56 |
|
SwissArmyDruid posted:They already made ARM-based Exynos chips, as well as both planar and 3D TLC NAND. And that's only at their wholly-owned facilities. Recent financials from Nvidia also suggest that they are making chips for Nvidia too, although whether this is GPUs or Tegra is uncertain. Since we're already in crazy land, couldn't thye just make ARM desktop chips instead of fighting with Intel on x86? If there is even a market to fight for that is, but Android could probably work as a Desktop thing.
|
# ? Mar 25, 2015 23:49 |
|
Lord Windy posted:Since we're already in crazy land, couldn't thye just make ARM desktop chips instead of fighting with Intel on x86? If there is even a market to fight for that is, but Android could probably work as a Desktop thing. Samsung doesn't need AMD to make a lovely ARM desktop if they want, since AMD has no fabs and has less experience with ARM than Samsung themselves have.
|
# ? Mar 26, 2015 00:19 |
|
Lord Windy posted:Since we're already in crazy land, couldn't thye just make ARM desktop chips instead of fighting with Intel on x86? If there is even a market to fight for that is, but Android could probably work as a Desktop thing. Lemme ask you a question. If Intel wants to get into the mobile SoC market with their sub-5-watt parts and heavily subsidizing these parts, why is it crazy for Samsung to want to get into the x86 market? The idea isn't for Samsung to make ARM desktops, it's to get their fingers into the cross-licensing agreement that AMD has with Intel. Intel lets AMD use the x86 free of charge, because AMD lets Intel use the x86-64 free of charge. This was the crux of a pretty big lawsuit a few years ago where Intel's lawyers tried to break the agreement when Samsung buying AMD rumors came up. SwissArmyDruid fucked around with this message at 00:24 on Mar 26, 2015 |
# ? Mar 26, 2015 00:21 |
|
SwissArmyDruid posted:Lemme ask you a question. If Intel wants to get into the mobile SoC market with their sub-5-watt parts and heavily subsidizing these parts, why is it crazy for Samsung to want to get into the x86 market? Want? x86?
|
# ? Mar 26, 2015 00:51 |
|
SwissArmyDruid posted:Lemme ask you a question. If Intel wants to get into the mobile SoC market with their sub-5-watt parts and heavily subsidizing these parts, why is it crazy for Samsung to want to get into the x86 market? The idea isn't for Samsung to make ARM desktops, it's to get their fingers into the cross-licensing agreement that AMD has with Intel. Intel lets AMD use the x86 free of charge, because AMD lets Intel use the x86-64 free of charge. Samsung doesn't have the magic ability to make AMD's current lineup perform as well as Intel's stuff, so they'd still need to buy a lot of intel parts for any of their desktops/laptops/servers that are worth a poo poo.
|
# ? Mar 26, 2015 01:00 |
|
Nintendo Kid posted:Samsung doesn't have the magic ability to make AMD's current lineup perform as well as Intel's stuff, so they'd still need to buy a lot of intel parts for any of their desktops/laptops/servers that are worth a poo poo. Agreed. But having fabs three other fabs whose 14nm FinFET process can be directly applied to an AMD product, that's going to improve TDPs relative to current parts, at least, so it's *sort of* like magic. (Especially since they were going to be using Samsung's FinFET at GloFo anyways.) Remember how AMD was said to have lost an Apple contract because of concerns how they wouldn't be able to keep them supplied a few years back? Remember how AMD couldn't keep up with production when demand spiked heavily for bitcoin mining last year? I'm not saying that Samsung buying AMD would make everything better. But it wouldn't be completely without benefit or benefit-neutral for AMD. They would actually get something out of it, which, I assume, is why the rumors persist and keep coming back every few years or so. SwissArmyDruid fucked around with this message at 01:29 on Mar 26, 2015 |
# ? Mar 26, 2015 01:22 |
|
SwissArmyDruid posted:Agreed. But having fabs three other fabs whose 14nm FinFET process can be directly applied to an AMD product, that's going to improve TDPs relative to current parts, at least, so it's *sort of* like magic. (Especially since they were going to be using Samsung's FinFET at GloFo anyways.) Samsung's fabs are largely locked up with producing all their current products and fabbing things on contract. They would need to invest quite a bit more money to build extra cpaacity to try running off x86 stuff as well. It was also because AMD's performance was terrible. Lack of capacity was just the cherry on top. AMD couldn't keep up with production for graphics cards because all the AMD cards good for bitcoin mining were obsolete cards that were not being produced in large numbers anymore. The then current cards were worse for mining, since it was a fluke that any of the GPUs they made were good for mining at all. AMD would get something out of being bought, but Samsung gets about dick out of buying them. Which is why they ain't gonna buy.
|
# ? Mar 26, 2015 01:45 |
|
Does AMD have any good IP, like Sun did? (I assume Sun did, I wasn't old enough, or in the loop enough to understand what happened there). I'm surprised nobody has bought AMD out, they have to be only worth a few billion will all their debt.
|
# ? Mar 26, 2015 02:02 |
|
Lord Windy posted:Does AMD have any good IP, like Sun did? (I assume Sun did, I wasn't old enough, or in the loop enough to understand what happened there). Well they have the x64 patent/IP and the license to use x86 which is pretty major. The issue is that in order to use either you have to work off of their current offerings, since Intel sure ain't going to sell you some Core designs.
|
# ? Mar 26, 2015 02:06 |
|
Maybe Zen will be super fantastic and solve all problems forever and ever. I'd like that.
|
# ? Mar 26, 2015 02:11 |
|
Lord Windy posted:Maybe Zen will be super fantastic and solve all problems forever and ever. I'd like that.
|
# ? Mar 26, 2015 02:29 |
|
Lord Windy posted:Maybe Zen will be super fantastic and solve all problems forever and ever. I'd like that. I'd be delighted if Zen provided an Intel alternative that you wouldn't feel compelled to make excuses for buying.
|
# ? Mar 26, 2015 03:01 |
|
El Scotch posted:I'd be delighted if Zen provided an Intel alternative that you wouldn't feel compelled to make excuses for buying.
|
# ? Mar 26, 2015 03:14 |
|
Samsung buying AMD makes very little sense, and I doubt it's real. There's not a lot worth having at AMD: the embedded systems contracts, the GPU business, low-power x86, some IP, and I guess maybe a fixer-upper chip design to jump into the desktop market. None of those lack a significant caveat, they'd all take some significant elbow grease to utilize successfully. The dreamer in me wishes it was true, it would be great if there were a competitor to keep Intel moving forward. But I don't think it really makes sense for Samsung to buy out a client who is barely holding on to profitability with very specific niches, especially when that client is locked into using their fabs to produce high-performance chips. OK, fanboy time is over. Gotta go drag my new 4690K into the den.
|
# ? Mar 26, 2015 04:11 |
|
|
# ? Dec 12, 2024 20:59 |
|
I suspect that at most there is some kind of deal being worked on, maybe for Samsung to do some fabbing for AMD, and the rumor mill then blew it up into Samsung is buying AMD!!!!
|
# ? Mar 26, 2015 12:03 |