|
Ozz81 posted:The perception though is that cheap budget brand = not as good with a lot of consumers. Well that's because they've also been not as good for most of their history. And they self-advertised as being all about cheap for a reasonable amount of time.
|
# ? Aug 20, 2014 03:22 |
|
|
# ? Dec 1, 2024 21:12 |
|
Once Intel started selling the C2D E6300 for $183 in mid-2006 there is no reason to buy an AMD CPU for a gaming system ever. It was so good that it made the cheapest $120 A64 X2 look like a ripoff in comparison. Oh, so you wanna save $63 to buy 2GHz X2 when the E6300 can easily be OCed hit 3GHz which makes it the equivalent of a 4GHz X2? Be my guest. What, 8 years since then isn't enough to make the AMD rose tinted glasses drop already?
|
# ? Aug 20, 2014 03:48 |
|
I actually bought a Phenom IIx6 1090t when it came out (I wasn't educated by the goon university at the time) but I don't think that was too unreasonable - at the time it had only slightly lower single threaded performance compared to 1st gen socket 1156 stuff from what I could tell at the time (maybe I was wrong). Then they went backwards with bulldozer, and it's taken them 4 years to catch back up to that cpu I bought back in 2010, and meanwhile intel has made substantial progress.
|
# ? Aug 20, 2014 04:14 |
|
Palladium posted:Once Intel started selling the C2D E6300 for $183 in mid-2006 there is no reason to buy an AMD CPU for a gaming system ever. It was so good that it made the cheapest $120 A64 X2 look like a ripoff in comparison. Oh, so you wanna save $63 to buy 2GHz X2 when the E6300 can easily be OCed hit 3GHz which makes it the equivalent of a 4GHz X2? Be my guest. Ugh, I remember this like it was yesterday. Up until those c2d's hit, AMD was the current hotness. I was in 10th grade, and finally convinced my parents to give me money to build a new family PC (aka a new gaming PC for myself ). Not having a job and being in 10th grade and all, it took some prodding from me to convince them to hand 800 bucks to their son. They eventually did though, and I built what I thought was a pretty sweet system. Thing is, I didn't know there was the c2d on the horizon and at the time reading all of the conflicting information out there about different builds and such was pretty overwhelming. I wound up getting an AMD dual core 4000+ processor (the exact name escapes me atm) and building it in I think June 2006. You can imagine my disappointment when I found out that not only was a better processor being released in a few weeks time, but that performance absolutely blew away anything else available including the system I had just built with my parents money
|
# ? Aug 20, 2014 04:54 |
|
I'll say one thing, I'm glad AMD finally dropped the stupid "performance rating" poo poo when naming their CPUs.
|
# ? Aug 20, 2014 05:00 |
|
The Lord Bude posted:I actually bought a Phenom IIx6 1090t when it came out (I wasn't educated by the goon university at the time) but I don't think that was too unreasonable - at the time it had only slightly lower single threaded performance compared to 1st gen socket 1156 stuff from what I could tell at the time (maybe I was wrong). Then they went backwards with bulldozer, and it's taken them 4 years to catch back up to that cpu I bought back in 2010, and meanwhile intel has made substantial progress. It looks like it was pretty close to Intel's offerings at its price point at the time of its launch. Intel didn't really just run away with the performance crown until Sandy Bridge, the jump from Lynnfield to that was gigantic on top of Sandy's Bridge's amazing overclocking capabilities.
|
# ? Aug 20, 2014 05:21 |
|
Nintendo Kid posted:I'll say one thing, I'm glad AMD finally dropped the stupid "performance rating" poo poo when naming their CPUs. What, and the cryptic model numbers we get these days from the AMD and Intel camps are actually any better?
|
# ? Aug 20, 2014 10:37 |
|
fart simpson posted:Yeah but back then AMD had a competitive if not outright superior product. You know why a salesman framing it that way was unfair. At any rate, outside of the gamer & budget subsets most people didn't really give a ship whose chip was in the PC and since the OEMs were being threatened/bribed into pushing Intel that's what they got.
|
# ? Aug 20, 2014 15:55 |
|
cisco privilege posted:What alot of people like to forget is that AMD chipsets from the era were loving awful with missing or misfiring features compared to Intel chipsets. Sure, a Pentium 4 wouldn't run nearly as well, but when AMD boards had VIA or SiS-level quality concerns and weird compatibility issues it usually wasn't worth trying them out compared to an Intel setup which would presumably just work, albeit with measurably worse performance. Apple switched to Intel during AMD's hayday as well. This is a totally blind guess to me but I assume things like that and superior supply chain management are more important to these guys than just CPU benchmarks
|
# ? Aug 20, 2014 16:29 |
|
WhyteRyce posted:Apple switched to Intel during AMD's hayday as well. This is a totally blind guess to me but I assume things like that and superior supply chain management are more important to these guys than just CPU benchmarks The big deal with intel was performance per watt.
|
# ? Aug 20, 2014 16:31 |
|
cisco privilege posted:What alot of people like to forget is that AMD chipsets from the era were loving awful with missing or misfiring features compared to Intel chipsets. Sure, a Pentium 4 wouldn't run nearly as well, but when AMD boards had VIA or SiS-level quality concerns and weird compatibility issues it usually wasn't worth trying them out compared to an Intel setup which would presumably just work, albeit with measurably worse performance. That wasn't really an issue anymore when the nforce series was released though.
|
# ? Aug 20, 2014 16:32 |
|
The Lord Bude posted:I actually bought a Phenom IIx6 1090t when it came out (I wasn't educated by the goon university at the time) but I don't think that was too unreasonable - at the time it had only slightly lower single threaded performance compared to 1st gen socket 1156 stuff from what I could tell at the time (maybe I was wrong). Then they went backwards with bulldozer, and it's taken them 4 years to catch back up to that cpu I bought back in 2010, and meanwhile intel has made substantial progress. Phenom II's have a special place in my heart, warranted or not.
|
# ? Aug 20, 2014 16:35 |
|
Don Lapre posted:The big deal with intel was performance per watt. OEMs probably don't care about a regular consumer's power bill and your average Joe consumer had no concept of what that was 10 years ago
|
# ? Aug 20, 2014 16:35 |
|
WhyteRyce posted:OEMs probably don't care about a regular consumer's power bill and your average Joe consumer had no concept of what that was 10 years ago Its not about power bills, its about laptop battery life.
|
# ? Aug 20, 2014 16:36 |
|
HalloKitty posted:What, and the cryptic model numbers we get these days from the AMD and Intel camps are actually any better? Uh, yeah? The model numbers at least tell you a decent bit of info about what generation and the like the CPU is, as well as the other stats being listed. The "performance rating" system was entirely marketing with no coherent correspondence between generations of chips and only limited correspondence within a generation. Don Lapre posted:Its not about power bills, its about laptop battery life. Yeah this was and is huge. Consumers went to majority purchasing laptops in 2005 or 2006; and AMD really hasn't been able to match Intel for that in general.
|
# ? Aug 20, 2014 16:37 |
|
Don Lapre posted:Its not about power bills, its about laptop battery life. Oh I see, you're talking about power/performance with Intel chips for laptops, not that the big (negative) deal with Intel being bad performance per watt on desktop.
|
# ? Aug 20, 2014 16:52 |
|
WhyteRyce posted:Oh I see, you're talking about power/performance with Intel chips for laptops, not that the big (negative) deal with Intel being bad performance per watt on desktop. What? That's referring to Core, not Netburst. Intel rather handily reversed the whole bad performance per watt thing.
|
# ? Aug 20, 2014 16:57 |
|
WhyteRyce posted:Oh I see, you're talking about power/performance with Intel chips for laptops, not that the big (negative) deal with Intel being bad performance per watt on desktop. Apple moved to intel when intel moved to core.
|
# ? Aug 20, 2014 17:01 |
|
Factory Factory posted:What? That's referring to Core, not Netburst. Intel rather handily reversed the whole bad performance per watt thing. Yes I'm referring to Core Duo and not the Netburst garbage that was prevalent on the desktop at the same time it was released quote:Apple moved to intel when intel moved to core. Correct, I forgot that Apple didn't touch Pentium D when they first switched over WhyteRyce fucked around with this message at 17:10 on Aug 20, 2014 |
# ? Aug 20, 2014 17:07 |
|
SwissCM posted:That wasn't really an issue anymore when the nforce series was released though. Nforce didn't have some of the stability issues, but it did tend to have other issues along it's Ethernet, firewall and real time clock. For an AMD chipset it was awesome but it still didn't rival Intel's best at the time.
|
# ? Aug 20, 2014 18:07 |
|
Nintendo Kid posted:Uh, yeah? The model numbers at least tell you a decent bit of info about what generation and the like the CPU is, as well as the other stats being listed. The "performance rating" system was entirely marketing with no coherent correspondence between generations of chips and only limited correspondence within a generation I'd argue though that things aren't really better now. Desktop-side try telling someone that an i5 means quad core and no hyper threading while i7 translates to quad core with hyper threading but that performances varied by app and that # cores doesn't automatically equal highest performance, etc. and watch their eyes glaze over. Then go into explaining how mobile i5s can be dual core but with hyper threading and there can be i7s that are the same but also are quad core and with hyper threading, etc. Theres really no effective way without keeping tables/ARK handy. It's just as bad now as it ever was.
|
# ? Aug 20, 2014 18:15 |
|
SourKraut posted:Not to defend the stupid marketing gimmick that much, but the whole point of the Athlon XP / 64-era rating systems was to try and indicate what the chips performance was to a Pentium 4 at the clock speed given, i.e. a Athlon XP 2100+ was equivalent to a Pentium 4 2.1 ghz chip. Regardless of whether or not it was, it actually worked a little bit, as I could go into stores at the time and hear salesmen saying stuff like that and crap such as "Not only is it equal to that speed of Pentium but since it's running slower it's also running cooler too!!!" (Though by the time of Prescott that'd be the truth anyway). The problem was that they quickly started changing their Performance Rating numbers to exaggerations not long after - similar to the shenanigans they pulled with performance ratings on their not-quite-Pentium level 486 based CPUs in the 90s. Let alone how the performance ratings used for laptop CPUs didn't seem to hold much relation to the ones for desktops. SourKraut posted:I'd argue though that things aren't really better now. Desktop-side try telling someone that an i5 means quad core and no hyper threading while i7 translates to quad core with hyper threading but that performances varied by app and that # cores doesn't automatically equal highest performance, etc. and watch their eyes glaze over. Then go into explaining how mobile i5s can be dual core but with hyper threading and there can be i7s that are the same but also are quad core and with hyper threading, etc. Theres really no effective way without keeping tables/ARK handy. It's just as bad now as it ever was. The thing is that now they at least don't try to stick a plain performance number onto things, both AMD and Intel marketing admits that comparing speeds is super tricky in a world where the feature sets of the processors can't be reasonably brought down to a single number. It was confusing to compare things then and now, but at least now you don't have an additional layer of obfuscation with the pr system.
|
# ? Aug 20, 2014 18:35 |
|
I thought this was a really interesting post-mortem on what went wrong at AMD and why they're so totally screwed now: http://arstechnica.com/business/2013/04/the-rise-and-fall-of-amd-how-an-underdog-stuck-it-to-intel/
|
# ? Aug 20, 2014 23:45 |
|
Nostrum posted:I thought this was a really interesting post-mortem on what went wrong at AMD and why they're so totally screwed now: This is a fantastic read, thanks for the link!
|
# ? Aug 21, 2014 00:44 |
|
As a hard core AMD fan boy in the early '00's, I got an early Athlon 64 on the Socket 754 platform. I got the 3200+ and a board for like $150 as part of an AMD training program when I worked at Staples in college. I know the board was a VIA K8T800 chipset, and I think it was an Asus. That board was nothing but a pain in the rear end. After like a year, I finally figured out that having a PCI NIC in the bottom slot was literally cutting the AGP performance in half. I had a Radeon 9700 that was not playing HL2 well, so I sold it and got a GF 6800 GT that was also was lovely. I finally took everything out and it was the NIC. I got fed up with it and got a DFI LAN PARTY (UV PAINT!!) nForce 3 board which was better, but still sucked. That machine lived on for a few more years as a HTPC until I had enough cash to dump it for a newer setup. That was a good day, and it would be 6 years until I touched an AMD system again, and I am not thrilled with it now due to some driver issues.
|
# ? Aug 21, 2014 02:39 |
|
Nostrum posted:I thought this was a really interesting post-mortem on what went wrong at AMD and why they're so totally screwed now: That is really depressing. I wish someone with a heap of money would buy AMD out and turn it around. I'm sure a company like Apple, IBM or Samsung could turn AMD around with additional funding. All three already make chips, so it isn't a big leap.
|
# ? Aug 21, 2014 02:59 |
|
Lord Windy posted:That is really depressing. I wish someone with a heap of money would buy AMD out and turn it around. I'm sure a company like Apple, IBM or Samsung could turn AMD around with additional funding. All three already make chips, so it isn't a big leap. No one is interested in buying a dead chipmaker with serious financial issues though, they could just shoulder them out of the market completely for less effort.
|
# ? Aug 21, 2014 03:11 |
|
Lord Windy posted:That is really depressing. I wish someone with a heap of money would buy AMD out and turn it around. I'm sure a company like Apple, IBM or Samsung could turn AMD around with additional funding. All three already make chips, so it isn't a big leap. Apple "makes" a few ARM chips with a subsidiary they bought. IBM has deliberately stopped making x86 CPUs these days (they used to you know). Samsung and the rest all have the money to buy out AMD's reciprocal licenses with Intel to produce x86 and x86-64 stuff, if they really wanted to, without buying anything else in AMD's hulk.
|
# ? Aug 21, 2014 03:53 |
|
Nintendo Kid posted:Apple "makes" a few ARM chips with a subsidiary they bought. IBM has deliberately stopped making x86 CPUs these days (they used to you know). Samsung and the rest all have the money to buy out AMD's reciprocal licenses with Intel to produce x86 and x86-64 stuff, if they really wanted to, without buying anything else in AMD's hulk. Well, I was saying Apple just because they piles of money. I think they make almost as much in operating profit as Intel does in revenue. Also, wouldn't AMD come with ATI which isn't terrible?
|
# ? Aug 21, 2014 04:27 |
|
Lord Windy posted:Well, I was saying Apple just because they piles of money. I think they make almost as much in operating profit as Intel does in revenue. Also, wouldn't AMD come with ATI which isn't terrible? Apple was one of the big movers behind OpenCL, which seems like it's one of AMD's few significant selling points right now (Jaguar and GPUs). The question is what Apple would get out of it, they haven't really coordinated on designing chips since the PowerPC days from what I remember. Paul MaudDib fucked around with this message at 04:42 on Aug 21, 2014 |
# ? Aug 21, 2014 04:39 |
|
HP is bringing out a Chromebook competitor powered by A4 Micro-6400T with $200 price tag. The model in picture is equipped with 14" 1366x768 display, but when they release a model with 9" or 10.1" screen(and hopefully same resolution) with 64 gb storage, I'm going to snag it ASAP. I currently use a EeePC 1015BX which would otherwise be acceptable but it has 1 GB RAM(and the GPU reserves 275 MB of it) so it is painful to use.
|
# ? Aug 21, 2014 05:16 |
|
Paul MaudDib posted:Apple was one of the big movers behind OpenCL, which seems like it's one of AMD's few significant selling points right now (Jaguar and GPUs). The question is what Apple would get out of it, they haven't really coordinated on designing chips since the PowerPC days from what I remember. Yeah, OpenCL is something I am very interested in and the AMD drivers are simply the best. It doesn't work with Nvidia cards, but the drivers will work with any ATI card, AMD CPU or Intel CPU. I don't know what Apple would get out of it. It's just wishful thinking.
|
# ? Aug 21, 2014 09:45 |
|
mayodreams posted:As a hard core AMD fan boy in the early '00's, I got an early Athlon 64 on the Socket 754 platform. I got the 3200+ and a board for like $150 as part of an AMD training program when I worked at Staples in college. I know the board was a VIA K8T800 chipset, and I think it was an Asus. That board was nothing but a pain in the rear end. After like a year, I finally figured out that having a PCI NIC in the bottom slot was literally cutting the AGP performance in half. I had a Radeon 9700 that was not playing HL2 well, so I sold it and got a GF 6800 GT that was also was lovely. I finally took everything out and it was the NIC. AMD reliance on hit-and-miss third party chipsets played no small part in tarnishing their own reputation. The only chipset for AMD regarded are universally excellent is the nForce 2...And that's it. Chipsets for A64 went from garbage VIA to hit-and-miss BSODing firewall nForce 4 to excellent but only for a tiny select mobo models Geforce 6100.
|
# ? Aug 21, 2014 11:04 |
|
Rosoboronexport posted:HP is bringing out a Chromebook competitor powered by A4 Micro-6400T with $200 price tag. The model in picture is equipped with 14" 1366x768 display, but when they release a model with 9" or 10.1" screen(and hopefully same resolution) with 64 gb storage, I'm going to snag it ASAP. I currently use a EeePC 1015BX which would otherwise be acceptable but it has 1 GB RAM(and the GPU reserves 275 MB of it) so it is painful to use. I kind of want one just for Steam streaming
|
# ? Aug 21, 2014 12:24 |
|
Lord Windy posted:Yeah, OpenCL is something I am very interested in and the AMD drivers are simply the best. It doesn't work with Nvidia cards, but the drivers will work with any ATI card, AMD CPU or Intel CPU. OpenCL works just fine on nVidia cards though?
|
# ? Aug 21, 2014 13:16 |
|
Nostrum posted:I thought this was a really interesting post-mortem on what went wrong at AMD and why they're so totally screwed now: Ooof. That is bleak. They should sell out the CPU side and put their eggs into IBM.
|
# ? Aug 21, 2014 15:21 |
|
The Lord Bude posted:I actually bought a Phenom IIx6 1090t when it came out (I wasn't educated by the goon university at the time) but I don't think that was too unreasonable - at the time it had only slightly lower single threaded performance compared to 1st gen socket 1156 stuff from what I could tell at the time (maybe I was wrong). Then they went backwards with bulldozer, and it's taken them 4 years to catch back up to that cpu I bought back in 2010, and meanwhile intel has made substantial progress. I had the 1090t also, and when they went to CMT that was the nail in the coffin still out performed them all with a 4GHz+ overclock although they start getting real hot and draw more power than 4 intels . I eventually just said gently caress it enough's enough and got in bed with Intel, miss my x6 for running VM's and stuff but I don't miss it in gaming 1 bit. I feel sorry for the fools who have gone x2 > x 3 > x4 > x6 > fx > fx > fx 8 core monstrosity and not really got any where. 'it'll get better guys' 'it has to' Palladium posted:AMD reliance on hit-and-miss third party chipsets played no small part in tarnishing their own reputation. The only chipset for AMD regarded are universally excellent is the nForce 2...And that's it. Chipsets for A64 went from garbage VIA to hit-and-miss BSODing firewall nForce 4 to excellent but only for a tiny select mobo models Geforce 6100. Those via chipsets and asrock boards where you could run agp or pcie ddr or ddr2 were loving great fun at the time and dirt cheap, had much good fun with those.
|
# ? Aug 21, 2014 16:01 |
|
1gnoirents posted:Ooof. That is bleak. They should sell out the CPU side and put their eggs into IBM. You're beyond bleak if the savior is IBM. I used to be concerned that their teambuilding software (put in project parameters, it scans db of employees/contractors and spits out project team options) would be the core of our dystopian future. But they were bitten by the financialization bug and don't look likely to pull out. Further reading: http://www.forbes.com/sites/stevedenning/2014/05/30/why-ibm-is-in-decline/ http://www.forbes.com/sites/stevedenning/2014/06/03/why-financialization-has-run-amok/
|
# ? Aug 21, 2014 16:30 |
|
My first PC was built on the Asus A7A266, with the ALi Magik 1 chipset by Acer that enabled both SDR and DDR SDRAM support for Thunderbird Athlons. I recall it was many, many years before motherboards were good enough that reviews stopped routinely counting the number of crashes and bluescreens during reviews. Re: Ars' pre-post-mortem of AMD, this one blew my mind: quote:According to both reports at the time and to Ruiz’s own book, Nvidia was considered a potential acquisition target first, since the company had plenty of graphics experience and some of the best chipsets for AMD's K7 and K8-based CPUs. But Jen-Hsun Huang, Nvidia's outspoken CEO, wanted to run the combined company—a non-starter for AMD's leadership. The discussion then turned to ATI, which AMD eventually bought for $5.4 billion in cash and more in stock in October 2006. A combined AMD-Nvidia would've been a loving POWERHOUSE. Who knows what would've happened to ATI, but AMD-Nvidia vs. Intel... Holy poo poo. Factory Factory fucked around with this message at 16:34 on Aug 21, 2014 |
# ? Aug 21, 2014 16:32 |
|
|
# ? Dec 1, 2024 21:12 |
|
NVidia's CEO has a huge hateboner for Intel too, wouldn't that have been a fun combination.
|
# ? Aug 21, 2014 17:37 |