Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Nintendo Kid
Aug 4, 2011

by Smythe

Ozz81 posted:

The perception though is that cheap budget brand = not as good with a lot of consumers.

Well that's because they've also been not as good for most of their history. And they self-advertised as being all about cheap for a reasonable amount of time.

Adbot
ADBOT LOVES YOU

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️
Once Intel started selling the C2D E6300 for $183 in mid-2006 there is no reason to buy an AMD CPU for a gaming system ever. It was so good that it made the cheapest $120 A64 X2 look like a ripoff in comparison. Oh, so you wanna save $63 to buy 2GHz X2 when the E6300 can easily be OCed hit 3GHz which makes it the equivalent of a 4GHz X2? Be my guest.

What, 8 years since then isn't enough to make the AMD rose tinted glasses drop already?

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE
I actually bought a Phenom IIx6 1090t when it came out (I wasn't educated by the goon university at the time) but I don't think that was too unreasonable - at the time it had only slightly lower single threaded performance compared to 1st gen socket 1156 stuff from what I could tell at the time (maybe I was wrong). Then they went backwards with bulldozer, and it's taken them 4 years to catch back up to that cpu I bought back in 2010, and meanwhile intel has made substantial progress.

chocolateTHUNDER
Jul 19, 2008

GIVE ME ALL YOUR FREE AGENTS

ALL OF THEM

Palladium posted:

Once Intel started selling the C2D E6300 for $183 in mid-2006 there is no reason to buy an AMD CPU for a gaming system ever. It was so good that it made the cheapest $120 A64 X2 look like a ripoff in comparison. Oh, so you wanna save $63 to buy 2GHz X2 when the E6300 can easily be OCed hit 3GHz which makes it the equivalent of a 4GHz X2? Be my guest.

What, 8 years since then isn't enough to make the AMD rose tinted glasses drop already?

Ugh, I remember this like it was yesterday. Up until those c2d's hit, AMD was the current hotness.

I was in 10th grade, and finally convinced my parents to give me money to build a new family PC (aka a new gaming PC for myself :v: ). Not having a job and being in 10th grade and all, it took some prodding from me to convince them to hand 800 bucks to their son. They eventually did though, and I built what I thought was a pretty sweet system. Thing is, I didn't know there was the c2d on the horizon and at the time reading all of the conflicting information out there about different builds and such was pretty overwhelming. I wound up getting an AMD dual core 4000+ processor (the exact name escapes me atm) and building it in I think June 2006.

You can imagine my disappointment when I found out that not only was a better processor being released in a few weeks time, but that performance absolutely blew away anything else available including the system I had just built with my parents money :(

Nintendo Kid
Aug 4, 2011

by Smythe
I'll say one thing, I'm glad AMD finally dropped the stupid "performance rating" poo poo when naming their CPUs.

Beautiful Ninja
Mar 26, 2009

Five time FCW Champion...of my heart.

The Lord Bude posted:

I actually bought a Phenom IIx6 1090t when it came out (I wasn't educated by the goon university at the time) but I don't think that was too unreasonable - at the time it had only slightly lower single threaded performance compared to 1st gen socket 1156 stuff from what I could tell at the time (maybe I was wrong). Then they went backwards with bulldozer, and it's taken them 4 years to catch back up to that cpu I bought back in 2010, and meanwhile intel has made substantial progress.

It looks like it was pretty close to Intel's offerings at its price point at the time of its launch. Intel didn't really just run away with the performance crown until Sandy Bridge, the jump from Lynnfield to that was gigantic on top of Sandy's Bridge's amazing overclocking capabilities.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Nintendo Kid posted:

I'll say one thing, I'm glad AMD finally dropped the stupid "performance rating" poo poo when naming their CPUs.

What, and the cryptic model numbers we get these days from the AMD and Intel camps are actually any better?

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

fart simpson posted:

Yeah but back then AMD had a competitive if not outright superior product. You know why a salesman framing it that way was unfair.
What alot of people like to forget is that AMD chipsets from the era were loving awful with missing or misfiring features compared to Intel chipsets. Sure, a Pentium 4 wouldn't run nearly as well, but when AMD boards had VIA or SiS-level quality concerns and weird compatibility issues it usually wasn't worth trying them out compared to an Intel setup which would presumably just work, albeit with measurably worse performance.

At any rate, outside of the gamer & budget subsets most people didn't really give a ship whose chip was in the PC and since the OEMs were being threatened/bribed into pushing Intel that's what they got.

WhyteRyce
Dec 30, 2001

cisco privilege posted:

What alot of people like to forget is that AMD chipsets from the era were loving awful with missing or misfiring features compared to Intel chipsets. Sure, a Pentium 4 wouldn't run nearly as well, but when AMD boards had VIA or SiS-level quality concerns and weird compatibility issues it usually wasn't worth trying them out compared to an Intel setup which would presumably just work, albeit with measurably worse performance.


Apple switched to Intel during AMD's hayday as well. This is a totally blind guess to me but I assume things like that and superior supply chain management are more important to these guys than just CPU benchmarks

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

WhyteRyce posted:

Apple switched to Intel during AMD's hayday as well. This is a totally blind guess to me but I assume things like that and superior supply chain management are more important to these guys than just CPU benchmarks

The big deal with intel was performance per watt.

SCheeseman
Apr 23, 2003

cisco privilege posted:

What alot of people like to forget is that AMD chipsets from the era were loving awful with missing or misfiring features compared to Intel chipsets. Sure, a Pentium 4 wouldn't run nearly as well, but when AMD boards had VIA or SiS-level quality concerns and weird compatibility issues it usually wasn't worth trying them out compared to an Intel setup which would presumably just work, albeit with measurably worse performance.

At any rate, outside of the gamer & budget subsets most people didn't really give a ship whose chip was in the PC and since the OEMs were being threatened/bribed into pushing Intel that's what they got.

That wasn't really an issue anymore when the nforce series was released though.

1gnoirents
Jun 28, 2014

hello :)

The Lord Bude posted:

I actually bought a Phenom IIx6 1090t when it came out (I wasn't educated by the goon university at the time) but I don't think that was too unreasonable - at the time it had only slightly lower single threaded performance compared to 1st gen socket 1156 stuff from what I could tell at the time (maybe I was wrong). Then they went backwards with bulldozer, and it's taken them 4 years to catch back up to that cpu I bought back in 2010, and meanwhile intel has made substantial progress.

Phenom II's have a special place in my heart, warranted or not.

WhyteRyce
Dec 30, 2001

Don Lapre posted:

The big deal with intel was performance per watt.

OEMs probably don't care about a regular consumer's power bill and your average Joe consumer had no concept of what that was 10 years ago

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

WhyteRyce posted:

OEMs probably don't care about a regular consumer's power bill and your average Joe consumer had no concept of what that was 10 years ago

Its not about power bills, its about laptop battery life.

Nintendo Kid
Aug 4, 2011

by Smythe

HalloKitty posted:

What, and the cryptic model numbers we get these days from the AMD and Intel camps are actually any better?

Uh, yeah? The model numbers at least tell you a decent bit of info about what generation and the like the CPU is, as well as the other stats being listed. The "performance rating" system was entirely marketing with no coherent correspondence between generations of chips and only limited correspondence within a generation.

Don Lapre posted:

Its not about power bills, its about laptop battery life.

Yeah this was and is huge. Consumers went to majority purchasing laptops in 2005 or 2006; and AMD really hasn't been able to match Intel for that in general.

WhyteRyce
Dec 30, 2001

Don Lapre posted:

Its not about power bills, its about laptop battery life.

Oh I see, you're talking about power/performance with Intel chips for laptops, not that the big (negative) deal with Intel being bad performance per watt on desktop.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

WhyteRyce posted:

Oh I see, you're talking about power/performance with Intel chips for laptops, not that the big (negative) deal with Intel being bad performance per watt on desktop.

What? That's referring to Core, not Netburst. Intel rather handily reversed the whole bad performance per watt thing.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

WhyteRyce posted:

Oh I see, you're talking about power/performance with Intel chips for laptops, not that the big (negative) deal with Intel being bad performance per watt on desktop.

Apple moved to intel when intel moved to core.

WhyteRyce
Dec 30, 2001

Factory Factory posted:

What? That's referring to Core, not Netburst. Intel rather handily reversed the whole bad performance per watt thing.

Yes I'm referring to Core Duo and not the Netburst garbage that was prevalent on the desktop at the same time it was released

quote:

Apple moved to intel when intel moved to core.

Correct, I forgot that Apple didn't touch Pentium D when they first switched over

WhyteRyce fucked around with this message at 18:10 on Aug 20, 2014

Ryokurin
Jul 14, 2001

Wanna Die?

SwissCM posted:

That wasn't really an issue anymore when the nforce series was released though.

Nforce didn't have some of the stability issues, but it did tend to have other issues along it's Ethernet, firewall and real time clock. For an AMD chipset it was awesome but it still didn't rival Intel's best at the time.

Canned Sunshine
Nov 20, 2005
Probation
Can't post for 4 hours!

Nintendo Kid posted:

Uh, yeah? The model numbers at least tell you a decent bit of info about what generation and the like the CPU is, as well as the other stats being listed. The "performance rating" system was entirely marketing with no coherent correspondence between generations of chips and only limited correspondence within a generation
Not to defend the stupid marketing gimmick that much, but the whole point of the Athlon XP / 64-era rating systems was to try and indicate what the chips performance was to a Pentium 4 at the clock speed given, i.e. a Athlon XP 2100+ was equivalent to a Pentium 4 2.1 ghz chip. Regardless of whether or not it was, it actually worked a little bit, as I could go into stores at the time and hear salesmen saying stuff like that and crap such as "Not only is it equal to that speed of Pentium but since it's running slower it's also running cooler too!!!" (Though by the time of Prescott that'd be the truth anyway).

I'd argue though that things aren't really better now. Desktop-side try telling someone that an i5 means quad core and no hyper threading while i7 translates to quad core with hyper threading but that performances varied by app and that # cores doesn't automatically equal highest performance, etc. and watch their eyes glaze over. Then go into explaining how mobile i5s can be dual core but with hyper threading and there can be i7s that are the same but also are quad core and with hyper threading, etc. Theres really no effective way without keeping tables/ARK handy. It's just as bad now as it ever was.

Nintendo Kid
Aug 4, 2011

by Smythe

SourKraut posted:

Not to defend the stupid marketing gimmick that much, but the whole point of the Athlon XP / 64-era rating systems was to try and indicate what the chips performance was to a Pentium 4 at the clock speed given, i.e. a Athlon XP 2100+ was equivalent to a Pentium 4 2.1 ghz chip. Regardless of whether or not it was, it actually worked a little bit, as I could go into stores at the time and hear salesmen saying stuff like that and crap such as "Not only is it equal to that speed of Pentium but since it's running slower it's also running cooler too!!!" (Though by the time of Prescott that'd be the truth anyway).


The problem was that they quickly started changing their Performance Rating numbers to exaggerations not long after - similar to the shenanigans they pulled with performance ratings on their not-quite-Pentium level 486 based CPUs in the 90s. Let alone how the performance ratings used for laptop CPUs didn't seem to hold much relation to the ones for desktops.


SourKraut posted:

I'd argue though that things aren't really better now. Desktop-side try telling someone that an i5 means quad core and no hyper threading while i7 translates to quad core with hyper threading but that performances varied by app and that # cores doesn't automatically equal highest performance, etc. and watch their eyes glaze over. Then go into explaining how mobile i5s can be dual core but with hyper threading and there can be i7s that are the same but also are quad core and with hyper threading, etc. Theres really no effective way without keeping tables/ARK handy. It's just as bad now as it ever was.

The thing is that now they at least don't try to stick a plain performance number onto things, both AMD and Intel marketing admits that comparing speeds is super tricky in a world where the feature sets of the processors can't be reasonably brought down to a single number.

It was confusing to compare things then and now, but at least now you don't have an additional layer of obfuscation with the pr system.

forbidden dialectics
Jul 26, 2005





I thought this was a really interesting post-mortem on what went wrong at AMD and why they're so totally screwed now:

http://arstechnica.com/business/2013/04/the-rise-and-fall-of-amd-how-an-underdog-stuck-it-to-intel/

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Nostrum posted:

I thought this was a really interesting post-mortem on what went wrong at AMD and why they're so totally screwed now:

http://arstechnica.com/business/2013/04/the-rise-and-fall-of-amd-how-an-underdog-stuck-it-to-intel/

This is a fantastic read, thanks for the link!

mayodreams
Jul 4, 2003


Hello darkness,
my old friend
As a hard core AMD fan boy in the early '00's, I got an early Athlon 64 on the Socket 754 platform. I got the 3200+ and a board for like $150 as part of an AMD training program when I worked at Staples in college. I know the board was a VIA K8T800 chipset, and I think it was an Asus. That board was nothing but a pain in the rear end. After like a year, I finally figured out that having a PCI NIC in the bottom slot was literally cutting the AGP performance in half. I had a Radeon 9700 that was not playing HL2 well, so I sold it and got a GF 6800 GT that was also was lovely. I finally took everything out and it was the NIC.

I got fed up with it and got a DFI LAN PARTY (UV PAINT!!) nForce 3 board which was better, but still sucked. That machine lived on for a few more years as a HTPC until I had enough cash to dump it for a newer setup. That was a good day, and it would be 6 years until I touched an AMD system again, and I am not thrilled with it now due to some driver issues.

Lord Windy
Mar 26, 2010

Nostrum posted:

I thought this was a really interesting post-mortem on what went wrong at AMD and why they're so totally screwed now:

http://arstechnica.com/business/2013/04/the-rise-and-fall-of-amd-how-an-underdog-stuck-it-to-intel/

That is really depressing. I wish someone with a heap of money would buy AMD out and turn it around. I'm sure a company like Apple, IBM or Samsung could turn AMD around with additional funding. All three already make chips, so it isn't a big leap.

orange juche
Mar 14, 2012



Lord Windy posted:

That is really depressing. I wish someone with a heap of money would buy AMD out and turn it around. I'm sure a company like Apple, IBM or Samsung could turn AMD around with additional funding. All three already make chips, so it isn't a big leap.

No one is interested in buying a dead chipmaker with serious financial issues though, they could just shoulder them out of the market completely for less effort.

Nintendo Kid
Aug 4, 2011

by Smythe

Lord Windy posted:

That is really depressing. I wish someone with a heap of money would buy AMD out and turn it around. I'm sure a company like Apple, IBM or Samsung could turn AMD around with additional funding. All three already make chips, so it isn't a big leap.

Apple "makes" a few ARM chips with a subsidiary they bought. IBM has deliberately stopped making x86 CPUs these days (they used to you know). Samsung and the rest all have the money to buy out AMD's reciprocal licenses with Intel to produce x86 and x86-64 stuff, if they really wanted to, without buying anything else in AMD's hulk.

Lord Windy
Mar 26, 2010

Nintendo Kid posted:

Apple "makes" a few ARM chips with a subsidiary they bought. IBM has deliberately stopped making x86 CPUs these days (they used to you know). Samsung and the rest all have the money to buy out AMD's reciprocal licenses with Intel to produce x86 and x86-64 stuff, if they really wanted to, without buying anything else in AMD's hulk.

Well, I was saying Apple just because they piles of money. I think they make almost as much in operating profit as Intel does in revenue. Also, wouldn't AMD come with ATI which isn't terrible?

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Lord Windy posted:

Well, I was saying Apple just because they piles of money. I think they make almost as much in operating profit as Intel does in revenue. Also, wouldn't AMD come with ATI which isn't terrible?

Apple was one of the big movers behind OpenCL, which seems like it's one of AMD's few significant selling points right now (Jaguar and GPUs). The question is what Apple would get out of it, they haven't really coordinated on designing chips since the PowerPC days from what I remember.

Paul MaudDib fucked around with this message at 05:42 on Aug 21, 2014

Rosoboronexport
Jun 14, 2006

Get in the bath, baby!
Ramrod XTreme
HP is bringing out a Chromebook competitor powered by A4 Micro-6400T with $200 price tag. The model in picture is equipped with 14" 1366x768 display, but when they release a model with 9" or 10.1" screen(and hopefully same resolution) with 64 gb storage, I'm going to snag it ASAP. I currently use a EeePC 1015BX which would otherwise be acceptable but it has 1 GB RAM(and the GPU reserves 275 MB of it) so it is painful to use.

Lord Windy
Mar 26, 2010

Paul MaudDib posted:

Apple was one of the big movers behind OpenCL, which seems like it's one of AMD's few significant selling points right now (Jaguar and GPUs). The question is what Apple would get out of it, they haven't really coordinated on designing chips since the PowerPC days from what I remember.

Yeah, OpenCL is something I am very interested in and the AMD drivers are simply the best. It doesn't work with Nvidia cards, but the drivers will work with any ATI card, AMD CPU or Intel CPU.

I don't know what Apple would get out of it. It's just wishful thinking.

Palladium
May 8, 2012

Very Good
✔️✔️✔️✔️

mayodreams posted:

As a hard core AMD fan boy in the early '00's, I got an early Athlon 64 on the Socket 754 platform. I got the 3200+ and a board for like $150 as part of an AMD training program when I worked at Staples in college. I know the board was a VIA K8T800 chipset, and I think it was an Asus. That board was nothing but a pain in the rear end. After like a year, I finally figured out that having a PCI NIC in the bottom slot was literally cutting the AGP performance in half. I had a Radeon 9700 that was not playing HL2 well, so I sold it and got a GF 6800 GT that was also was lovely. I finally took everything out and it was the NIC.

I got fed up with it and got a DFI LAN PARTY (UV PAINT!!) nForce 3 board which was better, but still sucked. That machine lived on for a few more years as a HTPC until I had enough cash to dump it for a newer setup. That was a good day, and it would be 6 years until I touched an AMD system again, and I am not thrilled with it now due to some driver issues.

AMD reliance on hit-and-miss third party chipsets played no small part in tarnishing their own reputation. The only chipset for AMD regarded are universally excellent is the nForce 2...And that's it. Chipsets for A64 went from garbage VIA to hit-and-miss BSODing firewall nForce 4 to excellent but only for a tiny select mobo models Geforce 6100.

Panty Saluter
Jan 17, 2004

Making learning fun!

Rosoboronexport posted:

HP is bringing out a Chromebook competitor powered by A4 Micro-6400T with $200 price tag. The model in picture is equipped with 14" 1366x768 display, but when they release a model with 9" or 10.1" screen(and hopefully same resolution) with 64 gb storage, I'm going to snag it ASAP. I currently use a EeePC 1015BX which would otherwise be acceptable but it has 1 GB RAM(and the GPU reserves 275 MB of it) so it is painful to use.

I kind of want one just for Steam streaming :v:

One Eye Open
Sep 19, 2006
Am I awake?

Lord Windy posted:

Yeah, OpenCL is something I am very interested in and the AMD drivers are simply the best. It doesn't work with Nvidia cards, but the drivers will work with any ATI card, AMD CPU or Intel CPU.

I don't know what Apple would get out of it. It's just wishful thinking.

OpenCL works just fine on nVidia cards though?:confused:

1gnoirents
Jun 28, 2014

hello :)

Nostrum posted:

I thought this was a really interesting post-mortem on what went wrong at AMD and why they're so totally screwed now:

http://arstechnica.com/business/2013/04/the-rise-and-fall-of-amd-how-an-underdog-stuck-it-to-intel/

Ooof. That is bleak. They should sell out the CPU side and put their eggs into IBM.

Shitty Treat
Feb 21, 2012

Stoopid?

The Lord Bude posted:

I actually bought a Phenom IIx6 1090t when it came out (I wasn't educated by the goon university at the time) but I don't think that was too unreasonable - at the time it had only slightly lower single threaded performance compared to 1st gen socket 1156 stuff from what I could tell at the time (maybe I was wrong). Then they went backwards with bulldozer, and it's taken them 4 years to catch back up to that cpu I bought back in 2010, and meanwhile intel has made substantial progress.

I had the 1090t also, and when they went to CMT that was the nail in the coffin still out performed them all with a 4GHz+ overclock although they start getting real hot and draw more power than 4 intels :ohdear:.
I eventually just said gently caress it enough's enough and got in bed with Intel, miss my x6 for running VM's and stuff but I don't miss it in gaming 1 bit.
I feel sorry for the fools who have gone x2 > x 3 > x4 > x6 > fx > fx > fx 8 core monstrosity and not really got any where. 'it'll get better guys' 'it has to'

Palladium posted:

AMD reliance on hit-and-miss third party chipsets played no small part in tarnishing their own reputation. The only chipset for AMD regarded are universally excellent is the nForce 2...And that's it. Chipsets for A64 went from garbage VIA to hit-and-miss BSODing firewall nForce 4 to excellent but only for a tiny select mobo models Geforce 6100.

Those via chipsets and asrock boards where you could run agp or pcie ddr or ddr2 were loving great fun at the time and dirt cheap, had much good fun with those.

JawnV6
Jul 4, 2004

So hot ...

1gnoirents posted:

Ooof. That is bleak. They should sell out the CPU side and put their eggs into IBM.

You're beyond bleak if the savior is IBM. I used to be concerned that their teambuilding software (put in project parameters, it scans db of employees/contractors and spits out project team options) would be the core of our dystopian future. But they were bitten by the financialization bug and don't look likely to pull out.

Further reading: http://www.forbes.com/sites/stevedenning/2014/05/30/why-ibm-is-in-decline/
http://www.forbes.com/sites/stevedenning/2014/06/03/why-financialization-has-run-amok/

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
My first PC was built on the Asus A7A266, with the ALi Magik 1 chipset by Acer that enabled both SDR and DDR SDRAM support for Thunderbird Athlons.

I recall it was many, many years before motherboards were good enough that reviews stopped routinely counting the number of crashes and bluescreens during reviews.

Re: Ars' pre-post-mortem of AMD, this one blew my mind:

quote:

According to both reports at the time and to Ruiz’s own book, Nvidia was considered a potential acquisition target first, since the company had plenty of graphics experience and some of the best chipsets for AMD's K7 and K8-based CPUs. But Jen-Hsun Huang, Nvidia's outspoken CEO, wanted to run the combined company—a non-starter for AMD's leadership. The discussion then turned to ATI, which AMD eventually bought for $5.4 billion in cash and more in stock in October 2006.

A combined AMD-Nvidia would've been a loving POWERHOUSE. Who knows what would've happened to ATI, but AMD-Nvidia vs. Intel... Holy poo poo.

Factory Factory fucked around with this message at 17:34 on Aug 21, 2014

Adbot
ADBOT LOVES YOU

canyoneer
Sep 13, 2005


I only have canyoneyes for you
NVidia's CEO has a huge hateboner for Intel too, wouldn't that have been a fun combination.

  • Locked thread