New around here? Register your SA Forums Account here!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Rexxed
May 1, 2010

Dis is amazing!
I gotta try dis!

Lord Windy posted:

Where would I go to learn more about the future of AMD? I'm forever hopeful that they will make something awesome.

The APUs interest me, are any of them decent or is basically just the same as the rest and is utter crap?

EDIT:

How does the graphics on the cpu stand up against the graphics on the Intel CPUs?

The APUs aren't very good unless you need slightly higher end video integrated for some reason. Their GPU is a bit better than the onboard Intel HD GPU although Intel is improving. The CPU trails behind in performance. In a laptop they provide a little bit better video than the Intel HD, but still don't have the single core performance of an Intel chip so if you wanted to game with one it wouldn't be that fast (it's sharing the system memory which means it's DDR3 for video, and the CPU won't run the game as well).

My father is using an APU and it's fine for him but all he runs is Firefox, MS Word, Adobe reader and the like. With an SSD that system will probably last him years, but only due to his particular use case of very light use. He's an AMD fanboy so he really wanted an AMD system but there was no reason to get one of the FX chips. It would be a better PC with an i3 or something but it's not a big deal either way since he doesn't stress it much at all and the whole thing was pretty cheap.

Adbot
ADBOT LOVES YOU

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Rexxed posted:

The APUs aren't very good unless you need slightly higher end video integrated for some reason. Their GPU is a bit better than the onboard Intel HD GPU although Intel is improving. The CPU trails behind in performance. In a laptop they provide a little bit better video than the Intel HD, but still don't have the single core performance of an Intel chip so if you wanted to game with one it wouldn't be that fast (it's sharing the system memory which means it's DDR3 for video, and the CPU won't run the game as well).



This is not entirely true. AMD APUs allow you to play a lot of games that you would otherwise not be able to play with any other integrated GPU solution. The important part to remember is that price point of said laptop. If you're looking at something in the really cheap $300-$400 range it can't be beat. I have an old A6-3xxx series or whatnot (cost $350 CDN) and it plays the games I'd want to play while mobile. Things like Diablo 3, a couple of different MMOs, etc.

orange juche
Mar 14, 2012



AMD APUs in laptops definitely aren't all that terrible as long as you keep in mind what their strengths are. I just recommended my sister buy a 15 inch HP Envy with a Richland A10 in it. It is faster than her old laptop (a 2010 C2D ULV chip) and she can run games on it at an acceptable quality (TF2, other source games)

She was getting it mainly for college work and maybe some very light mobile gaming when she is bored of listening to the professor, and the only intel driven HP Envy I could find was $200 more for performance she wouldn't notice in office applications and an absolutely pitiful last gen intel HD 4400 that would have serious issues running any 3d games above 12080p 720p low quality.

E: Fixed I was jetlagged and phone posting.

orange juche fucked around with this message at 20:51 on Aug 17, 2014

unpronounceable
Apr 4, 2010

You mean we still have another game to go through?!
Fallen Rib

orange juche posted:

She was getting it mainly for college work and maybe some very light mobile gaming when she is bored of listening to the professor, and the only intel driven HP Envy I could find was $200 more for performance she wouldn't notice in office applications and an absolutely pitiful last gen intel HD 4400 that would have serious issues running any 3d games above 1280p low quality.

With that resolution, games be damned. You probably mean 720p

chocolateTHUNDER
Jul 19, 2008

GIVE ME ALL YOUR FREE AGENTS

ALL OF THEM

orange juche posted:

AMD APUs in laptops definitely aren't all that terrible as long as you keep in mind what their strengths are. I just recommended my sister buy a 15 inch HP Envy with a Richland A10 in it. It is faster than her old laptop (a 2010 C2D ULV chip) and she can run games on it at an acceptable quality (TF2, other source games)

She was getting it mainly for college work and maybe some very light mobile gaming when she is bored of listening to the professor, and the only intel driven HP Envy I could find was $200 more for performance she wouldn't notice in office applications and an absolutely pitiful last gen intel HD 4400 that would have serious issues running any 3d games above 1280p low quality.

How would battery life be when compared to a haswell, though?

SYSV Fanfic
Sep 9, 2003

by Pragmatica

chocolateTHUNDER posted:

How would battery life be when compared to a haswell, though?

My trinity A8 had a battery life of about 4 hours playing something ancient like Everquest or WoW. So about half (or less) of haswell.

It was a desktop replacement for me - I was spending a lot of time at my girlfriend's house and it was nice to be able to log in and see what was going on. It sat on her kitchen table and never moved. When we moved in together, I gave it to my parents to replace their Pentium 4 desktop. For most people, battery life isn't going to be a big issue.

orange juche
Mar 14, 2012



chocolateTHUNDER posted:

How would battery life be when compared to a haswell, though?

If you're playing video games on any platform you really want it plugged into the wall. Haswell, Richland, doesn't matter, because your iGPU performance will be crippled severely by simply unplugging from the wall.

In all actuality, really once you have more than 3-4 hours endurance on a battery, it is basically just bragging rights unless you live in Swiss Family Robinson style and have zero access to power. I can't foresee being in a situation where I needed to use a laptop for more than 4 hours and could not find a charging port. (inflight airlines notwithstanding) Anyways if you are on a long haul flight cross country or something, usually there is a 120v outlet under your seat.

SwissArmyDruid
Feb 14, 2014

orange juche posted:

AMD APUs in laptops definitely aren't all that terrible as long as you keep in mind what their strengths are. I just recommended my sister buy a 15 inch HP Envy with a Richland A10 in it. It is faster than her old laptop (a 2010 C2D ULV chip) and she can run games on it at an acceptable quality (TF2, other source games)

[emphasis mine] That's more thanks to the engine, with how well it scales to hardware, than it is to the actual hardware.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

SwissArmyDruid posted:

[emphasis mine] That's more thanks to the engine, with how well it scales to hardware, than it is to the actual hardware.

No, not really. The difference between AMD APUs vs ANY other intergrated solution is the difference between completely playable and not at all :)

No one is saying that an AMD APU is going to net you 60FPS @ 1080p on BF4 on ultra, but it sure as heck can get you playing a LOT of games (even some modern ones).

Nintendo Kid
Aug 4, 2011

by Smythe

Stanley Pain posted:

No, not really. The difference between AMD APUs vs ANY other intergrated solution is the difference between completely playable and not at all :)

I haven't seen an integrated graphics solution that couldn't handle Source games in a very long time. Integrated graphics on some lovely 2005 HP notebook I have around can handle 1280x1024 TF2 on low settings.

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Nintendo Kid posted:

I haven't seen an integrated graphics solution that couldn't handle Source games in a very long time. Integrated graphics on some lovely 2005 HP notebook I have around can handle 1280x1024 TF2 on low settings.

That's true I guess. Seems like the HD4600 can do pretty well in some modern games as well.

adorai
Nov 2, 2002

10/27/04 Never forget
Grimey Drawer
It's hilarious that when it comes to integrated graphics, an intel solution is "good enough for anything from 2005" but for compute, if it can't run crisis 6 at 4k it's crap.

Nintendo Kid
Aug 4, 2011

by Smythe

Stanley Pain posted:

That's true I guess. Seems like the HD4600 can do pretty well in some modern games as well.

I checked Valve's stats recommendations just now. Turns out then-ATI had integrated graphics solutions that met the current Source engine (TF2 version) minimum released starting in 2003, and Intel and ATI had "reccomended" level integraed graphics starting in 2005. :v:

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Nintendo Kid posted:

I checked Valve's stats recommendations just now. Turns out then-ATI had integrated graphics solutions that met the current Source engine (TF2 version) minimum released starting in 2003, and Intel and ATI had "reccomended" level integraed graphics starting in 2005. :v:

Hahah oh my. Yeah so I guess TF2 can run on my coffee maker now :shobon:

sweart gliwere
Jul 5, 2005

better to die an evil wizard,
than to live as a grand one.
Pillbug

Stanley Pain posted:

That's true I guess. Seems like the HD4600 can do pretty well in some modern games as well.

As someone using an HD4600 i5/4570 processor with no discrete GPU until I can justify it monetarily, it works pretty well on the handful I've tested. Downside is Intel's programming misses some stuff, including the comically easy task of running those Baldur's Gate rereleases on hardware from 2013. Literally unplayable, despite having complete ports which run on iOS+Android tablets, and the PC release was functional on an Atom330/Ion system.

It's like some weird purgatory where stuff after 2003 is generally good, but earlier things and random cases are a minefield. Just glad I'm not invested in gaming as a hobby anymore. And yeah, Source is crazy versatile, mostly because it was mass-market practically a decade ago.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Nintendo Kid posted:

I haven't seen an integrated graphics solution that couldn't handle Source games in a very long time. Integrated graphics on some lovely 2005 HP notebook I have around can handle 1280x1024 TF2 on low settings.

I managed to play Portal 2 (although with a config file change) on a Latitude D800 with Radeon 9600 graphics, although it doesn't support the shaders for the fluid, so I literally could not see it. Thankfully, I was playing co-op in the same room!

Source can go down to an extreme extent.

SwissArmyDruid
Feb 14, 2014

sweart gliwere posted:

And yeah, Source is crazy versatile, mostly because it was mass-market practically a decade ago.

I'd note that Titanfall uses Source, but I'm not convinced that Respawn hasn't ripped out and recoded > 50% of the engine.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
We're getting kind of far afield here, but keep in mind that the Source engine undergoes pretty regular overhauls as Valve releases new products. You can read about some past engine enhancements on the Valve publications page. These new engine revisions are periodically back-ported to active games, so for example TF2 is currently running Valve's very latest version of the Source engine.

Paul MaudDib
May 2, 2006

TEAM NVIDIA:
FORUM POLICE

Alereon posted:

We're getting kind of far afield here, but keep in mind that the Source engine undergoes pretty regular overhauls as Valve releases new products. You can read about some past engine enhancements on the Valve publications page. These new engine revisions are periodically back-ported to active games, so for example TF2 is currently running Valve's very latest version of the Source engine.

Yeah, and over the years TF2 has gotten worse and worse with the particle effects. They used to be fairly rare items, nowadays everyone is a walking particle fountain. Cosmetic models have also gone up in complexity.

Source does a pretty good job of scaling to the available processing resources. When you turn the graphics down TF2 is heavily bottlenecked by CPU, and most of that happens on a single core. I used to play it on a Compaq CQ56-115DX with a single-core 2.3ghz AMD V140 cpu and a Radeon HD Mobility 4250 graphics chipset. It couldn't do much more than minimum spec graphics and it could chug a bit during intense combat but it was tolerably playable.

Paul MaudDib fucked around with this message at 16:46 on Aug 18, 2014

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
The Source engine (and TF2 in particular) is actually pretty good at scaling across multiple cores, I just tested and during multiplayer combat I had pretty even load across four logical cores, somewhat less load on a fifth, and light load across the other three (I verified all were flat before launching TF2). Multi-core rendering used to be disabled by default due to hitching and freezing issues, though. I think there's room for surprisingly good performance on Broadwell-Y, though "surprisingly good" may not mean playable given that we're talking about single-digit watts. I'd mention some AMD products, but it almost seems like they've given up selling the products they do launch.

SwissArmyDruid
Feb 14, 2014

Alereon posted:

The Source engine (and TF2 in particular) is actually pretty good at scaling across multiple cores, I just tested and during multiplayer combat I had pretty even load across four logical cores, somewhat less load on a fifth, and light load across the other three (I verified all were flat before launching TF2). Multi-core rendering used to be disabled by default due to hitching and freezing issues, though. I think there's room for surprisingly good performance on Broadwell-Y, though "surprisingly good" may not mean playable given that we're talking about single-digit watts. I'd mention some AMD products, but it almost seems like they've given up selling the products they do launch.

Just waiting on "K13", something that doesn't have loving wasted die space being used for graphics that they could be using on stronger cores, and 10-series chipset.

SwissArmyDruid fucked around with this message at 21:37 on Aug 18, 2014

Nintendo Kid
Aug 4, 2011

by Smythe
The graphics "wasted die space" isn't why the CPU cores suck.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.
Their highest end FX chips dont have any onboard graphics and are still terrible.

Lord Windy
Mar 26, 2010
Why do their FX chips suck so much? They draw more power and run faster in terms of Ghz but they just aren't as good as comparable Intel chips?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Lord Windy posted:

Why do their FX chips suck so much? They draw more power and run faster in terms of Ghz but they just aren't as good as comparable Intel chips?

They don't do as much work per clock tick. They are severely outclassed by Intel in single-core performance, to the point where 8 FX cores at very high clocks struggle to keep up with 4 Intel cores at lower clocks. Programs that cannot use all eight cores (and there are a ton of them, especially games) are dominated by Intel.

And drawing more power isn't a good thing, just the opposite. For a given level of performance, it's better to achieve it using less electricity, not more. And Intel just kills AMD chips here, too.

There are all sorts of reasons as to why, but it all boils down to Intel doing a good job at CPUs for Core and AMD doing a bad job for A-series and FX.

adorai
Nov 2, 2002

10/27/04 Never forget
Grimey Drawer

Factory Factory posted:

There are all sorts of reasons as to why, but it all boils down to Intel doing a good job at CPUs for Core and AMD doing a bad job for A-series and FX.
I think it boils down to Intel bribing (or blackmailing) PC manufacturers while the p4 was using 1.21 gigawatts of power, depriving AMD of R&D money to stay competitive. As a result, AMD had to spin off their foundries, putting them at an even worse disadvantage.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

adorai posted:

I think it boils down to Intel bribing (or blackmailing) PC manufacturers while the p4 was using 1.21 gigawatts of power, depriving AMD of R&D money to stay competitive. As a result, AMD had to spin off their foundries, putting them at an even worse disadvantage.

They obtained roughly 25% of the server market at the height of their popularity, and failed to make any meaningful advancements with the move from the Athlon 64s to the X2 series of processors and onward. K10 was only acceptable in performance, never excellent, and meanwhile Intel's very large bags of money sure didn't get in the way of tick, tock, tick, tock...

But it's myopic to view AMD's failure as entirely Intel's fault. Intel definitely did some underhanded poo poo, and there's no effective way to punish a corporation that large for doing heinous things so they more or less got away with it - no argument there - but at the peak of AMD's popularity, they failed badly to reach out and grasp the moment. They only had the moment. FIN

Rastor
Jun 2, 2001

AMD was definitely stabbed by Intel, but they also stumbled with their choice to go with a CMT architecture which is weak on a per-core basis. Unfortunately an architecture decision such as that lasts for years; Intel's NetBurst was [on the desktop] from Willamette (late 2000) until Conroe (2006).

AMD CMT / Bulldozer is expected to be replaced by a new, non-CMT architecture -- but not until 2016. Until then, Intel gets to do what they want without any significant competition. And even if AMD comes up with a competitive architecture, Intel is expected to maintain significant foundry advantages.

Canned Sunshine
Nov 20, 2005

CAUTION: POST QUALITY UNDER CONSTRUCTION



Agreed posted:

but at the peak of AMD's popularity, they failed badly to reach out and grasp the moment.
Even at the peak of their popularity though, their yearly revenue itself was still below that of Intel's R&D budget, so they were still rather limited in what they could do. And, you know, regardless of their "popularity", when Intel is threatening all the first-tier OEMs with cancelation of rebates that all of those companies depended upon if any of them used AMD's product, yeah, you can't really "grasp the moment". It was literally monopolistic behavior at its worst and no one should be defending it.

OldPueblo
May 2, 2007

Likes to argue. Wins arguments with ignorant people. Not usually against educated people, just ignorant posters. Bing it.
Yeah if I recall thy were actually getting screwed hardest by Intel monopoly poo poo right during the time they had their best chip to compete (were actually winning). So an argument could be made that they may have been cheated out of a ton of gained momentum. Although I'm not sure they had the production capacity to supplant Intel contracts anyway.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

SourKraut posted:

Even at the peak of their popularity though, their yearly revenue itself was still below that of Intel's R&D budget, so they were still rather limited in what they could do. And, you know, regardless of their "popularity", when Intel is threatening all the first-tier OEMs with cancelation of rebates that all of those companies depended upon if any of them used AMD's product, yeah, you can't really "grasp the moment". It was literally monopolistic behavior at its worst and no one should be defending it.

The one thing I think everyone with a pair of brain cells to rub together can agree on is that Intel is a giant with all the money and they behave very, very badly toward others - e.g. patent infringement for a modem on the chip? gently caress it, put the competition under by operating at a loss if necessary to move the product (simplified, but you get me I hope). I do think, however, that it's important to remember that despite practices by Intel that are simply anticompetitive by any measure, they did have a moment where they could have likely made some serious inroads, and when that moment passed, they ran on inertia in the desktop space much more than innovation. The story isn't as simple as "Intel bad, AMD good, bad Intel pay make AMD die!" which it gets reduced to far too often in these discussions (though thankfully not here, generally speaking, and certainly not in this exchange in case you felt that was in any way aimed at you).

It'd be unfair to give AMD all the credit and Intel all the blame. Intel deserves plenty of blame, too, nobody is suggesting otherwise - it's just that AMD's big moment wasn't handled by them very well, and the results made Intel's job of pushing them out of the market easier.

Arzachel
May 12, 2012
While Intel's shady deals certainly didn't help, the fall of AMD was more down to AMD itself, or Hector Ruiz to be more precise. Insider trading, cutting R&D funds on Phenom and fabs, buying Ati at a massively inflated price (netting him a cozy bonus), spinning off GF at the worst terms imaginable (netting him a cozy bonus and a chairman position to boot).

The funiest thing is that while the Bulldozer family is not very good to say the least, they would at least be competitive if not for GF. Their 32nm SOI node is worse in every way except density compared to Intel's 32nm, never mind 22nm.

WhyteRyce
Dec 30, 2001

Even at the peak of their popularity, consumers are dumb and didn't know any better. Only nerds gave a poo poo about benchmarks and a sizable portion of buyers probably stuck to what brands they recognized. It didn't help that earlier non-Intel chips were poo poo. I had a hell of a time convincing people I knew who trusted my opinion to buy AMD at the time, I can't even imagine what it was like for your average consumer asking the Circuit City guy for help.

BOOTY-ADE
Aug 30, 2006
Probation
Can't post for 6 hours!
IMHO, a lot of AMD's issue fell around advertising - like, you'd see a TON of Intel commercials touting their tech and who they partnered with for servers/desktops/laptops/etc., but I never saw nearly as much with AMD. Sure you can find a few commercials they pushed out with some of their Athlon/Athlon XP line of chips, but compared to Intel, advertising was practically nonexistent. I don't know if that came down to some of the PC vendors being paid off to not advertise that they carried AMD-based systems, or if it was just AMD not spending the time and money to market itself, but that was a pretty big factor in how well known they were overall.

cstine
Apr 15, 2004

What's in the box?!?

Ozz81 posted:

don't know if that came down to some of the PC vendors being paid off to not advertise that they carried AMD-based systems, or if it was just AMD not spending the time and money to market itself, but that was a pretty big factor in how well known they were overall.

It was very much both of those.

fart simpson
Jul 2, 2005

DEATH TO AMERICA
:xickos:

WhyteRyce posted:

Even at the peak of their popularity, consumers are dumb and didn't know any better. Only nerds gave a poo poo about benchmarks and a sizable portion of buyers probably stuck to what brands they recognized. It didn't help that earlier non-Intel chips were poo poo. I had a hell of a time convincing people I knew who trusted my opinion to buy AMD at the time, I can't even imagine what it was like for your average consumer asking the Circuit City guy for help.

Back in those days I watched a Circuit City guy tell someone to get a Pentium 4 system if he wanted great performance because AMD was a cheap budget brand.

Don Lapre
Mar 28, 2001

If you're having problems you're either holding the phone wrong or you have tiny girl hands.

fart simpson posted:

Back in those days I watched a Circuit City guy tell someone to get a Pentium 4 system if he wanted great performance because AMD was a cheap budget brand.

AMD was/is a cheap budget brand. They always have been.

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD
Intel threw AMD into a hole.

AMD then started digging.

BOOTY-ADE
Aug 30, 2006
Probation
Can't post for 6 hours!

Don Lapre posted:

AMD was/is a cheap budget brand. They always have been.

The perception though is that cheap budget brand = not as good with a lot of consumers. No different than having two identical 4K LED televisions with one being a name brand like Samsung, and the other being something like Emerson. People gravitate towards a name and product history and tend to avoid changing unless there's some really good, solid proof that the change is worth it.

Adbot
ADBOT LOVES YOU

fart simpson
Jul 2, 2005

DEATH TO AMERICA
:xickos:

Don Lapre posted:

AMD was/is a cheap budget brand. They always have been.

Yeah but back then AMD had a competitive if not outright superior product. You know why a salesman framing it that way was unfair.

  • Locked thread