|
Lord Windy posted:Where would I go to learn more about the future of AMD? I'm forever hopeful that they will make something awesome. The APUs aren't very good unless you need slightly higher end video integrated for some reason. Their GPU is a bit better than the onboard Intel HD GPU although Intel is improving. The CPU trails behind in performance. In a laptop they provide a little bit better video than the Intel HD, but still don't have the single core performance of an Intel chip so if you wanted to game with one it wouldn't be that fast (it's sharing the system memory which means it's DDR3 for video, and the CPU won't run the game as well). My father is using an APU and it's fine for him but all he runs is Firefox, MS Word, Adobe reader and the like. With an SSD that system will probably last him years, but only due to his particular use case of very light use. He's an AMD fanboy so he really wanted an AMD system but there was no reason to get one of the FX chips. It would be a better PC with an i3 or something but it's not a big deal either way since he doesn't stress it much at all and the whole thing was pretty cheap.
|
# ? Aug 16, 2014 12:24 |
|
|
# ? Dec 1, 2024 20:18 |
|
Rexxed posted:The APUs aren't very good unless you need slightly higher end video integrated for some reason. Their GPU is a bit better than the onboard Intel HD GPU although Intel is improving. The CPU trails behind in performance. In a laptop they provide a little bit better video than the Intel HD, but still don't have the single core performance of an Intel chip so if you wanted to game with one it wouldn't be that fast (it's sharing the system memory which means it's DDR3 for video, and the CPU won't run the game as well). This is not entirely true. AMD APUs allow you to play a lot of games that you would otherwise not be able to play with any other integrated GPU solution. The important part to remember is that price point of said laptop. If you're looking at something in the really cheap $300-$400 range it can't be beat. I have an old A6-3xxx series or whatnot (cost $350 CDN) and it plays the games I'd want to play while mobile. Things like Diablo 3, a couple of different MMOs, etc.
|
# ? Aug 16, 2014 15:00 |
|
AMD APUs in laptops definitely aren't all that terrible as long as you keep in mind what their strengths are. I just recommended my sister buy a 15 inch HP Envy with a Richland A10 in it. It is faster than her old laptop (a 2010 C2D ULV chip) and she can run games on it at an acceptable quality (TF2, other source games) She was getting it mainly for college work and maybe some very light mobile gaming when she is bored of listening to the professor, and the only intel driven HP Envy I could find was $200 more for performance she wouldn't notice in office applications and an absolutely pitiful last gen intel HD 4400 that would have serious issues running any 3d games above E: Fixed I was jetlagged and phone posting. orange juche fucked around with this message at 20:51 on Aug 17, 2014 |
# ? Aug 17, 2014 12:46 |
|
orange juche posted:She was getting it mainly for college work and maybe some very light mobile gaming when she is bored of listening to the professor, and the only intel driven HP Envy I could find was $200 more for performance she wouldn't notice in office applications and an absolutely pitiful last gen intel HD 4400 that would have serious issues running any 3d games above 1280p low quality. With that resolution, games be damned. You probably mean 720p
|
# ? Aug 17, 2014 14:07 |
|
orange juche posted:AMD APUs in laptops definitely aren't all that terrible as long as you keep in mind what their strengths are. I just recommended my sister buy a 15 inch HP Envy with a Richland A10 in it. It is faster than her old laptop (a 2010 C2D ULV chip) and she can run games on it at an acceptable quality (TF2, other source games) How would battery life be when compared to a haswell, though?
|
# ? Aug 17, 2014 16:12 |
|
chocolateTHUNDER posted:How would battery life be when compared to a haswell, though? My trinity A8 had a battery life of about 4 hours playing something ancient like Everquest or WoW. So about half (or less) of haswell. It was a desktop replacement for me - I was spending a lot of time at my girlfriend's house and it was nice to be able to log in and see what was going on. It sat on her kitchen table and never moved. When we moved in together, I gave it to my parents to replace their Pentium 4 desktop. For most people, battery life isn't going to be a big issue.
|
# ? Aug 17, 2014 19:36 |
|
chocolateTHUNDER posted:How would battery life be when compared to a haswell, though? If you're playing video games on any platform you really want it plugged into the wall. Haswell, Richland, doesn't matter, because your iGPU performance will be crippled severely by simply unplugging from the wall. In all actuality, really once you have more than 3-4 hours endurance on a battery, it is basically just bragging rights unless you live in Swiss Family Robinson style and have zero access to power. I can't foresee being in a situation where I needed to use a laptop for more than 4 hours and could not find a charging port. (inflight airlines notwithstanding) Anyways if you are on a long haul flight cross country or something, usually there is a 120v outlet under your seat.
|
# ? Aug 17, 2014 20:54 |
|
orange juche posted:AMD APUs in laptops definitely aren't all that terrible as long as you keep in mind what their strengths are. I just recommended my sister buy a 15 inch HP Envy with a Richland A10 in it. It is faster than her old laptop (a 2010 C2D ULV chip) and she can run games on it at an acceptable quality (TF2, other source games) [emphasis mine] That's more thanks to the engine, with how well it scales to hardware, than it is to the actual hardware.
|
# ? Aug 17, 2014 21:07 |
|
SwissArmyDruid posted:[emphasis mine] That's more thanks to the engine, with how well it scales to hardware, than it is to the actual hardware. No, not really. The difference between AMD APUs vs ANY other intergrated solution is the difference between completely playable and not at all No one is saying that an AMD APU is going to net you 60FPS @ 1080p on BF4 on ultra, but it sure as heck can get you playing a LOT of games (even some modern ones).
|
# ? Aug 18, 2014 01:12 |
|
Stanley Pain posted:No, not really. The difference between AMD APUs vs ANY other intergrated solution is the difference between completely playable and not at all I haven't seen an integrated graphics solution that couldn't handle Source games in a very long time. Integrated graphics on some lovely 2005 HP notebook I have around can handle 1280x1024 TF2 on low settings.
|
# ? Aug 18, 2014 01:15 |
|
Nintendo Kid posted:I haven't seen an integrated graphics solution that couldn't handle Source games in a very long time. Integrated graphics on some lovely 2005 HP notebook I have around can handle 1280x1024 TF2 on low settings. That's true I guess. Seems like the HD4600 can do pretty well in some modern games as well.
|
# ? Aug 18, 2014 01:34 |
|
It's hilarious that when it comes to integrated graphics, an intel solution is "good enough for anything from 2005" but for compute, if it can't run crisis 6 at 4k it's crap.
|
# ? Aug 18, 2014 01:44 |
|
Stanley Pain posted:That's true I guess. Seems like the HD4600 can do pretty well in some modern games as well. I checked Valve's stats recommendations just now. Turns out then-ATI had integrated graphics solutions that met the current Source engine (TF2 version) minimum released starting in 2003, and Intel and ATI had "reccomended" level integraed graphics starting in 2005.
|
# ? Aug 18, 2014 02:05 |
|
Nintendo Kid posted:I checked Valve's stats recommendations just now. Turns out then-ATI had integrated graphics solutions that met the current Source engine (TF2 version) minimum released starting in 2003, and Intel and ATI had "reccomended" level integraed graphics starting in 2005. Hahah oh my. Yeah so I guess TF2 can run on my coffee maker now
|
# ? Aug 18, 2014 02:11 |
|
Stanley Pain posted:That's true I guess. Seems like the HD4600 can do pretty well in some modern games as well. As someone using an HD4600 i5/4570 processor with no discrete GPU until I can justify it monetarily, it works pretty well on the handful I've tested. Downside is Intel's programming misses some stuff, including the comically easy task of running those Baldur's Gate rereleases on hardware from 2013. Literally unplayable, despite having complete ports which run on iOS+Android tablets, and the PC release was functional on an Atom330/Ion system. It's like some weird purgatory where stuff after 2003 is generally good, but earlier things and random cases are a minefield. Just glad I'm not invested in gaming as a hobby anymore. And yeah, Source is crazy versatile, mostly because it was mass-market practically a decade ago.
|
# ? Aug 18, 2014 02:44 |
|
Nintendo Kid posted:I haven't seen an integrated graphics solution that couldn't handle Source games in a very long time. Integrated graphics on some lovely 2005 HP notebook I have around can handle 1280x1024 TF2 on low settings. I managed to play Portal 2 (although with a config file change) on a Latitude D800 with Radeon 9600 graphics, although it doesn't support the shaders for the fluid, so I literally could not see it. Thankfully, I was playing co-op in the same room! Source can go down to an extreme extent.
|
# ? Aug 18, 2014 08:11 |
|
sweart gliwere posted:And yeah, Source is crazy versatile, mostly because it was mass-market practically a decade ago. I'd note that Titanfall uses Source, but I'm not convinced that Respawn hasn't ripped out and recoded > 50% of the engine.
|
# ? Aug 18, 2014 09:21 |
|
We're getting kind of far afield here, but keep in mind that the Source engine undergoes pretty regular overhauls as Valve releases new products. You can read about some past engine enhancements on the Valve publications page. These new engine revisions are periodically back-ported to active games, so for example TF2 is currently running Valve's very latest version of the Source engine.
|
# ? Aug 18, 2014 16:11 |
|
Alereon posted:We're getting kind of far afield here, but keep in mind that the Source engine undergoes pretty regular overhauls as Valve releases new products. You can read about some past engine enhancements on the Valve publications page. These new engine revisions are periodically back-ported to active games, so for example TF2 is currently running Valve's very latest version of the Source engine. Yeah, and over the years TF2 has gotten worse and worse with the particle effects. They used to be fairly rare items, nowadays everyone is a walking particle fountain. Cosmetic models have also gone up in complexity. Source does a pretty good job of scaling to the available processing resources. When you turn the graphics down TF2 is heavily bottlenecked by CPU, and most of that happens on a single core. I used to play it on a Compaq CQ56-115DX with a single-core 2.3ghz AMD V140 cpu and a Radeon HD Mobility 4250 graphics chipset. It couldn't do much more than minimum spec graphics and it could chug a bit during intense combat but it was tolerably playable. Paul MaudDib fucked around with this message at 16:46 on Aug 18, 2014 |
# ? Aug 18, 2014 16:37 |
|
The Source engine (and TF2 in particular) is actually pretty good at scaling across multiple cores, I just tested and during multiplayer combat I had pretty even load across four logical cores, somewhat less load on a fifth, and light load across the other three (I verified all were flat before launching TF2). Multi-core rendering used to be disabled by default due to hitching and freezing issues, though. I think there's room for surprisingly good performance on Broadwell-Y, though "surprisingly good" may not mean playable given that we're talking about single-digit watts. I'd mention some AMD products, but it almost seems like they've given up selling the products they do launch.
|
# ? Aug 18, 2014 17:23 |
|
Alereon posted:The Source engine (and TF2 in particular) is actually pretty good at scaling across multiple cores, I just tested and during multiplayer combat I had pretty even load across four logical cores, somewhat less load on a fifth, and light load across the other three (I verified all were flat before launching TF2). Multi-core rendering used to be disabled by default due to hitching and freezing issues, though. I think there's room for surprisingly good performance on Broadwell-Y, though "surprisingly good" may not mean playable given that we're talking about single-digit watts. I'd mention some AMD products, but it almost seems like they've given up selling the products they do launch. Just waiting on "K13", something that doesn't have loving wasted die space being used for graphics that they could be using on stronger cores, and 10-series chipset. SwissArmyDruid fucked around with this message at 21:37 on Aug 18, 2014 |
# ? Aug 18, 2014 21:35 |
|
The graphics "wasted die space" isn't why the CPU cores suck.
|
# ? Aug 18, 2014 21:50 |
|
Their highest end FX chips dont have any onboard graphics and are still terrible.
|
# ? Aug 18, 2014 22:08 |
|
Why do their FX chips suck so much? They draw more power and run faster in terms of Ghz but they just aren't as good as comparable Intel chips?
|
# ? Aug 18, 2014 22:37 |
|
Lord Windy posted:Why do their FX chips suck so much? They draw more power and run faster in terms of Ghz but they just aren't as good as comparable Intel chips? They don't do as much work per clock tick. They are severely outclassed by Intel in single-core performance, to the point where 8 FX cores at very high clocks struggle to keep up with 4 Intel cores at lower clocks. Programs that cannot use all eight cores (and there are a ton of them, especially games) are dominated by Intel. And drawing more power isn't a good thing, just the opposite. For a given level of performance, it's better to achieve it using less electricity, not more. And Intel just kills AMD chips here, too. There are all sorts of reasons as to why, but it all boils down to Intel doing a good job at CPUs for Core and AMD doing a bad job for A-series and FX.
|
# ? Aug 18, 2014 23:02 |
|
Factory Factory posted:There are all sorts of reasons as to why, but it all boils down to Intel doing a good job at CPUs for Core and AMD doing a bad job for A-series and FX.
|
# ? Aug 19, 2014 00:09 |
|
adorai posted:I think it boils down to Intel bribing (or blackmailing) PC manufacturers while the p4 was using 1.21 gigawatts of power, depriving AMD of R&D money to stay competitive. As a result, AMD had to spin off their foundries, putting them at an even worse disadvantage. They obtained roughly 25% of the server market at the height of their popularity, and failed to make any meaningful advancements with the move from the Athlon 64s to the X2 series of processors and onward. K10 was only acceptable in performance, never excellent, and meanwhile Intel's very large bags of money sure didn't get in the way of tick, tock, tick, tock... But it's myopic to view AMD's failure as entirely Intel's fault. Intel definitely did some underhanded poo poo, and there's no effective way to punish a corporation that large for doing heinous things so they more or less got away with it - no argument there - but at the peak of AMD's popularity, they failed badly to reach out and grasp the moment. They only had the moment. FIN
|
# ? Aug 19, 2014 00:19 |
|
AMD was definitely stabbed by Intel, but they also stumbled with their choice to go with a CMT architecture which is weak on a per-core basis. Unfortunately an architecture decision such as that lasts for years; Intel's NetBurst was [on the desktop] from Willamette (late 2000) until Conroe (2006). AMD CMT / Bulldozer is expected to be replaced by a new, non-CMT architecture -- but not until 2016. Until then, Intel gets to do what they want without any significant competition. And even if AMD comes up with a competitive architecture, Intel is expected to maintain significant foundry advantages.
|
# ? Aug 19, 2014 00:22 |
|
Agreed posted:but at the peak of AMD's popularity, they failed badly to reach out and grasp the moment.
|
# ? Aug 19, 2014 07:05 |
|
Yeah if I recall thy were actually getting screwed hardest by Intel monopoly poo poo right during the time they had their best chip to compete (were actually winning). So an argument could be made that they may have been cheated out of a ton of gained momentum. Although I'm not sure they had the production capacity to supplant Intel contracts anyway.
|
# ? Aug 19, 2014 07:12 |
|
SourKraut posted:Even at the peak of their popularity though, their yearly revenue itself was still below that of Intel's R&D budget, so they were still rather limited in what they could do. And, you know, regardless of their "popularity", when Intel is threatening all the first-tier OEMs with cancelation of rebates that all of those companies depended upon if any of them used AMD's product, yeah, you can't really "grasp the moment". It was literally monopolistic behavior at its worst and no one should be defending it. The one thing I think everyone with a pair of brain cells to rub together can agree on is that Intel is a giant with all the money and they behave very, very badly toward others - e.g. patent infringement for a modem on the chip? gently caress it, put the competition under by operating at a loss if necessary to move the product (simplified, but you get me I hope). I do think, however, that it's important to remember that despite practices by Intel that are simply anticompetitive by any measure, they did have a moment where they could have likely made some serious inroads, and when that moment passed, they ran on inertia in the desktop space much more than innovation. The story isn't as simple as "Intel bad, AMD good, bad Intel pay make AMD die!" which it gets reduced to far too often in these discussions (though thankfully not here, generally speaking, and certainly not in this exchange in case you felt that was in any way aimed at you). It'd be unfair to give AMD all the credit and Intel all the blame. Intel deserves plenty of blame, too, nobody is suggesting otherwise - it's just that AMD's big moment wasn't handled by them very well, and the results made Intel's job of pushing them out of the market easier.
|
# ? Aug 19, 2014 07:58 |
|
While Intel's shady deals certainly didn't help, the fall of AMD was more down to AMD itself, or Hector Ruiz to be more precise. Insider trading, cutting R&D funds on Phenom and fabs, buying Ati at a massively inflated price (netting him a cozy bonus), spinning off GF at the worst terms imaginable (netting him a cozy bonus and a chairman position to boot). The funiest thing is that while the Bulldozer family is not very good to say the least, they would at least be competitive if not for GF. Their 32nm SOI node is worse in every way except density compared to Intel's 32nm, never mind 22nm.
|
# ? Aug 19, 2014 12:12 |
|
Even at the peak of their popularity, consumers are dumb and didn't know any better. Only nerds gave a poo poo about benchmarks and a sizable portion of buyers probably stuck to what brands they recognized. It didn't help that earlier non-Intel chips were poo poo. I had a hell of a time convincing people I knew who trusted my opinion to buy AMD at the time, I can't even imagine what it was like for your average consumer asking the Circuit City guy for help.
|
# ? Aug 19, 2014 16:00 |
|
IMHO, a lot of AMD's issue fell around advertising - like, you'd see a TON of Intel commercials touting their tech and who they partnered with for servers/desktops/laptops/etc., but I never saw nearly as much with AMD. Sure you can find a few commercials they pushed out with some of their Athlon/Athlon XP line of chips, but compared to Intel, advertising was practically nonexistent. I don't know if that came down to some of the PC vendors being paid off to not advertise that they carried AMD-based systems, or if it was just AMD not spending the time and money to market itself, but that was a pretty big factor in how well known they were overall.
|
# ? Aug 19, 2014 18:55 |
|
Ozz81 posted:don't know if that came down to some of the PC vendors being paid off to not advertise that they carried AMD-based systems, or if it was just AMD not spending the time and money to market itself, but that was a pretty big factor in how well known they were overall. It was very much both of those.
|
# ? Aug 19, 2014 20:02 |
|
WhyteRyce posted:Even at the peak of their popularity, consumers are dumb and didn't know any better. Only nerds gave a poo poo about benchmarks and a sizable portion of buyers probably stuck to what brands they recognized. It didn't help that earlier non-Intel chips were poo poo. I had a hell of a time convincing people I knew who trusted my opinion to buy AMD at the time, I can't even imagine what it was like for your average consumer asking the Circuit City guy for help. Back in those days I watched a Circuit City guy tell someone to get a Pentium 4 system if he wanted great performance because AMD was a cheap budget brand.
|
# ? Aug 19, 2014 20:52 |
|
fart simpson posted:Back in those days I watched a Circuit City guy tell someone to get a Pentium 4 system if he wanted great performance because AMD was a cheap budget brand. AMD was/is a cheap budget brand. They always have been.
|
# ? Aug 19, 2014 20:58 |
|
Intel threw AMD into a hole. AMD then started digging.
|
# ? Aug 19, 2014 21:10 |
|
Don Lapre posted:AMD was/is a cheap budget brand. They always have been. The perception though is that cheap budget brand = not as good with a lot of consumers. No different than having two identical 4K LED televisions with one being a name brand like Samsung, and the other being something like Emerson. People gravitate towards a name and product history and tend to avoid changing unless there's some really good, solid proof that the change is worth it.
|
# ? Aug 20, 2014 01:40 |
|
|
# ? Dec 1, 2024 20:18 |
|
Don Lapre posted:AMD was/is a cheap budget brand. They always have been. Yeah but back then AMD had a competitive if not outright superior product. You know why a salesman framing it that way was unfair.
|
# ? Aug 20, 2014 02:03 |