Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

~Coxy posted:

It's a bit worrying at the high end that the advantage to the AMD CPUs is merely "more cores". That's not a particularly useful drawcard for most enthusiasts, I would imagine.

Most of the stuff people do that actually leans hard on the CPU will take advantage of more cores. AMD's already got a pretty good thing going with core count in the low end: you can get a quad-core Athlon II for a little bit less than a Core i3 dual, and the Athlon ends up doing very well for itself. The Phenom II X6 put up a good showing against the last-gen Core i5 quads, and even the lower-end i7s, in heavily parallel tasks. Of course, a lot comes down to performance per core, but "more cores for the same money" can be a compelling performance advantage.

Adbot
ADBOT LOVES YOU

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

Faceless Clock posted:

Does it really matter?

The AMD Athlon X4 system is going to be substantially slower. You'd only buy it if you can't afford the Intel. It's not even a "bang-for-your-buck" solution. It's just what you buy if you can't afford better.

That depends entirely on what you're doing. For most common desktop work, processor performance is close to irrelevant. An i3 or Athlon II will run Word, Powerpoint, and sane Excel just as well as an i7 or Xeon hexcore. And, as far as games are concerned, there aren't that many games which bottleneck on the CPU. If you drop the resolution to CRT-circa-1994 levels, or have some ridiculous $1500 3-GPU setup, you might shift the bottleneck back to the processor, but it's simply not a huge concern for the vast majority of users. Most people don't encode video or run CFD simulations on their desktop all day long.

Yes, the i5 quad is a substantially faster processor in benchmarks and CPU-heavy tasks. If you had an unlimited budget and simply wanted the best value-for-money on a general-purpose system, it would be a great choice. However, for a lot of users the Athlon II X4 will do just as well for less money. Given that budget almost always is of some concern, why not go for the option that effectively performs just as well for less money?

Powercrazy posted:

Or if you are an irrational fanboy who doesn't understand that performance per core is more important than more cores.

How do you figure? Sure, there are exceptions, but most stuff that leans hard on the CPU is written to take advantage of multiple cores these days. At that point, of course, both performance per core and number of cores play an important role in overall performance. Typical office crap is lightly threaded - but again, that's the stuff where even a $60-100 bargain basement CPU is more than enough to handle everything with flying colors.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

Alereon posted:

That article is a year old, doesn't include the current generation of Intel processors, and STILL shows the Intel Core i5 750 as having the best performance/dollar when total system cost is factored in.

I mean, obviously if you can't afford $350 for your motherboard and CPU then AMD is going to give you compellingly better value than Intel will, and of course when I built a computer for my grandma I used an Athlon II X4 and the onboard video because she's barely going to use that CPU and the higher performance would have been wasted. If you do notice performance though, and you have the money, it's a little silly to say an i5 2500 wouldn't be worth it.

Those "performance per dollar" numbers can be misleading, though. If you want to express things as a simple ratio, you have to distill "performance" into a single number, and that's always going to have issues. For instance, a lot of people don't give the slightest of shits about Folding@Home, but it's factored into the results. There are very few situations in which it matters whether it takes 16 or 20 seconds to encode a 10 minute MP3, but again, that's factored into the "performance" number.

Ultimately, it comes down to one question: what does the extra $100-150 over the minimum acceptable option get the user? For a whole lot of people, the answer is, "not much." Hell, probably half of the first world would be just fine with a wimpy little single-core ARM in an iPad 1 for their home computing needs. Most of the rest could get by just fine with a cheap dual-core/integrated-graphics laptop - and many of them do. Even gamers, and people who occasionally do things that require a fair amount of CPU power (like, say, video encoding) will for the most part be perfectly happy with an inexpensive quad core. There are certainly exceptions, but for the vast majority of the world and even a significant chunk of SH/SC types, "good enough, fast enough, and dirt cheap" wins out over "MAXIMUM X-TREME PERFORMANCE."

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

pienipple posted:

Ooh, I hope that means a Zacate price drop is in the works.

The rumor mills are saying that Zacate's supply is seriously constrained, and that prices should fall naturally once AMD ramps up production to fulfill demand. Given that you can buy a complete 15.6" laptop with an E-350 for under $350, I don't think that the price of the chip itself is what's driving prices on stuff like Zacate mITX and desktop boards so high.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

Not A Gay Name posted:

Well for CPU's AMD is still at 45nm not 40nm. As far as I know there are no 40nm CPUs only GPUs in which case, Nvidia is still at the 40nm process as well.

In those regards they are certainly behind Intel with the 32nm process which is what I thought (at least the high end) Bulldozer was going to rather than 28nm.

Though I don't see any reason they can't skip 32 and go to 28 if it's ready.

AMD's shipping four 40nm bulk CPUs right this very minute. They're selling like crazy.

It takes time to develop a design that works on a smaller process; it's not just a matter of loading it into the magical shrinking machine. Bulldozer and Llano started quite a while ago, and they're already in mass production. The very first Wichita products, which are very simple in comparison, have just now gone to initial 28nm test manufacturing. They'll find problems, like always, and it'll be a while before it's ready as a shipping product.

Wichita is an ideal early step for the move to 28nm, because it's simple, and it's small. If you've got a bunch of defects on a wafer thanks to a new process that the foundry assures you is 100% ready for mass production, that's a killer if you've only got a small number of dies per wafer. If you've got some ridiculous number of tiny dies on there, you're not going to give a poo poo if a few dozen come out hosed up. You've still got plenty of products to sell.

Basically, AMD's 32nm products are ready. They're taking the first steps in the move to 28nm.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

Sinestro posted:

Zacate is ultra-portable only. Llano (later BD?) will go into laptops.

Not necessarily. AMD's not putting restrictions on Zacate like Intel has with Atom. As a result, there are a number of sub-$350 15" laptops with E-350s, and I believe that there are some even cheaper models with the C-50. Acer's even got a Windows tablet with an even-lower-power version of the C-50, although reviews haven't been kind.

Sinestro posted:

That is not too hard. Look at what AMD did with the E350, they are known for GPU excellence.

It'll be interesting to see what happens as the power levels move up, though. AMD has a dynamite GPU design team, but throwing a powerful processor and a powerful GPU on the same die mean that you're going to need a lot of cooling when things throttle up. From what I understand, while the E-350 has the GPU and CPU on the same die, it's not really a well-integrated setup; it's a bit like Intel's Clarkdale approach with two discrete areas on the chip that just happen to have a very short on-die interconnect. Mobile limits have always been more about power and cooling than what's capable at the top end of performance, and it remains to be seen how well AMD can turn CPU/GPU integration into power savings.

Space Gopher fucked around with this message at 00:15 on May 23, 2011

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

PC LOAD LETTER posted:

Yea there'll be Llano laptop chips. Model TDP is supposed to 25-45w depending on the chip you get. Obviously the top end one will have the highest TDP so if you want to get that 6550-ish performance + quad PhenomII cores (aka Husky) you can kiss good battery life good bye but decent battery life may still be possible since that power rating is for the CPU+GPU+NB.

TDP isn't a great way to look at power consumption and battery life any more. It specifies a sustained maximum power draw, but it doesn't give you any information about how the chip performs with lighter loads. Intel's current Sandy Bridge mobile quads have high TDPs, but still get excellent battery life under typical light-usage scenarios like web browsing because they're aggressive about clocking down, sleeping, and even gating off parts of the CPU that aren't in active use. It remains to be seen if AMD can match Intel's progress on that front, but I wouldn't assume that a 45W TDP automatically means poor runtime.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

Factory Factory posted:

This is why good review sites will actually load up some apps and games and time a standardized task that mimics real-world use.

Well, that's what Sysmark does, too. It all comes down to what tasks you've chosen as representative of "real-world use."

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

Alereon posted:

but with only 30GB/sec of memory bandwidth shared between the CPU and GPU we're probably not going to be seeing games with high-res, detailed textures. On the plus side, you'll notice low-res textures less because there will be neat shader tricks covering them up. Unfortunately this isn't the same result as just throwing a real GPU in there with dedicated GDDR5 memory, which WOULD give you enough bandwidth for high-res textures and real anti-aliasing.

Why are you assuming they'll go with standard desktop memory? Seems to me it'd make a lot more sense to take an APU with a beefed-up memory controller (it's going to be a custom chip, after all) and hook that straight to some high-bandwidth video RAM in a unified-memory architecture. That's the way pretty much all consoles have done things for quite a while, now. You get all the benefits of a discrete GPU, and some extras (the CPU can screw around with "VRAM" directly if you want it to), and with the APU you don't even have to deal with two big complex hot chips and two sets of RAM.

The bigger thing, to me, would be moving consoles to x86. I guess we've gotten to the point where everything's so high-level that architecture doesn't even matter all that much, but it still seems kind of weird.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

adorai posted:

The original xbox was x86.

Sure, but when it was designed Microsoft placed a lot more importance on getting something out there to compete now than on an optimum solution. They went with x86 not because it was the best choice for the task at hand, but because they had a lot of people very good with x86, and they could do it all with off-the-shelf parts. Nobody pretended it was the best choice.

When it came time to develop the 360, which wasn't nearly as much of a rush job, they went with a custom Power chip just like the rest of the industry. Now, if they are in fact going with a Bulldozer-based APU, it seems like their best and brightest have sat down and said, "starting from a clean sheet, x86 is clearly the best option for our new console." If AMD's starting to do some really interesting stuff with CPU/GPU hybrids behind closed doors, then it might be realistic, but it still feels kind of weird.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

Factory Factory posted:

Oh, Lord, they're copying the K from Intel. :ughh:

Still, if the secondhand marketing mumbo-jumbo is correct, they both look like very attractive alternatives to an i3 at that price.

AMD was actually doing the Black Edition unlocked-multiplier thing well before Intel brought out the K models. I don't really see a point in this price range, though. Even if an unlocked multiplier is only a few bucks, it's money better spent saving up for a discrete video card.

e: :doh: You're talking about the model numbers themselves, aren't you?

Agreed posted:

Which they really need. If AMD can stop making the ancient crap they're making now, they don't have to stomp all over it for performance, just make money off it. Having a better, competitively priced, well-performing option for the low end that brings in more profit and doesn't tie up manufacturing with crap they don't need to make? Sounds like a feasible path to getting some black ink, and working into brand names better. Wal-Mart PCs, you know what I mean, but it'd be a big step up from i3 for graphically mildly intensive stuff without requiring a separate card, would it not?

Could be grasping at straws, I just want AMD to make some money so I can stop feeling like an Intel fanboy for putting forward what I feel is a pretty solid argument that there isn't a good reason to make an AMD-based computer at any budget :(

The mobile APUs are actually really good for low- and mid-budget netbooks and laptops, and the low-end desktop CPUs are still a decent budget choice. The bigger problem for AMD is the future: their new high-end architecture is a tremendous flop, they're going to get squeezed hard on the low end (where they really do have a compelling product) by ARM, and they still haven't managed much with the Fusion stuff beyond putting a decent GPU and a decent CPU on the same piece of silicon.

Space Gopher fucked around with this message at 19:35 on Dec 17, 2011

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

HalloKitty posted:

Why not just U for Unlocked?

Hard consonants are more x-treme.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

Factory Factory posted:

PowerVR is in the game, still.

In the mobile sector, they practically are the game.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

HalloKitty posted:

But isn't that from 12 cores to 8 bulldozer modules? not even clocked very high - I can't imagine single thread performance budged at all. The real upgrade is the huge amount of GPUs. Sad AMD couldn't get in on that action, since GCN is powerful in compute.

Supercomputers give absolutely no fucks about single-threaded performance*. And, by all accounts, Bulldozer-based designs can perform well under certain circumstances. For you and me, that doesn't matter all that much; we're not going to be rewriting off-the-shelf software packages to target our hardware. For a giant supercomputer in a lab where even the janitors probably have comp sci doctorates, that equation shifts somewhat.

*well, ok, it's not quite that clear cut. Per-core and per-thread performance is always nice. But, raw parallel power is the main goal.

Adbot
ADBOT LOVES YOU

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

Nintendo Kid posted:

Atom's whole bag was "low power x86 at any cost, and do it as quickly as possible" so it's no surprise it kinda sucked until the latest revs.

It also wasn't intended for general-purpose PCs at launch.

Intel's idea for Atom was that it would give them a credible competitor to ARM in the broad middle class of devices that need way more processing power than a microcontroller can give them, but not enough to make the cost and packaging of a full x86 PC setup worth the hassle. Think set-top boxes, routers, car infotainment, smart appliances, and things like that. This has been a weak point for Intel for a long time. They didn't care so much when they could just pump out high-margin PC and server processors and leave cheap embedded stuff to others, but with a lot of consumer demand moving from PCs to smart devices, ARM was looking like a much bigger threat.

Because Intel was getting eaten alive in that market and wanted to jump-start a competitive product, they sold bargain-price Atoms to OEMs under a term that was supposed to lock out PCs: a maximum display size restriction in the complete hardware. Ten inches would be more than enough for a barcode scanner, industrial control interface, or something like that, but nobody wanted a laptop that small, right?

Of course, OEMs saw dirt-cheap CPUs that they could slap together with legacy chipsets to run Windows or Linux, and it was off to the races in a machine that just barely met the licensing specs. The original Atom netbooks sucked because the CPUs were never intended to run a full-fat consumer OS on a PC. Now that Intel has actually focused on the "cheap low-end PC" market for Windows tablets, Atoms are actually pretty good at it.

This also explains why "netbook" came and went as a size category; it was entirely driven by Intel (and Microsoft) licensing restrictions. When AMD started building netbook-class processors without the display size limit, OEMs immediately slapped them into larger chassis, and they became super-cheap 15" laptops instead of netbooks. And, because Intel and Microsoft are fighting for the tablet market, you can get cheap Atom x86 tablets right now.

  • Locked thread