New around here? Register your SA Forums Account here!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


The hell? They can't be right, the integrated graphics core has a R 6550? That's what I've got in my desktop right now...

Adbot
ADBOT LOVES YOU

Sinestro
Oct 31, 2010

The perfect day needs the perfect set of wheels.

Tab8715 posted:

The hell? They can't be right, the integrated graphics core has a R 6550? That's what I've got in my desktop right now...

Llano's IGP line is :krad:.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Tab8715 posted:

The hell? They can't be right, the integrated graphics core has a R 6550? That's what I've got in my desktop right now...
To be fair, it isn't directly comparable to desktop-class cards because of the reduced memory bandwidth. Llano has 29.8GB/sec shared between the CPU cores and GPU, desktop cards (those intended for 3D) have at least 64GB/sec. The lower-end cards intended for HTPC and video applications do have as little as 28.8GB/sec though, so Llano will easily mean videocards are only necessary for gaming.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Alereon posted:

To be fair, it isn't directly comparable to desktop-class cards because of the reduced memory bandwidth. Llano has 29.8GB/sec shared between the CPU cores and GPU, desktop cards (those intended for 3D) have at least 64GB/sec. The lower-end cards intended for HTPC and video applications do have as little as 28.8GB/sec though, so Llano will easily mean videocards are only necessary for gaming.

Actually, video cards aren't anywhere near as bandwidth-intensive as their interconnects suggest. Everything below a dual-GPU card actually loses only a handful percent of their performance scaling all the way down to a PCIe x4 connect.

Also, PCIe 2.0 x16 only runs at 8 GB/s (500 MB/s per lane times 16 lanes). You might be thinking gigabits.

VVVV

Ahh, derp. :downs: 'kay.

Factory Factory fucked around with this message at 19:05 on May 21, 2011

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Factory Factory posted:

Actually, video cards aren't anywhere near as bandwidth-intensive as their interconnects suggest. Everything below a dual-GPU card actually loses only a handful percent of their performance scaling all the way down to a PCIe x4 connect.

Also, PCIe 2.0 x16 only runs at 8 GB/s (500 MB/s per lane times 16 lanes). You might be thinking gigabits.
I'm talking about memory bandwidth, a Radeon HD 6570 has a 128-bit 4Ghz GDDR5 memory bus, but the Radeon HD 6550 is going to have to make do with a 128-bit 1866Mhz DDR3 bus, shared with the CPU.

Sinestro
Oct 31, 2010

The perfect day needs the perfect set of wheels.
I still expect it to slaughter the Intel HD 3000, though.

PC LOAD LETTER
May 23, 2005
WTF?!

Alereon posted:

I'm talking about memory bandwidth, a Radeon HD 6570 has a 128-bit 4Ghz GDDR5 memory bus, but the Radeon HD 6550 is going to have to make do with a 128-bit 1866Mhz DDR3 bus, shared with the CPU.
This is true but if you look at the Zacate APU's and how well they perform with just a single 1066 DDR3 channel then this might not be so bad at all. I have no clue if its because today's CPU's have so much L2/L1 cache that memory bandwidth isn't too important past a certain point or if its because they're hiding a bunch of cache in the GPU itself or something else but AMD appears to be getting some pretty good performance out of relatively low bandwidth. They may actually be able to get close to a "real" 6550. That'd be a heck of a bargain chip if they pull that off, particularly for a laptop.

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


PC LOAD LETTER posted:

This is true but if you look at the Zacate APU's and how well they perform with just a single 1066 DDR3 channel then this might not be so bad at all. I have no clue if its because today's CPU's have so much L2/L1 cache that memory bandwidth isn't too important past a certain point or if its because they're hiding a bunch of cache in the GPU itself or something else but AMD appears to be getting some pretty good performance out of relatively low bandwidth. They may actually be able to get close to a "real" 6550. That'd be a heck of a bargain chip if they pull that off, particularly for a laptop.

The hell? This is going in a laptop?

Verizian
Dec 18, 2004
The spiky one.
Zacate is the mobile/desktop middle ground chip right? So it's going in mid to high end laptops, budget desktops, media HTPC's and there were rumours last year that it could fit into a 12" or larger tablet.
Of course it would have had heat/battery issues in a tablet and according to market research nobody wants one bigger than 10".

Sinestro
Oct 31, 2010

The perfect day needs the perfect set of wheels.

Verizian posted:

Zacate is the mobile/desktop middle ground chip right? So it's going in mid to high end laptops, budget desktops, media HTPC's and there were rumours last year that it could fit into a 12" or larger tablet.
Of course it would have had heat/battery issues in a tablet and according to market research nobody wants one bigger than 10".

Zacate is ultra-portable only. Llano (later BD?) will go into laptops.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
What would be neat is if Llano could make gaming on a laptop cheaper and more feasible.

Sinestro
Oct 31, 2010

The perfect day needs the perfect set of wheels.

spasticColon posted:

What would be neat is if Llano could make gaming on a laptop cheaper and more feasible.

That is not too hard. Look at what AMD did with the E350, they are known for GPU excellence.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

Sinestro posted:

Zacate is ultra-portable only. Llano (later BD?) will go into laptops.

Not necessarily. AMD's not putting restrictions on Zacate like Intel has with Atom. As a result, there are a number of sub-$350 15" laptops with E-350s, and I believe that there are some even cheaper models with the C-50. Acer's even got a Windows tablet with an even-lower-power version of the C-50, although reviews haven't been kind.

Sinestro posted:

That is not too hard. Look at what AMD did with the E350, they are known for GPU excellence.

It'll be interesting to see what happens as the power levels move up, though. AMD has a dynamite GPU design team, but throwing a powerful processor and a powerful GPU on the same die mean that you're going to need a lot of cooling when things throttle up. From what I understand, while the E-350 has the GPU and CPU on the same die, it's not really a well-integrated setup; it's a bit like Intel's Clarkdale approach with two discrete areas on the chip that just happen to have a very short on-die interconnect. Mobile limits have always been more about power and cooling than what's capable at the top end of performance, and it remains to be seen how well AMD can turn CPU/GPU integration into power savings.

Space Gopher fucked around with this message at 23:15 on May 22, 2011

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
The Brazos APU is pretty tightly integrated, it's definitely not comparable to Clarkdale or Atom, which don't have integrated memory controllers. Clarkdale actually used separate chips (32nm CPU cores, 45nm northbridge) on the same package, Atom uses a single chip but with two independent ICs linked by an on-die FSB. These approaches require little development work and provide some of the cost savings of integrated memory controllers, but you don't get the performance improvements offered by an IMC (and an actual IMC uses even less power and die space). The main limitation to Brazos performance is the very low CPU clocks and lack of Turbo support, both of which should be remedied in the 28nm die shrink. New chipsets probably also wouldn't hurt (especially for nettops), but overall platform power usage is already pretty low.

PC LOAD LETTER
May 23, 2005
WTF?!

Tab8715 posted:

The hell? This is going in a laptop?
Yea there'll be Llano laptop chips. Model TDP is supposed to 25-45w depending on the chip you get. Obviously the top end one will have the highest TDP so if you want to get that 6550-ish performance + quad PhenomII cores (aka Husky) you can kiss good battery life good bye but decent battery life may still be possible since that power rating is for the CPU+GPU+NB. Supposedly they're coming in June. There was a video posted a few pages back that AMD released of one running a month or so ago.

edit: Looks like we've got a good leak on BD clocks and prices from ASUS.

e2: Looks like we got some prices for some Llano based laptops. In Euros and has a discrete GPU in it too (common place CF in a low to mid range laptop ahoy!) but still it gives you a good idea of what they'll be like in dollars.

PC LOAD LETTER fucked around with this message at 13:15 on May 24, 2011

Sinestro
Oct 31, 2010

The perfect day needs the perfect set of wheels.
Anand Lal Shimpi is rather unsubtlely confirming the BD at Computex rumor on Twitter.

Sinestro fucked around with this message at 21:57 on May 24, 2011

pienipple
Mar 20, 2009

That's wrong!
I'm kinda :swoon: over the 8110.

Frugality may win out, but i still want it.

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

PC LOAD LETTER posted:

Yea there'll be Llano laptop chips. Model TDP is supposed to 25-45w depending on the chip you get. Obviously the top end one will have the highest TDP so if you want to get that 6550-ish performance + quad PhenomII cores (aka Husky) you can kiss good battery life good bye but decent battery life may still be possible since that power rating is for the CPU+GPU+NB.

TDP isn't a great way to look at power consumption and battery life any more. It specifies a sustained maximum power draw, but it doesn't give you any information about how the chip performs with lighter loads. Intel's current Sandy Bridge mobile quads have high TDPs, but still get excellent battery life under typical light-usage scenarios like web browsing because they're aggressive about clocking down, sleeping, and even gating off parts of the CPU that aren't in active use. It remains to be seen if AMD can match Intel's progress on that front, but I wouldn't assume that a 45W TDP automatically means poor runtime.

Sinestro
Oct 31, 2010

The perfect day needs the perfect set of wheels.

Space Gopher posted:

TDP isn't a great way to look at power consumption and battery life any more. It specifies a sustained maximum power draw, but it doesn't give you any information about how the chip performs with lighter loads. Intel's current Sandy Bridge mobile quads have high TDPs, but still get excellent battery life under typical light-usage scenarios like web browsing because they're aggressive about clocking down, sleeping, and even gating off parts of the CPU that aren't in active use. It remains to be seen if AMD can match Intel's progress on that front, but I wouldn't assume that a 45W TDP automatically means poor runtime.

Wow, I am now even more excited about Bulldozer.

PC LOAD LETTER
May 23, 2005
WTF?!

Space Gopher posted:

TDP isn't a great way to look at power consumption and battery life any more. It specifies a sustained maximum power draw, but it doesn't give you any information about how the chip performs with lighter loads. Intel's current Sandy Bridge mobile quads have high TDPs, but still get excellent battery life under typical light-usage scenarios like web browsing because they're aggressive about clocking down, sleeping, and even gating off parts of the CPU that aren't in active use. It remains to be seen if AMD can match Intel's progress on that front, but I wouldn't assume that a 45W TDP automatically means poor runtime.
This is all true but I'm pessimistic about such things for practical purposes. We don't know how well exactly AMD's power gating and power management software/tech will work yet. And if you do intend to run a significant work load on it (ie. gaming) then you have a pretty good idea of just what the batter life will be.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance

PC LOAD LETTER posted:

edit: Looks like we've got a good leak on BD clocks and prices from ASUS.

Could this be AMD heralding their triumphant return? :stare::fh:

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
If this reaches IPC similar to even Nehalem, then drat

HalloKitty fucked around with this message at 09:19 on May 25, 2011

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Fudzilla has an article about AMD's Fusion strategy (direct link to AMD presentation slides), as well as another article with more details about the upcoming Fusion Z-series APUs for tablets. Intel has announced they're slashing the prices on their upcoming 32nm Atoms by about 50%, so competition is really starting to heat up in the low-power computing arena. I'm thinking we'll see Atoms in cheap, low-performance devices (like ChromeOS smartbooks), with Fusion processors making significant headway in Windows-based devices that require higher performance. The real question for AMD longer-term is whether they can write Linux/Android graphics drivers that will allow them to capture some of that market, or if they'll just leave it to the ARM SoCs. AMD has two year until Intel produces a competitive Atom (22nm Silvermont in 2013), so they better take advantage of this time by executing well.

Let's all just pray the rumors of Carly Fiorina being selected as the new AMD CEO were false, otherwise AMD is just plain done.

Sinestro
Oct 31, 2010

The perfect day needs the perfect set of wheels.
There is some talk that BD isn't going to show up at Computex because "client" means Llano, and the Q3 server release will include the workstation/enthusiast parts.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
AMD says it has sold 5 million Fusion APUs so far, and that it is sold out, with demand far exceeding supply. Engadget is patting themselves on the back for predicting the death of the netbook, but I think the reality is that consumers are catching on to the fact that Atoms simply aren't fast enough for netbooks. They were barely adequate when the netbook form factor first appeared, but because the Atom never evolved the computing demands for basic web browsing far outpaced it. It's unfortunate but understandable that Fusion APUs are ending up in low-end laptops rather than the netbooks they'd be perfect for.

Devian666
Aug 19, 2008

Take some advice Chris.

Fun Shoe

Alereon posted:

It's unfortunate but understandable that Fusion APUs are ending up in low-end laptops rather than the netbooks they'd be perfect for.

I recommended a fusion laptop to one of the guys in the office for the low end cheap laptop for his family to use. Initially his kids moaned at him for buying a lovely laptop, but once they started using it all the moaning went away. Between it having enough cpu and gpu to actually do stuff and the 4.5 hour battery life they're pretty happy with it.

AMD will do pretty well once they release their entire range of fusion chips as indicated in the slides.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Bulldozer has been delayed until approximately late July due to performance issues. Apparently the B0 and B1 steppings had unacceptable performance, so AMD is spinning up a B2 stepping and hoping that it will make a big enough difference. So much for those optimistic performance projections :(

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Nordic Hardware is reporting that the Bulldozer lineup has been canceled, replaced with a new lineup with much more conservative clockspeed targets, to be launched in September. This news story came out before we got confirmation at Computex that AMD was going to have to make a new stepping, so it seems pretty likely right now. Since Llano still seems to be on track, I'm thinking this is more of an issue with the Bulldozer architecture than the manufacturing process (though we have no idea what the clockspeed targets are on Llano). Unfortunately, this probably spells the end of any chance AMD had at competing with Sandy Bridge in terms of per-thread performance, and doesn't bode well for their chances with well-threaded workloads when compared to Sandy Bridge-E. It seems like Bulldozer is going to end up like the Thuban hex-core Phenom IIs, not the fastest, but the cheapest way to get 6+ cores if you have an application that can use them all.

On the plus side, we know that Intel's next-generation Ivy Bridge CPUs have been pushed back from Q1 2012 to Q2, meaning AMD is going to have a generous period of graphics dominance with Llano and Brazos.

ZeroConnection
Aug 8, 2008
I planned to built a new computer next month, my desktop is still a single core 1.8ghz athlon with an x700 card.:negative:

So does this affect the release dates of new AM3+ motherboards ?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
I think it's fair to say that the 900-series chipsets and boards will be delayed to launch alongside Bulldozer, but that's just a rebrand anyway and you can buy AM3+ boards now (list linked in the OP). Realistically though, you should probably put some serious thought into an Intel Sandy Bridge CPU and Z68 board. AMD could still pull off something awesome, but that's looking less likely and less worth waiting for.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
I dunno how kosher it is to repeat this, but Star War Sex Parrot gave an offhand comment in the System Building thread about being both really disappointed with Bulldozer and bound by NDA about the subject. Also mentioned was how it seemed nice for workstations and servers, but otherwise was not impressive.

I'm thinking the FPU design backfired - the half-as-many-as-cores, double-wide, bifurcating thing just doesn't sound like it would really hold up to the floating-point-strong Intel Core processors. I was optimistic for BD, but ever since I heard about its design, I felt like it was pushing the Phenom "more cores" strategy out too far. Rather than being designed for today's (or even tomorrow's) most pressing processing needs, the chips are designed toward some highly-parallelized vision of software years from now, putting power in the wrong places compared to what most people usually wait on their processors for now.

PC LOAD LETTER
May 23, 2005
WTF?!
Its a drat shame AMD dropped the ball, again. :sigh: They're lucky they have a decent GPU to cram onto a single die with their older cores. Llano will make a good mainstream chip but that is a real disappointment to those of us who were hoping for something more than that.

FPU though? I'd be surprised if the FPU was the problem. For what ever reason the problems seem to pop up with the L1 cache, decoders/schedulers with AMD.

adorai
Nov 2, 2002

10/27/04 Never forget
Grimey Drawer

Factory Factory posted:

I felt like it was pushing the Phenom "more cores" strategy out too far. Rather than being designed for today's (or even tomorrow's) most pressing processing needs, the chips are designed toward some highly-parallelized vision of software years from now, putting power in the wrong places compared to what most people usually wait on their processors for now.
The high margin chips are the ones people buy for their datacenters. For example, we run ~120 VMs on three physical boxes, each with 2x 6 core hyperthreaded processors. The procs themselves when we purchased were outrageously expensive. AMD would like to capture that market.

fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Factory Factory posted:

I'm thinking the FPU design backfired - the half-as-many-as-cores, double-wide, bifurcating thing just doesn't sound like it would really hold up to the floating-point-strong Intel Core processors

I have to say I agree with this, it just never made sense to me to do it. Not having an FPU was bad on single core CPUs back in the 90s - why would we make sure that a design in the 2010s lacked an FPU for half the cores?

Especially since it seems like it was kinda supposed to be the AMD version of hyperthreading.

Sinestro
Oct 31, 2010

The perfect day needs the perfect set of wheels.

fishmech posted:

I have to say I agree with this, it just never made sense to me to do it. Not having an FPU was bad on single core CPUs back in the 90s - why would we make sure that a design in the 2010s lacked an FPU for half the cores?

Especially since it seems like it was kinda supposed to be the AMD version of hyperthreading.

That is not true. There are 2 128-bit FPUs in a module, but for 256-bit FP calculation, they can fuse. Unless AVX is being used on one core, there are two FPUs, each tied to one core.

Sinestro fucked around with this message at 01:15 on May 31, 2011

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
We can speculate, but personally I want to see the chips out and benchmarked by someone reliable before I write off AMD

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Sinestro posted:

That is not true. There are 2 128-bit FPUs in a module, but for 256-bit FP calculation, they can fuse. Unless AVX is bein gused on one core, there are two FPUs, each tied to one core.
That's not really accurate, there's a single FPU with two 128-bit FMACs, which is the amount of FPU resources you normally dedicate to a single core. See the Bulldozer slides in comparison to Phenom II on this page. The FPU is, in effect, using Hyperthreading to allow it to serve two cores. That said, I don't think FPU performance is a limiting factor in most workloads, but if you're building a cluster to process your floating point scientific data, you'll probably buy Xeons. The actual problem for AMD appears to have been that Bulldozer couldn't scale to high enough clockspeeds to offer competitive per-thread performance, which is what has always been the big risk of going with >4 core CPUs.

PC LOAD LETTER
May 23, 2005
WTF?!
That makes lots more sense, especially given the rumors of poor yields and delays from GF's 32nm process.

Devian666
Aug 19, 2008

Take some advice Chris.

Fun Shoe

Alereon posted:

The actual problem for AMD appears to have been that Bulldozer couldn't scale to high enough clockspeeds to offer competitive per-thread performance, which is what has always been the big risk of going with >4 core CPUs.

The clockspeed is one issue. However, I had a look at the article and there's one thing that sticks in my mind. The main feature of this architecture that I see is going from 3 to 4 decode units. That will certainly provide some improvement with the theoretical best being a 33% increase in instructions converted to microcode per clock cycle, but it is also a limit.

I'm wondering when AMD will provide a 5 module unit, and the corresponding 5 decoders. Though this could be linked to the current limits of memory speed.

Adbot
ADBOT LOVES YOU

PC LOAD LETTER
May 23, 2005
WTF?!
x86 is pretty IPC limited, IIRC the Athlon had 3 decoders and only averaged around 1.5 IPC thorough put. 4 is already overkill, adding a 5th would be a waste. Resources would probably be better spent on a bigger/faster cache or branch prediction or improving clockspeed. I don't believe memory bandwidth is an issue right now either, almost nothing seems to be limited by it for desktop workloads.

  • Locked thread