Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Keep in mind that this generation of consoles is all about cost reduction, both in terms of bill of materials but ESPECIALLY hardware development investment. I'd love to see AMD develop some custom APU with a wide GDDR5 memory bus, but I think it's more likely that any console using a Bulldozer APU will use a regular AMD Trinity APU (Bulldozer cores plus VLIW4 graphics) and a custom chipset to provide a low-cost integrated platform.

Adbot
ADBOT LOVES YOU

Ryokurin
Jul 14, 2001

Wanna Die?

Space Gopher posted:

Sure, but when it was designed Microsoft placed a lot more importance on getting something out there to compete now than on an optimum solution. They went with x86 not because it was the best choice for the task at hand, but because they had a lot of people very good with x86, and they could do it all with off-the-shelf parts. Nobody pretended it was the best choice.

When it came time to develop the 360, which wasn't nearly as much of a rush job, they went with a custom Power chip just like the rest of the industry. Now, if they are in fact going with a Bulldozer-based APU, it seems like their best and brightest have sat down and said, "starting from a clean sheet, x86 is clearly the best option for our new console." If AMD's starting to do some really interesting stuff with CPU/GPU hybrids behind closed doors, then it might be realistic, but it still feels kind of weird.

About the only thing I can say that leads to it not being true is how the main reason they killed the original Xbox so quickly was because Nvidia wanted more money for their chips and there was rumors that Intel didn't want to make that processor any longer. What's stopping AMD from making a similar call a few years into Xbox Next production? The 360 wasn't really about following everyone else, it was making sure they owned as much of the process as possible.

And then again, they need to make sure whatever they make isn't more than the current system. The days of charging $600 for a console are over and bulldozer is probably one of the easiest ways to do one for $400

Not A Gay Name
Nov 8, 2008

Ryokurin posted:

About the only thing I can say that leads to it not being true is how the main reason they killed the original Xbox so quickly was because Nvidia wanted more money for their chips and there was rumors that Intel didn't want to make that processor any longer. What's stopping AMD from making a similar call a few years into Xbox Next production? The 360 wasn't really about following everyone else, it was making sure they owned as much of the process as possible.

And then again, they need to make sure whatever they make isn't more than the current system. The days of charging $600 for a console are over and bulldozer is probably one of the easiest ways to do one for $400

It was a poor contract with Nvidia as I understood it. It was a off the shelf Geforce 3 (or 4, can't remember for sure) and as retail prices dropped Nvidia had MS locked into paying the original high price. Left a sour taste in the mouth so they went with ATI.

I wouldn't put doing the same thing past AMD/ATI, but I think MS has learned their lesson in that regard.

AMD would probably be ecstatic to have that many APUs getting into peoples homes though.

PC LOAD LETTER
May 23, 2005
WTF?!
AMD may be willing to sell them a long term liscence to the chip itself which I don't think Intel was willing to do. If so it might be made on TSMC's bulk process and not GF's custom or maybe even their bulk process which would make more sense if they're looking to control the manufacture of their next console as much as possible.

Alereon posted:

Keep in mind that this generation of consoles is all about cost reduction, both in terms of bill of materials but ESPECIALLY hardware development investment. I'd love to see AMD develop some custom APU with a wide GDDR5 memory bus, but I think it's more likely that any console using a Bulldozer APU will use a regular AMD Trinity APU (Bulldozer cores plus VLIW4 graphics) and a custom chipset to provide a low-cost integrated platform.
I'm actually expecting something like a more updated version of X360 which has worked out very well for MS both cost and performance wise over all. Give it some more eDRAM or put some small fast GDDR5 on package with the APU and then have one big pool of GDDR5 on a "narrow" 128 bit bus as the main RAM (the 6-7Ghz stuff might be commonly available by then) and you might have one heck of a machine for cheapish. I know MS was making money/broke even on their hardware far sooner than Sony did, I think it was after a year from launch or so. I'm assuming they'll be targeting 1080p as the standard resolution which even mid range GPUs today can perform well at so it shouldn't be hard for a console coming out in 2013 or so to pull off.

PC LOAD LETTER fucked around with this message at 01:10 on Jul 23, 2011

You Am I
May 20, 2001

Me @ your poasting

Not A Gay Name posted:

It was a poor contract with Nvidia as I understood it. It was a off the shelf Geforce 3 (or 4, can't remember for sure) and as retail prices dropped Nvidia had MS locked into paying the original high price. Left a sour taste in the mouth so they went with ATI.
Yeah, Nvidia screwed Microsoft hard there, which is strange considering the lawyers that MS has, they should've seen that coming.

There is no way Microsoft were going to deal with Nvidia after that little episode. The 360 was a rush job, as the RROD issue showed in the early versions. I'm sure if Nvidia didn't screw MS so hard, MS wouldn't have killed the original XBox so quickly and would've spent more time developing the 360 and not have the hardware issues they had.

JnnyThndrs
May 29, 2001

HERE ARE THE FUCKING TOWELS

You Am I posted:

The 360 was a rush job, as the RROD issue showed in the early versions. I'm sure if Nvidia didn't screw MS so hard, MS wouldn't have killed the original XBox so quickly and would've spent more time developing the 360 and not have the hardware issues they had.

Wasn't the RROD issue due to leadfree solder hassles(similar to those lovely HP DV-series laptops), rather than poor design? Or were there multiple causes?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

JnnyThndrs posted:

Wasn't the RROD issue due to leadfree solder hassles(similar to those lovely HP DV-series laptops), rather than poor design? Or were there multiple causes?
While I think the actual problem was similar (high temperature differentials causing weakened and disconnected solderballs), the cause was different. On the HP laptops it was due to a design defect on the packaging for Geforce 8M GPUs (specifically, a missing underfill layer designed to cause the solderballs to stick to the die), which is the generation following the 360. I think they key problem for the 360 was insufficient cooling, since the later models had much larger heatsinks with heatpipes.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

JnnyThndrs posted:

Wasn't the RROD issue due to leadfree solder hassles(similar to those lovely HP DV-series laptops), rather than poor design? Or were there multiple causes?

Here's a fantastic series of articles on The Inquirer about the NVIDIA issue: http://www.theinquirer.net/inquirer/news/1004378/why-nvidia-chips-defective

As far as I understand it, the Xbox 360 issue was mainly one of heat.
At least part of that must be down to the cooling shroud, it's poorly designed - the path to the GPU is extremely high resistance compared to the CPU, so most of the air just goes over the CPU, leaving the GPU to stew ridiculously.

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

HalloKitty posted:

Here's a fantastic series of articles on The Inquirer about the NVIDIA issue: http://www.theinquirer.net/inquirer/news/1004378/why-nvidia-chips-defective

As far as I understand it, the Xbox 360 issue was mainly one of heat.
At least part of that must be down to the cooling shroud, it's poorly designed - the path to the GPU is extremely high resistance compared to the CPU, so most of the air just goes over the CPU, leaving the GPU to stew ridiculously.

This reminds, a big part of why the 360 wasn't an x86 variant was maybe because it struck right before power consumption became a big thing, when the Athlon 64 was competing with the P4. With so much of a shift in both the x86 market and the microprocessor market in general to performance per watt, an AMD APU solution might make all the more sense in the next console generation.

JnnyThndrs
May 29, 2001

HERE ARE THE FUCKING TOWELS

HalloKitty posted:

Here's a fantastic series of articles on The Inquirer about the NVIDIA issue: http://www.theinquirer.net/inquirer/news/1004378/why-nvidia-chips-defective

Wow..great, in-depth analysis, thanks! :)

waxluthor
May 28, 2003
http://www.youtube.com/watch?v=LxlQLzOCxEc

I came across this video and thought people here would be interested. The guy compares AMD A8-3850 with an i3 2105 in a number of games in low and high settings. He also tested the level of performance boost by adding a HD6670 to the AMD system.

JustAnother Fat Guy
Dec 22, 2009

Go to hell, and take your cheap suit with you!
Hate to do a lovely bump like this, but has any more credible news come out about a release date for these? As it is getting closer to the end of august, and all i am seeing on tech sites is just more explanations of bulldozers architecture but nothing that says when we will see these illusive chips by either retail or OEM systems.

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

rexelation posted:

http://www.youtube.com/watch?v=LxlQLzOCxEc

I came across this video and thought people here would be interested. The guy compares AMD A8-3850 with an i3 2105 in a number of games in low and high settings. He also tested the level of performance boost by adding a HD6670 to the AMD system.



Video review? Really?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

rexelation posted:

http://www.youtube.com/watch?v=LxlQLzOCxEc

I came across this video and thought people here would be interested. The guy compares AMD A8-3850 with an i3 2105 in a number of games in low and high settings. He also tested the level of performance boost by adding a HD6670 to the AMD system.

He left out the bench of just the discrete graphics card without the crossfire setting. Makes that really unhelpful if you were trying to see what hybrid crossfire offers over just using the graphics card. I don't really understand why he bothered doing the tests and left out a pretty important row for people seriously considering options for a budget system like that. It demonstrates plenty well that the i3's onboard graphics suck compared to the AMD one but why go only half of the rest of the way toward demonstrating the performance of the AMD on-board versus the discrete card that it's supposed to compete with for performance (that was the idea, right?)

Agreed fucked around with this message at 13:50 on Aug 19, 2011

freeforumuser
Aug 11, 2007

Agreed posted:

He left out the bench of just the discrete graphics card without the crossfire setting. Makes that really unhelpful if you were trying to see what hybrid crossfire offers over just using the graphics card. I don't really understand why he bothered doing the tests and left out a pretty important row for people seriously considering options for a budget system like that. It demonstrates plenty well that the i3's onboard graphics suck compared to the AMD one but why go only half of the rest of the way toward demonstrating the performance of the AMD on-board versus the discrete card that it's supposed to compete with for performance (that was the idea, right?)

Put in an actual gaming-grade video card like a 5770 and watch the i3 eats Llano for breakfast at around the same cost for both.

Its disingeneous to compare hybrid crossfire Llano with the SB GPU while downplaying the fact that the Intel setup can also use the same discrete card.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Yeah, I don't really understand the point of Llano, still waiting to see what Bulldozer will do, but if he's going to go to the trouble to try to demonstrate something it seems like he ought to at least get the basic design of his tests settled, I have no idea what the hybrid crossfire thing is even really doing; if it's offloading the rendering mainly to the card, or what. He had all the stuff right there and set out to run some tests, just bugs me that he didn't do the last one to make it some kind of meaningful data set I guess.

Sinestro
Oct 31, 2010

The perfect day needs the perfect set of wheels.
Llano is the chip without a market. Anyone who would want one would get SB + /[H|Z]6\d/ using the IGP, or AM3 + dGPU. :iiam: why AMD thinks there is a big market for "lovely CPU with good (for a IGP) GPU". The laptop performance is poo poo vs the category of "Any dGPU + SB", and Optimus makes the batt. life compatible to a IGP. The notion of Llano in the desktop is laughable, because the desktop is the land of the power user, where Intel is king. I guess for gaming on a *very* tight budget at 720p resolutions, it might make sense.

But enough :spergin: for now.

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

It's bringing gaming to $500 laptops and desktops. That's huge, because you can't game for poo poo on a $500 laptop without Llano, and they sell millions of the drat things. Average people don't even buy computers over a couple hundred dollars, much less gaming PC's with video cards that a $200 alone.

freeforumuser
Aug 11, 2007

Sinestro posted:

Llano is the chip without a market. Anyone who would want one would get SB + /[H|Z]6\d/ using the IGP, or AM3 + dGPU. :iiam: why AMD thinks there is a big market for "lovely CPU with good (for a IGP) GPU". The laptop performance is poo poo vs the category of "Any dGPU + SB", and Optimus makes the batt. life compatible to a IGP. The notion of Llano in the desktop is laughable, because the desktop is the land of the power user, where Intel is king. I guess for gaming on a *very* tight budget at 720p resolutions, it might make sense.

But enough :spergin: for now.

It would had been a lot more impressive if it was released in mid-2010. The delays of Llano and BD has taken its toll.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Sinestro posted:

Llano is the chip without a market. Anyone who would want one would get SB + /[H|Z]6\d/ using the IGP, or AM3 + dGPU. :iiam: why AMD thinks there is a big market for "lovely CPU with good (for a IGP) GPU". The laptop performance is poo poo vs the category of "Any dGPU + SB", and Optimus makes the batt. life compatible to a IGP. The notion of Llano in the desktop is laughable, because the desktop is the land of the power user, where Intel is king. I guess for gaming on a *very* tight budget at 720p resolutions, it might make sense.
A Llano system is a good $100-$200 cheaper that a Sandy Bridge system, and it actually has sufficient graphics horsepower for moderate gaming and HTPC applications. You only get the Intel HD Graphics 3000 if you buy a K-edition desktop processor (or a lovely i3 2105, or low-power i5 2405S), and I don't think you really appreciate how terrible previous-generation integrated graphics was on the AM3 platform. It's true that the CPU performance, equivalent to an Athlon II X4 640 or so, isn't very impressive, but it's fast enough to beat Intel's dual-core CPUs and fast enough for basic gaming and desktop applications. Basically, if you're buying anything less than an i5 2500K, or care more about decent graphics performance than high-end CPU performance, Llano is a better option. I really wish they'd get Turbo working on a future refresh though, that should help a lot in applications that don't use all four cores, and really boost competition against Intel.

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug
The fact is that cheap consumer systems generally are skewed toward CPU power. Not as badly as in the clock speed wars when everything was marketed by CPU speed and RAM, video, hard drive and everything else would be woefully inadequate even for common tasks, but all the same it's still the case outside of the ultra low end where you get an Atom or whatever single-core chips are still in production.

The sub-$500 desktop and sub-$600 laptop ranges are still ruled by systems where CPU power is okay for anything but power users, but GPU performance is miserably inadequate, and even though PCIe has meant even cheap systems usually have expansion slots a lot of power supplies are still too anemic to stick much of a card in even if someone that knows computers (since power users and tech types aren't the target market here) recognizes the need.

I agree, Llano would be much more impressive if it had hit last year, or if the Bulldozer core was in it, but as it stands? Most users in the market for a prebuilt system of that price range aren't going to notice a Llano taking a few extra seconds zipping files or reencoding videos off their phone or whatever. They're a lot more likely to notice their Sandy Bridge just doesn't play games even adequately for a casual gamer, unless they're strictly into six year old titles. Even if you're not some clueless type that just wants it to work, that's still a system with a niche: it's often not about a fast processor but a balanced system, and Llano can deliver that cheapest within its niche.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Digitimes is reporting that yields are significantly lower than expected on AMD's 32nm process, resulting in shortages of Llanos. This is probably also the reason why the Bulldozer release date keeps creeping back.

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug
Looks like the E-450 is out now. It's only a little faster than the E-350, but it's added support for Llano-style Turbo Core, which apparently can give it a significant boost in GPU-limited games. So a good little advancement for casual gaming in the sub-notebook sector.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
SemiAccurate got their hands on an AMD internal presentation disclosing the Bulldozer die size as 315mm^2, which is very large compared to other contemporary CPUs. A full Sandy Bridge die is 216mm^2, the Phenom II X6 (Thuban) 45nm die was 346mm^2.

Devian666
Aug 20, 2008

Take some advice Chris.

Fun Shoe

Look at the layout. No wonder it's so drat big there's 2 megs of L2 cache and 2 megs of L3 cache per core. I'm guessing this is how they solved the memory bottleneck.

That's a total of 16 megs of cache on die compared with a Phenom X6 with 3 megs of L2 and 6 megs of L3 (total).

JawnV6
Jul 4, 2004

So hot ...
Can BD do an L3 chop besides 8M/0M? Or does the cache setup still have that hole if L3 exists and has less space than the combined L2's?

Devian666
Aug 20, 2008

Take some advice Chris.

Fun Shoe

JawnV6 posted:

Can BD do an L3 chop besides 8M/0M? Or does the cache setup still have that hole if L3 exists and has less space than the combined L2's?

I'm not sure anyone has that information other than AMD's engineers and the motherboard manufacturers.

JawnV6
Jul 4, 2004

So hot ...

Devian666 posted:

I'm not sure anyone has that information other than AMD's engineers and the motherboard manufacturers.

Yeah I guess it's super-hard to scan a product line and see if they're selling anything besides 0M L3 and 8M L3 parts? These threads are normally full of system-builder types who could answer that at a glance.

Devian666
Aug 20, 2008

Take some advice Chris.

Fun Shoe

JawnV6 posted:

Yeah I guess it's super-hard to scan a product line and see if they're selling anything besides 0M L3 and 8M L3 parts? These threads are normally full of system-builder types who could answer that at a glance.

So difficult that you didn't do this yourself. You could research this and response to the thread.

You may find that any information may be subject to change given that the previous version of the chip didn't perform as expected.

JawnV6
Jul 4, 2004

So hot ...

Devian666 posted:

So difficult that you didn't do this yourself. You could research this and response to the thread.

You may find that any information may be subject to change given that the previous version of the chip didn't perform as expected.

Yes. System builders who are constantly shopping for parts and aware of product lines have easier access to this information than I do. I'm seriously asking if anyone's seen a 2, 4, or 6 in the L3 column or if it's all 0 or 8. This shouldn't involve research, it's a yes/no question that I'm sure someone following this thread can answer offhand. I'm not sure why you're being so prickly about this, you didn't even understand the question well enough the first time I asked it to produce a coherent response. Motherboard manufacturers wouldn't have a clue what sizes the L3 can come in and availability of chops is public information.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
FFS, it's not even out yet. Stop having a sperg contest.

According to Wikipedia, the L3 is shared between all cores, but none of the currently-known (i.e. possibly old and outdated) SKUs will have less than 8 MB of L3 cache. L2 cache is paired with the Bulldozer modules and is slated to be 1 MB/core regardless of number of cores.

Devian666
Aug 20, 2008

Take some advice Chris.

Fun Shoe

JawnV6 posted:

Yes. System builders who are constantly shopping for parts and aware of product lines have easier access to this information than I do. I'm seriously asking if anyone's seen a 2, 4, or 6 in the L3 column or if it's all 0 or 8. This shouldn't involve research, it's a yes/no question that I'm sure someone following this thread can answer offhand. I'm not sure why you're being so prickly about this, you didn't even understand the question well enough the first time I asked it to produce a coherent response. Motherboard manufacturers wouldn't have a clue what sizes the L3 can come in and availability of chops is public information.

Sorry for trying to be helpful.

The answer is no at this time as all eight versions are listed with 8 megs of L3 cache. I don't have any information outside of the release versions of the desktop cores.

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

Any day now, right?

:f5:

Devian666
Aug 20, 2008

Take some advice Chris.

Fun Shoe

Bob Morales posted:

Any day now, right?

:f5:

They delayed until September so let's hope so. It could take a while for stock to be available.

Star War Sex Parrot
Oct 2, 2003

I updated the thread title to see if there's interest in creating other roadmap discussion threads. NVIDIA's a bit hazy right now, but there's something interesting stuff to talk about for Intel. If it's a terrible idea, feel free to let me know. I figure it's more appropriate than just calling it "The AMD megathread" since that's sorta what it's turned into.

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

I bought NVIDIA and AMD stock during the crash two weeks ago, lets see how this pans out.

Longinus00
Dec 29, 2005
Ur-Quan

Devian666 posted:

They delayed until September so let's hope so. It could take a while for stock to be available.

Welp!

http://www.xbitlabs.com/news/cpu/display/20110831102650_AMD_s_Highly_Anticipated_Bulldozer_Chips_Might_Face_Further_Delay.html

Devian666
Aug 20, 2008

Take some advice Chris.

Fun Shoe

:ohdear:

So it looks like the good part of their yield is going into the server market. AMD will most likely build up high stock levels of "low end" cpus from the defective chips. A shame they won't be able to sell those until some time after the bulldozer release.

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug
Gigabyte just revealed some upcoming models and specs for Bulldozer chips, apparently:

http://www.gigabyte.com/support-downloads/cpu-support-popup.aspx?pid=3880

Top end is the FX-8150, with 8 cores, 3.6GHz, and 125W TDP.

Adbot
ADBOT LOVES YOU

Maxwell Adams
Oct 21, 2000

T E E F S

Killer robot posted:

Gigabyte just revealed some upcoming models and specs for Bulldozer chips, apparently:

http://www.gigabyte.com/support-downloads/cpu-support-popup.aspx?pid=3880

Top end is the FX-8150, with 8 cores, 3.6GHz, and 125W TDP.

That page says there are two kinds of FX-8120, the only difference being the TDP. That seems a bit weird.

  • Locked thread