|
Keep in mind that this generation of consoles is all about cost reduction, both in terms of bill of materials but ESPECIALLY hardware development investment. I'd love to see AMD develop some custom APU with a wide GDDR5 memory bus, but I think it's more likely that any console using a Bulldozer APU will use a regular AMD Trinity APU (Bulldozer cores plus VLIW4 graphics) and a custom chipset to provide a low-cost integrated platform.
|
# ? Jul 22, 2011 02:06 |
|
|
# ? Dec 12, 2024 01:01 |
|
Space Gopher posted:Sure, but when it was designed Microsoft placed a lot more importance on getting something out there to compete now than on an optimum solution. They went with x86 not because it was the best choice for the task at hand, but because they had a lot of people very good with x86, and they could do it all with off-the-shelf parts. Nobody pretended it was the best choice. About the only thing I can say that leads to it not being true is how the main reason they killed the original Xbox so quickly was because Nvidia wanted more money for their chips and there was rumors that Intel didn't want to make that processor any longer. What's stopping AMD from making a similar call a few years into Xbox Next production? The 360 wasn't really about following everyone else, it was making sure they owned as much of the process as possible. And then again, they need to make sure whatever they make isn't more than the current system. The days of charging $600 for a console are over and bulldozer is probably one of the easiest ways to do one for $400
|
# ? Jul 22, 2011 17:41 |
|
Ryokurin posted:About the only thing I can say that leads to it not being true is how the main reason they killed the original Xbox so quickly was because Nvidia wanted more money for their chips and there was rumors that Intel didn't want to make that processor any longer. What's stopping AMD from making a similar call a few years into Xbox Next production? The 360 wasn't really about following everyone else, it was making sure they owned as much of the process as possible. It was a poor contract with Nvidia as I understood it. It was a off the shelf Geforce 3 (or 4, can't remember for sure) and as retail prices dropped Nvidia had MS locked into paying the original high price. Left a sour taste in the mouth so they went with ATI. I wouldn't put doing the same thing past AMD/ATI, but I think MS has learned their lesson in that regard. AMD would probably be ecstatic to have that many APUs getting into peoples homes though.
|
# ? Jul 22, 2011 23:58 |
|
AMD may be willing to sell them a long term liscence to the chip itself which I don't think Intel was willing to do. If so it might be made on TSMC's bulk process and not GF's custom or maybe even their bulk process which would make more sense if they're looking to control the manufacture of their next console as much as possible.Alereon posted:Keep in mind that this generation of consoles is all about cost reduction, both in terms of bill of materials but ESPECIALLY hardware development investment. I'd love to see AMD develop some custom APU with a wide GDDR5 memory bus, but I think it's more likely that any console using a Bulldozer APU will use a regular AMD Trinity APU (Bulldozer cores plus VLIW4 graphics) and a custom chipset to provide a low-cost integrated platform. PC LOAD LETTER fucked around with this message at 00:10 on Jul 23, 2011 |
# ? Jul 23, 2011 00:04 |
|
Not A Gay Name posted:It was a poor contract with Nvidia as I understood it. It was a off the shelf Geforce 3 (or 4, can't remember for sure) and as retail prices dropped Nvidia had MS locked into paying the original high price. Left a sour taste in the mouth so they went with ATI. There is no way Microsoft were going to deal with Nvidia after that little episode. The 360 was a rush job, as the RROD issue showed in the early versions. I'm sure if Nvidia didn't screw MS so hard, MS wouldn't have killed the original XBox so quickly and would've spent more time developing the 360 and not have the hardware issues they had.
|
# ? Jul 23, 2011 01:46 |
|
You Am I posted:The 360 was a rush job, as the RROD issue showed in the early versions. I'm sure if Nvidia didn't screw MS so hard, MS wouldn't have killed the original XBox so quickly and would've spent more time developing the 360 and not have the hardware issues they had. Wasn't the RROD issue due to leadfree solder hassles(similar to those lovely HP DV-series laptops), rather than poor design? Or were there multiple causes?
|
# ? Jul 23, 2011 02:10 |
|
JnnyThndrs posted:Wasn't the RROD issue due to leadfree solder hassles(similar to those lovely HP DV-series laptops), rather than poor design? Or were there multiple causes?
|
# ? Jul 23, 2011 02:16 |
|
JnnyThndrs posted:Wasn't the RROD issue due to leadfree solder hassles(similar to those lovely HP DV-series laptops), rather than poor design? Or were there multiple causes? Here's a fantastic series of articles on The Inquirer about the NVIDIA issue: http://www.theinquirer.net/inquirer/news/1004378/why-nvidia-chips-defective As far as I understand it, the Xbox 360 issue was mainly one of heat. At least part of that must be down to the cooling shroud, it's poorly designed - the path to the GPU is extremely high resistance compared to the CPU, so most of the air just goes over the CPU, leaving the GPU to stew ridiculously.
|
# ? Jul 24, 2011 09:17 |
|
HalloKitty posted:Here's a fantastic series of articles on The Inquirer about the NVIDIA issue: http://www.theinquirer.net/inquirer/news/1004378/why-nvidia-chips-defective This reminds, a big part of why the 360 wasn't an x86 variant was maybe because it struck right before power consumption became a big thing, when the Athlon 64 was competing with the P4. With so much of a shift in both the x86 market and the microprocessor market in general to performance per watt, an AMD APU solution might make all the more sense in the next console generation.
|
# ? Jul 24, 2011 09:41 |
|
HalloKitty posted:Here's a fantastic series of articles on The Inquirer about the NVIDIA issue: http://www.theinquirer.net/inquirer/news/1004378/why-nvidia-chips-defective Wow..great, in-depth analysis, thanks!
|
# ? Jul 24, 2011 21:40 |
|
http://www.youtube.com/watch?v=LxlQLzOCxEc I came across this video and thought people here would be interested. The guy compares AMD A8-3850 with an i3 2105 in a number of games in low and high settings. He also tested the level of performance boost by adding a HD6670 to the AMD system.
|
# ? Jul 29, 2011 19:29 |
|
Hate to do a lovely bump like this, but has any more credible news come out about a release date for these? As it is getting closer to the end of august, and all i am seeing on tech sites is just more explanations of bulldozers architecture but nothing that says when we will see these illusive chips by either retail or OEM systems.
|
# ? Aug 19, 2011 10:26 |
|
rexelation posted:http://www.youtube.com/watch?v=LxlQLzOCxEc Video review? Really?
|
# ? Aug 19, 2011 12:26 |
|
rexelation posted:http://www.youtube.com/watch?v=LxlQLzOCxEc He left out the bench of just the discrete graphics card without the crossfire setting. Makes that really unhelpful if you were trying to see what hybrid crossfire offers over just using the graphics card. I don't really understand why he bothered doing the tests and left out a pretty important row for people seriously considering options for a budget system like that. It demonstrates plenty well that the i3's onboard graphics suck compared to the AMD one but why go only half of the rest of the way toward demonstrating the performance of the AMD on-board versus the discrete card that it's supposed to compete with for performance (that was the idea, right?) Agreed fucked around with this message at 12:50 on Aug 19, 2011 |
# ? Aug 19, 2011 12:48 |
|
Agreed posted:He left out the bench of just the discrete graphics card without the crossfire setting. Makes that really unhelpful if you were trying to see what hybrid crossfire offers over just using the graphics card. I don't really understand why he bothered doing the tests and left out a pretty important row for people seriously considering options for a budget system like that. It demonstrates plenty well that the i3's onboard graphics suck compared to the AMD one but why go only half of the rest of the way toward demonstrating the performance of the AMD on-board versus the discrete card that it's supposed to compete with for performance (that was the idea, right?) Put in an actual gaming-grade video card like a 5770 and watch the i3 eats Llano for breakfast at around the same cost for both. Its disingeneous to compare hybrid crossfire Llano with the SB GPU while downplaying the fact that the Intel setup can also use the same discrete card.
|
# ? Aug 19, 2011 14:54 |
|
Yeah, I don't really understand the point of Llano, still waiting to see what Bulldozer will do, but if he's going to go to the trouble to try to demonstrate something it seems like he ought to at least get the basic design of his tests settled, I have no idea what the hybrid crossfire thing is even really doing; if it's offloading the rendering mainly to the card, or what. He had all the stuff right there and set out to run some tests, just bugs me that he didn't do the last one to make it some kind of meaningful data set I guess.
|
# ? Aug 19, 2011 15:01 |
|
Llano is the chip without a market. Anyone who would want one would get SB + /[H|Z]6\d/ using the IGP, or AM3 + dGPU. why AMD thinks there is a big market for "lovely CPU with good (for a IGP) GPU". The laptop performance is poo poo vs the category of "Any dGPU + SB", and Optimus makes the batt. life compatible to a IGP. The notion of Llano in the desktop is laughable, because the desktop is the land of the power user, where Intel is king. I guess for gaming on a *very* tight budget at 720p resolutions, it might make sense. But enough for now.
|
# ? Aug 19, 2011 16:03 |
|
It's bringing gaming to $500 laptops and desktops. That's huge, because you can't game for poo poo on a $500 laptop without Llano, and they sell millions of the drat things. Average people don't even buy computers over a couple hundred dollars, much less gaming PC's with video cards that a $200 alone.
|
# ? Aug 19, 2011 16:06 |
|
Sinestro posted:Llano is the chip without a market. Anyone who would want one would get SB + /[H|Z]6\d/ using the IGP, or AM3 + dGPU. why AMD thinks there is a big market for "lovely CPU with good (for a IGP) GPU". The laptop performance is poo poo vs the category of "Any dGPU + SB", and Optimus makes the batt. life compatible to a IGP. The notion of Llano in the desktop is laughable, because the desktop is the land of the power user, where Intel is king. I guess for gaming on a *very* tight budget at 720p resolutions, it might make sense. It would had been a lot more impressive if it was released in mid-2010. The delays of Llano and BD has taken its toll.
|
# ? Aug 19, 2011 16:34 |
|
Sinestro posted:Llano is the chip without a market. Anyone who would want one would get SB + /[H|Z]6\d/ using the IGP, or AM3 + dGPU. why AMD thinks there is a big market for "lovely CPU with good (for a IGP) GPU". The laptop performance is poo poo vs the category of "Any dGPU + SB", and Optimus makes the batt. life compatible to a IGP. The notion of Llano in the desktop is laughable, because the desktop is the land of the power user, where Intel is king. I guess for gaming on a *very* tight budget at 720p resolutions, it might make sense.
|
# ? Aug 19, 2011 17:59 |
|
The fact is that cheap consumer systems generally are skewed toward CPU power. Not as badly as in the clock speed wars when everything was marketed by CPU speed and RAM, video, hard drive and everything else would be woefully inadequate even for common tasks, but all the same it's still the case outside of the ultra low end where you get an Atom or whatever single-core chips are still in production. The sub-$500 desktop and sub-$600 laptop ranges are still ruled by systems where CPU power is okay for anything but power users, but GPU performance is miserably inadequate, and even though PCIe has meant even cheap systems usually have expansion slots a lot of power supplies are still too anemic to stick much of a card in even if someone that knows computers (since power users and tech types aren't the target market here) recognizes the need. I agree, Llano would be much more impressive if it had hit last year, or if the Bulldozer core was in it, but as it stands? Most users in the market for a prebuilt system of that price range aren't going to notice a Llano taking a few extra seconds zipping files or reencoding videos off their phone or whatever. They're a lot more likely to notice their Sandy Bridge just doesn't play games even adequately for a casual gamer, unless they're strictly into six year old titles. Even if you're not some clueless type that just wants it to work, that's still a system with a niche: it's often not about a fast processor but a balanced system, and Llano can deliver that cheapest within its niche.
|
# ? Aug 19, 2011 18:41 |
|
Digitimes is reporting that yields are significantly lower than expected on AMD's 32nm process, resulting in shortages of Llanos. This is probably also the reason why the Bulldozer release date keeps creeping back.
|
# ? Aug 21, 2011 19:03 |
|
Looks like the E-450 is out now. It's only a little faster than the E-350, but it's added support for Llano-style Turbo Core, which apparently can give it a significant boost in GPU-limited games. So a good little advancement for casual gaming in the sub-notebook sector.
|
# ? Aug 22, 2011 04:25 |
|
SemiAccurate got their hands on an AMD internal presentation disclosing the Bulldozer die size as 315mm^2, which is very large compared to other contemporary CPUs. A full Sandy Bridge die is 216mm^2, the Phenom II X6 (Thuban) 45nm die was 346mm^2.
|
# ? Aug 22, 2011 08:45 |
|
Alereon posted:SemiAccurate got their hands on an AMD internal presentation disclosing the Bulldozer die size as 315mm^2, which is very large compared to other contemporary CPUs. A full Sandy Bridge die is 216mm^2, the Phenom II X6 (Thuban) 45nm die was 346mm^2. Look at the layout. No wonder it's so drat big there's 2 megs of L2 cache and 2 megs of L3 cache per core. I'm guessing this is how they solved the memory bottleneck. That's a total of 16 megs of cache on die compared with a Phenom X6 with 3 megs of L2 and 6 megs of L3 (total).
|
# ? Aug 22, 2011 20:07 |
|
Can BD do an L3 chop besides 8M/0M? Or does the cache setup still have that hole if L3 exists and has less space than the combined L2's?
|
# ? Aug 22, 2011 21:05 |
|
JawnV6 posted:Can BD do an L3 chop besides 8M/0M? Or does the cache setup still have that hole if L3 exists and has less space than the combined L2's? I'm not sure anyone has that information other than AMD's engineers and the motherboard manufacturers.
|
# ? Aug 22, 2011 21:36 |
|
Devian666 posted:I'm not sure anyone has that information other than AMD's engineers and the motherboard manufacturers. Yeah I guess it's super-hard to scan a product line and see if they're selling anything besides 0M L3 and 8M L3 parts? These threads are normally full of system-builder types who could answer that at a glance.
|
# ? Aug 22, 2011 23:03 |
|
JawnV6 posted:Yeah I guess it's super-hard to scan a product line and see if they're selling anything besides 0M L3 and 8M L3 parts? These threads are normally full of system-builder types who could answer that at a glance. So difficult that you didn't do this yourself. You could research this and response to the thread. You may find that any information may be subject to change given that the previous version of the chip didn't perform as expected.
|
# ? Aug 22, 2011 23:58 |
|
Devian666 posted:So difficult that you didn't do this yourself. You could research this and response to the thread. Yes. System builders who are constantly shopping for parts and aware of product lines have easier access to this information than I do. I'm seriously asking if anyone's seen a 2, 4, or 6 in the L3 column or if it's all 0 or 8. This shouldn't involve research, it's a yes/no question that I'm sure someone following this thread can answer offhand. I'm not sure why you're being so prickly about this, you didn't even understand the question well enough the first time I asked it to produce a coherent response. Motherboard manufacturers wouldn't have a clue what sizes the L3 can come in and availability of chops is public information.
|
# ? Aug 23, 2011 00:56 |
|
FFS, it's not even out yet. Stop having a sperg contest. According to Wikipedia, the L3 is shared between all cores, but none of the currently-known (i.e. possibly old and outdated) SKUs will have less than 8 MB of L3 cache. L2 cache is paired with the Bulldozer modules and is slated to be 1 MB/core regardless of number of cores.
|
# ? Aug 23, 2011 01:08 |
|
JawnV6 posted:Yes. System builders who are constantly shopping for parts and aware of product lines have easier access to this information than I do. I'm seriously asking if anyone's seen a 2, 4, or 6 in the L3 column or if it's all 0 or 8. This shouldn't involve research, it's a yes/no question that I'm sure someone following this thread can answer offhand. I'm not sure why you're being so prickly about this, you didn't even understand the question well enough the first time I asked it to produce a coherent response. Motherboard manufacturers wouldn't have a clue what sizes the L3 can come in and availability of chops is public information. Sorry for trying to be helpful. The answer is no at this time as all eight versions are listed with 8 megs of L3 cache. I don't have any information outside of the release versions of the desktop cores.
|
# ? Aug 23, 2011 01:22 |
|
Any day now, right?
|
# ? Aug 30, 2011 14:23 |
|
Bob Morales posted:Any day now, right? They delayed until September so let's hope so. It could take a while for stock to be available.
|
# ? Aug 31, 2011 02:25 |
|
I updated the thread title to see if there's interest in creating other roadmap discussion threads. NVIDIA's a bit hazy right now, but there's something interesting stuff to talk about for Intel. If it's a terrible idea, feel free to let me know. I figure it's more appropriate than just calling it "The AMD megathread" since that's sorta what it's turned into.
|
# ? Aug 31, 2011 02:28 |
|
I bought NVIDIA and AMD stock during the crash two weeks ago, lets see how this pans out.
|
# ? Aug 31, 2011 12:36 |
|
Devian666 posted:They delayed until September so let's hope so. It could take a while for stock to be available. Welp! http://www.xbitlabs.com/news/cpu/display/20110831102650_AMD_s_Highly_Anticipated_Bulldozer_Chips_Might_Face_Further_Delay.html
|
# ? Aug 31, 2011 17:44 |
|
Longinus00 posted:Welp! So it looks like the good part of their yield is going into the server market. AMD will most likely build up high stock levels of "low end" cpus from the defective chips. A shame they won't be able to sell those until some time after the bulldozer release.
|
# ? Aug 31, 2011 20:15 |
|
Gigabyte just revealed some upcoming models and specs for Bulldozer chips, apparently: http://www.gigabyte.com/support-downloads/cpu-support-popup.aspx?pid=3880 Top end is the FX-8150, with 8 cores, 3.6GHz, and 125W TDP.
|
# ? Sep 2, 2011 09:43 |
|
|
# ? Dec 12, 2024 01:01 |
|
Killer robot posted:Gigabyte just revealed some upcoming models and specs for Bulldozer chips, apparently: That page says there are two kinds of FX-8120, the only difference being the TDP. That seems a bit weird.
|
# ? Sep 3, 2011 01:31 |