New around here? Register your SA Forums Account here!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
fishmech
Jul 16, 2006

by VideoGames
Salad Prong

Red_Mage posted:

I was under the impression Microsoft was going to handle porting .NET and all relevant libraries to ARM. Like it might be an issue if you are writing using mono gtk, or in like Iron Ruby, but in theory C#/Visual Basic apps using whatever the rumored WPF replacement is should run fine.

In theory a lot of things are true. In practice...

But seriously there are other .net platforms already - most notably .NET for WIndows on Itanium.

Adbot
ADBOT LOVES YOU

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
And now back to our regularly scheduled Bulldozerchat:

XbitLabs is reporting that Bulldozer-based Opterons will have user-configurable TDPs with 1W granularity. This means AMD will no longer be shipping SE/HE/EE versions of its CPUs, instead you just buy the speed grade you want and set the TDP in the BIOS to exactly what you want it to be.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Llano reviews are officially out:

Anandtech's A8-3850 Desktop Review
Anandtech's A8-3850 HTPC Review
Anandtech's Desktop Llano Overclocking and ASRock A75 Extreme6 Review

Turbo Core: Only available on the 65W TDP processors, adds up to 300Mhz to CPU clock speed only. Disappointing.

Overclocking: Overclocking is of limited usefulness because the chip still tries to stay within the same power envelope, so overclocking the CPU underclocks the GPU. Overclocking the memory is highly beneficial, it overclocks the CPU cores as well, but the improved memory bandwidth more than makes up for the reduced power available to the GPU. For an unoverclocked system, you definitely want 1866Mhz DDR3 if possible, or at least 1600Mhz.

WhyteRyce
Dec 30, 2001

Alereon posted:

For an unoverclocked system, you definitely want 1866Mhz DDR3 if possible, or at least 1600Mhz.

I think the concept of buying more expensive memory for a cheap, low-cost build to get better graphics performance is a little silly. But spending an extra $35 isn't too bad.

WhyteRyce fucked around with this message at 19:02 on Jun 30, 2011

Sinestro
Oct 31, 2010

The perfect day needs the perfect set of wheels.
:smith: Wow, I hope this isn't a trend.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Sinestro posted:

:smith: Wow, I hope this isn't a trend.

Not sure how it could be interpreted as anything but a trend - continuing, not starting - they've lost on performance every generation since the Athlon 64, haven't they? Intel caught up with the architecture switch in the Pentium M and has blasted past since for performance.

Un-l337-Pork
Sep 9, 2001

Oooh yeah...


AMD done everything but come straight out and say that they aren't interested in an arms race with Intel. The Fusion APU is where they are going to be successful (if they are at all), not with Bulldozer.

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

If they can offer 80% of the performance for 30% of the price (like the X6 compared to the i7) then that's good enough

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Bob Morales posted:

If they can offer 80% of the performance for 30% of the price (like the X6 compared to the i7) then that's good enough

Intel's been doing that, too. The $1000 hexacore Nehalem i7 is matched by the quad-core Sandy Bridge i7 in most tasks (and that's just at stock clocks), and it costs $300. Granted, the top-end Phenom II x6 is down to ~$200 now, but still. Intel seriously closed the price gap and has monstrous performers for chips.

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

Factory Factory posted:

Intel's been doing that, too. The $1000 hexacore Nehalem i7 is matched by the quad-core Sandy Bridge i7 in most tasks (and that's just at stock clocks), and it costs $300. Granted, the top-end Phenom II x6 is down to ~$200 now, but still. Intel seriously closed the price gap and has monstrous performers for chips.

Right, I was thinking of threaded stuff like Cinebench and compiling which the X6 seems to still be faster than the 4-core i5 at. There aren't many reasons to buy an X6.

Longinus00
Dec 29, 2005
Ur-Quan
How is GPGPU for AMD parts in linux? If this more or less requires the official AMD drivers to actually use the fusion part of the CPU I'm not going to be interested at all.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Longinus00 posted:

How is GPGPU for AMD parts in linux? If this more or less requires the official AMD drivers to actually use the fusion part of the CPU I'm not going to be interested at all.
Phoronix has lots of tests of AMD hardware in Linux. Here's their review of AMD Fusion with open source drivers, though that was Brazos and not Llano. Why don't you want to use the Catalyst drivers?

Devian666
Aug 19, 2008

Take some advice Chris.

Fun Shoe
I haven't had any issues with running computational stuff using the linux catalyst drivers. Is there a specific application that you need other drivers?

Longinus00
Dec 29, 2005
Ur-Quan

Alereon posted:

Phoronix has lots of tests of AMD hardware in Linux. Here's their review of AMD Fusion with open source drivers, though that was Brazos and not Llano. Why don't you want to use the Catalyst drivers?

That's just testing graphics performance, I see nothing there about compute. If I'm going to have to run Catalyst I'll just reboot into windows where it's faster anyway.

Devian666 posted:

I haven't had any issues with running computational stuff using the linux catalyst drivers. Is there a specific application that you need other drivers?

No, I was just looking playing around with it. What is it currently being programmed in?

Devian666
Aug 19, 2008

Take some advice Chris.

Fun Shoe

Longinus00 posted:

That's just testing graphics performance, I see nothing there about compute. If I'm going to have to run Catalyst I'll just reboot into windows where it's faster anyway.


No, I was just looking playing around with it. What is it currently being programmed in?

The gpgpu stuff is in opencl. The stream processors are being included in all new and CPUs so the code will be portable from the gpus. Though for support you'll probably need catalyst drivers initially to make use of opencl on them.

I'm still learning some of this stuff so I might be slightly out with some details.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
[H]ardOCP has an article on E3 rumors about next generation console hardware. The really interesting thing here is that Sony may be using Bulldozer as the Playstation 4's CPU. Nintendo is switching to an IBM quad-core CPU, and Microsoft may be switching to a next-generation Cell for the Xbox. Only the news that the PS4 might use Bulldozer is shocking, the Wii used IBM PowerPC cores, the PS3's Cell used IBM PowerPC cores plus the SPE vector co-processors, and the 360 used the same IBM PowerPC cores from the Cell minus the SPEs. I would certainly expect upcoming consoles to continue using IBM PowerPC CPU cores, though the continued usefulness of the SPEs is questionable now that we have GPUs with incredible, accessible compute performance.

On the GPU front, AMD will be powering all three next-generation consoles. The 360 and Wii are both powered by AMD GPUs, it makes a lot of sense that Sony would switch to AMD given the direction nVidia seems to be going with GPUs. It's possible that Sony could be using a Bulldozer-based APU in the PS4, I don't think this would necessarily provide the graphics horsepower that you'd want, but then again it makes sense in the context of Sony saying they wanted to make the PS4 a less expensive console with less investment in hardware development.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

That's good news. AMD'll hold on strong with that much console support.

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

Agreed posted:

That's good news. AMD'll hold on strong with that much console support.

Whatever's going on with their CPU designs themselves, buying ATI has paid off for them in a lot of ways.

neTYFKuuLnxtHxfLcm
Jun 28, 2007

Alereon posted:

it makes a lot of sense that Sony would switch to AMD given the direction nVidia seems to be going with GPUs.

What does that mean, exactly?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

wicka posted:

What does that mean, exactly?
nVidia's GPUs are overly compute-focused for the console market, and their power efficiency is significantly worse. This article from SemiAccurate about the development of nVidia's upcoming Kepler GPU has a bit more information, especially about the manufacturing and engineering challenges nVidia is facing on TSMC's 28nm process that AMD seems to have surpassed. Basically AMD can deliver a graphics solution that's smaller, cooler, and less expensive for the same amount of performance, and they seem to be in a better position to actually deliver upcoming products sooner (remember how late the Geforce 400-series was). I'm mostly comparing the Radeon HD 6870 to the Geforce GTX 560 Ti here since I don't think anyone's putting a GTX 580/R6970 in a console, but I think the same relative proportions are likely to apply to the next generation of mid-range parts.

PC LOAD LETTER
May 23, 2005
WTF?!

Alereon posted:

and the 360 used the same IBM PowerPC cores from the Cell minus the SPEs.
[nitpick]Xenon also used a customized vector FPU that the version in Cell didn't have and the cache structure was different, more like what you'd see in a PC CPU and complex than Cell's LSU's.[/niptick]

It would be pretty strange and amazing if MS uses a revamped Cell and Sony goes with a BD variant after pimping Cell as the end all be all of the future back when they released the PS3. Its almost a total reversal in each company's design ideology.

Alereon posted:

On the GPU front, AMD will be powering all three next-generation consoles.
:monocle: Wow big win for them. I really hope BD is better than rumored to be but it looks like either which way AMD may end up financially sound for the future if for no other reason then all the consoles will be using their hardware or at least liscencing their designs. I guess AMD really nailed the perfect balance between GPGPU support, graphics performance, power usage, and die size (cost).

edit: I know and I don't think I said otherwise, but Sony sure thought it would.\/\/\/\/\/\/\/

PC LOAD LETTER fucked around with this message at 02:05 on Jul 8, 2011

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

Cell never took over the world...

neTYFKuuLnxtHxfLcm
Jun 28, 2007

PC LOAD LETTER posted:

It would be pretty strange and amazing if MS uses a revamped Cell and Sony goes with a BD variant after pimping Cell as the end all be all of the future back when they released the PS3. Its almost a total reversal in each company's design ideology.

It's not really strange or amazing that Sony promoted something new and expensive and it turned out not to be the next big thing.

PC LOAD LETTER
May 23, 2005
WTF?!

wicka posted:

It's not really strange or amazing that Sony promoted something new and expensive and it turned out not to be the next big thing.

They dumped a heap of cash into it as well though IIRC. They seemed pretty serious about pushing it for a while but after a couple of years nothing really came of it outside of the PS3.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

wicka posted:

It's not really strange or amazing that Sony promoted something new and expensive and it turned out not to be the next big thing.
To be fair, the Cell was an interesting solution to the problem that was solved by the next generation of GPUs that came out right after the PS3. The Cell really isn't a bad idea inherently, it just seems like the GPU (Geforce 7950 GT) is so weak that no one ever had much use for the capabilities, especially given the programming effort needed to unlock them. Sony tried to offer the Cell for embedded applications where you need a lot of performance, like TVs and video boxes, but it turns out an SoC with an ARM CPU core and some dedicated video decode hardware is a much better, cheaper, and more efficient solution. There just aren't that many applications that require a lot of processing power, don't have available ASICs, and won't/can't be run on x86.

Gucci Loafers
May 20, 2006

Ask yourself, do you really want to talk to pair of really nice gaudy shoes?


Aleron, where the hell do you get all this information?

freeforumuser
Aug 11, 2007

wicka posted:

It's not really strange or amazing that Sony promoted something new and expensive and it turned out not to be the next big thing.

At least they won with Bluray! In an era where people are moving away from optical media.

Devian666
Aug 19, 2008

Take some advice Chris.

Fun Shoe

Tab8715 posted:

Aleron, where the hell do you get all this information?

If you follow all the hardware news closely you hear about a lot of this. The philosophy of the ps3 was discussed a lot around the time it was released. That all dates back a long time now. The use of SPEs seemed like a good idea at the time, but between yields and an SPE dedicated to the PS3 OS it didn't end up like what was promised.

Cell processors weren't the next big thing but all three consoles use processors based on the technology, and at least two will continue to use the technology in their next consoles.

If AMD can deliver a CPU that can handle the PS4 encryption without being a bottleneck they may win the contract.

JustAnother Fat Guy
Dec 22, 2009

Go to hell, and take your cheap suit with you!
It's kinda nice to watch AMD/ATI steam ahead in the graphics market after years of being condemned as untermenschen, and maybe with bulldozer they might close the gap a bit with intel.

Riso
Oct 11, 2008

by merry exmarx

JustAnother Fat Guy posted:

It's kinda nice to watch AMD/ATI steam ahead in the graphics market after years of being condemned as untermenschen, and maybe with bulldozer they might close the gap a bit with intel.

I hope they do because I want to buy a new computer, and don't want to go for the K10s or Intel.

Ryokurin
Jul 14, 2001

Wanna Die?

Alereon posted:

nVidia's GPUs are overly compute-focused for the console market, and their power efficiency is significantly worse. This article from SemiAccurate about the development of nVidia's upcoming Kepler GPU has a bit more information, especially about the manufacturing and engineering challenges nVidia is facing on TSMC's 28nm process that AMD seems to have surpassed. Basically AMD can deliver a graphics solution that's smaller, cooler, and less expensive for the same amount of performance, and they seem to be in a better position to actually deliver upcoming products sooner (remember how late the Geforce 400-series was). I'm mostly comparing the Radeon HD 6870 to the Geforce GTX 560 Ti here since I don't think anyone's putting a GTX 580/R6970 in a console, but I think the same relative proportions are likely to apply to the next generation of mid-range parts.

It will be interesting which generation of GPU they use, VLIW4 or vector + scalar which is what Southern Isles is alleged to be based off of. VLIW5 and 4 was much more efficient than Nvidia's designs, but was also pretty hard to fully utilize which explains why it never really walked all over anything Nvidia put out. Vector + Scalar is more close to what Nvidia uses, but with some improvements. I doubt it will be hotter than what Nvidia will produce but we also don't know yet if the improvements they have done will actually be seen in the real world either.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Ryokurin posted:

It will be interesting which generation of GPU they use, VLIW4 or vector + scalar which is what Southern Isles is alleged to be based off of. VLIW5 and 4 was much more efficient than Nvidia's designs, but was also pretty hard to fully utilize which explains why it never really walked all over anything Nvidia put out. Vector + Scalar is more close to what Nvidia uses, but with some improvements. I doubt it will be hotter than what Nvidia will produce but we also don't know yet if the improvements they have done will actually be seen in the real world either.
Here's the Anandtech article about Graphics Core Next (GCN) for those who haven't seen it. I'm pretty sure that GCN is still pretty far out, more likely a target for the Radeon HD 8000 or even 9000 series. I think it's confirmed that Southern Islands, the Radeon HD 7000-series due for release in a couple months, will be VLIW4-based. Bulldozer APUs are definitely VLIW4-based. I think AMD is likely to maintain their power usage lead for awhile, since it's more of a philosophy of aiming for a lower target, and AMD has a significant technology lead in memory controllers (one of the reasons the R6870 has such good efficiency).

neTYFKuuLnxtHxfLcm
Jun 28, 2007
So is there any more news about potential Bulldozer performance/release date?

Star War Sex Parrot
Oct 2, 2003

wicka posted:

So is there any more news about potential Bulldozer performance/release date?
Last I heard, late August for desktop Bulldozer parts. Server chips should be out any day now.

neTYFKuuLnxtHxfLcm
Jun 28, 2007

Star War Sex Parrot posted:

Last I heard, late August for desktop Bulldozer parts. Server chips should be out any day now.

Whoa, that's closer than I expected. I might actually have the patience to wait and see if it's something I'd be willing to buy. Thanks.

Devian666
Aug 19, 2008

Take some advice Chris.

Fun Shoe

Star War Sex Parrot posted:

Last I heard, late August for desktop Bulldozer parts. Server chips should be out any day now.

Last I heard bulldozer was September but I won't be complaining if they turn up earlier. This is still one of the more interesting cpu developments in a long time.

tijag
Aug 6, 2002

Alereon posted:

Here's the Anandtech article about Graphics Core Next (GCN) for those who haven't seen it. I'm pretty sure that GCN is still pretty far out, more likely a target for the Radeon HD 8000 or even 9000 series. I think it's confirmed that Southern Islands, the Radeon HD 7000-series due for release in a couple months, will be VLIW4-based. Bulldozer APUs are definitely VLIW4-based. I think AMD is likely to maintain their power usage lead for awhile, since it's more of a philosophy of aiming for a lower target, and AMD has a significant technology lead in memory controllers (one of the reasons the R6870 has such good efficiency).

I do not believe this is accurate.

Right now, based on what we can see in the drivers [check beyond3d forums] there will likely be a low end VLIW-4 card, to be used in the hybird CF for Trinity [BD APU], with the other designs all being GCN based.

PC LOAD LETTER
May 23, 2005
WTF?!
Supposedly GCN is big and hot compared to VLIW5/4 5xxx/6xxx GPU's so it won't be put on a APU until another die shrink or 2. No one is really sure about what the 7xxx will be exactly yet but it could very well be VLIW4's last hurrah before GCN shows up in 8xxx cards.

PC LOAD LETTER fucked around with this message at 06:11 on Jul 11, 2011

Peechka
Nov 10, 2005
AMD's Bulldozer-based FX-8130P benchmarked early
By Jose Vilches, TechSpot.com
Published: July 11, 2011, 9:00 AM EST

Last month at the E3 conference in Los Angeles AMD officially reintroduced the FX brand for their top performing processors aimed at PC enthusiasts and gaming aficionados. Although no actual products were launched, we already have a pretty good idea of the initial lineup, and now Turkish website DonanimHaber is offering a glimpse at the performance we can look forward to.

The site managed to get their hands on an engineering sample of AMD's forthcoming FX-8130P and ran it through a range of tests. The 8-core chip features 3.2GHz clock speeds, 2MB of L2 cache per each pair of cores (8MB in total), and 8MB L3 cache shared between all modules. The motherboard used was a Gigabyte 990FXA-UD5, which was paired with a GeForce GTX 580.

Bulldozer scores P6265 in the 3D Mark 11 benchmark, 3045 in PCMark 7, 24434 in Cinebench R10 and manages 136 and 45 frames per second in x264 encoding tests for Pass 1 and Pass 2, respectively. In addition, it took 19.5 seconds to complete SuperPi 1M. Unfortunately there are no Core i7 2600K scores to compare with -- and the benchmark programs used differ from our usual range of tests -- but VR-Zone claims typical scores for Intel's top Sandy Bridge part are lower in all tests except SuperPi 1M, where it is significantly faster.

Compared to the Thuban-based Phenom II X6 1100T, Bulldozer should end up about 50% faster, while overall it slots right in between the Sandy Bridge Core i7 2600K and Gulftown-based Core i7 990X in terms of performance.

Of course scores will vary from platform to platform so we'll reserve judgment until we can put Bulldozer to the test ourselves. If these early comparisons hold up, though, AMD could finally have an answer to Intel on the high-end. The rumored $320 price tag suggests that will be the case considering Intel's Core i7 2600K costs roughly the same.

Adbot
ADBOT LOVES YOU

freeforumuser
Aug 11, 2007

Peechka posted:

AMD's Bulldozer-based FX-8130P benchmarked early
By Jose Vilches, TechSpot.com
Published: July 11, 2011, 9:00 AM EST

Last month at the E3 conference in Los Angeles AMD officially reintroduced the FX brand for their top performing processors aimed at PC enthusiasts and gaming aficionados. Although no actual products were launched, we already have a pretty good idea of the initial lineup, and now Turkish website DonanimHaber is offering a glimpse at the performance we can look forward to.

The site managed to get their hands on an engineering sample of AMD's forthcoming FX-8130P and ran it through a range of tests. The 8-core chip features 3.2GHz clock speeds, 2MB of L2 cache per each pair of cores (8MB in total), and 8MB L3 cache shared between all modules. The motherboard used was a Gigabyte 990FXA-UD5, which was paired with a GeForce GTX 580.

Bulldozer scores P6265 in the 3D Mark 11 benchmark, 3045 in PCMark 7, 24434 in Cinebench R10 and manages 136 and 45 frames per second in x264 encoding tests for Pass 1 and Pass 2, respectively. In addition, it took 19.5 seconds to complete SuperPi 1M. Unfortunately there are no Core i7 2600K scores to compare with -- and the benchmark programs used differ from our usual range of tests -- but VR-Zone claims typical scores for Intel's top Sandy Bridge part are lower in all tests except SuperPi 1M, where it is significantly faster.

Compared to the Thuban-based Phenom II X6 1100T, Bulldozer should end up about 50% faster, while overall it slots right in between the Sandy Bridge Core i7 2600K and Gulftown-based Core i7 990X in terms of performance.

Of course scores will vary from platform to platform so we'll reserve judgment until we can put Bulldozer to the test ourselves. If these early comparisons hold up, though, AMD could finally have an answer to Intel on the high-end. The rumored $320 price tag suggests that will be the case considering Intel's Core i7 2600K costs roughly the same.

As much as I want AMD to succeed I would take any unofficial benchmarks with a grain of salt for now.

  • Locked thread