|
Red_Mage posted:I was under the impression Microsoft was going to handle porting .NET and all relevant libraries to ARM. Like it might be an issue if you are writing using mono gtk, or in like Iron Ruby, but in theory C#/Visual Basic apps using whatever the rumored WPF replacement is should run fine. In theory a lot of things are true. In practice... But seriously there are other .net platforms already - most notably .NET for WIndows on Itanium.
|
# ? Jun 28, 2011 08:37 |
|
|
# ? Oct 8, 2024 20:02 |
|
And now back to our regularly scheduled Bulldozerchat: XbitLabs is reporting that Bulldozer-based Opterons will have user-configurable TDPs with 1W granularity. This means AMD will no longer be shipping SE/HE/EE versions of its CPUs, instead you just buy the speed grade you want and set the TDP in the BIOS to exactly what you want it to be.
|
# ? Jun 29, 2011 21:01 |
|
Llano reviews are officially out: Anandtech's A8-3850 Desktop Review Anandtech's A8-3850 HTPC Review Anandtech's Desktop Llano Overclocking and ASRock A75 Extreme6 Review Turbo Core: Only available on the 65W TDP processors, adds up to 300Mhz to CPU clock speed only. Disappointing. Overclocking: Overclocking is of limited usefulness because the chip still tries to stay within the same power envelope, so overclocking the CPU underclocks the GPU. Overclocking the memory is highly beneficial, it overclocks the CPU cores as well, but the improved memory bandwidth more than makes up for the reduced power available to the GPU. For an unoverclocked system, you definitely want 1866Mhz DDR3 if possible, or at least 1600Mhz.
|
# ? Jun 30, 2011 17:57 |
|
Alereon posted:For an unoverclocked system, you definitely want 1866Mhz DDR3 if possible, or at least 1600Mhz. I think the concept of buying more expensive memory for a cheap, low-cost build to get better graphics performance is a little silly. But spending an extra $35 isn't too bad. WhyteRyce fucked around with this message at 20:02 on Jun 30, 2011 |
# ? Jun 30, 2011 20:00 |
|
Wow, I hope this isn't a trend.
|
# ? Jun 30, 2011 20:12 |
|
Sinestro posted:Wow, I hope this isn't a trend. Not sure how it could be interpreted as anything but a trend - continuing, not starting - they've lost on performance every generation since the Athlon 64, haven't they? Intel caught up with the architecture switch in the Pentium M and has blasted past since for performance.
|
# ? Jun 30, 2011 21:26 |
|
AMD done everything but come straight out and say that they aren't interested in an arms race with Intel. The Fusion APU is where they are going to be successful (if they are at all), not with Bulldozer.
|
# ? Jun 30, 2011 21:55 |
|
If they can offer 80% of the performance for 30% of the price (like the X6 compared to the i7) then that's good enough
|
# ? Jun 30, 2011 21:59 |
|
Bob Morales posted:If they can offer 80% of the performance for 30% of the price (like the X6 compared to the i7) then that's good enough Intel's been doing that, too. The $1000 hexacore Nehalem i7 is matched by the quad-core Sandy Bridge i7 in most tasks (and that's just at stock clocks), and it costs $300. Granted, the top-end Phenom II x6 is down to ~$200 now, but still. Intel seriously closed the price gap and has monstrous performers for chips.
|
# ? Jun 30, 2011 22:03 |
|
Factory Factory posted:Intel's been doing that, too. The $1000 hexacore Nehalem i7 is matched by the quad-core Sandy Bridge i7 in most tasks (and that's just at stock clocks), and it costs $300. Granted, the top-end Phenom II x6 is down to ~$200 now, but still. Intel seriously closed the price gap and has monstrous performers for chips. Right, I was thinking of threaded stuff like Cinebench and compiling which the X6 seems to still be faster than the 4-core i5 at. There aren't many reasons to buy an X6.
|
# ? Jun 30, 2011 22:55 |
|
How is GPGPU for AMD parts in linux? If this more or less requires the official AMD drivers to actually use the fusion part of the CPU I'm not going to be interested at all.
|
# ? Jul 1, 2011 00:56 |
|
Longinus00 posted:How is GPGPU for AMD parts in linux? If this more or less requires the official AMD drivers to actually use the fusion part of the CPU I'm not going to be interested at all.
|
# ? Jul 1, 2011 02:27 |
|
I haven't had any issues with running computational stuff using the linux catalyst drivers. Is there a specific application that you need other drivers?
|
# ? Jul 1, 2011 04:58 |
|
Alereon posted:Phoronix has lots of tests of AMD hardware in Linux. Here's their review of AMD Fusion with open source drivers, though that was Brazos and not Llano. Why don't you want to use the Catalyst drivers? That's just testing graphics performance, I see nothing there about compute. If I'm going to have to run Catalyst I'll just reboot into windows where it's faster anyway. Devian666 posted:I haven't had any issues with running computational stuff using the linux catalyst drivers. Is there a specific application that you need other drivers? No, I was just looking playing around with it. What is it currently being programmed in?
|
# ? Jul 1, 2011 08:46 |
|
Longinus00 posted:That's just testing graphics performance, I see nothing there about compute. If I'm going to have to run Catalyst I'll just reboot into windows where it's faster anyway. The gpgpu stuff is in opencl. The stream processors are being included in all new and CPUs so the code will be portable from the gpus. Though for support you'll probably need catalyst drivers initially to make use of opencl on them. I'm still learning some of this stuff so I might be slightly out with some details.
|
# ? Jul 5, 2011 12:32 |
|
[H]ardOCP has an article on E3 rumors about next generation console hardware. The really interesting thing here is that Sony may be using Bulldozer as the Playstation 4's CPU. Nintendo is switching to an IBM quad-core CPU, and Microsoft may be switching to a next-generation Cell for the Xbox. Only the news that the PS4 might use Bulldozer is shocking, the Wii used IBM PowerPC cores, the PS3's Cell used IBM PowerPC cores plus the SPE vector co-processors, and the 360 used the same IBM PowerPC cores from the Cell minus the SPEs. I would certainly expect upcoming consoles to continue using IBM PowerPC CPU cores, though the continued usefulness of the SPEs is questionable now that we have GPUs with incredible, accessible compute performance. On the GPU front, AMD will be powering all three next-generation consoles. The 360 and Wii are both powered by AMD GPUs, it makes a lot of sense that Sony would switch to AMD given the direction nVidia seems to be going with GPUs. It's possible that Sony could be using a Bulldozer-based APU in the PS4, I don't think this would necessarily provide the graphics horsepower that you'd want, but then again it makes sense in the context of Sony saying they wanted to make the PS4 a less expensive console with less investment in hardware development.
|
# ? Jul 7, 2011 20:30 |
|
That's good news. AMD'll hold on strong with that much console support.
|
# ? Jul 7, 2011 20:38 |
|
Agreed posted:That's good news. AMD'll hold on strong with that much console support. Whatever's going on with their CPU designs themselves, buying ATI has paid off for them in a lot of ways.
|
# ? Jul 7, 2011 20:45 |
|
Alereon posted:it makes a lot of sense that Sony would switch to AMD given the direction nVidia seems to be going with GPUs. What does that mean, exactly?
|
# ? Jul 8, 2011 01:01 |
|
wicka posted:What does that mean, exactly?
|
# ? Jul 8, 2011 01:58 |
|
Alereon posted:and the 360 used the same IBM PowerPC cores from the Cell minus the SPEs. It would be pretty strange and amazing if MS uses a revamped Cell and Sony goes with a BD variant after pimping Cell as the end all be all of the future back when they released the PS3. Its almost a total reversal in each company's design ideology. Alereon posted:On the GPU front, AMD will be powering all three next-generation consoles. edit: I know and I don't think I said otherwise, but Sony sure thought it would.\/\/\/\/\/\/\/ PC LOAD LETTER fucked around with this message at 03:05 on Jul 8, 2011 |
# ? Jul 8, 2011 02:40 |
|
Cell never took over the world...
|
# ? Jul 8, 2011 02:46 |
|
PC LOAD LETTER posted:It would be pretty strange and amazing if MS uses a revamped Cell and Sony goes with a BD variant after pimping Cell as the end all be all of the future back when they released the PS3. Its almost a total reversal in each company's design ideology. It's not really strange or amazing that Sony promoted something new and expensive and it turned out not to be the next big thing.
|
# ? Jul 8, 2011 02:48 |
|
wicka posted:It's not really strange or amazing that Sony promoted something new and expensive and it turned out not to be the next big thing. They dumped a heap of cash into it as well though IIRC. They seemed pretty serious about pushing it for a while but after a couple of years nothing really came of it outside of the PS3.
|
# ? Jul 8, 2011 03:06 |
|
wicka posted:It's not really strange or amazing that Sony promoted something new and expensive and it turned out not to be the next big thing.
|
# ? Jul 8, 2011 03:15 |
|
Aleron, where the hell do you get all this information?
|
# ? Jul 8, 2011 04:58 |
|
wicka posted:It's not really strange or amazing that Sony promoted something new and expensive and it turned out not to be the next big thing. At least they won with Bluray! In an era where people are moving away from optical media.
|
# ? Jul 8, 2011 05:59 |
|
Tab8715 posted:Aleron, where the hell do you get all this information? If you follow all the hardware news closely you hear about a lot of this. The philosophy of the ps3 was discussed a lot around the time it was released. That all dates back a long time now. The use of SPEs seemed like a good idea at the time, but between yields and an SPE dedicated to the PS3 OS it didn't end up like what was promised. Cell processors weren't the next big thing but all three consoles use processors based on the technology, and at least two will continue to use the technology in their next consoles. If AMD can deliver a CPU that can handle the PS4 encryption without being a bottleneck they may win the contract.
|
# ? Jul 8, 2011 07:53 |
|
It's kinda nice to watch AMD/ATI steam ahead in the graphics market after years of being condemned as untermenschen, and maybe with bulldozer they might close the gap a bit with intel.
|
# ? Jul 8, 2011 10:55 |
|
JustAnother Fat Guy posted:It's kinda nice to watch AMD/ATI steam ahead in the graphics market after years of being condemned as untermenschen, and maybe with bulldozer they might close the gap a bit with intel. I hope they do because I want to buy a new computer, and don't want to go for the K10s or Intel.
|
# ? Jul 8, 2011 11:49 |
|
Alereon posted:nVidia's GPUs are overly compute-focused for the console market, and their power efficiency is significantly worse. This article from SemiAccurate about the development of nVidia's upcoming Kepler GPU has a bit more information, especially about the manufacturing and engineering challenges nVidia is facing on TSMC's 28nm process that AMD seems to have surpassed. Basically AMD can deliver a graphics solution that's smaller, cooler, and less expensive for the same amount of performance, and they seem to be in a better position to actually deliver upcoming products sooner (remember how late the Geforce 400-series was). I'm mostly comparing the Radeon HD 6870 to the Geforce GTX 560 Ti here since I don't think anyone's putting a GTX 580/R6970 in a console, but I think the same relative proportions are likely to apply to the next generation of mid-range parts. It will be interesting which generation of GPU they use, VLIW4 or vector + scalar which is what Southern Isles is alleged to be based off of. VLIW5 and 4 was much more efficient than Nvidia's designs, but was also pretty hard to fully utilize which explains why it never really walked all over anything Nvidia put out. Vector + Scalar is more close to what Nvidia uses, but with some improvements. I doubt it will be hotter than what Nvidia will produce but we also don't know yet if the improvements they have done will actually be seen in the real world either.
|
# ? Jul 8, 2011 12:21 |
|
Ryokurin posted:It will be interesting which generation of GPU they use, VLIW4 or vector + scalar which is what Southern Isles is alleged to be based off of. VLIW5 and 4 was much more efficient than Nvidia's designs, but was also pretty hard to fully utilize which explains why it never really walked all over anything Nvidia put out. Vector + Scalar is more close to what Nvidia uses, but with some improvements. I doubt it will be hotter than what Nvidia will produce but we also don't know yet if the improvements they have done will actually be seen in the real world either.
|
# ? Jul 8, 2011 16:50 |
|
So is there any more news about potential Bulldozer performance/release date?
|
# ? Jul 9, 2011 03:20 |
|
wicka posted:So is there any more news about potential Bulldozer performance/release date?
|
# ? Jul 9, 2011 03:25 |
|
Star War Sex Parrot posted:Last I heard, late August for desktop Bulldozer parts. Server chips should be out any day now. Whoa, that's closer than I expected. I might actually have the patience to wait and see if it's something I'd be willing to buy. Thanks.
|
# ? Jul 9, 2011 03:31 |
|
Star War Sex Parrot posted:Last I heard, late August for desktop Bulldozer parts. Server chips should be out any day now. Last I heard bulldozer was September but I won't be complaining if they turn up earlier. This is still one of the more interesting cpu developments in a long time.
|
# ? Jul 11, 2011 00:59 |
|
Alereon posted:Here's the Anandtech article about Graphics Core Next (GCN) for those who haven't seen it. I'm pretty sure that GCN is still pretty far out, more likely a target for the Radeon HD 8000 or even 9000 series. I think it's confirmed that Southern Islands, the Radeon HD 7000-series due for release in a couple months, will be VLIW4-based. Bulldozer APUs are definitely VLIW4-based. I think AMD is likely to maintain their power usage lead for awhile, since it's more of a philosophy of aiming for a lower target, and AMD has a significant technology lead in memory controllers (one of the reasons the R6870 has such good efficiency). I do not believe this is accurate. Right now, based on what we can see in the drivers [check beyond3d forums] there will likely be a low end VLIW-4 card, to be used in the hybird CF for Trinity [BD APU], with the other designs all being GCN based.
|
# ? Jul 11, 2011 05:31 |
|
Supposedly GCN is big and hot compared to VLIW5/4 5xxx/6xxx GPU's so it won't be put on a APU until another die shrink or 2. No one is really sure about what the 7xxx will be exactly yet but it could very well be VLIW4's last hurrah before GCN shows up in 8xxx cards.
PC LOAD LETTER fucked around with this message at 07:11 on Jul 11, 2011 |
# ? Jul 11, 2011 07:09 |
|
AMD's Bulldozer-based FX-8130P benchmarked early By Jose Vilches, TechSpot.com Published: July 11, 2011, 9:00 AM EST Last month at the E3 conference in Los Angeles AMD officially reintroduced the FX brand for their top performing processors aimed at PC enthusiasts and gaming aficionados. Although no actual products were launched, we already have a pretty good idea of the initial lineup, and now Turkish website DonanimHaber is offering a glimpse at the performance we can look forward to. The site managed to get their hands on an engineering sample of AMD's forthcoming FX-8130P and ran it through a range of tests. The 8-core chip features 3.2GHz clock speeds, 2MB of L2 cache per each pair of cores (8MB in total), and 8MB L3 cache shared between all modules. The motherboard used was a Gigabyte 990FXA-UD5, which was paired with a GeForce GTX 580. Bulldozer scores P6265 in the 3D Mark 11 benchmark, 3045 in PCMark 7, 24434 in Cinebench R10 and manages 136 and 45 frames per second in x264 encoding tests for Pass 1 and Pass 2, respectively. In addition, it took 19.5 seconds to complete SuperPi 1M. Unfortunately there are no Core i7 2600K scores to compare with -- and the benchmark programs used differ from our usual range of tests -- but VR-Zone claims typical scores for Intel's top Sandy Bridge part are lower in all tests except SuperPi 1M, where it is significantly faster. Compared to the Thuban-based Phenom II X6 1100T, Bulldozer should end up about 50% faster, while overall it slots right in between the Sandy Bridge Core i7 2600K and Gulftown-based Core i7 990X in terms of performance. Of course scores will vary from platform to platform so we'll reserve judgment until we can put Bulldozer to the test ourselves. If these early comparisons hold up, though, AMD could finally have an answer to Intel on the high-end. The rumored $320 price tag suggests that will be the case considering Intel's Core i7 2600K costs roughly the same.
|
# ? Jul 12, 2011 16:47 |
|
|
# ? Oct 8, 2024 20:02 |
|
Peechka posted:AMD's Bulldozer-based FX-8130P benchmarked early As much as I want AMD to succeed I would take any unofficial benchmarks with a grain of salt for now.
|
# ? Jul 12, 2011 16:57 |