|
If those benchmarks are true :
|
# ? Jul 12, 2011 17:00 |
|
|
# ? Dec 14, 2024 17:24 |
|
JustAnother Fat Guy posted:If those benchmarks are true : If those benchmarks are true: I wonder how the four-core model fares (there's no emoticon for that).
|
# ? Jul 12, 2011 18:36 |
|
What if AMD is also waiting for Mac OS X Lion to come out, because Apple is going to offer a cheaper Mac Pro powered by AMD?
|
# ? Jul 12, 2011 18:37 |
|
Apple make a big deal out of their relationship with intel, and intel do make better cpu's for the time being, i don't see them switching soon, unless AMD literally blows intel clean out of the water. They are pretty big on ATI graphics now that ATI have steamed ahead. It probably wouldn't be all that of a big step to offer the llano in some of their offerings as the powerdraw decrease would do wonders for battery life, or allow a more powerful CPU, but for their mac pros they are likely to keep using intel for the time being.
|
# ? Jul 12, 2011 18:51 |
|
JustAnother Fat Guy posted:Apple make a big deal out of their relationship with intel, and intel do make better cpu's for the time being, i don't see them switching soon, unless AMD literally blows intel clean out of the water. I'm imagining this involves AMD hiring Xe to blow up ships loaded with Intel chips on their way to Costa Rica and Vietnam for assembly.
|
# ? Jul 12, 2011 19:40 |
|
Bob Morales posted:What if AMD is also waiting for Mac OS X Lion to come out, because Apple is going to offer a cheaper Mac Pro powered by AMD? As much as I'd like an affordable Mac Pro, this won't happen
|
# ? Jul 12, 2011 19:43 |
|
freeforumuser posted:As much as I want AMD to succeed I would take any unofficial benchmarks with a grain of salt for now. These are also benchmarks on engineering samples. We don't know for sure if these engineering samples are the ones before or after the performance tweaks. I'm still happy to wait for some credible sites to run a wide range of benchmarks.
|
# ? Jul 12, 2011 22:31 |
|
What kind of processor is being allegedly benchmarked there? Is it a server/workstation part, or something meant for typical desktops?
|
# ? Jul 13, 2011 00:19 |
|
Top end 8 core desktop part, FX-8130P so 125w TDP. Supposed to cost around $300 or so if the rumors hold up.
|
# ? Jul 13, 2011 05:06 |
|
What I don't understand is how the chip can slot between the i7-2600K and the i7-990X. In all but a very few tasks, they are neck and neck, with the 2600K winning on per-thread tasks and the 990X winning on heavily multithreaded/floating point tasks. Where does the FX-8130p fit in, exactly?
|
# ? Jul 13, 2011 05:44 |
|
That thin vector of the market who want 8 cores for the price of a 2600k?
|
# ? Jul 13, 2011 06:02 |
|
I think it is ~2600K.
|
# ? Jul 13, 2011 06:08 |
|
Factory Factory posted:Where does the FX-8130p fit in, exactly?
|
# ? Jul 13, 2011 06:22 |
|
PC LOAD LETTER posted:Its a top end enthusiast part, pretty niche so AMD fans would buy it even if it doesn't overclock well. If the motherboards work out to be cheap enough there could be some platform value I guess. I have a friend who I'm reasonably sure will buy this. He thinks that all Intel chips suck, and refuses to even read anything that states otherwise. Also, he is adamant that more cores = better. If AMD made a processor with 1000 cores that each had the power of an 8080, he'd buy it in a second. Edit: Then he'd blame Windows 7 for running slow on it.
|
# ? Jul 13, 2011 06:40 |
|
KillHour posted:I have a friend who I'm reasonably sure will buy this. He thinks that all Intel chips suck, and refuses to even read anything that states otherwise. Also, he is adamant that more cores = better. If AMD made a processor with 1000 cores that each had the power of an 8080, he'd buy it in a second. Dumb people are everywhere; even within the PC enthusiast circles. I'm not even sure how can anyone think a $200 AM3+ board + $120 Phenom II can be remotely considered a good deal compared to a 2500K combo at the same price. "But it's upgradable to BD!" Yeah, as if AMD is gonna give BD when it hits for free to anyone who bought a AM3+ board beforehand.
|
# ? Jul 13, 2011 10:07 |
|
I'm personally eyeing up the 4 core or possibly 6 core variant to replace my phenom II x4, but it'd be a pretty un-needed buy as my cpu already crunches almost everything, but BF3 might make me reconsider. Depends on the price really, and real production chip benchmarks, i hope AMD keep their price edge on intel, as the i5 2500K is hot poo poo and i'd like it if the 4 core bulldozer could compete against it. As i'm in no real hurry to upgrade, i'm just going to sit around a bit and watch. JustAnother Fat Guy fucked around with this message at 12:22 on Jul 13, 2011 |
# ? Jul 13, 2011 10:16 |
|
freeforumuser posted:Dumb people are everywhere; even within the PC enthusiast circles. I'm not even sure how can anyone think a $200 AM3+ board + $120 Phenom II can be remotely considered a good deal compared to a 2500K combo at the same price. "But it's upgradable to BD!" Yeah, as if AMD is gonna give BD when it hits for free to anyone who bought a AM3+ board beforehand. The only way I could see someone buying an AMD is if they did something like got the 1090T for $179 with a free motherboard like some retailers do. You're only saving $75 over the cost of i5 2500k + low-end MB, but hey it's $75. And it's going to be faster at things like compiling, encryption, compression, and lose out in almost all games and anything that is using < 4 cores, especially single-threaded stuff.
|
# ? Jul 13, 2011 12:35 |
|
Bob Morales posted:..1090T.. ..it's going to be faster at.. Bob Morales posted:compression Bob Morales posted:encryption HalloKitty fucked around with this message at 12:44 on Jul 13, 2011 |
# ? Jul 13, 2011 12:38 |
|
What kills me is people who are cheering on AMD to succeed and become competitive with Intel again so they turn around and buy more Intel chips at what they hope will be lower prices.
|
# ? Jul 13, 2011 12:40 |
|
HalloKitty posted:Not by quite a way, and this is against Nehalem/Gulftown, not Sandy Bridge. I was looking at Anand's bench numbers for the 1090T (1100T actually) vs the i5 2500k, but the point was there's very few things the AMD is faster at.
|
# ? Jul 13, 2011 13:31 |
|
Coredump posted:What kills me is people who are cheering on AMD to succeed and become competitive with Intel again so they turn around and buy more Intel chips at what they hope will be lower prices. I've built both AMD and Intel systems. If AMD just outright wins a generation I'll be loving tickled pink to build another AMD system, it'd be pretty nice to be able to do that. But my job requires performance and the smart money is Intel for now, and was in 2008 when I built my last computer. In 2003 it was a different story and I loved the AMD Athlon XP system at the time, felt like lightning. It's not about fanboyism, it's just practical decisionmaking based on price and performance.
|
# ? Jul 13, 2011 13:45 |
|
Coredump posted:What kills me is people who are cheering on AMD to succeed and become competitive with Intel again so they turn around and buy more Intel chips at what they hope will be lower prices. BD might end up being competitive in the desktop front but it still be a mostly hollow victory for it will be a non-starter on the mobile market where the real money is to be made. Most won't agree the $300 2600K is worth the money on the desktop side; but to get the lowest-end i7-2620M mobile SB quad you are already paying $346 to Intel alone. The lack of an integrated northbridge and on-die IGP means BD is ill-suited for laptops; BD fusion will be the key to unlock this segment but god knows how long we have to wait.
|
# ? Jul 13, 2011 15:51 |
|
Coredump posted:What kills me is people who are cheering on AMD to succeed and become competitive with Intel again so they turn around and buy more Intel chips at what they hope will be lower prices. Yeah i hate it when people act like rational consumers instead of blind fanboys.
|
# ? Jul 13, 2011 16:03 |
|
$320 a realistic price point? Some things that concern me - if it's expensive and it just matches Intel's current-gen chip for performance, that sucks. Intel will come out with Ivy Bridge and blow it away for enthusiast/high end usage, while remaining price to performance competitive in the $200-$300 CPU market. I am concerned also that it might not have a similarly easy to use overclocking method and that it might be memory performance bottlenecked with lower speed RAM. If it needs faster RAM to perform up to snuff, that adds more cost compared to the inexpensive and common 9-9-9-24 DDR3 1333mhz that gets you to pretty much ideal performance with Sandy Bridge. But I don't know enough about it at this point obviously to say one way or another. I don't understand some of the benchmark results, either, or the way they're presented as exceeding Intel; e.g. 3Dmark11's Performance mode score. That score is slightly lower than what I was getting with my 2600K before I overclocked it, also with a GTX 580, but it's weird to even use a 3Dmark score as an indicator of performance. Is it because they're just done with Sandra and have to have SOMETHING? Should the whole thing be taken with a grain of salt as far as that goes, just PR (since I can personally attest that those scores are not superior to Intel's comparably priced chip, which means that's at best inaccurate reporting or at worst misleading on purpose)? On the other hand, if it's serious then that's not exactly encouraging, if it costs more than a 2600K out the gate and doesn't perform better all AMD is doing is playing catch-up to a months-old processor that is going to get another iteration around the time of AMD's launch that will make Intel an undisputed performance winner again in all price categories except very low budget systems. Edit: I just hate to see such lofty hype boil down to meet-but-not-exceed performance. If this is a generation where they're just trying to get their poo poo together and let the fact that they're powering all the consoles subsidize the rest, okay, I hope it pays off in the next generation... But at this point it looks like what they're offering up is a slightly more expensive alternative to the 2600K that doesn't improve on it meaningfully at all. Which is great if you've been itching to build an AMD system for ideological reasons but not great if you're just wanting to put together a high performance computer without paying the enthusiast premium for the chips coming down the pipeline from Intel. Agreed fucked around with this message at 16:45 on Jul 13, 2011 |
# ? Jul 13, 2011 16:40 |
|
Bob Morales posted:The only way I could see someone buying an AMD is if they did something like got the 1090T for $179 with a free motherboard like some retailers do. You're only saving $75 over the cost of i5 2500k + low-end MB, but hey it's $75. And it's going to be faster at things like compiling, encryption, compression, and lose out in almost all games and anything that is using < 4 cores, especially single-threaded stuff. On my current desktop, I bought AMD a year and a half ago largely because it meant I didn't have to stress over how many PCIe lanes Intel would allow me on a reasonably priced system. A year ago I would have added USB3 and SATA3 to the list and still gotten AMD, even with a bit of a performance hit. But the performance gap has just widened so much even since then, so today it would be another case. Though if it was today I was shopping I'd hold out a little while longer to see what the Bulldozer looked like.
|
# ? Jul 13, 2011 17:18 |
|
JawnV6 posted:Yeah i hate it when people act like rational consumers instead of blind fanboys. Yeah I hate it when people want to post witty zingers instead of adding anything worthwhile to the conversation. Agreed posted:I've built both AMD and Intel systems. If AMD just outright wins a generation I'll be loving tickled pink to build another AMD system, it'd be pretty nice to be able to do that. But my job requires performance and the smart money is Intel for now, and was in 2008 when I built my last computer. In 2003 it was a different story and I loved the AMD Athlon XP system at the time, felt like lightning. It's not about fanboyism, it's just practical decisionmaking based on price and performance. I really don't think its about being a fanboy in the case. In order for AMD to "outright win a generation" they have to get the revenue in order to pump into R&D in order to do that. I just don't see AMD ever being able to get the money they need to get the lead back from Intel if everyone holds this view. In a situation like this where AMD is the largest thing keeping Intel's prices in check. I feel like its part of being a responsible consumer to support AMD to keep the x86 market place from becoming more of monopoly ruled by one company.
|
# ? Jul 13, 2011 20:35 |
|
Coredump posted:Yeah I hate it when people want to post witty zingers instead of adding anything worthwhile to the conversation. This can never happen. AMD controls the patent for the 64 bit instruction set, and Intel owns x86. If one went down, they could pull the other with them.
|
# ? Jul 13, 2011 21:01 |
|
If AMD went down, the administrators would sell off all the assets. It would require someone else to outbid Intel for the AMD64 instruction set before there would really be a problem. If I were a patent troll, that would definitely be one I would be keeping an eye on, however.
|
# ? Jul 14, 2011 11:24 |
|
Can't AMD just use the literally booming profits from their graphics business(ATI/AMD Graphics) to pump into R&D for CPU's. Edit: The chances of AMD folding are pretty slim currently.
|
# ? Jul 14, 2011 11:28 |
|
JustAnother Fat Guy posted:Can't AMD just use the literally booming profits from their graphics business(ATI/AMD Graphics) to pump into R&D for CPU's. AMD had an operating income (that's profit after taxes, amortization, interest and all that other accounting poo poo) of $100 million on revenue of $1.2 billion for their CPU related operations last quarter, the operating income of the GPU business was $19 million on revenue of $413 million. As much as the GPU side of their business is the superior product in the marketplace, the CPU is still the engine that drives the companies profits.
|
# ? Jul 14, 2011 16:08 |
|
If AMD ever got to the point where bankruptcy was a real possibility Intel would probably throw them a indirect lifeline similar to how Microsoft purchased Apple stock in the 90s simply because they are the only thing that keeps an antitrust inquiry from gaining traction. Not to mention they can't afford to have patents go to another party who may not be as flexible when it comes to agreements. Unless it's a world where x86 is dieing or AMD slides so much they are well below 10% like everyone else they'll survive somehow.
|
# ? Jul 14, 2011 16:22 |
|
Ryokurin posted:If AMD ever got to the point where bankruptcy was a real possibility Intel would probably throw them a indirect lifeline similar to how Microsoft purchased Apple stock in the 90s simply because they are the only thing that keeps an antitrust inquiry from gaining traction. Not to mention they can't afford to have patents go to another party who may not be as flexible when it comes to agreements. It should be noted that Microsoft's Macintosh Business Unit itself is still the largest third party developer for the Macintosh, in revenue as well as employees.
|
# ? Jul 14, 2011 16:38 |
|
Ryokurin posted:If AMD ever got to the point where bankruptcy was a real possibility Intel would probably throw them a indirect lifeline similar to how Microsoft purchased Apple stock in the 90s simply because they are the only thing that keeps an antitrust inquiry from gaining traction. Not to mention they can't afford to have patents go to another party who may not be as flexible when it comes to agreements. I think a second supplier for a part is a common prerequisite for government contracts, as well.
|
# ? Jul 14, 2011 23:26 |
|
AMD got their x86 license out of that sort of requirement, but it's clearly not a factor today because there is no x86 chip with two suppliers. And if there were such a requirement, AMD couldn't meet it anyway because they can't fabricate chips anymore. Edit: also, while Intel would certainly not want AMD's patents going to another party, very few could afford to outbid Intel. Zhentar fucked around with this message at 14:29 on Jul 15, 2011 |
# ? Jul 15, 2011 14:26 |
|
Hardocp is saying now that MS is to go with what amounts to a customized version of a BD based APU. Very much rumor mongering but would make more sense than them going with Cell for X720 or whatever. That would be a pretty big win for AMD, maybe even bigger than getting their GPU in all the next gen consoles, if true of course.
|
# ? Jul 21, 2011 10:44 |
|
It makes a bit more sense for MS to use an AMD CPU rather than Sony, as you don't have to find a way to reimplement code that was using the SPEs. Still, I remain unconvinced that an APU would provide enough graphics horsepower to provide a real generational leap over the 360 and PS3. Console games are EXTREMELY shader-bound, and you can do cool things with more shader horsepower like use post-processing to hide how lovely your game looks. Currently console games are rendered at below native resolution (1024x576 for example), then scaled up for display using a soft filter. The goal is to increase rendering performance, while using the blur effect from scaling the image up to hide aliasing and your low-resolution textures. With more shader horsepower, you can render at native resolution and use a shader-based antialiasing filter like nVidia's FXAA (which works on all hardware since it's just a pixel shader and is integrated into the game by the developers) or AMD's MLAA (which is part of the drivers so only works on AMD hardware, but is otherwise similar). That'll improve edge sharpness and clarity a lot, but with only 30GB/sec of memory bandwidth shared between the CPU and GPU we're probably not going to be seeing games with high-res, detailed textures. On the plus side, you'll notice low-res textures less because there will be neat shader tricks covering them up. Unfortunately this isn't the same result as just throwing a real GPU in there with dedicated GDDR5 memory, which WOULD give you enough bandwidth for high-res textures and real anti-aliasing.
|
# ? Jul 21, 2011 20:01 |
|
Alereon posted:but with only 30GB/sec of memory bandwidth shared between the CPU and GPU we're probably not going to be seeing games with high-res, detailed textures. On the plus side, you'll notice low-res textures less because there will be neat shader tricks covering them up. Unfortunately this isn't the same result as just throwing a real GPU in there with dedicated GDDR5 memory, which WOULD give you enough bandwidth for high-res textures and real anti-aliasing. Why are you assuming they'll go with standard desktop memory? Seems to me it'd make a lot more sense to take an APU with a beefed-up memory controller (it's going to be a custom chip, after all) and hook that straight to some high-bandwidth video RAM in a unified-memory architecture. That's the way pretty much all consoles have done things for quite a while, now. You get all the benefits of a discrete GPU, and some extras (the CPU can screw around with "VRAM" directly if you want it to), and with the APU you don't even have to deal with two big complex hot chips and two sets of RAM. The bigger thing, to me, would be moving consoles to x86. I guess we've gotten to the point where everything's so high-level that architecture doesn't even matter all that much, but it still seems kind of weird.
|
# ? Jul 22, 2011 00:25 |
|
Space Gopher posted:The bigger thing, to me, would be moving consoles to x86. I guess we've gotten to the point where everything's so high-level that architecture doesn't even matter all that much, but it still seems kind of weird.
|
# ? Jul 22, 2011 01:21 |
|
adorai posted:The original xbox was x86. Sure, but when it was designed Microsoft placed a lot more importance on getting something out there to compete now than on an optimum solution. They went with x86 not because it was the best choice for the task at hand, but because they had a lot of people very good with x86, and they could do it all with off-the-shelf parts. Nobody pretended it was the best choice. When it came time to develop the 360, which wasn't nearly as much of a rush job, they went with a custom Power chip just like the rest of the industry. Now, if they are in fact going with a Bulldozer-based APU, it seems like their best and brightest have sat down and said, "starting from a clean sheet, x86 is clearly the best option for our new console." If AMD's starting to do some really interesting stuff with CPU/GPU hybrids behind closed doors, then it might be realistic, but it still feels kind of weird.
|
# ? Jul 22, 2011 01:51 |
|
|
# ? Dec 14, 2024 17:24 |
|
Space Gopher posted:Why are you assuming they'll go with standard desktop memory? Seems to me it'd make a lot more sense to take an APU with a beefed-up memory controller (it's going to be a custom chip, after all) and hook that straight to some high-bandwidth video RAM in a unified-memory architecture. That's the way pretty much all consoles have done things for quite a while, now. You get all the benefits of a discrete GPU, and some extras (the CPU can screw around with "VRAM" directly if you want it to), and with the APU you don't even have to deal with two big complex hot chips and two sets of RAM. Given existing consoles tend to use tightly integrated or even unified memory architectures with very high local bandwidth, moving to an APU might be a relatively small step. An off the shelf Llano would be a bad idea, but something more purpose-built and a year or so off, that has more potential. x86 is potentially an asset these days, if just since there are so many tools and so much development experience invested in it, and a console that uses it also potentially gets to have more shared resources with Windows ports, useful in a marketplace with so many multiplatform games. With how much development times and costs have ballooned in the last ten years, and how new console generations suffer at first as the developers get used to new hardware, anything that helps keep devs on familiar ground is a good thing. My big hope is that none of the three new consoles skimp on total RAM, though. Sure, the current generation might be limited by having the power of a five year old gaming system and multiplatform games being written to play off one DVD at a time with no hard drive (thanks, 360 Arcade), having to fit everything in 256MB main memory on the PS3 is its own sort of tight fit, to say nothing of the Wii. I don't play much console any more, but with most games being multiplatform they're all built to the lowest common denominator.
|
# ? Jul 22, 2011 02:01 |