|
pienipple posted:Since the launch is so close I'm gonna wait and see what the AM3+ board selection is like. Think of current AM3 boards only with USB3.0 as baseline standard. Chipsets are so homogenized to the point there is no real difference between them that Intel intentionally disabled SB OCing on H67 so they earn a few extra bucks on P67 boards.
|
# ¿ Apr 18, 2011 01:49 |
|
|
# ¿ Apr 24, 2024 01:58 |
|
Factory Factory posted:Is that the top 4-core or comparing 8-core to the i7-2600K's 4-core? AMD seems to be going all-in on multithreaded performance if that's an 8 vs. 4 comparison. Then again, that would mirror their graphics strategies. Most likely the Zambezi 8 core / 4 module part. If the chart is true I can understand why Intel was selling SB on the cheap. Still, we don't know how the frequency, overclocking and power consumption would work out.
|
# ¿ May 5, 2011 04:10 |
|
You Am I posted:AMD will find someway of screwing it up, like dodgey third party chipsets or some CPU bug. Not going to happen since AMD completely ditched 3rd party chipsets 3 years ago. Quality has only shot up tremendously compared to the days of quirky nForce 3/4 (but at least that was still much better than Nvidia disastrous 680/780/790i chipsets)
|
# ¿ May 6, 2011 06:10 |
|
Alereon posted:Llano previews are out, I'm thinking of giving it its own thread. Anandtech Desktop Preview, Anandtech Notebook Preview. Trouble for AMD is SB + GT540M is better than mobile Llano in every aspect and the former isn't exactly expensive to boot either.
|
# ¿ Jun 14, 2011 11:34 |
|
Space Gopher posted:Well, that's what Sysmark does, too. It all comes down to what tasks you've chosen as representative of "real-world use." Anyone with a clue is not going care about Sysmark scores. "I'm so going to get a 2500K over Phenom II because MS Office runs faster hoho"
|
# ¿ Jun 25, 2011 05:31 |
|
wicka posted:It's not really strange or amazing that Sony promoted something new and expensive and it turned out not to be the next big thing. At least they won with Bluray! In an era where people are moving away from optical media.
|
# ¿ Jul 8, 2011 05:59 |
|
Peechka posted:AMD's Bulldozer-based FX-8130P benchmarked early As much as I want AMD to succeed I would take any unofficial benchmarks with a grain of salt for now.
|
# ¿ Jul 12, 2011 16:57 |
|
KillHour posted:I have a friend who I'm reasonably sure will buy this. He thinks that all Intel chips suck, and refuses to even read anything that states otherwise. Also, he is adamant that more cores = better. If AMD made a processor with 1000 cores that each had the power of an 8080, he'd buy it in a second. Dumb people are everywhere; even within the PC enthusiast circles. I'm not even sure how can anyone think a $200 AM3+ board + $120 Phenom II can be remotely considered a good deal compared to a 2500K combo at the same price. "But it's upgradable to BD!" Yeah, as if AMD is gonna give BD when it hits for free to anyone who bought a AM3+ board beforehand.
|
# ¿ Jul 13, 2011 11:07 |
|
Coredump posted:What kills me is people who are cheering on AMD to succeed and become competitive with Intel again so they turn around and buy more Intel chips at what they hope will be lower prices. BD might end up being competitive in the desktop front but it still be a mostly hollow victory for it will be a non-starter on the mobile market where the real money is to be made. Most won't agree the $300 2600K is worth the money on the desktop side; but to get the lowest-end i7-2620M mobile SB quad you are already paying $346 to Intel alone. The lack of an integrated northbridge and on-die IGP means BD is ill-suited for laptops; BD fusion will be the key to unlock this segment but god knows how long we have to wait.
|
# ¿ Jul 13, 2011 16:51 |
|
Agreed posted:He left out the bench of just the discrete graphics card without the crossfire setting. Makes that really unhelpful if you were trying to see what hybrid crossfire offers over just using the graphics card. I don't really understand why he bothered doing the tests and left out a pretty important row for people seriously considering options for a budget system like that. It demonstrates plenty well that the i3's onboard graphics suck compared to the AMD one but why go only half of the rest of the way toward demonstrating the performance of the AMD on-board versus the discrete card that it's supposed to compete with for performance (that was the idea, right?) Put in an actual gaming-grade video card like a 5770 and watch the i3 eats Llano for breakfast at around the same cost for both. Its disingeneous to compare hybrid crossfire Llano with the SB GPU while downplaying the fact that the Intel setup can also use the same discrete card.
|
# ¿ Aug 19, 2011 15:54 |
|
Sinestro posted:Llano is the chip without a market. Anyone who would want one would get SB + /[H|Z]6\d/ using the IGP, or AM3 + dGPU. why AMD thinks there is a big market for "lovely CPU with good (for a IGP) GPU". The laptop performance is poo poo vs the category of "Any dGPU + SB", and Optimus makes the batt. life compatible to a IGP. The notion of Llano in the desktop is laughable, because the desktop is the land of the power user, where Intel is king. I guess for gaming on a *very* tight budget at 720p resolutions, it might make sense. It would had been a lot more impressive if it was released in mid-2010. The delays of Llano and BD has taken its toll.
|
# ¿ Aug 19, 2011 17:34 |
|
http://www.xbitlabs.com/news/cpu/display/20110901142352_Gigabyte_Accidentally_Reveals_AMD_s_FX_Launch_Lineup_Specs.html First chip to finally break the 4GHz barrier, officially. Last near-candidate was the 3.8GHz P4 570 in Nov 2004.
|
# ¿ Sep 3, 2011 03:21 |
|
HalloKitty posted:This isn't good. A CPU that is - lest we forget - closing in on 2 years old, as the comparison for AMD's not yet released, high end processor? Remember the HD 4850? It was previewed before release because it was a 9800GTX killer and that was all that matters; AMD doesn't need to touch Nvidia's top-end to entice buyers. Same for the 4870 vs GTX 260. Same also goes for BD. Hardly anyone cares whether BD can compete against a $999 i7 990X or maybe even a 2600K. All AMD needs is a chip that can compete with the 2500K, which is the current top dog in price/performance. An $150 AMD chip that reaches within 90% of 2500K is already a winner. Now, if AMD had this winning chip, then why would they be so hush-hush on giving actual unbiased benchmarks but instead relying on lolmarketing? With no other reliable counter-evidence for now, plus the months of delays and delays, I can only conclude BD simply is to disappoint.
|
# ¿ Sep 25, 2011 15:47 |
|
Peechka posted:
Uh, I don't agree with this "upgradeability" thing. History has shown provided you bought smart at the start, upgrading CPUs without changing the socket and mobo is a bad move for the money. For you case, you are also forgetting that buying AM3+ now means you also need to buy a current AMD CPU with worse performance/price than a 2500K no matter which you pick, and you are stuck with that level of performance until BD releases (assuming it doesn't get delayed again, and actually faster enough than current AMD offerings to justify the price) which also costs money to upgrade to. All in all, you might as well went 2500K in the first place.
|
# ¿ Sep 26, 2011 15:19 |
|
Agreed posted:Yeah, but I'm wondering if that's just because they've got complex processes spread out in discrete threads with core affinity and then they sync it all up at the end, or if it's because the game is more deeply multi-threaded and could run just as well (thought experiment follows) on an 8-core system with half the clock per core, assuming clock for clock parity. In other words, multi-threaded, or quad-threaded? One thing people tend not to realize is a hypothetical single-core CPU A which has twice the single-threaded performance of a dual-core CPU B is superior in all kinds of workloads no matter how whether it's single-threaded or multithreaded. Single-thread performance is still very important today and in the future, since we are already seeing stuff like that simply doesn't scale well than more than 2/4 threads, plus the fact diminishing returns gets worse as you add more and more threads even for workloads that can be perfectly scaled to any threads.
|
# ¿ Sep 27, 2011 03:12 |
|
Found a pretty legit BD leak from PCWorld France.quote:"In the more upscale, FX processors based on the Bulldozer architecture we also disappointed: while they are still generally more efficient than their predecessors and allow AMD to approach much of the last Core i5 and i7 but their performance remains below expectations. Besides, as we announced already in our previous issue, if they can sometimes compete with Sandy Bridge in the applications of rough calculations, the results are in video games very far behind. Only overclockers (and fanboys) will find them a great interest given their predisposition in this area."
|
# ¿ Sep 29, 2011 12:50 |
|
Longinus00 posted:Hmm, I wonder why the architecture is always so behind in games? It didn't had the massive IPC advantages Intel has since the first C2Ds; Intel chips simply perform well regardless of threads. Clock-for-clock Phenom II X4 is only just as fast a Q6600. When you pit a Phenom II X4 against a Intel CPU with a ~50% IPC advantage in a dual-threaded game it looks like this:
|
# ¿ Sep 30, 2011 00:27 |
|
Alereon posted:I think that's a little pessimistic. Their desktop performance has been generally weak, but they delivered a hex-core processor far more cheaply than Intel did, and maintained performance leadership in the value segment up until Sandy Bridge. Their strategy has always been more cores/$ and more performance/$ and watt in servers (as well as memory quantity and bandwidth/$), which doesn't translate to a desktop strategy since desktop applications are still poorly threaded. Marketshare numbers don't exactly tell the whole story because AMD Opterons excel in 4+ socket configurations, while Intel leads in 1-2 sockets, so each server AMD wins sells more processors and those processors are worth more (as they're 12-core, 8-way models). AMD has had great success in the supercomputer market for this reason, for example. Basically AMD's server strategy is targeting huge servers for virtualization, while Intel concentrates on more performance-sensitive applications. It is pessimistic because Intel can annihilate AMD in every sector, if they choose to. Server: I'm don't track stuff here, but anything wrong with the Intel server platform was long fixed since Nehalem with the IMC, QPI and HT. Sandy bridge only made the same stuff even better. Desktops: Release an unlocked i3. No reason to even buy AMD anymore. Laptops: Cut SB prices against Llano. Actually Intel doesn't even need to do anything since dualcore SBs laptops with Llano-grade GPUs are already as competitive as it is, pricing-wise. (considering how slow is the Llano CPU) Netbooks: Release a single core SB. Zacate? What is that again?
|
# ¿ Sep 30, 2011 01:52 |
|
trandorian posted:Considering Atoms are already dual core, it'd be silly to replace them with a single core sandy bridge. Noone wants to be stuck with single-core today. A single core SB is going to murder any Atom or Zacate CPUs regardless of cores, with much better performance/watt to boot. http://www.xbitlabs.com/articles/cpu/display/core-i3-2100t_11.html#sect0 But I think I'm derailing a BD thread so I'll just stop here.
|
# ¿ Sep 30, 2011 04:15 |
|
dissss posted:I wonder what their share is now? We're seeing a huge number of E-350 laptops (not just netbooks) - sure perhaps it's ill advised as far as long term reputation goes but at the low end of the market people very much buy on price and AMD is considerably undercutting even the low end Pentiums. 2011 Q3 AMD 10.4% vs Intel 81.8% by revenue Keep in mind that the E-350 isn't exactly the money-printing machine like Sandy Bridge.
|
# ¿ Sep 30, 2011 10:20 |
|
quote:Meanwhile, AMD's server processor codenamed Interlagos will also have difficulty shipping on schedule and is expected to be delayed to November. Source: http://www.digitimes.com/news/a20110930PD207.html I'm speechless.
|
# ¿ Sep 30, 2011 15:21 |
|
Jago posted:Here's the 105 price point. The Intel dual core beats the Athon II at games and the sysmark stuff, but it seems that anything multithreaded at all the Athlon meets or beats it. I'd call this one a toss up. Hardly anyone is going to buy a single CPU alone. Add in a $120 mobo and AMD already loses hands down in price/performance/power draw even if the 2500K was priced $100 higher. That is how far behind AMD is now. Put overclocking into the mix and AMD loses even more; it's difficult for a 965BE to hit 4GHz and it still won't touch a stock 2500K while turning into a small oven. Whereas 2500K is 4GHz guaranteed on the stock cooler. With regards to "60 fps in games": (Side note: see how a 4 year old Q6600 beats a 3GHz Phenom II with a 600MHz deficit!) AMD is far from 60 fps. You really want to save $100 now over that? And how many games are going to become even more CPU-limited in the future? No one is disputing AMD is cheaper but they are a far cry from the price/performance champions in the Athlon XP/64 days. They can't even hold on to that at the $100 mark....And that is how deep into poo poo they are. freeforumuser fucked around with this message at 04:06 on Oct 1, 2011 |
# ¿ Oct 1, 2011 04:02 |
|
frumpsnake posted:AMD does provide good value, but you've got to go lower than the i3. http://www.anandtech.com/bench/Product/204?vs=406 *takes a look at the chart* Nope. A G620 is almost neck to neck to a 565 BE while being $32 cheaper, having a 800MHz deficit and consuming a good 50W less power, and the HD2000 IGP is a good deal better than the crappy AMD mobo IGPs. Needless to say the Athlon II X2 line gets entirely slaughtered if the 565 BE is that bad. Both of those CPUs aren't even of good value for money in the grand scheme of things, but it does demonstrate that AMD doesn't even have a low end to speak of. More like "how cheap can one get while building a PC and doesn't care about performance or power consumption" end. freeforumuser fucked around with this message at 12:22 on Oct 1, 2011 |
# ¿ Oct 1, 2011 12:18 |
|
Bloody Antlers posted:If BD stinks at launch, we could see another Phenom -> Phenom II type transition where engineers save the day by fine tuning the design and dramatically increasing performance. There was no fine-tuning from Phenom I to II; all AMD simply did was slapped an extra 4MB L3 cache and clocked it higher on a 45nm process. The thing if BD fails, it cannot be saved in the easy manner like in Phenon II, as it's already has a ton of cache, very highly clocked and 32nm like SB. If you are talking about an "extremely complex design", the same also apply for Intel. How was Intel able to stick to its roadmap like clockwork with three entire architectures (Conroe, Nehalem, SB) and 2 die-shrinks (Penryn, Westmere), while AMD can't even get BD out of the door after 4 years since Phenom I? There is no "introducing parity" here, more like chasing a airplane with a bicycle. Mind you this was the same AMD that went from the original K7 to A64 winning streak from 99-03. freeforumuser fucked around with this message at 01:47 on Oct 3, 2011 |
# ¿ Oct 3, 2011 01:42 |
|
Star War Sex Parrot posted:It's weird to think about how long we've been reading about Bulldozer. I think it first started getting thrown around in summer 2007, but maybe even earlier than that. The whole BD debacle is looking eerily reminiscent to the Phenom I launch. Lots of official slides on how BD is better than Intel blah blah blah but no mention of any actual performance or power consumption, a useless 2560x1440 gaming comparison vs a 990X and the 8GHz LN2 OC, and leaked underwhelming benchmarks that which was exactly what happened to Phenom I. In fact, I would take AMD's silence over BD performance as evidence of BD being nowhere as good as AMD's hyping up it is. You just simply don't keep mum if you have a killer product after losing consistently for 5.5 years, and you can't hide it from your 100+ billion competitor that knows everything through good ol' corporate espionage. One can argue AMD was pretty quiet before the HD 5000 series launch, but that was more on how underwhelming the lineup was compared to cheap HD 4000s than actual failure itself, and the GPU sector was and still is a market that are very competitive in. freeforumuser fucked around with this message at 02:49 on Oct 4, 2011 |
# ¿ Oct 4, 2011 02:46 |
|
PCM NL leaked benchmarks: http://www.overclock.net/rumors-unconfirmed-articles/1134704-pcm-leaked-dutch-fx-8150-review.html Summary: Cinebench 11.5 FX-8150 = 6.01 2600K = 6.73 2500K = 5.73 Dirt 3 - 1080p HD5970 FX-8150 = 105 avg, 75 min i7-965 = 93 avg, 71 min Far Cry 2 - 1080p DX10 max FX-8150 = 111 avg, 23 min i7-965 = 126 avg, 75.2 min Mafia 2 FX-8150 = 68.3 avg i7-965 = 76 avg Power consumption - full load FX-8150 = 120W Overclocking 5GHz, 1.47V Edit: Link fixed. Thanks Movax! freeforumuser fucked around with this message at 17:41 on Oct 6, 2011 |
# ¿ Oct 6, 2011 17:27 |
|
Bob Morales posted:Didn't we already know BD would be behind the highest-end of the last-gen i7's in gaming and single-thread stuff? There's Cinebench 10 in there: 2600K = 5800 FX-8150 = 4024 Single threaded IPC = (5800 / 3800MHz) / (4024 / 4200MHz) = SB has 1.6x better IPC per clock than BD. That's really sad. Actually I think thats even worse than Phenom II. freeforumuser fucked around with this message at 18:08 on Oct 6, 2011 |
# ¿ Oct 6, 2011 18:02 |
|
FX-4110 Quad-core @ 4.2GHz leaks from China @ xtremesystems.org It doesn't take much effort to Google current Phenom II scores for comparison. Superpi Worse than 965BE (20.529 vs 18.252 secs) 3dmark Vantage CPU Worse than 965BE (10664 vs 11395) W-Prime: Much worse than 965BE (17.191s vs 10.764s) 7-Zip compression Worse than 975 BE (11387 vs 12547) 7-Zip decompression Worse than 975 BE (12701 vs 14294) Cinebench Single-threaded: Worse than 975 BE (1.03 vs 1.09), BTW this is the same score reported by the 8150 @ 4.2GHz PCM NL leak. Multi-threaded: Much worse than 975 BE (3.31 vs 4.27) What is this I don't even. How can AMD make a new quad-core that loses to their previous generation quads clocked 600-800 MHz slower? This is a trainwreck of epic proportions. freeforumuser fucked around with this message at 13:31 on Oct 8, 2011 |
# ¿ Oct 8, 2011 13:11 |
|
The cat is out of the bag now, with a legit Romanian hardware review site doing a FX-8150 preview: http://lab501.ro/procesoare-chipseturi/amd-fx-8150-bulldozer-preview/14 tl;dr version: Loses heavily in everything to 2600K, except Handbrake it comes within 1% of 2600K. BD isn't going to find itself in the SH/SC recommendation thread anytime soon.
|
# ¿ Oct 9, 2011 05:36 |
|
All these official reviews simply proves beyond doubt BD is a fail of epic proportions, in performance and especially power draw (MY GOD, extra 200W from stock to 4600MHz at full load?!) As seen from: http://www.guru3d.com/article/amd-fx-8150-processor-review/7 freeforumuser fucked around with this message at 05:21 on Oct 12, 2011 |
# ¿ Oct 12, 2011 05:18 |
|
http://www.anandtech.com/show/4955/the-bulldozer-review-amd-fx8150-tested Anandtech review is up!
|
# ¿ Oct 12, 2011 06:29 |
|
And those aren't even crossing the 60 fps barrier.
|
# ¿ Oct 12, 2011 07:13 |
|
Maxwell Adams posted:So... Intel is dropping the price on i5's when bulldozer hits the market, right? There will be price drops, but it won't be Intel.
|
# ¿ Oct 12, 2011 09:02 |
|
Hog Butcher posted:That A8's APU's going to blow the HD3000 out of the water. A prebuilt with a video card in it'll beat it, but I think the only place AMD's got a chance right now's if they're competing with the HD3000. That A8 APU is slower than a discrete 5570 which is in turn molasses slow compared to a 5770 which I consider it the minimum GPU required for current gaming at a decent resolution, and that is NOT FAST ENOUGH for many! No gamer is going to touch it just because its 2x better than HD3000. Besides the CPU portion is subpar to boot.
|
# ¿ Oct 13, 2011 00:28 |
|
wipeout posted:EDIT - the Anand posters commenting on buying Bulldozer because AMD need the money.. Wow, thats strange logic. Even before BD, AMD fanboys love to say how their CPUs were "good enough" for things like "Internet surfing" when AMD was and still is whacked to hell and back by Sandy Bridge...As if that somehow excuses AMD for making subpar chips not even worth the performance/price ratio.
|
# ¿ Oct 13, 2011 15:48 |
|
Agreed posted:What the hell are they going to do with that problem? The Phenom II X6 1055T $149.99 should overclock pretty trivially to any of the Black Edition specs, and thus make Bulldozer look foolish. And it also features this ad copy: A die-shrunk 32nm GPUless Llano with L3 cache and AVX would have a smaller die for the same cores, and would have been much faster than BD per clock/core since it is already +6% IPC than 45nm K10. This 8-core hypothetical chip at 3.6GHz (compared to 3.3GHz 1100T) would had around (8/6) * (3600/3300) * (1.06) * 5.9 = 9.09 score in Cinebench 11.5 which would had been truly a multithreaded monster compared to the cocktease 6.0 score of the current FX-8150. It's staggering how the gently caress anyone at AMD would think BD was a good idea when they can already make something much better for much less effort.
|
# ¿ Oct 14, 2011 07:07 |
|
Longinus00 posted:The linux kernel patch only increased performance by up to 10%, a 40% performance increase is pretty crazy all things considered. AMD has had previous problems with windows kernel scheduling (aka phenom cool and quiet problem) so I suppose this isn't unprecedented. That also a problem that core parking would have solved, it just took until intel got on the case to get that implemented into the kernel. Meh, I will just tell AMD to suck it up. Nobody is going optimize for your CPU when it runs current code molasses slow. How about design a CPU that is actually fast NOW in the first place instead of this pathetic whining, AMD?
|
# ¿ Oct 16, 2011 12:41 |
|
PC LOAD LETTER posted:I think that had more to do with Bobcat. Still plenty that is lol worthy in the old pre launch BD slides given what we know now though. If Intel taught us how to design a good x86 processor, it is to improve the decoders, branch predictors and out-of-order resources as much as possible since those are the main bottlenecks. Sharing a decoder with 2 cores is a recipe for disaster aka BD.
|
# ¿ Oct 17, 2011 14:31 |
|
ClosedBSD posted:What the gently caress would cause a problem like this? Did AMD leave out whole x86_64 instruction sets or something? BD just keeps better isn't it? Even the Phenom TLB bug wasn't this bad, at least it was extremely unlikely to affect consumer apps. Civil posted:Newegg is a hotbed of AMD fanboyism. Not that the A2's and Phenom2's were bad CPU's, but the reviews speak of them as if they're water-walking Jesus processors. For the most part, bulldozer will run all modern software and games acceptably, just not as well as intel parts. And the tardcore AMD fans are just fine with that. For AMD fanboys, it always boils down the same old tired arguments: 1. It's OK for AMD to suck because they are the underdog 2. Buy AMD unless you like sky-high Intel prices (People will buy AMD stuff if it was actually good, not poo poo like BD) 3. I'm been using AMD all my life (Irrelevant) 4. AMD is good because they are cheap. (They are cheap because they are simply not good enough...And they have no chip that even touches a 2500K in perf/OC/power/price) 5. AMD is fast enough for games. (Nope for CPU-limited games like SC2) 6. AMD has a better upgrade path (debunked for AM2/3, AM3+ makes that moot)
|
# ¿ Oct 20, 2011 16:28 |
|
|
# ¿ Apr 24, 2024 01:58 |
|
cinder posted:Has AMD directly addressed the questionable performance of BD versus their existing offerings? I had read through the previously linked thread mentioning AMD's desire to answer questions directly from the enthusiast crowd and I'm interested to see their responses, but it doesn't seem like it has been posted yet. I could understand that, except they totally trolled us so hard lost after all their BD performance and perf/watt pre-launch claims didn't materalize at all in the final product. I doubt asking AMD about how BD sucked so hard is going to give us any honest answers.
|
# ¿ Oct 24, 2011 17:11 |