Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
freeforumuser
Aug 11, 2007

pienipple posted:

Since the launch is so close I'm gonna wait and see what the AM3+ board selection is like.

Think of current AM3 boards only with USB3.0 as baseline standard. Chipsets are so homogenized to the point there is no real difference between them that Intel intentionally disabled SB OCing on H67 so they earn a few extra bucks on P67 boards.

Adbot
ADBOT LOVES YOU

freeforumuser
Aug 11, 2007

Factory Factory posted:

Is that the top 4-core or comparing 8-core to the i7-2600K's 4-core? AMD seems to be going all-in on multithreaded performance if that's an 8 vs. 4 comparison. Then again, that would mirror their graphics strategies.

It'll kinda suck when 6/8-core Sandy Bridge E parts come out. Poor AMD :saddowns:

Most likely the Zambezi 8 core / 4 module part. If the chart is true I can understand why Intel was selling SB on the cheap.

Still, we don't know how the frequency, overclocking and power consumption would work out.

freeforumuser
Aug 11, 2007

You Am I posted:

AMD will find someway of screwing it up, like dodgey third party chipsets or some CPU bug.

Not going to happen since AMD completely ditched 3rd party chipsets 3 years ago. Quality has only shot up tremendously compared to the days of quirky nForce 3/4 (but at least that was still much better than Nvidia disastrous 680/780/790i chipsets)

freeforumuser
Aug 11, 2007

Alereon posted:

Llano previews are out, I'm thinking of giving it its own thread. Anandtech Desktop Preview, Anandtech Notebook Preview.

Notebook: Absolutely unbeatable for inexpensive gaming performance, with Sandy Bridge-like excellent general usage battery life. The CPU performance is poor, but you're clearly buying a Llano notebook for the graphics. The CPU performance does hold back the GPU in CPU-heavy games like StarCraft 2, but not by enough that the GPU still doesn't give it a commanding lead. The new Hybrid Crossfire isn't bad (when it works), though it depends on how cheaply they can throw low-end GPUs into laptops (low-end AMD mobile GPUs seem pretty drat cheap). No graphics turbo :(

Desktop: Disappointing. The GPU is more hamstrung by sharing memory bandwidth with the CPU than hoped, though it still has a compelling performance advantage. The CPU is showing its age, easily trounced by Sandy Bridge Core i3s. It still obviously wins at gaming without a dedicated graphics card thanks to 50-100% faster graphics performance, but I'm holding out for final reviews and to find out how overclockable it is.

Trouble for AMD is SB + GT540M is better than mobile Llano in every aspect and the former isn't exactly expensive to boot either.

freeforumuser
Aug 11, 2007

Space Gopher posted:

Well, that's what Sysmark does, too. It all comes down to what tasks you've chosen as representative of "real-world use."

Anyone with a clue is not going care about Sysmark scores.

"I'm so going to get a 2500K over Phenom II because MS Office runs faster hoho"

freeforumuser
Aug 11, 2007

wicka posted:

It's not really strange or amazing that Sony promoted something new and expensive and it turned out not to be the next big thing.

At least they won with Bluray! In an era where people are moving away from optical media.

freeforumuser
Aug 11, 2007

Peechka posted:

AMD's Bulldozer-based FX-8130P benchmarked early
By Jose Vilches, TechSpot.com
Published: July 11, 2011, 9:00 AM EST

Last month at the E3 conference in Los Angeles AMD officially reintroduced the FX brand for their top performing processors aimed at PC enthusiasts and gaming aficionados. Although no actual products were launched, we already have a pretty good idea of the initial lineup, and now Turkish website DonanimHaber is offering a glimpse at the performance we can look forward to.

The site managed to get their hands on an engineering sample of AMD's forthcoming FX-8130P and ran it through a range of tests. The 8-core chip features 3.2GHz clock speeds, 2MB of L2 cache per each pair of cores (8MB in total), and 8MB L3 cache shared between all modules. The motherboard used was a Gigabyte 990FXA-UD5, which was paired with a GeForce GTX 580.

Bulldozer scores P6265 in the 3D Mark 11 benchmark, 3045 in PCMark 7, 24434 in Cinebench R10 and manages 136 and 45 frames per second in x264 encoding tests for Pass 1 and Pass 2, respectively. In addition, it took 19.5 seconds to complete SuperPi 1M. Unfortunately there are no Core i7 2600K scores to compare with -- and the benchmark programs used differ from our usual range of tests -- but VR-Zone claims typical scores for Intel's top Sandy Bridge part are lower in all tests except SuperPi 1M, where it is significantly faster.

Compared to the Thuban-based Phenom II X6 1100T, Bulldozer should end up about 50% faster, while overall it slots right in between the Sandy Bridge Core i7 2600K and Gulftown-based Core i7 990X in terms of performance.

Of course scores will vary from platform to platform so we'll reserve judgment until we can put Bulldozer to the test ourselves. If these early comparisons hold up, though, AMD could finally have an answer to Intel on the high-end. The rumored $320 price tag suggests that will be the case considering Intel's Core i7 2600K costs roughly the same.

As much as I want AMD to succeed I would take any unofficial benchmarks with a grain of salt for now.

freeforumuser
Aug 11, 2007

KillHour posted:

I have a friend who I'm reasonably sure will buy this. He thinks that all Intel chips suck, and refuses to even read anything that states otherwise. Also, he is adamant that more cores = better. If AMD made a processor with 1000 cores that each had the power of an 8080, he'd buy it in a second.

Edit: Then he'd blame Windows 7 for running slow on it.

Dumb people are everywhere; even within the PC enthusiast circles. I'm not even sure how can anyone think a $200 AM3+ board + $120 Phenom II can be remotely considered a good deal compared to a 2500K combo at the same price. "But it's upgradable to BD!" Yeah, as if AMD is gonna give BD when it hits for free to anyone who bought a AM3+ board beforehand.

freeforumuser
Aug 11, 2007

Coredump posted:

What kills me is people who are cheering on AMD to succeed and become competitive with Intel again so they turn around and buy more Intel chips at what they hope will be lower prices.

BD might end up being competitive in the desktop front but it still be a mostly hollow victory for it will be a non-starter on the mobile market where the real money is to be made. Most won't agree the $300 2600K is worth the money on the desktop side; but to get the lowest-end i7-2620M mobile SB quad you are already paying $346 to Intel alone.

The lack of an integrated northbridge and on-die IGP means BD is ill-suited for laptops; BD fusion will be the key to unlock this segment but god knows how long we have to wait.

freeforumuser
Aug 11, 2007

Agreed posted:

He left out the bench of just the discrete graphics card without the crossfire setting. Makes that really unhelpful if you were trying to see what hybrid crossfire offers over just using the graphics card. I don't really understand why he bothered doing the tests and left out a pretty important row for people seriously considering options for a budget system like that. It demonstrates plenty well that the i3's onboard graphics suck compared to the AMD one but why go only half of the rest of the way toward demonstrating the performance of the AMD on-board versus the discrete card that it's supposed to compete with for performance (that was the idea, right?)

Put in an actual gaming-grade video card like a 5770 and watch the i3 eats Llano for breakfast at around the same cost for both.

Its disingeneous to compare hybrid crossfire Llano with the SB GPU while downplaying the fact that the Intel setup can also use the same discrete card.

freeforumuser
Aug 11, 2007

Sinestro posted:

Llano is the chip without a market. Anyone who would want one would get SB + /[H|Z]6\d/ using the IGP, or AM3 + dGPU. :iiam: why AMD thinks there is a big market for "lovely CPU with good (for a IGP) GPU". The laptop performance is poo poo vs the category of "Any dGPU + SB", and Optimus makes the batt. life compatible to a IGP. The notion of Llano in the desktop is laughable, because the desktop is the land of the power user, where Intel is king. I guess for gaming on a *very* tight budget at 720p resolutions, it might make sense.

But enough :spergin: for now.

It would had been a lot more impressive if it was released in mid-2010. The delays of Llano and BD has taken its toll.

freeforumuser
Aug 11, 2007
http://www.xbitlabs.com/news/cpu/display/20110901142352_Gigabyte_Accidentally_Reveals_AMD_s_FX_Launch_Lineup_Specs.html

First chip to finally break the 4GHz barrier, officially. Last near-candidate was the 3.8GHz P4 570 in Nov 2004.

freeforumuser
Aug 11, 2007

HalloKitty posted:

This isn't good. A CPU that is - lest we forget - closing in on 2 years old, as the comparison for AMD's not yet released, high end processor?

Sandy Bridge ALREADY gives you this kind of performance for the prices AMD are claiming. Sandy Bridge IS the lower priced competition for the 980x/990x.

Sad, so very sad

Remember the HD 4850? It was previewed before release because it was a 9800GTX killer and that was all that matters; AMD doesn't need to touch Nvidia's top-end to entice buyers. Same for the 4870 vs GTX 260.

Same also goes for BD. Hardly anyone cares whether BD can compete against a $999 i7 990X or maybe even a 2600K. All AMD needs is a chip that can compete with the 2500K, which is the current top dog in price/performance. An $150 AMD chip that reaches within 90% of 2500K is already a winner.

Now, if AMD had this winning chip, then why would they be so hush-hush on giving actual unbiased benchmarks but instead relying on lolmarketing? With no other reliable counter-evidence for now, plus the months of delays and delays, I can only conclude BD simply is to disappoint.

freeforumuser
Aug 11, 2007

Peechka posted:


But yeah, I do agree that if their chips are anywhere close to 2600K and they can get them out for a cheap price, then yeah, its a winner. The thing thats always awesome with AMD is they dont change their sockets every drat year, so chances are if I buy an AM3+ system now, I could still upgrade processor when the next gen Bulldozer hits as well. Which IMO lends itself to a better more upgradeable system overall.

Uh, I don't agree with this "upgradeability" thing. History has shown provided you bought smart at the start, upgrading CPUs without changing the socket and mobo is a bad move for the money.

For you case, you are also forgetting that buying AM3+ now means you also need to buy a current AMD CPU with worse performance/price than a 2500K no matter which you pick, and you are stuck with that level of performance until BD releases (assuming it doesn't get delayed again, and actually faster enough than current AMD offerings to justify the price) which also costs money to upgrade to. All in all, you might as well went 2500K in the first place.

freeforumuser
Aug 11, 2007

Agreed posted:

Yeah, but I'm wondering if that's just because they've got complex processes spread out in discrete threads with core affinity and then they sync it all up at the end, or if it's because the game is more deeply multi-threaded and could run just as well (thought experiment follows) on an 8-core system with half the clock per core, assuming clock for clock parity. In other words, multi-threaded, or quad-threaded?

Programs that have long been multi-threaded and can take more or less linear advantage of additional cores tend not to have any uncertainties in the time domain. Rendering for example. Four cores at the same speed will be almost four times as fast as one core. Eight cores will be almost twice as fast as four cores. There's some overhead involved in managing the workload but it's not much. Games have to worry about syncing up unknown variables with a pretty short window in which to do so, and so past games just parceled out specific tasks that could more easily be synced up and gave them affinity on core 2, then they'd put it back together. I'm wondering it games that are using 4 cores are still doing that, but with cleverer discrete tasks, or if they're actually multi-threaded yet.

One thing people tend not to realize is a hypothetical single-core CPU A which has twice the single-threaded performance of a dual-core CPU B is superior in all kinds of workloads no matter how whether it's single-threaded or multithreaded.

Single-thread performance is still very important today and in the future, since we are already seeing stuff like that simply doesn't scale well than more than 2/4 threads, plus the fact diminishing returns gets worse as you add more and more threads even for workloads that can be perfectly scaled to any threads.

freeforumuser
Aug 11, 2007
Found a pretty legit BD leak from PCWorld France.

quote:

"In the more upscale, FX processors based on the Bulldozer architecture we also disappointed: while they are still generally more efficient than their predecessors and allow AMD to approach much of the last Core i5 and i7 but their performance remains below expectations. Besides, as we announced already in our previous issue, if they can sometimes compete with Sandy Bridge in the applications of rough calculations, the results are in video games very far behind. Only overclockers (and fanboys) will find them a great interest given their predisposition in this area."

freeforumuser
Aug 11, 2007

Longinus00 posted:

Hmm, I wonder why the architecture is always so behind in games?

It didn't had the massive IPC advantages Intel has since the first C2Ds; Intel chips simply perform well regardless of threads. Clock-for-clock Phenom II X4 is only just as fast a Q6600.

When you pit a Phenom II X4 against a Intel CPU with a ~50% IPC advantage in a dual-threaded game it looks like this:

freeforumuser
Aug 11, 2007

Alereon posted:

I think that's a little pessimistic. Their desktop performance has been generally weak, but they delivered a hex-core processor far more cheaply than Intel did, and maintained performance leadership in the value segment up until Sandy Bridge. Their strategy has always been more cores/$ and more performance/$ and watt in servers (as well as memory quantity and bandwidth/$), which doesn't translate to a desktop strategy since desktop applications are still poorly threaded. Marketshare numbers don't exactly tell the whole story because AMD Opterons excel in 4+ socket configurations, while Intel leads in 1-2 sockets, so each server AMD wins sells more processors and those processors are worth more (as they're 12-core, 8-way models). AMD has had great success in the supercomputer market for this reason, for example. Basically AMD's server strategy is targeting huge servers for virtualization, while Intel concentrates on more performance-sensitive applications.

The situation is almost entirely reversed in the low-power market, as Intel completely hosed themselves with Atom and won't begin to recover until 2013. AMD's low-power Fusion processors (E, C, Z-series) have a commanding lead and sell as fast as AMD can make them. Their Llano processors have pretty disappointing CPU performance, but putting a decent GPU on-die makes them pretty compelling for laptops where the costs (especially in terms of power) of a discrete GPU aren't worth it. Llano also delivers 4 cores in a space where Intel usually delivers 2, which doesn't always help but isn't as useless as having more than 4 cores typically is.

It is pessimistic because Intel can annihilate AMD in every sector, if they choose to.

Server: I'm don't track stuff here, but anything wrong with the Intel server platform was long fixed since Nehalem with the IMC, QPI and HT. Sandy bridge only made the same stuff even better.

Desktops: Release an unlocked i3. No reason to even buy AMD anymore.

Laptops: Cut SB prices against Llano. Actually Intel doesn't even need to do anything since dualcore SBs laptops with Llano-grade GPUs are already as competitive as it is, pricing-wise. (considering how slow is the Llano CPU)

Netbooks: Release a single core SB. Zacate? What is that again?

freeforumuser
Aug 11, 2007

trandorian posted:

Considering Atoms are already dual core, it'd be silly to replace them with a single core sandy bridge. Noone wants to be stuck with single-core today.

A single core SB is going to murder any Atom or Zacate CPUs regardless of cores, with much better performance/watt to boot.

http://www.xbitlabs.com/articles/cpu/display/core-i3-2100t_11.html#sect0

But I think I'm derailing a BD thread so I'll just stop here. :)

freeforumuser
Aug 11, 2007

dissss posted:

I wonder what their share is now? We're seeing a huge number of E-350 laptops (not just netbooks) - sure perhaps it's ill advised as far as long term reputation goes but at the low end of the market people very much buy on price and AMD is considerably undercutting even the low end Pentiums.

2011 Q3 AMD 10.4% vs Intel 81.8% by revenue

Keep in mind that the E-350 isn't exactly the money-printing machine like Sandy Bridge.

freeforumuser
Aug 11, 2007

quote:

Meanwhile, AMD's server processor codenamed Interlagos will also have difficulty shipping on schedule and is expected to be delayed to November.

Source: http://www.digitimes.com/news/a20110930PD207.html

I'm speechless.

freeforumuser
Aug 11, 2007

Jago posted:

Here's the 105 price point. The Intel dual core beats the Athon II at games and the sysmark stuff, but it seems that anything multithreaded at all the Athlon meets or beats it. I'd call this one a toss up.
http://www.anandtech.com/bench/Product/188?vs=143

This is the 129 price point. The Phenom X4 965 reigns supreme.
http://www.anandtech.com/bench/Product/102?vs=118

At 189 Intel takes a decisive lead except for heavily threaded apps.
http://www.anandtech.com/bench/Product/203?vs=363

I guess I'm coming from a home-build perspective myself, so I might be biased. Also, I guess I am building computers that are right around that 500-700 dollar range. On a practical side, do you really need more than 60fps in a game? AMD's generally larger number of cores blasts out video encoding much faster... ah, well Intel is going to be using it's onboard video to do that soon anyway.

I guess I'll just say that if you are assembling a "bad-rear end" system and trying to max out the price performance curve, AMD wins hands down. If you are trying to make a super bad rear end system... well.... Intel, no arguments.

AMD is hosed and only exists because of ATI and Intel not being allowed to crush them because of regulation.

Hardly anyone is going to buy a single CPU alone. Add in a $120 mobo and AMD already loses hands down in price/performance/power draw even if the 2500K was priced $100 higher. That is how far behind AMD is now. Put overclocking into the mix and AMD loses even more; it's difficult for a 965BE to hit 4GHz and it still won't touch a stock 2500K while turning into a small oven. Whereas 2500K is 4GHz guaranteed on the stock cooler.

With regards to "60 fps in games":



(Side note: see how a 4 year old Q6600 beats a 3GHz Phenom II with a 600MHz deficit!)



AMD is far from 60 fps. You really want to save $100 now over that? And how many games are going to become even more CPU-limited in the future?

No one is disputing AMD is cheaper but they are a far cry from the price/performance champions in the Athlon XP/64 days. They can't even hold on to that at the $100 mark....And that is how deep into poo poo they are.

freeforumuser fucked around with this message at 04:06 on Oct 1, 2011

freeforumuser
Aug 11, 2007

frumpsnake posted:

AMD does provide good value, but you've got to go lower than the i3.

http://www.anandtech.com/bench/Product/204?vs=406

*takes a look at the chart*

Nope.

A G620 is almost neck to neck to a 565 BE while being $32 cheaper, having a 800MHz deficit and consuming a good 50W less power, and the HD2000 IGP is a good deal better than the crappy AMD mobo IGPs. Needless to say the Athlon II X2 line gets entirely slaughtered if the 565 BE is that bad.

Both of those CPUs aren't even of good value for money in the grand scheme of things, but it does demonstrate that AMD doesn't even have a low end to speak of. More like "how cheap can one get while building a PC and doesn't care about performance or power consumption" end.

freeforumuser fucked around with this message at 12:22 on Oct 1, 2011

freeforumuser
Aug 11, 2007

Bloody Antlers posted:

If BD stinks at launch, we could see another Phenom -> Phenom II type transition where engineers save the day by fine tuning the design and dramatically increasing performance.

We're talking about an extremely complex design being implemented by an equally complex fabrication process. Months of delay shouldn't be surprising to anyone.

How anyone could feel strongly about the long term status of either company evades me. Each successive generation introduces new engineering challenges, and having a larger R&D budget doesn't guarantee success by any means.

Nature has a way of eventually introducing parity wherever competition exists.

There was no fine-tuning from Phenom I to II; all AMD simply did was slapped an extra 4MB L3 cache and clocked it higher on a 45nm process. The thing if BD fails, it cannot be saved in the easy manner like in Phenon II, as it's already has a ton of cache, very highly clocked and 32nm like SB.

If you are talking about an "extremely complex design", the same also apply for Intel. How was Intel able to stick to its roadmap like clockwork with three entire architectures (Conroe, Nehalem, SB) and 2 die-shrinks (Penryn, Westmere), while AMD can't even get BD out of the door after 4 years since Phenom I? There is no "introducing parity" here, more like chasing a airplane with a bicycle. Mind you this was the same AMD that went from the original K7 to A64 winning streak from 99-03.

freeforumuser fucked around with this message at 01:47 on Oct 3, 2011

freeforumuser
Aug 11, 2007

Star War Sex Parrot posted:

It's weird to think about how long we've been reading about Bulldozer. I think it first started getting thrown around in summer 2007, but maybe even earlier than that.

The whole BD debacle is looking eerily reminiscent to the Phenom I launch. Lots of official slides on how BD is better than Intel blah blah blah but no mention of any actual performance or power consumption, a useless 2560x1440 gaming comparison vs a 990X and the 8GHz LN2 OC, and leaked underwhelming benchmarks that which was exactly what happened to Phenom I.

In fact, I would take AMD's silence over BD performance as evidence of BD being nowhere as good as AMD's hyping up it is. You just simply don't keep mum if you have a killer product after losing consistently for 5.5 years, and you can't hide it from your 100+ billion competitor that knows everything through good ol' corporate espionage.

One can argue AMD was pretty quiet before the HD 5000 series launch, but that was more on how underwhelming the lineup was compared to cheap HD 4000s than actual failure itself, and the GPU sector was and still is a market that are very competitive in.

freeforumuser fucked around with this message at 02:49 on Oct 4, 2011

freeforumuser
Aug 11, 2007
PCM NL leaked benchmarks:
http://www.overclock.net/rumors-unconfirmed-articles/1134704-pcm-leaked-dutch-fx-8150-review.html

Summary:

Cinebench 11.5
FX-8150 = 6.01
2600K = 6.73
2500K = 5.73

Dirt 3 - 1080p HD5970
FX-8150 = 105 avg, 75 min
i7-965 = 93 avg, 71 min

Far Cry 2 - 1080p DX10 max
FX-8150 = 111 avg, 23 min
i7-965 = 126 avg, 75.2 min

Mafia 2
FX-8150 = 68.3 avg
i7-965 = 76 avg

Power consumption - full load
FX-8150 = 120W

Overclocking
5GHz, 1.47V

Edit: Link fixed. Thanks Movax!

freeforumuser fucked around with this message at 17:41 on Oct 6, 2011

freeforumuser
Aug 11, 2007

Bob Morales posted:

Didn't we already know BD would be behind the highest-end of the last-gen i7's in gaming and single-thread stuff?

I'd like to see the #'s for Cinebench, single CPU

There's Cinebench 10 in there:
2600K = 5800
FX-8150 = 4024

Single threaded IPC = (5800 / 3800MHz) / (4024 / 4200MHz) = SB has 1.6x better IPC per clock than BD.

That's really sad. Actually I think thats even worse than Phenom II.

freeforumuser fucked around with this message at 18:08 on Oct 6, 2011

freeforumuser
Aug 11, 2007
FX-4110 Quad-core @ 4.2GHz leaks from China @ xtremesystems.org

It doesn't take much effort to Google current Phenom II scores for comparison.

Superpi
Worse than 965BE (20.529 vs 18.252 secs)

3dmark Vantage CPU
Worse than 965BE (10664 vs 11395)

W-Prime:
Much worse than 965BE (17.191s vs 10.764s)

7-Zip compression
Worse than 975 BE (11387 vs 12547)

7-Zip decompression
Worse than 975 BE (12701 vs 14294)

Cinebench
Single-threaded: Worse than 975 BE (1.03 vs 1.09), BTW this is the same score reported by the 8150 @ 4.2GHz PCM NL leak.
Multi-threaded: Much worse than 975 BE (3.31 vs 4.27)

What is this I don't even. How can AMD make a new quad-core that loses to their previous generation quads clocked 600-800 MHz slower? This is a trainwreck of epic proportions.

freeforumuser fucked around with this message at 13:31 on Oct 8, 2011

freeforumuser
Aug 11, 2007
The cat is out of the bag now, with a legit Romanian hardware review site doing a FX-8150 preview:

http://lab501.ro/procesoare-chipseturi/amd-fx-8150-bulldozer-preview/14

tl;dr version: Loses heavily in everything to 2600K, except Handbrake it comes within 1% of 2600K. BD isn't going to find itself in the SH/SC recommendation thread anytime soon.

freeforumuser
Aug 11, 2007
All these official reviews simply proves beyond doubt BD is a fail of epic proportions, in performance and especially power draw (MY GOD, extra 200W from stock to 4600MHz at full load?!)

As seen from: http://www.guru3d.com/article/amd-fx-8150-processor-review/7

freeforumuser fucked around with this message at 05:21 on Oct 12, 2011

freeforumuser
Aug 11, 2007
http://www.anandtech.com/show/4955/the-bulldozer-review-amd-fx8150-tested

Anandtech review is up!

freeforumuser
Aug 11, 2007




:suicide::ughh::negative::psyduck:

And those aren't even crossing the 60 fps barrier.

freeforumuser
Aug 11, 2007

Maxwell Adams posted:

So... Intel is dropping the price on i5's when bulldozer hits the market, right?

There will be price drops, but it won't be Intel.

freeforumuser
Aug 11, 2007

Hog Butcher posted:

That A8's APU's going to blow the HD3000 out of the water. A prebuilt with a video card in it'll beat it, but I think the only place AMD's got a chance right now's if they're competing with the HD3000.

Which means if the Ivy Bridge chipset's better than the 6520G, I'm out of ways to even defend AMD. :v:

That A8 APU is slower than a discrete 5570 which is in turn molasses slow compared to a 5770 which I consider it the minimum GPU required for current gaming at a decent resolution, and that is NOT FAST ENOUGH for many!

No gamer is going to touch it just because its 2x better than HD3000. Besides the CPU portion is subpar to boot.

freeforumuser
Aug 11, 2007

wipeout posted:

EDIT - the Anand posters commenting on buying Bulldozer because AMD need the money.. Wow, thats strange logic.
If they must fund AMD in some way buy the decent products it makes such as Bobcat or Llano, or better yet just go outside and don't touch a PC for a few years.

Even before BD, AMD fanboys love to say how their CPUs were "good enough" for things like "Internet surfing" when AMD was and still is whacked to hell and back by Sandy Bridge...As if that somehow excuses AMD for making subpar chips not even worth the performance/price ratio.

freeforumuser
Aug 11, 2007

Agreed posted:

What the hell are they going to do with that problem? The Phenom II X6 1055T $149.99 should overclock pretty trivially to any of the Black Edition specs, and thus make Bulldozer look foolish. And it also features this ad copy:


... which is rather unfortunate in light of Bulldozer's extraordinary power draw.

I mean, do they kill what has been one of their golden geese now that keeping it around means kind of looking like assholes? They could keep having them made, but there's only so much fab time and space available, and obviously they're going to want to scoot people along to Bulldozer. Problem is it's not actually any better, just more expensive, more power hungry, and embarrassing in how not-competitive it is with Intel's current-gen stuff despite prices and marketing that beg you to think otherwise.

The sunk cost fallacy starts to look pretty frightening when it's your company's path and perception at stake.

A die-shrunk 32nm GPUless Llano with L3 cache and AVX would have a smaller die for the same cores, and would have been much faster than BD per clock/core since it is already +6% IPC than 45nm K10.

This 8-core hypothetical chip at 3.6GHz (compared to 3.3GHz 1100T) would had around (8/6) * (3600/3300) * (1.06) * 5.9 = 9.09 score in Cinebench 11.5 which would had been truly a multithreaded monster compared to the cocktease 6.0 score of the current FX-8150. It's staggering how the gently caress anyone at AMD would think BD was a good idea when they can already make something much better for much less effort.

freeforumuser
Aug 11, 2007

Longinus00 posted:

The linux kernel patch only increased performance by up to 10%, a 40% performance increase is pretty crazy all things considered. AMD has had previous problems with windows kernel scheduling (aka phenom cool and quiet problem) so I suppose this isn't unprecedented. That also a problem that core parking would have solved, it just took until intel got on the case to get that implemented into the kernel.

I wouldn't personally believe anything until more people can actually try this stuff out and see what changes are really being done.


Considering how bulldozer is supposed to be made to improve server performance I'm sad that all/most the benchmarks so far have been for desktop apps on windows. Hopefully someone will get their hands on some opterons and start doing those tests.

Meh, I will just tell AMD to suck it up. Nobody is going optimize for your CPU when it runs current code molasses slow. How about design a CPU that is actually fast NOW in the first place instead of this pathetic whining, AMD?

freeforumuser
Aug 11, 2007

PC LOAD LETTER posted:

I think that had more to do with Bobcat. Still plenty that is lol worthy in the old pre launch BD slides given what we know now though.

If Intel taught us how to design a good x86 processor, it is to improve the decoders, branch predictors and out-of-order resources as much as possible since those are the main bottlenecks. Sharing a decoder with 2 cores is a recipe for disaster aka BD.

freeforumuser
Aug 11, 2007

ClosedBSD posted:

What the gently caress would cause a problem like this? Did AMD leave out whole x86_64 instruction sets or something?


BD just keeps better isn't it? Even the Phenom TLB bug wasn't this bad, at least it was extremely unlikely to affect consumer apps.

Civil posted:

Newegg is a hotbed of AMD fanboyism. Not that the A2's and Phenom2's were bad CPU's, but the reviews speak of them as if they're water-walking Jesus processors. For the most part, bulldozer will run all modern software and games acceptably, just not as well as intel parts. And the tardcore AMD fans are just fine with that.

For AMD fanboys, it always boils down the same old tired arguments:

1. It's OK for AMD to suck because they are the underdog
2. Buy AMD unless you like sky-high Intel prices (People will buy AMD stuff if it was actually good, not poo poo like BD)
3. I'm been using AMD all my life (Irrelevant)
4. AMD is good because they are cheap. (They are cheap because they are simply not good enough...And they have no chip that even touches a 2500K in perf/OC/power/price)
5. AMD is fast enough for games. (Nope for CPU-limited games like SC2)
6. AMD has a better upgrade path (debunked for AM2/3, AM3+ makes that moot)

Adbot
ADBOT LOVES YOU

freeforumuser
Aug 11, 2007

cinder posted:

Has AMD directly addressed the questionable performance of BD versus their existing offerings? I had read through the previously linked thread mentioning AMD's desire to answer questions directly from the enthusiast crowd and I'm interested to see their responses, but it doesn't seem like it has been posted yet.

I could understand that, except they totally trolled us so hard lost after all their BD performance and perf/watt pre-launch claims didn't materalize at all in the final product. I doubt asking AMD about how BD sucked so hard is going to give us any honest answers.

  • Locked thread