Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

That's horrible loving news. I haven't built an AMD system in years and years but they're the only balancing factor keeping the arms race going for consumers. Who else is even remotely positioned to offer an alternative to Intel? What's the future of AMD (and ATI) if they lose the CPU race this many generations in a row?

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Not make or break, but they acknowledge their market share is pretty crap for anything serious. It seems like the only segments propping them up are people who see lower price and don't care if it's Intel or AMD (we're talking Best Buy shoppers, here) and enthusiasts on a budget who will accept the slower processor in order to save money. It's looking more and more like K6 vs Pentium, I just hate to see AMD slip into that slump again. They kicked serious rear end with the Athlon XP and first-gen 64-bit chips. Man. Lame.

Ryokurin posted:

While it's likely is correct that it's going to disappoint, you need to take Theo Valich articles with a grain of salt. He's been dead wrong several times in the past.

I'll keep that in mind.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Alereon posted:

Keep in mind that nVidia and Via quit at the same time along with AMD. This is really an issue of Intel controlling BAPCo and selecting and weighting the benchmarks to show Intel's products in the best possible light. That probably means a focus on single-threaded performance and with minimal emphasis on graphics performance, which would definitely disadvantage their competitors who optimized for multi-threaded performance and GPU speed.

Intel is manipulating the market in underhanded ways? :aaaaa:

Although nVidia leaving is kinda funny to me, they have their own sordid history with benchmark software and do plenty of their own Intel-like bundling stuff for developers. Easier for them to have compatibility in advance since they'll give you plenty of nVidia cards to work with if you do the whole "the way it's meant to be played" thing, although someone will surely point out that nVidia and ATI both have crap drivers if I don't throw that out there beforehand :)

So which benchmarks are worth a drat, then? None? If they're all gamed, what's the yardstick by which to measure performance so that an informed purchase can be made?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Sinestro posted:

:smith: Wow, I hope this isn't a trend.

Not sure how it could be interpreted as anything but a trend - continuing, not starting - they've lost on performance every generation since the Athlon 64, haven't they? Intel caught up with the architecture switch in the Pentium M and has blasted past since for performance.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

That's good news. AMD'll hold on strong with that much console support.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Coredump posted:

What kills me is people who are cheering on AMD to succeed and become competitive with Intel again so they turn around and buy more Intel chips at what they hope will be lower prices.

I've built both AMD and Intel systems. If AMD just outright wins a generation I'll be loving tickled pink to build another AMD system, it'd be pretty nice to be able to do that. But my job requires performance and the smart money is Intel for now, and was in 2008 when I built my last computer. In 2003 it was a different story and I loved the AMD Athlon XP system at the time, felt like lightning. It's not about fanboyism, it's just practical decisionmaking based on price and performance.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

$320 a realistic price point?

Some things that concern me - if it's expensive and it just matches Intel's current-gen chip for performance, that sucks. Intel will come out with Ivy Bridge and blow it away for enthusiast/high end usage, while remaining price to performance competitive in the $200-$300 CPU market. I am concerned also that it might not have a similarly easy to use overclocking method and that it might be memory performance bottlenecked with lower speed RAM. If it needs faster RAM to perform up to snuff, that adds more cost compared to the inexpensive and common 9-9-9-24 DDR3 1333mhz that gets you to pretty much ideal performance with Sandy Bridge. But I don't know enough about it at this point obviously to say one way or another.

I don't understand some of the benchmark results, either, or the way they're presented as exceeding Intel; e.g. 3Dmark11's Performance mode score. That score is slightly lower than what I was getting with my 2600K before I overclocked it, also with a GTX 580, but it's weird to even use a 3Dmark score as an indicator of performance. Is it because they're just done with Sandra and have to have SOMETHING? Should the whole thing be taken with a grain of salt as far as that goes, just PR (since I can personally attest that those scores are not superior to Intel's comparably priced chip, which means that's at best inaccurate reporting or at worst misleading on purpose)?

On the other hand, if it's serious then that's not exactly encouraging, if it costs more than a 2600K out the gate and doesn't perform better all AMD is doing is playing catch-up to a months-old processor that is going to get another iteration around the time of AMD's launch that will make Intel an undisputed performance winner again in all price categories except very low budget systems.

Edit: I just hate to see such lofty hype boil down to meet-but-not-exceed performance. If this is a generation where they're just trying to get their poo poo together and let the fact that they're powering all the consoles subsidize the rest, okay, I hope it pays off in the next generation... But at this point it looks like what they're offering up is a slightly more expensive alternative to the 2600K that doesn't improve on it meaningfully at all. Which is great if you've been itching to build an AMD system for ideological reasons but not great if you're just wanting to put together a high performance computer without paying the enthusiast premium for the chips coming down the pipeline from Intel.

Agreed fucked around with this message at 17:45 on Jul 13, 2011

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

rexelation posted:

http://www.youtube.com/watch?v=LxlQLzOCxEc

I came across this video and thought people here would be interested. The guy compares AMD A8-3850 with an i3 2105 in a number of games in low and high settings. He also tested the level of performance boost by adding a HD6670 to the AMD system.

He left out the bench of just the discrete graphics card without the crossfire setting. Makes that really unhelpful if you were trying to see what hybrid crossfire offers over just using the graphics card. I don't really understand why he bothered doing the tests and left out a pretty important row for people seriously considering options for a budget system like that. It demonstrates plenty well that the i3's onboard graphics suck compared to the AMD one but why go only half of the rest of the way toward demonstrating the performance of the AMD on-board versus the discrete card that it's supposed to compete with for performance (that was the idea, right?)

Agreed fucked around with this message at 13:50 on Aug 19, 2011

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Yeah, I don't really understand the point of Llano, still waiting to see what Bulldozer will do, but if he's going to go to the trouble to try to demonstrate something it seems like he ought to at least get the basic design of his tests settled, I have no idea what the hybrid crossfire thing is even really doing; if it's offloading the rendering mainly to the card, or what. He had all the stuff right there and set out to run some tests, just bugs me that he didn't do the last one to make it some kind of meaningful data set I guess.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Alereon posted:

Anandtech has a post detailing pricing and model lineup for the initial AMD Bulldozer FX launch. It looks like they're pushing the release date back until Q4, which starts in October. Intel plans to compete by releasing a Core i7 2700K CPU, which will take the top slot and push down pricing on the i7 2600K and i5 2500K.

Intel seems like they're going to win this round on price as well as performance if that's the case. AMD's lowering of expectations from 50% clock for clock to 35% clock for clock isn't really comforting, but then they just haven't been especially forthcoming with performance details at all. I don't know how to feel about any of it except that I'm pissed that they're releasing a 2700K because I'd have bought that had I known it was coming in a few months. But, then, there's always something around the corner, no point trying to time the market. I'll wait for Ivy Bridge's second stepping after the overclockers.net folks have a go at it and toast some silicon before deciding on an upgrade path, if I do choose to move on from the 2600K to something socket-compatible rather than wait a few years to build a new computer as I've done in the past.

I hope AMD's expectations management isn't ALL loving bad news. I can see the potential benefits of 256-bit floating point hardware in unison, it's frankly a bit cooler than Hyperthreading I think, but... Numbers. Well, I guess we'll get them sooner or later. Q4. Hell of a wait for this thing, and Intel's already prepared to push at the same time, I hope it isn't crap. Hope hope hope.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Star War Sex Parrot posted:

Properly-optimized PC games finally want a quad, yeah.

Are they truly multi-threaded, or are they spreading individual tasks across up to (and only) four cores in a way they can still sync up at the end, like the older dual core optimized games did with the main gameplay on core 1 and then physics/AI/etc. that could be synced up on core 2? I've been wondering about that, are the games recommending quad core CPUs spreading the full workload or are they doing discrete threads for discrete tasks and bringing them together, sort of a ramped up version of the dual core usage? I know CPUs are really, really powerful these days, but could the games recommending 4 cores theoretically get a performance boost (were it required) from 6 or 8 or whatever, or are they still doing it the sort of compromise method rather than totally multi-threading the game itself?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Rawrbomb posted:

DX:HR seemed to like multiple cores, and I know that BF3 is recomended to have quad cores for recomended preformance.

Yeah, but I'm wondering if that's just because they've got complex processes spread out in discrete threads with core affinity and then they sync it all up at the end, or if it's because the game is more deeply multi-threaded and could run just as well (thought experiment follows) on an 8-core system with half the clock per core, assuming clock for clock parity. In other words, multi-threaded, or quad-threaded?

Programs that have long been multi-threaded and can take more or less linear advantage of additional cores tend not to have any uncertainties in the time domain. Rendering for example. Four cores at the same speed will be almost four times as fast as one core. Eight cores will be almost twice as fast as four cores. There's some overhead involved in managing the workload but it's not much. Games have to worry about syncing up unknown variables with a pretty short window in which to do so, and so past games just parceled out specific tasks that could more easily be synced up and gave them affinity on core 2, then they'd put it back together. I'm wondering it games that are using 4 cores are still doing that, but with cleverer discrete tasks, or if they're actually multi-threaded yet.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Alereon posted:

Games have been AMD's weak area for awhile, it's due to their poor per-thread performance.

How long do we let them slide on the "Intel are basically rat bastards" thing before we just come out and say that everything has been AMD's weak area for, what, five years now? You don't shrink from 25% to less than 5% of the server market despite extremely low priced hardware without some culpability. Their processors have been lagging in performance since Intel ditched the long pipeline and started building more around the Pentium M architecture. The good old days are going to be awfully hard to reclaim at this rate and I'm worried it's going to put Intel in a position of even more market dominance that'll be like a boot on AMD's throat, how can they get up?

At least their graphics cards are legitimate great price to performance cards, and they swept the next gen consoles. That'll... help. But Bulldozer is looking like a total disappointment at this point, maybe even a flop if they can't get the price down (and how will they do that when their yields are so crap?) and I am pretty upset about it. Every single thing that could go wrong has, both from a bad luck but also from a decision making standpoint. Yeah, Intel still hosed them over pretty bad, it would be unfair to forget about that but god drat it.

Agreed fucked around with this message at 23:53 on Sep 29, 2011

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

AMD does still hold the x64 patent. That's a hell of a safety net. And Intel would get AT&T'd if they really tried to eliminate AMD as a competitor. But that's such a hosed up place for AMD to be - under the limited protection of a mutually assured destruction patent war, and the tenuous mercy of Intel because AMD is what protects them from being broken up as a monopoly. I just... It's crap, to know that they are and will be held at such a competitive disadvantage because Intel has more money than god and all the problems AMD has, Intel doesn't have. AMD attempts these innovations but they're stuck behind. They can't even get yields at their designed process and are having yield issues with ATI cards at the same time, meanwhile Intel is steadily progressing to smaller lithography and making basically all the right choices for power saving, etc.

AMD seems to be on life support from their "competitor" for no other reason than one really good bargaining chip (x64 and cross-licensing agreements) and the need for there to be an ostensible competitor for Intel so regulatory bodies stay off their back (anti-trust settlements and extensions under duress of the aforementioned agreements). AMD are on a leash and it's choking the poo poo out of them.

Extremely parallel supercomputers for less money with off the shelf parts are cool but they don't change the fact that confidence in the AMD brand is total crap, as expressed privately by review organizations and publicly by shareholders. Key people leaving... Find the silver lining, I don't see it.

Agreed fucked around with this message at 03:00 on Sep 30, 2011

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

They have zero alternatives. It's catch where catch can. Intel won't move to totally cut them off because Intel needs them to exist. I think the "in a world without regulations" (my interpretation of "where _____ could do whatever they want") is a little specious, no offense intended Alereon, because at any point after the glory days between Thunderbird and Hammer, Intel would actually have been able to kill AMD dead without regulation to worry about. Whereas at no point during the Thunderbird-->Hammer period could AMD have done the same. It was an impressive example of rising to the occasion and forcing competition from Intel, but that's exactly what it accomplished. Now, there is no day to seize. You said Intel makes a lot of bad decisions, but their processors are blazing and their marketshare is as close to complete as you can hope for without immediately being told to gently caress off as a monopoly.

I get what you mean when you say AMD's cheap cheap cheap processors make them a good choice for highly parallel supercomputers, but 5% vs 95% of the server market really can't be interpreted as anything but "AMD has taken a massive beating," and I would add "in no small part because their chips stopped getting faster some time toward the end of 2007 until now."

It's great that they'll finally have a chip that will compete with an i7!... It's just the wrong damned i7, they'll be two generations behind at launch, and the price category that they seem to be falling into doesn't promote their chances to reclaim lost marketshare anywhere that matters.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Jago posted:

I was looking around newegg and comparing some prices and performance on the Anandtech bench, and it seems to me that Intel/AMD are pretty competitive below 200 dollars. If you want to spend less than 150 though, AMD seems like a no brainer. Lots of 3+ core options whereas Intel has nothing at that level. No one is really OK with a dual core anymore, right?

Are you guys all only interested in chips that cost more than 200 dollars? Aren't AMD motherboards a bit cheaper as well?

With current processors, there's a very small price gap from about $500 to $650 where you can build a system based on AMD and it'll be competitive with Core 2 Quads for per-thread performance. Below that, you can't beat pre-built for performance/$ (that segment's Sandy Bridge i3 processors with their high clocks are a very solid performer for lower end systems even though they're dual core with hyperthreading). Above that, you ought to be building an Intel based system unless you just like being several years behind for performance.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

frumpsnake posted:

No, you're comparing the 965 to the old i3-530 rather than the Sandy Bridge-based i3-2100, which as of writing, is $5 cheaper than the 965. Comparable boards are about the same price, +/- $10. Lets call it even.

http://www.anandtech.com/bench/Product/102?vs=289

In many real world apps and most games (particularly some not listed such as Starcraft 2) -- the dual core Intel equals or handily beats the 965. And draws a lot less power doing so.

AMD does provide good value, but you've got to go lower than the i3.

joe944 posted:

How about the $89 price point, which is how much I just picked up a Phenom II 925 for. :p

To someone who isn't brand loyal and thinks Intel has hosed AMD over with effectively-forced bundling and other really underhanded market tactics, it is not even remotely comforting to see that AMD provides a means to put together a computer that is performance competitive with an Intel computer from late 2007, for a price that's about $200 away from dramatically better performance (you know, 2011 performance). And the current indications about Bulldozer seem to be that by the time they DO get it out, they'll be an even narrower price gap away from providing performance that's competitive with Intel chips from two generations ago. It's an ugly picture.

It seems unrecoverable for AMD right now. Every bit of news that comes out is bad news. Intel cedes markets where they don't mind doing so because they don't have to worry about getting busted up as a monopoly. If AMD is betting the farm on ARM platform computing, maybe things will change, but right now I don't see how anyone can be anything but pessimistic. What keeps AMD going? The fact that computers don't have to be blazing fast for most things and there's this narrow region where they are price competitive. But that's becoming tenuous as hell, the price gap has narrowed substantially. How are they going to hit that same price point with yield difficulties that are clearly more than just an annoyance, while Intel is just quietly continuing to push toward superior manufacturing processes, performance increases, architectural improvements (as opposed to AMD's architectural gambles), and power efficiency?

The one advantage AMD enjoys could be taken away from them if Intel were stupid enough to do it. If Bulldozer had come out when it was supposed to, that would have been a foothold because there are some people who would probably buy on principle, and the server architecture might have actually been pretty cool/useful, but Intel has had a year to keep pushing the envelope.

I am glad I don't have AMD stock. Intel stock doesn't seem to enjoy performance relative to their market position, but AMD stock sure does take a hit comparable to their processors' dwindling penetration into key sectors. gently caress, man. This blows.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Jago posted:

I would argue that at the price points where you are choosing between a dual core Intel solution vs a 4 core AMD solution AMD is still the good choice. Still, it's a small slice, p sad.

In the end though I guess it's time to learn the Intel lines. :(

You're acting like a core is a core is a core and more cores=better always. Not at -all- true. Single-threaded performance has to do with the efficiency of the architecture itself. AMD's clock-for-clock performance is poo poo compared to Intel, hence having to add extra cores to even have a horse in the race with usage scenarios that can make use of additional cores. Intel is kicking AMD's rear end all over town at inter-process communication.

There is a reason the articles you've found are from 2010 and 2007, man.

Agreed fucked around with this message at 18:02 on Oct 1, 2011

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Factory Factory posted:

All this time I've taken IPC to refer to Instructions Per Clock (which Intel is also kicking AMD's butt with). The More You Know.

I would bet that is a more common usage, a search on enthusiast forums seems to turn up that more often than the other. No happy face to put on it, really, there isn't anything AMD is better at right now except price (not price:performance, not price:power:performance, just "yeah, you can make a cheap computer if you want to with their stuff") - that narrow price category for building a complete system, or a narrow price category where I'm taking Alereon's word as reliable that AMD is the processor of choice for off the shelf sourced highly parallel supercomputers (which apparently make the decline from 75%/25% server market share Intel vs AMD to 95%/5% somehow less completely devastating as a statistic).

5% in servers, 25% on Steam (where you'd expect to see representation of the enthusiast market I'd think, that's not -too- bad I suppose, shows people are building with them at least), the Llanos are doing okay as a processor for low end laptops but anyone have numbers there? The last news regarding the relatively shining star of the company's departure was that it was, paraphrasing, ohhh god whyyy; did that get any better since then? I'll pretty much take any good news at this point. No benches this close to launch, maaajor downward expectations management for yields and profit, :negative: all over. I want to see this as something other than a disaster, so if anyone with their finger seriously on the pulse of the market can see a way that their situation isn't basically "Intel's required competition under law" right now I'd be pleased as punch, seriously.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Look, your basic assumptions about how processors work are wrong. I was trying to say that less directly, I shouldn't have. Until you understand more about what makes a given processor faster than another you're missing the requisite groundwork to know what you're talking about. The statement "in games that can make use of more cores, turning off some cores on the same processor results in performance drops compared to not turning them off" is so obvious it's practically tautological.

In one of the articles you linked, there was one comparison that showed just how much of a per-core, per-clock advantage the previous generation of Intel processors enjoyed over the AMD processor under comparison. Specifically, when they limited the AMD processor to 2 cores out of 4 and performance nosedived; when they did the same to the last-gen i7 Intel processor, it barely dropped. That's because the i7 processor's per-core performance outpaces AMD's by enough that it didn't need all of its available processing power to not be CPU limited.

Extrapolate that to games which are now advertising compatibility with 4-core processors. If a 2-core processor with hyperthreading exceeds the performance of a different 4-core processor, and both versions of IPC are strongly in favor of the 2-core processor with hyperthreading, what you have is a dual core processor meeting or exceeding a slower and less advanced 4-core processor. Because it's not really how many cores there are, it's how efficiently they operate.

Systems are a fair point. You can build an AMD system, complete, that has about as good a chance of overclocking as any other, for that little niche between $500 and $650. Okay, got yourself a computer, and Steam's got enough of a view of things to make some sense out of it, apparently you would not be alone. A quarter or a bit better of the market are running AMD too. How many of them built it, god knows; AMD does still ship in pre-built computers, it's just not a very good option for desktops, though Llano is pretty cool and maybe some casual Steam gamers are doing the whole integrated GPU thing. AMD does have something vaguely sexy going on there but if it carries them I will be very surprised, as Ivy Bridge is bringing Intel much closer, fast, with better power management. Llano will never be and was never intended to be a desktop replacement; anyone think AMD would be in a good position if they lost inexpensive, low performance laptop buyers?

I digress. Back to systems and the DIY thing. Raise the budget for building that system by a couple hundred bucks and it becomes completely no contest because there are architectural flaws in AMD's current-gen desktop CPUs (pre-Bulldozer) which prevent them from exceeding 4GHz in a 64-bit operating environment, while any 2500K made will not only dramatically outperform them in pretty much anything at stock settings, but also overclock trivially to between 4GHz and 4.3GHz, with most of them able to hit 4.4GHz or 4.5GHz before there's any kind of a voltage wall. That clock disparity, which is, again, an architectural ceiling on AMD's 64-bit processors at least until Bulldozer hits, is a total slaughter. Intel doesn't just beat AMD clock for clock this generation, it beats modern AMD processors clock for clock with its Core 2 processors. Substantially.

To demonstrate further that just more cores doesn't mean much, look at what happens when an AMD Phenom II X6 1100T BE 3.3GHz goes up against a 2500K 3.3GHz. The 2500K has 4 cores, no hyperthreading, the AMD has 6 cores. Same clock speed, just dramatically better efficiency on the part of the 2500K. It's hard to see, because that's 6 of AMD's best pre-Bulldozer cores up against 4 of Intel's second best pre-Ivy Bridge cores and it is not at all flattering to AMD in terms of price:performance or price:performance:power. Add overclocking to the mix and it's a total wash.

What AMD has right now is not a price:performance advantage, just "for $150-$200 less than a system that performs dramatically better, you can build a computer that won't bottleneck you for now provided that you overclock it." In business terms, there's a niche there where they can do it cheaper. Gotta be better, cheaper, or both, right? Intel wins the lower end market in desktops delivering both, and the higher end market in desktops and servers by delivering drastically better performance at a price that is still affordable.

For DIYers I feel it's worth the extra $150-$200 to get such a huge performance leap, personally, but if all you care about is "will it play a game that can use 4 cores" then I'll bunt and acknowledge, for whatever it's worth, that yes, for now it will. But so will the Core 2 Q6600, and the Q6600 is from 2007, clocked at 2.6GHz to the Athlon II X4 645's 3.1GHz.

I hope it's clear that it isn't about just having cores, it's about how the processor and chipset as a whole integrates the processing horsepower, the multiple core architecture, and the system functions like the path from RAM to the processor and many other factors into a package which can efficiently perform calculations to do whatever. If the only advantage AMD has in the consumer market is in DIYers who are 100% stuck with a budget, that's a really meager niche. And those guys could be in for some real disappointment in a year or two when their computer sporting the 4-core processor that started off with four year old performance the day it was built stops running games so well, and it's not necessarily so easy to get them near or to that 4.0GHz maximum clock to try to buy some more time. Meanwhile Sandy Bridge is still humming along just fine (and Intel's made better, faster, less power hungry processors with better integration of everything, including the damned GPU where AMD has put so much of its weight).

Bulldozer was supposed to fix the situation and it was supposed to do so back when the situation wasn't quite as hosed as it is now. Obviously it hasn't worked out that way for what are, it is becoming clearer, a lot of reasons.

So the current generation of AMD processors are a swiftly eroding stopgap, and Bulldozer is a day late and a dollar short unless every bit of information we've got so far is wrong (and some of it definitely isn't which suggests the rest of it very well isn't either - there's no upside to the well-respected honcho heading for the hills, or admitted yield issues and sharp revenue adjustments on their old AND new production processes, and daily stock dives taking the company's value into the toilet).

Agreed fucked around with this message at 10:23 on Oct 2, 2011

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Caseman posted:

Good lord. Thanks for taking the time to type all that. Having just bought a $200 990fx motherboard, I feel pretty silly at this point. I'll have to stick to AMD for the time being because my investment has already been made. I'm hoping that the architecture changes in bulldozer going forward will provide better real-world performance in a way that's not immediately measurable with the benchmarks we currently use. It's a longshot and a pipedream, I'm probably just trying to justify the money I spent.

I know I paint a really bleak picture there, and there are lots of reasons to try to maintain realistic expectations even though that means seeing a brand we all need to perform strongly instead get mired in all kinds of poo poo... But nonetheless, there is still some hope, we don't completely know what Bulldozer performance will be like. It's an architectural gamble in a lot of ways, and them leaving Bapco was justified, even though it stings the image of the company (take their toys and go home because the benchmark software makes them look bad - justified or not, doesn't go over well for consumer confidence). Sysmark is Intelmark now.

Once it launches and Anandtech gets their hands on it to really put it through the paces we'll know what we're dealing with. AMD's stock is suffering, I've presented a lot of reasons I think that has been the case, but it still remains to be seen what we can really expect from the new architecture. All it has to do is be close enough, within a certain price range, for AMD to hold on. And I do think Llano is still viable if they can maintain performance on it going forward, though Ivy Bridge will probably provide a substantial challenge to them because of the dramatic efficiency improvement Intel has managed there.

AMD calling its CPUs with an integrated GPU "APUs" (which should be reserved for the special class it denotes, accelerated processing units) is kind of bullshit and they know it, but nonetheless if more things are coded to take advantage of the sort of fast processing that can be done on GPUs, AMD might get some unexpected advantages in the future based on heavy investment on the die. But so might Intel. Someone said recently that Intel is very good at seeing AMD's good ideas and capitalizing on them, and that is supremely true.

Nobody knows what ARMs are going to do. That could be a game changer, but AMD has totally different competition there.

I tried to start this post not sounding like I'm :smithicide: at AMD's chances, I hope I haven't failed to hold onto that over the course of it. There's something really dreary about having to qualify every potentially positive thing with how it could actually not work out so well, when the same is not true of Intel's stuff and hasn't been for a few years. AMD can't afford a Netburst gently caress-up, Intel could. So let's hope it's not like that.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

frumpsnake posted:

Not on Intel's two cores.



That is loving brutal. Jesus. Bulldozer, please be at least as good as Lynnfield, because that hurts to look at.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

freeforumuser posted:

The whole BD debacle is looking eerily reminiscent to the Phenom I launch. Lots of official slides on how BD is better than Intel blah blah blah but no mention of any actual performance or power consumption, a useless 2560x1440 gaming comparison vs a 990X and the 8GHz LN2 OC, and leaked underwhelming benchmarks that which was exactly what happened to Phenom I.

In fact, I would take AMD's silence over BD performance as evidence of BD being nowhere as good as AMD's hyping up it is. You just simply don't keep mum if you have a killer product after losing consistently for 5.5 years, and you can't hide it from your 100+ billion competitor that knows everything through good ol' corporate espionage.

I wonder, how does it get worse than 60% clock for clock, this thing is gonna kill!... no, 50% clock for clock, that's still good!... okay 35% clock for clock, look, it is really hard to make a processor :smith:

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Reads like the first one. But I'm all for unexpected good news. Companies usually don't sit on it after crowing in the early stages about how profound an improvement it's going to be. Getting quieter and quieter closer to release with bad things happening --> jump out with confetti and a sign saying "Gotcha!" while your stockholders go all "et tu" on you for ruining that portion of their portfolio?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

gemuse posted:

When concerning CPU performance, it does mean Instructions Per Clock. While a fast CPU (and especially fast/efficient core/socket interconnects) may speed up Inter-Process Communication, it is mainly a software implementation issue and a not a metric for CPU speed. Unless there are benchmarks showing a marked performance advantage for Intel vs. AMD in Inter-Process Communication, which is independent of other characteristics of the CPUs (such as memory, cache or integer performance), I think Agreed is confused.

That's a definite possibility, though I was under the impression (could be wrong!) that there's a significant factor in the actual instruction sets/microarchitecture in terms of how they can be utilized by operating systems. E.g AMD's K10 vs. Bulldozer, and Sandy Bridge.

K10: Superscalar, out-of-order execution, 32-way set associative L3 victim cache, 32-byte instruction prefetching

Bulldozer: Shared L3 cache, multithreading, multicore, integrated memory controller

Sandy Bridge: Simultaneous multithreading, multicore, integrated memory controller, L1/L2/L3 cache. 2 threads per core.

compare that to Intel's post-Netburst return to the P6 style with the Core architecture - 4 issues wide, ditched hyperthreading and introduced macro-ops, increase to 64 KB L1 cache/core split between L1 data and L1 instruction.

I thought all that would have a pretty significant impact on Inter-Process Communication because of the fairly dramatic differences in how the microarchitecture structures execution. But if I'm wrong, tell me so, always happy to learn.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Alereon posted:

How does the first Bulldozer review describe performance? "Downright tragic."

If those numbers are anywhere near what the platform should be producing, AMD is hosed. It's simply not performing worth a drat in multi-threaded applications, and that should be its strong-suit. The fact that in some tests it's even losing to a Phenom II leaves me at a loss for words.

Yeah, this is just incredibly bad news. I am still waiting for Anandtech to get their hands on it, but it's looking more and more like AMD is delivering this product effectively stillborn. Meanwhile, tick, tock, tick, tock, tick, tock...

Edit: I can't believe how miserable the floating point performance is. How in the gently caress can they put in two flexible 128-bit floating point hardware units per module and get results that are so deflated? And its overclocking seems to be "about as good as Sandy Bridge for clock rate" with worse performance and ungodly power draw. Intel's improvements to hyperthreading, optimizations for its usage, and processing efficiency mean they're wiping their asses with Bulldozer's floating point performance, which of anything, ANYTHING it should be able to dominate at, should be that. Some kind of absolutely tragic mistakes and bad decisions and poor guesses and rotten luck combined here, the modules thing appears to just totally screwed. I was concerned about the way their new execution process might look when the gaming benches suggested that AMD has much wilder variance when intensive processing that involves more guesswork in the pipeline starts going on, but this really underlines that their module idea is a rotten egg.

This is just poo poo. They must have spent a lot of time trying to make this into something worthwhile but it's just more disappointment and they can't afford that. I dread the opening of trading if this information is generally out by Monday. God drat.

Again, I am still waiting for official stuff, but silence from AMD is absolutely deafening at this point given the nature of the information coming out. Fuuuuuuck.

Agreed fucked around with this message at 06:37 on Oct 9, 2011

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Civil posted:

I thought Intel didn't cut prices of their chips. But yeah, AMD is getting a serious black eye off this one. I hope it doesn't mean the beginning of the end. Intel needs a competitor.

They've announced a 2700K which will take the price spot of the 2600K, should just be a binned 2600K if I understand it correctly but that would trickle down.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I wonder if anyone at Intel feels silly for having done that, or if it'll just be at worst meaningless to their bottom line and at best promote enthusiasts to switch from 2600K to 2700K for no reason, or bring in more people in the in-between now that the in-between exists. Hyperthreading really does have performance benefits for everything thanks to how well modern operating systems integrate it and the architecture's efficiency at implementing it, fewer wasted cycles is a pretty cool trick. If the 2600K gets close enough to the 2500K it could start being a decent recommendation for more general usage after all instead of "get this if you do time-sensitive content creation or a lot of rendering and not for any other reason."

So AMD's switched gears already, before dropping this turd of a processor on the market, to hyping up its improved successor? Generously assuming that they can improve the performance dramatically enough to make it competitive, how many years will that have been between the announcement of the new architecture and an implementation on the market that isn't non-competitive junk?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Thought they already announced (back when there was still some doubt as to Bulldozer sucking) the 2700K taking over the 2600K's price point. If it changes nothing else, it's supposed to bump that down. They could say "did we say that? Oops, typo, meant ON TOP OF the 2600K, hah, weird us right?" because there's no reason not to, I guess.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Factory Factory posted:

They could also just stop selling the 2600K.

They better not :mad:

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

That your portfolio talking? :ohdear:

This is 2011, I guess maybe given the new and unique nature of the architecture it really is possible that there is some serious OS miscommunication... Intel's Hyperthreading works so well because operating systems integrate it as well. But a last minute save that turns it from poo poo to fine? When they've already, as previously mentioned, shifted gears to talking up the successor, before this even hits?

I'm still dubious but god drat it I'm not going to be a total pessimist here, I'd love for Intel to have some real competition and frankly I think the module idea is a neat one, if it works, which I guess is what they're trying to fix.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Devian666 posted:

Between reading tomshardware and anandtech I'm satisfied that AMD have almost caught up to their main competitor the Phenom II X6 1100T.

Brutally true. Jeeesus. :suicide:

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I think the whole thing is pretty ridiculous but what kills it completely is the power draw, what the gently caress is up with such an absolutely massive difference in power draw under load compared to Intel?

If hardware sites wouldn't remind everyone about the Phenom II X6 these numbers would still look bad, there's no way to fix that, but with that on there as well (which is just due diligence, frankly), it really is a case of completely loving the dog. :bang:

Who thought up the modules=cores idea? Why? If it was an engineer I don't understand it, if it was a marketing guy fire the fucker now.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

The -idea- isn't stupid, it's actually pretty cool. 2 128bit or 1 256bit FP, that's neat. But advertising it is awful. At their best these will perform like modern 4-core processors. A module is not a core in the sense that people expect something WOWEE from an 8-core processor. Though it sure uses power like an 8-core...

movax posted:

Hopefully the Radeon 7000s own face (heh, we're already back at the 7000 numbering there) and can help keep AMD solvent.

They won every console in the next generation, so unless the processors literally kill them, the graphics cards should do well.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

movax posted:

:laugh:

This is going to be turtles all the way down isn't it.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

The i3 is a really good processor, though, is AMD genuinely competitive in the region where you can get a Dell with an i3 for like $450?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

The server market is where they needed to regain some ground, having declined from 25% marketshare to 5% marketshare now, and VEEEERY clearly aren't going to do that, not with power draw where it is. And I doubt many people making supercomputers are going to be sticking these things in them either, which would be pretty bad as apparently that represents a substantial portion of the current 5% they hold.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Yeah, basically the best case scenario is power draw that's not exactly attractive and performance that doesn't offer any incentives for switching from Intel to AMD. Or it could eat poo poo and be terrible and just a really obvious wrong choice. But the big problem is just no reason to switch, every reason to stay, even if they can squeeze out that extra 6% with the kernal patch. By the time Piledriver makes what I hope to god is a better showing than this, at least, Ivy Bridge will have already happened a while ago, bringing superior thermal performance and all that along with it, and Intel will be on to the next thing. They're really well positioned at this point, they basically need to gently caress up badly for AMD to have a shot and while they aren't infallible, they already had this problem and won't be revisiting it.

I haven't read about NetBurst in forever but wikipedia talking about an architecture from 2000 reads like a find and replace descriptor of what's not to like about Bulldozer. It isn't a totally analogous situation, but it's awfully similar. Remember when Intel was talking about super high clock rates? TDP >100W for a single core processor when the higher clocked models were considered? Finally just couldn't dissipate the heat at all (despite the ridiculous turbine-like shrouds they used to have for them)? Hell, didn't AMD take the world record clock rate here from a Pentium 4?

Then they brought out the Pentium M, which proved that the P6 architecture's efficient, shorter pipeline and dramatically superior thermal performance was still viable with more modern fabrication processes, and life's been good since. That was back when AMD's XP processors had a 12-step pipeline compared to the 31-step pipeline on the later Pentium 4 processors. The story of AMD and Intel sucks these days.

Agreed fucked around with this message at 04:59 on Oct 13, 2011

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Well, the actual idea at the root of that is legitimate. Consumers don't benefit from a monopoly, without competition there's not much to keep a totally dominant company pushing the envelope and keeping prices sane. However, acting to prop up a really, really bad product is the panicked response to that underlying sentiment, and it isn't effective. People aren't going to act as a class to boycott a far, far superior price:performance product, the industries that purchase parts at a larger scale certainly aren't going to say "oh, dear, it seems AMD has lost a number of product generations in a row, we better put their power-hungry underperforming parts in our systems so they can build better stuff!"

Intel's timing has been pitch-perfect ever since returning to the P6 architecture and widening the lanes. Core onwards has been exactly the right direction to both match and shape industry demand and larger market trends. Now, their actions during the period where their processors were poo poo are pretty despicable in my opinion, they went totally knives out to keep as much of a boot as they possibly could on AMD's neck with bundling practices and sweetheart deals with major brands that could be viewed as legitimately anti-competitive. I seem to remember a decision in AMD's favor in the Euro markets to that effect but I could be off... Still, ever since then it's been a continued push towards power efficiency. That really pays off if you do the math on the useful life of a processor; the one-time expense of buying Intel is negated by the substantial power savings. And they just keep managing to shrink and improve, shrink and improve, so their power usage keeps going down, down, down and performance keeps going up. Hell, Ivy Bridge looks like its going to be able to fit an actual quad into a laptop without killing the power efficiency or putting out stupid high heat levels. They're looking at putting out a powerful, quad-core processor that has a TDP as low as the old Pentium M that started them back down the road to (legitimate) success. That's remarkable.

More to the point, that's the direction things need to go in. AMD's reversal here is totally baffling. It would be one thing if it offered performance commensurate to its power consumption. It doesn't. Dramatically in the other direction. (Edit: For their desktop and server parts, anyway. I know they've got a pretty slick competitor to Atom, which is way way in the other direction, but obviously not intended for anything but super lower power applications. I do still think Llano has promise, but Ivy Bridge is going to push them up against a wall and make them fight to keep it relevant with Intel devoting a lot more silicon on their lower end processors for their GPU than they did with Sandy Bridge; hopefully AMD can at least hold on to what it's got.)

Agreed fucked around with this message at 16:58 on Oct 13, 2011

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Poor bastards. But, hey, you're right, this isn't like a Pentium 4 or anything ridiculous. It's more like they're picking a quad core Pentium 4 D, but with modern instruction sets :smithicide:

  • Locked thread