New around here? Register your SA Forums Account here!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
Nintendo Kid
Aug 4, 2011

by Smythe

freeforumuser posted:

Netbooks: Release a single core SB. Zacate? What is that again?

Considering Atoms are already dual core, it'd be silly to replace them with a single core sandy bridge. Noone wants to be stuck with single-core today.

Adbot
ADBOT LOVES YOU

Nintendo Kid
Aug 4, 2011

by Smythe

nodm posted:

Which didn't amount to a whole lot in the grand scheme of things, since they didn't really have the brand recognition to command a high selling price or enough fabbing capacity to grab a large market share. I think they owned something like 22% of the consumer market while their cpu's were murdering the competition.

I dunno how accurate this chart is (it's based off benchmarks submitted to a benchmark site) but it seems to say AMD wasn't that low back in the day:


( from http://www.cpubenchmark.net/market_share.html )

Nintendo Kid
Aug 4, 2011

by Smythe

Civil posted:

Sufficient gaming performance doesn't require an i5/i7 - most current games are still playable on a 2/3/4 core Athlon 2. I'll eat one of my socks if these upcoming benchmarks actually outpace Intel's current line, though.

Yeah, the fact they phrased it that way made me worry too.

Nintendo Kid
Aug 4, 2011

by Smythe

freeforumuser posted:

Far Cry 2 - 1080p DX10 max
FX-8150 = 111 avg, 23 min
i7-965 = 126 avg, 75.2 min

This is just confusing, how can the FPS vary by almost 90 for the AMD chip while the Intel chip only varies by 51?

Nintendo Kid
Aug 4, 2011

by Smythe

hootimus posted:

Man, this is beyond pathetic. AMD really has no hope of regaining any lost ground. Especially considering windows 8 is coming soon, and it will run on ARM, the old battle of Intel vs AMD will just become irrelevant. It's going to be Intel vs ARM. No one in their right mind is going to buy this lovely loving processor.

Intel vs. ARM isn't even a battle. Intel laptops can run for 7 hours (Macbook Air, Macbook Pro) or more time (netbooks) all while having way more computing power to hand than any ARM chip.

The latest, most cutting edge, 8 core ARM chips are on par, performance wise, with a Core 2 Duo from 2006, and take just as much power to do that as said Core 2 Duo chips did. Cost more too. It would take a radical change in a lot of things to get comparable performance for everyday non-trivial Windows applications on the ARM platform, and if things actually started getting close, Intel could simply license ARM themselves and start cranking them out again.

Nintendo Kid
Aug 4, 2011

by Smythe

roadhead posted:

Also I love the little A8-3850 in my HTPC - so what AMD lacks in the low-volume enthusiast gamer market perhaps can be made-up in the "I want the cheapest computer you have that plays games" segment at BestBuy.

A prebuilt Core i3 is that already.

Nintendo Kid
Aug 4, 2011

by Smythe

Hog Butcher posted:

That A8's APU's going to blow the HD3000 out of the water. A prebuilt with a video card in it'll beat it, but I think the only place AMD's got a chance right now's if they're competing with the HD3000.

Which means if the Ivy Bridge chipset's better than the 6520G, I'm out of ways to even defend AMD. :v:

Prebuilts come with video cards. So yea, AMD still screwed.

Nintendo Kid
Aug 4, 2011

by Smythe

THAT drat DOG posted:

Some new developments:

http://quinetiam.com/?p=2356

Site claims to be working on a patch that increases performance by ~40% across the board. They claim their patch forces Windows 7 to recognize the Bulldozer Processor as having 8 cores. With the patch their Passmark CPU score increased about 4000 points from 8500 to 13000.

What do you guys think, bullshit or are they onto something?

Isn't the whole problem that Windows is recognizing it as 8 full independent cores?

Nintendo Kid
Aug 4, 2011

by Smythe

Longinus00 posted:

Hey, monopoly markets are fine. Look at how much innovation is going on in the ISP/telecom industry, we keep getting more and more bandwidth and better prices amiright guys? Hell, IE6 was so good microsoft didn't even need to upgrade it for years and years.

But I did get my speeds upgraded this year out of the blue? Only "competition" my cable provider has here is 3 mbps DSL. AMD's been as effective a competitor against Intel as DSL has been against cable for at least the last 2 years, which is to say, not very.

And IE6 wasn't upgraded because Vista was supposed to be out in 2004 instead of 2006. Take a minute to remember back too, all the other browsers were crap compared to IE6 until Firefox finally went stable in 2004. IE 6 was the most standards compliant browser for several years even.

Nintendo Kid
Aug 4, 2011

by Smythe

Longinus00 posted:

That's great news to all at&t and comcast/verizon customers that get faster speeds and lower bandwidth caps! It's also nice that IE6 was so standards compliant that when standards compliant browsers came about none of those sites worked with them, and the later IEs have an IE compliant mode (that is to say IE6 wasn't standards compliant it's just that sites were forced to work with it).

Why yes when browsers more standards complaint than IE6 came out they were more standards compliant than IE6. How insightful! You do realize that the Microsoft plan was to have IE releases tied to OSes and that XP SP2 happening screwed everything up and delayed Vista right? Microsoft actually halted development of the next os for a decent period of time to revamp XP with SP2. IE7 was due to come out at the same time roughly that browsers on par with IE6 were finally coming out.

Seriously, just because IE6 turned out to not have kept up with web standards for 10 years after its release doesn't mean that it wasn't the best at its time (2001) and for several years after. Hell, Bulldozer was supposed to launch in 2007 originally, right (not with the same name but the same concept)? AMD's suffering the same kind of problem MS did with IE, schedules go out of wack and all that.

Nintendo Kid
Aug 4, 2011

by Smythe

Longinus00 posted:

I'm not sure you're getting it unless you were making a commentary on the differences between dejure and defacto standards. IE6 was purposely not standards compliant but because of it's market share all the sites had to code to its standards thus screwing other browsers with much smaller market share. AMD is also in a totally different position than microsoft because BD is not some vehicle with which it is going to change the cpu standards that other people are going to have to deal with later.

IE6 was the most standards compliant browser when it was released, that's a fact and your Microsoft bashing doesn't change that. No browser was fully standards compliant before IE6 and in fact there's still none now that are compliant with everything. And there was none as compliant as IE6 was until many years after its release. Nor was "introduce proprietary support things" a Microsoft only thing, Netscape was especially bad with trying that, and adware Opera at the time had its own special things it supported.

AMD is in fact trying to make the case that processors should be designed like bulldozer, modules of two integer cores sharing cache and an FPU. And of course throughout their history Intel and AMD have each tried to introduce new instruction sets and convince people to use them (MMX, 3DNow!) etc. Just because they're failing doesn't make it not the case!

Nintendo Kid
Aug 4, 2011

by Smythe
How would you even break up Intel? Not let the desktop and laptop cpu teams talk to each other?

Nintendo Kid
Aug 4, 2011

by Smythe

feedmegin posted:

ARM might want a word with you on that one...

ARM's nowhere near close enough yet. How many ARM servers on the internet and private networks (banking, etc) versus x86 servers, or even mainframes and SPARC? How many ARM machines in your ATMs, at government offices, in the military, and so on versus x86?

A very big part of the reason that Intel would be under scrutiny if AMD went out of business is that so much of the world relies on x86 CPUs.

Nintendo Kid
Aug 4, 2011

by Smythe

Agreed posted:

Windows 8 is supposed to run on ARM, right? That could legitimize it further, considering that ARM and related processors do power a number of consumer devices..

It could, but the necessity of recompiling everything means that you don't get the legacy app support that allows movement to it the way x86-64 did or even the way various expansions to the instruction set since the original 8086/8088 did. Not to mention there's an awful lot of x86 devices that don't run Windows.

And even in Windows, look how much a of a problem it was for many businesses and people that 64 bit Windows no longer ran 16 bit applications native.

Nintendo Kid
Aug 4, 2011

by Smythe

streetgang posted:

Hey since we have all these faster processors cranking out, how does a 8 core affect gaming? do half the mmo's out there and pc games even have coding to use a 8 core ?

I think all the benchmarks have shown "8" core bulldozer chips losing handily to 4 core Sandy Bridge and even pre-Sandy Bridge chips in games.

Nintendo Kid
Aug 4, 2011

by Smythe

Bob Morales posted:

Are there any cases where disabling HT on the latest Intel processors increases performance?

If you're still running XP SP2 or earlier, the OS' scheduler has problems handling hyperthreading correctly, if I remember right. An edge case to be sure.

Nintendo Kid
Aug 4, 2011

by Smythe

Combat Pretzel posted:

Holy crap. Are they going to blame that on scheduler problems? I wonder how that ARMA benchmark would look like with a 2600K in play, for 4C8T SMT.

Well, it probably IS a scheduler problem isn't it? Fixing it of course would probably only smooth out the spikes in both directions though, not increase overall performance.

Nintendo Kid
Aug 4, 2011

by Smythe

Bob Morales posted:

What are the sheer odds that a startup could make an x86 chip?

Are they greater than another platform coming along (ARM+Linux or something) and helping tablets/etc kill the desktops? Will desktop PC's remain a niche for content developers, power users, and business computing?

"Desktop PCs" really includes laptops you know.

And there's massive inertia in the market too, it's not as if everyone buys a new computer yearly, many people only buy once every 5 years or so. Let's say starting tomorrow morning no x86/x86-64 computers ever sold ever again, and people who were buying them are all buying ipads and android tablets - it would take 4, maybe 5 years for those to outnumber x86/x86-64 computers and probably a decade to eclipse them fully.

Nintendo Kid
Aug 4, 2011

by Smythe

Aleksei Vasiliev posted:

Seeing as it directly says they are realigning to low power processors, even if they aren't pulling out of x86 they won't really be much competition for Intel anymore.

As if they're much competition for Intel anyway?

Nintendo Kid
Aug 4, 2011

by Smythe

Ryokurin posted:

They may very well be, as that market is going to get smaller over the years or at the very least become harder and harder to justify an upgrade every 2-3 years as it has been. Either way I don't see it happening in the near (that is the next year) as the alternatives still need work. It would be different if they didn't sell off their ARM division a few years back, still made their own memory instead of rebranding or had someone who could execute their low powered chip production properly. They owe it to their shareholders and their future to take a hard look at where they see things going.

There really isn't a good reason to assume the market for x86 computers is going to get smaller, only that it will continue to get bigger while other stuff also gets bigger faster, but without reducing the x86 market.

Nintendo Kid
Aug 4, 2011

by Smythe

Ryokurin posted:

I was talking more about the desktop computer market than just the x86. Notebooks already outsell Desktops and Tablets will eventually become a bigger part of the market. Desktops won't go away but they will be more of a server role, or for heavy duty tasks that need a higher resolution or power than the average Notebook/Tablet can provide.

Well desktops have been outsold by notebooks since 2005 or so and if I remember right they're currently about 65% of computer sales, might even be 70% or more.

And frankly AMD has done even worse in laptops than they have in desktops, since the whole power usage problem is especially apparent and annoying when you have a battery and that lasts shorter and thing you're touching directly that heats up more.

Nintendo Kid
Aug 4, 2011

by Smythe

Spiderdrake posted:

Didn't AMD make gains in notebooks and release chips people actually want to buy as opposed to their desktop plan of sucking on every possible level? Several benchmarks show the power / battery life of Llano is pretty much fine.

They just, you know, can't ship enough product.
This really lines up with everything written about them. It's sad seeing a company mismanage itself out of a market, but then Intel hasn't exactly been forgiving for the last five years.

They've made minor gains, but an AMD notebook is still all but guaranteed to perform worse and run shorter on battery than a similarly priced Intel. Being "fine" doesn't help against "great".

IIRC Intel only dipped from 86% of laptops to 85% recently.

Nintendo Kid
Aug 4, 2011

by Smythe

Beef posted:

They are rebranding, simply adding marketing bullshit to Kingstone RAM.

I was about to ask how they were making their own RAM without fabs or (to my knowledge) having a RAM design team. But your post beat me to it by explaining.

Nintendo Kid
Aug 4, 2011

by Smythe

Shaocaholica posted:

For FPU intensive work, does the extra hardware in BD really add any performance? How does it fare for something like folding at home?

N0 and terrible.

Nintendo Kid
Aug 4, 2011

by Smythe
When you get a 2 GB video card and 8 gb of system ram in a fairly cheap laptop, it'd be rather nice if the consoles we'll be using for the next 8 years have that much to work in.

Nintendo Kid
Aug 4, 2011

by Smythe

wipeout posted:

Fanboys are retarded - but none so perfectly as a guy I saw on a forum, who bought a brand new Intel P4 EE and prised the heatspreader off it - not realising it was soldered on. £900 dead chip.

Reminds me of this kid who in 2011 was running XP Home (32 bit of course since 64 bit doesn't exist for it) on a 6 core AMD high end CPU, with 16 GB of RAM and two video cards, which needed to have power leads plugged into them, not having those plugged in so they were struggling by on PCI-Express power, and bragging about the system.

And in a separate forum on that same site, was asking for help for why his machine wasn't performing as good as it should, while insisting running a 32 bit 10 year old OS made it faster.

Nintendo Kid
Aug 4, 2011

by Smythe

Wedesdo posted:

The funniest part: unless he was running Win XP x64 (which he probably wasn't), only ~3.2GB of that 16GB was actually usable.

I did explicitly say he was running XP home and that that is 32 bit only, yes. Also he'd already told the forum that he'd added more RAM recently to speed things up and it was "totally working" even though it of course couldn't be.

Nintendo Kid
Aug 4, 2011

by Smythe
S3 ViRGE were pretty much good for 2D acceleration only, which used to matter then.

Nintendo Kid
Aug 4, 2011

by Smythe

Fatal posted:

Care to say why? It seems so similar to HDMI I don't really get it (although I plan to use it when my 7970 comes)

It's also licensing cost free, or nearly so, in addition to what the parrot said.

Nintendo Kid
Aug 4, 2011

by Smythe
So wait do I need to change my sign now? :saddowns:

Nintendo Kid
Aug 4, 2011

by Smythe

SwissCM posted:

What could be interesting is seeing the ARM core being used for virtualization for the emulation of ARM-based devices. Probably not possible though.

It's a very old core and slow as well. Kinda like deciding to use a Pentium III core to emulate a modern x86 cpu.

Nintendo Kid
Aug 4, 2011

by Smythe

syzygy86 posted:

I think its just the cheapest core that has the TrustZone feature. If that's really all they want it for, there's no sense in using a better core.

Well yes, but the guy I was quoting specifically said "seeing the ARM core being used for virtualization for the emulation of ARM-based devices.". That simply won't be practical with it.

Nintendo Kid
Aug 4, 2011

by Smythe
I'd also point out, it's been known for console makers to have several different initial system designs and change them later. Take the Dreamcast for instance - there were two competing designs a year before the console's actual release:

quote:

Hideki Sato's group used Hitachi SH4 and PowerVR to make a video game machine called "White Belt".
Tatsuo Yamamoto's group used IBM/Motorola PowerPC 603e and 3dfx Voodoo 2 to make a video game machine called "Blackbelt"

A variant of the "White Belt" system became the production Dreamcast, while the Blackbelt system went nowhere.


It wouldn't be surprising if Microsoft had had both an all-AMD and this Intel-nVidia combo system from the get-go.

Nintendo Kid
Aug 4, 2011

by Smythe

Alereon posted:

While I'd agree with that, I don't see how it would serve any purpose to send out a dev kit of an entirely different architecture. The 360 dev kit was at least a G5 (based on POWER4) + ATI R420 (Radeon X800), though the final shipping product was Cell PPEs (based on POWER6) + ATI R520 (Radeon X1800). If they really are sending out Intel+nVidia dev kits (this might all be a hoax or they might just be dev workstations or something), that's a pretty good sign the earlier rumors of POWER7 + AMD were wrong.

I thought the next Xbox rumors had always been (since like 2010 at least) x86-64 AMD CPU + AMD graphics, not some kind of POWER7 chip for the CPU + AMD graphics. So having other next Xbox stuff come out that was Intel x86-64 plus nVidia graphics isn't much of a structural change.

Nintendo Kid
Aug 4, 2011

by Smythe

pigdog posted:

If you're gaming, then there are substantial differences in overall smoothness of gameplay, even if the differences in GPU-dependent, averaged framerates aren't big. Most of the time there are other bottlenecks in the system, but in spots which it is CPU-dependent, the difference is noticeable. E.g. the time taken to render one frame at constant 60 FPS is 16.7 ms, and if it takes longer than that, then that's a stutter, a drop in framerate. The graph on Skyrim timedemo is especially telling.



The fact that the Phenom II X4 980 manages to outperform every other AMD chip gets me everytime. How did AMD let something like that happen?

Nintendo Kid
Aug 4, 2011

by Smythe

Goon Matchmaker posted:

If anything I see Apple scaling their processors up to the point where they're comparable to whatever Intel offers and then giving Intel the finger.

This is close to impossible, ARM stuff is way slower than what Intel can put out and Apple does not have the money or R&D knowledge to remedy that on their own.

Nintendo Kid
Aug 4, 2011

by Smythe

Chuu posted:

Apple has over $100B in cash, and the Apple A6 CPU shows their hardware engineers know what they're doing.

I really doubt that Intel is looking to sever their relationship with Intel, and Haswell looks like it's going to be an incredible architecture for mobile computing, but if Intel really was looking at developing a high performance ARM processor they have the resources.

Do you have any idea how much money it took to get Intel or AMD processors to the performance they are today? $100 billion is nothing in comparison, and frankly the A6 doesn't show poo poo as far as getting an ARM based architecture to x86-64 performance in laptop/desktop applications.

If I remember right the fastest ARM-based cpu anyone's got still performs like a Core Duo or early Core 2 Duo while sucking down more power - and it certainly wasn't Apple who made it.

Nintendo Kid
Aug 4, 2011

by Smythe
Apple didn't develop high dpi displays they don't even make them. They just buy them.

Colonel Sanders posted:

No poo poo, who financed the research.

Not Apple, dude.

Nintendo Kid fucked around with this message at 04:41 on Oct 17, 2012

Nintendo Kid
Aug 4, 2011

by Smythe

chocolateTHUNDER posted:

I ask this not as a troll, but as a genuinely curious person:

How is it that AMD fell this far behind in the CPU arms race? I mean 10 years ago they were neck-and-neck with Intel.

Intel woke up and stopped pulling stupid poo poo; and had the advantage of all their money to follow through.

Adbot
ADBOT LOVES YOU

Nintendo Kid
Aug 4, 2011

by Smythe

Maxwell Adams posted:

Didn't Bulldozer have slightly better performance/power draw on Windows 8? Is that still the case with Piledriver?

Emphasis on the slightly here; most computing tasks still aren't suited to it but the OS can schedule things onto it slightly better.

Any recent intel chip still wipes the floor with it in most use.

  • Locked thread