|
freeforumuser posted:Netbooks: Release a single core SB. Zacate? What is that again? Considering Atoms are already dual core, it'd be silly to replace them with a single core sandy bridge. Noone wants to be stuck with single-core today.
|
# ¿ Sep 30, 2011 02:15 |
|
|
# ¿ Jan 26, 2025 01:07 |
|
nodm posted:Which didn't amount to a whole lot in the grand scheme of things, since they didn't really have the brand recognition to command a high selling price or enough fabbing capacity to grab a large market share. I think they owned something like 22% of the consumer market while their cpu's were murdering the competition. I dunno how accurate this chart is (it's based off benchmarks submitted to a benchmark site) but it seems to say AMD wasn't that low back in the day: ( from http://www.cpubenchmark.net/market_share.html )
|
# ¿ Sep 30, 2011 18:25 |
|
Civil posted:Sufficient gaming performance doesn't require an i5/i7 - most current games are still playable on a 2/3/4 core Athlon 2. I'll eat one of my socks if these upcoming benchmarks actually outpace Intel's current line, though. Yeah, the fact they phrased it that way made me worry too.
|
# ¿ Oct 6, 2011 15:55 |
|
freeforumuser posted:Far Cry 2 - 1080p DX10 max This is just confusing, how can the FPS vary by almost 90 for the AMD chip while the Intel chip only varies by 51?
|
# ¿ Oct 6, 2011 19:09 |
|
hootimus posted:Man, this is beyond pathetic. AMD really has no hope of regaining any lost ground. Especially considering windows 8 is coming soon, and it will run on ARM, the old battle of Intel vs AMD will just become irrelevant. It's going to be Intel vs ARM. No one in their right mind is going to buy this lovely loving processor. Intel vs. ARM isn't even a battle. Intel laptops can run for 7 hours (Macbook Air, Macbook Pro) or more time (netbooks) all while having way more computing power to hand than any ARM chip. The latest, most cutting edge, 8 core ARM chips are on par, performance wise, with a Core 2 Duo from 2006, and take just as much power to do that as said Core 2 Duo chips did. Cost more too. It would take a radical change in a lot of things to get comparable performance for everyday non-trivial Windows applications on the ARM platform, and if things actually started getting close, Intel could simply license ARM themselves and start cranking them out again.
|
# ¿ Oct 9, 2011 10:07 |
|
roadhead posted:Also I love the little A8-3850 in my HTPC - so what AMD lacks in the low-volume enthusiast gamer market perhaps can be made-up in the "I want the cheapest computer you have that plays games" segment at BestBuy. A prebuilt Core i3 is that already.
|
# ¿ Oct 12, 2011 21:18 |
|
Hog Butcher posted:That A8's APU's going to blow the HD3000 out of the water. A prebuilt with a video card in it'll beat it, but I think the only place AMD's got a chance right now's if they're competing with the HD3000. Prebuilts come with video cards. So yea, AMD still screwed.
|
# ¿ Oct 12, 2011 23:13 |
|
THAT drat DOG posted:Some new developments: Isn't the whole problem that Windows is recognizing it as 8 full independent cores?
|
# ¿ Oct 16, 2011 03:11 |
|
Longinus00 posted:Hey, monopoly markets are fine. Look at how much innovation is going on in the ISP/telecom industry, we keep getting more and more bandwidth and better prices amiright guys? Hell, IE6 was so good microsoft didn't even need to upgrade it for years and years. But I did get my speeds upgraded this year out of the blue? Only "competition" my cable provider has here is 3 mbps DSL. AMD's been as effective a competitor against Intel as DSL has been against cable for at least the last 2 years, which is to say, not very. And IE6 wasn't upgraded because Vista was supposed to be out in 2004 instead of 2006. Take a minute to remember back too, all the other browsers were crap compared to IE6 until Firefox finally went stable in 2004. IE 6 was the most standards compliant browser for several years even.
|
# ¿ Oct 20, 2011 19:55 |
|
Longinus00 posted:That's great news to all at&t and comcast/verizon customers that get faster speeds and lower bandwidth caps! It's also nice that IE6 was so standards compliant that when standards compliant browsers came about none of those sites worked with them, and the later IEs have an IE compliant mode (that is to say IE6 wasn't standards compliant it's just that sites were forced to work with it). Why yes when browsers more standards complaint than IE6 came out they were more standards compliant than IE6. How insightful! You do realize that the Microsoft plan was to have IE releases tied to OSes and that XP SP2 happening screwed everything up and delayed Vista right? Microsoft actually halted development of the next os for a decent period of time to revamp XP with SP2. IE7 was due to come out at the same time roughly that browsers on par with IE6 were finally coming out. Seriously, just because IE6 turned out to not have kept up with web standards for 10 years after its release doesn't mean that it wasn't the best at its time (2001) and for several years after. Hell, Bulldozer was supposed to launch in 2007 originally, right (not with the same name but the same concept)? AMD's suffering the same kind of problem MS did with IE, schedules go out of wack and all that.
|
# ¿ Oct 20, 2011 20:59 |
|
Longinus00 posted:I'm not sure you're getting it unless you were making a commentary on the differences between dejure and defacto standards. IE6 was purposely not standards compliant but because of it's market share all the sites had to code to its standards thus screwing other browsers with much smaller market share. AMD is also in a totally different position than microsoft because BD is not some vehicle with which it is going to change the cpu standards that other people are going to have to deal with later. IE6 was the most standards compliant browser when it was released, that's a fact and your Microsoft bashing doesn't change that. No browser was fully standards compliant before IE6 and in fact there's still none now that are compliant with everything. And there was none as compliant as IE6 was until many years after its release. Nor was "introduce proprietary support things" a Microsoft only thing, Netscape was especially bad with trying that, and adware Opera at the time had its own special things it supported. AMD is in fact trying to make the case that processors should be designed like bulldozer, modules of two integer cores sharing cache and an FPU. And of course throughout their history Intel and AMD have each tried to introduce new instruction sets and convince people to use them (MMX, 3DNow!) etc. Just because they're failing doesn't make it not the case!
|
# ¿ Oct 20, 2011 22:30 |
|
How would you even break up Intel? Not let the desktop and laptop cpu teams talk to each other?
|
# ¿ Oct 21, 2011 01:53 |
|
feedmegin posted:ARM might want a word with you on that one... ARM's nowhere near close enough yet. How many ARM servers on the internet and private networks (banking, etc) versus x86 servers, or even mainframes and SPARC? How many ARM machines in your ATMs, at government offices, in the military, and so on versus x86? A very big part of the reason that Intel would be under scrutiny if AMD went out of business is that so much of the world relies on x86 CPUs.
|
# ¿ Oct 21, 2011 16:28 |
|
Agreed posted:Windows 8 is supposed to run on ARM, right? That could legitimize it further, considering that ARM and related processors do power a number of consumer devices.. It could, but the necessity of recompiling everything means that you don't get the legacy app support that allows movement to it the way x86-64 did or even the way various expansions to the instruction set since the original 8086/8088 did. Not to mention there's an awful lot of x86 devices that don't run Windows. And even in Windows, look how much a of a problem it was for many businesses and people that 64 bit Windows no longer ran 16 bit applications native.
|
# ¿ Oct 21, 2011 17:00 |
|
streetgang posted:Hey since we have all these faster processors cranking out, how does a 8 core affect gaming? do half the mmo's out there and pc games even have coding to use a 8 core ? I think all the benchmarks have shown "8" core bulldozer chips losing handily to 4 core Sandy Bridge and even pre-Sandy Bridge chips in games.
|
# ¿ Oct 27, 2011 14:25 |
|
Bob Morales posted:Are there any cases where disabling HT on the latest Intel processors increases performance? If you're still running XP SP2 or earlier, the OS' scheduler has problems handling hyperthreading correctly, if I remember right. An edge case to be sure.
|
# ¿ Oct 28, 2011 14:07 |
|
Combat Pretzel posted:Holy crap. Are they going to blame that on scheduler problems? I wonder how that ARMA benchmark would look like with a 2600K in play, for 4C8T SMT. Well, it probably IS a scheduler problem isn't it? Fixing it of course would probably only smooth out the spikes in both directions though, not increase overall performance.
|
# ¿ Nov 5, 2011 16:30 |
|
Bob Morales posted:What are the sheer odds that a startup could make an x86 chip? "Desktop PCs" really includes laptops you know. And there's massive inertia in the market too, it's not as if everyone buys a new computer yearly, many people only buy once every 5 years or so. Let's say starting tomorrow morning no x86/x86-64 computers ever sold ever again, and people who were buying them are all buying ipads and android tablets - it would take 4, maybe 5 years for those to outnumber x86/x86-64 computers and probably a decade to eclipse them fully.
|
# ¿ Nov 29, 2011 22:14 |
|
Aleksei Vasiliev posted:Seeing as it directly says they are realigning to low power processors, even if they aren't pulling out of x86 they won't really be much competition for Intel anymore. As if they're much competition for Intel anyway?
|
# ¿ Nov 30, 2011 01:23 |
|
Ryokurin posted:They may very well be, as that market is going to get smaller over the years or at the very least become harder and harder to justify an upgrade every 2-3 years as it has been. Either way I don't see it happening in the near (that is the next year) as the alternatives still need work. It would be different if they didn't sell off their ARM division a few years back, still made their own memory instead of rebranding or had someone who could execute their low powered chip production properly. They owe it to their shareholders and their future to take a hard look at where they see things going. There really isn't a good reason to assume the market for x86 computers is going to get smaller, only that it will continue to get bigger while other stuff also gets bigger faster, but without reducing the x86 market.
|
# ¿ Nov 30, 2011 01:37 |
|
Ryokurin posted:I was talking more about the desktop computer market than just the x86. Notebooks already outsell Desktops and Tablets will eventually become a bigger part of the market. Desktops won't go away but they will be more of a server role, or for heavy duty tasks that need a higher resolution or power than the average Notebook/Tablet can provide. Well desktops have been outsold by notebooks since 2005 or so and if I remember right they're currently about 65% of computer sales, might even be 70% or more. And frankly AMD has done even worse in laptops than they have in desktops, since the whole power usage problem is especially apparent and annoying when you have a battery and that lasts shorter and thing you're touching directly that heats up more.
|
# ¿ Nov 30, 2011 02:50 |
|
Spiderdrake posted:Didn't AMD make gains in notebooks and release chips people actually want to buy as opposed to their desktop plan of sucking on every possible level? Several benchmarks show the power / battery life of Llano is pretty much fine. They've made minor gains, but an AMD notebook is still all but guaranteed to perform worse and run shorter on battery than a similarly priced Intel. Being "fine" doesn't help against "great". IIRC Intel only dipped from 86% of laptops to 85% recently.
|
# ¿ Nov 30, 2011 07:52 |
|
Beef posted:They are rebranding, simply adding marketing bullshit to Kingstone RAM. I was about to ask how they were making their own RAM without fabs or (to my knowledge) having a RAM design team. But your post beat me to it by explaining.
|
# ¿ Nov 30, 2011 15:53 |
|
Shaocaholica posted:For FPU intensive work, does the extra hardware in BD really add any performance? How does it fare for something like folding at home? N0 and terrible.
|
# ¿ Dec 7, 2011 03:09 |
|
When you get a 2 GB video card and 8 gb of system ram in a fairly cheap laptop, it'd be rather nice if the consoles we'll be using for the next 8 years have that much to work in.
|
# ¿ Dec 22, 2011 00:22 |
|
wipeout posted:Fanboys are retarded - but none so perfectly as a guy I saw on a forum, who bought a brand new Intel P4 EE and prised the heatspreader off it - not realising it was soldered on. £900 dead chip. Reminds me of this kid who in 2011 was running XP Home (32 bit of course since 64 bit doesn't exist for it) on a 6 core AMD high end CPU, with 16 GB of RAM and two video cards, which needed to have power leads plugged into them, not having those plugged in so they were struggling by on PCI-Express power, and bragging about the system. And in a separate forum on that same site, was asking for help for why his machine wasn't performing as good as it should, while insisting running a 32 bit 10 year old OS made it faster.
|
# ¿ Jan 18, 2012 17:47 |
|
Wedesdo posted:The funniest part: unless he was running Win XP x64 (which he probably wasn't), only ~3.2GB of that 16GB was actually usable. I did explicitly say he was running XP home and that that is 32 bit only, yes. Also he'd already told the forum that he'd added more RAM recently to speed things up and it was "totally working" even though it of course couldn't be.
|
# ¿ Jan 18, 2012 18:39 |
|
S3 ViRGE were pretty much good for 2D acceleration only, which used to matter then.
|
# ¿ Jan 22, 2012 17:41 |
|
Fatal posted:Care to say why? It seems so similar to HDMI I don't really get it (although I plan to use it when my 7970 comes) It's also licensing cost free, or nearly so, in addition to what the parrot said.
|
# ¿ Feb 28, 2012 02:50 |
|
So wait do I need to change my sign now?
|
# ¿ May 9, 2012 01:57 |
|
SwissCM posted:What could be interesting is seeing the ARM core being used for virtualization for the emulation of ARM-based devices. Probably not possible though. It's a very old core and slow as well. Kinda like deciding to use a Pentium III core to emulate a modern x86 cpu.
|
# ¿ Jun 15, 2012 03:22 |
|
syzygy86 posted:I think its just the cheapest core that has the TrustZone feature. If that's really all they want it for, there's no sense in using a better core. Well yes, but the guy I was quoting specifically said "seeing the ARM core being used for virtualization for the emulation of ARM-based devices.". That simply won't be practical with it.
|
# ¿ Jun 15, 2012 17:06 |
|
I'd also point out, it's been known for console makers to have several different initial system designs and change them later. Take the Dreamcast for instance - there were two competing designs a year before the console's actual release:quote:Hideki Sato's group used Hitachi SH4 and PowerVR to make a video game machine called "White Belt". A variant of the "White Belt" system became the production Dreamcast, while the Blackbelt system went nowhere. It wouldn't be surprising if Microsoft had had both an all-AMD and this Intel-nVidia combo system from the get-go.
|
# ¿ Jul 31, 2012 19:22 |
|
Alereon posted:While I'd agree with that, I don't see how it would serve any purpose to send out a dev kit of an entirely different architecture. The 360 dev kit was at least a G5 (based on POWER4) + ATI R420 (Radeon X800), though the final shipping product was Cell PPEs (based on POWER6) + ATI R520 (Radeon X1800). If they really are sending out Intel+nVidia dev kits (this might all be a hoax or they might just be dev workstations or something), that's a pretty good sign the earlier rumors of POWER7 + AMD were wrong. I thought the next Xbox rumors had always been (since like 2010 at least) x86-64 AMD CPU + AMD graphics, not some kind of POWER7 chip for the CPU + AMD graphics. So having other next Xbox stuff come out that was Intel x86-64 plus nVidia graphics isn't much of a structural change.
|
# ¿ Aug 19, 2012 21:30 |
|
pigdog posted:If you're gaming, then there are substantial differences in overall smoothness of gameplay, even if the differences in GPU-dependent, averaged framerates aren't big. Most of the time there are other bottlenecks in the system, but in spots which it is CPU-dependent, the difference is noticeable. E.g. the time taken to render one frame at constant 60 FPS is 16.7 ms, and if it takes longer than that, then that's a stutter, a drop in framerate. The graph on Skyrim timedemo is especially telling. The fact that the Phenom II X4 980 manages to outperform every other AMD chip gets me everytime. How did AMD let something like that happen?
|
# ¿ Oct 3, 2012 21:08 |
|
Goon Matchmaker posted:If anything I see Apple scaling their processors up to the point where they're comparable to whatever Intel offers and then giving Intel the finger. This is close to impossible, ARM stuff is way slower than what Intel can put out and Apple does not have the money or R&D knowledge to remedy that on their own.
|
# ¿ Oct 15, 2012 02:03 |
|
Chuu posted:Apple has over $100B in cash, and the Apple A6 CPU shows their hardware engineers know what they're doing. Do you have any idea how much money it took to get Intel or AMD processors to the performance they are today? $100 billion is nothing in comparison, and frankly the A6 doesn't show poo poo as far as getting an ARM based architecture to x86-64 performance in laptop/desktop applications. If I remember right the fastest ARM-based cpu anyone's got still performs like a Core Duo or early Core 2 Duo while sucking down more power - and it certainly wasn't Apple who made it.
|
# ¿ Oct 15, 2012 03:28 |
|
Apple didn't develop high dpi displays they don't even make them. They just buy them.Colonel Sanders posted:No poo poo, who financed the research. Not Apple, dude. Nintendo Kid fucked around with this message at 04:41 on Oct 17, 2012 |
# ¿ Oct 16, 2012 23:39 |
|
chocolateTHUNDER posted:I ask this not as a troll, but as a genuinely curious person: Intel woke up and stopped pulling stupid poo poo; and had the advantage of all their money to follow through.
|
# ¿ Oct 24, 2012 04:38 |
|
|
# ¿ Jan 26, 2025 01:07 |
|
Maxwell Adams posted:Didn't Bulldozer have slightly better performance/power draw on Windows 8? Is that still the case with Piledriver? Emphasis on the slightly here; most computing tasks still aren't suited to it but the OS can schedule things onto it slightly better. Any recent intel chip still wipes the floor with it in most use.
|
# ¿ Oct 25, 2012 05:04 |