Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
movax
Aug 30, 2008

AMD copped to production issues with 32nm, not a good sign :ohdear: You never admit to this, never!

Adbot
ADBOT LOVES YOU

movax
Aug 30, 2008

Desperate attempt to draw traffic to a site? Or legitimate excitement?

quote:

I'm sitting in on a press briefing for AMD Bulldozer right now, and while everything is embargoed, I will say this: If you're building a gaming PC, this is going to be the way to go.

Edit 1 We're gonna be covering the normal stuff (Benchmarks, etc.) but we're also going to talk about value proposition against Intel as well as some of the exciting new advancements that Bulldozer brings to the table. On October 12th, 12:01am CST.

Edit 2 "We" means Icrontic . I'm not trying to shill my site or anything; we do have a Bulldozer on the testbench, we sat in on a press briefing tonight, and we will have a launch-day piece about it. Of course, you'll also find reviews and other awesome content at [H], AnandTech, TechReport, and so on. Please consider us in your content rotation, we're a small but very, very dedicated team who have been doing this since 2000. Thanks!

I won't be surprised if this delivers all the performance you need for games (and then some) at a very competitive price point compared to Intel. And sufficient performance for modern games usually means sufficient performance for most other desktop tasks.

I'm kind of excited to see what the G34 Bulldozer variants will deliver. I'll likely be working on a G34-based refresh for one of our server boards soon and having some Bulldozer action to toss in (and the lower TDP) will be awesome.

Have they said what SB this mates with yet?

movax
Aug 30, 2008

freeforumuser posted:

PCM NL leaked benchmarks:
http://www.overclock.net/rumors-unco...50-review.html

Fixed Link

Module architecture reminds me a bit of what the Xenon can do, in terms of having 3 cores but each being 2-way SMT capable.

Benchies look good; not many people are going to really need more than four cores but the price point is nice and it can at least play in the same field as Intel now.

movax fucked around with this message at 17:43 on Oct 6, 2011

movax
Aug 30, 2008

Ragingsheep posted:

If Bulldozer matches an 2500k in terms of performance and price, is that enough?

I don't think so; the 2500K was essentially completed a year ago, if not longer. Several revisions of the chip have been taped out, the chipsets for it are mature and Intel is busy at work getting Ivy Bridge ready for mass product; 22nm ES silicon is already at the majority of ISVs, and that product will only exceed Sandy Bridge performance with no regressions (ideally) as well as correcting some Sandy Bridge errata.

I think AMD has to provide a very compelling budget-conscious processor, an area they have historically dominated in (hell I picked an Athlon II for my server build several years ago because the equivalent Intel hardware was a good $150+ more and delivered less performance).

Delivering a comparable product is good, but it is not so good when you're always racing to catch up and your competitor has the resources to put out an immediate successor to the chip you're trying to compete with, as well as being a full generation ahead with physical silicon of their next generation architecture.

movax
Aug 30, 2008

gently caress, I just feel terrible for the AMD engineers. They've had a tough road in developing this chip and they've released knowing full well that it's about to get poo poo upon by everyone. Kind of like sending your kid out onto the field knowing he's about to get his rear end kicked into the ground at worst, and barely manage to keep pace with the other kids at best.

That said, the weapon they can bring to bear on Intel is pricing. I wouldn't have an issue tossing a chip like this into a system for a non-gaming, non-techie family member if the price for the mobo and CPU was right. One thing I like about AMD is how long that socket has lasted, and the relative lower costs of their boards. The Intel PDG is a pretty thick book with exacting specifications on every little thing; the AMD guidebook is a little looser, and the specs are pretty tolerant, so you can shortcut a bit at the artwork stage.

Just remember, the consumer is hosed if AMD ceases to be a viable competitor in the desktop x86 market space.

movax
Aug 30, 2008

HalloKitty posted:

Here's a wild card: Hardware Heaven's review
http://www.hardwareheaven.com/revie...revolution.html

Seems completely off to me, can anyone spot the problems with it? I'm more inclined to believe AnandTech, but it's interesting..

Uggh that style sheet/site is hard to read, but gently caress even that little site has some awesome hardware to test with!

Only thing that really jumps out to me is that they used DDR3-1866, but they did use that on both platforms. No idea if they played with BIOS settings on the Intel board to increase memory frequency.

movax
Aug 30, 2008

Alereon posted:

Yeah I read further and the conclusion and 9/10 rating offended me enough to call them out on their own forums for lack of editorial integrity and writing reviews to please their sponsors. My favorite part was when the reviewer said Bulldozer needed a lower price, and still gave it a 9/10 for value. My second-favorite part was where they didn't mention the higher power usage in the conclusion at all, or even on the page with the power usage numbers.

Nice post, but sadly that HW site is just part of the "noise" that fosters up a really insular, groupthinking community and spreads FUD :( I hope someone answers that post politely!

e: hah, you already got called out within two posts.

movax
Aug 30, 2008

Agreed posted:

Who thought up the modules=cores idea? Why? If it was an engineer I don't understand it, if it was a marketing guy fire the fucker now.

I think the concept has merit, I just hope that AMD gets the chance to further explore it. They burned a lot of transistors in their branch predictor for this one, hopefully a process shrink or further development (likely already been in progress, seeing as BD probably taped out six months ago).

Hopefully the Radeon 7000s own face (heh, we're already back at the 7000 numbering there) and can help keep AMD solvent.

movax
Aug 30, 2008

Alereon posted:

While that's true for floating point workloads, most people really care about integer performance. If Bulldozer actually performed like an 8-core for integer stuff but a quad-core for floating point, pretty much everyone would consider that a good deal. Except somehow they managed to get it to perform like a slower hex-core at best.
My goal is to get a Radeon HD 7870 for the holidays, hopefully some of the initial cards that trickle onto the market before we get volume in Q1. If things go according to plan it will basically be a tweaked, higher-clocked Radeon HD 6970 with lower power usage and a lower price, which is all I could possibly ask for. At this point I'm concerned about how well the 7900-series will turn out given that it's completely unlike any GPU anyone's ever designed before, so drivers and its overall performance are an unknown quantity.

I believe the mass-market launch has slipped to Q1 2012, but we should still get a paper launch by the end of the year, I think. They should have a good few months to optimize and get ready to combat Kepler as well.

I'm holding out for whatever single card Kepler will get me close to 60FPS @ 2560x1600, personally.

e: You knew this was coming, Hitler sees the Bulldozer benchmarks. Downfall is a great movie, and made even more amazing by this scene being so subtitle-ready.

:hitler: "Everyone who bought a Sandy Bridge needs to get the gently caress out now! What the gently caress has AMD even been doing these past few years?"
:geno: Jacking off to hentai and My Little Pony?
:hitler: "I could poo poo a better CPU! 2 billion transistors and this is what we get?"

movax fucked around with this message at 17:26 on Oct 12, 2011

movax
Aug 30, 2008

@anandshimpi posted:

anandshimpi
404W at 4.8GHz :-/

:laugh:

movax
Aug 30, 2008

It's begun, a poor goon posted in parts megathread about perhaps getting this processor, wooed by the eight-core marketing. We were able to save him, but how many more will fall!

Ok, so it's not like they're picking up a Pentium 4 vs. a Athlon 64, it'll still deliver performance, just sucking down more power than a comparable Intel and suffering from unoptimized schedulers/software.

movax
Aug 30, 2008

Alereon posted:

We should adapt the Programming Language Checklist.
code:
BAD COMPUTER PURCHASE CHECKLIST: Check the boxes that apply

You appear to be considering purchasing a:
[ ] AMD FX-series Bulldozer processor
[ ] Intel LGA-1366/2011 Core i7 processor
[ ] Intel Atom processor
[ ] AMD Radeon HD 6990
[ ] nVidia Geforce GTX 590
[ ] 2000Mhz+ overclocking RAM
[ ] RAID0 array
[ ] Watercooler
[ ] Other:

You should not buy this product because:
[ ] It is slow
[ ] It is unreliable/poor quality
[ ] It has poor power-efficiency
[ ] It has been replaced with a newer, better product for the same price or cheaper
[ ] The previous generation is still: [ ] Better [ ] Faster [ ] Cheaper
[ ] You won't use the products capabilities in your application
[ ] I am a raving fanboy and/or its manufacturer raped and/or killed my dog
[ ] I tried one of these once 6 years ago and it broke NEVER AGAIN
[ ] Other:

Instead you should buy a:
[ ] Competitor's product
[ ] Newer generation of this product
[ ] Nothing, you're already getting 60fps in Crysis you ninny!
[ ] Weed
[ ] loving Mac you idiot

This is pretty awesome, heh. You should probably PM Crackbone anyways to get him to add your little Bulldozer blurb to the OP. Sometimes people do read it! :eng101:

Again though, at least this chip manages to mostly "keep pace" with Sandy Bridge outside of a few applications. Not as a big of a performance gulf as say NetBurst and the A64. Intel's marketing and buddy-buddy (honestly kind of suspicious) relationships with OEMs kept power hungry Pentium 4s shipping like crazy even though Athlons were running circles around them.

If I recall correctly, Adobe Premiere 6.5 was one benchmark where an Athlon 64 that was behind by nearly a gigahertz in clock would still defeat a Pentium 4 with a very healthy margin. It took until Premiere 7 (Premiere Pro) for the Pentium to start winning that benchmark.

movax
Aug 30, 2008

PC LOAD LETTER posted:

Given what he said about BD largely panned out I don't think you can hand wave away what those ex engineers said as "disgruntled employees bitching" or exaggeration or something.

As for the scheduler being the problem...I don't think anyone outside of AMD knows exactly what is wrong with BD. Most likely its a combo of several design problems and process issues.

I think those guys are floundering because they don't have enough money. They're having to cut corners somewhere, be it the architecture team, process, software support, packaging, etc. They can't fire on all cylinders. In an ideal world they'd have an army of software engineers preparing drivers and updates for the major operating systems while the hardware team gets the actual hardware ready.

If they really have switched to a ton of EDA tools as well, I can see a disconnect between some old guard engineers and fresh guys that studied with EDA in school. I know I'm a baby engineer and I had EDA tools at my disposal during school, but I've had to go back to the dark ages a bit in supporting some legacy products.

movax
Aug 30, 2008

trandorian posted:

How would you even break up Intel? Not let the desktop and laptop cpu teams talk to each other?

Better question: why would you gently caress with a company that has the majority of institutional knowledge when it comes to computer hardware (from the process level up to software) and the world literally runs on their hardware and innovations? The Intel Architecture Labs developed PCI, AGP, amongst many other contributions. In fact, a lot of their research was shut down prematurely because they were beginning to compete with Microsoft.

That's a bit sensationalist though, on a more sane note, if there was some busting going on I assume it'd be split along the line of business units.

movax
Aug 30, 2008

Nostrum posted:

Is it really even necessary? They paid huge fines and settlements to AMD over their anti-competitive behavior. Pretty sure AMD's only recent fiscal year in the black was because of that. Is there any evidence that they are still actively pursuing anti-competitive practices? Their prices are pretty fair considering they ARE delivering the best product.

I dunno if there is any current concern over their current business practices, but I think what people are "worried" about is when/if they get a virtual monopoly on a gigantic market because their products are the only one available/worth buying.

The barrier of entry for a new competitor into the x86-processor market would be almost insurmountable, I think.

movax
Aug 30, 2008

I read the Windows NTDebugging Blog quite often, and they just put up a write up on debugging a CLOCK_WATCHDOG_TIMEOUT, which has recently come into the limelight as happening with Bulldozer. Interested read if you want to see a Microsoft engineer step through and isolate the problem.

movax
Aug 30, 2008

streetgang posted:

Hey since we have all these faster processors cranking out, how does a 8 core affect gaming? do half the mmo's out there and pc games even have coding to use a 8 core ?

HalloKitty pretty much covered it. Biggest applications for multiple cores are in productivity applications. Developers will see faster compile times (we use one of our old 32-core Opteron chassis to compile/simulate VHDL/Verilog), certain server applications like databases prefer cores to raw clock speed, etc. VMs also of course benefit from many cores, and it's nice to get a large # of cores while consuming less rack units.

Multiple cores are definitely awesome, but all you need for gaming at the moment is a quad-core.

movax
Aug 30, 2008

pienipple posted:

Might be good for an HTPC setup as it eliminates a major source of heat in the small case. That's not a very large market though.

Especially with competitors like Sigma whose SoCs are cheaper/somewhat less complex options for powering set-top media boxes. TI's DaVinci chips are also very powerful for their price; they even sport integrated DDR2/3 controllers, PCI Express, SATA, USB and Ethernet. All on one BGA for ~$80 at quantity, IIRC. Why would you even bother dealing with x86 at that price? Licensing a BIOS, trying to minimize power consumption, etc...painful.

movax
Aug 30, 2008

WhyteRyce posted:

Oops, sorry I should say I have a E-350. It's not where near as rock solid as the other setup was. I've got some weird HDMI issue where occasionally the resolution looks like it gets set really low (i.e. my WMC looks like it's running 1024x768 stretched to fit my screen), which goes away if I minimize and re-maximize. Which is super-annoying in a HTPC setup that you want to control only with a remote. If I elect to connect directly to my TV instead of through my receiver, then my screen will turn black randomly while idle. WMC will also crash if I have too many files in a video directory. It also crashes when it tries to render the thumbnail for certain MKVs. Continually waiting around for Silverlight 5 so I can do HD Netflix is also fun. And AMD removed the overscan correction tool in a couple of their driver releases.

Not to derail too much, but this is why I left HTPCs behind. I know a lot of people have them running successfully and love them to death, but I just had so much trouble with them I went back to dedicated set-top media boxes. Sacrifice in broader software compatibility, but much less painful. It didn't help that my target display was connected via 1080i component, which means a fun battle against overscan.

That said, maybe some AMD hardware will be finding its way into aforementioned boxes, but they've got stiff competition from the existing players in the field.

movax
Aug 30, 2008

Factory Factory posted:

I do a full HTPC because I actually don't have a huge transcoded media archive. I spend most of my storage on documents and media I create and backups of my PCs, and all my video stuff is live/recorded cable TV, streaming from YouTube and Hulu, and physical disks. It would actually be more of a pain in the rear end to convert everything over to STB-playable stuff than just deal with the four different player softwares and web browser - all of which work great, they just aren't centralized. And I'd need more, expensive storage to boot.

It's an E350 mini-ITX box, and I love it.

Yep, I'm the opposite, with a huge NAS (~15T) full of media that the Sigma can playback. Occasionally run into issues with compressed headers because my NMT is an older generation, but nothing that can't be overcome with a little work.

movax
Aug 30, 2008

necrobobsledder posted:

There's also the sheer laziness factor by your customers not wanting to deal with the hardware-supported formats better as mentioned above. Me, I'm really pissed off at having to transcode stuff because most of what I have is so low quality to begin with I can't accept transcoding for convenience.

But basically it all boils down to the age-old problem of "software defines your hardware requirements." That's why we always ask people buying hardware regardless of if they're your grandma or a Fortune 500 customer wtf they want to run, right?

You hit the issue on the head exactly. Right now, your best bet for playing any given media format is generally a ffmpeg-derived/based-on player solution on your PC. You're at the mercy of the firmware maintainers/developers for set-top boxes and however long they plan on continuing to support their box, and what formats they choose to implement.

Maybe someone will put out an AMD APU/similar-based solution running x86 Android or similar to make a decent turn-key HTPC, who knows? I've mostly played with TI DaVinci hardware, and it's quite powerful, but the work involved in getting software decoders to leverage the hardware makes me want eat a gun most of the time. :(

movax
Aug 30, 2008

If I recall correctly, most textbook examples of implementing memory of a type that would be similar to what you want to use for cache use something like six transistors per bit.

I am going 100% off memory at a bar right now, I can check my VLSI textbook when I get home to make sure, though someone like JawnV6 or the other chip design goons could comment on it in more detail, I've only done small mixed signal designs.

movax
Aug 30, 2008

DNova posted:

I'm not sure why you're bringing this up, but generally on-die cache is SRAM, which is generally a minimum of 6 transistors per bit. Sometimes more. Rarely less. Contrast to standard DRAM which is 1 transistor per bit.

Ah, so I was remembering correctly :woop:. I was pretty sure it was SRAM, but I was impaired and couldn't think straight so I left it to someone else to clarify.

I figured that the massive amount of cache on Bulldozer would have contributed a significant amount to the various transistor count figures bandied about by marketing/engineering. Not that it matters, because 1.2 billion or 2 billion transistors, it still screwed AMD either way.

movax
Aug 30, 2008

Shaocaholica posted:

Is there a nicely formatted wikipedia article listing all the GPU families and the hosed up re-naming? I would think that would be one of the best forms of information against the practice.

I'm not sure of a family comparison in particular, but each generation generally has a pretty good table listing all the specs, plus launch pricing and launch dates, which is a very good place to start.

I gave up trying to understand laptop GPU naming long ago (I have no idea what my MBP's GT330M compares too), but they are plenty guilty of this on the desktop as well. The low-end parts are often based on last-generation stuff and share very little in common with their higher-end buddies.

I wish that nvidia and AMD would just stop bothering, seeing as how integrated graphics are growing in penetration (especially thanks to you know, shipping *with* the CPU) and I don't see very many use cases where you'd need a low-end, discrete GPU over the IGP in your CPU and chipset.

Of course they won't, and people will still pay $100 for a POS discrete card that performs maybe 10% better than integrated graphics, but still falls into the dead-zone of being plenty enough for regular computing, but useless for playing games.

e: I actually went and looked up some stuff for nvidia in particular. The GeForce article has a table at the bottom that attempts to make sense of part names. Here is a giant spec table of nvidia GPUs. The most important thing to pay attention to there is the code-name, which can give you a relative idea of what generation is at play. And here are Red Team's charts.

movax fucked around with this message at 19:18 on Dec 7, 2011

movax
Aug 30, 2008

HalloKitty posted:

That basically already exists in Windows Experience Index, and it was clearly designed to fulfill this role - but it never gets used.

Basically, marketing always wins. They want to shift more cards to gullible consumers, and give OEMs a new number to stick on their spec charts every 6 months. That's all there is to it.

3DMark tried, but therein came the issue of driver cheating and optimizing for certain benchmarks as well.

Basically, fire all marketing departments into the sun. Impossible demands and promising poo poo we can't deliver as engineers, and then confusing the poo poo out of the customer. :argh:

movax
Aug 30, 2008

Shaocaholica posted:

That list is still a bit hard to read. Maybe I should make my own on wikipedia.

The avalanche of [citation needed] heard around the world...

movax
Aug 30, 2008

necrobobsledder posted:

Jesus, 3w at idle? That's some serious power gating happening, you'd think they're more power efficient at desktop 2D than integrated chips from Intel are by now.

It's like someone decided to drop in ClkEns everywhere in the design and actually properly partition it! Could be a big selling point over nvidia's entry if they can't pull of the same. (I will buy green no matter what anyways, though)

movax
Aug 30, 2008

Chas McGill posted:

I'd watch that.

People have talked about hardware for games consoles being part of AMD's revenue stream, but how significant is it? Is it quite a low margin enterprise?

At this point, I think it's a decent cash-stream. They were taped out years ago, process improvements are always marching on, and you have guaranteed customer until they EOL the console and stop production.

As the consoles shrink too, you can migrate those chips to your newer processes so you can decommission your older lines/furnaces/lithography/etc.

movax
Aug 30, 2008

Shaocaholica posted:

^^^ I would think devs are using 32bit pointers somehow to save space since neither the 360 or PS3 needs the additional addressing bits even if their CPUs are 64bit.


I think you misread. I'm hoping next gen consoles have more than 4GB of memory so all builds will have to be 64bit to use it all.

Also, do you really think game developers for PPC and x86 are all programming in assembly? Game devs are constantly porting from 360/PS3 to x86. I would think it would be easier to go from a native 64bit source to another 64bit build.

I think he's shooting for the fact that x86/PPC have vast architectural differences as well. Obviously they aren't programming entirely in assembly, with the robust development tools that are supplied from each manufacturer; this isn't the dark ages of game development.

Remember that the 360/PS3 are exotic compared to our x86s. 360 is a three-core PPC-based CPU, each core can perform 2-way SMT. The Cell hardware far outpaced the state-of-the-art in compiler development/parallel programming tools and supplies a single PPC-derived core mated to 7 128-bit RISC cores.

And yet, most companies buy an engine that's been designed to run on both of those platforms to base their games on. The obvious exceptions tend to be each system's halo-titles; I remember some blog-posts from a Naughty Dog developer where he details their dropping down to the assembly level to get some effects from Uncharted working properly.

I think we'll definitely see more memory in the new-generation consoles, especially for high-res textures, but I don't see why they'd need in excess of 4GB. Even our PC GPUs are fine pushing 1080p with 1GB VRAM, and I don't think most titles in the PC environment eat more than 1GB or so of system memory.

Anyways, AMD getting some much-needed revenue for winning a contract to supply a GPU would be most welcome! And IBM will probably get the business (again) for the CPU design.

movax
Aug 30, 2008

Agreed posted:

nVidia and ATI both get up to some heinous bullshit when it comes to the low-end and mobile SKUs, do they not? And "why" is probably "because they get away with it every time since the market for those cards is squarely aimed at people who have no idea what they're missing out on."

The low-end GPU business is being cannibalized (and rightfully so) by integrated GPUs. There are very few reasons to get something lower than a midrange card these days...you either need the GPU (games) or you don't (integrated will handle Aero, DXVA, etc).

I remember my GeForce 7300 being some kind of GeForce 6100 or something equally retarded. All this does is confuse buyers and show the shareholders "look how many products/segments we serve!"

We're all smart enough to know not to buy anything below 7xx0 or whatnot, but the average consumer isn't :(

movax
Aug 30, 2008

Agreed posted:

It's worth if if you are can afford, are willing to, and intend to continually upgrade. If it's just part of your cycle, then you can buy in each generation at a nice sweet spot (this generation, two 6950s or two 570s; next generation, who knows) and get dramatically superior performance than a top end card for a same or marginally higher outlay. If you go top of the line every year, you're spending a lot of money trying to push the highest numbers.

Dogen gives me very well deserved poo poo for weighing the pros and cons of adding a second 580 to my setup, because he plays the same games I do and is perfectly happy with the performance - but I'm not, and so it looks like this generation I've locked myself into either continuing to be unusually bothered by framerate dips when settings are maxed despite an OC to 925mhz (which scores well in 3Dmark11 and other synthetics that aren't heavily weighted to favor ATI's parallelized calculation). My minimum desire is no FPS below 30, ever - prefer minimum 45 or minimum 60 ideally, but 30 is the bottom number, and one overclocked 580 won't deliver that in modern games at 1080p. I mean, it will in games like Space Pirates and Zombies, but I play a lot of S.T.A.L.K.E.R. with extra shaders and texture overhauls, Metro 2033 (and upcoming sequel), Crysis 2, and other games that actually can cause even this expensive poo poo of a card stress.

But the figures for a single 7970 even heavily overclocked aren't past that point. Two 580s still outperforms one 7970 like crazy in games that are well supported. In games with mediocre SLI support, scaling can be as low as 40%, which is still better than the OC'd 7970 does in most games, but on average SLI with two cards scales around 90% increase. Metro 2033, it's almost 100% increase, very little overhead in that game. Other games, I could definitely live with 80%-90% scaling. Because there is no way, NO WAY, that a single card from either company is going to improve framerates by 80% for the amount it would cost to add another 580.

It's dumb, the high end sucks. But if I were building a computer right now I'd be clicking everywhere to try to find a 7970. Sigh. Idiot me. Shiny things pretty things.

I'm glad I don't have your illness when it comes to shiny things (I game at 2560x1600). GTX 460 still going strong :patriot:

movax
Aug 30, 2008

Factory Factory posted:

CES is in progress, and no sign of Kepler so far. I think we were speculating that that would mean Kepler is not gonna trounce Southern Islands or anything.

Also, want to trade two 6850s for your 580 and break the hell-cycle that is high-end upgrading? :v:

SemiAccurate had a quiet article that mentioned Kepler slipping further to 2013, or maybe I was drunk and confused "Kepler slipping to 2012" with the year still be 2011, and beginning to sob helplessly.

Also, I think Agreed should :toxx: himself somehow with regards to the 7970, for our entertainment :toot:

movax
Aug 30, 2008

HalloKitty posted:

Ah, christ, for a minute I poo poo myself. I saw 3xxxK, and thought.. gently caress, Ivy Bridge sucks, how?
Oh, AMD, copying a little name recognition are we?

Same here, hah, took me a second to remember which thread I was in.

movax
Aug 30, 2008

I met an actual Bulldozer fanboy yesterday :psyduck:

He had a nice watercooling setup going (red flag #1) so I asked what he was running, it was a 6-core Phenom. I mentioned offhandedly about Bulldozer being somewhat of a failure when compared to the 2500/2600, and he got really upset and was like "WHAT NO BRO, ITS AWESOME, I NEED THOSE CORES, IVE GOT LIKE 50 WINDOWS OPEN AT ALL TIMES!"

At that point I wasn't going to try to discuss further, but there are AMD customers out there somewhere!

movax
Aug 30, 2008

pixaal posted:

I got water cooling once I no longer have it (sprung a leak the rug still has a green stain from the fluid). It looked really cool. It was mostly because I was bringing my computer to the dorm where the entire building was programming/IT majors, so people actually saw and thought it looked cool. Oh god I spent so much money on making the outside look cool, but I guess people spent even more making their cars look cool in highschool and I avoided that poo poo.

Yeah I used to water-cool back in college as well, but once heat-pipe coolers hit the stage, I went back to air. My Ultra 120 has been going strong, and I'm confident that adapters will keep coming out for upcoming CPU sockets.

movax
Aug 30, 2008

Too many posts with nvidia and AMD in this thread, not enough 3dfx and Voodoo :smug:

movax
Aug 30, 2008

Agreed posted:

Excuse me, I'd like to call your attention to exhibit A

:eng99:

Touche.

movax
Aug 30, 2008

grumperfish posted:

Kyro was a PowerVR part.

My first 3D card was a voodoo banshee. It died and got replace by an abomination that S3 poo poo out. The 3D demo included with the card dragged it down to a slideshow. :effort:

I think XGI attempted to be a thing at one point too with the Volari line. Didn't last long IIRC.

For some unknown reason to me, before I joined my company, the Volari Z9 was chosen as the GPU for the platform (Single-board PC). Granted they run mostly headless, so maybe it was the cheapest choice or something that could deliver basic VGA output.

movax
Aug 30, 2008

Alereon posted:

To be fair that article is only for gaming, which is extremely limited in its ability to consistently use more than two cores and gives a significant advantage to processors with Turbo. I would consider an A6-3670K, potentially even overclocked. The big advantage is with the onboard graphics, if he doesn't care at all about graphics and definitely wouldn't want to overclock at all, then an older platform with an Athlon II X4 may be better. That said, I would strongly recommend he consider just springing for the i5 2400+ if he cares about CPU performance, especially with Turbo it's a really fast platform.

Hey, you have a star now!

FWIW, I think the Intel platforms make better server boards/platforms for the home anyways. Stable, clean drivers across all platforms (Windows, Linux and Solaris!) I hadn't realized that i3 performance had caught up to this point though, I may have to start considering that for any super-budget builds.

Adbot
ADBOT LOVES YOU

movax
Aug 30, 2008

Huh. *Looks at his GTX 460* Guess you have to last another year*, buddy! :downs:

* - at 2560x1600

  • Locked thread