|
Space Gopher posted:AMD was actually doing the Black Edition unlocked-multiplier thing well before Intel brought out the K models. I don't really see a point in this price range, though. Even if an unlocked multiplier is only a few bucks, it's money better spent saving up for a discrete video card. Yes, but they denoted that with Black Edition or BE. There doesn't seem any reason to go with K other than to leech off the success of Intel's unlocked CPUs and blur the line between brands.
|
# ¿ Dec 17, 2011 18:35 |
|
|
# ¿ Dec 6, 2024 20:22 |
|
Moey posted:That's my logic too. 90% of people who use computers don't understand any of the "voodoo" that goes on inside. Do companies like Dell and HP use unlocked processors in their machines? If so, wtf, I still have not seen a bios from a computer manufacturer that allows any real changes. Dell offers unlocked processors on its Alienware machines. It also offers non-unlocked processors, but gets more juice out of them by sticking them on P67 boards for +4 bins of Turbo. Of course, by default, the overclock is only 1 bin, or a whopping 1x multiplier/100 MHz. But the BIOS does allow adjustments.
|
# ¿ Dec 17, 2011 19:32 |
|
Next time on Dragon Core Z, our heroes gather the seven dragon cores and assemble them onto one die to revive Goku, who has learned the Turbo technique from King Kai.
|
# ¿ Dec 18, 2011 01:05 |
|
I think you have that backwards. Intel is dominating in everything but per-socket performance server-load per-watt performance, and AMD is only competitive there because it crams 12 or 16 cores onto a chip where Intel maxes out at 10. Per core, Intel parts are faster and more power efficient.
|
# ¿ Dec 20, 2011 23:16 |
|
Star War Sex Parrot posted:We're talking about gamers here. Didn't you recently build a computer just for games? And then Star War Sex Parrot was a zombie.
|
# ¿ Dec 21, 2011 03:32 |
|
Bob Morales posted:What percentage of gamers are still using a 32-bit OS? According to Steam, 17% use XP 32-bit, 11.5% use Vista 32-bit, and 10% use Windows 7 32-bit. So almost 40%.
|
# ¿ Dec 21, 2011 21:41 |
|
Good lord. AnandTech has some news about lower-end HD 7000 SKUs.quote:So much of what we take for granted with retail cards – well defined specifications and formal product announcements through press releases – simply don’t happen in the OEM market. Instead the OEM market is ambiguous on its best days and secretive at its worst
|
# ¿ Jan 6, 2012 19:29 |
|
While I was reading that article, I was also struck that there seems to be 1:1 percentage scaling on GPU/RAM speeds and FPS. As in, a 10% overclock gives a 10% performance increase. The Radeon 6850 scales very well (about 10% overclock for 9% performance), but not that well.
|
# ¿ Jan 9, 2012 18:07 |
|
CES is in progress, and no sign of Kepler so far. I think we were speculating that that would mean Kepler is not gonna trounce Southern Islands or anything. Also, want to trade two 6850s for your 580 and break the hell-cycle that is high-end upgrading?
|
# ¿ Jan 9, 2012 22:40 |
|
They also demoed a single-cable DisplayPort, USB 3.0, and power connector. It'll take a bit longer than Trinity to come out, though.
|
# ¿ Jan 13, 2012 04:39 |
|
Install Gentoo posted:I did explicitly say he was running XP home and that that is 32 bit only, yes. Also he'd already told the forum that he'd added more RAM recently to speed things up and it was "totally working" even though it of course couldn't be. Maybe he was using totally legitimate copies of Adobe Creative Suite software which could manage its own memory and thus manage 4 GB chunks of RAM that the OS itself couldn't touch. For his Important Projects That Justify Such a Machine. This kid, doing that. Sounds right to me.
|
# ¿ Jan 18, 2012 18:42 |
|
Alereon posted:On nVidia you can do up to 2 monitors per-card OR 2 monitors total in SLI. Small correction: [url=http://www.nvidia.com/object/3d-vision-surround-system-requirements.html]With higher-end cards, you can do 3 monitors total in SLI, with very limited choices of resolution.
|
# ¿ Jan 19, 2012 15:26 |
|
It has to do with how the signal is generated. DVI and HDMI use TMDS generators to create a clock signal and related circuitry on the receiving end to detect the clock signal. You need one generator per unique display. These generators cost money and take up card space, so it's not super cost effective to install more when most people use one or two monitors. Same deal with VGA connections and RAMDACs, roughly. And the connector pins are fixed-function, with distinct pins for the clock signal. You gotta work with that. DisplayPort, OTOH, works on packet data with self-clocking. And since each connector and signal generator comes with four distinct "lanes," you can handle them with some flexibility. Either run the lanes separately and drive a 1080p60 on each one, or send a quarter of a single large display on each and aggregate them for, say, 1600p120. Of course, you need proper driver support and hardware for this lane tomfoolery, and nVidia gives next to no shits about DisplayPort. Their driver updates regularly disable the DP output entirely until they accidentally turn it back on a month or two down the line. Never mind triple monitor on one card. nVidia gives no shits.
|
# ¿ Jan 19, 2012 18:13 |
|
tijag posted:Not sure what to make of this. I may be working on a small sample size, but SemiAccurate seems like a site full of hysterical horseshit to me. When Intel used a video to demo Ivy Bridge's DX11 graphics, SemiAccurates response was 1000 words of " JESUS gently caress THIS IS AN ABOMINATION! THIS IS A CRIMINAL ACT OF DECEPTION AND FRAUD BEYOND ALL KEN, JUST LIKE THAT TIME NVIDIA USED WOOD SCREWS AMIRITE?" AnandTech's coverage? "We saw the VLC interface during the DX11 interface demo. Whoops. Turns out the demo was slapped together last-minute. Now, I've already seen Ivy Bridge, but for posterity I asked Intel to re-run the demo on an IVB laptop, and they did. Here's the YouTube." I'm not inclined to believe anything that site says until I hear it somewhere else as well.
|
# ¿ Jan 19, 2012 21:53 |
|
Man, we have all these opinions and nowhere to put them besides the Internet.
|
# ¿ Jan 20, 2012 18:28 |
|
PowerVR is in the game, still.
|
# ¿ Jan 21, 2012 00:42 |
|
AMD's FY11 earnings report is out, and there will be a company strategy meeting next week.
|
# ¿ Jan 26, 2012 02:16 |
|
FISHMANPET posted:With per core performance the Intel is kicking rear end on the low end, but until a single i3-2100 core is twice as fast as an Athlon x4 645 core, I'm not sure why you're all saying it's such a clear winner. Not all of those benchmarks are synthetic. The Cinebench ones, for example, measure rendering a real-world scene, and the i3's single-threaded performance is about 54% higher than the Athlon. Combined with Hyperthreading, the dual-core is only about 6% behind the full quad-core Athlon at equal clocks in the multithreaded performance. The i3 will have a fully-loaded performance lower than the Athlon, that's true. But its moderate-loaded performance will be competitive for 25% less power consumption, and, if single-thread performance ever becomes paramount (like, say, the Minecraft server gets heavily loaded), the i3 will have a major advantage over the Athlon. So the two options are pretty close, with trade-offs for going either way.
|
# ¿ Feb 1, 2012 18:16 |
|
Well, it's the end of an era. My first ever build was a 1.4GHz Thunderbird when I was in high school. Made my repugnant nerd friend who got me into this stuff so jealous because all he had was a 1 GHz which he hamfistedly overclocked to 1.2. Gonna drink to AMD tonight. Here's hoping they do well in their future endeavours. Also, gently caress you, Spell Check, that's how the space shuttle is spelled.
|
# ¿ Feb 2, 2012 19:24 |
|
Yeah, Bulldozer isn't going anywhere. There will even be new AM3+ FX CPUs when Piledriver comes out. But the next version beyond that seems to drop the AM3+ desktop CPUs, according to the roadmap.
|
# ¿ Feb 3, 2012 03:27 |
|
So, if that's accurate, AMD will have the high-performance market dominated for six months with a more power-efficient architecture that has a bright future, and their main competitor in the space is having issues bringing an enormous chip to market which might outperform AMD's product under ideal settings, but more and more is looking like it might not compete in raw power and will use more electricity at that?
|
# ¿ Feb 8, 2012 16:49 |
|
Agreed posted:I don't know, I might just get a second 580 under the "gently caress it" clause, this is ridiculous. You've probably already done this, but look into how that would impact your CUDA work. I know with dual-card AMD setups that you can't do GPGPU on two cards when CF is enabled. It looks like this holds true for SLI, as well, so there might be some hassle involved.
|
# ¿ Feb 8, 2012 17:37 |
|
movax posted:I wouldn't go Intel because I play games (if I didn't game I would not even bother with a discrete GPU unless I needed more displays), but I won't go ATI/AMD because their drivers are awful compared to nvidia. Their Radeon 7500/8500 "drivers" left a very sour taste in my mouth. The gaudy, bloated CCC (do they still do that?) doesn't help either. That's really not fair. nVidia has had as many high-profile driver fuckups as AMD in the past two years at least. Catalyst 11 was not as bad (and was it really THAT much worse than the nVidia control panel?), and Catalyst 12 has been de-gaudied and brought to feature parity with nVidia's control panel. And EyeFinity just completely blows away nVidia's multi-monitor support, both on hardware and software.
|
# ¿ Feb 9, 2012 14:57 |
|
I went from a Radeon 4850 to a 6850 to a pair of 6850s. Depends on what you're upgrading from. HD 5000 to HD 6000 was evolutionary, as was GeForce 400 to 500.
|
# ¿ Feb 9, 2012 18:13 |
|
DX11 mode is a hambeast but it scales well. It's completely playable and almost entirely buttery smooth on CF 6850s, even where "as much GPU" in a single card apparently still drops to single digits. The biggest performance driver there is tessellation. I don't use much of it.
|
# ¿ Feb 10, 2012 00:33 |
|
The GPU market is all about price discrimination. Eventually, these will be good buys. At launch, they don't really differentiate from current offerings on performance, so they'll suck money out of people who buy computers and parts with and enthusiasm for big numbers.
|
# ¿ Feb 15, 2012 18:04 |
|
Agreed posted:Two of them is one of the better price:performance choices at the moment, offering performance similar to a GTX 580 On that note, if I were dropping $300 on graphics cards all at once, and doing so today, an overclockable GeForce 560 Ti-448 would be my preference. No CrossFire headaches, similar performance, slightly lower power consumption plus you get PhysX for your troubles.
|
# ¿ Feb 15, 2012 20:00 |
|
As long as they're using memes in marketing, nVidia should put up a billboard in front of AMD HQ with a "no1curr" macro.
|
# ¿ Feb 18, 2012 02:48 |
|
Alereon posted:AMD Trinity APU (two Piledriver modules, four "cores", plus VLIW4 graphics) Eh? I thought Trinity was going to have GCN graphics. Leaked slides apparently referenced "next-gen DX11," which is kind of a limited field but suggests DX11.1.
|
# ¿ Feb 29, 2012 23:36 |
|
Alereon posted:It's been confirmed as VLIW4 since last year, there's a pretty long lead time between when a new architecture hits the discrete market and when it gets integrated. Friggin' Google only giving me results since October. Thanks.
|
# ¿ Feb 29, 2012 23:50 |
|
According to completely unsourced rumors, in hours. We probably need a GPU thread.
|
# ¿ Mar 22, 2012 05:10 |
|
Agreed posted:Why the drop from the higher memory bus to the lower one? Weird choice, they're packing fast GDDR5 and yet I've got 1.5x the memory bus on my 580 as the 680 has. But, the price is right, it'll force competition from ATI. Might wait around to see what partners do with it before picking one up, might wait on GK110, overclocked 580 is holding steady for the time being and that's a great price for the performance it offers. Wait and see is my plan, with the eventual goal of purchasing. According to AnandTech, there's an even-higher-end single chip Kepler forthcoming. The 680 GTX is the current single-chip king, but it won't be the single-chip king of this generation.
|
# ¿ Mar 22, 2012 18:20 |
|
Maybe it'll tie into Intel's trend of pumping up the last two digits of Core i processor model numbers, and two years from now both will be dipping into hexadecimal digits since they used up 99 already. Get your Alienware AlienAnus with i7-5AAA A-core CPU @ AAA GHz with two nVidia GeForce GTX AAA GTX in SLI! With 0xAAA CUDA cores! And 0xAAAAAA KB of RAM! We call it, "AAAAAAAAAAAAAAAAAAAAAA!"
|
# ¿ Mar 22, 2012 18:55 |
|
Magic Underwear posted:So the 680 is a great card. Can we agree that even so it is overkill and a bad value for 1080p or lower? It's clearly great for 1600p or 5760x1200 but the extra $200+ that you're spending over a mid-range card doesn't get you $200+ in performance. You don't understand. TRANSISTORS! But yes.
|
# ¿ Mar 23, 2012 02:29 |
|
What this card does is it forces a new price/performance curve to emerge. The 7970 was priced based on Radeon HD 6000/GeForce 500, even though it's probably cheaper to build a 7970 than a 6970 that sells for $200 less right now. It was pricing that made sense from AMD's point of view - semiconductor products have low marginal costs relative to the investments needed to start building a part in the first place. But already we're starting to see prices budge a bit on 7900 cards, for example - Sapphire and XFX are offering volume discounts for ordering 2 or 3 at once. More directly, the 6900 series is almost gone now that the 7800 series there to replace it. The higher-end GeForce 500 series (560 Ti-448, 570, 580) got big price cuts to bring them in line with the 680 being a $500 card and so are likely having their stock drained from retail channels, not to be replenished once Kepler-based replacements are out there. What the GeForce 680 isn't changing yet is the lower end of the market. There are plenty of Radeon 6850s and GeForce 560 Tis filling out the big-volume parts of the market, and their prices have been about the same since months before the 7970 hit. This is likely because there's just plenty of stock lying around.
|
# ¿ Mar 23, 2012 08:17 |
|
That's pretty much what application profiles are, big fancy driver conditionals. Graphics driver packages aren't ~150 MB for the fun of it. But it's a natural evolution of having APIs like DirectX or OpenGL, many studios being eternally rushed and cash-strapped, and many games being console ports when PCs so overwhelmingly overpower consoles. There is little to gain, from a studio perspective, in knowing PC hardware inside and out when 1) you can program to the API, 2) there are multiple major hardware vendors out there for graphics and CPU, as well as a wide range of offerings within those manufacturers, and 3) it's the hardware manufacturer's job to make the hardware to API interface work well anyway. Meanwhile, look at Valve. Pushing a seven-year-old engine that still looks pretty drat good and performs like mad on modern hardware because they invest a lot of time into optimization. Well, that and art direction to plays to the engine's strengths, and good art direction in designing those strengths in the first place.
|
# ¿ Mar 24, 2012 18:34 |
|
Game rendering can get really screwed up if a game encounters SLI and tries some tricks that aren't SLI-compatible. Sometimes the proper compatibility mode is "Oh, that game? Don't even wake up the second card, just do it all on one."
|
# ¿ Mar 25, 2012 04:47 |
|
That's nothing, they recently enabled CrossFire scaling for Mass Effect 3. Now instead of 60 frames per second faster than my monitor can refresh, I'm doing 180 frames per second faster.
|
# ¿ Mar 25, 2012 08:58 |
|
Might be a bad cable, they can do screwy things. Have you tried each interface on each monitor? But yeah, with a simple Active DP adapter like this one you can plug in a DVI monitor or, from that plug with an extra cheap pin adapter or DVI->HDMI cable, an HDMI screen.
|
# ¿ Mar 26, 2012 00:35 |
|
|
# ¿ Dec 6, 2024 20:22 |
|
Yes, that's right. And what you read is that DVI and HDMI are the same video signal, so a simple pin adapter can swap between them (except that DVI won't carry the audio). So unless you're hooking up a screen with speakers or something, DVI = HDMI with a $3 adapter from Monoprice. Any DisplayPort-equipped Radeon since the 5000 series can do at least three monitors.
|
# ¿ Mar 26, 2012 01:05 |