Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
movax
Aug 30, 2008

Agreed posted:

Serious talk if I were guessing, this would be my guess. They've gone pretty far out of their way to establish and promote the TI branding.

GeForce4 Ti4400 :unsmith:

e: new page, poo poo. Comments spammed with AMD Fanboys claiming that their beloved Team Red has more OCing headroom. More at 11.

Adbot
ADBOT LOVES YOU

Srebrenica Surprise
Aug 23, 2008

"L-O-V-E's just another word I never learned to pronounce."

Crackbone posted:

Actually, they could use 685, like they did back with the 285. The 95 is usually reserved for the dual gpu/single card setup. But then again, who knows with Nvidia, they seem to make poo poo up as they go along.
The $829 MSRP 3.25GB VRAM 680 Ultra :getin:

movax
Aug 30, 2008

Srebrenica Surprise posted:

The $829 MSRP 3.25GB VRAM 680 Ultra :getin:

Really, all the people who saved some $$$ buying the Korean Catleap 27" monitor should throw some of that money they saved at a 680 and enjoy playing every game available currently with a gigantic erection.

KillHour
Oct 28, 2007


Oh God, I just ordered an MSI one. I couldn't resist. Someone tell me I didn't just make a huge mistake and blow 500 bucks.

To be fair, it will be replacing a pair of GTX260's that should have been put out to pasture a LOOOONG time ago.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

KillHour posted:

Oh God, I just ordered an MSI one. I couldn't resist. Someone tell me I didn't just make a huge mistake and blow 500 bucks.

To be fair, it will be replacing a pair of GTX260's that should have been put out to pasture a LOOOONG time ago.

There are very few situations for which the 680 isn't total overkill. Also, if the expense isn't cutting into your needs budget, and you like the pretty stuff in games, you didn't make a mistake, it's clearly a badass card. Enjoy PhysX and high framerates and stuff, I'll be joining the club sooner or later I'm sure.

tijag
Aug 6, 2002
I bought it too. First $500 video card I've ever had. I'm building an IVB box and this will complete it. I'm probably going to overspend and get the i7-3770k as well. The computer will be with me 4 years.

The $230 difference between what I SHOULD get and what I WANT to get [should get the i5-3570k and HD7870] is like $4.70 a month over that period of time. I can't say no to what I want for such a small difference in money over the length of time i'll own these cards.

Also, given that the development cycle for GPU's seems to have slowed down, I'm pretty sure that the 680 will really be very solid over that length of time. We will get at least 1 more generation on 28nm for sure, and I wouldn't be surprised if we are on 28nm for even a bit longer than we were on 40nm.

Anyway, here comes maxed out adaptive V-synced TXAA2 graphics to my humble 2048x1152 monitor [in about a month].

I'm pretty excited.

:dance:

KillHour
Oct 28, 2007


Agreed posted:

There are very few situations for which the 680 isn't total overkill. Also, if the expense isn't cutting into your needs budget, and you like the pretty stuff in games, you didn't make a mistake, it's clearly a badass card. Enjoy PhysX and high framerates and stuff, I'll be joining the club sooner or later I'm sure.

I totally will. I don't treat myself enough. :)

Just need to wait until IB drops, and I can replace my ancient E6300 CPU.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

KillHour posted:

I totally will. I don't treat myself enough. :)

Just need to wait until IB drops, and I can replace my ancient E6300 CPU.

You'll need to, man, talk about bottlenecked! But, hey, you've got a 680 :v:

Magic Underwear
May 14, 2003


Young Orc
So the 680 is a great card. Can we agree that even so it is overkill and a bad value for 1080p or lower? It's clearly great for 1600p or 5760x1200 but the extra $200+ that you're spending over a mid-range card doesn't get you $200+ in performance.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Magic Underwear posted:

So the 680 is a great card. Can we agree that even so it is overkill and a bad value for 1080p or lower? It's clearly great for 1600p or 5760x1200 but the extra $200+ that you're spending over a mid-range card doesn't get you $200+ in performance.

You don't understand. TRANSISTORS!

But yes.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Magic Underwear posted:

So the 680 is a great card. Can we agree that even so it is overkill and a bad value for 1080p or lower? It's clearly great for 1600p or 5760x1200 but the extra $200+ that you're spending over a mid-range card doesn't get you $200+ in performance.

Yeah, nobody would go into the parts picking thread and tell people running "mere" 1080p or whatever to go out and spend $500 on a graphics card. This is a different kind of thread, where we can nerd out about how great high end graphics are this generation.

That said, it's everyone's prerogative how they want to spend their money. I feel like there's no room to judge someone for something as totally subjective as how much shiny stuff they want to turn on. It's like this - a 560Ti or a 6870 will run 1080p with a lot of pretty settings turned on in many games, at framerates above 30fps almost all the time. Turn down some settings and you can get 60fps. But who gets to tell someone that if they play games with extremely high graphically intense options, they're not allowed to splurge on a card that can take advantage of them?

I didn't get my card for games, I got it for CUDA. A 570 would have been a better choice, in retrospect, but even so, in gaming I've seen my VRAM get close enough to maxed out at 1080p with my GTX 580 (1.5GB framebuffer) with current gen games and all settings as high as they'll go. And even though 1080p seems to be (in my opinion not really legitimately) considered a low resolution, there are plenty of games which do take full advantage of the rendering power of the card. In games that don't, with a few extra fancy options enabled in nvidiaInspector I can get a game to go from 35-50% GPU utilization to pegging it out at 99% utilization, with commensurate improvements in graphical shiny stuff.

If you're spending someone else's money and they've given you a flexible budget to put together a work-oriented rig, that's one thing. If you're strapped for cash and weighing "eat ramen and vitamins for the next month to get a GTX 680 at launch," that's one thing. But if it's within your means and you want to get it, while it may not be required, it's also one part of the system that is in current generations not at all required anyway. Any discrete graphics card is a luxury with Intel's integrated graphics doing just fine for even multi-monitor normal desktop usage. So whatever you can swing there and what you're happy with performance-wise is your business, I think, regardless of whether it slots ideally on the price:performance curve.

We've also become more attuned to issues (like microstuttering) with SLI and Crossfire setups that have us less and less commonly recommending them for higher resolution usage, so cards that can handle gaming at high resolutions without relying on doubling up resources can be a sensible buy even if, again, raw FPS benchmarks show less expensive setups outperforming them in FRAPS measured framerate.

freeforumuser
Aug 11, 2007
drat, GTX 680 is mightily impressive.

This is hilarious. It only took just one card from Nvidia to steal all the thunder from AMD's entire 7xxx lineup.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
What this card does is it forces a new price/performance curve to emerge. The 7970 was priced based on Radeon HD 6000/GeForce 500, even though it's probably cheaper to build a 7970 than a 6970 that sells for $200 less right now. It was pricing that made sense from AMD's point of view - semiconductor products have low marginal costs relative to the investments needed to start building a part in the first place.

But already we're starting to see prices budge a bit on 7900 cards, for example - Sapphire and XFX are offering volume discounts for ordering 2 or 3 at once. More directly, the 6900 series is almost gone now that the 7800 series there to replace it. The higher-end GeForce 500 series (560 Ti-448, 570, 580) got big price cuts to bring them in line with the 680 being a $500 card and so are likely having their stock drained from retail channels, not to be replenished once Kepler-based replacements are out there.

What the GeForce 680 isn't changing yet is the lower end of the market. There are plenty of Radeon 6850s and GeForce 560 Tis filling out the big-volume parts of the market, and their prices have been about the same since months before the 7970 hit. This is likely because there's just plenty of stock lying around.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

freeforumuser posted:

drat, GTX 680 is mightily impressive.

This is hilarious. It only took just one card from Nvidia to steal all the thunder from AMD's entire 7xxx lineup.

Eh, that's not entirely true. It is clearly the best card right now, but if AMD shifts the pricing of their cards, they could still be a wholly viable option.

It is of course a bit painful for AMD since they needed the cash to bolster themselves, due to their entirely flaccid Bulldozer.

kuddles
Jul 16, 2006

Like a fist wrapped in blood...
I'll probably pick one up, but I'm waiting for a store I buy from here in Canada has stock in the EVGA version. Now that the results are in, I'm getting off my experiment with the AMD card.

I know people have had different experiences so I'm not trying to start another debate. All I know is that personally the past year-and-a-half since I switched to using a Radeon card has been a non-stop barrage of driver issues.

The fact that you can now turn on Triple Buffering in DirectX and apply FXAA to anything directly in the new Nvidia drivers without fooling around with third-party tools is the clincher for me. That new vsync option sounds pretty interesting, too.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Man, last year was a lovely year for drivers for everyone. Major releases had all kinds of driver trouble on both sides. RAGE and ATI (there's irony there if you go back far enough in ATI's history, which as far as I know nobody has brought up yet, so ten points to loving Agreed, score), Skyrim and nVidia (I mean thanks for the free 40% performance improvement but what the hell happened there that I was only getting 60% to start with?), everyone and Batman:AC (... in DX11), 560Ti and BF3, year suuuucked for drivers.

I do agree that the more aggressive control panel forcing is exciting for the "don't have to dick around with stuff" factor, but I betcha it's not going to be quite as straightforward as it sounds. Or, rather, while it will take the pain in the rear end out of using stuff like FXAA injector, it won't solve the "wtf is FXAA doing to the HUD" issue.

nVidiaInspector will still be super useful for forcing "wrong" AA flags to allow different/better/will-actually-use-the-card's-GPU AA in games for the resolution-challenged, though, betcha.

Magic Underwear
May 14, 2003


Young Orc
Doesn't forcing FXAA blur the text on the screen? I remember reading that a game-specific fxaa implementation is needed to avoid that.

kuddles
Jul 16, 2006

Like a fist wrapped in blood...

Magic Underwear posted:

Doesn't forcing FXAA blur the text on the screen? I remember reading that a game-specific fxaa implementation is needed to avoid that.
Yes it does, although that has been improving gradually. It's not something I would use for most things, but for games where traditional AA either doesn't work at all or doesn't do a very good job, I find it preferable to AMD's solution (MLAA). It's actually one reason why I switched to AMD. At the time, a lot of the games I was playing did not let you force AA at all and the fact that AMD was providing MLAA and Nvidia seemed focused on 3D, I decided to make the switch.

Nvidia are also working on something called TXAA which could be interesting.

kuddles fucked around with this message at 19:21 on Mar 23, 2012

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

kuddles posted:

Nvidia are also working on something called TXAA which could be interesting.

If they manage to pull out a proprietary AA system (like MLAA was initially) and it actually works for them, like PhysX, after having basically pulled the rug out from under MLAA with FXAA and it being cross-platform, that's a serious needling of ATI.

Setzer Gabbiani
Oct 13, 2004

Agreed posted:

AA stuff

See, here's what really bugs me about current GPU's in general - lazy AA implementation, or none at all - both on the manufacturer side and game developer side. I think postprocessed AA is a neat feature, but neat only if you're on a GPU that can't handle 'traditional' AA, or a console, and chances are, most of us are buying high-end/enthusiast cards capable of traditional AA, and would rather not have shader-based AA at the cost of global blurring, decreased IQ, or overly-sharpened/smoothed text and UI elements

AMD's CAP's are the worst offender, because they disable more than they fix. If a game has issues with Catalyst AI applying AA when it's that special case of way off-standard deferred shading, weird SSAO, or a game making the mistake of being DX10/DX11, instead of a driver-level workaround/special flag, it just disables it completely. I think the last time I ever saw a CAP restore AA functionality was with loving Bionic Commando

nVidia does this too, but at least Inspector lets you customize flags on a per-game basis. It's been proven that finnicky things like LA Noire and Dead Space 2 are totally playable with normal AA, granted you get a drop in FPS, but on things like a 680, you're not going to notice forced-SGSSAA it's already in the 60-something range. It's one of the more convenient things that tempts me to go back to a GeForce someday

Devs making some attempt to work with AMD and nVidia would be nice, granted it's a "hey, help us make our crazy implementation of Unreal 3.0 work with your drivers", and not just a fist-bump and $100k to throw an AMD or nV logo on the spash screen, or something that favors one manufacturer entirely like the initial Arkham Asylum release

And compared to both MLAA and FXAA, I like SMAA more

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I really don't think it's quite like that - while lazy implementation is definitely a problem, attentive implementation of shader-based methods is really impressive in terms of performance for image quality. If the HUD isn't part of what FXAA is working on (as in games that support it natively), it looks great. I don't feel like we should be happy to take a technological walk backwards just because the demands on performance aren't high enough to max current gen hardware. FSAA/SSAA has its own issues with certain image quality compromises, despite its tremendous performance cost and great job at catching standard jaggies.

We need both tech improvements and performance improvements, not just one or the other. FXAA's next iteration should offer a great combo of pixel-based and edge detection, with sharpening to help prevent the softness that can accompany its anti-aliasing, all in a basically-free render pass. Portable to DX9 and older hardware doesn't mean devs can't implement higher performance versions for really nice looking AA that doesn't carry a 30%+ performance hit. (Just using FXAA as an example here, not the be-all end-all, though I have followed its development and think it's a great tech.)

shodanjr_gr
Nov 20, 2007
There is an issue with doing native AA ns that a lot of games have moved to deferred rendering pipelines and super-sample AA doesn't work well with those, especially if you are targetting DX-9 era hardware. Hence we get the proliferation of post-processing techniques.

hillaryous clinton
May 11, 2003

super dynamic
Taco Defender

Agreed posted:

Man, last year was a lovely year for drivers for everyone. Major releases had all kinds of driver trouble on both sides. RAGE and ATI (there's irony there if you go back far enough in ATI's history, which as far as I know nobody has brought up yet, so ten points to loving Agreed, score), Skyrim and nVidia (I mean thanks for the free 40% performance improvement but what the hell happened there that I was only getting 60% to start with?), everyone and Batman:AC (... in DX11), 560Ti and BF3, year suuuucked for drivers.

Some of the blame should go to the game developers, no? They knew what they were working with when they coded the game, and they released it knowing exactly how it would perform.

Does anyone here have experience writing code modern video drivers? When I see driver release notes with "fixes performance issues in <game>" I imagine a driver codebase littered with game-specific conditionals. Nasty! Come to think of it, I wonder if AMD and Nvidia drivers have codepaths that mimic the PS3 and Xbox360 gpu drivers. Considering most pc games are console ports...

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
That's pretty much what application profiles are, big fancy driver conditionals. Graphics driver packages aren't ~150 MB for the fun of it.

But it's a natural evolution of having APIs like DirectX or OpenGL, many studios being eternally rushed and cash-strapped, and many games being console ports when PCs so overwhelmingly overpower consoles. There is little to gain, from a studio perspective, in knowing PC hardware inside and out when 1) you can program to the API, 2) there are multiple major hardware vendors out there for graphics and CPU, as well as a wide range of offerings within those manufacturers, and 3) it's the hardware manufacturer's job to make the hardware to API interface work well anyway.

Meanwhile, look at Valve. Pushing a seven-year-old engine that still looks pretty drat good and performs like mad on modern hardware because they invest a lot of time into optimization. Well, that and art direction to plays to the engine's strengths, and good art direction in designing those strengths in the first place.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

maninacape posted:

Some of the blame should go to the game developers, no? They knew what they were working with when they coded the game, and they released it knowing exactly how it would perform.

Does anyone here have experience writing code modern video drivers? When I see driver release notes with "fixes performance issues in <game>" I imagine a driver codebase littered with game-specific conditionals. Nasty! Come to think of it, I wonder if AMD and Nvidia drivers have codepaths that mimic the PS3 and Xbox360 gpu drivers. Considering most pc games are console ports...



That's just the SLI compatibility bits; there's a similarly long list of options for all the stuff that's grayed out in the global profile. And there are unused flags that you can set manually as well to get not-exactly-intended behavior. Note the different "regular option" and then "DX1x," so double the specificity for games with DX9 or other DX...

And scroll to the bottom to see a bunch of really specific weird stuff that are basically hacks needed to meet various hacks sometimes used.

Yeah, it's really complicated for everybody.

Methylethylaldehyde
Oct 23, 2004

BAKA BAKA

Agreed posted:



City Bus Simulator. Needs SLI?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Game rendering can get really screwed up if a game encounters SLI and tries some tricks that aren't SLI-compatible. Sometimes the proper compatibility mode is "Oh, that game? Don't even wake up the second card, just do it all on one."

movax
Aug 30, 2008

maninacape posted:

Does anyone here have experience writing code modern video drivers? When I see driver release notes with "fixes performance issues in <game>" I imagine a driver codebase littered with game-specific conditionals. Nasty! Come to think of it, I wonder if AMD and Nvidia drivers have codepaths that mimic the PS3 and Xbox360 gpu drivers. Considering most pc games are console ports...

I've got experience with Intel and...XGI (the Volari). So, two GPU vendors/families that are basically unsuitable for modern 3D gaming. There's a reason though, like above posters have mentioned, that there is a not-small software engineering department at both Team Green & Red dedicated to making sure titles work.

APIs like DirectX and OpenGL are supposed to make this job easier, but even then, rendering pipelines can get very complex, very quickly. Especially when you factor in that the programmers are generally being slave-driven and often subject to design-by-committee.

The thing that amuses me about Agreed's screenshot is that it reminds me of emulator GPU plugins that have very game-specific hacks as well.

Not Wolverine
Jul 1, 2007

Methylethylaldehyde posted:

Agreed posted:


City Bus Simulator. Needs SLI?

I don't know what is more depressing, the fact that drivers are just a huge hack job or this in depth city bus simulator 2010 review.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Methylethylaldehyde posted:

City Bus Simulator. Needs SLI?

Needs on-board T&L, that's for sure.

https://www.youtube.com/watch?v=extonQIEVfs

Make sure you select if you've got a Soundblaster compatible (100% or it won't work ok)

Edit: rated M for mature.

Agreed fucked around with this message at 07:40 on Mar 25, 2012

Yaos
Feb 22, 2003

She is a cat of significant gravy.
I was going to ask why don't the cards just leave SLI off unless the game specifically turns it on, then I remembered many developers are idiots and would write long blog posts about how unfair it is that their lovely game needs to be written to tell the graphics drivers what graphics features they want to support and how it would cost trillions of dollars to add this to the game.

Then there would be a thread in the games forum about how small developers are being screwed by Nvidia and AMD and then it would derail into a thread about which engines are actually new and which ones are just upgrades of old ones as if nobody reuses any old assets in new projects.

So we should thank Nvidia and AMD for preventing bad games forums threads.

I guess the real question is who are the jerks running cards so old they get some kind of benefit with SLI in Grand Theft Auto 3, Freedom Fighters, or Tron 2.0? Presumably only one of those represents "leave SLI off", so at least two of those games are supported with SLI. And what the hell is Tabula Rasa still doing in there? They could decrease the size of the drivers by like 1 kb if they removed that.

Yaos fucked around with this message at 08:29 on Mar 25, 2012

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
That's nothing, they recently enabled CrossFire scaling for Mass Effect 3. Now instead of 60 frames per second faster than my monitor can refresh, I'm doing 180 frames per second faster.

Sombrero!
Sep 11, 2001

So I bought a 7750HD and just installed the Catalyst 12.2 drivers into my W7-64bit install from here: http://support.amd.com/us/gpudownload/windows/Pages/radeonaiw_vista64.aspx

However, after installing that it doesn't seem to actually install any display drivers. My Devices window still says 'Standard VGA adapter' for the display. Is there something else I'm missing?

Yaos
Feb 22, 2003

She is a cat of significant gravy.

Sombrero! posted:

So I bought a 7750HD and just installed the Catalyst 12.2 drivers into my W7-64bit install from here: http://support.amd.com/us/gpudownload/windows/Pages/radeonaiw_vista64.aspx

However, after installing that it doesn't seem to actually install any display drivers. My Devices window still says 'Standard VGA adapter' for the display. Is there something else I'm missing?
Sometimes Windows gets confused and will not associate the driver with the card drive correctly, I've had this happen with different devices at work. In device manager right click on on the vga adapater and tell it to update drivers, it should find the drivers and install them or tell you it can't find them. If that fails to work delete the adapter from device manager, right click any entry, and click "scan for hardware changes". If that fails to work check and see if your motherboard as onboard video, if it does go into the BIOS and find the option that turns it off.

Yaos fucked around with this message at 17:25 on Mar 25, 2012

Sombrero!
Sep 11, 2001

Thanks a lot, finally got it working.

New question: is it possible to get 3 monitors with HDMI and DVI output running on the HD 7750? It's got an HDMI, DVI and Displayport output. Can I use one output for each monitor?

Reason I ask is I'm using HDMI and DVI to power 2 monitors for now and it works great, however I get random BSODs related to a file named atikmpag.sys and I'm wondering if it's because I'm not supposed to be using HDMI and DVI at the same time. I don't BSOD when I'm only using HDMI or only using DVI.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Might be a bad cable, they can do screwy things. Have you tried each interface on each monitor?

But yeah, with a simple Active DP adapter like this one you can plug in a DVI monitor or, from that plug with an extra cheap pin adapter or DVI->HDMI cable, an HDMI screen.

Sombrero!
Sep 11, 2001

Factory Factory posted:

Might be a bad cable, they can do screwy things. Have you tried each interface on each monitor?

But yeah, with a simple Active DP adapter like this one you can plug in a DVI monitor or, from that plug with an extra cheap pin adapter or DVI->HDMI cable, an HDMI screen.

So with that adapter and my video card I could do:

DVI -> DVI
HDMI -> HDMI
Displayport -> DVI via active adapter

All at the same time. Am I getting this right?

Sorry for the dumb questions but Googles have been telling me that the HDMI and DVI port use the same circuitry or something. Not sure if that applies to my card though.

I'll also try replacing the HDMI cable.

Sombrero! fucked around with this message at 02:05 on Mar 26, 2012

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Yes, that's right. And what you read is that DVI and HDMI are the same video signal, so a simple pin adapter can swap between them (except that DVI won't carry the audio). So unless you're hooking up a screen with speakers or something, DVI = HDMI with a $3 adapter from Monoprice.

Any DisplayPort-equipped Radeon since the 5000 series can do at least three monitors.

Sombrero!
Sep 11, 2001

I ordered this because it was on AMD's list of Eyefinity approved dongles. It gets here Wednesday so hopefully this works!

I've also replaced the HDMI cable from my card to my monitor and am crossing my fingers and hoping that stops the BSODs. Will post a trip report soon. Thanks a lot for your help, Yaos & Factory Factory.

Adbot
ADBOT LOVES YOU

real_scud
Sep 5, 2002

One of these days these elbows are gonna walk all over you

Sombrero! posted:

I ordered this because it was on AMD's list of Eyefinity approved dongles. It gets here Wednesday so hopefully this works!

I've also replaced the HDMI cable from my card to my monitor and am crossing my fingers and hoping that stops the BSODs. Will post a trip report soon. Thanks a lot for your help, Yaos & Factory Factory.
When you get it, can you run a test of powering down the monitor connected to that dongle if when you turn the monitor off does Windows think you've lost an available monitor or not

  • Locked thread