Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Here we go, we knew Tahiti and GCN in general has headroom, so AMD keeps ramping clocks.
AMD Radeon HD 7970 GHz Edition

Quick summary for you: 1GHz 1050MHz base clock, (clocks down to 1GHz when the power usage is too high), which causes it to beat the GTX 680 in places it didn't before, across the board numbers are up (as you'd expect) and they've even boosted memory clocks.
Idle power consumption and noise are very low as you'd expect from AMD, but the boost in clocks has made noise under load terrible. But this is of little concern, really, because you'll see large fan coolers out from the usual suspects in no time, I'd wager. Avoid the reference cooler.

In AnandTech's testing here are the number of wins/card:
Gaming: 7970 - 18 / 680 - 16
Compute: 7970 - 5 / 680 - 2
Synthetics: 7970 - 4 / 680 - 1
Overall: 7970 - 27 / 680 - 19

HalloKitty fucked around with this message at 19:49 on Jun 22, 2012

Adbot
ADBOT LOVES YOU

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

Agreed posted:

A hundred dollars for that level of performance is sort of a joke when the AMD 7750-900 eats its lunch for about twenty five bucks more.

Don't eat fast food one day, double your graphics performance for the current generation of cards.

How much fast food do you eat :stare:

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Dogen posted:

How much fast food do you eat :stare:

Okay, pizza? Or just don't be a cheapass, it's an even smaller price difference now as both are $109 msrp. The unique selling point is that the 640 actually works well for 4K for HTPC, as it turns out though. Both are specced for it, only the 640 makes it not suck.

eggyolk
Nov 8, 2007


Agreed posted:

A hundred dollars for that level of performance is sort of a joke when the AMD 7750-900 eats its lunch for about twenty five bucks more.

Don't eat fast food one day, double your graphics performance for the current generation of cards.

Where did you find a small form factor version of the 7750?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

HalloKitty posted:

Here we go, we knew Tahiti and GCN in general has headroom, so AMD keeps ramping clocks.
AMD Radeon HD 7970 GHz Edition

Quick summary for you: 1GHz base clock, turbo up to 1050MHz, which causes it to beat the GTX 680 in places it didn't before, across the board numbers are up (as you'd expect) and they've even boosted memory clocks.
Idle power consumption and noise are very low as you'd expect from AMD, but the boost in clocks has made noise under load terrible. But this is of little concern, really, because you'll see large fan coolers out from the usual suspects in no time, I'd wager. Avoid the reference cooler.

In AnandTech's testing here are the number of wins/card:
Gaming: 7970 - 18 / 680 - 16
Compute: 7970 - 5 / 680 - 2
Synthetics: 7970 - 4 / 680 - 1
Overall: 7970 - 27 / 680 - 19

More significant is they're launching it price competitive with the 680 at $499 MSRP. Still, with the 670 being the current card to beat for high end gaming, they need to do something roughly as spectacular as what nVidia's up to at the $400 mark if they want to pull off a real coup here.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

eggyolk posted:

Where did you find a small form factor version of the 7750?

http://hexus.net/tech/reviews/graphics/37477-sapphire-hd-7750-ultimate-vtx3d-hd-7750/

or if you have slot space but not length,

http://hexus.net/tech/news/graphics/38157-powercolor-outs-passively-cooled-hd-7750-go-green/

eggyolk
Nov 8, 2007


Sorry, meant low profile.

:eng99:

There's never gonna be a competent low profile video card is there? The 6670 is about as good as it gets right now.

hobbesmaster
Jan 28, 2008

eggyolk posted:

Sorry, meant low profile.

:eng99:

There's never gonna be a competent low profile video card is there? The 6670 is about as good as it gets right now.

Seems like something that should be possible. I wish I could install something better than a FireGL 3800 (or whatever they call them now) in the Z200s at work.

real_scud
Sep 5, 2002

One of these days these elbows are gonna walk all over you

HalloKitty posted:

Here we go, we knew Tahiti and GCN in general has headroom, so AMD keeps ramping clocks.
AMD Radeon HD 7970 GHz Edition

Quick summary for you: 1GHz base clock, turbo up to 1050MHz, which causes it to beat the GTX 680 in places it didn't before, across the board numbers are up (as you'd expect) and they've even boosted memory clocks.
Idle power consumption and noise are very low as you'd expect from AMD, but the boost in clocks has made noise under load terrible. But this is of little concern, really, because you'll see large fan coolers out from the usual suspects in no time, I'd wager. Avoid the reference cooler.

In AnandTech's testing here are the number of wins/card:
Gaming: 7970 - 18 / 680 - 16
Compute: 7970 - 5 / 680 - 2
Synthetics: 7970 - 4 / 680 - 1
Overall: 7970 - 27 / 680 - 19
Oh hey that's pretty badass, guess I should find some aftermarket cooling for my 7970 and start OC'ing that bad boy.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

eggyolk posted:

Sorry, meant low profile.

:eng99:

There's never gonna be a competent low profile video card is there? The 6670 is about as good as it gets right now.

AFOX makes a low-profile Radeon 6850, but I don't think there are any North American distributors yet.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

real_scud posted:

Oh hey that's pretty badass, guess I should find some aftermarket cooling for my 7970 and start OC'ing that bad boy.

Also, corrected the figures. It actually runs at 1050MHz most of the time unless PowerTune kicks in, then it drops to 1GHz.

Another good review here.

tijag
Aug 6, 2002

Tunga posted:

Same for Batman. My 670 already runs it completely maxed out without any issues, I can't imagine the 680 needs another 13% performance or whatever it was.

Maybe for 3D?

My GTX 680 + i5-3570k @ 4.5ghz regularly stutters if I have everything maxed on Batman:AC. What resolution are you at? I'm on a 2048x1152 monitor.

Incredulous Dylan
Oct 22, 2004

Fun Shoe
Have you tried the new nvidia beta drivers? There has been a specific stuttering and underclocking bug for the 680 that was just fixed with those.

For more anecdotal evidence I run Arkham City in 3D off my single EVGA 680 SC+ with everything max except for AA (I keep it at 8XCSAA) and Physx on High and get amazing performance at 1920x1800.

Incredulous Dylan fucked around with this message at 21:13 on Jun 22, 2012

tijag
Aug 6, 2002

Incredulous Dylan posted:

Have you tried the new nvidia beta drivers? There has been a specific stuttering and underclocking bug for the 680 that was just fixed with those.

For more anecdotal evidence I run Arkham City in 3D off my single EVGA 680 SC+ with everything max except for AA (I keep it at 8XCSAA) and Physx on High and get amazing performance at 1920x1800.

Hmm, perhaps that's what I need to do.

I didn't enjoy the game very much though, so I uninstalled it and am now trying to work through ME1 and then ME2 for the first time.

Is anyone familiar with how to force AA in ME1 with nvidia gpu? I have googled this but the posts are all out of date.

td4guy
Jun 13, 2005

I always hated that guy.

tijag posted:

Hmm, perhaps that's what I need to do.

I didn't enjoy the game very much though, so I uninstalled it and am now trying to work through ME1 and then ME2 for the first time.

Is anyone familiar with how to force AA in ME1 with nvidia gpu? I have googled this but the posts are all out of date.
Go to the nVidia Control Panel. Manage 3D settings. Program Settings tab. Add MassEffect.exe or whatever. Then set the antialiasing mode to override the application, then set the Setting to whatever level you want.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

td4guy posted:

Go to the nVidia Control Panel. Manage 3D settings. Program Settings tab. Add MassEffect.exe or whatever. Then set the antialiasing mode to override the application, then set the Setting to whatever level you want.

If a 670 can't handle FXAA and at least 2xSSAA (total performance hog, but amazing visual quality, and it works with Mass Effect series' deferred rendering better than MSAA or CSAA), something's up.

tijag
Aug 6, 2002

td4guy posted:

Go to the nVidia Control Panel. Manage 3D settings. Program Settings tab. Add MassEffect.exe or whatever. Then set the antialiasing mode to override the application, then set the Setting to whatever level you want.

That doesn't work. It's not a big deal.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

tijag posted:

That doesn't work. It's not a big deal.

Download nVidiaInspector 1.9.6.6 (released yesterday). Set the Antialising Option to "Override Application Setting" so that it takes over from the in-game deferred renderer and sets up its own back buffer. Set it to either 2xMSAA or (as the card should handle fine) 8XSQ (combines 2x2 SSAA and 2xMSAA). Then, a couple options down, under Transparency Antialiasing, set it to 2x Sparse Grid SSAA. Then, a ways down, right above "Negative LOD Bias" where it should be defaulting to "Allow," set a manual negative LOD Bias at -0.500. Turn FXAA on for good measure because why the hell not.

You need to match MSAA level with SGSSAA level, because Sparse Grid Supersampling is a huge performance saver compared to regular transparency Supersampling, but it gets its points of comparison from MSAA. I like 8XSQ because it combines 2x2 SSAA and 2xMSAA for superb jaggie reduction without a dramatic performance hit (on a card like this - these aren't options to use with lower end cards, this is what $400-$500 gets you at 1080p-ish resolutions, no guarantees with multi-monitor or 1440p/1600p).

The negative LOD bias is only applicable to DX9 games, but it helps to ensure crisper textures. -.500 for 2xSGSSAA, -.1000 for 4xSGSSAA, -.1500 for 8xSGSSAA... But the truth is there is virtually no distinguishable visual difference past 2xSGSSAA, though you'll feel the hit performance-wise for sure. DX10/DX11 games, you don't have the option of setting a negative LOD bias, which can lead to less texture sharpness or even some unintended shimmering, but the visual quality trade is generally worth it, or taken care of by the heavy-duty AA you've got going on elsewhere.

SSAA and SGSSAA are only feasible these days because GPUs are incredibly powerful. It used to be the method of choice, but then MSAA came along and offered an acceptable compromise with a much lower performance hit. Then shader-based post processing algorithms (MLAA, FXAA, SMAA, TXAA) came along and they're compatible with anything if implemented right, and (in their modern iterations) are practically free in terms of performance hit for AA comparable to 4xMSAA. MLAA-based algorithms are cool because they're kind of shader agnostic, they don't mess with the sorts of effects modern games like to use to look purty, but it has to be implemented in-game, it was initially AMD-only, and FXAA came out and kicked its rear end cross-platform. There's pretty much no reason to ever turn FXAA off in a game unless it causes unacceptable blurring of text, because the driver-level implementation is effective and leans on post-sharpening enough that it isn't as blurry as some injectors could be before it became officially force-able through nVidia's control panel.

The problem is that deferred rendering engines (which are not a bad thing at all, they can be quite efficient and offer some really impressive visuals with less of a hardware requirement) don't play well with conventional MSAA or even fancier new algorithms like CSAA. But SSAA is brute force and can be forced even on D3D games, from DX9 to DX11. Combine 2xSSAA with 2xMSAA in that 8XSQ (because transparency 2xSGSSAA needs the 2xMSAA to know where to take its lower sample count from) and you should have great image quality even in games which are not amenable to MSAA or CSAA.

I know this sounds like a lot of stuff, but it's really not, you just install one application, launch it as administrator, hit the button on the middle right to get into elevated driver control mode, and you change four things. Boom, good to go.

Boten Anna
Feb 22, 2010

Tesselation is going to be the death of me. I thought I'd be happy forever with a 670 even with my gigantic Korean monitor, but apparently breaking up a texture into about a billion triangles makes things look goddamned amazing is a thing we're going to be doing going forward and in some things it does push the 670 at least to where it might dip below 60FPS :v:

Tunga
May 7, 2004

Grimey Drawer

tijag posted:

My GTX 680 + i5-3570k @ 4.5ghz regularly stutters if I have everything maxed on Batman:AC. What resolution are you at? I'm on a 2048x1152 monitor.
I run 1680x1050 so that probably explains why I don't really stretch the card.
Is this stuff applicable to all games? Can I just do this once and never touch it again, or will I be forever fiddling with settings for different games? I've never really got into overriding game-applied video settings but I'm interested.

It would also be a lot simpler if we didn't, as a species, feel the need to invent seventeen different ways to make video games edges look less jagged.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Tunga posted:

I run 1680x1050 so that probably explains why I don't really stretch the card.

Is this stuff applicable to all games? Can I just do this once and never touch it again, or will I be forever fiddling with settings for different games? I've never really got into overriding game-applied video settings but I'm interested.

It would also be a lot simpler if we didn't, as a species, feel the need to invent seventeen different ways to make video games edges look less jagged.

Best to do it game-by-game because universal settings can be problematic. Exceptions: it's fine to have the universal profile set up a frame limiter (works in tandem with vsync to provide less input lag with deferred renderers and/or triple buffering). 58frames for vsync 60Hz, different values for different refresh rates. It's also fine to have FXAA turned on universally. No real hassle there.

As far as "seventeen different ways," more like hundreds. Most games have preconfigured driver conditionals that force certain behaviors, some of it opaque but others editable by a user that knows what he or she is doing. Drivers are two things - underlying interface between the hardware and the OS to take advantage of acceleration as such, and a huge collection of the accumulated per-game hacks that make various games work right.

To recall a recent example, Skyrim started off with serious problems with nVidia cards. I found it astonishing that my GTX 580 ran it kinda like poo poo at 1920x1080, especially indoors. It didn't make sense. Then nVidia released a new driver set that offered about a 25% performance increase, and an accompanying fix in the drivers for a bug that caused undue poor performance indoors. Then the 300-series added another 40% performance increase on top of that, more or less patching up the bugs that were preventing Skyrim and nVidia's Fermi and Kepler cards from working to their capability. That's two significant hacks in a row and a clear indication that in the pre-adjusted state, high end hardware was only performing up to about 50% of what it could do.

Often for forcing special kinds of AA you need to enter in manual compatibility bits, too, which tells the control panel to ignore nVidia's chosen hacks in favor of other hacks that work better for something specific (e.g. if you're having trouble using the nVidia default UE3-friendly profile for Mass Effect 2, users have experimented and come up with 0x08009CC5 as an AA compatibility bit which allows for a wider variety of AA to be applied, at the cost of some occasional visual artifacts).

Also, with regard to technological progress in AA, it's kind of funny that if you CAN, "2x SSAA for fullscreen + 2x MSAA for a basically-free pass edge detection enhancement and more importantly tuning for transparency AA + 2xSGSSAA for lighter performance hit transparency AA + FXAA for postprocessing" is a great way to leverage older technologies for extreme visual performance. But the only thing in that whole mess of acronyms that's remotely new is FXAA, the other stuff has been around since at least the 8800GT. The goal with new technologies is to provide high performance antialiasing without sacrificing image quality for that performance. Hence FXAA, hence upcoming TXAA (which, being integrated into Unreal Engine 4, should be dynamite). MLAA was ATI's shot at a proprietary AA format and it's cool because it's shader-agnostic, but only recent versions get into the "basically free" territory of performance, and it's highly debatable whether they offer something that's worth it compared to the unquestionably higher performance FXAA.

But so long as game developers do stuff with the DX and OpenGL APIs that are not, strictly speaking, "to the letter of the law," expect it to be on a game by game basis. That's the fun part!

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Here's a list of nVidia compatibility bits that can make a game go from "the control panel forcing doesn't work :smith:" to "hooray AA :unsmith:"

http://www.forum-3dcenter.org/vbulletin/showthread.php?t=490867

The table is in English, there are other good ones linked. This saves a LOT of effort, I mean, imagine trying out various bits yourself to see if they'd work... Talk about excrutiatingly boring. Use hardware to play games, not play games to look at what your hardware can do. Lately I've been really digging the two Ys games released, though since Absolute Nature 3 came out I am very much looking forward to a full play through of some of my favorite mods with that amazing, I would say legitimately redefining texture and model pack for S.T.A.L.K.E.R. CoP.

Looks so great, I've spent tons of hours playing the various S.T.A.L.K.E.R. games, plenty on CoP alone because the engine just comes together so nicely... And I didn't recognize screenshots of some "famous" locations in CoP when AN3 came out. And man, it gets up to a very, very high VRAM utilization.

But it still runs at 60FPS (without PhysX as even part of the equation at all, not implemented in the game, like most games, so some kind of performance hit going on running the 680 in a PCI-e 2.0 8x slot, with the 580 just chilling... well, hotting... So to speak). God drat Kepler is a great generation. This is a game that on my GTX 580 couldn't be comfortably run in DX11 mode because of performance impacts that weren't acceptable (dips into the 40s maxed out with no forced AA). But the overclocked 680, which as others have noted is a good sample, I'm lucky in that regard I guess... just chews through it no problem. Totally, totally smooth performance.

Then I forced the above quad-AA crap. 2xSSAA and 2xMSAA mode, 2xSGSSAA, and FXAA (on top of some built-in post processing it does in the engine that is not identical to FXAA). I expected to have to turn some details down or something, but it freaking still runs at 60FPS, just completely, totally without issue. Unbelievable.

---

Now, to a different topic...

Remember that the 7970 is actually a better clock for clock performer than the 680, and is generally regarded as being a better overclocker, all things told. Both cards show dramatic improvements in DX11 in-game compared to previous games, especially when it comes to all-important (if you care about graphics like this, which is ~40% too much to be considered normal :v:) Minimum Framerate framerate framerate framerate.

The GTX 580 and the Radeon 7970 spend almost the exact same time rendering a given frame, though with particular DX11 features the 7970 will pull ahead. Of course, clocking a GTX 580 to 1200MHz or greater? Yeah, good luck, that's a suicide run - but the 7970, that ought to be in reach.

So all this nVidia talk, well, I figure there have got to be some dorks on the AMD side of things who can solve AA/control panel issues because of a dramatic overabundance of care. And it is worth looking into, because now that AMD's brought their cards into price and default performance parity with the 680, I'd think it's time to focus less on the GTX 680 and how rad Kepler is, and see what people can get going on with the 7970 side of things.

PhysX is only so cool, after all. ;) AMD's working hard to answer nVidia since AMD created this generation's basic price:performance criteria and nVidia won :smuggo:-style. Still on the edge of my seat wondering what nVidia's mid-range will look like, if they'll try to compete on performance or on price. Seems to me they need to aim for price or the 7800 cards will kill 'em, I just don't see them putting out a part that makes sense in that price bracket. Are they going to top GTX 580 performance for $250? AMD can do that, and of course are happy to, grab up sales... But nVidia has some concerns AMD doesn't there, obviously. I am quite excited to see what kind of lean, mean chip we end up with, how many SMX, especially how many ROP. It's possible nVidia is going to kind of stuck... But we'll see. I'm excited. It's yet another competition, much more interesting than just a boring generation win. :D

Even the GT 640, which has some moderate things about it that improve the HTPC experience for people, saw a nearly immediate answer from a close AMD partner:

Seriously, look at this crazy thing!

Space Gopher
Jul 31, 2006

BLITHERING IDIOT AND HARDCORE DURIAN APOLOGIST. LET ME TELL YOU WHY THIS SHIT DON'T STINK EVEN THOUGH WE ALL KNOW IT DOES BECAUSE I'M SUPER CULTURED.

My favorite low-profile packaging wackiness will probably always be Sparkle's lo-pro 9800GT:



You can't really see it because of the VRM heatsinks, but it had a PCIe power connector and ran at full stock clocks. The GPU might not be so impressive any more, but that board design is some engineer's magnum opus.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
In a similar vein is this PowerColor Radeon HD 6750 1GB Low-Profile card with two small fans.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Low-profile cards that pack some oomph to them are really, really cool, as stated mainly because they've gotta be somebody or some team's pet project. That's not a mass-market part, it's... well, for comparison's sake, here's the 7750 in other hands. Note some are obviously the 7770 or the GHz edition or both, but still, while some engineers are making tiny ones, other folks are doing all manner of weird stuff that makes hilariously gigantic housings for a card that slots in as "well it's better than a GTX 285 anyway" price:performance-wise.

http://www.google.com/search?q=radeon+7750&um=1&ie=UTF-8&hl=en&tbm=isch

Edit: Personal favorite, also from Sapphire, has to be the Sapphire Radeon HD 7750 Ultimate. Look at this crazy bastard. Passively cooled, maxxximum HD 7750 performance with no fan in sight.




... of course, high performance for a Radeon HD 7750 is still somewhere a bit over half a GTX 460, but imagine the work that went into making it! With no fan to fail, what parts do you warranty? :v:

Agreed fucked around with this message at 15:54 on Jun 24, 2012

Star War Sex Parrot
Oct 2, 2003

Holy poo poo these beta NVIDIA drivers are what the GTX 680 needed on day one. Arkham City is phenomenal now, and the Witcher 2 is beautiful at 2560x1440.

Rekkit
Nov 5, 2006

I have a GT 570m on my laptop that kicks in real loud when I'm playing games. What's a good program to monitor the temperature? I tried using HW Monitor but it only shows the CPU temperature.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Rekkit posted:

I have a GT 570m on my laptop that kicks in real loud when I'm playing games. What's a good program to monitor the temperature? I tried using HW Monitor but it only shows the CPU temperature.

HWiNFO64 or GPU-Z.

Henry Black
Jun 27, 2004

If she's not making this face, you're not doing it right.
Fun Shoe
So if you guys weren't satisfied with the 680's performance because you're some sort of sperglord that has to game on a 27" monitor, would opt for a second 680 and a PSU upgrade for SLI, or just suck up the cost and get a 690?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Why not get the 690, keep the 680, and do triple SLI?

Meanwhile, I want a pony who rides a goose who lays golden toilets.

Also, if you would need a PSU upgrade for a 680 SLI setup, you'd need it for the 690 as well.

madsushi
Apr 19, 2009

Baller.
#essereFerrari

LittleBob posted:

So if you guys weren't satisfied with the 680's performance because you're some sort of sperglord that has to game on a 27" monitor, would opt for a second 680 and a PSU upgrade for SLI, or just suck up the cost and get a 690?

The 680 can easily power a 27" monitor. The issue was that the initial drivers sucked, and had several issues. The new drivers are like a night and day difference -- not in performance -- but in lack of stuttering and other bad effects.

Animal
Apr 8, 2003

LittleBob posted:

So if you guys weren't satisfied with the 680's performance because you're some sort of sperglord that has to game on a 27" monitor, would opt for a second 680 and a PSU upgrade for SLI, or just suck up the cost and get a 690?

The 690 is a badass card and the first multi GPU card I would consider. That said, a 680 really should be enough unless you need to max out Metro 2033

poo poo, my 560 Ti 448 runs 1440p like a champ in all but Metro.

Rekkit
Nov 5, 2006

Factory Factory posted:

HWiNFO64 or GPU-Z.

HW doesn't detect the discrete card, but GPU-Z works perfectly, thanks.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Animal posted:

The 690 is a badass card and the first multi GPU card I would consider. That said, a 680 really should be enough unless you need to max out Metro 2033

poo poo, my 560 Ti 448 runs 1440p like a champ in all but Metro.

I doubt you've played Skyrim, because especially with the high res texture pack, that will chew through your 1GB VRAM and leave it begging for mercy

Edit: unless there's a 2GB version

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
The 560 Ti-448 has 1.25 GB of VRAM, which is sufficient for many games which would just barely exceed 1 GB.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Skyrim is screwy, poor resource utilization in my opinion. Gtx 580 got up to about 1.4GB of texture memory at 1080p. Though I think in general we simultaneously over-worry and underestimate the importance of the framebuffer; paging to and from VRAM is not the end of the world, often it's barely a hitch in performance momentarily. Having a goodly amount (and for current gen high quality gaming I do think that 1.5GB is the "goodly amount" point) is obviously desirable but it's not game over if you need to hit storage. Especially on a game like Skyrim where you might reasonably run it from an SSD.

Hell, look at RAGE. For its noted flaws with regard to depth, and noted successes with regard to gameplay, it's rendering method relies on a large storage decompression cache and constant texture streaming. Arguably a choice for scalability of the megatexturing system's implementation, but it for 100% sure runs smoothly on even comparably low powered hardware, hitting the hdd or SSD constantly. (20 gigs is a lot of space to give a short and kinda unfinished or unpolished game, but there in minor contradiction to my general point it does run flawlessly without mico hitching on that much faster medium...)

Games that don't handle VRAM limitations gracefully are another matter, obviously, but most of the time it's not a huge deal to need to move stuff in and out.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

Star War Sex Parrot posted:

Holy poo poo these beta NVIDIA drivers are what the GTX 680 needed on day one. Arkham City is phenomenal now, and the Witcher 2 is beautiful at 2560x1440.

It's a good kick in the pants for the 580 as well. Seems like nearly all AA and lingering adaptive vsync weirdness has been resolved.

Animal
Apr 8, 2003

HalloKitty posted:

I doubt you've played Skyrim, because especially with the high res texture pack, that will chew through your 1GB VRAM and leave it begging for mercy

Edit: unless there's a 2GB version

More than 300 hours of Skyrim. Runs like a charm, mostly locked at 60fps with some dips to 45ish in certain spots. No stuttering at all.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Animal posted:

More than 300 hours of Skyrim. Runs like a charm, mostly locked at 60fps with some dips to 45ish in certain spots. No stuttering at all.

Fair enough. I've seen 1.8GB peak use once, and I've seen threads online of people reporting 2.2GB(!)

But far from stock..

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Could be coded to use whatever it can. Not necessarily a good or a bad thing, once it's at "enough for the user not to notice paging."

No modern game is going to be able to load every single texture, bar none, into VRAM at once. Games like Skyrim will encounter situations where their previous textures are useless and new textures are needed and it's time to page... Pretty sure since it's Gamebryo++, that means it'll have somewhat aggressive precaching with available video memory to prevent hitching.

Basically, look at any frame by frame GPU benchmarks of high resolution games. Watch how performance over time doesn't go to poo poo all the time when even though, say, a SLI 1GB card has to load stuff in and out of memory with some frequency. Does it cause a performance hit, probably yes. A HUGE world-ender of one, nah.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply