Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Power management mode seemed like it made a difference until I started Precision to verify it was making a difference, then things got choppy again and locked to low clocks.

Next up to try is a reboot I guess. Weird.

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Speaking of Dunia 2 engine and overclocking. My two cards are stuck at top clocks right now because Far Cry 3 crashed after staying at its menu forever. I lowered my core clock VERY BEGRUDGINGLY because now I can't legitimately say it's 300MHz over stock, just... like... 296MHz or something and that sounds like crap.

If it crashes again it's still memory being too high. I'm actually kind of betting on memory, because nothing else in my experience takes nearly as long to manifest as a problem as memory. Gotta wait for the on-board memory interface to warm up (fast), then the actual memory controllers and modules (slow), then wait for instability to show up (uggghhhh seriously no saves during "missions?" I'm always on a god damned mission, what the poo poo is this poo poo for poo poo's sake!). Lose some progress, lower your expectations, try the core first because you have an inkling it was one turbo bin too high anyway, then lower memory later if it crashes again.

God drat it I skinned a deer, that's gone now. My deer skin! And now I have to restart because my GPUs won't power down from max clocks and voltage. They're not doing anything, just kinda sitting there, amped and ready, for no reason.

Edit: Precision 4.2.0 needs a revision, somehow the driver crash... I poo poo you not... turned on K-boost. Hence the clocks and voltage. Well, it's good to know K-Boost works for if I need it (e.g. Far Cry - Blood Dragon has the same issue with my setup as it does with Factory Factory's, which I will find out in about half a month or so when I can go home and get my real internet connection and actually download it!).

Agreed fucked around with this message at 11:32 on Jun 30, 2013

uhhhhahhhhohahhh
Oct 9, 2012
Have you tried MSI Afterburner?

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

uhhhhahhhhohahhh posted:

Have you tried MSI Afterburner?

The same dude makes it but it gets updated a lot more frequently because MSI lets him post betas, whereas EVGA does not.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

For what it's worth, it is possible that my wife or I accidentally turned on K-Boost after Far Cry crashed. My recuperatin' bed has my keyboard and mouse on it, and I had Precision open monitoring some stuff because I feel like I'm really zeroing in on my final overclock, just haven't *quite* finished that process yet. It would only take one button click to turn K-boost on, I think, so it's not necessarily true that it had anything to do with the drivers or Precision. But if it did do that on its own, I'd kinda dig getting an update, beta or otherwise... Though I do appreciate the fact that EVGA insists on stability for their overclocking software. The jump from version 3 to version 4 was insanely good for my former (F^2's current) 680, and it locks in clocks like nobody's business.

I know Afterburner is cross-platform since MSI isn't a solely nVidia partner like EVGA is, and I think I prefer having software that was developed specifically to fill the niche of overclocking nVidia graphics cards, overseen by EVGA's quality assurance team and all that. I haven't had any issues with Precision 4.2.0 before this thing and it seems premature to assume that it was definitely the driver crash that did it instead of potentially just clicking stuff on accident since the mouse is on the bed and we both move in our sleep.



Edit: Oh, by the way - based on your recommendation I picked up a Bitfenix Spectre Pro (and the side panel magnetic DEMCiflex filter), intend to run it full bore to give maximum airflow to the graphics cards and push their exhaust to the left/up-left, toward the front exhaust and top exhaust. I have the weirdest airflow but my temps have never been better, even without a side fan both of my GPUs idle sub-30ºC, only breaking the 30ºC barrier when it gets up to 80ºF inside on days when it's 100ºF outside. Thanks for the two solid recommendations, on specs the Spectre Pro looks badass, and the side panel has no such 20mm width limitation.

I wish I weren't /having/ to replace it, but one of the Xigmateks was DOA (the rest continue to perform exactly to specifications, just a fluke - somebody's gotta catch the bad ones, this time it was me), and Newegg is refunding it and the price of my original overnight shipping, too, plus paying for a UPS shipping label. All told, the refund more than covers the Bitfenix Spectre Pro's cost, plus Amazon Prime's cheap overnight shipping so I can get it in before the 4th guaranteed.

I picked the red LED version because it turns out that even as a grown-assed man I apparently do like having a little silly bling in my computer, and reviews remark that it's not obnoxiously bright, but rather understated. So, cool stuff there. With its high CFM and static pressure I should be able to make sure that all air needed for the three exhausts is coming from the side panel one way or another, making that one magnetic filter all I need to worry about cleaning. Plus, with more than 50CFM advantage over the Xigmateks, it ought to really push air to where the cards need them to stay frosty, what with the side fan mount being right over where the PCI-e slots are on a standard sized ATX board. Ought to be the last ingredient in this particular overclocking setup, so thanks for the solid advice, man.

Agreed fucked around with this message at 15:44 on Jun 30, 2013

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Driver crashes lock your clocks? Well, that might explain it. I've been trying to speedrun Half-Life 2, but even at stock clocks, it crashes after about an hour (maybe because I'm forcing DirectX 9.0, and there's a reason it isn't used by default?).

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Factory Factory posted:

Driver crashes lock your clocks? Well, that might explain it. I've been trying to speedrun Half-Life 2, but even at stock clocks, it crashes after about an hour (maybe because I'm forcing DirectX 9.0, and there's a reason it isn't used by default?).

In general, no, that only happened once. nVidia driver crashes do restore stock settings and voltage, though.

Try a full uninstall and use the 314 drivers instead. The 680 doesn't benefit from 320+, and this might help us figure out if there is something driver related about game crashes, after all.

Agreed fucked around with this message at 16:46 on Jun 30, 2013

Animal
Apr 8, 2003

The new NvidiaInspector has an overclock menu with the option of temperature boost and overvoltage, pretty much everything important is in there. With PrecisionX my overvolt option would not stick over reboots, and I think someone else in this thread had the same issue. With NvidiaInspector it sticks.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
I had that problem with windows 8 and afterburner early on and had to use the overclocking menu in nvinspector to make it stick, as well. A later release of afterburner fixed that.

Also a lot of people bitch about the 320 drivers, seems like. Crash here, dead card there...

ijyt
Apr 10, 2012

Dogen posted:

I had that problem with windows 8 and afterburner early on and had to use the overclocking menu in nvinspector to make it stick, as well. A later release of afterburner fixed that.

Also a lot of people bitch about the 320 drivers, seems like. Crash here, dead card there...

320.11 that came on the disk are working great for me, it's also the first time in ages I've actually used the disk drivers.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
314 has clocking back to normal, yay! Now to try it against Half Life 2.

Nondescript Van
May 2, 2007

Gats N Party Hats :toot:
Anybody heard of any problem with the newest 320.49 beta drivers? Specifically in relation to making your entire computer freeze and stutter for 2-10 seconds making everything impossible to play and other tasks (such as typing this) a chore. My entire computer is not freezing because my music still plays just fine and in games I end up where I would have been if I had been actually playing but all display freezes.

Going back to what I had before (320.11)

edit: switched back. No more stuttering. I think I'll wait a while for nvidia to get their poo poo together.

Nondescript Van fucked around with this message at 18:02 on Jun 30, 2013

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I'm gonna jump on board and say this driver release is not awesome. There's some root instability here. A driver's just a pack o' hacks to begin with, and this one has some funky poo poo going on. Apologies for dismissing earlier complaints about them being Bad Drivers.

Hey, that puts nVidia and AMD in the same boat right now vis a vis drivers! They both suck!

Animal
Apr 8, 2003

I'm pretty much putting my overclocking on hold until a stable release. You just cant know whether the games are crashing because of your clocks or because the drivers were coded by the same two neckbeards who designed the Game of Pwns poster.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
314 has given me no clocking problems, and HL2 doesn't crash. :ms:

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

Factory Factory posted:

314 has given me no clocking problems, and HL2 doesn't crash. :ms:

I think Nvidia has recommended people using the latest drivers (and having problems) roll back to the last stable 314 WHQL drivers. I've been using the 314.22's for a while now and haven't had so much as a hiccup with them, but the 320 revisions all gave me issues of some sort in my games (freezing, unstable overclocks, artifacts).

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
How about that...I rolled back to the 314.22 drivers just now as well and the fans on the pre-installed custom cooler on my ASUS 660Ti stopped freaking out during loading screens.

spasticColon fucked around with this message at 22:49 on Jun 30, 2013

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

This does not leave those of us without that option in a great place. Anyone want to walkthrough how to inf mod the drivers to support my 780... while it is currently attached... please?

MagusDraco
Nov 11, 2011

even speedwagon was trolled

Agreed posted:

This does not leave those of us without that option in a great place. Anyone want to walkthrough how to inf mod the drivers to support my 780... while it is currently attached... please?

There's an inf modding guide over on guru3d you could take a look at: http://forums.guru3d.com/showthread.php?t=377158

You'll have to find your gpu string and device id from elsewhere since the pastebins in that thread are from before the 780 was released.

MagusDraco fucked around with this message at 00:57 on Jul 1, 2013

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Once you have the NVIDIA.DEV.XXXX part, you can actually make the string whatever you want.

Give your card a pet name.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Actually, it looks like the 320.49 main branch of the drivers includes fixes for a lot of the known odd behavior. They are aware of bugs and agree that these drivers have multiple issues, which they are working on. So that's fantastic :v: Right now Battlefield 3 on the 560Ti is a major bugfest apparently, but if you are running new hardware, 320.49 beta is probably the best way to go - running anything without a 7 in front of it, use the 314s, they were stable as can be.

I won't trust my overclock limit until they get the drivers fixed, although it does seem more stable since I backed the core down by one turbo bin and reduced the memory by another 25MHz baseclock.

But three cheers for nVidia loving the drivers up just in time for ATI to launch their product range and be awesome! I hope!

Agreed fucked around with this message at 01:09 on Jul 1, 2013

BONESTORM
Jan 27, 2009

Buy me Bonestorm or go to Hell!
I was having major issues with BF3 on a 560 Ti with the latest WHQL drivers, but the newest beta drivers solved all my artifacting issues I was having. The only issue I have in that game now is I occasionally get a very weird stuttering effect that feels like my fps has dropped to single digit levels for 2-3 seconds, but it only happens like once in 2 hours of playtime. On a whole that game has been unkind to 560 Ti owners to put it lightly. It works well enough for every other game I play at the moment, so I think I will hold off until the Maxwell cards come out and then go hog wild and treat myself to a near top of the line card for once. Might be a rough couple of months though once truly next-gen games come out around the the console launces since the card has such little RAM compared to the new hotness.

Animal
Apr 8, 2003

The 560 Ti has always had problems with BF3. I originally had one that had horrible corruptions just on that game. Ended up replacing it with a 560 Ti 448 and that was a little monster of a video card.

BONESTORM
Jan 27, 2009

Buy me Bonestorm or go to Hell!
They were more like mildly crippled 570's weren't they? Seemed like a really good value, they came out shortly after I built my desktop so weren't an option for me at the time. I was lucky to avoid most of the issues 560 Ti users were having at launch, it was just the last set of drivers that completely messed up that game for me. I even tried running the card at stock clocks, not even the factory overclock, and was still having massive issues after a few minutes of play. Every texture you could think of would be corrupted and black triangles everywhere.

Animal
Apr 8, 2003

Sergeant Steiner posted:

They were more like mildly crippled 570's weren't they? Seemed like a really good value, they came out shortly after I built my desktop so weren't an option for me at the time. I was lucky to avoid most of the issues 560 Ti users were having at launch, it was just the last set of drivers that completely messed up that game for me. I even tried running the card at stock clocks, not even the factory overclock, and was still having massive issues after a few minutes of play. Every texture you could think of would be corrupted and black triangles everywhere.
Yes, and like the 780 and Titan, it overclocked better than the 570. For some reason they put them on 680 pcb's so they clocked higher.
Thats exactly the corruption I would get back then.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

In a quick round of "follow that dev!," here's what post-processing genius Timothy Lottes has been up to lately. After making FXAA (several versions) and TXAA (several versions) for nVidia's software team, he's now working at Epic, on their render team.

As what must be at least a minor god of post-processing, he will surely bring amazing things to all of us in the form of UE4 just TOTALLY KICKING ALL THE rear end, ALL OF IT. Big loss for nVidia, big gain for Epic - while TXAA is cool and cinematic, making it proprietary was a dick move as there is nothing at all inherently proprietary about how it works, and if you turn it sideways it's kind of a more precise FXAA plus a post-processing version of MSAA that smooths more than usual. That's fast and loose, but close enough. I can only imagine that his presence at Epic will result in some seriously cool poo poo in the post-processing pipeline, since UE is deferred these days anyway and he is a wiz at that stuff.

So what else has he done lately? Well, he did a low-level breakdown of the PS4's hardware and why it's awesome, which appears to be redacted from his blog for who knows what reason. But it exists, check it out, it's a good read. He's pretty clearly on the PS4 bandwagon, and now you can find out why!

He also linked to a neat white paper on practical clustered shading, which is an interesting read if you want to check out what Avalanche games is up to lately (and see some real-world problems and how they are solved in videogame engine design). I won't spoil too much, but the cool thing is that it's still a forward renderer... with some deferred features. AA compatibility: high.

Timothy Lottes, makin' stuff, sayin' things. :allears:


Bonus content: 4A's chief technical officer Oles Shishkovstov talks about Metro: Last Light, consoles vs. PC, and other smart-person-working-on-engines stuff. Make sure to read the Metro 2033 interview from three years back, linked early in the article, too, it provides some excellent context for the degree of growth of their kickin' rad in-house engine and demonstrates a great practical distinction between DX9, DX10, and DX11 if you've ever been confused about anything there or just want to hear somebody with a fancy title and a couple of awesome games under his belt talk about them.

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.

Agreed posted:

In a quick round of "follow that dev!," here's what post-processing genius Timothy Lottes has been up to lately. After making FXAA (several versions) and TXAA (several versions) for nVidia's software team, he's now working at Epic, on their render team.

As what must be at least a minor god of post-processing, he will surely bring amazing things to all of us in the form of UE4 just TOTALLY KICKING ALL THE rear end, ALL OF IT. Big loss for nVidia, big gain for Epic - while TXAA is cool and cinematic, making it proprietary was a dick move as there is nothing at all inherently proprietary about how it works, and if you turn it sideways it's kind of a more precise FXAA plus a post-processing version of MSAA that smooths more than usual. That's fast and loose, but close enough. I can only imagine that his presence at Epic will result in some seriously cool poo poo in the post-processing pipeline, since UE is deferred these days anyway and he is a wiz at that stuff.
Is TXAA still only available in precious few games or have you been applying it with some kind of injector suite like SweetFX all along? I've wanted it for quite some time, I hate texture shuffling/shimmering.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

TheRationalRedditor posted:

Is TXAA still only available in precious few games or have you been applying it with some kind of injector suite like SweetFX all along? I've wanted it for quite some time, I hate texture shuffling/shimmering.

Precious few. Off the top of my head, Crysis 3 (looks totally amazing with everything else set to Very High too - I live in Arkansas and while it is a political shithole, there's a lot of actual nature here, pretty much everywhere, and Crysis 3 has the most realistic overgrowth that I've personally ever seen, it's damned near photorealistic)... Assassin's Creed III... The Secret World?

It's a very small list. Borderlands 2 was supposed to support it, and was even advertised as having support for it by nVidia in the initial 600-series TXAA announcement, but they quietly made that advertising false and I'm pretty sure it's using FXAA 3.1 on the PC version when you turn AA on. I do know it definitely is not using TXAA.

TXAA seems like a flop, which blows because it's fantastic. The only competitive algorithms are kind of similar in overall nature visually, but very different in execution; where TXAA applies a sort of modified tent filter and then a modified MSAA pass with some fancy stuff going on (and it is not, regardless of anyone telling you otherwise, related to ATI's attempt to do the tent filter AA thing years back, totally different implementations), the visually competitive MSAA implementations tend to borrow a bit of SSAA mojo and have a big ol' performance hit to go with your lack of jaggies. Talking Sleeping Dogs ultra quality AA, the in-house 4A engine's MSAA for Metro 2033... Battlefield 3's MSAA implementation in a deferred rendering engine was a cool accomplishment and could have easily been altered to allow for TXAA with just a few differences, but no such luck, and it ended up just being a performance hog compared to their visually almost identical implementation of FXAA. Metro: Last Light uses their amped up AAA method, which was originally kind of a supercharged MLAA and is on par with the highest quality, not-just-a-post-shader SMAA mode. But for best quality in Last Light, buy three Titans in SLI and turn on SSAA, of course.

:sigh:

The quick and dirty on why TXAA seems like it might just be dead in the water isn't too hard to grasp. One, it's hardware specific. Even with the lion's share of the market, nVidia doesn't have all of it, and there's no real impetus to put in a specialized AA mode that's one card only. They should have learned from AMD's early adventures with MLAA a few years back on that one, but nope. Now MLAA is cross platform and the basis of a lot of sub-pixel AA shaders, while nVidia gets the FXAA roundup of options from the original to the console optimized to the PC higher fidelity one that Lottes did before moving on to TXAA.

If it were made cross-platform (which, again, completely possible), maybe it'd show up in more games.

It's also possible that developers just don't want a softer, cinematic look. It isn't nearly as sharp as "normal" AA (and I include all methods of AA that work with deferred and with forward rendering engines, there). It was basically Timothy Lottes' pet project to make games look like movies and solve the problem of temporal aliasing without having a resource hit as high as SSAA, and it does that, but with the side effect of being probably a bit blurrier than 2x2 SSAA, and with developers not wanting to take the time to implement MSAA in deferred rendering engines (or capitalize on it for just one company when they do), TXAA doesn't have broad prospects I'm afraid. But in the few games it is in, god drat does it look good. SSAA quality, or better, at MSAA performance hit levels.

I wish it weren't proprietary to nVidia hardware. It'd be much more likely to go the MLAA route and become more widely available, or even get tweaked and used as the basis of other AA technologies if it were cross-platform. Everyone seems to agree SMAA in its highest quality iteration looks nice, and it grew from MLAA. The best FXAA may be a little dated in mid-2013 but it still works on everything (that isn't implementing FXAA in their own engine) and looks good. I'm rambling at this point. Just disappointed that some of the promised support at the outset of TXAA dropped off like ballast when games actually made it through development and hit the market. At least Crytek held up their end of the bargain.

Agreed fucked around with this message at 08:25 on Jul 1, 2013

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.
That really sucks. Contemporary polycount averages and higher texture/native resolutions than ever before have made the "jaggies" that inspired classical anti-aliasing methods less evident and troublesome than other visual artifacts, and it's lovely that seemingly no one wants to throw in for the next advent of superficial graphical improvement.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
I don't know about NVIDIA cards, but for what it's worth, ever since someone mentioned RadeonPro and SMAA, that's what I've been injecting all the time.

Good performance, saves my card to work on rendering everything else at a higher frame rate..

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

TheRationalRedditor posted:

Is TXAA still only available in precious few games or have you been applying it with some kind of injector suite like SweetFX all along? I've wanted it for quite some time, I hate texture shuffling/shimmering.

At my old job (well, 1 week old job :v:), we had a visit from nVidia to offer help implementing nVidia specific features, namely TXAA. The truth is, the algorithm is too advanced to inject into an engine and seems to require more render target information than an injector can divine.

So yeah, the bottom line behind the scenes was that it's very low priority on the list compared to optimisations and more generic DX11 features. Even though nVidia has a greater market share on PC, it's still a lot of investment. On the other hand, nVidia is very aware of this, which is why they were pretty cool about offering dedicated help implementing all this stuff. I'm looking forward to seeing the game ship to see what the end result will be.

Grim Up North
Dec 12, 2011

Did y'all think that your card would survive going from 1080p to 4K? :stonklol: Pretty fun, if way to short article at Anandtech.

Only registered members can see post attachments!

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

The contour of the coming console generation will likely remain 1080p w/r/t gameplay targets, while computers scale over that time to 4K gameplay targets.

Jan posted:

At my old job (well, 1 week old job :v:), we had a visit from nVidia to offer help implementing nVidia specific features, namely TXAA. The truth is, the algorithm is too advanced to inject into an engine and seems to require more render target information than an injector can divine.

So yeah, the bottom line behind the scenes was that it's very low priority on the list compared to optimisations and more generic DX11 features. Even though nVidia has a greater market share on PC, it's still a lot of investment. On the other hand, nVidia is very aware of this, which is why they were pretty cool about offering dedicated help implementing all this stuff. I'm looking forward to seeing the game ship to see what the end result will be.

It will never be injected, imo, because it requires a modified MSAA pass that talks back, to use terms you're gonna cringe at :D. Even on a totally front rendering engine, that is, as you note, way more than an injector gets. And on a deferred rendering engine it absolutely has to be implemented from the ground up.

Peteyfoot
Nov 24, 2007
Is there some kind of preferred list of AA options or combinations from most optimal to least optimal, or is it really dependent on taste and source material?

I've learned quite a bit from reading here and on Wikipedia but my in-game tweaking habit is still pretty much "just turn off in-game AA and just use default SweetFX SMAA settings".

Animal
Apr 8, 2003

I just bought Crysis 3 for $10. I really wanted to try TXAA but it incurs a pretty big performance hit over FXAA at 1440p with a Geforce 780. So its either High settings and TXAA, or Very High and FXAA.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

terre packet posted:

Is there some kind of preferred list of AA options or combinations from most optimal to least optimal, or is it really dependent on taste and source material?

I've learned quite a bit from reading here and on Wikipedia but my in-game tweaking habit is still pretty much "just turn off in-game AA and just use default SweetFX SMAA settings".

Different AA modes target different things. SMAA is fine, but TXAA was specifically targeted at getting a cinematic quality to motion - no temporal aliasing. And it works. Nailed it. SMAA is more "AA but cheaper," yet another way to remove traditional jaggies-style aliasing with some nice agnosticism toward shader effects without the overhead of MSAA. As long as we're working on things like that, we are not working on things like TXAA - which was aimed at solving both the jaggies problem and the temporal aliasing problem at once.

Basically, the gold standard for AA is to supersample the image. To understand supersampling, understand that it basically is just rendering everything internally at a higher sample rate and then downsampling that to your screen size, thus removing jaggies. So up your effective resolution by whatever supersampling factor you prefer - I use 2x2 for less demanding d3d games. But that is as brute force as it gets.

So MSAA was created to use a set of sample points (and there are a shitload of different types of MSAA, by the way) and process just part of the image in a given pass, thus reducing the cost of AA hopefully without affecting image quality. They got it down to about 25% the cost of SSAA. That's a success, but now MSAA is pretty dated at the hardware level - only on a forward rendering engine can MSAA be forced. Deferred rendering engines require it to be built in, and that is a huge pain in the rear end and I do not know why Battlefield 3 went to the trouble and then didn't implement TXAA (oh wait it wasn't out yet) Also, isn't 25% of some huge performance hit (see that chart up there? Basically SSAA in action) still a big hit?

Enter post-processing AA, like MLAA, FXAA, SMAA in most of its modes, AAA, and many more in-house AA engines. They can cost next to nothing in terms of rendering performance (console optimized FXAA is about as close to literally free as you can get, seriously). They work on deferred rendering engines and forward-rendering engines alike. They improve that cost of doing AA business much, much further! And they don't at all address the temporal aliasing or annoying texture shimmer or etc., etc., etc., that make up the gestalt of image quality.

TXAA did, but they took it out back and shot it before it could grow up. Poor TXAA. But not every dev wants a more cinematic motion experience, as stated, and as you've heard from an actual graphics designer, it's kind of a thing to implement. No injector for this, it has to be built into the game with modified MSAA (which, again, performance hit and why bother when there're so many ways to do AA now?).

We're stuck in a loop where jaggies are becoming less of a problem, that we're getting better and better at solving, while ignoring everything else.

Peteyfoot
Nov 24, 2007
Thanks, I really appreciate the explanation. Also, is there any point to using a post-process AA if you can afford to crank up the FSAA to x4 or x8?

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

Agreed posted:

because it requires a modified MSAA pass that talks back, to use terms you're gonna cringe at :D.

No harm, no foul, I'm actually still rather inexperienced in terms of rendering. I've researched a few techniques and implemented some very specific algorithms but my broader knowledge is still kind of lacking. Not to mention that I started after deferred renderers became the norm, so while I know the how forward renderers, I'm not familiar enough with them to say with all confidence what works better in each type. But I'm slowly reading through Real Time Rendering to fill in the gaps. :eng101:

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

terre packet posted:

Thanks, I really appreciate the explanation. Also, is there any point to using a post-process AA if you can afford to crank up the FSAA to x4 or x8?

FSAA/SSAA is the "gold standard" in terms of reducing jaggies, so if that is your goal, then yeah, it's great. However, it's also ancient. That was the first style of practical anti-aliasing in games. It doesn't do so well with shaders or HUD elements. SMAA is nice because it's fairly shader agnostic, yet still offers good sub-pixel AA. Since, like the MLAA that it grew out of, it operates entirely via post-processing, if configured well it can offer visual performance similar to the brute-force method of FSAA/SSAA for way less time spent calculating it (or the slightly more svelte SGSSAA, though getting that to work can be a pain in the rear end).

But that goes back to the problem we're facing. We are getting just, loving flat out amazing at eliminating aliasing - even within engines themselves, at 1080p aliasing is less noticeable or obnoxious than ever before and most games operating on deferred rendering engines have some kind of post-processing pass that helps with jaggies regardless of AA settings. We are getting better and better at solving the jaggies problem, and that is what the path from FSAA/SSAA through to MSAA and its many, many iterations through to post-processing methods like FXAA, MLAA, SMAA, AAA, etc. offer. They make reducing jaggies cheaper and cheaper. They don't address texture shimmering, or temporal aliasing...

... or (while we're talking about this kinda stuff) the fact that rote DX11 optimizations pale in comparison to what a well-written engine can do (see: Metro Last Light for a practical example of home-brewed optimizations beating the standard fare over the head for performance). Batman: Arkham City is a good example of bad DX11, and it's games like that which really do still call out for some method of AA that's good at reducing jaggies faster so it can spend the rest of the time exhaustively rendering everything.

Developers face strict constraints and this is not really totally on them. Saying it's a problem with lazy devs ignores the fact that they have a massive amount of pressure to operate on what can only be described as a superhuman time scale. In that sense 4A as a development studio is a bit of a fluke, producing amazing quality with 20% of the resources that many American higher profile studios have.

And what else contributes to this problem? Well, consoles, for one. Outgoing ones are still more powerful than integrated graphics were up until recently, and incoming ones will be about 16x as powerful on brute power alone (and even more optimized). Most PC users don't have top of the line hardware, and the ones that do still find that these really fancy AA methods that go beyond removing jaggies have that old "roughly 25% of SSAA" performance hit. See the above poster who can't run TXAA and Very High on a higher-than-1080p monitor.

Develop for the platform that the publisher wants emphasis on. You have X amount of time to do Y amount of work and that is IT. In all this, given that there isn't a huge consumer demand for super cool AA technology, and even enthusiasts with $650 graphics cards can't use them, for whom is this niche being developed?

Forgive a pun, but right now, it's a deferred problem :haw:. But as the PC pushes toward 4K and hardware keeps up to make the pixel count work, aliasing is going to disappear completely - once we're at resolutions that are effectively 1080p at 4xSSAA just by default - then hopefully the industry can turn towards answering the other problems. Timothy Lottes' TXAA is one method, ahead of its time and unfortunately proprietary, and that carries a penalty in trying to get it implemented.

Now that he's at Epic, working on the Unreal rendering team, imagine what might be. UE4 is going to power a -lot- of games, and it's hardly static, I'm sure he's bringing some killer design acumen to bear already and we'll reap the rewards as the years go on. He's one of a few "high profile" guys like that, though you do have to be careful not to be too :allears: since for every high profile dev there are a ton who only ever show up in the credits that most people don't watch, responsible for a hell of a lot of the things that visually amaze. That's fame for you.

Anyway. Final answer, SSAA can blur stuff a little but looks great and applies to any D3D game without visual artifacts beyond the blurring. Post-processing AA methods are cool since they were invented after shaders became a thing, and hence they tend to work much better w/r/t not applying AA unnecessarily to some shader effects. Don't discount in-game engine tuned AA methods; the SMAA injector is sure neato, and I am guilty of using FXAA across the board through the default global driver profile so I'm really one to talk here, right, but sometimes it's not better to have more fast AA going on and sometimes it is. Case by case is best, it's just so easy to set it up to apply to everything in the driver and go from there. Solving that jaggies problem... that we more and more don't actually have.

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Jan posted:

No harm, no foul, I'm actually still rather inexperienced in terms of rendering. I've researched a few techniques and implemented some very specific algorithms but my broader knowledge is still kind of lacking. Not to mention that I started after deferred renderers became the norm, so while I know the how forward renderers, I'm not familiar enough with them to say with all confidence what works better in each type. But I'm slowly reading through Real Time Rendering to fill in the gaps. :eng101:

I do really appreciate your being here to offer a practical perspective, though - I don't work in the industry at all, you can probably guess my industry by my avatar :) Did you check out the Practical Clustered Shading white paper? It looks really promising as an alternative to totally deferred or totally forward rendering, though it's not the first approach to that goal, conceptually. Just nicely open about it, at least in the abstract. I appreciated the visual reminder that a "rote" mental picture of a frustum isn't really correct, hadn't thought about it like that before - but that's just one thing, the paper as a whole was very interesting and it seems like their particular approach, while not production proven yet, could yield interesting results.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply