Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.
Is this the thread where I ask whether these Catalyst 12.8 drivers are any good? Seems like hardly a month since I installed the 12.7 beta.

I'm running 7850s in Crossfire, a notoriously driver capricious setup, so I'm always apprehensive about this stuff.

Adbot
ADBOT LOVES YOU

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

EvilMuppet posted:

Thanks heaps, that's exactly what I needed to know. I'm more than happy with that for the moment and can update the card later. Thanks.

To further compound on this, with some personal experience:

I got myself a 30" (2560x1600) monitor a little while ago, while still running a Radeon HD 5870. Among other things, I was able to run Serious Sam 3 and Skyrim at high detail (not ultra), in 2560x1600, while easily staying above 30FPS. I haven't tried Crysis 2, Metro 2033 or any of the traditionally "taxing" games, but for the most part the old 5870 more than pulled its weight. The only game that had difficulties was SWTOR, but that's because their rendering engine is a massive turd.

I ended up replacing it anyway, but it wasn't for lack of performance as much as the fact that the reference cooler on it was like having a B-52 in the room.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.
I realise this is going to be a highly engine-dependent thing, but what are some GPU troubleshooting tools and approaches?

I'm running 7850s in Crossfire and I've been experiencing randomly warping polygons, seemingly since switching to Catalyst 12.8. Tried switching to 12.9 beta, same deal -- I'll have to try rolling back to an earlier revision, I suppose. This happens to a minor extent in Borderlands 2, but Skyrim's UI is especially bad. Turning off Crossfire helps a lot.

Also, what's that HUD that some people seem to be using to display GPU usage in Crossfire? I think it's supposed to be MSI Afterburner, but I could never get it to work.

Starting to regret picking Crossfire as a solution to run in 2560x1600. :smith:

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

Goon Matchmaker posted:

How can a game like Skyrim be 32bit and have in aggregate more than 4GB of RAM allocated between Video RAM and System RAM? This has been bugging me for a while now...

I'm pretty certain VRAM uses a completely separate addressing space. Direct3D resources work in terms of "buffers" instead of pointers, and said buffers wrap up all the memory mapping information for GPU resources.

I've mostly done GPU work on consoles, though, so I'm not 100% positive how it works for PC with non-unified memory.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

Factory Factory posted:

I was hesitant to post because I'm not sure either, but I think that the graphics card maps only a portion of its RAM to the system's address space. Recall that older AGP cards had configurable Apertures to define the amount that data chunks could flow (which was subject to a lot of enthusiast nonsense, since it turned out that aperture sizes didn't really affect performance). Googling around suggests that's still the case. For example, this TechNet blog describing the addressing limit verifies that, on a 4GB RAM 32-bit system with 2.2 GB accessible, his two 1GB GeForce 280s using 256 MB of address space each. Much of the rest was apparently either reserved for dynamic mapping or over-conservative reservation.

Other devices have changed the paradigm somewhat. I've watched the RAM-as-VRAM allocations on my laptop's HD 3000, and it only takes off a very small amount for framebuffer from the address space. Other allocations are done dynamically by the driver and show up the way other system services do.

Nevertheless, the current Intel (and AMD) ISA maintains separate address spaces for VRAM and system RAM, as far as the GPU and CPU are concerned. Changing this ISA is something AMD is betting the farm on.

Since the subject piqued my curiosity, I did some extra research, and that does sort of match what I've found. What I'm unsure on is that while bus I/O (AGP, PCI-E or otherwise) does seem to require some shared memory (for memory mapped I/O, at least), there shouldn't be any correlation between the amount of VRAM a GPU has and the amount that mapped space will take up. All it does is create a buffer through which the CPU and GPU can communicate, and there's no point making this buffer larger than bus bandwidth.

It's not really clear to me how much of this memory responsibility belongs to the program, the GPU driver or the OS... I hadn't realised how much simpler unified memory (on 360) is. I will definitely have to read that article.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

KillHour posted:

I don't know why they don't just program games to be 64 bit nowadays. Is anyone really still using a 32 bit OS for gaming?

I can answer this one from a game developer standpoint.

The #1 reason, as movax mentions, really is the effort porting an existing engine to 64 bit. If you're licensing an engine that only supports 32 bit, it'd be a tremendous waste of effort to do that conversion yourself.

And for those with in-house engines, sure, fundamentally all it means is that all the pointers in your engine will now be 64 bits instead of 32... But depending on how the engine was written, there are so many places where this can go wrong. The most trivial example that comes to mind is the variable type for a pointer -- it's not uncommon to see a programmer take a pointer and cast it to int. Bam, you just lost half your pointer on a 64 bit system.

Most pointer-based fuckery and low-level optimisations probably won't convert over very well either.

So, as a corollary, an engine that foresaw this eventuality and imposed specific types for pointers from the get go will have a much easier time doing the switch. But since current consoles all use 32 bit, there was really not much of an incentive for cross-platform developers to make said switch. However, that choice will be forced on developers soon, both first party and middleware, as Durango and PS4 have more than 4GB and obviously need 64 bit OSes. This likely will be the moment at which most PC games will make the switch and stop supporting a dwindling 32-bit userbase.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

So, taking another pass at trying to figure out my terrible Skyrim Crossfire performance. Out of curiosity, I decided to have a look at what PCIe speed my GPUs are currently using... According to GPU-Z, the primary one is PCIe 3.0 16x, and the secondary is PCIe 3.0 4x. Shouldn't they both instead be set to 8x?

From a cursory search, my motherboard (Gigabyte P55 USB3) does some PCIe lane sharing with the main 16x slot in order to support USB 3.0 and SATA3's 6GB speed. What this apparently means (according to the engrish from the manual, as shown here) is that in a single GPU context, the 16x slot will be slowed down to 8x if USB 3.0 and SATA3 are both enabled. And indeed, turning USB 3.0 turbo on reduces the 16x to 8x. But the secondary slot stays at 4x.

Now, I really couldn't give a poo poo if USB is running at super duper turbo speeds, as long as I'm getting the most from my GPUs and SSDs. But I can't find an option or a combination of options that raise the secondary slot to 8x. (And yes, I plugged in the Crossfire ribbon connector between the GPUs.) Evidently, from the benchmarks quoted above, there isn't a huge difference between 16x/8x/4x with PCIe 3.0. Still, in the default Crossfire context (AFR), it sounds to me like the primary GPU would end up being hobbled down to 4x since both GPUs need to sync the same data...

Is this something I can fix? Or, rather, is this something I want to fix?

Edit: Searching for Gigabyte P55 Crossfire instead of USB3 seems to indicate that the secondary slot is capped at 4x, in spite of that message from Gigabyte in my link above, that seems to hint at Crossfire making it run 8x/8x. Oh well. That's what I get for going for Crossfire on a budget motherboard. :downs:

Jan fucked around with this message at 00:04 on Oct 18, 2012

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

movax posted:

Typo I assume, a P55 mobo wouldn't have PCIe 3.0 support.

A card strapped to a x4 link off the PCH also has to travel DMI to the CPU.

Yeah, that's totally my bad, sorry! I was misreading the GPU-Z display:



The primary (first one) GPU says it's 1.1, but running the render test shows it at 2.0. So: 16x 2.0, and... 4x 1.1. Ick. No wonder my Crossfire scaling is so bad, and that the whole upgrade from a 5870 felt rather underwhelming.

Agreed posted:

Aligning my answer with Factory Factory's, now. New motherboard time. Shouldn't be too hard to find an inexpensive price:performance board from last gen, should it?

I'd gotten this P55 from the top picks back when LGA 1156 was still the reasonable choice (over LGA 1366). I suppose it's still perfectly adequate as a single GPU motherboard, but that ship sort of sailed when I sprung for a 30" monitor. In retrospect, I should've gotten a good single GPU Kepler card, instead of assuming that the previous idiom of running 6850s in Crossfire for higher resolutions would extend itself to 7850s.

I guess if I'm going to replace the motherboard, I could consider upgrading the CPU as well. But I'll move this imminent talk of parts picking to the upgrade thread, heh.

Thanks for clearing this up, guys. :unsmith:

vvvvv

Edit: I never benchmarked it proper -- I got over my "zomg 3dmark points" phase a while ago, and now pretty much just estimate performance on "yep, this feels smooth". Which, incidentally, is also why I never really bothered checking why my Crossfire upgrade didn't feel so awesome up until now.

I suppose I'll have to try it out now. It'll be an excuse to finally install Metro 2033, too! :v:

Jan fucked around with this message at 03:05 on Oct 18, 2012

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

movax posted:

Not sure actually, that's an interesting question. From a hardware perspective, I could see the GPUs recognizing that there is a SLI bridge special and changing the BARs they request appropriately.

Not sure about SLI, but I know that for Crossfire in Alternate Frame Rendering mode, you essentially need each GPU to have a copy of the exact same resources (give or take one frame :v:), otherwise they won't be able to efficiently render said frames. So in AFR, you essentially have as much VRAM as your individual GPUs. I see no reason SLI would be any different with regards to AFR.

Now, for the fancier multi GPU techniques, things get a lot more complicated than "render one frame on each GPU", so all bets are off. I haven't really looked into the techniques involved, but an engine that explicitly supports multiple GPUs definitely could make effective use of both cards' VRAM. But then, the challenge would become to find a way to divide the work of one frame evenely across all GPUs, in a manner as transparent as possible to the nature of said frame.

Depending on the renderer, there's lots of possible ways in which one could divide draw calls between multiple GPUs... But absolutely none could guarantee a perfectly even workload without sharing a considerable amount of resources. So might as well stick to the incredibly trivial AFR.

DaNzA posted:

So how much faster are your games in CF configuration under that condition vs single card?

So, just ran the Metro 2033 benchmark with the following settings:

Options: Resolution: 2560 x 1600; DirectX: DirectX 11; Quality: Very High; Antialiasing: AAA; Texture filtering: AF 16X; Advanced PhysX: Disabled; Tesselation: Enabled; DOF: Disabled

With Crossfire:

Average Framerate: 41.33
Max. Framerate: 174.89
Min. Framerate: 9.08

Without Crossfire:

Average Framerate: 22.67
Max. Framerate: 77.41
Min. Framerate: 5.43

So I guess it's not a complete loss. But I suspect both would sink pretty low if I used 4x MSAA instead, or turned on DoF.

Edit: Huh, turned on MSAA and DoF, and didn't quite get what I expected.

Crossfire + DOF

Average Framerate: 26.33
Max. Framerate: 188.01
Min. Framerate: 6.18

Crossfire + MSAA 4x + DOF

Average Framerate: 23.00
Max. Framerate: 154.42
Min. Framerate: 4.63

Crossfire + MSAA 4x

Average Framerate: 32.33
Max. Framerate: 149.84
Min. Framerate: 5.48

I would've expected the hit from MSAA to be greater than that of DoF. I guess I can conclude that in this particular case, the game isn't memory bound (even with 2560x1600 resolution) as much as ROP fragment bound. Which makes the memory bandwidth thing sort of moot. :v:

Edit 2: I probably meant fragment bound, not ROP bound. GPU programming is still sort of new to me.

Jan fucked around with this message at 04:58 on Oct 18, 2012

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

Agreed posted:

Both companies have had on-board high quality scaling that's basically free

Caveat emptor: AMD/ATI's GPU scaling option has a completely idiotic usability issue where you can't change the scaling type (i.e.: Maintain aspect ratio, Stretch to full size or Center image) unless it currently is active, and the default is to stretch. And, of course, if you alt-tab away from a game to change the GPU scaling option, you revert to your native resolution so it's not active anymore. So you have to lower your resolution before starting whatever fullscreen game that will use that lower resolution, set it to Maintain aspect ratio (or other choice), then start the game, and restore your native resolution whenever you're done.

:psyduck:

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

axeil posted:

Are these drivers still having issues with weird artifacts in DX9 games in the new 7xxx series? I had to go back to the 12.6 Catalysts to play Skyrim without large angry black boxes everywhere. There was a long thread in I think the D3D message board about the bug and a lot of games had it, but I haven't checked it lately.

Is this seriously a thing?

I'm actually in the middle of support and RMA procedures with MSI for one of my 7850s because I'm getting occasional warping polygons. I have one trouble-ish test case in Skyrim where I can get it to happen reliably when wearing a certain suit of armour... Also happens in Borderlands 2, incidentally both games are DX9. And it only seems to happen when in Crossfire, or in single GPU but only with one of the GPUs.

If it was happening on both GPUs I'd tend to lay the blame on drivers, since it does seem to have started after a particular driver upgrade, but it doesn't seem to happen on one of them. I'm going to have to put the "good" one back in and double check...

Edit: I captured a Youtube of this delightful GPU experience. Note that this is actually a really pathologic video capture, like the artefacts wanted to show off for the camera or something. It's normally more subdued but definitely noticeable.

Edit 2: Welp. Put the other GPU back in, loaded my Skyrim test case, same poo poo. Hooray ATI! :suicide:

Between that and the lovely driver russian roulette, I'm pretty sure this is the last time I'm using ATIs. I'd never had this much trouble with nVidias. I don't even know why I strayed from them at this point... Probably because of Fermi. But seeing as that's all over and done with...

Jan fucked around with this message at 04:02 on Dec 13, 2012

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

spasticColon posted:

Is it the hardware itself that is causing this issue or did they just botch the drivers?

Sadly, that's not even known at this point. I've read through the Guru3D post that was linked earlier, and it does seem to vary wildly between games and drivers. I didn't really have any issues when I first got the 7850s, and it's been getting noticeably worse over time, so it could be drivers or just hardware degrading.

I'll be running some more tests, although it's hard to test anything without having a precise repro case.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.
So, on the polygon warping issue, I managed to get a frame capture of the problem with Intel GPA, by setting it to capture every 10 seconds and hope it eventually triggers spot on. I can look at the frame screenshot in the capture browser and see the issue, like so:



But then when I load the capture to try and see which draw call is freaking out:



I should have known better, of course it's not going to show. It's playing back the frame and getting the expected results, not those of the hardware being lovely. :suicide:

Oh well, looks like my awesome debugging process ends here.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.
Well, to be fair, my 7850s are doing the job very nicely, not counting the polygon corruption issues. And on that particular note, I went ahead and tried the newest 12.11 beta11 drivers (I was at beta4), and they pretty much entirely get rid of the problem. I think I noticed a flicker or two in Skyrim, but it could easily have been my imagination -- nothing like the psychedelicfest from my video capture.

The periodic high latency issue (or whatever you could call it) mentioned in that TechReport article has been acknowledged by AMD, and it shouldn't hopefully be too long before they address it.

The only other issues I've had are the occasional dud driver release that actually worsens performance in some games, but that's happened when I had nVidias as well.

The negative issues just happen to be more apparent in these discussions. :v:

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.
What the hell is a TDR?

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.
For what it's worth, a colleague of mine worked at ATI and he is adamant that their hardware design team is excellent, but the driver guys just aren't able to keep up.

Can't really blame them for that, PC drivers must be one of the most finnicky pieces of software to write.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.
Gave 13.1 a try, hoping it might have helped with the stutters I get in SWTOR, but nope. Those are just a lousy engine making blocking loads on the main thread or some poo poo. :shepface:

On the bright side, I figured out that contrarily to usual habit, turning VSync majorly helps most games when running Crossfire. So all the stutters I had in Dishonored went away, same for Skyrim. The only exception is Guild Wars 2, which still gets worse performance in most areas when using Crossfire than not. Unless that's changed with 13.1...

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

Endymion FRS MK1 posted:

Wait, people actually used the CCC updater? I always hear about that breaking things. Uninstall via Add/Remove Programs, then run the new driver's .exe is the way to go.

You mean: Uninstall using Add/Remove programs, wait for the uninstaller to inevitably crash, use driver sweeper to clean up what's left, reboot, then run the new installer executable.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.
Nah, you'd be surprised at how much the consoles influence game development. There's a reason most PC games these days are touted "PC ports".

If anything, having both consoles use essentially the same architecture will make PC games from next console gen far better. The PS3's Cell processor was an awesome piece of engineering, but a very specialised one which pretty much required a hand coded effort to fully harness its power. These specs, on the other hand, are much closer to traditional PC multicore development. So odds are that the barrier between PS3 or 360 exclusives will be a lot thinner now, and by extension PC versions would be much easier to come up with.

But yeah, architecture aside, PC development is way more complicated, even if drivers are supposed to hide all the ugly PC complexities. And, more importantly, the PC market is still much smaller than that of the consoles, which is really what will dictate the amount of effort developers are willing to invest in the end.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

bull3964 posted:

I don't doubt that there is room for improvement. I'm just saying, when was the last time visuals were the selling point of a game?

Pretty much all the loving time on PC, and probably even consoles. Skyrim comes to mind.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

SocketSeven posted:

But when was the last time visuals have really improved on a game?

When they weren't forced to downscale the totality of their engine to work on archaic consoles with 350MB of RAM to work with between CPU and GPU?

You guys are arguing against needing new consoles because we don't have any better graphics, when it's the other way around. :psyduck:

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

Endymion FRS MK1 posted:

Speaking of Vsync, is there any way to enable it without also getting the horrendous mouse acceleration?

That depends on the way a game engine handles its updates. If they do it intelligently, then VSync won't affect mouse input. If they do it badly, you're SOL and nothing you choose will do anything about it.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

Endymion FRS MK1 posted:

Forgot to mention, I have a 7950.

RadeonPro has some special VSync settings, namely an equivalent to Adaptive VSync. Give it a try.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

TheRationalRedditor posted:

The 7850 is a no-frills kinda thing so nearly all of them ever produced have that ATi reference single-fan leafblower.

What? No. The good brands will put their own cooling solution on them, even if they're "no frills" cards. Manufacturers are aware that noise levels are an increasing concern, and that there will be a demand for a quieter version of any model.

I have two of the above, and both of them running at max load is still more quiet than the XFX 5870 I upgraded from (because it was so loving noisy).

That said, I could never find some decent aftermarket coolers as mentioned by Factory Factory and grumperfish. I initially wanted to get one of those for my 5870, but it turned out it was not nearly as simple, cheap and convenient as an aftermarket cooler for a CPU.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.
Yeah, since the only reason I wanted to upgrade at the time was for noise levels, I waited patiently until MSI released their 7850. Which took forever but was worth it.

Still would rather they'd use 120mm fans but it's still a pretty good cooler.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

Tab8715 posted:

With new consoles having or rumored to have ATI gpus - would it be a good idea to purchase an ATI-Card over a nVidia card?

No matter what GPU the next generation has, it very likely will also have unified memory. This one detail changes everything about how engines are designed and optimised, so until PCs also have unified memory, console specific optimisations don't meant anything.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

Chasiubao posted:

360 has 512 unified, PS3 does a 256 even split, although developers have been known to use the video side for non-video stuff.

Correct.

But the OS also takes a bunch more memory for itself, so the effective available memory on 360 is closer to 325MB. Don't remember off-hand on PS3.

As for video RAM vs. regular RAM on PC, remember that the most memory consuming resources (textures, non-streamed sounds) don't live in VRAM. While unified memory makes some PC mechanisms unnecessary like texture caching, I doubt any next gen title is going to use up 6+GBs worth of attribute data, especially not if tessellation is available to reduce bandwidth usage. With textures, sure, but PC already makes use of bigger memory addressing by using bigger textures.

I think the bigger challenge, video RAM-wise, is going to stem from render buffers. Pretty much every single modern engine is migrating towards full deferred shading, which is incredibly easier to work with and more efficient... But requires quite a bit of memory to store a bunch of render targets, some of which have to be in sRGB space and take up 4 times more memory. While the 360 had 300+MB available memory, that memory was rather slow for rendering to, so they added some embedded RAM that is tremendously faster. But it was only 10MB. Most render targets in 720p won't fit in 10MB, let alone 1080p render targets. Most games cheated by using 640p rendertargets and upscaling, or they used tiling, which is a chore.

If the new consoles also have an eDRAM mechanism (and I can't see how they wouldn't), I'm hoping it'll be more than 10MB. It'd be awesome to have our more important geometry buffers all reside in fast RAM without having to manage copies/resolves between each deferred pass.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

Chasiubao posted:

Where on Earth did you get your numbers? :psyduck: Oh never mind, don't even need to break NDA, it's public:

http://blogs.msdn.com/b/xboxteam/archive/2007/11/30/december-2007-system-update.aspx

I haven't actually looked up the official documentation to see how memory is allocated. There are probably some managed resources and system memory used by Direct3D itself. But I do know that the biggest block we could allocate to our memory manager without running out of memory was around 350MB at best.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.
Well, that very much matches what playing with my 2x7850s feels like. Without VSync, it worsens what tearing I would've otherwise experienced and feels subjectively worse than running single GPU. With VSync, stuttering doesn't feel as bad, and frame rates are relatively smooth, but nothing near what a second GPU should be like.

CrossfireX -- never again.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

spouse posted:

Made me nauseous looking at Far Cry 3 in ultra, with AA on. Is that the "microstutter" you guys mention?

I'd call microstutter many things, but nauseating is not one of them. Odds are you're getting some motion sickness, which can manifest more easily at higher frame rates. But you can severely reduce the impact of microstuttering if you want to rule that out.

For starters, as grumperfish mentions, make sure VSync is on. Crossfire technically doesn't work without VSync.

An even better solution would be to use RadeonPro to enable Dynamic VSync. The procedure is pretty simple, but I'll just link to this article that explains it in detail. (If you're curious, look in the preceding pages for the one that gives the frame rate graphs while exhibiting microstuttering. It's pretty amazing, in a disappointing sort of way.)

Jan fucked around with this message at 16:46 on Mar 15, 2013

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

hayden. posted:

Can you run two video cards in SLI/Crossfire with one in a 2.0 and one in a 3.0 PCI-E? I'm not sure there even exists a motherboard that's labeled SLI/CF compatible in this configuration.

Yes.



For my motherboard (P55-USB3), Gigabyte decided to include a second PCI-E x16 slot, but they're routing it through the PCH, resulting in considerably lower PCI-E bandwidth. The end result is that both cards will run at the lowest PCI-E version and speed you have.

In my case, this hobbles the Crossfire setup pretty badly, since the PCH is at a lousy 1.1 x4. I didn't realise this at the time I bought said motherboard, but then I didn't plan on doing Crossfire either. Either way, I'm still getting some gains from Crossfire in quite a few cases, so I've been living with it.

You're right that few motherboards will advertise multi-GPU compatibility in these circumstances. Since SLI/CrossfireX is still a market fairly restricted to enthusiasts, they usually design motherboards to be as efficient as possible in that context. I think that in my case, they designed the motherboard first and foremost to allow USB3 speeds on LGA1156, and only threw in Crossfire as an afterthought, since the design necessary for USB3 allowed them to throw it in after all. But this is a pretty extreme example.

I don't believe this would be possible with SLI, though. Last I checked, SLI is far more stringent in having perfectly matched cards and buses. But that might've changed a bit, so don't take my word for it.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

spasticColon posted:

With newer games eating more VRAM will the memory bus on my 660Ti start to become a problem in the near future? Right now it seems my 660Ti is borderline overkill at 1080p but with the new consoles coming with GPUs having access to 8GB of GDDR5 RAM I fear that video cards with 2GB of VRAM or less will become obsolete very quick.

As I've mentioned earlier in the thread, consoles will have shared memory, so having 6-8GB of RAM doesn't necessarily mean they'll use all that up on GPU resources.

It's too early to tell, but I think the real VRAM wasters on PC will be the same as now -- large resolutions, MSAA, etc. which you don't often see on consoles.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

TheRationalRedditor posted:

It's not really a big deal when you can just add additional cards in SLI for considerable gain (Nvidia, anyway). This generation's GPU value has been pretty favorable versus modern game tech.

But SLI won't help if you're running out of VRAM, as seems to be the case for Bioshock Infinite. With these reports of filling up 2GB in 1080p, I'm almost afraid to try it out on my own PC at 2560x1600.

What I really did, though, was buy XCOM and get a Bioshock Infinite preorder for free. And I'm still not done playing XCOM. :haw:

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.
So, with Haswell coming out, have there been any recent news on its integrated GPU? Last I heard, it was surprisingly competitive (compared to previous offerings at least), and I've been holding off building my HTPC to see if I could also make it a HTPC that can half-decently play some games.

Edit:
vvvvv
That's the thing, I'm not looking for a gaming HTPC, I'm looking for a HTPC that just happens to have some gaming ability. Any serious gaming will still take place on my desktop PC, but the option dicking around with 3-player co-op on Trine 2 would be neat. The priority for the HTPC remains to be small, low power and low noise.

Jan fucked around with this message at 22:55 on Apr 7, 2013

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

Urzza posted:

It's it worth my time to look at this if I'm making something for playing games on?

Look at it this way: Crossfire on different cards will always run at the speed and bandwidth of the lowest card. So if you have a fancy HD7990, it will only run as fast as your lovely integrated GPU.

Urzza posted:

I'm set on an AMD chip however, I got one and a mobo for free from AMD fan day.

Your loss. If I want a free turd, I'll just go to the bathroom instead.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

Killer robot posted:

I don't know, if I remember this right wasn't the 8800GT just a revision and die shrink that gave incremental improvement over the initial GTS/GTX (and landed between them in performance)? The GTX was the one that made a huge leap in features and performance and became what a serious card needed to match. I mean, the GT was out longer and sold more, but I can't argue with this any more than I can with the GeForce 256 being on there instead of the longer lived and better selling GeForce 2 that just built on it a little.

Right. I had a 8800GT and it lasted a long time, the 8800GTX was far less appealing at the time because it was the then equivalent of Fermi in terms of unreasonable TDP.

spasticColon posted:

Those optimum reqs...Jesus Christ. :stare:

Remember that Metro 2033's "Best" settings uses some prohibitively expensive AA, DoF and SSAO settings. Turning those off makes it playable in 2560x1600, for little to no loss of visual quality. I wouldn't be surprised if they pile up even more expensive PostFX this time around that can be turned off for massive performance gains.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

Dogen posted:

You can actually do most of that for nvidia with the nvidia control panel, although it's a bit clunky.

Clunky? Please. Remember that its competing equivalent is Catalyst Control Centre.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.
Whether an AMD driver release is signed or not has pretty much no impact on it being a turd or not.

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

iuvian posted:

Multi-card setups in general are dumb as heck, the appeal of a single card solution is less noise/power/heat issues.

For a while, a pair of market ATIs in Crossfire were a quieter and less hot solution than a single Fermi card.

Klyith posted:

e: ^^^ My understanding is that if you enable vsynch it's relatively fine and always has been. I'd think the main benefit of a SLI/CF setup is so you can enable vsynch and never drop below 60 frames.

Yeah, Vsync pretty much eliminates the microstuttering from AFR, and Dynamic Vsync lets you at least keep a tolerable performance if your frame rate goes below 60.

But whether AFR actually helps a game engine or not is pretty much a crapshoot.

Adbot
ADBOT LOVES YOU

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

Agreed posted:

During that time, I loved my gaming experience and never had any issues, while he often had reason to grumble about how much of a pain in the rear end CF was and that he wouldn't be doing THAT again.

Oh, I'm not saying it was a more convenient setup, only relatively more quiet. ;)

For what it's worth, 7850s in Crossfire has been good... When it works. I had to go through a series of driver changes to try to fix the geometry corruption issue from a little while back, but that plagued single card setups as well. All of the CF-specific issues I had went away once I figured out about Vsync needing to be on.

Lately, the only games where CF hasn't given me noticeable gains are CPU bound (Guild Wars II), or completely unoptimised for multi-GPU setups (the beta of a certain MMO).

Jan fucked around with this message at 14:11 on Apr 27, 2013

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply