Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Dominoes
Sep 20, 2007

Hey, I use eyefinity/crossfire with 3 portrait monitors. I get pretty bad screen tearing because vsync doesn't work. As far as I can tell, there is no way to get eyefinity to work with vsync. Searching shows a few cases of this with no solution, although I haven't found solid information, Ie an article from AMD explaining that there are technical issues preventing vsync from working with eyefinity.

Have any of you found a solution? Or know that there will or won't be one?

Does Nvidia surround work with vsync?

Running games in a window is a workaround; vsync works there, although crossfire doesn't.

Dominoes fucked around with this message at 18:19 on Oct 14, 2012

Adbot
ADBOT LOVES YOU

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
Set a framerate cap to 60fps, that doesn't fix the problem but will significantly reduce the amount of tearing you see. From some Googling there's no way this will work until MST hubs are available and you connect all three monitors via DisplayPort.

Dominoes
Sep 20, 2007

Alereon posted:

Set a framerate cap to 60fps, that doesn't fix the problem but will significantly reduce the amount of tearing you see. From some Googling there's no way this will work until MST hubs are available and you connect all three monitors via DisplayPort.
A framerate cap helps, but is a 20% solution. I do connect all three monitors via DisplayPort. One of the links I found described this as the problem too, but it's apparently not. (or there are multiple causes)

I use two of these:

Dominoes fucked around with this message at 18:22 on Oct 14, 2012

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
What videocards are you using?

Dominoes
Sep 20, 2007

ASUS EAH6950

Longinus00
Dec 29, 2005
Ur-Quan
What monitors are you running?

Dominoes
Sep 20, 2007

Three HP ZR24ws in portrait.

My specific setup shouldn't help diagnose or assess this if I'm right that no one has gotten eyefinity to work with vsync in fullscreen.

Animal
Apr 8, 2003

Alereon posted:

Don't get the FTW model, it has stacked power connectors that will break compatibility with many aftermarket coolers. Factory-overclocked Kepler cards have also been problematic due to the difficulty of testing for stability with boost and TDP cap operating. The only reason to get a non-stock card is if you're buying it for the cooler or improved power delivery components for high overclocking.

To make sure you're looking at the best prices, Newegg has the EVGA Geforce GTX 670 for $379.99-$20 MIR=$359.99 with free shipping.

I would like to overclock as high as I can. I love the cooler on my MSI Twin Frozr 560 Ti 448. But I read that MSI is being shady with voltages now.

BusinessWallet
Sep 13, 2005
Today has been the most perfect day I have ever seen
Does anyone have any experience with XFX RMAs? I have a 5850 that has the double lifetime warranty, it poo poo the bed so I'm sending it in. I've heard they replace the cards pretty quickly, but I'm wondering if they're going to replace it with a current card, or send me back some old crusty card.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
I do from a few years ago, recently posted upthread. Unless they've changed, they'll send you a crusty card. If you're lucky, it'll be a 6850 or a 7770 and there won't be much difference, but you may not be lucky.

roadhead
Dec 25, 2001

BusinessWallet posted:

Does anyone have any experience with XFX RMAs? I have a 5850 that has the double lifetime warranty, it poo poo the bed so I'm sending it in. I've heard they replace the cards pretty quickly, but I'm wondering if they're going to replace it with a current card, or send me back some old crusty card.

XFX sent me a 6950 to replace my 5870 - though somehow they managed to find a 1GB 6950 AND it appears to have been a return/exchange. My brother has the 6950 now.

Chuu
Sep 11, 2004

Grimey Drawer
I noticed that there are a ton of GTX 670 and GTX 680 4GB cards on the market. At what resolution do you actually need that much memory?

Animal
Apr 8, 2003

Is the NVIDIA Borderlands 2 promotion still going? I am about to pull the trigger on this card but it doesnt mention anything about it:

http://www.amazon.com/gp/product/B0...&pf_rd_i=507846

Also is this a good 670 to get over reference?

-edit-
I ordered the reference card because it was smaller. I can't find the Borderlands 2 promotion either on Amazon or Newegg so I think thats over :(

Its only available if you order from EVGA and the shipping there is too expensive.
I guess I'll beg NVIDIA for a code like I did with Arkham City, it worked then :downs:

Animal fucked around with this message at 06:01 on Oct 16, 2012

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Chuu posted:

I noticed that there are a ton of GTX 670 and GTX 680 4GB cards on the market. At what resolution do you actually need that much memory?

You can maybe make an argument if you're doing 8x or higher MSAA on Battlefield 3 or maxed-out GTA IV with High/Ultra graphics on a triple 1080p multimonitor setup. Maybe; even then, 2GB can go a surprising way. Obviously, you'd be constrained by other performance factors on any single card before the VRAM really made a difference. RAM also helps with serious GPGPU (OpenCL/CUDA) work, but that's really a workstation/server consideration.

So don't pay for 4GB.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I would say it's completely contraindicated if you're still on PCI-e 2.0, because the only instance where it's going to be useful would be if it were in SLI at extremely high resolutions/3D/both, and only PCI-e 3.0 has sufficient real bandwidth to use more than one 670/680 (or 7970, take your pick).

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
It would also come in handy for supersampling or if you want to run custom high-res texture packs (such as Skyrim), but that's rather uncommon right now.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Alereon posted:

It would also come in handy for supersampling or if you want to run custom high-res texture packs (such as Skyrim), but that's rather uncommon right now.

That was more salient in the previous generation when a lot of performance cards were 1GB and it was accepted wisdom that 1GB was enough (when truthfully by the end of Fermi there were some games getting into the 1.2GB-1.4GB region at 1080p).

The performance jump from the 560Ti to the 660Ti is huge, and even with its weird asynchronous VRAM "issue" the 660Ti turns in great performance in standard SLI, though it eats poo poo scaling past two cards. With superior access to the framebuffer two 670s or two 680s can be expected to do pretty amazingly well even in very texture-heavy games.

My prediction is for single-monitor gaming, we won't see a need for more than 2GB of VRAM until the next consoles launch. Going to watch for Metro 2034 to prove me wrong, if anything PC-oriented will. :)

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.
I realise this is going to be a highly engine-dependent thing, but what are some GPU troubleshooting tools and approaches?

I'm running 7850s in Crossfire and I've been experiencing randomly warping polygons, seemingly since switching to Catalyst 12.8. Tried switching to 12.9 beta, same deal -- I'll have to try rolling back to an earlier revision, I suppose. This happens to a minor extent in Borderlands 2, but Skyrim's UI is especially bad. Turning off Crossfire helps a lot.

Also, what's that HUD that some people seem to be using to display GPU usage in Crossfire? I think it's supposed to be MSI Afterburner, but I could never get it to work.

Starting to regret picking Crossfire as a solution to run in 2560x1600. :smith:

Animal
Apr 8, 2003

Alright, so I am getting my 670 tomorrow. I have an MSI Twin Frozr II Geforce 560 Ti 448 that has depreciated to the point where it might just be worth keeping around for PhysX.

Will it actually provide a benefit, or is the 670 fast enough that I am just better off using it alone and selling/giving the other card away? My power supply is a Seasonic 620, and I will be gaming at 1440p.

Either way I will benchmark both scenarios, but if the answer is unanimous then maybe I should not waste my time.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Any GF110 based card (that's 560Ti-448, 570, 580) will do PhysX with virtually no resource hit at all. Like, shrug it off, temps at nearly idle levels.

If you're on a P67 or Z68 motherboard, you may only have PCI-e 2.0 support for 8x/8x. That does shave some performance off of the 670 in demanding scenarios. If you can run 16x/4x, do that.

Animal
Apr 8, 2003

Agreed posted:

Any GF110 based card (that's 560Ti-448, 570, 580) will do PhysX with virtually no resource hit at all. Like, shrug it off, temps at nearly idle levels.

If you're on a P67 or Z68 motherboard, you may only have PCI-e 2.0 support for 8x/8x. That does shave some performance off of the 670 in demanding scenarios. If you can run 16x/4x, do that.

I have a Z68 motherboard (Asus Gene-Z). I will try to figure out if 16x/4x is possible.

-edit-
Could not find a stting in the BIOS or a how-to in Google. Is it actually possible to force a motherboard to 16x/4x?

Animal fucked around with this message at 20:49 on Oct 16, 2012

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

Animal posted:

I have a Z68 motherboard (Asus Gene-Z). I will try to figure out if 16x/4x is possible.

-edit-
Could not find a stting in the BIOS or a how-to in Google. Is it actually possible to force a motherboard to 16x/4x?

On my asus p8p67 pro, I can force PCIe slot 3 to be a specific speed (I think my choices are auto, 1x, and 4x).

Animal
Apr 8, 2003

Dogen posted:

On my asus p8p67 pro, I can force PCIe slot 3 to be a specific speed (I think my choices are auto, 1x, and 4x).

The Gene-Z has only two slots (mATX). Where in the BIOS does yours allow the change?

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
You won't have the option. Larger boards get it because the last slot is the PCH slot, and setting it to full x4 bandwidth will disable some peripherals.

Animal
Apr 8, 2003

Factory Factory posted:

You won't have the option. Larger boards get it because the last slot is the PCH slot, and setting it to full x4 bandwidth will disable some peripherals.

Lame... how large is that 8x performance drop on the 670 at 1440p?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Animal posted:

Lame... how large is that 8x performance drop on the 670 at 1440p?

It's noticeable at 1080p, I reckon you'll feel it at 1440p.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Let's put some numbers to it, eh? TechPowerUp on GeForce 680 and Radeon 7970 performance scaling by PCIe bandwidth.

PCIe 2.0 x8 can have as much as a 17.5% drop in frames per second, or as little as zero, depending on the card-to-RAM bandwidth needs of the particular game.

norg
Jul 5, 2006
Animal, are you playing Skyrim modded with hi-res texture packs? Because if so 2GB VRAM is going to get tested. I've seen people using in excess of 2GB in Skyrim while playing at 1080p, let alone 1440p.

Are you tied into Nvidia for any particular reason? A 7950/7970 with 3GB of VRAM would probably be better for modded Skyrim, or even a 4GB 670.

That said, it is just one game, and it's one where ultimately the VRAM useage is in your hands. If you're hitting the wall you can always install lower res textures. It's just something to bear in mind, I guess.

Goon Matchmaker
Oct 23, 2003

I play too much EVE-Online
How can a game like Skyrim be 32bit and have in aggregate more than 4GB of RAM allocated between Video RAM and System RAM? This has been bugging me for a while now...

Jan
Feb 27, 2008

The disruptive powers of excessive national fecundity may have played a greater part in bursting the bonds of convention than either the power of ideas or the errors of autocracy.

Goon Matchmaker posted:

How can a game like Skyrim be 32bit and have in aggregate more than 4GB of RAM allocated between Video RAM and System RAM? This has been bugging me for a while now...

I'm pretty certain VRAM uses a completely separate addressing space. Direct3D resources work in terms of "buffers" instead of pointers, and said buffers wrap up all the memory mapping information for GPU resources.

I've mostly done GPU work on consoles, though, so I'm not 100% positive how it works for PC with non-unified memory.

Wozbo
Jul 5, 2010
It's two separate addressing spaces.

E: And the VC's controller might have a separate addressing scheme, so the CPU literally doesn't know/ care other than feeding/ fetching data. I know Maxwell will almost for sure, but not 100% sure on current tech.

suddenlyissoon
Feb 17, 2002

Don't be sad that I am gone.
Just picked up the Gigabyte 670 OC from Amazon today for 339 (there was a $20 coupon you "clip"). I am hoping I can justify my purchase by seeing quite a bit of an upgrade from my 6870. I could still run most games (like Battlefield) at medium or high with some options off. Just plan on using my 6870 in a yet-to-be-built HTPC down the road.

Animal
Apr 8, 2003

norg posted:

Animal, are you playing Skyrim modded with hi-res texture packs? Because if so 2GB VRAM is going to get tested. I've seen people using in excess of 2GB in Skyrim while playing at 1080p, let alone 1440p.

Are you tied into Nvidia for any particular reason? A 7950/7970 with 3GB of VRAM would probably be better for modded Skyrim, or even a 4GB 670.

That said, it is just one game, and it's one where ultimately the VRAM useage is in your hands. If you're hitting the wall you can always install lower res textures. It's just something to bear in mind, I guess.

Too late, I have the 670 here. I can live without super high res textures for Skyrim if it just benefits that one game.

I am brand agnostic, I went NVIDIA because it seems to perform better on the games I like, and I am one of those dorks who thinks PhysX is kinda cool :)

Proud Christian Mom
Dec 20, 2006
READING COMPREHENSION IS HARD
Luckily there are enough Skyrim texture packs out there that you can find a middle ground between "black and white" and "my computer melted"

Animal
Apr 8, 2003

I just installed my EVGA 670 (reference) and am getting lower benchmark results than Anandtech did hre:

http://www.anandtech.com/show/5818/nvidia-geforce-gtx-670-review-feat-evga/6

They average 34fps at 1600p on very high, and I average 22fps at 1440p.

Rigged Death Trap
Feb 13, 2012

BEEP BEEP BEEP BEEP

Animal posted:

I just installed my EVGA 670 (reference) and am getting lower benchmark results than Anandtech did hre:

http://www.anandtech.com/show/5818/nvidia-geforce-gtx-670-review-feat-evga/6

They average 34fps at 1600p on very high, and I average 22fps at 1440p.

I have a hard time believing that you have an i7-3960X @ 4.3GHz and 8GB of Ripjaws RAM at 1867.

Animal
Apr 8, 2003

Rigged Death Trap posted:

I have a hard time believing that you have an i7-3960X @ 4.3GHz and 8GB of Ripjaws RAM at 1867.

No, but I have a i7 2600k overclocked to 4.5Ghz, and 8gb of RAM at 1600, all of which should make about .1% difference.

I tried without DOF enabled, much better results, I think they disabled it and didnt mention it.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
High quality, 16x anisotropic filtering, and no MSAA? Because the hit from MSAA is generally about 1/3 and would account for the difference.

Animal
Apr 8, 2003

Factory Factory posted:

High quality, 16x anisotropic filtering, and no MSAA? Because the hit from MSAA is generally about 1/3 and would account for the difference.

I have it set exactly like them. High Quality, AAA, 16AF. I turned off DoF and that seems to put the results in parity, higher scores than them which is about right considering they are 1600p and I am 1440p.

I did some overclocking, brought the clock Turbo Boost to 1065Mhz and the memory to 6.9Ghz. So far it flies and is stable. Power target 120%.

Adbot
ADBOT LOVES YOU

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Jan posted:

I'm pretty certain VRAM uses a completely separate addressing space. Direct3D resources work in terms of "buffers" instead of pointers, and said buffers wrap up all the memory mapping information for GPU resources.
I'm pretty sure VRAM and system RAM do use the same address space, that's why 32-bit systems can only address 4GB-VRAM-all other hardware reservations worth of system RAM. This isn't relevant for the case of a 32-bit app running on a 64-bit system because Skyrim doesn't care about the VRAM, only the video driver does, and that's a 64-bit application.

Animal posted:

I have it set exactly like them. High Quality, AAA, 16AF. I turned off DoF and that seems to put the results in parity, higher scores than them which is about right considering they are 1600p and I am 1440p.
Do make sure you have it running in DX11 mode, accidentally falling back to an older DX mode will significantly impair performance. Another Metro 2033 benchmarking pro-tip: There's significant run-to-run variability so you need to use multiple runs and average them. Also make sure you have your fan speed turned high enough to get meaningful results, if the videocard breaks 69C it throttles so you have to throw out the results for that run.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply