|
Hey, I use eyefinity/crossfire with 3 portrait monitors. I get pretty bad screen tearing because vsync doesn't work. As far as I can tell, there is no way to get eyefinity to work with vsync. Searching shows a few cases of this with no solution, although I haven't found solid information, Ie an article from AMD explaining that there are technical issues preventing vsync from working with eyefinity. Have any of you found a solution? Or know that there will or won't be one? Does Nvidia surround work with vsync? Running games in a window is a workaround; vsync works there, although crossfire doesn't. Dominoes fucked around with this message at 18:19 on Oct 14, 2012 |
# ? Oct 14, 2012 18:07 |
|
|
# ? Apr 27, 2024 09:33 |
|
Set a framerate cap to 60fps, that doesn't fix the problem but will significantly reduce the amount of tearing you see. From some Googling there's no way this will work until MST hubs are available and you connect all three monitors via DisplayPort.
|
# ? Oct 14, 2012 18:12 |
|
Alereon posted:Set a framerate cap to 60fps, that doesn't fix the problem but will significantly reduce the amount of tearing you see. From some Googling there's no way this will work until MST hubs are available and you connect all three monitors via DisplayPort. I use two of these: Dominoes fucked around with this message at 18:22 on Oct 14, 2012 |
# ? Oct 14, 2012 18:14 |
|
What videocards are you using?
|
# ? Oct 14, 2012 18:23 |
|
ASUS EAH6950
|
# ? Oct 14, 2012 18:24 |
|
What monitors are you running?
|
# ? Oct 14, 2012 19:15 |
|
Three HP ZR24ws in portrait. My specific setup shouldn't help diagnose or assess this if I'm right that no one has gotten eyefinity to work with vsync in fullscreen.
|
# ? Oct 14, 2012 19:21 |
|
Alereon posted:Don't get the FTW model, it has stacked power connectors that will break compatibility with many aftermarket coolers. Factory-overclocked Kepler cards have also been problematic due to the difficulty of testing for stability with boost and TDP cap operating. The only reason to get a non-stock card is if you're buying it for the cooler or improved power delivery components for high overclocking. I would like to overclock as high as I can. I love the cooler on my MSI Twin Frozr 560 Ti 448. But I read that MSI is being shady with voltages now.
|
# ? Oct 14, 2012 19:23 |
|
Does anyone have any experience with XFX RMAs? I have a 5850 that has the double lifetime warranty, it poo poo the bed so I'm sending it in. I've heard they replace the cards pretty quickly, but I'm wondering if they're going to replace it with a current card, or send me back some old crusty card.
|
# ? Oct 15, 2012 02:35 |
|
I do from a few years ago, recently posted upthread. Unless they've changed, they'll send you a crusty card. If you're lucky, it'll be a 6850 or a 7770 and there won't be much difference, but you may not be lucky.
|
# ? Oct 15, 2012 02:50 |
|
BusinessWallet posted:Does anyone have any experience with XFX RMAs? I have a 5850 that has the double lifetime warranty, it poo poo the bed so I'm sending it in. I've heard they replace the cards pretty quickly, but I'm wondering if they're going to replace it with a current card, or send me back some old crusty card. XFX sent me a 6950 to replace my 5870 - though somehow they managed to find a 1GB 6950 AND it appears to have been a return/exchange. My brother has the 6950 now.
|
# ? Oct 15, 2012 22:19 |
|
I noticed that there are a ton of GTX 670 and GTX 680 4GB cards on the market. At what resolution do you actually need that much memory?
|
# ? Oct 16, 2012 05:27 |
|
Is the NVIDIA Borderlands 2 promotion still going? I am about to pull the trigger on this card but it doesnt mention anything about it: http://www.amazon.com/gp/product/B0...&pf_rd_i=507846 Also is this a good 670 to get over reference? -edit- I ordered the reference card because it was smaller. I can't find the Borderlands 2 promotion either on Amazon or Newegg so I think thats over Its only available if you order from EVGA and the shipping there is too expensive. I guess I'll beg NVIDIA for a code like I did with Arkham City, it worked then Animal fucked around with this message at 06:01 on Oct 16, 2012 |
# ? Oct 16, 2012 05:38 |
|
Chuu posted:I noticed that there are a ton of GTX 670 and GTX 680 4GB cards on the market. At what resolution do you actually need that much memory? You can maybe make an argument if you're doing 8x or higher MSAA on Battlefield 3 or maxed-out GTA IV with High/Ultra graphics on a triple 1080p multimonitor setup. Maybe; even then, 2GB can go a surprising way. Obviously, you'd be constrained by other performance factors on any single card before the VRAM really made a difference. RAM also helps with serious GPGPU (OpenCL/CUDA) work, but that's really a workstation/server consideration. So don't pay for 4GB.
|
# ? Oct 16, 2012 07:50 |
|
I would say it's completely contraindicated if you're still on PCI-e 2.0, because the only instance where it's going to be useful would be if it were in SLI at extremely high resolutions/3D/both, and only PCI-e 3.0 has sufficient real bandwidth to use more than one 670/680 (or 7970, take your pick).
|
# ? Oct 16, 2012 18:32 |
|
It would also come in handy for supersampling or if you want to run custom high-res texture packs (such as Skyrim), but that's rather uncommon right now.
|
# ? Oct 16, 2012 18:52 |
|
Alereon posted:It would also come in handy for supersampling or if you want to run custom high-res texture packs (such as Skyrim), but that's rather uncommon right now. That was more salient in the previous generation when a lot of performance cards were 1GB and it was accepted wisdom that 1GB was enough (when truthfully by the end of Fermi there were some games getting into the 1.2GB-1.4GB region at 1080p). The performance jump from the 560Ti to the 660Ti is huge, and even with its weird asynchronous VRAM "issue" the 660Ti turns in great performance in standard SLI, though it eats poo poo scaling past two cards. With superior access to the framebuffer two 670s or two 680s can be expected to do pretty amazingly well even in very texture-heavy games. My prediction is for single-monitor gaming, we won't see a need for more than 2GB of VRAM until the next consoles launch. Going to watch for Metro 2034 to prove me wrong, if anything PC-oriented will.
|
# ? Oct 16, 2012 19:01 |
|
I realise this is going to be a highly engine-dependent thing, but what are some GPU troubleshooting tools and approaches? I'm running 7850s in Crossfire and I've been experiencing randomly warping polygons, seemingly since switching to Catalyst 12.8. Tried switching to 12.9 beta, same deal -- I'll have to try rolling back to an earlier revision, I suppose. This happens to a minor extent in Borderlands 2, but Skyrim's UI is especially bad. Turning off Crossfire helps a lot. Also, what's that HUD that some people seem to be using to display GPU usage in Crossfire? I think it's supposed to be MSI Afterburner, but I could never get it to work. Starting to regret picking Crossfire as a solution to run in 2560x1600.
|
# ? Oct 16, 2012 19:10 |
|
Alright, so I am getting my 670 tomorrow. I have an MSI Twin Frozr II Geforce 560 Ti 448 that has depreciated to the point where it might just be worth keeping around for PhysX. Will it actually provide a benefit, or is the 670 fast enough that I am just better off using it alone and selling/giving the other card away? My power supply is a Seasonic 620, and I will be gaming at 1440p. Either way I will benchmark both scenarios, but if the answer is unanimous then maybe I should not waste my time.
|
# ? Oct 16, 2012 20:21 |
|
Any GF110 based card (that's 560Ti-448, 570, 580) will do PhysX with virtually no resource hit at all. Like, shrug it off, temps at nearly idle levels. If you're on a P67 or Z68 motherboard, you may only have PCI-e 2.0 support for 8x/8x. That does shave some performance off of the 670 in demanding scenarios. If you can run 16x/4x, do that.
|
# ? Oct 16, 2012 20:27 |
|
Agreed posted:Any GF110 based card (that's 560Ti-448, 570, 580) will do PhysX with virtually no resource hit at all. Like, shrug it off, temps at nearly idle levels. I have a Z68 motherboard (Asus Gene-Z). I will try to figure out if 16x/4x is possible. -edit- Could not find a stting in the BIOS or a how-to in Google. Is it actually possible to force a motherboard to 16x/4x? Animal fucked around with this message at 20:49 on Oct 16, 2012 |
# ? Oct 16, 2012 20:36 |
|
Animal posted:I have a Z68 motherboard (Asus Gene-Z). I will try to figure out if 16x/4x is possible. On my asus p8p67 pro, I can force PCIe slot 3 to be a specific speed (I think my choices are auto, 1x, and 4x).
|
# ? Oct 16, 2012 20:55 |
|
Dogen posted:On my asus p8p67 pro, I can force PCIe slot 3 to be a specific speed (I think my choices are auto, 1x, and 4x). The Gene-Z has only two slots (mATX). Where in the BIOS does yours allow the change?
|
# ? Oct 16, 2012 20:56 |
|
You won't have the option. Larger boards get it because the last slot is the PCH slot, and setting it to full x4 bandwidth will disable some peripherals.
|
# ? Oct 16, 2012 22:11 |
|
Factory Factory posted:You won't have the option. Larger boards get it because the last slot is the PCH slot, and setting it to full x4 bandwidth will disable some peripherals. Lame... how large is that 8x performance drop on the 670 at 1440p?
|
# ? Oct 16, 2012 23:07 |
|
Animal posted:Lame... how large is that 8x performance drop on the 670 at 1440p? It's noticeable at 1080p, I reckon you'll feel it at 1440p.
|
# ? Oct 17, 2012 00:34 |
|
Let's put some numbers to it, eh? TechPowerUp on GeForce 680 and Radeon 7970 performance scaling by PCIe bandwidth. PCIe 2.0 x8 can have as much as a 17.5% drop in frames per second, or as little as zero, depending on the card-to-RAM bandwidth needs of the particular game.
|
# ? Oct 17, 2012 02:07 |
|
Animal, are you playing Skyrim modded with hi-res texture packs? Because if so 2GB VRAM is going to get tested. I've seen people using in excess of 2GB in Skyrim while playing at 1080p, let alone 1440p. Are you tied into Nvidia for any particular reason? A 7950/7970 with 3GB of VRAM would probably be better for modded Skyrim, or even a 4GB 670. That said, it is just one game, and it's one where ultimately the VRAM useage is in your hands. If you're hitting the wall you can always install lower res textures. It's just something to bear in mind, I guess.
|
# ? Oct 17, 2012 11:22 |
|
How can a game like Skyrim be 32bit and have in aggregate more than 4GB of RAM allocated between Video RAM and System RAM? This has been bugging me for a while now...
|
# ? Oct 17, 2012 13:16 |
|
Goon Matchmaker posted:How can a game like Skyrim be 32bit and have in aggregate more than 4GB of RAM allocated between Video RAM and System RAM? This has been bugging me for a while now... I'm pretty certain VRAM uses a completely separate addressing space. Direct3D resources work in terms of "buffers" instead of pointers, and said buffers wrap up all the memory mapping information for GPU resources. I've mostly done GPU work on consoles, though, so I'm not 100% positive how it works for PC with non-unified memory.
|
# ? Oct 17, 2012 15:30 |
|
It's two separate addressing spaces. E: And the VC's controller might have a separate addressing scheme, so the CPU literally doesn't know/ care other than feeding/ fetching data. I know Maxwell will almost for sure, but not 100% sure on current tech.
|
# ? Oct 17, 2012 16:46 |
|
Just picked up the Gigabyte 670 OC from Amazon today for 339 (there was a $20 coupon you "clip"). I am hoping I can justify my purchase by seeing quite a bit of an upgrade from my 6870. I could still run most games (like Battlefield) at medium or high with some options off. Just plan on using my 6870 in a yet-to-be-built HTPC down the road.
|
# ? Oct 17, 2012 17:18 |
|
norg posted:Animal, are you playing Skyrim modded with hi-res texture packs? Because if so 2GB VRAM is going to get tested. I've seen people using in excess of 2GB in Skyrim while playing at 1080p, let alone 1440p. Too late, I have the 670 here. I can live without super high res textures for Skyrim if it just benefits that one game. I am brand agnostic, I went NVIDIA because it seems to perform better on the games I like, and I am one of those dorks who thinks PhysX is kinda cool
|
# ? Oct 17, 2012 17:24 |
|
Luckily there are enough Skyrim texture packs out there that you can find a middle ground between "black and white" and "my computer melted"
|
# ? Oct 17, 2012 18:09 |
|
I just installed my EVGA 670 (reference) and am getting lower benchmark results than Anandtech did hre: http://www.anandtech.com/show/5818/nvidia-geforce-gtx-670-review-feat-evga/6 They average 34fps at 1600p on very high, and I average 22fps at 1440p.
|
# ? Oct 17, 2012 18:27 |
|
Animal posted:I just installed my EVGA 670 (reference) and am getting lower benchmark results than Anandtech did hre: I have a hard time believing that you have an i7-3960X @ 4.3GHz and 8GB of Ripjaws RAM at 1867.
|
# ? Oct 17, 2012 18:52 |
|
Rigged Death Trap posted:I have a hard time believing that you have an i7-3960X @ 4.3GHz and 8GB of Ripjaws RAM at 1867. No, but I have a i7 2600k overclocked to 4.5Ghz, and 8gb of RAM at 1600, all of which should make about .1% difference. I tried without DOF enabled, much better results, I think they disabled it and didnt mention it.
|
# ? Oct 17, 2012 18:56 |
|
High quality, 16x anisotropic filtering, and no MSAA? Because the hit from MSAA is generally about 1/3 and would account for the difference.
|
# ? Oct 17, 2012 19:06 |
|
Factory Factory posted:High quality, 16x anisotropic filtering, and no MSAA? Because the hit from MSAA is generally about 1/3 and would account for the difference. I have it set exactly like them. High Quality, AAA, 16AF. I turned off DoF and that seems to put the results in parity, higher scores than them which is about right considering they are 1600p and I am 1440p. I did some overclocking, brought the clock Turbo Boost to 1065Mhz and the memory to 6.9Ghz. So far it flies and is stable. Power target 120%.
|
# ? Oct 17, 2012 19:48 |
|
|
# ? Apr 27, 2024 09:33 |
|
Jan posted:I'm pretty certain VRAM uses a completely separate addressing space. Direct3D resources work in terms of "buffers" instead of pointers, and said buffers wrap up all the memory mapping information for GPU resources. Animal posted:I have it set exactly like them. High Quality, AAA, 16AF. I turned off DoF and that seems to put the results in parity, higher scores than them which is about right considering they are 1600p and I am 1440p.
|
# ? Oct 17, 2012 19:58 |