Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Animal
Apr 8, 2003

I have one of the Korean 1440p monitors, and an MSI 560 Ti-448 Twin Frozr III. It's doing an admirable job driving this resolution while overclocked, but feels the strain on certain games.

What would be best, sell it while I can still get a good price for it and buy a 670, or wait till their price drops, get another one and SLI them?

Adbot
ADBOT LOVES YOU

Animal
Apr 8, 2003

Biggest human being Ever posted:

Yes but if you use two cards with different clock speed/amount of memory then the faster card will work like the slower one, extra memory will remain unused and clock speeds will be lowered to match the slower card.

They need to have the same amount of memory too. At least this is what an EVGA rep once told me.

Animal
Apr 8, 2003

Will it be able to max out Metro 2033 with PhysX at 1600p? :downs:

Animal
Apr 8, 2003

I guess I'm one of the lucky ones who is not jaded on graphics. To me those screenshots look awesome. Of course nothing will beat that day when I fired up Tomb Raider and Quake on my 3Dfx Voodoo

Animal
Apr 8, 2003

I just Googled that Euphoria animation engine. Saw a video and thought that is some amazing new tech... then saw the date it was posted, 2007

:negative:

Animal
Apr 8, 2003

This guy at hardforums.com compared an overclocked 670 vs an overclocked 680. Both cards are the same brand and cooler. It pretty much cements the conclusion that the 680 is a waste of $100.

http://hardforum.com/showthread.php?t=1694021

Animal
Apr 8, 2003

I'm not giving my money to scalpers. My 560 Ti 448 is being a trooper at 1440p, the only game it can't play smoothly is Metro. I'm just playing the less demanding games in my backlog, I have plenty.

Animal
Apr 8, 2003

EvilCoolAidMan posted:

Quick question. Can I run BF3 at ultra on a 27" monitor with a GTX670, or do I need to step up to a GTX680?

You did not list a resolution. But yes the 670 will run it fine on any 27" regardless of resolution.

Animal
Apr 8, 2003

To me its well worth it. The tearing is minimal and well worth the gain in fps stability.

Animal
Apr 8, 2003

LittleBob posted:

So if you guys weren't satisfied with the 680's performance because you're some sort of sperglord that has to game on a 27" monitor, would opt for a second 680 and a PSU upgrade for SLI, or just suck up the cost and get a 690?

The 690 is a badass card and the first multi GPU card I would consider. That said, a 680 really should be enough unless you need to max out Metro 2033

poo poo, my 560 Ti 448 runs 1440p like a champ in all but Metro.

Animal
Apr 8, 2003

HalloKitty posted:

I doubt you've played Skyrim, because especially with the high res texture pack, that will chew through your 1GB VRAM and leave it begging for mercy

Edit: unless there's a 2GB version

More than 300 hours of Skyrim. Runs like a charm, mostly locked at 60fps with some dips to 45ish in certain spots. No stuttering at all.

Animal
Apr 8, 2003

unpronounceable posted:

You don't wait: You buy a GPU, and might miss out on a better one, but you can make use of it a week sooner.
You wait: You deal with whatever you have right now, and make an informed decision in a week or so.

From the benchmarks we've seen, it looks like it'll be a pretty great card. Whether it'll be worth buying or not depends on how much you'll be able to snag it for.

Or if you are able to snag it at all, considering past shortages.

Animal
Apr 8, 2003

Dogen posted:

More than enough.

Is a Seasonic 620w enough for an overclocked i7 2600k, an overclocked Geforce 670, and a 560 Ti 448 for PhysX?

Animal
Apr 8, 2003

Dogen posted:

If you were running both cards flat out I think it would be borderline, but I don't think (unsure on this) that a physx load is going to ever get close to pushing the 560 to the max, so it should be fine.

Maybe Agreed could chime in with what kind of load just being a physx processor puts on a GPU?

My 448 is not cutting it for 1440p (drat you Koreans!) and I am thinking of relegating it to PhysX duties thanks to Agreed's jizzing all over the thread about his setup

Animal
Apr 8, 2003

The motherboard is an ASUS GENE-Z. Case is a Fractal Define Mini. The 448 is a MSI Twin Frozr Lighting so I could get a nice price for it and put it into the 670... but I do love Arkham City, and I will play Borderlands 2 since its bundled with the 670. Also I like the Metro games... decisions.

Animal
Apr 8, 2003

MixMasterMalaria posted:

Wait, so to get the sweet particle effects in borderlands I'd need the 660 and another nvidia card or can you just run it off of one?

You can run it off one. Its just much faster to unload it to an extra card.

Animal
Apr 8, 2003

I am gonna get a 670 because my 560 Ti 448 is not enough for 1440p. Resell value for the 560 Ti seems bad so might as well use it for PhysX.

Animal
Apr 8, 2003

I am looking at Geforce 670's on Amazon. There is a $30 difference between the regular EVGA 670 and the FTW model, and a bigger jump to Superclocked. Does the factory overclock make a big difference on these cards?

1440p, Skyrim, BF3 etc

Animal
Apr 8, 2003

Alereon posted:

Don't get the FTW model, it has stacked power connectors that will break compatibility with many aftermarket coolers. Factory-overclocked Kepler cards have also been problematic due to the difficulty of testing for stability with boost and TDP cap operating. The only reason to get a non-stock card is if you're buying it for the cooler or improved power delivery components for high overclocking.

To make sure you're looking at the best prices, Newegg has the EVGA Geforce GTX 670 for $379.99-$20 MIR=$359.99 with free shipping.

I would like to overclock as high as I can. I love the cooler on my MSI Twin Frozr 560 Ti 448. But I read that MSI is being shady with voltages now.

Animal
Apr 8, 2003

Is the NVIDIA Borderlands 2 promotion still going? I am about to pull the trigger on this card but it doesnt mention anything about it:

http://www.amazon.com/gp/product/B0...&pf_rd_i=507846

Also is this a good 670 to get over reference?

-edit-
I ordered the reference card because it was smaller. I can't find the Borderlands 2 promotion either on Amazon or Newegg so I think thats over :(

Its only available if you order from EVGA and the shipping there is too expensive.
I guess I'll beg NVIDIA for a code like I did with Arkham City, it worked then :downs:

Animal fucked around with this message at 06:01 on Oct 16, 2012

Animal
Apr 8, 2003

Alright, so I am getting my 670 tomorrow. I have an MSI Twin Frozr II Geforce 560 Ti 448 that has depreciated to the point where it might just be worth keeping around for PhysX.

Will it actually provide a benefit, or is the 670 fast enough that I am just better off using it alone and selling/giving the other card away? My power supply is a Seasonic 620, and I will be gaming at 1440p.

Either way I will benchmark both scenarios, but if the answer is unanimous then maybe I should not waste my time.

Animal
Apr 8, 2003

Agreed posted:

Any GF110 based card (that's 560Ti-448, 570, 580) will do PhysX with virtually no resource hit at all. Like, shrug it off, temps at nearly idle levels.

If you're on a P67 or Z68 motherboard, you may only have PCI-e 2.0 support for 8x/8x. That does shave some performance off of the 670 in demanding scenarios. If you can run 16x/4x, do that.

I have a Z68 motherboard (Asus Gene-Z). I will try to figure out if 16x/4x is possible.

-edit-
Could not find a stting in the BIOS or a how-to in Google. Is it actually possible to force a motherboard to 16x/4x?

Animal fucked around with this message at 20:49 on Oct 16, 2012

Animal
Apr 8, 2003

Dogen posted:

On my asus p8p67 pro, I can force PCIe slot 3 to be a specific speed (I think my choices are auto, 1x, and 4x).

The Gene-Z has only two slots (mATX). Where in the BIOS does yours allow the change?

Animal
Apr 8, 2003

Factory Factory posted:

You won't have the option. Larger boards get it because the last slot is the PCH slot, and setting it to full x4 bandwidth will disable some peripherals.

Lame... how large is that 8x performance drop on the 670 at 1440p?

Animal
Apr 8, 2003

norg posted:

Animal, are you playing Skyrim modded with hi-res texture packs? Because if so 2GB VRAM is going to get tested. I've seen people using in excess of 2GB in Skyrim while playing at 1080p, let alone 1440p.

Are you tied into Nvidia for any particular reason? A 7950/7970 with 3GB of VRAM would probably be better for modded Skyrim, or even a 4GB 670.

That said, it is just one game, and it's one where ultimately the VRAM useage is in your hands. If you're hitting the wall you can always install lower res textures. It's just something to bear in mind, I guess.

Too late, I have the 670 here. I can live without super high res textures for Skyrim if it just benefits that one game.

I am brand agnostic, I went NVIDIA because it seems to perform better on the games I like, and I am one of those dorks who thinks PhysX is kinda cool :)

Animal
Apr 8, 2003

I just installed my EVGA 670 (reference) and am getting lower benchmark results than Anandtech did hre:

http://www.anandtech.com/show/5818/nvidia-geforce-gtx-670-review-feat-evga/6

They average 34fps at 1600p on very high, and I average 22fps at 1440p.

Animal
Apr 8, 2003

Rigged Death Trap posted:

I have a hard time believing that you have an i7-3960X @ 4.3GHz and 8GB of Ripjaws RAM at 1867.

No, but I have a i7 2600k overclocked to 4.5Ghz, and 8gb of RAM at 1600, all of which should make about .1% difference.

I tried without DOF enabled, much better results, I think they disabled it and didnt mention it.

Animal
Apr 8, 2003

Factory Factory posted:

High quality, 16x anisotropic filtering, and no MSAA? Because the hit from MSAA is generally about 1/3 and would account for the difference.

I have it set exactly like them. High Quality, AAA, 16AF. I turned off DoF and that seems to put the results in parity, higher scores than them which is about right considering they are 1600p and I am 1440p.

I did some overclocking, brought the clock Turbo Boost to 1065Mhz and the memory to 6.9Ghz. So far it flies and is stable. Power target 120%.

Animal
Apr 8, 2003

Alereon posted:

Do make sure you have it running in DX11 mode, accidentally falling back to an older DX mode will significantly impair performance. Another Metro 2033 benchmarking pro-tip: There's significant run-to-run variability so you need to use multiple runs and average them. Also make sure you have your fan speed turned high enough to get meaningful results, if the videocard breaks 69C it throttles so you have to throw out the results for that run.

Yes its on DX11, everything is set just like Anandtech. Disabling DoF did the trick.

Thanks for the fan trick, will monitor that.

Animal fucked around with this message at 20:18 on Oct 17, 2012

Animal
Apr 8, 2003

Agreed posted:

Metro 2033 does so much right in terms of their engine and scalability that their crap implementation of ADoF is kind of baffling. The underlying engine similarities to GSC's (rest in peace :qq:) X-Ray rendering engine are profound and yet S.T.A.L.K.E.R. CoP, even with extraordinary texture mods and added visual goodies from the fanbase, runs like greased lightning on a 670/680. View distance: forever, all the bells and whistles, and even some supersampling forced through the CC, runs like a champ.

Metro, you're in these super enclosed environments most of the time and their turn-everything-up has a ton of root similarities to how the shaders and particles and poo poo work in CoP yet the ADoF is a totally disproportionate FPS hog.

You must be right. I could not believe DoF was the cause, why should it have such a big impact.

Animal
Apr 8, 2003

Majumbo posted:

Do you normally see a POST screen with that monitor? With my 2560x1440 I won't get a post screen and won't get anything until the OS loads.

And then I know some newer motherboards (like my ASUS motherboards) require you to enter the BIOS when there is a hardware change. Whenever I change hardware I have to plug in a smaller monitor and follow the stupid instructions to enter the BIOS just to get the system to boot that first time after the change.

I dont think not getting a POST has much to do with the screen. More likely is which video card is set as primary in the BIOS.

Animal
Apr 8, 2003

Squibbles posted:

The cheap Korean IPS's doing do image scaling, so if the video card isn't doing the scaling to native resolution then the monitor won't show anything. My nvidia 570 does do scaling but as the poster above said, not all cards do. Perhaps it's a problem with the 670 not scaling?

I have a 670 and a Korean IPS. No issues showing POST. No issues ever.

Animal
Apr 8, 2003

Rigged Death Trap posted:

You do realize that the Tech Demos are real time, don't you?

The ones he posted are, too. I think his hidden point is that they are more impressive because they are 64k tech demos. (thats the size of the whole file, impressive, but still graphically inferior)

Animal
Apr 8, 2003

I am playing around with Adaptive VSynch on my 670 (1440p). How do you guys go about it? I am thinking focrcing Adaptive on the nvidia CP, and choosing VSynch off in the games so as not to cause any type of conflict.

Animal
Apr 8, 2003

To be fair, thats four GPU's vs three, still an impressive showing.

Animal
Apr 8, 2003

slidebite posted:

What's Zotac like for a manufacturer for 6xx series? I've never heard of them until recently.

They are decent and gaining a reputation as good.

Animal
Apr 8, 2003

BeanBandit posted:

For those having problems with Tomb Raider on Nvidia cards, the latest beta drivers dramatically increased performance for me:

code:
	314.07	314.21 (beta)

Tomb Raider - Ultra settings
min	18.20	44.05
max	33.95	61.30
avg	31.20	52.70

Tomb Raider - Ultimate settings
min	 8.35	24.85
max	26.20	42.80
avg	22.05	33.85

What is your GPU? That is an impressive gain.

Animal
Apr 8, 2003

It seems Bioshock Infinite is using all 2gb on my 670 at 1440p and causing some stuttering.

Animal
Apr 8, 2003

Factory Factory posted:

Texture resolution and MSAA. FXAA is fine.

At 1440p, even lowering texture resolution and using FXAA is not enough. The game is a video ram hog either by design or flaw, so there is significant stuttering ruining an otherwise perfect framerate. Hopefully with new drivers and patches it will get smoothed out. I am inclined to believe its a flaw, because one round of benchmark had no stuttering with minimum FPS at 47fps, the next round, with the exact same settings, was stuttering down to 10fps. Makes no sense.

As for Tomb Raider with the new drivers, it runs perfectly at 1440p with a Geforce 670 with only TressFX off and FXAA instead of MSAA (I always choose FXAA), all other settings maxed out. Before the drivers, some guys on Arstechnica were buying SLI Titans to run it at 1080p. I poo poo you not.

Animal fucked around with this message at 15:55 on Mar 27, 2013

Adbot
ADBOT LOVES YOU

Animal
Apr 8, 2003

That's up to Final Fantasy: Spirits Within CGI standards. I'll believe it when I see it running on my PC, at that framerate on my 670.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply