Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

tijag posted:

I'm pretty sure the game still needs to 'support' it, right?

It's not like you can force TXAA can you?

Yeah you're right, it just works on some mmorpger that came out today :(

Adbot
ADBOT LOVES YOU

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
nVidia's Timothy Lottes has a new blog post about TXAA here.

Boten Anna
Feb 22, 2010

Dogen posted:

Yeah you're right, it just works on some mmorpger that came out today :(

...and that game is The Secret World, and it owns :colbert:

g0del
Jan 9, 2001



Fun Shoe

El Grillo posted:

So this is intensely annoying - Catalyst Control Centre won't open. I click, get an hourglass cursor for a sec, then nothing.
Seems like a fairly common problem, but I haven't found any working solution. Any advice??
I've had some success following these steps and also using a driver cleaner program to make sure everything got removed.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.
Or try the atiman uninstaller. Specifically made to be a complete driver cleaner/uninstaller for ATI/AMD drivers. It is (anecdotally) idiot-proof.

http://www.mediafire.com/?0jdko53gk5npzo0

Carecat
Apr 27, 2004

Buglord
Well I got about 10c off my temperatures by cleaning fluff out of the heatsinks with a Q Tip. Thanks cat! It didn't look that bad and there wasn't a lot of it but it made a difference.

General_Failure
Apr 17, 2005
Got a new mobo/ CPU (well, APU) / RAM combo coming hopefully this week if Aussie post pulls their finger out. I just wanted to ask if there is any reason whatsoever I shouldn't just disable the on chip HD6550D?

The card in my current system is an nVidia 8800GTX. Lol old shutup. It's a nice, older power hungry card which meets my needs. Including CUDA. One of the reasons I got it (secondhand, cheap) was for that very reason, but cascading hardware failures after that have prevented me from having a play with its lovely processing abilities.

According to passmark the nVidia blows the poo poo out of the ATI, but I have concerns. Mostly i couldn't help noticing that they have benchmark results for VGA going something like 5-10 times faster than things like a TNT2 which says to me that their GPU benchmarks aren't differentiating between software and hardware, which puts all results into question for me because of varying specs of the machines used to to the benchmarks.

I don't think this is a card picking question because I already have the hardware.
Also don't suppose anyone has another 8800gtx kicking around unloved? I'd love to play with SLI bridging.

Also ^^^^ Fluff makes a massive difference. I live in a super dusty place and need to clean the internals every couple of months. In summer it can be the difference between running happily and thermal shutoff in 5 minutes of heavy load.

unpronounceable
Apr 4, 2010

You mean we still have another game to go through?!
Fallen Rib
You know how AMD introduced the 7970 GHz edition to compete with the 680? Well one performance notch down, they're apparently doing the same with a 7950 GHz edition to compete with the 670.

jisforjosh
Jun 6, 2006

"It's J is for...you know what? Fuck it, jizz it is"
I'm looking to purchase a 670 in the coming weeks and selling my eVGA 560 GTX Superclocked that is currently in my system. Can anyone suggest a price range I should try and sell it for? I think they're going for about $190 new on NewEgg and was contemplating somewhere in the low to mid 100s

El Grillo
Jan 3, 2008
Fun Shoe
Been looking around the forums but no help. Anyone know of a decent monitoring/fan control software for 8800GTX these days? Used to use rivatuner but that can't handle new drivers it seems.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Try HWiNFO64, GPU-Z, and also Furmark should have some reporting when you run its benchmark.

Fatal
Jul 29, 2004

I'm gunna kill you BITCH!!!
Got a question about many-monitor setups. Right now I've got my 7970 driving 4 monitors but that just isn't enough. I've got an old 4670 that I've tried to install in my 2nd x16 slot but with no luck, it seems the AMD catalyst drivers just disable this 2nd card (I'm assuming it's too old to work in conjunction with the 7970). I really just need some sort of confirmation that buying a closer to current generation AMD card should let me do what I want, I'm thinking something like a 6450 like this guy: http://www.newegg.com/Product/Product.aspx?Item=N82E16814102933

I'm not planning on gaming on the final 2 monitors in this setup but hardware acceleration would be nice for video playback. Any thoughts?

Rosoboronexport
Jun 14, 2006

Get in the bath, baby!
Ramrod XTreme

El Grillo posted:

Been looking around the forums but no help. Anyone know of a decent monitoring/fan control software for 8800GTX these days? Used to use rivatuner but that can't handle new drivers it seems.

http://event.msi.com/vga/afterburner/download.htm MSI Afterburner is modernized rivatuner. I use it for my 9800Gt.

Ervin K
Nov 4, 2010

by Jeffrey of YOSPOS
I have a question on the relationship between the CPU and GPU. Is there some kind of a performance ratio that we have to maintain between the video card and the processor? What would happen if you paired an old Core 2 Duo processor with a GTX 690?

The reason I'm asking is because I'm planning on building a PC with a 3570k (that I plan on overclocking) and a factory overclocked GTX 670, and in the future possibly add a second 670. I want to know if this will cause some kind of bottle-necking to occur or something like that.

Edit: I will be gaming and doing 3d stuff on this computer

Also, completely separate question. On my current PC I have an HD 5770, and for some reason every time me computer is restarted catalyst tells me that my current driver is horribly outdated (v 11.something) and that I should download the latest 12.something. This is despite the fact that I have downloaded and installed the latest version about half a dozen times, and every time it tells me the same loving thing, wtf is going on?

Ervin K fucked around with this message at 20:26 on Jul 9, 2012

Incessant Excess
Aug 15, 2005

Cause of glitch:
Pretentiousness

Ervin K posted:

I have a question on the relationship between the CPU and GPU. Is there some kind of a performance ratio that we have to maintain between the video card and the processor? What would happen if you paired an old Core 2 Duo processor with a GTX 690?

The reason I'm asking is because I'm planning on building a PC with a 3570k (that I plan on overclocking) and a factory overclocked GTX 670, and in the future possibly add a second 670. I want to know if this will cause some kind of bottle-necking to occur or something like that.

Edit: I will be gaming and doing 3d stuff on this computer

From what I understand, you have to have a pretty huge gap between GPU/CPU to have one bottleneck the other, I don't think you're gonna run into this problem with your specific setup since both of those components were at pretty much the same time and are similar in performance (both are enthusiast products).

Gwaihir
Dec 8, 2009
Hair Elf
Practically speaking, the only way to make the CPU a bottleneck for nearly any game is to run at super low resolution (1024*768, maybe) with no details. There are a few games that are more CPU bound than others (Blizzard games, Civ games, stuff like that), but in general the GPU is always going to be the limiting factor at modern resolutions. You'd have to do something like get a 2+ generation old CPU for it to really choke you performance wise.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Ervin K posted:

I have a question on the relationship between the CPU and GPU. Is there some kind of a performance ratio that we have to maintain between the video card and the processor? What would happen if you paired an old Core 2 Duo processor with a GTX 690?

The reason I'm asking is because I'm planning on building a PC with a 3570k (that I plan on overclocking) and a factory overclocked GTX 670, and in the future possibly add a second 670. I want to know if this will cause some kind of bottle-necking to occur or something like that.

Edit: I will be gaming and doing 3d stuff on this computer

Also, completely separate question. On my current PC I have an HD 5770, and for some reason every time me computer is restarted catalyst tells me that my current driver is horribly outdated (v 11.something) and that I should download the latest 12.something. This is despite the fact that I have downloaded and installed the latest version about half a dozen times, and every time it tells me the same loving thing, wtf is going on?

Except for exceptionally CPU-heavy games like StarCraft 2, Civ5, BF3 (more number of cores than core speeds), SWTOR (ditto), or the ARMA series, most CPUs are good enough for most games regardless of GPU. The one exception is when doing SLI or CrossFire setups. SLI/CF require additional CPU power to coordinate compared to a single card setup, so a good-enough CPU might become not good enough when you compare a single card to an SLI/CF pair. That said, an i5-3570K at stock clocks has plenty of power to handle a dual-card setup. For triple or quad SLI/CF, that's when you would be looking at overclocking or an i7. Not that the performance hit would be huge, but if you're spending $1,500 or so on graphics cards, you can afford an extra $100 and/or some tweaking time to get the most out of it.

As for the driver thing, :iiam:. If it really bothers you, uninstall all the drivers, murderize them with Driver Sweeper, and then install a fresh batch of 12.7 Beta.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Not super happy with the most recent beta drivers, they basically broke the framerate limiter function. Previously it was great at reducing input lag by capping frames to 58, with either front or deferred rendering methods both working perfectly well to fill out the vsync for no screen tearing (I am not using Adaptive vsync, too much tearing... But that's mainly due to really aggressive settings fuckery to get high image quality, if my GPU isn't being utilized I try to take steps to fix that).

The current version may be closer to the intended behavior of framerate targets and Kepler hardware - it certainly much, much more aggressively controls power and voltage than the previous versions, which used a framerate limiting method that has been in nVidiaInspector for some time and didn't seem to have much if anything to do with how the hardware and software controlled the various clock and power states. So it could be that they're fine-tuning it, and the eventual result will be both as-required performance and lower power usage, but in the meantime there's no real way to get the old version of framerate limiting and I don't like seeing my card's core and smx clocks dip down into the 600s when I'm playing a game that would otherwise be utilizing the full power of the card since I've forced either aggressive CSAA (if it's not a deferred rendering engine) or the heavier handed but workable "2xSSAA+2xMSAA w/ 2xSGSSAA transparency" go-to that plays nice with deferred rendering engines.

The aggressive downclocking results in a very not-smooth gameplay experience. I've had to disable the "framerate targets" manually and can't use the previous framerate limiter, even in games where vsync is poorly implemented, or whatever. Not ideal, hope they keep adjusting it so that it works better. This seems to be a very problematic intermediary step rather than a working technology.

Incessant Excess
Aug 15, 2005

Cause of glitch:
Pretentiousness
Is there a way to check if my videocard is broken? I bought a GTX 670 yesterday and it crashes when, I guess, the load gets too heavy, namely when I try playing Crysis in max settings and also when running the Unigine Heaven Benchmark at max settings. The screen goes dark and all my fans start spinning at max RPM, I have to restart the computer and get a message that I got a BSOD with the BCCode 116. It seems to happen once the cards temperatures are around 70°c, that temperature is not too high I think but I don't know what else to look out for.

The card is a GTX 670 OC by Gigabyte.

Tunga
May 7, 2004

Grimey Drawer

Biggest human being Ever posted:

Is there a way to check if my videocard is broken? I bought a GTX 670 yesterday and it crashes when, I guess, the load gets too heavy, namely when I try playing Crysis in max settings and also when running the Unigine Heaven Benchmark at max settings. The screen goes dark and all my fans start spinning at max RPM, I have to restart the computer and get a message that I got a BSOD with the BCCode 116. It seems to happen once the cards temperatures are around 70°c, that temperature is not too high I think but I don't know what else to look out for.

The card is a GTX 670 OC by Gigabyte.

A lot of the factory OCed 670s are very close to the edge of what they can do. Try setting the Power Target to 117% or whatever the maximum is that Afterburner will let you set it to. Otherwise you can try underclocking it and see if it stablises. Obviously you should return it if it needs a big underclock to be stable.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
You should return it if it requires any underclock to be stable! This isn't a cheap, crappy piece of hardware, and it should work well..

Definitely sounds like there are some testing and validation issues with the overclocked 670s, which is a shame, because they fall in a sweet spot.

Incessant Excess
Aug 15, 2005

Cause of glitch:
Pretentiousness
Yea I'm gonna return it I think, even if I get it stable by underclocking it, I don't think that's really an acceptable solution.

One weird thing I noticed, checking the card with HWINFO64, the "GPU Geometry Clock" speed, which I assume is the core clock, goes up to 1175.8 Mhz under load, but on the Gigabyte website it says the card boosts to 1058 Mhz. I'm wondering if that could be the reason for my crashes or if that's just a wrong reading from HWiNFO.

Gonna do some more benchmarks now that I've underclocked it to the 1058 Mhz from the Gigabyte website.

Belan
May 7, 2007
You shouldn't have to underclock it, the extra boost you see is called "Kepler boost" that gets added on top of the normal boost clock.

The amount of kepler boost changes from card to card which is probably why a lot of overclocked cards seem to be crashy.

More here http://www.overclock.net/t/1265110/the-gtx-670-overclocking-master-guide

6XX overclocking got very silly with all the boosts and temperature throttle points.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
Buying a factory OCd card seems like it's been a bad idea for a while, unless you're getting a full on custom PCB one (MSI Lightning, etc). Total waste of money for the same basic part which, as you have discovered, has a lovely overclock that hasn't even been fully tested.

Incessant Excess
Aug 15, 2005

Cause of glitch:
Pretentiousness

Belan posted:

You shouldn't have to underclock it, the extra boost you see is called "Kepler boost" that gets added on top of the normal boost clock.

The amount of kepler boost changes from card to card which is probably why a lot of overclocked cards seem to be crashy.

More here http://www.overclock.net/t/1265110/the-gtx-670-overclocking-master-guide

6XX overclocking got very silly with all the boosts and temperature throttle points.

drat, here I thought my problems were kinda solved, guess I'll just exchange the card, roll the dice again. Thanks for the advice guys!

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Belan posted:

You shouldn't have to underclock it, the extra boost you see is called "Kepler boost" that gets added on top of the normal boost clock.

The amount of kepler boost changes from card to card which is probably why a lot of overclocked cards seem to be crashy.

More here http://www.overclock.net/t/1265110/the-gtx-670-overclocking-master-guide

6XX overclocking got very silly with all the boosts and temperature throttle points.

Got pretty awesome if you ask me, it's really solid technology and while it does affect tiered/binned overclocking practices nonetheless I'm happy for all the power-saving I can get, in and out of game.

Gigabyte has a history of bad power delivery in addition to this generation's "how... do we test this overclock, exactly?" problem.

Argas
Jan 13, 2008
SRW Fanatic




I've a EVGA 560Ti and it tends to run in the 70s/80s when playing some rather GPU-intensive/poorly optimized games. Aside from just looking into aftermarket cooling (Wary of violating warranty, since that's what EVGA is good for) and turning down settings, any ideas on how to get the temperatures lower? I cleaned out the dust from my case, fans, etc. today and it's gotten to the high 60s. Set up a fan profile with MSI Afterburner yesterday, which got it down from low 90s, and installed two intake fans on the front of my case (Antec P193). I'm not too sure how much the intake fans will help though.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
What's the problem, exactly? The GPU can take it.

Argas
Jan 13, 2008
SRW Fanatic




Factory Factory posted:

What's the problem, exactly? The GPU can take it.

Mostly looking into ways to cool it.

Edit: The fan gets rather loud and annoying at the high temperatuers.

Argas fucked around with this message at 01:18 on Jul 13, 2012

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Argas posted:

Mostly looking into ways to cool it.

Edit: The fan gets rather loud and annoying at the high temperatuers.
Your temperatures are perfectly acceptable, if you're not happy with the fan noise levels, turn the fan speeds down. There's really not much reason to spend effort/noise to reduce temperatures below an already acceptable margin. Also, you have gone a bit crazy with case fans, which will increase noise while providing more benefits. There's almost never a reason for more than two case fans, an intake in the lower-front of the case, and an exhaust in the upper rear.

Argas
Jan 13, 2008
SRW Fanatic




Alereon posted:

Your temperatures are perfectly acceptable, if you're not happy with the fan noise levels, turn the fan speeds down. There's really not much reason to spend effort/noise to reduce temperatures below an already acceptable margin. Also, you have gone a bit crazy with case fans, which will increase noise while providing more benefits. There's almost never a reason for more than two case fans, an intake in the lower-front of the case, and an exhaust in the upper rear.

Alright. I'll have to mess with the profile a little to find a sweet spot in cooling/noise.

As for fan noise from the new case fans, they're surprisingly quiet. I can hear them if things are silent but that's a rarity. The GPU's fan stands out because it's a slightly higher-pitched whine compared to the whirring of the other fans if I set them to high.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

A side fan is going to do way more for keeping a GPU cooler than front intake. Front intake is for HDDs and general airflow - a side fan usually sits right where the GPU(s) is/are. I've got... an unusual and power hungry high end setup, and the 200mm side fan (optional) is part of why even playing the most intensive games I've got, my temperatures don't go above 60ºC on either the graphics or the physx card. In the vast majority of games, temps stay right around 50ºC on the 680. Stock coolers on both the 680 and the 580.

But that's a pretty unusual and not very recommended setup. If you are having actual problems with your temperatures, that's different, but the cooling tends to be optimized for noise levels by default and so setting a more aggressive cooling profile isn't necessarily a good idea, especially if you're not doing some heavy overclocking where you might want to stay out of the higher reaches. I do agree with other regulars that the thermals you've noted are within the card's safety window for sure, but I get a little uncomfortable seeing a GPU running at 90ºC today - these cards shouldn't be running that hot, that's more HD4870/GTX280 under full load territory, makers have gotten better at power efficiency and cooling since then. If it's overclocked, that could cause some instability in marginal cases, I'd think, given the increased resistance as heat rises. But a video card between 70ºC and 80ºC with the stock cooler and default fan control is not outside the "24/7" safety margin for them, they can run a lot hotter than CPUs by design.

If cleaning out your dust went from the high 80ºC region to the mid-to-high 60ºC region, I'd say just be more vigilant about blowing the dust out of the case. Which the intake fans will probably not help a lot with, unfortunately, more intake means more dust coming in even with filters. :-/

Argas
Jan 13, 2008
SRW Fanatic




Agreed posted:

A side fan is going to do way more for keeping a GPU cooler than front intake. Front intake is for HDDs and general airflow - a side fan usually sits right where the GPU(s) is/are. I've got... an unusual and power hungry high end setup, and the 200mm side fan (optional) is part of why even playing the most intensive games I've got, my temperatures don't go above 60ºC on either the graphics or the physx card. In the vast majority of games, temps stay right around 50ºC on the 680. Stock coolers on both the 680 and the 580.

But that's a pretty unusual and not very recommended setup. If you are having actual problems with your temperatures, that's different, but the cooling tends to be optimized for noise levels by default and so setting a more aggressive cooling profile isn't necessarily a good idea, especially if you're not doing some heavy overclocking where you might want to stay out of the higher reaches. I do agree with other regulars that the thermals you've noted are within the card's safety window for sure, but I get a little uncomfortable seeing a GPU running at 90ºC today - these cards shouldn't be running that hot, that's more HD4870/GTX280 under full load territory, makers have gotten better at power efficiency and cooling since then. If it's overclocked, that could cause some instability in marginal cases, I'd think, given the increased resistance as heat rises. But a video card between 70ºC and 80ºC with the stock cooler and default fan control is not outside the "24/7" safety margin for them, they can run a lot hotter than CPUs by design.

If cleaning out your dust went from the high 80ºC region to the mid-to-high 60ºC region, I'd say just be more vigilant about blowing the dust out of the case. Which the intake fans will probably not help a lot with, unfortunately, more intake means more dust coming in even with filters. :-/

Well, the default fan settings left it at 90+. It's a P193 so there's a 20cm side fan built in, and all the case's intakes have dust filters. By default, the only intake is the side fan and the case comes with three exhaust fans. Setting up a fan profile with higher fan speed was what helped the most, the dust wasn't clogging the GPU all that much. As far as I can tell, it's largely just the GPU not being all that good at cooling itself without upping the fan speed. The side fan brings in a fair amount of air for it.

Edit: vvv It was setting up fan profiles that largely did it.

Argas fucked around with this message at 03:37 on Jul 13, 2012

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Well, you might be able to get away with re-applying the TIM; on AR parts of the period, if I recall correctly EVGA's policy was "undo any modifications and restore the card to factory condition and it will be considered under warranty unless we discover something damaged by inept modification."

I note that because to me it sounds like the card might have misapplied TIM at first glance (er, first listen? ... metaphors). But then you note cleaning the dust out dropped your temps 20ºC. That's a lot. So obviously something was going on there. If you take an action and the effect is a huge reduction in temperatures, I'd say that's a pretty solid indicator that whatever you just did had something significant to do with the problem in the first place.

Incessant Excess
Aug 15, 2005

Cause of glitch:
Pretentiousness
Little update on my GTX 670, they tested it at the store and of course it's working fine there, so I'm wondering, did they maybe just not put enough load on it or is there something else on my end that could be causing my problems?

I get BSODs when I do really graphics intensive stuff (Crysis in max settings, Uningine Benchmark), unless I manually downclock the card. Could my Motherboard or PSU be to blame here (Asus P8Z77 and XFX 750w, both a month old) or could it be a driver issue (using the latest beta drivers)?

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
If they weren't running crysis at max or uningine whatever, it's not really a good test. If you can fix it by downclocking the card, it's prettty much the card's fault.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Biggest human being Ever posted:

Little update on my GTX 670, they tested it at the store and of course it's working fine there, so I'm wondering, did they maybe just not put enough load on it or is there something else on my end that could be causing my problems?

I get BSODs when I do really graphics intensive stuff (Crysis in max settings, Uningine Benchmark), unless I manually downclock the card. Could my Motherboard or PSU be to blame here (Asus P8Z77 and XFX 750w, both a month old) or could it be a driver issue (using the latest beta drivers)?

When you pay $400 for a graphics card it's not your problem when it doesn't work as advertised. That is a lot of money to drop on a luxury item, if it can't deliver the box specs it's defective and it'll be easier to get it treated as such the quicker you are about it, I'd guess.

Carecat
Apr 27, 2004

Buglord
Is the TOP version of the Asus 2GB GeForce GTX 670 DirectCU II different apart from the lower clock speeds? All the reviews are of the TOP version and I'm mainly after the low temperature and fan noise when the card is already extremely fast.

Star War Sex Parrot
Oct 2, 2003

Agreed posted:

TXAA is:
3. What killed development on FXAA (booooo).
Wait what?! :toughguy:

Adbot
ADBOT LOVES YOU

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
FXAA developer Timothy Lottes said he has no further plans to update FXAA. I think he hit a wall where further innovation with FXAA required specific engine support, like the temporal antialiasing mode in FXAA4. If you're building a new antialiasing mode that will require games to be developed to support it, you might as well develop it in hardware too. He point out that while FXAA has quality around 4X MSAA, TXAA performs like 4X MSAA but looks like 4X SSAA.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply