Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
craig588
Nov 19, 2005

by Nyc_Tattoo
Tweak Town did some overclocking on their 660 Ti sample. http://www.tweaktown.com/articles/4873/nvidia_geforce_gtx_660_ti_2gb_reference_video_card_overclocked/index.html

It doesn't overclock better than a fully enabled GK104, but they haven't been able to say anything about the quality of the factory cooler yet. They also don't have any voltage control right now so it's possible the 660 is slightly undervolted for better power efficiency.

I'm so excited for this card, even without overclocking it I was already ready to buy it as soon as it's launched. Hoping a manufacturer puts one out with a huge quiet 3 slot cooler.

Adbot
ADBOT LOVES YOU

craig588
Nov 19, 2005

by Nyc_Tattoo
At stock speeds a q6600 with a 670 is probably a waste. If you can't overclock then get something cheaper, if overclocking's on the table then it shouldn't be much of a restriction at 3.2+ ghz.

There's also the other side of it, if you plan to keep the card for 5 years or whatever then go nuts, it'll make your next CPU upgrade all the better. It's not like having too powerful of card will slow anything down if you do keep the CPU at stock speeds, it'll just be a bit of a waste of money right now.

craig588 fucked around with this message at 21:04 on Aug 7, 2012

craig588
Nov 19, 2005

by Nyc_Tattoo
Here are 560 ti benchmarks: http://www.anandtech.com/bench/Product/547
Here are 660 ti benchmarks: http://www.tweaktown.com/reviews/4869/nvidia_geforce_gtx_660_ti_2gb_reference_video_card_review/index.html

Better benchmarks for the 660 ti will be out once the NDA expires. As it looks right now, it blows the 560 ti away because it's generally on par with and even beating the 580 GTX.

craig588
Nov 19, 2005

by Nyc_Tattoo
I saw a 670 with a dual fan cooler on sale for 380$ and after seeing what the initial launch of 660 with custom cooler prices are looking like I couldn't hold back. I guess I was planning on getting a 2560x1440 monitor anyways...

Seriously, I'm happy with buying it. Going from a 8800 GTX SLI to it will be great.

craig588 fucked around with this message at 04:01 on Aug 14, 2012

craig588
Nov 19, 2005

by Nyc_Tattoo

InstantInfidel posted:

Get a 560Ti, then. It still rocks the socks of any 1080p display and is right around the $200 mark right now, and often hits $180-$190 on sale.

Used 580s are also hitting the 200$ mark now too. I figure if you're not buying a current generation card you might as well get a used one because the market's going to get flooded with them, pushing down prices.

craig588
Nov 19, 2005

by Nyc_Tattoo
I usually feel safest buying from various forums, from people with well over the minimum required to sell there.

I just installed this 670 with Gigabytes dual 100mm fan cooler and it's amazingly quiet, combined with the Noctua D14 CPU heatsink the computer is eerily quiet. The noise reduction from a pair of overclocked 8800 GTXes is the single best thing about the upgrade.

craig588
Nov 19, 2005

by Nyc_Tattoo
http://www.anandtech.com/bench/Product/146?vs=551
The Phenom is pretty terribly outclassed, and the Ivy Bridge overclocks incredibly well just as a bonus if you felt like it. Pretty much everyone will recommend the 3570k, but I didn't see it in the drop down. It's only 100mhz slower with 2mb less of cache which is insignificant in normal use.

My personal experience with an overclocked 2500k and a 670 has been running everything maxed out including some level of AA at 2560x1440 with vsync on and consistantly hitting 60 FPS. I don't benchmark anymore (it lead to me buying stuff I never needed, like Extreme Edition CPUs), but I havn't even had to even think about performance, just max everything and play and nothing's choppy. There's a setting in the Nvidia control panel that lets you choose between centered, full screen and scaled maintaining the aspect ratio. I'm on my tablet right now, so I'm sure I don't have the specific setting names right, but it's the gist of them

I'd keep the 5970 and just get an Ivy Bridge setup and see how that does. 5970s seem to be selling for about 300$ right now though.

Edit: Ignore my link, Agreed's is much better for game benchmarks.

Another edit to fix typos I'll blame on my tablet.

craig588 fucked around with this message at 04:28 on Aug 27, 2012

craig588
Nov 19, 2005

by Nyc_Tattoo
Put it simply: Upgrading your videocard before you upgrade your CPU is a waste of money. Yeah a 690 will be faster, but upgrading your CPU will provide you with larger gains for less money.

I like to reformat after switching motherboards and videocards, but it's probably not strictly necessary. It's good for the piece of mind knowing for sure that there's nothing left over from a previous driver installation, but there are tools that can take care of that well enough.

Low resolutions highlight CPU limits better, but it's not like they go away when you step up.

craig588
Nov 19, 2005

by Nyc_Tattoo
He's just trolling this thread, the Phenom was never the best performing option but for some reason he has one.

craig588 fucked around with this message at 16:18 on Aug 28, 2012

craig588
Nov 19, 2005

by Nyc_Tattoo
My 8800s could go into single card mode with the bridge still installed, there was a radio button called something like maximize 3D performance that turned SLI on. I never had any dedicated CUDA workloads to really determine anything beyond that through.

craig588
Nov 19, 2005

by Nyc_Tattoo
I'm pretty sure it has almost always been true except when 3D cards were a relatively new concept. I can remember as far back as the Geforce 2 having a "halo" tier for around 400$ and a cheaper version with 25% lower stock clock speeds that only had 32mb of memory vs 64, but otherwise were identical cards.

Edit: ATI even had their Rage Maxx dual GPU cards back then.

craig588 fucked around with this message at 21:29 on Sep 1, 2012

craig588
Nov 19, 2005

by Nyc_Tattoo
After buying a dual fan Gigabyte 670 I wouldn't buy a second one. The fan bracket wasn't solid enough and allowed the fans to vibrate. It's not something you'd normally hear, but I already have a Noctua D14 with low speed fans so it was really annoying. I made a little brace from a cut up credit card and epoxied it to the fan bracket and the edge of the card and it quieted up.

I'd say go with the MSI aftermarket cooler if you want large slow fans, they've been doing it longer and probably have more bugs worked out.

I thnk the card landscape gets a lot more interesting when the MVK Tech guys finish up their bios reverse engineering. From what it looks like right now, power target %s are entirely arbitrary and there's no reason one card couldn't have 100% be 200 watts while anothers 122% is only 175 watts. Excepting, of course the limitation of the power delivery system of the card itself. It explains how some people are able to overclock and mess with voltages for days and barely break 80%, but other people need 145% to not even hit the same speeds. The fallout will be revealing what cards are designed to handle a lot of power and what are built to be just enough. I have a feeling my Gigabyte will be on the lower end of the scale, it's constantly pegging out at 122% and pulling voltage and clock speed. (Not like below stock or anything really crazy, but at under 60C it will only break ~1280mhz when it's dropping the voltage to 1.137 or even lower and if you force it up it'll drop way down to around 1000mhz)

craig588
Nov 19, 2005

by Nyc_Tattoo
How low? Even aproximately lateral replacements are going to be over 200$ unless you want to get into buying used previous generation cards. Nvidia doesn't really have a card worth considering below 300$ right now. An AMD 7850 would be a cheaper option that performs slightly better than your current card. Unfortunately anything below the Nvidia 660 or the AMD 7850 will be a dramatic performance drop compared to your current card.

craig588
Nov 19, 2005

by Nyc_Tattoo
They turned out to be slower in many cases than forcing the PhysX to run in software on a contemporary CPU too.

craig588
Nov 19, 2005

by Nyc_Tattoo
Furmark isn't the best stability tester for the GK104 series of cards because it will cause the GPU to throttle pretty low. I've been enjoying using Heaven maxed out and letting it run its demo loop for an hour or so.

Blower style cards are generally better for cramped spaces. It has to do with pressure vs volume, blower style fans are better at moving a fixed amount of air regarless of how much restriction they're facing, but conventional fans can move more air at a given size if they're not trying to overcome a lot of resistance.


Glen Goobersmooches posted:

Gigabyte's 670 is literally the thinnest model there is.

I don't know where you got the idea that other manufacturers were only making 3 slot cards, but (Almost?) all of the blower cards are also 2 slot designs. If anything I think the Gigabyte cooler might exceed the width of 2 slots a bit while the blower shrouds actually fit within that limit.

craig588
Nov 19, 2005

by Nyc_Tattoo
I meant volume in terms of space. Noise is caused by turbulence and has a lot to do with with the construction of the blades and where the air is getting forced to go. There's also stuff like bearing whine and housing vibration but for most circumstances they'll be overwhelmed by the sound of air getting forced to change directions. To get really thorough though, in general you probably could make blowers quieter because they can move air through restrictions better so a baffling system would able to be more powerful than with a conventional axial fan. Also the benefit of high pressure letting you put it on the far end of a duct somewhere. It doesn't really work out like that with computers though, a large low speed thin fan fits nicer than a blower and noise reducing baffles would into the space available on a videocard.

craig588
Nov 19, 2005

by Nyc_Tattoo

quote:

GPU Utilization will show as 0% when this is enabled.

I can take a guess as to how this works and I'll bet it has the same effect of all of the power target talk I had in the overclocking thread. Instead of giving a card effectively no limit, it now just ignores or overwrites the sensors so it thinks it's never maxed out. The only disadvantage I see with this software hack is you lose power management with it enabled. I changed my cards bios from 170 watts to 225 watts and gained over 150mhz on average with no side effects.

I'm too busy at the moment to check if that's actually how it works, but it's how I read it.

craig588
Nov 19, 2005

by Nyc_Tattoo
I'd try replacing the DVI cable before you replace any other hardware if it comes to that. I had a VGA cable fail once and it was the most frustrating thing because I completely ignored it as a possibility until I ended up trying 3 different monitors and 2 different sources.

craig588
Nov 19, 2005

by Nyc_Tattoo
Buy a single card now and add more later? My procedure with only a single 670 at 2560x1440 is to max out everything and still get a solid 60FPS.

craig588 fucked around with this message at 15:37 on Dec 20, 2012

craig588
Nov 19, 2005

by Nyc_Tattoo
Are you running 7680x1600 or something? I'm sure at most a dual 680 setup would be more than enough for all but the most ridiculous of monitor setups. By the time you'll need to add a second 690 there will be better single card options available.

craig588
Nov 19, 2005

by Nyc_Tattoo
From my experience with SLI I still got weird dips but much higher average frame rates. Just sometimes the cards would get confused and kick out a few seconds of single digit frame rates for seemingly no reason. This was with 8800 GTXes so it's probably a more mature technology now.

craig588
Nov 19, 2005

by Nyc_Tattoo
GIS tells me this is an E510


You're going to have crazy high videocard temperatures if that's how yours is laid out because your cool air is immediately warmed up by the CPU.

craig588
Nov 19, 2005

by Nyc_Tattoo
Since you already own both of them can't you just try them and see? If we're placing bets I'd say the 230 would be faster based on theoretical performance. Especially if that 620 is an OEM card, it's almost half of a retail 620. (I'm just guessing you have an OEM 620 considering the 230 was only available in OEM form)

They'll both be pretty terrible but you could probably sell both of them to scrape together 50$ to buy a used 260.

craig588
Nov 19, 2005

by Nyc_Tattoo
http://unigine.com/products/heaven/

Max out everything you can and once the demo is runing the hotkey for running a benchmark is F9. You might have to set it to DX10 for both cards because I guess the 620 has some level of DX11 support. (It's not going to be powerful enough to ever see those features in a playable state so don't worry about not having it on the 230)

I think the major difference is going to come from the 230s much faster memory. The 620 is killed by having only a 64 bit bus compared to the 230s 192 bit. I don't think the architectural improvements of the 620 will be enough for it to pull ahead.

craig588
Nov 19, 2005

by Nyc_Tattoo
I remember seeing those patterns when I used a cheap CRT years ago. I'm betting it doesn't show up in screen shots because it's a problem somewhere along the path from the videocards output to the monitors PCB before it's displayed on the panel. You don't happen to be using a VGA cable or even a digital cable from a really shady vendor?

craig588
Nov 19, 2005

by Nyc_Tattoo
I'm pretty confidant it's not a problem with the videocard or from overclocking. There might even be a setting you can mess with in your monitors control panel (Maybe reducing overdrive? Sharpness? Dynamic contrast? Disabling any processing or special features the monitor might have could get rid of it). I've only seen it happen because of problems on the last step of the displaying an image process. Not that something else couldn't cause it, but the only time I saw it I could consistently reproduce it on one specific monitor. If it makes you feel any better the monitor never got worse and lasted for years without any problems before I sold it.

craig588
Nov 19, 2005

by Nyc_Tattoo

SocketSeven posted:

After setting K-boost to on in EVGA's precision X utility, everything seems stable, except for the whole wasting hundreds of watts of power at idle and increased card temps.

I'd like to be able to disable K-boost and let the cards adaptively underclock themselves and save my poor electric bill, but I'm lost on where to even begin fiddling with this stuff because EVGA's documentation appears to have been written by their marketing department. :psyduck:

Should I not even bother, and just let my PC be a space heater? Or is there some place I should be researching to tweak things so this crap doesn't happen.

Mod your bios so it has (effectively) no power limits. :smug: My card used to drop down to sub 900MHz levels depending on the load, now I've never seen it drop below 1200MHz and it's rare for it to even break from 1300Mhz and I keep full dynamic clock and voltage support.
Really unless you have really awful stock power limits you probably can ignore it though.

Have you tried each card by itself? I don't think I've ever heard of the dynamic clocking causing any sort of artifacts at all, it sounds like there might be a hardware issue.

Oh, idea: In EVGA Precision use the adjust voltage button and max it all the way. That's still a perfectly safe voltage because it's what your card jumps to in 3d mode until it hits power limits and it lowers voltage and clock speed in order to stay within range. All the setting does is set a minimum voltage so you'll lose a lot more clock speed because the cards wont have the option of lowering voltage so they'll only be able to drop clocks. If the problem goes away like that then you definitely have a bad card and need to test them individually to find out which one to RMA.

Another idea: What power supply do you have? If you're near the limit it could be reacting poorly to the rapidly changing load.

craig588 fucked around with this message at 14:22 on Jan 21, 2013

craig588
Nov 19, 2005

by Nyc_Tattoo

SocketSeven posted:

In Precision X, sync is set to off, as both cards have different stock clocks, (#1 has a boost clock of 1100mhz or so, and #2 has a boost clock of 1050)
That could be a problem. Try syncing them to the slowest cards speed.

quote:

K-boost is set to off, and adjusting the voltages doesn't seem to do anything, I'm assuming because the clocks and volts are being adaptively managed.
Something's wrong there. I just confirmed it pushes the voltage up even at idle. All the setting does is set a minimum dynamic voltage, the (indicated) voltage should never drop below what it's set to.

quote:

Both cards appear to be running OVER their stated boost speeds in Precision X, by about 75mhz

Rated boost speeds are kind of a lie. There's a whole range of possibilities depending on temperature and load.

You never ever should have to use the K boost setting to be stable at stock speeds. It's a really great way to bypass all of the dynamic clock changes for when someone wants to benchmark without putting serious effort into bios editing, but it's terrible for pretty much everything else. Unfortunately It's really looking like one of your cards is bad.

craig588 fucked around with this message at 17:53 on Jan 21, 2013

craig588
Nov 19, 2005

by Nyc_Tattoo
I still have such paranoia about that that I reformat my computer anytime I change a videocard or motherboard.

craig588
Nov 19, 2005

by Nyc_Tattoo
On the other hand, I found the Gigabyte cooler kind of loud at stock and super loud after overclocking. I swapped out the fans with a pair of quiet 120mm ones and it's much better now. The mechanical hard drives are now the loudest part of my computer.

craig588
Nov 19, 2005

by Nyc_Tattoo

TheRationalRedditor posted:

I'm talking about the OC model, which is the "Windforce 3x". I have no problem believing anything with fewer, possibly smaller fans has issues.

Yep that's the one I had. Gigabytes whole line of cards is sort of disappointing. They look great but perform poorly relative to competitors cards. Maybe they'll get it right next generation. I remember a time when Asus had terrible videocards, it seems like it's different every year.

Gigabyte also does semi shady things with the bioses on some of their cards to make them quieter and cooler but perform worse. That's a pretty good reason to stay away from them too.

craig588 fucked around with this message at 00:05 on Feb 4, 2013

craig588
Nov 19, 2005

by Nyc_Tattoo

Endymion FRS MK1 posted:

Then take matters into your own hands and mod the bios :v:

Seriously though, I had to. Gigabyte voltage locked my 7950. When the whole point of me buying a 7950 over a GeForce was the insane OC-ability of the 7900 cards. Ugh.

Yeah, I had to do that with my personal 670 because it was limited to something like 140 watts. No one has made an easy tool for it yet so I needed to read a whole bunch of forums to get offsets to edit and figure out how to update the checksum manually. I wouldn't want to recommend someone else do it or even risk doing it for someone else because of potential long term problems. Because Gigabyte uses lots of custom PCBs maybe there's a good reason they have low current limits and one day a FET will blow off my card?

craig588 fucked around with this message at 00:44 on Feb 4, 2013

craig588
Nov 19, 2005

by Nyc_Tattoo
Don't buy based on promise. For pretty much everything, really. The origial Xbox has an Nvidia GPU and there were little to no benefits of using an Nvidia card for console ports then. Get whatever is best for the games you play right now.

craig588
Nov 19, 2005

by Nyc_Tattoo

TheRationalRedditor posted:

pair of quiet 120s.

120MM fans are quite a bit bigger than the stock ones (I think those are 80MM?). If I wanted to fit 3 fans on it I probably could by letting the 2 on the edges hang off a whole lot, but I'm not wasting much of the heatsink as it is.

TheRationalRedditor posted:

has always been stealthy

To be fair I also think a Corsair H100 is incredibly loud, well beyond even the loudest the Gigabyte cooler got to and I know a lot of people who feel the H100 is quiet. I have 140mm fans on a fan controller and 200mm fans on their lowest setting. I want to hardly be aware of a computer making noise.

Jan posted:

No matter what GPU the next generation has, it very likely will also have unified memory. This one detail changes everything about how engines are designed and optimised, so until PCs also have unified memory, console specific optimisations don't meant anything.
I think either the Xbox 360 or the PS3 already has unified memory. I remember that being a feature one of them was holding over the other at some point.

craig588 fucked around with this message at 01:14 on Feb 4, 2013

craig588
Nov 19, 2005

by Nyc_Tattoo
I could see it being hard for PCs for a while if consoles end up with 8GB of unified memory and some scenes use up 6 or 7GB of video memory without a second thought on a console. Since any worthwhile videocard has more than 512MB of memory now it's not really an issue for 360 ports, but when it was new the 360 was ridiculously powerful. If it becomes an important feature I'm sure videocard manufacturers will start offering 8GB cards.

craig588
Nov 19, 2005

by Nyc_Tattoo

Skilleddk posted:

Does anyone have an opinion of the Windforce 3x coolers, compared to "normal" ones on GTX 680? It's just 20$ more, thinking of getting that

They're much better than the normal one as long as you have even slightly reasonable case airflow.

craig588
Nov 19, 2005

by Nyc_Tattoo
If you don't have a warranty left it's probably time to disassemble it, reseat the videocard, apply new TIM and blow out any dust. If you have warranty left you should let their service center deal with it and borrow another one while it's there if you really can't get along without it. It's incredibly unlikely it's a software problem.

craig588 fucked around with this message at 04:40 on Feb 13, 2013

craig588
Nov 19, 2005

by Nyc_Tattoo
They seem to have a really hard time getting the heat under control. I remember previous rumors were that it was going to run at something like 700MHz. It's not dramatically faster than a 680 despite having almost double the compute units. I think they had to go 2x 690 and 3x titan in order to show an improvement. A single 690 is probably faster than a single titan and they're both in the 900$+ range.

craig588
Nov 19, 2005

by Nyc_Tattoo

TheRationalRedditor posted:

The 660Ti is good, the 660 is terrible.

Yeah, if you're looking at spending any less than a 660ti you should probably look to something from the previous generation. You can probably get something in the 570 or 580 range for about the same price that'd perform better than the 660.

Adbot
ADBOT LOVES YOU

craig588
Nov 19, 2005

by Nyc_Tattoo
What brand is your power supply? I don't really think it's that considering the 6970 and the 7970 have similar TDPs, but it's a possibility.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply