Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

I just pulled the trigger on an EVGA Geforce GTX 670 2GB for $399.99 at Newegg. This is the first time in my life I've bought a videocard on launch day, or an nVidia videocard ever. I picked this particular card because I needed something shorter than 10" and it seemed slightly better than the other nearly-stock cards.

Adbot
ADBOT LOVES YOU

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Josh Lyman posted:

My gaming consists of Starcraft 2 and Diablo 3. Should I just get the GTX 460 for $140 now or wait for prices to fall as Kepler continues to roll out?
You'd have to be insane to buy a GTX 460 for $140. Head over to the parts picking megathread.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Klyith posted:

So they invented two things that are still in use today: 1-cycle trilinear filtering and a little thing called S3TC, which was licensed by Microsoft to use in DirectX as DXTC: DirectX Texture Compression.
I still remember installing the S3TC texture pack that came with Unreal Tournament (UT99) GOTY edition and being completely awed by the amazing, high-resolution graphics.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Aws posted:

Anybody having issues with the 12.4 Catalyst? On my HD 5870 I've noticed two things. First, my CCC settings are completely ignored by games that previously didn't ignore them. I have 33 games installed and I tested on a fair chunk of them, and none of them have the settings applied.
Are you setting a profile for each game? If you don't then the default profile (which may not be your global settings) will apply.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Aws posted:

Nope. Just a single global profile.
Try saving a per-application profile and see if that makes a difference.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

I thought it might be interesting to post a Geforce GTX 670 trip report from the perspective of a long-time ATI and then AMD user. I bought an EVGA GTX 670 2GB, non-superclocked version, which is basically a stock card with some minimal tweaks to the fan and the exhaust venting:

The first thing that struck me was that, out of the box, image quality is TERRIBLE. I noticed a lot of texture shimmering and thin lines were mangled badly by antialiasing. After spending a few minutes digging through the nVidia Control Panel to disable all their optimizations and Gamma Correction for Antialiasing, quality came up to what I was expecting. Coverage Sample AntiAliasing (CSAA) is loving amazing. 16X CSAA (4 geometry samples and 12 coverage samples) looks drat near flawless and isn't too much slower than 4X MSAA. I can play less demanding games in 32X CSAA (8 geometry samples and 24 coverage samples), which looks amazing. I can't wait for more demanding games to come out with support for TXAA.

Noise levels are a lot better than I was expecting, even when I maxed out the TDP. Definitely not something I'd notice with headphones, and I think it's actually quieter at idle than my Sapphire Radeon HD 4850 which had a reasonably quiet stock cooler. The blower had a bit of bearing whine when I first booted up after installing the card. I was worried that I might have to RMA it, but it stopped within about a minute and I haven't heard it since, so I think it's fine.

Overclocking is more difficult than it would seem at first due to Boost and the TDP cap. It's hard to test your overclock, as heavy load will hit the TDP cap before the boost cap, so it won't test the higher clockspeeds. This results in overclocks that passed torture tests fine failing under more moderate gaming where there's TDP headroom for it to boost higher. I've also found memory overclocking to have a larger performance impact than expected. Reviews are correct that it doesn't have much impact on average framerates, but it raises minimum framerates and makes valleys in framerate graphs shallower.

Metro 2033 is still a goddamned hog, though

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Aws posted:

As a longtime ATI/AMD user, how would you compare nvidia's control panel with AMD?
In terms of managing settings, especially per-game settings, nVidia's is far better. I have noticed more UI bugs, things like the control panel not knowing which setting I'm hovered over, linked settings not updating properly, and other weirdness in nVidia's conrol panel, but I am using a beta driver (301.34). AMD's panel scaling options seem better and more effective, even though they do require you to be in a non-native resolution to expose them. I haven't done much with the other settings, though the Digital Vibrance option Desktop Color Settings was an easy and effective way to make my older secondary display look less washed out.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Dotcom656 posted:

Anyone else having hard locking issues in Metro 2033 with a GTX 670? I'm running 301.34 drivers, and about 30 seconds after starting the game (or rather 10 seconds after all splash screens end) the game just hard locks. Everything freezes and I cant move my mouse cursor. I can bring up task manager and end the process and that's about it.

EDIT: Just pulled up metro again (I didn't end the process this second time) and its working now. Not sure what that's all about.

EDIT 2: Restarted the game to apply some graphics changes and its still acting weird. Maybe it hates my extended desktop?
What card do you have? That sounds like the issue the EVGA GTX 670 SuperClocked cards were recalled for. That's exactly what happens to me when I have the card overclocked too high.

Agreed posted:

You have to get nVidiaInspector. It allows for much more robust management of game profiles, including the ability to override certain flags and thus enable different types of AA, or allow SSAO, etc., in games that wouldn't support them with stock settings. It also gives you access to every nVidia AA mode, including supersampling AA as well as sparse-grid supersampling options that you can adjust to taste for the perfect balance of performance and incredible appearance.
Cool, thanks, I'll check it out.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Factory Factory posted:

I also discovered that my graphics overclock is stable for Furmark but not for Metro. Good lord, that game.
That's expected, when running Furmark the GPU is spending all its time being throttled to stay under the TDP cap so you're not actually testing the overclock much.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Endymion FRS MK1 posted:

Minor question here, exactly how does enabling Tessellation (AMD Optimized) and Triple Buffering affect my game performance? Do they give only a minor performance hit if I keep them on? Are the effects noticeable?
AMD Optimized Tesselation lowers the amount of tessellation, improving performance at a cost of image quality. Triple buffering is a huge difference and if you are going to have Vsync enabled triple buffering is practically mandatory.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

I'm thinking many of these problems are related to how difficult it is to test for stability with dynamic clockspeeds. The GPU is actually boosting higher with less than full load, meaning traditional tools that heavily load the GPU don't work well for testing. As an example, FurMark could run fine at nearly any clockspeed setting on my GTX 670, as the card was pegged at its 130% TDP cap the entire time so wasn't using much of the headroom available to it.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Something interesting just came up at Anandtech: AMD is launching a new Radeon HD 7750 clocked at 900Mhz that boosts card TDP to 83W (from 75W). They're going to validate all 7750 GPUs they sell at 900Mhz, and it will be up to board partners to decide whether to run them at 900MHz on boards with a 6-pin power connector, or clock them at 800Mhz on boards powered only by the motherboard. AMD will charge the same price for the GPU however they are clocked, but board partners will likely charge slightly more for the 900Mhz cards because of the additional power conversion hardware and circuitry.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Factory Factory posted:

Silly question: don't PCIe 2.1 and 3.0 slots deliver up to 150W power on their own? So such a card installed on a PCIe 3.0 motherboard would still be able to run without the aux power hooked up?
I don't think anyone's ever made a card that does that, probably because they don't want to break compatibility with older and lovely motherboards. I could definitely see a lot of cheaper boards, even current ones, not coping well or at all with a card pulling 150W from the slot.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Sagebrush posted:

In terms of general gaming performance, what's a rough desktop equivalent to a GTX660M/2GB? How about a 640M? A Quadro K2000M?

Work is buying me a new laptop, and those are the cards in some of the options I'm considering...wondering how they'd compare to the 9800GTX/512 in my desktop right now. (Yeah, I need to upgrade).
There are no desktop equivalents for any of those cards because they haven't launched yet. It's hard to draw comparisons to older desktop cards because they are a completely different architecture, but I would say that the GTX 660M would be much faster, the GTX 640M has a faster GPU but less than half the memory bandwidth (meaning it will be faster at low-res/light duty but performance falls off a cliff), and I can't find anything about what a Quadro K2000M is.

Mr.Fuzzywig posted:

So im building a new computer, and because i'm fairly insistent that everything run at max settings, i was going to get a 680, but i hear now that the 670s are almost the exact same card. Is the performance boost big enough to justify another 100$ or so?
Just get the 670, performance is within a few %, especially overclocked. It should be absolutely zero effort to raise the power cap to 122% and add the +91Mhz clock offset to boost to the same clocks as the GTX 680, and the extra shaders don't make a big difference. I'm really happy with my choice of the GTX 670.

Alereon fucked around with this message at 01:23 on Jun 13, 2012

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Mr.Fuzzywig posted:

Thank you very much, does this http://www.newegg.com/Product/Produ...N82E16814130787 look like a acceptable card? ill admit i just picked the first result off newegg but this one seems to have higher clock speeds?
Remember that time I accidentally edited my reply into your post? Edit != quote is apparently pretty easy for mods Anyway, I would recommend against factory-overlocked cards due to the difficulty testing overclocks for stability on the GTX 600-series. I have the base EVGA GTX 670 card and I am very happy with it. It has some very, very minor tweaks over the reference design that should improve cooling and noise by an immeasurably small amount.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Mr.Fuzzywig posted:

SO this http://www.newegg.com/Product/Produ...ID=3938566&SID= would be an regular card as opposed to the Factory Overclocked one?
Yes, I believe that's the exact card I have.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

td4guy posted:

Speaking of minor heatsinks, I was surprised to see that my GTX 680's RAM chips are naked. Is vram cooling not really a big deal?
No, memory chips produce very little heat at all, several watts each at most.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Boten Anna posted:

What do they mean exactly by "Pentium 1 cores"? Just that the core lacks all the fancy extensions (MMX, etc.) or is it a literal Pentium 1 just slapped on 22nm process so it's now faster? Kinda both?
P54C (pre-MMX Pentium) cores according to Intel. That said, I'm a bit surprised/skeptical they're not really using Atom cores, as those are essentially a Pentium tweaked for efficiency and with support for the current ISA bolted on.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Anandtech has their review of the Geforce GT 640 2GB up. It's a pretty sweet option for HTPCs, though gaming performance is significantly worse than the expectations nVidia set due to the extremely low memory bandwidth. It would be interesting to see how a similar card performed with GDDR5.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

In a similar vein is this PowerColor Radeon HD 6750 1GB Low-Profile card with two small fans.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

ACID POLICE posted:

I haven't been up to date on the video card scene since Radeon 9800 Pros were the best cards around.

Am I retarded for asking if I can drive two 1080p monitors with a single 1GB 6450?
That would be fine, as long as you're not expecting any kind of graphics performance. If you just need it for desktop and no frills video playback, it'll be fine. If you've already got the card go ahead and do it, if you're buying a card get something better so you're not so heavily bottlenecked for video playback and web browsing.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Carecat posted:

This is a new motherboard/CPU/ram but the power supply is the same. If it was the PSU wouldn't it be very likely to cause the PC to shutdown rather than just a driver crash?
Nope, you can get all sorts of interesting symptoms from a bad (either failed or just crappy) power supply.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

nVidia's Timothy Lottes has a new blog post about TXAA here.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Argas posted:

Mostly looking into ways to cool it.

Edit: The fan gets rather loud and annoying at the high temperatuers.
Your temperatures are perfectly acceptable, if you're not happy with the fan noise levels, turn the fan speeds down. There's really not much reason to spend effort/noise to reduce temperatures below an already acceptable margin. Also, you have gone a bit crazy with case fans, which will increase noise while providing more benefits. There's almost never a reason for more than two case fans, an intake in the lower-front of the case, and an exhaust in the upper rear.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

FXAA developer Timothy Lottes said he has no further plans to update FXAA. I think he hit a wall where further innovation with FXAA required specific engine support, like the temporal antialiasing mode in FXAA4. If you're building a new antialiasing mode that will require games to be developed to support it, you might as well develop it in hardware too. He point out that while FXAA has quality around 4X MSAA, TXAA performs like 4X MSAA but looks like 4X SSAA.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Yes the new Flash versions added a Protected Mode to reduce the risk of unpatched exploits, but it seems to have murdered both performance and reliability. I'd make sure you have the Catalyst 12.7 Beta drivers installed, the very latest version of Flash, and the very latest version of Firefox. If you still experience issues, there's instructions in the Firefox thread for how to disable Flash's Protected Mode.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Dead Man Posting posted:

So I think I'll try here as well about my problem. Now few months ago I was playing perfectly all my games and usually at max settings (i.e. Skyrim which probably isn't really resource intensive) with an NVIDIA 285M graphics card on the 296.10 driver. After each consecutive update, I've noticed a decreased in performance and increased microstutter in Skyrim and FPS loss in other games with the addition of my car running hotter and hotter. Now at the current beta drivers 304.79, it's just outright stuttering in that game and all my other games, old and new, run my card very hot to the touch despite not doing this before.

Turning off anti-aliasing and using just FXAA seems to not make the card run that hot and makes the stuttering less-noticeable. Running games in windowed mode makes them run fine with no problems. I have clean-installed the beta.

My question is, despite the card, I had been playing Skyrim and other games in the past with no problems, but after each beta update it gets worse and worse. Is this due to NVIDIA adding support for FXAA at all or adaptive verticle sync? Less optimization for the games? Anyone else experiencing these problems or have an idea? I know there's a thread on both Official and Steam forums about this but I doubt GPU people visit those awful forums often.
Clean out all your vents and fans with a canned air duster, dust buildup causing increasing temperatures could explain that. Aside from that, uninstall your drivers, use Driver Sweeper to remove the remnants, then reinstall the latest drivers.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

tijag posted:

I get a lot of crashes to desktop / BSOD while playing SW:ToR with my GTX 680. I'm pretty sure its the drivers. My build is brand new, the PSU is solid, I have no problems in other games, and my wife's computer [which is my old i5-750 + HD5850 build] never crashes at all.

*shakes fist at nvidia drivers*
Try underclocking it a bit, and if it's factory overclocked push it down to stock. Boost can make overclocks pretty iffy to test, so unstable overclocks in only specific situations seem more common. Or it could just be bad drivers, try the latest betas. I don't run SW:ToR so I can't tell you for sure.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

It would be interesting to compare that with the performance of the Arctic Cooling Accelero Xtreme 7970.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Yeah I think a GTX 670 would be a good choice, though you may want to consider whether you're really going to see a difference in your experience that will justify the upgrade. Here's a direct comparison between the GTX 470 and GTX 670 from Anandtech.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Factory Factory posted:

Heads up, sports fans nerds: the files for Source Filmmaker have leaked the existence of an in-development Source 2 engine.

I'm not sure the jump will be as drastic as the difference between HL1 and HL2... but then again, if the lighting model is competitive with Unreal Engine 4, maybe it will be.
This is interesting when paired with this blog post from Valve about how they've found OpenGL more efficient and faster than Direct3D, even when the OpenGL app is actually a Direct3D app running via OpenGL through a translation layer.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

shodanjr_gr posted:

That's not what that blog post says.
Can you elaborate on how I misinterpreted it?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

The Source engine is Direct3D, to run in OpenGL they use a layer that translates the Direct3D calls to OpenGL. What they're saying in the "OpenGL versus Direct3D on Windows 7" section then is that the reduced overhead in OpenGL is so significant that it more than makes up for the overhead of the translation layer. That would seem a pretty significant result, though this may have something to do with Source not being DX10+. Unless of course I'm misunderstanding this in some way, which is possible, but I don't think is the case.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Factory Factory posted:

I'm not seeing that, though I'm not about to say it's unambiguous. But this sentence:

shodanjr_gr posted:

First of all I don't see anything in that post that states that Valve translates D3D calls to OpenGL.
Sorry I should have been more clear, I know the part about using a translation layer between Direct3D and OpenGL calls isn't covered on that page, but that is how Valve does it in the Source engine. The Phoronix article here has a bit of info, this thread on the Steam forums has some technical info.

Edit: Actually, details of a talk from Valve on L4D2 on Linux and the slide deck are now available.

Alereon fucked around with this message at 14:18 on Aug 12, 2012

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Endymion FRS MK1 posted:

New AMD pricing strategy/deals:
I keep hearing that Sleeping Dogs is amazing, and the Radeon HD 7870 looks like a compelling deal at $249, especially if you were planning on buying the game anyway at $50.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

The Consultant posted:

This is tempting. Are they dual link dvi, typically?
The $20-$30 ones are not. Are all three too high resolution for single-link?

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

dog nougat posted:

I keep finding myself wanting to upgrade to a 660 or a 670. I've currently got a 570 HD. I only game at 1920 x 1080. But this 570 is just so drat loud, everything still looks fine in games too, I just want more. I think I have a problem.

Let's say I decide to get another card, would I be better off upgrading to a 6 series or would another 570 do me better? My motherboard can support SLI. I have an Antec Earthwatts 650, would that be enough to run 2 570's?
You would need a new power supply (220W per card plus CPU plus everything else plus headroom, 850W as a bare minimum, more if you overclock), and if one 570 is loud to you, two would be insane. I'd just gently caress with fan speeds and such. I don't think even a GTX 680 would provide enough of a performance improvement to be worth an upgrade. If noise is really a problem for you and you can't fix it by loving with fan speeds, an aftermarket cooler may be a good investment.

Here's benchmarks from Anandtech comparing a GTX 570 to a GTX 680, and while there would be an obvious improvement, I don't think that justifies the price of the card. Reevaluate when nVidia eventually releases GK100-based cards, especially if you care about compute.

Alereon fucked around with this message at 21:13 on Aug 26, 2012

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

Anandtech has their review of the Geforce GTX 660 up. While it slots in well in nVidia's lineup, it simply can't compete with AMD on value. The card GTX 660 delivers performance between the Radeon HD 7850 and 7870, but is priced above the 7870, which is insane. I guess nVidia is hoping people don't know about the bundled offerings of the Radeon cards? Even if you don't care at all and Ebay it for a fraction of the retail price you're still coming in $10 ahead of the slower GTX 660 (or $30 after rebate).

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

As much as nVidia cards have their advantages, I don't think it's fair to say that Rage never worked right on AMD cards. The launch experience sucked, but cutting edge games tend to take a driver revision or two before everything catches up (and Id didn't spend nearly enough time testing or polishing Rage, which John Carmack admits to). As long as you use updated drivers and install the profiles things work well.

Adbot
ADBOT LOVES YOU

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed

College Slice

MixMasterMalaria posted:

How is the value on the GTX 660? Mercury engine playback support for Premiere CS6 would be nice, and it seems to be quite a bit faster than the 7850, but I'm not seeing it recommended here.
Isn't Mercury only supported on Fermi (GTX 400/500) cards? I thought we didn't get support on Kepler until the GTX 700-series.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply