Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Fish Cake
Jun 13, 2008

woof

real_scud posted:

Maybe? I bought some DP cables from Monoprice so I don't think they're really out of spec. Do you happen to have a link to where you bought your cables?

I can't imagine Monoprice would sell bad cables so it might just be a monitor quirk; it seems like a lot of other people in the thread have the same issue so it might just be something strange going on with my setup but I can't complain :v:

FWIW I own the cheapest cable I could find on Amazon (http://amzn.com/B001MIB0SU) and it connects a 5750 to a Dell U2312.

Adbot
ADBOT LOVES YOU

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010
I belive i've found a terrible use for my GTX 670. I'm watching the rover land on mars with this app

http://eyes.nasa.gov/

I've locked the framerate at 30fps and running 16xQ CSAA. Note that java doesn't hit the graphics card that hard, but it is going to look sweet. I love the adaptive frame rate capabilities (tried it on my older 260, didn't look nearly as smooth).


incoherent fucked around with this message at 02:59 on Aug 6, 2012

future ghost
Dec 5, 2005

:byetankie:
Gun Saliva

incoherent posted:

I belive i've found a terrible use for my GTX 670. I'm watching the rover land on mars with this app

http://eyes.nasa.gov/

I've locked the framerate at 30fps and running 16xQ CSAA. Note that java doesn't hit the graphics card that hard, but it is going to look sweet. I love the adaptive frame rate capabilities (tried it on my older 260, didn't look nearly as smooth).



Whoa, thanks for the link.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Heads up, sports fans nerds: the files for Source Filmmaker have leaked the existence of an in-development Source 2 engine.

I'm not sure the jump will be as drastic as the difference between HL1 and HL2... but then again, if the lighting model is competitive with Unreal Engine 4, maybe it will be.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
Surely HL2:Ep3 will be the launch title for the engine!... :sigh:

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Factory Factory posted:

Heads up, sports fans nerds: the files for Source Filmmaker have leaked the existence of an in-development Source 2 engine.

I'm not sure the jump will be as drastic as the difference between HL1 and HL2... but then again, if the lighting model is competitive with Unreal Engine 4, maybe it will be.
This is interesting when paired with this blog post from Valve about how they've found OpenGL more efficient and faster than Direct3D, even when the OpenGL app is actually a Direct3D app running via OpenGL through a translation layer.

cliffy
Apr 12, 2002

Does anyone think my current machine:

Q6600 @ 2.4Ghz
4GB DDR2 RAM @ 800Mhz
Running at 2560x1440

Would hold back a GTX 670, or would it be able to utilize the card efficiently?

If it can't, why not? How would I go about diagnosing the bottleneck myself? I want to be absolutely sure before I go spending money to upgrade my system.

Anti-Hero
Feb 26, 2004

cliffy posted:

Does anyone think my current machine:

Q6600 @ 2.4Ghz
4GB DDR2 RAM @ 800Mhz
Running at 2560x1440

Would hold back a GTX 670, or would it be able to utilize the card efficiently?

If it can't, why not? How would I go about diagnosing the bottleneck myself? I want to be absolutely sure before I go spending money to upgrade my system.

I'm not speaking from experience here, but I think you'll need to OC that Q6600 to not get bottlenecked. I don't own one personally, but I've seen a smattering of threads on other hardware forums asking a similar question and my recollection is that a mild overclock of at least 3.0 Ghz is needed.

craig588
Nov 19, 2005

by Nyc_Tattoo
At stock speeds a q6600 with a 670 is probably a waste. If you can't overclock then get something cheaper, if overclocking's on the table then it shouldn't be much of a restriction at 3.2+ ghz.

There's also the other side of it, if you plan to keep the card for 5 years or whatever then go nuts, it'll make your next CPU upgrade all the better. It's not like having too powerful of card will slow anything down if you do keep the CPU at stock speeds, it'll just be a bit of a waste of money right now.

craig588 fucked around with this message at 21:04 on Aug 7, 2012

rainy day
Jul 20, 2009

by Ralp
I just upgraded my computer monitors so I'm finally at 1920x1200 and my old GTX 460 768MB is really starting to struggle. Should I spring for a 670?

My other specs are:
i5 2500k @ 3.3GHz
8GB DDR3 RAM

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
That's a question for the parts picking sticky thread.

shodanjr_gr
Nov 20, 2007

Alereon posted:

This is interesting when paired with this blog post from Valve about how they've found OpenGL more efficient and faster than Direct3D, even when the OpenGL app is actually a Direct3D app running via OpenGL through a translation layer.

That's not what that blog post says.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

shodanjr_gr posted:

That's not what that blog post says.
Can you elaborate on how I misinterpreted it?

Wozbo
Jul 5, 2010
It looks like opengl and direct3d are running side by side, not being translated.

quote:

The second category would include reducing overhead in calling OpenGL, and extending our renderer with new interfaces for better encapsulation of OpenGL and Direct3D.

Or am I reading that wrong? If it is then hot drat that's some sick performance.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice
The Source engine is Direct3D, to run in OpenGL they use a layer that translates the Direct3D calls to OpenGL. What they're saying in the "OpenGL versus Direct3D on Windows 7" section then is that the reduced overhead in OpenGL is so significant that it more than makes up for the overhead of the translation layer. That would seem a pretty significant result, though this may have something to do with Source not being DX10+. Unless of course I'm misunderstanding this in some way, which is possible, but I don't think is the case.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
I'm not seeing that, though I'm not about to say it's unambiguous. But this sentence:

quote:

The second category ["Modifying our game to work better with OpenGL"] would include reducing overhead in calling OpenGL, and extending our renderer with new interfaces for better encapsulation of OpenGL and Direct3D.

The end of that sentence reads to me like renderer calls for both interfaces are being treated equally by the engine itself. There are already versions of the engine in OpenGL for Mac OS X and Playstation 3. The PS3 version was someone else's doing, but the Mac OS version is in-house. The early SteamPlay Source games were probably ports, but a long time has passed since then.

It's also frustratingly difficult to figure this out based on factors like performance, because from a gaming perspective, Apple's drivers have tons and tons of issues that limit performance. For example, here's a Valve blog post on one, showing how an occlusion call was hogging up OS X's OpenGL process because there was no way to trade accuracy for speed. Problems like this hold true even for cross-platform games that had OpenGL development for PS3.

Here's a late-2010 Macworld investigation on gaming performance for a bit more info. See COD4: heavy OpenGL development for PS3, close performance to Windows on varying hardware (equal on a Radeon 4850). The Snow Leopard Graphics Update apparently raised Portal performance to similar near-parity. So if there is Direct3D translation going on, it's not doing much performance-wise.

shodanjr_gr
Nov 20, 2007
First of all I don't see anything in that post that states that Valve translates D3D calls to OpenGL.

They mention this:

quote:

The second category would include reducing overhead in calling OpenGL, and extending our renderer with new interfaces for better encapsulation of OpenGL and Direct3D.

and from the wording I infer that they have both D3D and OpenGL rendering back ends.

Secondly, they spent time optimizing Source for Linux/OpenGL and they worked with hardware vendors (who they got to fix stuff in their drivers) to optimize their OpenGL stacks. Notice this:

quote:

Now that we know the hardware is capable of more performance, we will go back and figure out how to mitigate this effect under Direct3D.

Valve seems to believe that the same optimizations they applied to OpenGL can be used for D3D.

Sorry if I'm coming across as pedantic but I've read discussions about that blog post in a bunch of places that basically boiled down to "OpenGL > D3D herp derp" and I don't see how people can logically reach that conclusion...

FamDav
Mar 29, 2008
So should I take this August 16th release date for the 660 as reliable? Want to decide if it's worth it to wait.

unpronounceable
Apr 4, 2010

You mean we still have another game to go through?!
Fallen Rib

FamDav posted:

So should I take this August 16th release date for the 660 as reliable? Want to decide if it's worth it to wait.

You don't wait: You buy a GPU, and might miss out on a better one, but you can make use of it a week sooner.
You wait: You deal with whatever you have right now, and make an informed decision in a week or so.

From the benchmarks we've seen, it looks like it'll be a pretty great card. Whether it'll be worth buying or not depends on how much you'll be able to snag it for.

Animal
Apr 8, 2003

unpronounceable posted:

You don't wait: You buy a GPU, and might miss out on a better one, but you can make use of it a week sooner.
You wait: You deal with whatever you have right now, and make an informed decision in a week or so.

From the benchmarks we've seen, it looks like it'll be a pretty great card. Whether it'll be worth buying or not depends on how much you'll be able to snag it for.

Or if you are able to snag it at all, considering past shortages.

Boten Anna
Feb 22, 2010

Animal posted:

Or if you are able to snag it at all, considering past shortages.

I suggest camping the EVGA site on launch morning. This is how I got a 670 on launch day with absolutely no issues, though goon warning: you'll have to wake up before noon. If no specifics are announced ahead of time, just set some paramaters for yourself. Buy IF less than $X, spending cap is $X so only buy the TURBO EDITION if it is less than $X, and be ready to mash the "add to cart" button as soon as it pops up in the store.

Boten Anna fucked around with this message at 23:19 on Aug 8, 2012

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

EVGA is really good to order from even if you can't get it immediately - they furnish committed end user purchases before retailer general stock, in my experience.

movax
Aug 30, 2008

Welp, I ordered a GTX 670 from Amazon. The GTX 680 was well within budget range, but I just couldn't bring myself to spend the $100 more for the minor performance increase.

GTX 460 to GTX 670, here we go! :downs: Looking forward to 80FPS+ with Skyrim @ 2560x1600.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

You're in for a major treat. That's a hell of an upgrade. poo poo, I'm at 1080p and feeling the 580-->680 upgrade big time, though partly because I lucked out and got a high-overclocking GTX 680 sample, I guess.

Remember when overclocking to find your VRAM OC hard limit first, because at that resolution you're going to want to weigh the OC in favor of memory in order to get the bandwidth you need for the GPU/SMX to work well. Try to get up to at least 6500MHz on the GDDR5 before you add the core itself in.

One would think voltage shouldn't have much to do with the overclocking stability since it will automatically grab for the highest possible voltage immediately if it needs it, but I have found that to be 100% false. Raising your voltage is a sort of less-problematic "load line calibration" - the dynamic power adjustment functionality is good, but it's not perfect, and if you hit a segment that calls for (on my card, max voltage is 1.175V) 1.175V but it's at the default .985V, the power scaling might not get there fast enough and could crash you out. Results in instability.

Ease of overclocking:

1. Drag the power target slider all the way over because it opens the card up for whatever it will be capable of as far as the core clock goes

2. Raise your voltage to the maximum to ensure that your memory and core/shaders are well-fed and don't suffer a crash due to insufficient momentary voltage on transition (otherwise, it will automatically raise the voltage anyway, but you risk instability if it can't go from 1.135V to 1.15V or 1.175V fast enough for the immediate demands on the hardware).

3. Make a custom fan profile to keep it from ever seeing 70ºC because it begins to automatically downclock at that temperature. The fan runs quiet and the cooling system is very well designed, so pegging it a bit ahead of temperature to aggressively cool the card isn't obnoxious. My 680 doesn't get past 65ºC in Metro 2033 at maxed out settings.

4. Start overclocking with the VRAM, once you've found your max there; for TDP throttling reasons you probably don't want to run it higher than 6.7GHz even if it'll do a full 7GHz, as some cards' GDDR5 and memory controllers will do. I'm running my GTX 680's VRAM at 6.8GHz but it has more TDP and that lets me get to the exact point at which my core can't go faster anyway without instability. Balance the memory and the core, and enjoy higher minimum framerates as the main fruit of your labors (I mean, average and max goes up too, but we care about hitching to a low framerate a hell of a lot more than we do about "oh, hooray, it'll run a max of 138FPS in this demanding game," yeah?)

Have fun! :allears:

Agreed fucked around with this message at 09:49 on Aug 11, 2012

VulgarandStupid
Aug 5, 2003
I AM, AND ALWAYS WILL BE, UNFUCKABLE AND A TOTAL DISAPPOINTMENT TO EVERYONE. DAE WANNA CUM PLAY WITH ME!?




Saw some nvidia branded 660ti s at work this morning. $300 retail but can't sell then until the 18th.

doomisland
Oct 5, 2004

VulgarandStupid posted:

Saw some nvidia branded 660ti s at work this morning. $300 retail but can't sell then until the 18th.

I'm so in there. Boom!

FamDav
Mar 29, 2008

VulgarandStupid posted:

Saw some nvidia branded 660ti s at work this morning. $300 retail but can't sell then until the 18th.

Speaking of which, http://wccftech.com/nvidia-geforce-gtx-660ti-spotted-bestbuy-officially-priced-299/. Kind of hoping that this is the best buy bump rather than the actual price, though I don't know how BB prices graphics cards.

FamDav fucked around with this message at 18:59 on Aug 11, 2012

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

It makes sense, they could almost certainly afford to sell them at $250 and still make a tidy profit. Much like the 560Ti over its lifetime - having also started at or slightly above $300 - as ATI refreshes their lineup and nVidia has to put up more of a fight, as the price:performance curve shifts, they can walk the 660Ti price down without any problems. Right now, they have no reason to do so - it slots in perfectly price:performance wise, pressures AMD just fine as-is since it's hitting their $300 price point with the performance of their $375-$400 cards. And, remember, this whole damned generation for nVidia's "real" consumer graphics card lineup is one freaking chip.

Toward the end of Fermi, we saw the 560Ti-448, which was just a way to keep the "Ti" branding strong while offloading third-tier Fermi GF110 chips. It could just have easily been called the GTX 570-LE or whatever, but they've got a strong association now with "Ti" branding and price:performance dominance. Pushing that by forgetting the idea of spinning a new chip for the 560Ti designation and just doing the 560Ti-448 "thing" from the start is a pretty good move on their part.

Alereon
Feb 6, 2004

Dehumanize yourself and face to Trumpshed
College Slice

Factory Factory posted:

I'm not seeing that, though I'm not about to say it's unambiguous. But this sentence:

shodanjr_gr posted:

First of all I don't see anything in that post that states that Valve translates D3D calls to OpenGL.
Sorry I should have been more clear, I know the part about using a translation layer between Direct3D and OpenGL calls isn't covered on that page, but that is how Valve does it in the Source engine. The Phoronix article here has a bit of info, this thread on the Steam forums has some technical info.

Edit: Actually, details of a talk from Valve on L4D2 on Linux and the slide deck are now available.

Alereon fucked around with this message at 15:18 on Aug 12, 2012

Opus125
Jul 29, 2011

by Y Kant Ozma Post
What is the performance boost of the 660ti over its predecessor?

craig588
Nov 19, 2005

by Nyc_Tattoo
Here are 560 ti benchmarks: http://www.anandtech.com/bench/Product/547
Here are 660 ti benchmarks: http://www.tweaktown.com/reviews/4869/nvidia_geforce_gtx_660_ti_2gb_reference_video_card_review/index.html

Better benchmarks for the 660 ti will be out once the NDA expires. As it looks right now, it blows the 560 ti away because it's generally on par with and even beating the 580 GTX.

Rigged Death Trap
Feb 13, 2012

BEEP BEEP BEEP BEEP

Opus125 posted:

What is the performance boost of the 660ti over its predecessor?

Around 20-40%, depending on the game.
It still creams current games at 1920*1200/1080, most running well at 50+ FPS (4x AA at highest detail).

Basically a ~10-20% slower 670 at about 75% of the price.
But the 670 does stressful loads better.

Rigged Death Trap fucked around with this message at 10:24 on Aug 13, 2012

Whale Cancer
Jun 25, 2004

I'm looking to invest about $300 into a card for gaming. I'll be running an i5 3570 chip. I'm currently set on the 560 ti 448. Is there a better option at my price point? I don't plan on overclocking as I'm a bit nervous to do that.

This is the card I'm looking at.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814130738

Grim Up North
Dec 12, 2011

Whale Cancer posted:

I'm looking to invest about $300 into a card for gaming. I'll be running an i5 3570 chip. I'm currently set on the 560 ti 448. Is there a better option at my price point?

Wait for the GTX 660 Ti to come out in in three days. It will initially cost $300 and be quite a bit faster. (Read the posts above yours.)

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Whale Cancer posted:

I'm looking to invest about $300 into a card for gaming. I'll be running an i5 3570 chip. I'm currently set on the 560 ti 448. Is there a better option at my price point? I don't plan on overclocking as I'm a bit nervous to do that.

This is the card I'm looking at.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814130738

Definitely don't bother pairing a previous generation card with such a nice new CPU. AMD offers the best new cards in this price bracket at this exact point in time, but the 660Ti is out soon, and should be the card you're looking for.

HalloKitty fucked around with this message at 21:13 on Aug 13, 2012

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Whale Cancer posted:

I'm looking to invest about $300 into a card for gaming. I'll be running an i5 3570 chip. I'm currently set on the 560 ti 448. Is there a better option at my price point? I don't plan on overclocking as I'm a bit nervous to do that.

This is the card I'm looking at.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814130738

Check the last page or so, we've gone from "holy smokes the 660Ti is in fact a 670 with a smaller memory bus and that's IT, I wonder how they'll price it?" to seeing "they're pricing it at $300, that's a fair and expected price point and gives them room to back down the price as the slightly-askew generations proceed, sort of like the 560Ti did but with a build choice much more like the 560Ti-448."

Basically, your question is all we've been talking about when it comes to actual graphics card stuff since it is the news right now, and it would be a good idea to wait a little bit to get a 660Ti since it costs the same as the card you're looking at but outperforms the GTX 580 according to current reports (and according to its specs, it should, especially at 1920x1200 or 1920x1080 - above that, the lower memory bandwidth starts to sorta kneecap the card a bit and you might benefit from going with a 7850 instead and overclocking the poo poo out of it).

What is your resolution, by the way? Resolution and CPU are the questions people need to know to assess your best choice.

Whale Cancer
Jun 25, 2004

I'm still monitor shopping. My current monitor is 1920x1080 though but I won't go above 1920x1200.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Whale Cancer posted:

I'm still monitor shopping. My current monitor is 1920x1080 though but I won't go above 1920x1200.

Shoe-in for a 660Ti, be patient (and vigilant, guarantee there'll be a run on them immediately - EVGA is good about filling customer orders in a timely fashion, don't pay marked up prices for no good reason, just be willing to wait a few days for the card to be ready to ship!).

Anti-Hero
Feb 26, 2004
Are any of the Kepler offerings when SLI'd tempting enough to consider an upgrade from my 580SLI rig? I game at 2560x1600 on a Nehalim i5.

Adbot
ADBOT LOVES YOU

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Well, you'd cut your system's power draw in half, but the performance increase from, say, a pair of GeForce 670s would be measurable in some games but not worth $800.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply