Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
~Coxy
Dec 9, 2003

R.I.P. Inter-OS Sass - b.2000AD d.2003AD

Agreed posted:

Or, let UNIGINE Heaven 3.0 go on its merry loop and you'll find out real quick what kind of stability your overclock has. Heaven is my favorite, both because they've consistently sorta one-upped Futuremark for relevance despite it being a freeware product; because there's no manual dicking around required, it auto-loops through a scene where every camera hard change is testing/showing off something new and DX11, so if you want to, feel free to just let it run for awhile and if you come back to a driver crash you know to reduce clocks; and you can monitor it to see if you've got shader artifacts, geometry issues, etc. and help narrow down what isn't working right.

Heaven is also the first benchmark for OS X since the Radeon 9600 days! :woz:

Adbot
ADBOT LOVES YOU

MeruFM
Jul 27, 2010
How much quieter are the 2-fan/3-fan full PCI board configurations of the GTX 670 compared to the baseline ones?

Been waiting 2 weeks for one of the custom ones. They seem to be in really short supply.

I just don't want another jet engine solution like my GTX470 or Radeon4890 before that.

mayodreams
Jul 4, 2003


Hello darkness,
my old friend
New WHQL Nvidia drivers dropped today.

http://www.geforce.com/drivers/results/44967

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down


Speak of the devil... This is for Fermi, not Kepler, an architecture that's getting mildly long in the tooth at this point and yet:

"GTX 570/580:
...
Up to 21% in The Elder Scrolls V: Skyrim
..."


So adding that to the previous WHQL drivers' improvement, that's, what, 60% cumulative performance improvement compared to the drivers that were out when Skyrim launched? Something was going on between the engine and the architecture of the card that was costing somewhere around half of its possible performance, and they've now got that lined out, what, half a year later? Neither company can say a word about driver superiority, both are just doing the best they can to keep up with the engine antics that accompany new AAA game launches. Drivers are one part features, hundreds of parts individual per-game hacks to improve compatibility.

Edit: I checked, and yeah, drivers 295.73 WHQL boasted "Game-changing performance boost of up to 45% in The Elder Scrolls V: Skyrim, “the fastest selling title in Steam’s history"". Welp!

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
I've been using beta/dev drivers for a while, but if this is the first WHQL version with adaptive vsync I highly recommend it. I get more tearing than I did with regular ol' vsync but the performance overall is much better and on games where my GPU is vastly overpowering it doesn't waste as much juice.

ToG
Feb 17, 2007
Rory Gallagher Wannabe
I've just bought a replacement motherboard for my old desktop PC and need a new GPU. What would be a good match with an Intel Q6600, 4GB Ram (800mhz) machine. I had a 8800GT back in the day but I sold it. Is that a decent GPU for this level of stuff?

My budget is <£50 (70-80USD) and I've no quams buying used. Just looking an indication of where to look (i.e. ATI/NVideo and possibly a range to look at)

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
System building/parts picking thread is ^^^^ thataway.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

ToG posted:

I've just bought a replacement motherboard for my old desktop PC and need a new GPU. What would be a good match with an Intel Q6600, 4GB Ram (800mhz) machine. I had a 8800GT back in the day but I sold it. Is that a decent GPU for this level of stuff?

My budget is <£50 (70-80USD) and I've no quams buying used. Just looking an indication of where to look (i.e. ATI/NVideo and possibly a range to look at)

This is probably the most GPU you'll get for that money:
http://www.ebuyer.com/280038-powercolor-hd-6670-1gb-ddr3-dvi-vga-hdmi-pci-e-graphics-card-ax6670-1gbk3-h

Yes, there's an MSI for a couple of quid less, but this just pushes you over the £50 needed for free 5 day delivery.

Had a look on ebay for you, and I didn't see anything that stood out in that price range.

HalloKitty fucked around with this message at 23:04 on May 22, 2012

ToG
Feb 17, 2007
Rory Gallagher Wannabe
Yeah there's some decent ones on ebay but they're ALL auctions. What's the equivalent Nvidia card to that thing?

ToG fucked around with this message at 23:24 on May 22, 2012

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
There is none; low-end Nvidia cards are terrible values. And this still isn't the parts-picking thread.

ToG
Feb 17, 2007
Rory Gallagher Wannabe

HalloKitty posted:

This is probably the most GPU you'll get for that money:
http://www.ebuyer.com/280038-powercolor-hd-6670-1gb-ddr3-dvi-vga-hdmi-pci-e-graphics-card-ax6670-1gbk3-h

Yes, there's an MSI for a couple of quid less, but this just pushes you over the £50 needed for free 5 day delivery.

Had a look on ebay for you, and I didn't see anything that stood out in that price range.

I found this which gave the the sort of anchor I was looking for. That 6670 looks like the one to be, I might spend a few more quid on a better make though.

Cheers for the help.

edit: Looks like ebuyer want £10 for delivery. I'll look elsewhere.

ToG fucked around with this message at 00:26 on May 23, 2012

Spagett
Feb 24, 2012

MeruFM posted:

How much quieter are the 2-fan/3-fan full PCI board configurations of the GTX 670 compared to the baseline ones?

Been waiting 2 weeks for one of the custom ones. They seem to be in really short supply.

I just don't want another jet engine solution like my GTX470 or Radeon4890 before that.

My GTX 670 is completely silent, although some people complain of a motor noise I don't hear anything. Now that may just be my extremely loud PC but the reference model seems very quiet to me (I'm using an EVGA one).

Lovable Luciferian
Jul 10, 2007

Flashing my onyx masonic ring at 5 cent wing n trivia night at Dinglers Sports Bar - Ozma

brettlaf posted:

My GTX 670 is completely silent, although some people complain of a motor noise I don't hear anything. Now that may just be my extremely loud PC but the reference model seems very quiet to me (I'm using an EVGA one).

I'm not a computer Wizard (not anymore anyway,) but wouldn't that all depend on the manufacturer and if they put a noisy fan on it?

EvilCoolAidMan
Jun 26, 2008
Quick question. Can I run BF3 at ultra on a 27" monitor with a GTX670, or do I need to step up to a GTX680?

Animal
Apr 8, 2003

EvilCoolAidMan posted:

Quick question. Can I run BF3 at ultra on a 27" monitor with a GTX670, or do I need to step up to a GTX680?

You did not list a resolution. But yes the 670 will run it fine on any 27" regardless of resolution.

coffeetable
Feb 5, 2006

TELL ME AGAIN HOW GREAT BRITAIN WOULD BE IF IT WAS RULED BY THE MERCILESS JACKBOOT OF PRINCE CHARLES

YES I DO TALK TO PLANTS ACTUALLY

EvilCoolAidMan posted:

Quick question. Can I run BF3 at ultra on a 27" monitor with a GTX670, or do I need to step up to a GTX680?

Hardly any difference between them, especially with the better 670 cards.

EvilCoolAidMan
Jun 26, 2008

Animal posted:

You did not list a resolution. But yes the 670 will run it fine on any 27" regardless of resolution.

Sorry it's 2560x1440, but thanks for the info!

movax
Aug 30, 2008

Yeah, GTX670 will own balls at that resolution, should be really really pretty.

Also, RizieN kindly made some graphics for this OP just like the monitor megathread, they look loving awesome. :woop:

Star War Sex Parrot
Oct 2, 2003

Hey these fixed my GTX 680's DisplayPort audio issue. Awesome!

Kramjacks
Jul 5, 2007

Can anyone recommend a driver sweeper? A bunch come up when I google the term but I don't want to just install any random one without knowing if its good or not.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

ToG posted:

I found this which gave the the sort of anchor I was looking for. That 6670 looks like the one to be, I might spend a few more quid on a better make though.

Cheers for the help.

edit: Looks like ebuyer want £10 for delivery. I'll look elsewhere.

it's free is you select 5 day delivery, as long as the order is over 50 quid. did say that..

HalloKitty fucked around with this message at 08:04 on May 23, 2012

td4guy
Jun 13, 2005

I always hated that guy.

Kramjacks posted:

Can anyone recommend a driver sweeper? A bunch come up when I google the term but I don't want to just install any random one without knowing if its good or not.
Phyxion's Driver Sweeper is the most popular and awesomest one.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Dogen posted:

I've been using beta/dev drivers for a while, but if this is the first WHQL version with adaptive vsync I highly recommend it. I get more tearing than I did with regular ol' vsync but the performance overall is much better and on games where my GPU is vastly overpowering it doesn't waste as much juice.

I can't be arsed to read the loving manual because I am almost literally a babby, could you share some experience-based insight as to which adaptive vsync I ought to go with since we're using the same usually-drastically-overpowered card? I'm using a 60hz Samsung 1080p monitor, appears to be roughly, ah... 24" or so. Regular adaptive, or Half?

Framerate limits have been available for some time through nVidiaInspector, but I am interested in the Framerate Target feature... Not quite sure what that's going to do, though.

I should probably rtfm after all. But any insight you can offer as someone who has been using beta drivers with the same card would be very much appreciated.

Edit: Also, if I get a 670/680 ($100 for a 10% improvement? ... we'll see) I'll sell you my GTX 580 at around half price, promise. Put that Cadillac of power supplies to use. :v: But I'm a little concerned at Kepler's apparently rather mediocre DX9 performance thus far... Still enough games using DX9 (e.g. The Witcher 2) that I really don't want to spend a bunch of money to downgrade my performance there. Anyone know if more has come of that since initial reports of the peculiar behavior?

Agreed fucked around with this message at 11:02 on May 23, 2012

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
I overclocked my HD7850 to 1GHz and ran the Heaven Benchmark 3.0 on it for about an hour without issues. The GPU only got to 61C under load at 1GHz too. :stare::fh:

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

Agreed posted:

I can't be arsed to read the loving manual because I am almost literally a babby, could you share some experience-based insight as to which adaptive vsync I ought to go with since we're using the same usually-drastically-overpowered card? I'm using a 60hz Samsung 1080p monitor, appears to be roughly, ah... 24" or so. Regular adaptive, or Half?

Framerate limits have been available for some time through nVidiaInspector, but I am interested in the Framerate Target feature... Not quite sure what that's going to do, though.

I should probably rtfm after all. But any insight you can offer as someone who has been using beta drivers with the same card would be very much appreciated.

Edit: Also, if I get a 670/680 ($100 for a 10% improvement? ... we'll see) I'll sell you my GTX 580 at around half price, promise. Put that Cadillac of power supplies to use. :v: But I'm a little concerned at Kepler's apparently rather mediocre DX9 performance thus far... Still enough games using DX9 (e.g. The Witcher 2) that I really don't want to spend a bunch of money to downgrade my performance there. Anyone know if more has come of that since initial reports of the peculiar behavior?

Regular, half limits it to 30fps, which I don't really know why that option is in there.

You might not like it, I still get tearing occasionally (I guess when frames drastically shoot up and it has trouble compensating? not sure).

Animal
Apr 8, 2003

To me its well worth it. The tearing is minimal and well worth the gain in fps stability.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Dogen posted:

Regular, half limits it to 30fps, which I don't really know why that option is in there.

You might not like it, I still get tearing occasionally (I guess when frames drastically shoot up and it has trouble compensating? not sure).

I can put up with some tearing if it means I get more consistent FPS. High end cards are in a weird spot when it comes to that "well it'd do 50+fps but it won't lock in at 60" with high graphics. Especially with Triple Buffering.

I am very interested in seeing FXAA just plain enable-able across the board, as FXinjector (the most configurable one, anyway) is limited to DX9 games.

I'll be testing how FXAA works on games I've already enabled FXAA on via the injector :v:

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
Yeah that's the beauty of it, you don't get *BOOM* down to 30 when frames drop below 60 anymore.

I have been enjoying playing with FXAA on titles that don't support it already.

Star War Sex Parrot
Oct 2, 2003

Dogen posted:

Regular, half limits it to 30fps, which I don't really know why that option is in there.
They justified it as "some really old games don't expect to go above 30, so use this."

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
A few modern games, too, like LA Noire. The facial animation system is a 30FPS video normal map, basically, so the engine locks in sync with that.

Also, this is neither here nor there, but every time I type "LA Noire" I very nearly typo "LA Norse," and I imagine the most wonderful Skyrim mashup.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

Factory Factory posted:

A few modern games, too, like LA Noire. The facial animation system is a 30FPS video normal map, basically, so the engine locks in sync with that.

Also, this is neither here nor there, but every time I type "LA Noire" I very nearly typo "LA Norse," and I imagine the most wonderful Skyrim mashup.

I think they came up with some kind of workaround patch before last, but don't quote me on that.

movax
Aug 30, 2008

Factory Factory posted:

A few modern games, too, like LA Noire. The facial animation system is a 30FPS video normal map, basically, so the engine locks in sync with that.

Also, this is neither here nor there, but every time I type "LA Noire" I very nearly typo "LA Norse," and I imagine the most wonderful Skyrim mashup.

LA Noire is actually running really well on my GTX 460 @ 2560x1600, shockingly enough. Even without AA, it's a pretty looking game.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

FXAA on top of FXAA looks... pretty good, little soft on some things but a quick CSAA pass takes care of that.

Niiiiiice.

Star War Sex Parrot
Oct 2, 2003

Just for shits and giggles I fired up Half-Life last night with 32x CSAA + FXAA with transparency super sampling. :pcgaming:

mayodreams
Jul 4, 2003


Hello darkness,
my old friend

Star War Sex Parrot posted:

Just for shits and giggles I fired up Half-Life last night with 32x CSAA + FXAA with transparency super sampling. :pcgaming:

Source or the original?

Mayne
Mar 22, 2008

To crooked eyes truth may wear a wry face.
Any idea why adaptive vsync is not working with Diablo3 fullscreen windowed mode?
It only works in fullscreen mode for some reason and i'd prefer to use adaptive mode instead of Diablo3 ingame vsync.

Star War Sex Parrot
Oct 2, 2003

mayodreams posted:

Source or the original?
Original.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Why not go for full SSAA then? Render internally at 10x resolution and then downscale. :getin:

Socialism
May 9, 2009
It looks like the Asus 670 GTX is now available. I ordered one from Newegg (http://www.newegg.com/Product/Product.aspx?Item=N82E16814121638) earlier today, although it's currently OOS. I think they restock pretty frequently, so it's worthwhile to keep an eye out.

The reviews for the Asus 670 has been phenomenal, especially the virtually silent fans, which will be a drastic improvement over my EVGA 580 and its awful blower. Also I look forward to plugging my Apple display directly instead of routing through Lucid Virtu (no displayport on the reference 580 for some reason).

Has anyone tried overclocking the 670? With a good custom cooler it seems the only limitation is voltage and stability rather than noise/heat. It seems insane to me that in some reviews factory OC'ed 670 is within single digit percentage performance of the 590. Also with the option of custom cooled 670 SLI for ~$850, I can't imagine 690 to have much of a place anymore.

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Socialism posted:

Has anyone tried overclocking the 670? With a good custom cooler it seems the only limitation is voltage and stability rather than noise/heat. It seems insane to me that in some reviews factory OC'ed 670 is within single digit percentage performance of the 590. Also with the option of custom cooled 670 SLI for ~$850, I can't imagine 690 to have much of a place anymore.

Info gathered 'round the net suggests that its overclocking capability is pretty much the same as the GTX 680's, and since they perform extraordinarily close to one another in most tasks with the stock clock discrepancy, the fact that some people get bummed out if they can't get their GTX 670 to at least 1200mhz core is pretty funny.

Here's a wrinkle - from what I can tell, the 7970 has better clock-for-clock performance at most tasks than the GTX 680, around 5% to 7% depending. And you can overclock the poo poo out of it. But, I do feel nVidia is a clear features leader right now, regardless of the potential performance discrepancy there, and driver support for nVidia's lineup has resulted in what seems to me to be a more thorough performance improvement over previous generations in most titles. Not all, most, though. And the basically-free FXAA and capability to do PhysX acceleration as a bonus. AMD/ATI is working on MLAA, I know, and the latest I saw there suggested that their image quality was getting more and more impressive (an area where their tech probably has a slight edge over FXAA as development continues, though the quirky and proprietary nature of it compared to nVidia's usage of it as a general post-processor for any game still hinders it a bit).

What's really vexing me is that the GTX 580 is still holding on. It's virtually clock-for-clock even with the 7970, which means a heavily overclocked GTX 580 performs kinda similarly to a (stupidly) not overclocked 680. While it is fun to be able to say "wow, an overclocked GTX 670/680 sometimes outperforms a GTX 590 and has smoother gameplay!!" you can very nearly say the same thing about a heavily overclocked GTX 580. Just because GF110 was never the right chip to slap together for a two-GPU card in the first place and performs like poo poo :v:

Basically, if you've got a GTX 580 right now, your best bet is to just accept it for what it is, get the highest clock you can, and wait for the next cards to land and push the performance envelope farther - assuming rough overclocking performance percentage parity, which is a pretty safe bet since as mentioned you get at least a 5% head start per megahertz on a 580, you're stuck at around 25-35% better performance in most games, up to 40% in some with a 680... And pretty much the same story, just with virtually identical clock-for-clock per-frame rendering performance with a 7970.

Even overclocking the poo poo out of them, that's not going to make the difference between "can't turn everything up" and "CAN turn everything up!" in the most demanding titles that nutjobs like me enjoy punishing our graphics cards to try to get lookin' good.

Edit: Reason that bums me out instead of making me go :unsmith: is because I really want to turn stuff up all the way, I've got cash in hand to get a card, but performance just isn't there yet except for the 690 and there's no way in hell I'm paying a grand for a god damned graphics card :suicide:

Agreed fucked around with this message at 05:47 on May 26, 2012

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply