Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast

Agreed posted:

Dudes... what is fuckin dumb is that GENS doesn't have a maintain aspect ratio option for above 2X scaling, because peak gaming HAPPENED with Shadowrun for the Genesis.

Have you tried KEGA Fusion?

HalloKitty fucked around with this message at 14:45 on Jun 11, 2013

Adbot
ADBOT LOVES YOU

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down


Yeah, shortly after typing, uh, that message back there, I went on a totally competent search for other emulators and luckily landed on Kega Fusion pretty quickly. It owns bones, getting my Shadowrun on. Forgot how tedious running is, but abbacab still works.

It would be an unintentional laugh riot if just as we were debating the merits of AMD vs. nVidia drivers, nVidia goes totally off the reservation and puts out their first killer drivers since 2007ish. I mean, for those of us in this thread, anyway, the rest of the world keeps on spinning :v:

metasynthetic
Dec 2, 2005

in one moment, Earth

in the next, Heaven

Megamarm
EVGA has the 4GB 770 Classifieds listed on their store page, but no stock yet. Hopefully soon.

http://www.evga.com/Products/ProductList.aspx?type=0&family=GeForce+700+Series+Family&chipset=GTX+770

Jmcrofts
Jan 7, 2008

just chillin' in the club
Lipstick Apathy
Where's the 760 Ti NVidia :argh:

metasynthetic
Dec 2, 2005

in one moment, Earth

in the next, Heaven

Megamarm

metasynthetic posted:

EVGA has the 4GB 770 Classifieds listed on their store page, but no stock yet. Hopefully soon.

http://www.evga.com/Products/ProductList.aspx?type=0&family=GeForce+700+Series+Family&chipset=GTX+770

Update: in stock now, just placed my order.

noxiousg
May 24, 2013

Is there anyone here with a GTX 780 that has figured out how to use the 'Shadowplay' feature that is supposed to let you record gameplay footage? I've tried searching Google with terms like 'shadowplay' and 'gtx 780 record gameplay' but all these searches return is a bunch of new articles about the feature. I can't find anywhere that actually tells you how to use it (or where to download it if it isn't included in the GeForce Experience app).

WHERE MY HAT IS AT
Jan 7, 2011

noxiousg posted:

Is there anyone here with a GTX 780 that has figured out how to use the 'Shadowplay' feature that is supposed to let you record gameplay footage? I've tried searching Google with terms like 'shadowplay' and 'gtx 780 record gameplay' but all these searches return is a bunch of new articles about the feature. I can't find anywhere that actually tells you how to use it (or where to download it if it isn't included in the GeForce Experience app).

http://www.geforce.com/whats-new/articles/geforce-experience-official-release

Shadowplay hasn't been released yet, it'll be coming in an update

noxiousg
May 24, 2013

Well, that explains why I couldn't figure out how to use it. Thanks!

Shadowhand00
Jan 23, 2006

Golden Bear is ever watching; day by day he prowls, and when he hears the tread of lowly Stanfurd red,from his Lair he fiercely growls.
Toilet Rascal
Since I had some extra budget for play, I ended up grabbing a GTX 780.

I definitely didn't need this card. After playing with it for a few days and maxing every game I have at 1440p, I realized that I don't even need to overclock this card in order to maximize all of the details.

Overall a great card, but its not something you would ever need for any of the current gen games.

Edit: Unless, you know, you like maxing out games and seeing all of the possible eye candy.

Magic Underwear
May 14, 2003


Young Orc

Shadowhand00 posted:

Since I had some extra budget for play, I ended up grabbing a GTX 780.

I definitely didn't need this card. After playing with it for a few days and maxing every game I have at 1440p, I realized that I don't even need to overclock this card in order to maximize all of the details.

Overall a great card, but its not something you would ever need for any of the current gen games.

Edit: Unless, you know, you like maxing out games and seeing all of the possible eye candy.

You will though. All the next-gen games at E3 looked really good, they're going to be amazing on PC. And since apparently nobody considers PC as a competitor to consoles, most of the "exclusives" are coming to PC as well.

redstormpopcorn
Jun 10, 2007
Aurora Master

Magic Underwear posted:

You will though. All the next-gen games at E3 looked really good, they're going to be amazing on PC. And since apparently nobody considers PC as a competitor to consoles, most of the "exclusives" are coming to PC as well.

I think the PC is going to get many more, much better console ports this generation since both of the performance-oriented consoles are literally running on PC hardware. I wonder if we'll see AMD release cards with an APU and 6+GB GDDR5 with the advertising slant of "just drop a PS4 in your PC!"

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.
The dark horse underdog platform where all the games look completely better and are infinitely more customizable!

The_Franz
Aug 8, 2003

redstormpopcorn posted:

I think the PC is going to get many more, much better console ports this generation since both of the performance-oriented consoles are literally running on PC hardware. I wonder if we'll see AMD release cards with an APU and 6+GB GDDR5 with the advertising slant of "just drop a PS4 in your PC!"

The next generation of APUs coming next year will support a GDDR5 memory controller so having a unified GDDR5 memory pool will be possible. The catch is that any GDDR5 equipped boards are going to be BGA integrated boards that can't be upgraded.

Animal
Apr 8, 2003

I personally dont mind BGA integrated boards. I bought my Gene Z with Sandy Bridge, and by the time I upgrade it will be a different socket. The CPU could have been integrated into my motherboard and it would not have made any difference for me.

dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars


As long as the board manufacturer isn't a crap.

So I guess as long as it's an Asus or AsRock board (or maybe one of a scant few other manufacturers).

I wonder how much of the firmware configuration menu is even necessary on a system so integrated.

movax
Aug 30, 2008

Just updated to 320.18...to be fair, the update took unusually long compared to any previous driver install and my cards beeped several times as well. No other signs of damage / non-functioning though.

Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride
I've been using it since it came out, but I also have a fancy non-reference card so who knows. If it really is kicking ailing VRMs or something corner cutting non-reference designs could be the ones dying.

randyest
Sep 1, 2004

by R. Guyovich
Has Nvidia said anything about 318 being a problem? I've only seen forum posters and bloggers complaining. I've been running it since released with no trouble, and they're still up on Nvidia's website for download, so I assume Nvidia hasn't acknowledged any problem (assuming one exists)?

Also, if this is not cool let me know and I'll edit it out, but just in case anyone here might be interested in a deal on a GTX 570 superclock, I just put one up in Sa Mart and I'm willing to wheel and deal :)

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

I survived back surgery and my recovery is going well, and I was on EVGA's site the very moment a SC with ACX cooling became available. Moth to a flame. :kiddo:



So I have an EVGA GTX 680 SC+ that reliably overclocks to 1240-1250 range core/shaders and 6740MHz VRAM, and an EVGA GTX 580 SC that will hang out around 900mhz core all day long with some room on the VRAM too (less of an issue with the GF110 chips) that I need to get rid of. I haven't sold a graphics card in forever, I guess just load them up on SAmart? I'm offering bro rates (thinking $300ish for the 680, $200ish for the 580, fair prices?) for thread regulars.

FactoryFactory I promised dibs a long time ago on a card if he wants to buy it when it goes up for sale. Given all his work maintaining the thread and being a personal friend to me I figure that's cool.

Since I'm running a Sandy Bridge system with a P67 motherboard and I obviously won't be getting the card in 'til next week, I can't exactly part these suckers out immediately, but however one puts hardware up for sale, they're about to be up for sale. Talk to me if you'd like to see about getting something going on before I head to SA Mart to peddle my wares. :)




Edit: They're still in stock at the moment, good luck folks - http://www.evga.com/Products/ProductList.aspx?type=0&family=GeForce+700+Series+Family&chipset=GTX+780

Agreed fucked around with this message at 00:15 on Jun 15, 2013

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Honestly, I could probably go for that. I'm downsizing to a Prodigy and was eyeballing the 770 but probably gonna go for a 670 for cost reasons.

E: Wait, custom cooled or stock blower? I can't stand blower noise :saddowns:

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Stock blower on each. I imagine the 680 would go nuts with a custom cooling solution, it stays within nominal temps and clocks to a high percentile on nVidia's (EVGA's) reference vapor chamber.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
Darn, I'm gonna have to pass. With a tabletop system, noise gets a lot more noticeable.

Scratch that, I'm down. Down like a clown. One 680, please.

Factory Factory fucked around with this message at 02:41 on Jun 15, 2013

Snorri
Apr 23, 2002
Just to amuse everyone I wanted to post my 3DMark FireStrike score with a GTX 770 2GB bottlenecked with an overclocked Q6600. Yes that's right, I am using a 6 year old CPU with a brand new $400 GPU. The 770 is the MSI Twin Frozr Gaming and the OC potential is great. I have mine stable 1180mhz core / 2024mhz memory at 73C under load.

http://www.3dmark.com/fs/549145

Animal
Apr 8, 2003

Not bad. Still higher graphics score than my overclocked 670 with an overclocked i7 2600k. Its amazing how great a processor the Q6600 is for its time. If it lasted that long, I wonder how long Sandy Bridge will last, considering the new processors are focused on power consumption reductions instead of performance. Probably longer than 6 years unless there is a big paradigm shift.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Factory Factory posted:

Darn, I'm gonna have to pass. With a tabletop system, noise gets a lot more noticeable.

Scratch that, I'm down. Down like a clown. One 680, please.

Coming right up (as soon as the new card arrives, works, etc.; we'll work it out via skype, eh).

So that just leaves the EVGA GTX 580 SC, stable as heck when overclocked, still putting out some salty performance for 1080p and under, it was the prime of its time, the stallion of its stable, $200 sound fair to anybody?

Animal posted:

Not bad. Still higher graphics score than my overclocked 670 with an overclocked i7 2600k. Its amazing how great a processor the Q6600 is for its time. If it lasted that long, I wonder how long Sandy Bridge will last, considering the new processors are focused on power consumption reductions instead of performance. Probably longer than 6 years unless there is a big paradigm shift.

I don't have the Firestrike bench but I can see how my SB 2600K at 4.7GHz does with the new 780 SC ACX once it gets in with 3Dmark11. Kind of boring by comparison but benchmarking software is expensive and my wife's father's day/happy you-made-it-through-surgery gift goes only so far :saddowns:

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.

Agreed posted:

Coming right up (as soon as the new card arrives, works, etc.; we'll work it out via skype, eh).

So that just leaves the EVGA GTX 580 SC, stable as heck when overclocked, still putting out some salty performance for 1080p and under, it was the prime of its time, the stallion of its stable, $200 sound fair to anybody?

I'd say so. A stock one is within a decent margin to my 7950 that ran me $300 last fall. $200 for an SC 580 is good value.

BIFF!
Jan 4, 2009
I'm on 320.18 with my GTX 770. Just had some bad artifacts on BF3 with the temps at 80c. I closed my game and raised my fan speed a bit and the temps dropped to 70c and the artifacts went away. I didn't think 80c was that hot. Should I be worried about this?
E: forgot to mention that it's not overclocked at all.

BIFF! fucked around with this message at 05:49 on Jun 15, 2013

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.
80c isn't good, I'm assuming you have a model with a reference blower and/or your case is a tomb of dust.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

TheRationalRedditor posted:

80c isn't good, I'm assuming you have a model with a reference blower and/or your case is a tomb of dust.

Edit: Bleh, brevity is the soul of not being full of poo poo - the temperature itself is not unsafe, though it is higher than expected for that particular chip. That said, perhaps it indicates something else which is not making proper contact, causing artifacts and corruption.

Agreed fucked around with this message at 06:04 on Jun 15, 2013

BIFF!
Jan 4, 2009
Ok, so this is getting more strange. I popped into BF3 to record a bit to test out Dxtory's settings and everything was fine. I went to view the recorded file in VLC and it was artifacting so bad that I couldn't see anything on screen. I opened the same file in Windows Media Player and it worked just fine. I render the file out then view that in VLC and that's fine too. What the heck is going on?

E: welp, just played a bit more BF3 and it started showing artifacts again like crazy. GPU temps only hit 63c. I should probably RMA this right?

BIFF! fucked around with this message at 07:58 on Jun 15, 2013

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
That sounds like a funky video codec on the raw Dxtory file, like it might be DXVA decode-accelerated but card is making GBS threads the bed on that particular codec for some reason. What were your capture settings?

E: Or yeah just RMA that sumgun.

Gonkish
May 19, 2004

I'm not sure if this is the right thread, so correct me if I'm wrong:

I'm looking into upgrading from my 560 Ti sometime this year, as I'm building a completely new system and carrying it over for the time being. Ultimately, I was thinking of either selling it or simply using it as a dedicated PhysX card. My question being, how viable/difficult would the latter be? Is there even a reason to do that over simply selling it?

BIFF!
Jan 4, 2009

Factory Factory posted:

That sounds like a funky video codec on the raw Dxtory file, like it might be DXVA decode-accelerated but card is making GBS threads the bed on that particular codec for some reason. What were your capture settings?

E: Or yeah just RMA that sumgun.

I'm using the Lagarith Lossless Codec with multithreading enabled. You're right - recorded with Fraps and it was fine. e: found it, it was changing the default RGB mode to something else that caused it.

BIFF! fucked around with this message at 08:17 on Jun 15, 2013

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Gonkish posted:

I'm not sure if this is the right thread, so correct me if I'm wrong:

I'm looking into upgrading from my 560 Ti sometime this year, as I'm building a completely new system and carrying it over for the time being. Ultimately, I was thinking of either selling it or simply using it as a dedicated PhysX card. My question being, how viable/difficult would the latter be? Is there even a reason to do that over simply selling it?

Agreed can give you the full skinny, but here's how I remember it:

It's stupidly easy to use the card as a PhysX accelerator. It's just a little tickbox in the driver.

Performance gains can be mild or wild, depending on the game. In Borderlands 2, you're looking at a ~10% improvement over a single card, with a larger increase in minimum framerate. Meanwhile, Batman: Arkham Asylum gains ~70% FPS average/min with a sufficiently fast PhysX card. At least, with older hardware; the gains are lesser but still impressive with current-gen stuff. Sometimes the gains can make a single card + PhysX card faster than a pair of identical cards in SLI running PhysX without a dedicated card.

Of course, in any game that doesn't use GPU PhysX, the card is just wasting electricity.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

FF as usual covered it perfectly well, but here's one weird note I guess - having had the experience of using a headless coprocessor (god drat it, "a physX card" sounds lame :smith:), I'm not going back. Probably going to pick up a 650Ti-level card that will 1. not slow the 780 or later cards down for awhile, and 2. not be a GTX 580 pulling down around 200W of power just to wake up, get the transistors going, and sleepily calculate a really simple CUDA load. GF110 was born to run, it's absurd to have that much power tied up in nothing but PhysX. A card well below GTX 580 level will still do the job perfectly well and free up the rendering card from having to juggle quite so much. It honestly seems to me, user experience-wise, that it's the juggling that hurts performance. PhysX is simple CUDA, but the cores are either engaged in a seamless rendering operation and calculating accordingly OR they're being told to switch up what they're doing... The latter is a somewhat jarring gameplay experience.

There are not that many games that support PhysX and I'm not stupid, I don't expect there to be a ton in the future for no apparent reason to change that fact. It's proprietary and unnecessary, CPU physics gives idle cores something to do, alienating potential customers from the best possible in-game experience for no other reason than they picked the wrong brand is a bad spot for devs to be in regardless of nVidia swinging in with development support and big bags of money.

But it would totally blow to go back to not having one, because some of the games I do play and care about use it. I just want a more efficient and purposeful solution to the problem. 580's doing the job because I had it when I upgraded to the 680, it ought instead to be still maxing out a lot of games at 1080p. A simpler setup is called for here and I will appreciate the power savings and lower heat, too. Right now looking at used cards on Amazon. A card dedicated to PhysX can't be overclocked (well, not GPU/Cuda cores; you can overclock the memory if you wish, and maybe if it's a really edge case card that is barely cutting it when keeping up with high end modern cards that might be helpful to give it more bandwidth since all CUDA workloads, PhysX included, are highly VRAM and memory bandwidth limited), so the consideration is really just getting enough but not too much, and the 580 really overshoots by a mile on that metric.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
Another consideration is that even more powerful PhysX capabilities doesn't necessarily provide better visuals in that most games that do support it don't have all the crazy options we do with primary graphics. So the optimal coprocessor would be lowest possible idle power with sufficient power to handle PhysX for the life of that system. Otherwise, unless power is really expensive, I could suggest just keeping old cards around instead of selling them and buying something cheaper every upgrade cycle, that just sounds silly.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.
One thought that passes through my head now and then is that if the new consoles with their AMD GCN GPUs cause an OpenCL physics engine to be born, a lot of modern gaming computers will have built-in, for-free physics coprocessors via integrated graphics. AMD APUs and Intel HD Graphics all support OpenCL.

It may well be that the lower-end SKUs (e.g. anything below HD 4000) would be too wimpy to coprocess effectively, but I can dream.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

necrobobsledder posted:

Another consideration is that even more powerful PhysX capabilities doesn't necessarily provide better visuals in that most games that do support it don't have all the crazy options we do with primary graphics. So the optimal coprocessor would be lowest possible idle power with sufficient power to handle PhysX for the life of that system. Otherwise, unless power is really expensive, I could suggest just keeping old cards around instead of selling them and buying something cheaper every upgrade cycle, that just sounds silly.

Shouldn't be necessary to buy something cheaper for every upgrade cycle, but a 580 for just PhysX is a shitload of power usage for a workload that absolutely tops out at around 15-18% in THE most intense PhysX games I can find.

Plus, my 750W supply is really an 850W supply under the hood (and will output a solid 930W or a bit higher before problems manifest). For a 680 and a 580, that's fine, but incoming 780 with overclock will be pulling around 240W from the wall under load. The 580 if it were being fully loaded would be close to 240W too. Since it's not, and is just waking up and doing light CUDA calculations, maybe closer to topping out around 180W or so in the most demanding PhysX games just because it still has to wake up the silicon and ramp up clocks, there's some effectively wasted power overhead.

The idea isn't to buy a new card every generation for PhysX, the idea is to replace the GTX 580 with something that isn't so massively overpowered for just PhysX.

Agreed fucked around with this message at 22:26 on Jun 15, 2013

Unormal
Nov 16, 2004

Mod sass? This evening?! But the cakes aren't ready! THE CAKES!
Fun Shoe
So it begs the question; what would an optional price/performance/power usage headless physics-helper be, something like a GTS450?

Adbot
ADBOT LOVES YOU

One Eye Open
Sep 19, 2006
Am I awake?

Factory Factory posted:

One thought that passes through my head now and then is that if the new consoles with their AMD GCN GPUs cause an OpenCL physics engine to be born, a lot of modern gaming computers will have built-in, for-free physics coprocessors via integrated graphics. AMD APUs and Intel HD Graphics all support OpenCL.

It may well be that the lower-end SKUs (e.g. anything below HD 4000) would be too wimpy to coprocess effectively, but I can dream.

Bullet Physics has had an OpenCL solver for a while, and will have a full GPU pipeline in version 3.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply