Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Grim Up North
Dec 12, 2011

You know apart from hoping that they get my 50 bucks worth of silicon working again, I'm now really interested to finally hear what the hell is going on with the whole 560 Ti TDR stuff.

Adbot
ADBOT LOVES YOU

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

Rahu X posted:

Yep. I uninstalled them, cleaned them up with CCleaner, then even cleaned them in Safe Mode with Driver Sweeper. I even made sure to remove the AMD drivers I have for my current replacement before installing the GPU and NVIDIA drivers. No dice.

Before anyone mentions reformatting and re-installing Windows, I've tried that before and got the same issues on my GTX 580 upon installing the drivers, so I doubt that will fix anything.


Just finished doing so. Came up clean, no issues.



Also, small update, I gave my GTX 580 to one of my friends to test and see if he gets the same issues on his system. In the meantime, I'll probably go ahead and send the GTX 780 in for replacement. I'll keep you guys updated when I get more information, and if the GTX 580 turns out to be fine, I'll continue testing with that.

I'm really hoping I won't have to get a new motherboard, as I just got this one back in March.

Just an idea - do you have another monitor you can test with? LCD or otherwise? Before assuming it's the board or card, I wonder if your monitor is on the fritz. You'll probably be able to tell for sure once your friend tests the 580 card, if it works on his system, I'd see if a different display has the same problems.

Wistful of Dollars
Aug 25, 2009

The rumour mill is reporting on the pending existence of some uber version of the 780 Ti with up to 12gb of ram. :shrug:

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
I'm sure there's a simple answer for this but why don't GPUs just have slots for you to put more ram into?

Rahu X
Oct 4, 2013

"Now I see, lak human beings, dis like da sound of rabbing glass, probably the sound wave of the whistle...rich agh human beings, blows from echos probably irritating the ears of the Namek People, yet none can endure the pain"
-Malaysian King Kai

Ozz81 posted:

Just an idea - do you have another monitor you can test with? LCD or otherwise? Before assuming it's the board or card, I wonder if your monitor is on the fritz. You'll probably be able to tell for sure once your friend tests the 580 card, if it works on his system, I'd see if a different display has the same problems.

You know, I just thought about that after I got back home from taking the 780 to the UPS store.

No idea why it would cause driver crashes though.

EDIT: Out of curiosity, I looked up some info about my motherboard, and it turns out the specific revision I have tends to have issues throttling voltage because of the VRM getting too hot. I think it only affects the CPU in most cases though, and I find it interesting that my GPU issues only started happening after the more recent drivers.

Still, if the 580 turns out to be alright, I'm thinking about pulling out my old motherboard and processor to test on that. If that turns out fine, then I guess I better start saving for a new motherboard.

Rahu X fucked around with this message at 01:09 on Nov 2, 2013

Squibbles
Aug 24, 2000

Mwaha ha HA ha!
I assume it's just added expense for something not many people would use.

My S3 ViRGE S2000 had 2MB of ram expandable to 4MB(!!) if I remember correctly.

MiniSune
Sep 16, 2003

Smart like Dodo!
But the virge was not a video card, it was an instrument of torture.

Gonkish
May 19, 2004

World's first graphics decelerator!

Killer robot
Sep 6, 2010

I was having the most wonderful dream. I think you were in it!
Pillbug

Zero VGS posted:

I'm sure there's a simple answer for this but why don't GPUs just have slots for you to put more ram into?

They had some like that in the old days! Like, before they called them "GPUs" and when you were lucky to get meaningful 3D feature sets anyway. I imagine higher speed, higher bandwidth, varying clock speeds, physical demands imposed by cooler design, and generally exacting demands compared to desktop RAM mean that it would be a headache for manufacturers, a limited value even for enthusiasts, and generally have a high cost/benefit ratio.

Kaddish
Feb 7, 2002
Whelp. I broke and bought a 780 today. I've been thinking about getting something with more than 2GB anyway and the price difference between a 770 4GB didn't seem worth it. Anyone want a just fine for everyone who isn't crazy 660 ti?

jane came by
Jun 29, 2013

by Fistgrrl

Kaddish posted:

Anyone want a just fine for everyone who isn't crazy 660 ti?

Who did what now?

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

jane came by posted:

What games are you people playing to justify high end GPUs? Stop making me feel bad about my 560ti. :(

I don't give a fraction of a gently caress about actually justifying my "fun" purchases, but the kinds of games that can actually stress high end graphics cards are generally released in the last couple of years and run on either really taxing in-house DX9 engines or take good advantage of a full DX11 feature set. Sometimes there's an element of somewhat unnecessary high requirements, like Battlefield 3's implementation of MSAA, or Metro games' implementation of MSAA, or the "Ultra" AA mode on Sleeping Dogs - those are generally due to choosing visually fidelity very strongly over performance, with the assumption that the user knows what they're in for and is willing to dial in the graphics experience they can swing with their system.

The funny thing about AA too (especially more conventional AA methods) is that they tend to fill up the VRAM to the point that you're actually looking at a performance requirement that is similar to just having more pixels on screen. If we assume the more or less fully-optimized 4xMSAA as having both a good-enough antialiasing function and about a 75% lower performance cost compared to the really inefficient but boss as heck FSAA/SSAA (basically rendering the whole frame at a higher resolution then scaling it down for display), that's still a 25% performance hit. The irony there is that higher resolutions remove the need for exceptional AA in the first place, but either way, framebuffer size becomes a significant factor. So whether you're still hanging out at 1080p like I am and cranking everything + piling on the AA, or whether you're running at a resolution like 1440p or so, the same visual fidelity (roughly speaking, here) has a remarkably similar performance cost, just as far as jaggies are concerned.

Of course there's a lot more to requirements than just the AA pass, but I've always found that kind of interesting, especially as discussion turns to games which can actually nose up to or exceed 2GB of VRAM at the conventionally modest-for-a-PC resolution of 1080p, when very high settings are used.

Some other DX11 features in particular, like ADoF and to a lesser extent (provided there's any effort at all made toward optimization, fuckin' Crysis 2...) tessellation can have a significant performance hit depending on how they're implemented. Tessellated water with full reflection and refraction looks amazing but it's some really heavy lifting. Games are also getting really, really good at having realistic shadows, though true global illumination and volumetric lighting is still more experimental than practical. I haven't kept up with the development of imperfect voxelized shadow volumes but I felt that it was approaching an elegant solution when I read the whitepaper on it (jointly authored by a developer at a university and one of nVidia's in-house devs).





The short answer is "the pretty ones made in the last year or two" though. :) At sub-4K resolutions, I think there's a nice push-pull going on with hardware getting more powerful and studios finding ways to take advantage of that power. For a good stretch we were basically stuck with DX9 games and the fact is that they just don't tax modern cards as much unless they're at REALLY high resolutions, or use exceptionally high-poly models, or extraordinary AA; see The Witcher 2 for an example of a DX9 game that uses a resource-heavy DX9 engine made in-house by competent developers who wanted to make your PC cry. There are some very clever tricks used in it that give it a far better than average look, I think largely because it was developed with the PC in mind first and consoles second. Most games, it goes the other way 'round, and so we ended up getting games like RAGE with 20GB worth of textures but nothing really taxing for contemporary price:performance hardware.

Agreed fucked around with this message at 05:47 on Nov 2, 2013

TheRationalRedditor
Jul 17, 2000

WHO ABUSED HIM. WHO ABUSED THE BOY.
BF4 is gorgeous Ultramaxed with 4xMSAA and extremely playable at 1080p with my 670. I turn AA off for online though just to keep FPS above 60 at all costs. It's a great exemplar why GPU advancements matter.

jane came by
Jun 29, 2013

by Fistgrrl
Thanks Agreed, very informative post.

Gonkish
May 19, 2004

I was running the BF4 beta at 1080p on a single 4GB 760 at around 40-45fps on Ultra. Once I got my second card it was 60fps all the way because I turned Vsync on.

Guni
Mar 11, 2010

Gonkish posted:

I was running the BF4 beta at 1080p on a single 4GB 760 at around 40-45fps on Ultra. Once I got my second card it was 60fps all the way because I turned Vsync on.

Gonk, do you have blowers or custom fans [i.e. anything else than blowers]? IIRC your case is ITX?

Digital Jesus
Sep 11, 2001

Gonk has a Corsair Carbide 500R. Regular ol' ATX.

Gonkish
May 19, 2004

Yeah, it's a 500R and it's relatively big and I love it. :3:

Kakarot
Jul 20, 2013

by zen death robot
Buglord

El Scotch posted:

The rumour mill is reporting on the pending existence of some uber version of the 780 Ti with up to 12gb of ram. :shrug:

Yea, thats pretty insane, they posted some benchmarks too
http://videocardz.com/47522/nvidia-geforce-gtx-780-ti-great-overclocking-potential
Looking forward to gamin benches.

The Lord Bude
May 23, 2007

ASK ME ABOUT MY SHITTY, BOUGIE INTERIOR DECORATING ADVICE
If it exists, I want one.

redstormpopcorn
Jun 10, 2007
Aurora Master
I'll just run my games on an 8GB GDDR5 RAMdisk.

Scarecow
May 20, 2008

3200mhz RAM is literally the Devil. Literally.
Lipstick Apathy
So for any of the R9-290x owners out there, have any of you experienced coil whine when your card gets loaded up? The high fan noise I can live with but this high pitched buzzing noise is starting to get on my nerves

...guess that's what I get for buying a gigabyte card

This Jacket Is Me
Jan 29, 2009

Rahu X posted:

Just finished doing so. Came up clean, no issues.

sether01 posted:

Have you memtested your ram?

From last page, but if you have any other compatible RAM laying around, try swapping it in anyway. I used to get MCE crashes on my current machine, memtest86 said everything was okay, I tried swapping the modules from one DIMM to another, tried running with only one DIMM populated, etc. On a whim, bought some new RAM, popped it in, and haven't had a crash since. All that while memtest86 said everything was dandy.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Agreed posted:

The irony there is that higher resolutions remove the need for exceptional AA in the first place,

This is incorrect. You still need temporal AA (high frame rates fix this but 60hz is still very hard to do at 4k) and even spatial AA if the frequency of your feature is high enough. Movies tend to have intensive AA even though they might render at 8k and beyond.

NVIDIA's txaa was pretty cool, but unfortunately proprietary. With Lottes now working for Epic, we might see some advances across the board since gamedevs seem to be more willing to share techniques.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Malcolm XML posted:

This is incorrect. You still need temporal AA (high frame rates fix this but 60hz is still very hard to do at 4k) and even spatial AA if the frequency of your feature is high enough. Movies tend to have intensive AA even though they might render at 8k and beyond.

NVIDIA's txaa was pretty cool, but unfortunately proprietary. With Lottes now working for Epic, we might see some advances across the board since gamedevs seem to be more willing to share techniques.

You know exactly what I mean, man. I've been banging the TXAA drum and hoping we'd get more sophisticated, specialized AA methods for a long time. Hit the question mark, go back in time to when TXAA was getting its first implementation... Or don't, but at the very best we're talking past each other, at the worst just talking at each other, and I have to say I prefer it when people talk to each other so I find it slightly disingenuous to come off a post talking about current and explicitly noted-as-traditional "AA-for-jaggies" with the lateral gotcha for temporal AA and all that.

I followed Tim Lottes' blog well enough that I managed to snag some of his stuff once he started scrubbing post job change, and I honestly don't think he'd be super willing to trade off the option of having their proprietary renderer do some hot poo poo vs. having everybody do some hot poo poo, since I'm going to go real far out on a limb and guess any rendering tech he invents while working on the rendering team at Epic belongs to Epic. At least it's widely licensed.

Agreed fucked around with this message at 16:51 on Nov 2, 2013

Paradox Personified
Mar 15, 2010

:sun: SoroScrew :sun:
Call of Duty:Ghosts-- better on slimPS3 architecture or on an i5-4570 3.2GHz Quad-Core PC running an R9 280x (with either 8g or 16g RAM?)

V Because I quite literally just woke up and I have not built a PC since 2001. I feel ashamed. :smith:

Paradox Personified fucked around with this message at 18:20 on Nov 2, 2013

John McCain
Jan 29, 2009
How could you possibly see better graphics on a PS3 (running graphics hardware from 2005) than the modern gaming computer you described?

You will be able to turn on a lot more eye candy for equivalent performance on that PC.

Mill Village
Jul 27, 2007

The PC version is the definitive version, Activision even has a dedicated team working on it. Keep in mind you will miss out on NVIDIA Exclusive effects, though.

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Agreed posted:

You know exactly what I mean, man. I've been banging the TXAA drum and hoping we'd get more sophisticated, specialized AA methods for a long time. Hit the question mark, go back in time to when TXAA was getting its first implementation... Or don't, but at the very best we're talking past each other, at the worst just talking at each other, and I have to say I prefer it when people talk to each other so I find it slightly disingenuous to come off a post talking about current and explicitly noted-as-traditional "AA-for-jaggies" with the lateral gotcha for temporal AA and all that.

I followed Tim Lottes' blog well enough that I managed to snag some of his stuff once he started scrubbing post job change, and I honestly don't think he'd be super willing to trade off the option of having their proprietary renderer do some hot poo poo vs. having everybody do some hot poo poo, since I'm going to go real far out on a limb and guess any rendering tech he invents while working on the rendering team at Epic belongs to Epic. At least it's widely licensed.

True--but even spatial AA is needed at retina resolutions, but you just some subtle blurring here and there.

By and large though, you're right: once we get the horsepower for 4k+ resolutions we can start focusing on getting good motion interpolation and other things.

I doubt Epic will release source code but they will likely through out a whitepaper or at least a presentation at GDC after their cool game engine with awesome AA has had a few years of design wins.

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

Malcolm XML posted:

True--but even spatial AA is needed at retina resolutions, but you just some subtle blurring here and there.

By and large though, you're right: once we get the horsepower for 4k+ resolutions we can start focusing on getting good motion interpolation and other things.

I doubt Epic will release source code but they will likely through out a whitepaper or at least a presentation at GDC after their cool game engine with awesome AA has had a few years of design wins.

Yeah, my gut feeling on that is that the spacial AA will be (hopefully more sophisticated/effective) post-processing shader based AA since they thankfully care a lot less about resolution when it comes to rendering costs than any variation of FSAA or its progeny (and, as you know, there isn't an MSAA variant around that isn't some "clever" form of site-specific FSAA, even these many years on).

I share your hope in the co-development stuff; there's a huuuge degree of enlightened self interest for a company like nVidia, who obviously stands to profit from more advanced graphics requiring more advanced hardware, to allow one of their devs to work on a coauthored paper like the IVSV thing or any number of developer-friendly API tips... But, and importantly, you are right that there are also some neat examples like the Practical Clustered Shading whitepaper put out (I really want to see what that ends up being, the idea of advancing both forward and deferred rendering in the same engine is a cool concept) now and again by very clever folks trying to solve interesting problems - and get their tech licensed, well, nobody's doing this poo poo for free after all.

My concern with Epic in particular is that Tim Lottes really is a friggin' genius when it comes to this stuff but he's also proved very good at keeping his cards close ever since he moved on - nVidia allowed him a great deal of freedom on his blog when he was under their employ, my guess being because he used the technical expertise on display to promote things that would be impressive and proprietary and thus benefit the company. But nVidia proprietary is waaay broader than Epic proprietary, if you get me, since nVidia sells the hardware that any software should run on, while Epic's one of many (doing it best, judging by how many games use their engine, but you know what I mean I think) - I don't think Epic have been especially active in the overall conversation, preferring instead to just provide the most robust product for licensing. Much more about showing off what their stuff does and what it can do for you as a developer.

Even with that more restricted space though I'd hope that conferences would show some cool tech and they have to talk about it, it's going to be explicit anyway when devs get the API, but they've managed to snap up one of the guys who used to be really talkative and since then he hasn't said nearly as much. Hell, he even pulled a simple bare-metal analysis of the PS4.

Of course he's not the only rendering genius out there, but there aren't a lot of them who speak up. :smithcloud:

Agreed fucked around with this message at 19:27 on Nov 2, 2013

Hakkesshu
Nov 4, 2009


VVVVVV Pardon me!

Hakkesshu fucked around with this message at 20:00 on Nov 2, 2013

beejay
Apr 7, 2002

This used to be in the thread title but it's still the first line of the OP: "Mod Note: This thread is for general GPU and videocard discussion, head over to the parts picking megathread if you just need help picking a card to buy."

Tgent
Sep 6, 2011
I have a gtx 460 that randomly freezes my pc on any driver past 314.22, seems to be a known issue going by the geforce forums. So that's fine, I'll just stay on that driver. Except my drivers are updating by themselves to the most recent version and causing the freezing to return. It's not windows update (there's nothing in the update history, and it's set to only check for updates, not install them automatically). Nvidia update is not installed. I see absolutely no part of the driver install process, it's completely invisible which seems bizarre. Does anyone know what the hell is going on here? What could possibly be doing the updating? I'm on windows 8.1 if that affects anything.

Cream
May 6, 2007
Fett-kart

Tgent posted:

I have a gtx 460 that randomly freezes my pc on any driver past 314.22, seems to be a known issue going by the geforce forums. So that's fine, I'll just stay on that driver. Except my drivers are updating by themselves to the most recent version and causing the freezing to return. It's not windows update (there's nothing in the update history, and it's set to only check for updates, not install them automatically). Nvidia update is not installed. I see absolutely no part of the driver install process, it's completely invisible which seems bizarre. Does anyone know what the hell is going on here? What could possibly be doing the updating? I'm on windows 8.1 if that affects anything.

Windows Update also updates drivers. Have a look and see if that's whats possibly happening to you.

Tgent
Sep 6, 2011

Cream posted:

Windows Update also updates drivers. Have a look and see if that's whats possibly happening to you.

I've had a look around and it seems that's likely. Unfortunately there doesn't seem to be any way to stop it apart from stopping windows from getting any new hardware drivers at all.

Bloody Hedgehog
Dec 12, 2003

💥💥🤯💥💥
Gotta nuke something

Tgent posted:

I've had a look around and it seems that's likely. Unfortunately there doesn't seem to be any way to stop it apart from stopping windows from getting any new hardware drivers at all.

:ssh: You should stop letting Windows get new hardware drivers.

Rahu X
Oct 4, 2013

"Now I see, lak human beings, dis like da sound of rabbing glass, probably the sound wave of the whistle...rich agh human beings, blows from echos probably irritating the ears of the Namek People, yet none can endure the pain"
-Malaysian King Kai

This Jacket Is Me posted:

From last page, but if you have any other compatible RAM laying around, try swapping it in anyway. I used to get MCE crashes on my current machine, memtest86 said everything was okay, I tried swapping the modules from one DIMM to another, tried running with only one DIMM populated, etc. On a whim, bought some new RAM, popped it in, and haven't had a crash since. All that while memtest86 said everything was dandy.

I do not, so I guess I'll just have to hope that's not the issue. I don't think it would be though, because I'm running perfectly fine right now on this weaker GPU. No crashes, no video artifacts. Perfectly stable.

Do motherboard VRMs affect GPUs as well, or is it only CPUs? I'm curious, because I'm thinking the reason why I'm getting away with running on this older GPU is simply because it's using less power (it also doesn't use any PSU power connectors). Of course, I can't really confirm that, but it's just a thought. I'll know more once I get my 580 back and test it on my old motherboard. Well, if the 580 is good, that is.

Factory Factory
Mar 19, 2010

This is what
Arcane Velocity was like.

Rahu X posted:

I do not, so I guess I'll just have to hope that's not the issue. I don't think it would be though, because I'm running perfectly fine right now on this weaker GPU. No crashes, no video artifacts. Perfectly stable.

Do motherboard VRMs affect GPUs as well, or is it only CPUs? I'm curious, because I'm thinking the reason why I'm getting away with running on this older GPU is simply because it's using less power (it also doesn't use any PSU power connectors). Of course, I can't really confirm that, but it's just a thought. I'll know more once I get my 580 back and test it on my old motherboard. Well, if the 580 is good, that is.

The motherboard's VRMs only affect CPU and DRAM. Everything else on the board is fed directly from the PSU's voltage rails.

Rahu X
Oct 4, 2013

"Now I see, lak human beings, dis like da sound of rabbing glass, probably the sound wave of the whistle...rich agh human beings, blows from echos probably irritating the ears of the Namek People, yet none can endure the pain"
-Malaysian King Kai

Factory Factory posted:

The motherboard's VRMs only affect CPU and DRAM. Everything else on the board is fed directly from the PSU's voltage rails.

I thought so. Just wasn't too sure. The apparent VRM issues on my specific motherboard revision was starting to make me think that could be the issue.

Adbot
ADBOT LOVES YOU

Paradox Personified
Mar 15, 2010

:sun: SoroScrew :sun:
So, confirm/deny; "physx really requires a second card to look nice"?
Someone told me this and I almost threw my keyboard across the room. My build has gone through so many choices.... 7970 Ghz editio-no, the Toxic version of the R9 280x wait no should I try physx? So it has to be the 770 becau-no wait it has to be more than one card, so what the gently caress am I doing? "Just get whatever the best sapphire card in your price range is." gently caress me running. I just say one of the versions of the r9 280x, Toxic has three fans, you know, that one. Sold out right now but since they're just out, should restock fairly quickly?


Someone shoot me. Just tie me to a chair, pick a card (but tell me why because I'm an OCD motherfucking freak and am barely read up on the chipsets - Looking at you, 7970, you know why), and then shoot me. In that order.

Paradox Personified fucked around with this message at 19:03 on Nov 3, 2013

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply