Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars


LuciferMorningstar posted:

fermi card flaking out

Fame Douglas posted:

the card broken

Normally I'd agree with you, but LuciferMorningstar is part of a decent-sized class of exceptions. And, in fact, you went on to mischaracterize the problem (which was mostly a factor of power states, not material defects) as entirely material defects.

nVidia left in a bug in power handling that pretty much wrecked Fermi cards for a solid year. HOWEVER, this bug has since been patched out. Current GeForce drivers and latest betas both work just fine for me on a 560 Ti.

Is it possible that this damaged his card? Yeah, maybe. But it's more likely that nVidia's driver, like most video card drivers, is a goddamn mess, and driver cleaning software (on the rare occasion it's necessary) is stuck playing catchup.

In LuciferMorningstar's case, I would recommend clean-uninstalling the nVidia driver, running Display Driver Uninstaller to clean up the rest, and clean-installing current drivers.

If the card is problems after that, I would at least strongly consider [imaging my system drive just in case and] clean-re-installing Windows, not allowing Windows Update drivers (you may have to install without an Ethernet cable plugged in or a Wi-Fi access point configured - and without a mobile card installed/plugged in, if applicable) and immediately installing current GeForce drivers, and seeing how that works before I threw out a still probably usable video card. In fact, graphics drivers are so bad that even if you panic-replaced the card on Fame Douglas's advice, you might be stuck doing this anyway to get your system stable again.

dont be mean to me fucked around with this message at 09:11 on May 27, 2014

Adbot
ADBOT LOVES YOU

Fame Douglas
Nov 20, 2013

by Fluffdaddy
The problem, as described, really does sound like a card hardware defect: Unless we're thinking of different incidents, the Fermi specific driver problem (apparently present in the driver software for something like 6 months) did happen while browsing on the desktop mainly.

The time frame mentioned, as well as the nature of the problem, make this sound like a regular ol' chip that can't handle the workload to me. But feel free to correct me if you still think I'm wrong!

dont be mean to me
May 2, 2007

I'm interplanetary, bitch
Let's go to Mars


It could happen anywhere, really. It happened a lot less in a game or 3D application, but it was mostly associated with doing interesting things with power states that most cards could deal with (even older cards) but Fermi could not - and some knock-on effects from Fermi being kind of an odd duck in general - and not all games and applications need full power so a few could drop the GPU's power state. It shouldn't really be a surprise that it shows up least when a GPU is at its rated operating frequency/voltage/other characteristics.

And like I said before, nVidia drivers are messy. I don't have insider information describing their compatibility or on-update configuration repopulation (nVidia profiles especially have changed a lot since SGSSAA became a popular tweak), but dragging stuff up from a year ago that no one thought to clear out is entirely within the realm of possibility, and if that had a hand in causing problems, it'd be really hard to ferret out.

Which is why I suggested testing with fresh Windows that's only ever seen current drivers as a diagnostic stage (and backups because you'd have to be crazy not to when doing that sort of thing).

dont be mean to me fucked around with this message at 10:26 on May 27, 2014

Ignoarints
Nov 26, 2010
I'm going to go ahead and follow those steps because I just got a driver failure with Watch Dogs as well as BF4. When I looked at Afterburner it the clock offset was -105. I've never seen it actually offset itself in the box you enter values into. That put it at 1100 mhz clock speed. Max temp on the hottest card was 84 degrees.

There were tons of visual errors when it happened, it's a lot more obvious in Watch Dogs.

Fame Douglas
Nov 20, 2013

by Fluffdaddy

Sir Unimaginative posted:

And like I said before, nVidia drivers are messy. I don't have insider information describing their compatibility or on-update configuration repopulation (nVidia profiles especially have changed a lot since SGSSAA became a popular tweak), but dragging stuff up from a year ago that no one thought to clear out is entirely within the realm of possibility, and if that had a hand in causing problems, it'd be really hard to ferret out.

That seems like a stretch to me, a hardware failure is really likely in this case. I'd be surprised if doing a crazy driver/Windows re-install ever fixed this problem (but going for a clean install is a reasonable troubleshooting step that should be done, considering software tends to be a complex beast).

Ignoarints posted:

There were tons of visual errors when it happened, it's a lot more obvious in Watch Dogs.

Sounds like you're in for a replacement.

Fame Douglas fucked around with this message at 13:58 on May 27, 2014

warcake
Apr 10, 2010
Just a quick question:

I have had 2 280x's and RMA'ed both of them, the last one being a sapphire toxic 280x. I've got a credit note now, and definitely don't want a 280x again.

Would something like this be a worthwhile replacement?

http://www.aria.co.uk/SuperSpecials...productId=56419

I don't mind paying a bit more towards it.

warcake fucked around with this message at 14:57 on May 27, 2014

Hace
Feb 13, 2012

<<Mobius 1, Engage.>>

Slider posted:

Does the GTX 770 not overlock that well? Mine is of the MSI flavor. I'm not able to get much more than an extra 50mhz out of the core, before the display drivers start crashing running the heaven benchmark thing. Also upping the core voltage/power limit in MSI afterburner doesn't make any difference surprisingly.

I've had the exact same experience with that card, haha. Luck of the draw I suppose.

Ignoarints
Nov 26, 2010

Fame Douglas posted:

That seems like a stretch to me, a hardware failure is really likely in this case. I'd be surprised if doing a crazy driver/Windows re-install ever fixed this problem (but going for a clean install is a reasonable troubleshooting step that should be done, considering software tends to be a complex beast).


Sounds like you're in for a replacement.

If something is defective, I better figure out how to pinpoint which one it is because I'm still within a 30 day return policy for one

Really unrelated, but does anybody remember a game that only really worked well with AMD cards where all nvidia cards were "strangely" limited to 55 fps or something? Someone is blowing up my phone about the watch dogs thing and I can't roll my eyes any harder

Liam Acerbus
Sep 17, 2007

Good morning, thread. I have an HD 6950 that I've been thinking of upgrading for a while. What's a reasonably priced* upgrade I can shoot for? Hopefully something that isn't physically larger. My PSU is 750 watts, if that matters.

*I don't think I want to spend over $400. Closer to 300 would be ideal.

SlayVus
Jul 10, 2009
Grimey Drawer
So I really thought that my GTX 680 would be able to handle Wolfenstein and Watch_Dogs full tilt, but sadly I am limited by the VRAM on my card. A measly 2GB of VRAM doesn't seem like it's going to cut it in this day and age of console.

Wolfenstein eats upwards of 1990~ MB of VRAM. Watch_Dogs hits 2020 MB of VRAM usage at 1080p with max settings.

SlayVus fucked around with this message at 15:33 on May 27, 2014

beejay
Apr 7, 2002

Liam Acerbus posted:

Good morning, thread. I have an HD 6950 that I've been thinking of upgrading for a while. What's a reasonably priced* upgrade I can shoot for? Hopefully something that isn't physically larger. My PSU is 750 watts, if that matters.

*I don't think I want to spend over $400. Closer to 300 would be ideal.

You need to ask this in the parts picking thread and also ideally when you post there, tell them what games you want to play, what resolution, and why you want to upgrade/what other parts do you have and so forth.

Liam Acerbus
Sep 17, 2007

Okay, thanks.

Ignoarints
Nov 26, 2010

SlayVus posted:

So I really thought that my GTX 680 would be able to handle Wolfenstein and Watch_Dogs full tilt, but sadly I am limited by the VRAM on my card. A measly 2GB of VRAM doesn't seem like it's going to cut it in this day and age of console.

Wolfenstein eats upwards of 1990~ MB of VRAM. Watch_Dogs hits 2020 MB of VRAM usage at 1080p with max settings.

For what it's worth, I've consistently passed 2gb of vram with 770's with no obvious downfalls up until now (with 1440p). But not everything is equal I'm sure, and it probably depends greatly what exactly is being stored in there which is something that can change with new games.

That being said I'm sure I'm hitting a memory bottleneck with Watch Dogs. I played for about an hour and a half, and before I crashed my driver I could really only be happy with high settings, not "ultra" (dont even remember if its called that). The fps itself wasn't actually all that bad it was the input lag that really got to me more than anything, but the fps was suffering too. Didn't have time to figure look into what was going on just wanted to play for a bit.

I've barely had this setup for a month and I already want to upgrade though. drat you games. Also the difference between ultra and high settings was very, very noticeable.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy
I remember a while back when the R9 290's first same out and they announced how they can Crossfire without a cable. Did that wind up being any good or does it have microstutter? Moreover does anyone know if there's a minimum speed required for the PCI, because I have two "3.0/2.0 x16 or dual x8" and one "2.0 x16 (x2 mode)".

Zero VGS fucked around with this message at 17:06 on May 27, 2014

Stanley Pain
Jun 16, 2001

by Fluffdaddy

Zero VGS posted:

I remember a while back when the R9 290's first same out and they announced how they can Crossfire without a cable. Did that wind up being any good or does it have microstutter? Moreover does anyone know if there's a minimum speed required for the PCI, because I have two "3.0/2.0 x16 or dual x8" and one "2.0 x16 (x2 mode)".

Yes it's good and microstutter is specifically MUCH better on the R9 290 cards. I've ran mine on dual 8x and 16x (3.0) and noticed no difference (benchmarks might show a slight difference). From a visual perspective I have not noticed ANY microstutter which used to drive me completely insane with dual 6870s.

Ignoarints
Nov 26, 2010
Out of curiosity is there a reason not to use a cable besides just not having one on hand at the moment. I've seen that question here and there

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

Ignoarints posted:

Out of curiosity is there a reason not to use a cable besides just not having one on hand at the moment. I've seen that question here and there

You can't on the 290+'s, it's simply not there.

Ignoarints
Nov 26, 2010
Oh. Haha.

Three different SLI cables on 660tis and literally 4 different 770's now have been super finicky and loose. Since I already had 3 different brands I didn't try my luck with another, but I wish it were more solid. If it's off by even a little it can make blue lines and all sorts of weird stuff to happen.

deimos
Nov 30, 2006

Forget it man this bat is whack, it's got poobrain!

Ignoarints posted:

Oh. Haha.

Three different SLI cables on 660tis and literally 4 different 770's now have been super finicky and loose. Since I already had 3 different brands I didn't try my luck with another, but I wish it were more solid. If it's off by even a little it can make blue lines and all sorts of weird stuff to happen.

I've always used the motherboard's solid adapters, seem to work best.

Zero VGS
Aug 16, 2002
ASK ME ABOUT HOW HUMAN LIVES THAT MADE VIDEO GAME CONTROLLERS ARE WORTH MORE
Lipstick Apathy

deimos posted:

I've always used the motherboard's solid adapters, seem to work best.

ASUS lost the ribbon-type SLI bridge I sent in with my motherboard RMA so I might use that as an excuse to get the PCB-type SLI bridge.

LuciferMorningstar
Aug 12, 2012

VIDEO GAME MODIFICATION IS TOTALLY THE SAME THING AS A FEMALE'S BODY AND CLONING SAID MODIFICATION IS EXACTLY THE SAME AS RAPE, GUYS!!!!!!!

Sir Unimaginative posted:

It could happen anywhere, really. It happened a lot less in a game or 3D application, but it was mostly associated with doing interesting things with power states that most cards could deal with (even older cards) but Fermi could not - and some knock-on effects from Fermi being kind of an odd duck in general - and not all games and applications need full power so a few could drop the GPU's power state. It shouldn't really be a surprise that it shows up least when a GPU is at its rated operating frequency/voltage/other characteristics.

And like I said before, nVidia drivers are messy. I don't have insider information describing their compatibility or on-update configuration repopulation (nVidia profiles especially have changed a lot since SGSSAA became a popular tweak), but dragging stuff up from a year ago that no one thought to clear out is entirely within the realm of possibility, and if that had a hand in causing problems, it'd be really hard to ferret out.

Which is why I suggested testing with fresh Windows that's only ever seen current drivers as a diagnostic stage (and backups because you'd have to be crazy not to when doing that sort of thing).

Thing is, this isn't limited to the Fermi cards, because apparently even the Keplers run into this issue. And in my case, at least, it's almost certainly the GPU itself because it only happens when I "push" the card. And by push, I mean do anything other than run games at low settings at reduced resolutions. Even then, that's not always enough. It's really oddly variable. I could run Dark Souls just fine at max resolution, but Space Engineers eats poo poo no matter what quality or resolution I run it at.

LuciferMorningstar fucked around with this message at 02:07 on May 28, 2014

Shaocaholica
Oct 29, 2002

Fig. 5E
So it looks like my VRAM discussion 5 pages back is now quite relevant wrt Watchdogs/Wolfenstein.

http://forums.somethingawful.com/showthread.php?threadid=3484126&userid=0&perpage=40&pagenumber=218

Seems like the days of not worrying about having enough VRAM for actual game assets(not just res) is coming to an end. I wonder if 2G high end parts will be dropping in price.

Ignoarints
Nov 26, 2010
I was really curious about it. Still not sure if my bandwidth is the bottleneck or capacity though. I'm clearly blowing my 2gb capacity out of the water with watch dogs but I've been doing so on older games. I'm sure it will be tested by someone

"figured out" what was going on with my weird driver crash problem. I disabled SLI and re enabled it and now it's running at normal factory boost speeds (1176-1228) and using the TDP it's supposed to. For whatever reason it was trying to do 1280 mhz on both cards at 70% tdp with 0 clock offset.

edit: Oh man, MSI 780ti is $600 on newegg

Ignoarints fucked around with this message at 04:25 on May 28, 2014

BurritoJustice
Oct 9, 2012

Ignoarints posted:

edit: Oh man, MSI 780ti is $600 on newegg

Buy two, it's the logical step (2x660ti -> 2x770 -> 2x780ti). Make sure to only use them with a 1080p monitor.

Ignoarints
Nov 26, 2010

BurritoJustice posted:

Buy two, it's the logical step (2x660ti -> 2x770 -> 2x780ti). Make sure to only use them with a 1080p monitor.

Yeah already bought one. Lol

I'm sure I can find a 1080p around here somewhere. Maybe I'll hook it up to my tv for most overkill possible

Overall legitimately excited to see the difference. I have many expectations here, time to see if I'm right about them.

Hooray for purposefully losing serious GPU power for science

(and I am planning on SLI, but just not right now)

edit: if I get a reasonable resale on my 770 I can't return, combined with the tiger direct return, that will be enough to coved the cost of the 780ti completely since newegg doesn't do tax yet. Not even considering the $30 rebate card and another free watch dogs code. Considering how much I was already in the hole, at the end of the day this MSI 780ti will end up costing me ~$420-$460 :lol:

Ignoarints fucked around with this message at 05:37 on May 28, 2014

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

If you decide to go for a second 780Ti for SLI for science and opulence I'll buy your second Watch Dogs code fair market value :) I'd like to see some real world, close proximity testing done on the memory issue since I keep seeing evidence that it's a real issue and evidence that it isn't as big of an issue and it's all contradictory down to the frame analysis tools, could just be engine specific as to how assets are handled and moved around - I mean, really we've been able to max 2GB at 1080p since Fallout 3: NV came out and people started getting ridiculous with the texture mods. It got way easier with Skyrim, since the engine allowed for a lot more fluidity in similar practices. But we've had games that purposefully use every single bit of your VRAM, like RAGE, which despite its many many fair criticisms did at least do some cool poo poo in OpenGL... And we've had games that accidentally kill your card's perf, like Crysis 2, with the world's most profoundly correct Jersey barriers ever and spooky, lovingly rendered and never seen underwater tessellated oceans that pervade the level.

I'm excited to see games finally casting off the yoke of really tiny VRAM limitations, but I want to know where it actually puts us. The PS4 is the new hardware standard for kick-rear end end to end VRAM performance, without any of the obnoxious overhead of the PC when the video card talks to the rest of the system, or the unfortunate and continued choice to bottleneck the flow and try to claim an overall higher number of the XBox (One). If it turns out that basically any Really Good Card going forward needs to be able to allocate at least 3GB of VRAM for assets, that's going to be cool as heck and we ought to see some really pretty games as a result, assuming artists use their newfound asset powers for good and not eeeevil. :)

Anyway seriously I will do my part to subsidize your horrible error of judgment so we can see what happens without enough VRAM vs. with enough VRAM in the same system, let's do this man!

Shaocaholica
Oct 29, 2002

Fig. 5E
If 2Gb is barely cutting it then I doubt 3Gb is going to last more than 6-12months. I don't game nearly as much as I used to but I think during the peak of my college years my GPUs were averaging 18-24months between upgrades. Not sure what the current forum recommendation is but if we're talking about $300-$1000 worth of hardware then maybe even more longevity should be expected.

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance
I'm hoping this means video cards with 3GB or 4GB VRAM variants will come down in price. I would have gotten a 3GB 660Ti but it cost almost $100 more at the time and back then you didn't need that third gig of VRAM for 1080p gaming. The next video card I get is going to have at least 4GB of VRAM. :pcgaming:

Ignoarints
Nov 26, 2010

Agreed posted:

If you decide to go for a second 780Ti for SLI for science and opulence I'll buy your second Watch Dogs code fair market value :) I'd like to see some real world, close proximity testing done on the memory issue since I keep seeing evidence that it's a real issue and evidence that it isn't as big of an issue and it's all contradictory down to the frame analysis tools, could just be engine specific as to how assets are handled and moved around - I mean, really we've been able to max 2GB at 1080p since Fallout 3: NV came out and people started getting ridiculous with the texture mods. It got way easier with Skyrim, since the engine allowed for a lot more fluidity in similar practices. But we've had games that purposefully use every single bit of your VRAM, like RAGE, which despite its many many fair criticisms did at least do some cool poo poo in OpenGL... And we've had games that accidentally kill your card's perf, like Crysis 2, with the world's most profoundly correct Jersey barriers ever and spooky, lovingly rendered and never seen underwater tessellated oceans that pervade the level.

I'm excited to see games finally casting off the yoke of really tiny VRAM limitations, but I want to know where it actually puts us. The PS4 is the new hardware standard for kick-rear end end to end VRAM performance, without any of the obnoxious overhead of the PC when the video card talks to the rest of the system, or the unfortunate and continued choice to bottleneck the flow and try to claim an overall higher number of the XBox (One). If it turns out that basically any Really Good Card going forward needs to be able to allocate at least 3GB of VRAM for assets, that's going to be cool as heck and we ought to see some really pretty games as a result, assuming artists use their newfound asset powers for good and not eeeevil. :)

Anyway seriously I will do my part to subsidize your horrible error of judgment so we can see what happens without enough VRAM vs. with enough VRAM in the same system, let's do this man!

Sure man, this is actually my second code so as soon as I get it I'll pm you. Was just going to sell it on SA art

Shaocaholica
Oct 29, 2002

Fig. 5E

spasticColon posted:

I'm hoping this means video cards with 3GB or 4GB VRAM variants will come down in price. I would have gotten a 3GB 660Ti but it cost almost $100 more at the time and back then you didn't need that third gig of VRAM for 1080p gaming. The next video card I get is going to have at least 4GB of VRAM. :pcgaming:

Some interesting observations.

-4Gb 760 costs less than 2Gb on newegg right now
-watchdogs being bundled with 2Gb cards(probably wolf too)

Agreed
Dec 30, 2003

The price of meat has just gone up, and your old lady has just gone down

If there's any notable performance difference, I'd posit that it's likely to be WAY more noticeable going from 2GB to 3GB than from 3GB to 4GB. I'd like to wait for some numbers to substantiate that conjecture but at some point you move entirely away from base assets and toward other uses of the framebuffer and I think that those tend to be more flexible in terms of perceived total usage vs. actual render time cost. It's a truism to say that "not enough VRAM will be not enough VRAM, and that will have some consequences" - pinning that specific number down as 3GB or 4GB has a lot to do with what resolution you're gaming at and what the game is actually doing with all that VRAM.

Yes, the two hardware-intensive consoles pack a lot more game-accessible memory than the previous consoles, by quite a substantial margin - but it's limited in several ways in the case of the XBox One, and even on the PS4 it still has to pull double duty as general memory vs. video addressable for graphics tasks specifically. Let's wait until we see some numbers on current gen titles that we can reasonably say are Good PC Ports before we start getting too crazy about what is and what isn't enough VRAM. We need more (hell, some) data first, then we can get down to the fun part of the discussion. :)

spasticColon
Sep 22, 2004

In loving memory of Donald Pleasance

Shaocaholica posted:

Some interesting observations.

-4Gb 760 costs less than 2Gb on newegg right now
-watchdogs being bundled with 2Gb cards(probably wolf too)

That's good to know but I'd rather wait until Maxwell before I get a video card with 4GB VRAM and the GTX760 is only slightly faster than the 660Ti anyway. I want a much faster GPU to go with the extra VRAM.

Shaocaholica
Oct 29, 2002

Fig. 5E
I'm doing some light reading on vram and I'm not sure where the whole 'vram as a function of resolution' came from.

A 1080p framebuffer is only ~8MB. A 4k framebuffer is ~32MB. Double and triple buffering don't impact those numbers much with respect to common vram sizes these days. And the game itself might store a few render passes as well but those are all linear functions. Only when you get into AA does the memory mushroom but who the hell is sacrificing vram for AA at a high enough resolution for it to matter? Logically you allocate free vram to AA if you have it. If you don't have it then AA is a logical sacrifice.

Shouldn't vram choice be driven by game/engine/usage design? The impact of giving up AA is going to be less noticeable to most folk than texture resolution or whatever else the developers are shoving into vram. Someone wanting to play a modern or even future game at the highest setting at 1080p is going to need practically the same amount of vram as someone at 4k if you take AA out of the equation. I don't see why resolution should be a dominant influencing factor.

HalloKitty
Sep 30, 2005

Adjust the bass and let the Alpine blast
Probably posted already, but Watch_Dogs is suffering with 3GiB or less VRAM with ultra settings: http://hardocp.com/article/2014/05/27/watch_dogs_amd_nvidia_gpu_performance_preview/3

The difference between high and ultra textures is noticeable, too: http://hardocp.com/article/2014/05/28/watch_dogs_image_quality_preview

Seamonster
Apr 30, 2007

IMMER SIEGREICH
Anybody find some Wolfenstein: NO benches yet? All I see is the console garbage.

Ignoarints
Nov 26, 2010

HalloKitty posted:

Probably posted already, but Watch_Dogs is suffering with 3GiB or less VRAM with ultra settings: http://hardocp.com/article/2014/05/27/watch_dogs_amd_nvidia_gpu_performance_preview/3

The difference between high and ultra textures is noticeable, too: http://hardocp.com/article/2014/05/28/watch_dogs_image_quality_preview

I guess the easiest test is to use a 3gb 780 and a 6gb 780. I did definitely notice a difference between high and ultra.

Seamonster posted:

Anybody find some Wolfenstein: NO benches yet? All I see is the console garbage.

Yeah, it's "60 fps" lol. It's a weird game.

http://www.hardocp.com/article/2014/05/21/wolfenstein_new_order_performance_review

Seamonster
Apr 30, 2007

IMMER SIEGREICH

Wow.

They've done gently caress all to improve the engine after 3 loving years AND made it clear as day all they care about is bending over backwards for the consoles. A shameful release.

Shaocaholica
Oct 29, 2002

Fig. 5E
^ Since when have game devs not bent over backwards for consoles(in the last decade)? ^

I see decent 4Gb and 6Gb cards on newegg for less than $300 so at least its not a(big) price issue if you're lacking the vram and can flip your current card.

Seamonster
Apr 30, 2007

IMMER SIEGREICH
Battlefield series is a good example, I'd say. Console versions limits to 32 players while PC can get the full 64.

EDIT: There's a difference between accomodating and bending backwards over for.

Seamonster fucked around with this message at 14:47 on May 28, 2014

Adbot
ADBOT LOVES YOU

Ignoarints
Nov 26, 2010

Shaocaholica posted:

^ Since when have game devs not bent over backwards for consoles(in the last decade)? ^

I see decent 4Gb and 6Gb cards on newegg for less than $300 so at least its not a(big) price issue if you're lacking the vram and can flip your current card.

I'm not disagreeing with you, but wolfenstein takes things a little farther than normal. Some wtf things going on there

Really really would hold off on 4gb nvidia stuff in that price range until someone tests it first. So far even under the most ridiculous scenarios its been a waste of money, but I'm totally down if new games are going to somehow make use of it while not being stuck at other limitations first. Of course if its the same price or cheaper there's no reason not to (like the MSI wow)

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply