Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
some dillweed
Mar 31, 2007

FetalDave posted:

There was an nVidia post on their forums saying they're aware of the 36 hour driver reset, and that they plan on fixing it with the next WHQL release.
Just in case anyone's wondering, the 32x branch drivers have also been causing these issues on 500 and 600 series cards, from the reports I've seen elsewhere (happens on both my GTX 580 and 770, for example). It seems like a lot of people reporting problems are on 700 series cards, though. Also, yeah, the 314.22 drivers with a modified nv_disp.inf seem to help stop the 36-hour crashes for a decent number of people.

Adbot
ADBOT LOVES YOU

some dillweed
Mar 31, 2007

Yeah, he posted a list of fixes that should be in the next driver:

ManuelG posted:

Here is a list of some bugs which are fixed in our upcoming driver:

-Some 27" 1440p monitors with Korean B grade panels are not detected. [1329305]
-The NVIDIA Driver fails to install on some systems. [1329909]
-TDR error may occur when the system is idle for 36 hours. [1330455]
-[Metro: Last Light]: Game performance drops with latest PhysX software. [1316114]
-[GeForce GTX TITAN][Sniper Elite V2]: The benchmark crashes when Advanced Shadows is turned. [1253786]
-[GeForce GTX 650Ti][Surround]: When three displays connected in 2D Surround mode, the Windows Start button cannot be clicked. [1299351]
-[SLI][Sleeping Dogs]: The game crashes when played with SLI enabled. [1327315]
-The NVIDIA Driver fails to install on some systems. [1329909]
-TDR error may occur when the system is idle for several hours. [1330455]
-[NFS]: There is corruption on the screen when playing the game. [1315717]
-[Surround][Far Cry 3]: With 3D Surround enabled, the system hangs when changing the in-game Surround resolution. [1256120]
-[SLI][Windows Magnifier]: With SLI enabled, the application crashes when attempting to close. [1339556]
-[NFS]: There is corruption on the screen when playing the game. [1315717]
-Faceworks demo shows DIgital Ira character with yellow beard [1338771]
-Far Cry 3 severe FPS drop in performance in SLI mode sometimes while gaming [1349607]

Here is a list of bugs that are targeted for next major driver branch. I will be continually adding more bugs to this list:
1328590 - Black artifacts on screen in Battlefield 3 when playing with Ultra preset graphic setting in Golf of Oman map
1295996 - [Surround]: 320.18: Dragon Age II exhibits lower performance after driver update
1349581 - GRID 2 minimum FPS is much lower in SLI mode than in single GPU mode

some dillweed
Mar 31, 2007

Sorry if these have been discussed before, but I have a few questions for anybody with the knowledge. Basically, I'm trying to figure out what the proper setup(s) is/are for eliminating tearing in games while keeping input lag to a minimum.

First, what exactly does the "Maximum pre-rendered frames" setting in the Nvidia control panel do? From what little I've gleaned through reading, this is meant to help make framerate smoother with higher settings by having the CPU render frames before they're sent to the GPU, but it seems that the higher this is set, the more input lag you'll experience. Is there any reason to not set this globally to 1 on a single GPU system if you're trying to minimize perceptible input lag? Does this setting apply to both OpenGL and DirectX games? If not, can you and/or how do you override it?

Also, is there a way to enable proper triple-buffered VSync for all games (OpenGL and DirectX alike)? From my limited experience and the limited research I've been able to do on the subject, proper triple buffering with VSync seems like the best tradeoff to remove tearing with minimal additional input lag compared to non-VSynced rendering. I don't like tearing, adaptive VSync doesn't really keep tearing from happening below 60 fps and sometimes seems like it doesn't prevent tearing at the framerates it should, and standard/double-buffered VSync makes games feel unresponsive (at least if they drop below 60 fps).

And lastly, I've seen some people mention that they run both VSync and a forced framerate cap, either at 60 or 59 for some reason. What does this actually do compared to just using VSync?

My brain is mush.

some dillweed
Mar 31, 2007

But isn't that just like capping the max framerate below the refresh rate and leaving VSync off? Wouldn't there still be tearing?

some dillweed
Mar 31, 2007

Ignoarints posted:

I tried 59, 60, and 61 with 60hz set on the monitor (and also a plain 60 hz monitor actually to start with). I had best results with 59 for whatever reason.
Do you just use Nvidia Inspector to set the cap?

It seems like the amount of input lag associated with enabling VSync is kind of dependent on the game. I don't really notice increased latency with VSync on and capped at 60 in Borderlands 2, but Sniper Elite V2 has noticeable input lag as soon as you enable VSync in the menu and everything is obviously lagging behind your mouse movements in the game. The input lag doesn't seem to be as much of an issue if you're playing with a controller. All of the games I'm playing seem to use DirectX, so forced triple buffering from the control panel isn't an option. You can apparently force some form of triple buffering with D3DOverrider or RadeonPro, so I'll have to check that and see if it makes any difference with VSync on.

In Borderlands 2, using a framerate cap at 60fps alone doesn't stop tearing, and VSync alone doesn't seem to completely stop it either, but VSync along with the framerate cap seems to eliminate it. I don't know if it would be any different or have less input lag if I capped it at 59 instead, but if it or any other game is using standard VSync and I enable that, then I'm guessing capping the framerate at anything below 60fps will just make it drop to 30 automatically.

some dillweed fucked around with this message at 04:35 on Jun 6, 2014

some dillweed
Mar 31, 2007

Hm. I might just be seeing proper tearing when it's either not using VSync or it's not capped at 60 or below, and the display is too slow to keep up with fast motion without it looking kind of like tearing. I don't even know anymore. I'm just going to try poo poo and hopefully figure out a way to eliminate the tearing while keeping things responsive.

some dillweed
Mar 31, 2007

Yeah, I just tried the various capping options both in Borderlands 2 itself and 59 fps through Nvidia Inspector, along with Sniper Elite V2 and CS:GO capped through Inspector. 59 fps by itself still noticeably tears in all of them, but enabling VSync with the frame limit at 59 fps does seem to essentially eliminate the extra input lag from VSync. There's next to no improvement in input lag from using triple-buffered VSync alone compared to double-buffered in CS:GO, but with a 59 fps limit and triple-buffered VSync, it's like there's no extra input lag. I have no idea why it works, but it seems to for any game that implements triple-buffered VSync. Now I understand why people use that and D3DOverrider or RadeonPro to enable triple buffering in games without the option.

There's still slight judder/afterimage/image persistence/motion blur or whatever with fast movement (although it's not really blurry to my eyes, it just looks like older frames or parts of frames being held too long on the display), but I'm guessing that's just a technical limitation of LCD or at least 60 Hz LCD displays. It seems like you can't really get rid of that completely without specialized technology like a strobed backlight display of some sort which have inherent problems of their own and blah blah blah.

some dillweed
Mar 31, 2007

You said you've tested one before, but I'm just running a single GTX 770. Maybe it's a difference somewhere else in our systems, or you might just be less OCD about tearing than me. I have a tendency to nitpick the hell out of and obsess over visual issues.

some dillweed
Mar 31, 2007

1gnoirents posted:

That's 8 chips, as far as I know 1gb GDDR5 chips are (were) really expensive. But a lot of time has passed now so we shall see.
All of this GPU memory talk piqued my interest. I couldn't find anything more recent in my quick search, and I don't know where you'd go to find more in-depth information, but I found a thread on HardOCP where somebody posted a couple of images from Mercury Research detailing graphics card component cost for some cards from both companies for 2009 and 2011. The figures have the cost for GDDR5 (in 2011) at around $18.45/GB. That would put 6GB and 8GB of GDDR5 in 2011 prices at $110.70 and $147.60, respectively.

I wish I knew where to find more up-to-date info on this stuff. It's kind of interesting.

some dillweed
Mar 31, 2007

Along with the 970/980 NDAs lifting, the new R343 branch GeForce driver (344.11) is up and ready for download, for those of you interested (Windows 8.1 download link).

Here's the Guru3D link for all versions: http://www.guru3d.com/files-details/geforce-344-11-whql-driver-download.html

some dillweed fucked around with this message at 04:54 on Sep 19, 2014

some dillweed
Mar 31, 2007

Sorry if this isn't really the place to ask, but is there a general consensus on the "best" method of enabling VSync in games while minimizing input latency? Still forcing triple buffering through D3DOverrider if it's not built into the game? Do I still need to test out every game with triple-buffered VSync enabled and see how the controls respond on a 59 fps cap versus 60 fps cap and all of that poo poo? How well does Nvidia's "Adaptive VSync" work? I think the last time I tried it I still noticed heavy tearing if the frame rate dropped below the refresh (which I guess is what you'd expect since it disables VSync at that point).

I'm just trying to finally get back to my huge backlog and don't want to be annoyed by tearing or unresponsive controls.

some dillweed
Mar 31, 2007

Yeah, I'm just dealing with a standard 60 Hz IPS display. It's probably one of the worst options as far as overall latency is concerned, but it's what I have to work with. As for the frame cap thing, I only mentioned that because I've tried it in combination with triple-buffered VSync for certain games in the past and it seems to reduce the input lag (at least slightly, but still perceptibly) compared to just using triple buffering and VSync. It seems to be really dependent on the specific game in question. Some games, it causes occasional stutter, while other games don't have much noticeable issue. I've seen other people mention using 2 frames below the refresh rate for their cap instead of 1, or putting the limit at a frame above the refresh rate, and setting the global setting for "Maximum pre-rendered frames" to 1 as possible solutions to reduce input latency, but haven't really looked into the "pre-rendered frames" setting further or tried anything with it. Dropping the frame cap further below 59 didn't seem to make any huge change on my setup, from what I remember, and I haven't tried increasing it beyond the refresh rate. I have no idea if putting a cap above the refresh rate actually does anything with VSync enabled, though.

I apparently started making a list back in September of all of the settings that work best in the various games I've played... I wish I were less obsessive about this stuff. I need sleep.

some dillweed
Mar 31, 2007

sauer kraut posted:

Why would you use Vsync and a framerate limiter at the same time, on a 60Hz display especially?
That seems like a surefire recipe for rock stable 30fps gaming :confused:
Triple buffering with VSync doesn't drop the frame rate to the various divisions of the refresh rate (60, 30, 20, 15, etc.) like standard double buffering does. If you're using standard double-buffered VSync, then yeah, I'd think you'd want to avoid capping the frame rate because that would likely worsen your experience significantly.

edit: Capping below the refresh rate, I mean. Like Zero VGS and other people elsewhere have said, capping above the refresh rate can provide certain advantages, I guess.

some dillweed fucked around with this message at 23:17 on Feb 10, 2015

some dillweed
Mar 31, 2007

Man, all of you talking about old video cards brings on some nostalgia. The first video card I ever got was an Elsa Erazor II (RIVA TNT) that I convinced my dad to buy for me because it was on sale. I think we had the PCI version, though. This was the box (except PCI instead of AGP, I'm pretty sure):



I think we had just moved to a K6-2 400 around that time, too. I had seen and played Jedi Knight on a good quality, fast CRT at the local game store ("Microplay") and had started playing the demo on a regular basis, and wanted to experience those amazing accelerated graphics at home. Just talking about it makes me want to go and play JK again. I have some seriously great memories thanks to that game and the community that sprouted up around it.

I honestly don't remember what I moved to from there. I think a Radeon SDR might have been the first card I bought with my own money, which then had to be replaced with either a Radeon 7000 or 7200 (one of which was replaced by the other, I don't remember the order), then eventually a Radeon 9600 XT, 8800 GTS (320 MB version), GTX 460, GTX 580, and a GTX 770. I'm waiting on more RX 480 details to see if it's any good before grabbing either it or a 1070. $900+ (CAD) for a 1080 is too rich for my blood.

some dillweed
Mar 31, 2007

Kazinsal posted:

1080s are $750+ CAD. Cheapest 1070 is $560.

Furnaceface posted:

We get absolutely hosed up the rear end for parts prices up here for some reason. Its why I was relying on AMD to mess with the GPU market so I could actually buy a new card.
From the previous page, but just so you guys know, some of the prices here are basically just the cost in USD converted to CAD. The exchange rate is just that bad right now, at $1.29 CAD to every USD. A $420 USD Gigabyte GTX 1070 G1 Gaming is listed for $564 CAD on Vuugo (I have no idea if they're reputable), which is ~$20 more than the direct currency conversion, while the EVGA SC Gaming ACX 3.0 is $450 USD in the US and $590 CAD at Memory Express, which is ~$10 more than the direct conversion. It's the same deal with several GTX 1080s, as well. For others, though, it definitely seems like there's some degree of price gouging (mainly because they're out of stock everywhere else).

But basically, if the US gets screwed on prices, then we do as well because the exchange rate sucks and that seems to be how a lot of prices get determined here. It looks like it's generally around $650-700 USD for any given 1080, which should usually mean around $840-905 here. The cheapest I see at a cursory glance is $550 for a 1070 and $815 for a 1080, which pretty much matches the cheapest models you can find (constantly out of stock) on Newegg.com after conversion.

some dillweed
Mar 31, 2007

http://wccftech.com/gtx-1060-3gb-detailed-specs-leak/

quote:

We will be taking this leak with a grain of salt, but owing to the differences between both graphic cards, the 3GB version has been reported to retail for $149 USD, giving gamers on a very tight budget the freedom to enjoy the latest games with a large number of GPU taxing settings enabled at their highest configurable point. This clearly shows that the card will tackle the Radeon RX 470 which is AMD’s $149 US solution that features the cut down Polaris 10 GPU with 4 GB GDDR5 VRAM.
http://www.tweaktown.com/news/52981/nvidia-geforce-gtx-1060-3gb-variant-priced-149-199/index.html

quote:

Well, NVIDIA could hit a lower price point with the 3GB variant of its GeForce GTX 1060, as the company has only unveiled the 6GB version thus far. NVIDIA is reported to hit a $149 price on the partner cards, while the GTX 1060 3GB Founders Edition could be priced at $199.

What these "reports" are and who the sources are supposed to be, no idea.

some dillweed
Mar 31, 2007

The only comparison I know of offhand is the Digital Foundry one with a 2500K and 3770K compared to a 6500: http://www.eurogamer.net/articles/digitalfoundry-2016-is-it-finally-time-to-upgrade-your-core-i5-2500k. It seems like increased memory speed has a bigger effect on Sandy Bridge, with relatively minor gains on Ivy for certain games. The minimums increase a decent amount in some cases, but it seems like memory speed doesn't make a huge improvement on their Ivy setup.

There are still pretty big gains in minimum frame rates going from the overclocked 2500K and DDR3-2133 to the other CPUs, though. I'm still on a 2500K at 4.5 GHz, DDR3-1600, and a GTX 770. All of this is starting to make me really feel the itch to upgrade...

some dillweed
Mar 31, 2007

wicka posted:

:agreed:, and i'm pretty surprised that anyone in 2016 cares about having an optical drive. like...what are they even using it for?
Old games that I haven't made an ISO copy of yet or audio CDs that I haven't ripped lossless files of yet. Those are the only reasons I reinstalled my DVD drive. In any case, regardless of the computer format, if you need an optical drive and don't already have one then you can generally get by with a USB-connected unit. Built-in optical drives are basically unnecessary extras at this point.

some dillweed
Mar 31, 2007

wargames posted:

With DX12 we can single out the people who give zero shits about optimization now?
Well, at least with Nixxes they posted a Steam announcement a couple of days ago stating that the game isn't going to have DirectX 12 support at launch, specifically because it needs more optimization: https://steamcommunity.com/games/337000/announcements/detail/930377969893113169. They're expecting it to be done by the week of September 5.

That doesn't really explain what settings cause Very High/Ultra to run so slowly in general, though. Might be things like the contact hardening shadows, "Ultra" volumetric lighting, and "Ultra" screenspace reflections.

some dillweed fucked around with this message at 06:47 on Aug 20, 2016

some dillweed
Mar 31, 2007

Is there a consensus on which of the GTX 1060 6GB and RX 480 is the overall better option if you're stuck at 1080p60 and like most settings cranked up? It seems like the 1060 generally does better with most benchmarked games, but the 480 outperforms it by a decent amount in others like Hitman and Doom. There are some 1060 models priced around $330-350 CAD (EVGA's mini single fan model, Gigabyte's non-G1 version, PNY, Zotac Mini), but they're still waiting for stock or are backordered everywhere, and the RX 480 only has the stock cooler models so far which are also all around that $340 mark.

I'm probably going to want to upgrade from my GTX 770 2GB soon but I can't justify replacing my display because it still works fine (for now), and I'd rather not spend $600+ CAD for a GTX 1070. I'm not expecting a ton of "future-proofing" from cards at this level, but if I can get something that'll at least be likely to last me until 2018 that'd be nice. Like I said, though, I'm not expecting miracles from these things.

some dillweed
Mar 31, 2007

Yeah, the lack of stock for the decent-priced models is more the issue than anything right now. The ones that are actually available at Newegg.ca, for example, are basically just at the cost they'd be after a currency conversion from the USD price, or they're being sold at $70+ markups from third-party sellers. The cheaper models are out of stock pretty much everywhere both here and in the US.

some dillweed
Mar 31, 2007

For a GTX 1070, if you have a choice between an EVGA FTW or an MSI Gaming X for the same price, or a non-OC ASUS Strix for around $30 less, is there a "best" option? In tests, I'm seeing the Strix as very slightly noisier, and the FTW as running slightly hotter and then also noisier if the fans ever run at full speed (I don't know in what kind of situation they would need to run at full speed, though). In terms of support, some people seem to get problems from both ASUS and MSI, but they both have service locations in Canada while EVGA might have a slightly better reputation but is located in the US.

Furnaceface posted:

Also why the gently caress is the 1060 3gb $100 more than the RX480 8gb? Wasnt that supposed to be the nvidia option for us poors?
Huh? In stock on Newegg:
Most RX 480 8GB are around $330-340 or $390 for the Sapphire Nitro+.

some dillweed fucked around with this message at 07:53 on Aug 25, 2016

some dillweed
Mar 31, 2007

Grog posted:

For a GTX 1070, if you have a choice between an EVGA FTW or an MSI Gaming X for the same price, or a non-OC ASUS Strix for around $30 less, is there a "best" option?
Anyone have any opinions on this? I'm thinking of going in and buying a card today. Either one of those or possibly the single slightly overpriced 1060 in Winnipeg. The ASUS is also actually around $40 CAD cheaper.

I'd rather get the quitest cooler that performs well, but reviews don't go super in-depth on the kind of noise these things make, just the decibel level. At defaults, the ASUS seems to cool the best but is slightly louder and slightly noticeable because of it, the MSI is quieter and cools slightly less effectively, and the EVGA is supposedly quiet like the MSI but cools even less effectively.

some dillweed
Mar 31, 2007

http://www.evga.com/support/stepup/

They say you can step up to a higher model of the same card. The time it takes to go through the process seems to depend on the model you want and its demand, and is decided on a case-by-case basis.

e: And yes, it also looks like EVGA is the only company that offers anything like a "Step-Up" program.

some dillweed fucked around with this message at 02:19 on Aug 26, 2016

some dillweed
Mar 31, 2007

Anybody have any experience with the EVGA GTX 1060 SC? It's selling for about $60 (CAD) cheaper than an MSI Gaming X, but someone on SPCR got one and says the fan's bearing starts to "howl" at speeds around 1100 RPM, and some people on EVGA's forums can hear it closer to 1000 RPM. The only reviews I've found with temperatures have it around 75*C at load and "quiet" at 38 dB on the newer "0dB" BIOS, with the previous BIOS having the load noise around 5 dB louder at ~1500 RPM. If you tried to overclock it and wanted to keep temperatures low so it doesn't throttle significantly, it sounds like it might make the fan noise annoying.

I have overly sensitive hearing and I guess I'm just trying to decide if extra quiet is worth an extra $60.

some dillweed
Mar 31, 2007

BIG HEADLINE posted:

One person's 'howl' is another person's 'quiet enough for me.'

As for grades...someone surmised a while back that it seems that very little 'binning' is going on with the 10x0 chips, and the lack of the EVGA Kingpin, Classified, and 'Lightning' SKUs from MSI seems to kind of confirm that. My guess is it comes down to 'does it work? Yeah? Okay, slap it on a loving card, we've got orders to fulfill" at this point.
Yeah, I wasn't really worried about binning and am more concerned with the noise and keeping the card cool enough. If I could find a recording of the fan somewhere then I'd be able to decide easily, but nobody seems to have recorded it.

some dillweed
Mar 31, 2007

BIG HEADLINE posted:

Like this? https://www.youtube.com/watch?v=kOEryFcZfnY

Or this? https://www.youtube.com/watch?v=awB3VivRPAo

Not that these should be taken as gospel, because every card is potentially different - either better or much worse.
Yeah, like those. My problem is finding the specific cooler for the EVGA 1060 SC since it uses a single-fan setup. It's supposedly the same cooler as on their 960 SC, but I can't seem to find any recordings of that either.

Lockback posted:

I have one, I can hear it a bit under full fan load but that was pretty rare playing doom. I'd call it a quiet card.
Thanks. I've found a couple of other people who said it's usually quiet, but others who say the opposite. The lack of reviews and detailed accounts and then conflicting reports about it are making me doubt everything.

some dillweed fucked around with this message at 01:07 on Aug 27, 2016

some dillweed
Mar 31, 2007

Lockback posted:

There's some luck of the draw, some impact on case acoustics and heat dissipation affecting how hard the fan had to work. I don't think your going to get an answer, I'd order one and return it if it's too loud.
Thanks for the advice. I'm sure it's still a great card, but I decided to go with the MSI anyway. I debated with myself over it for way too long, but I ended up choosing the MSI because the retailer (like most of them, at least in Canada, it looks like) charges a 15% restocking fee if the product is open. Add that to the fact that MSI has an RMA service center actually within Canada where EVGA's only got the location in the US, and I'd not be saving any money if I didn't like the sound or anything eventually went wrong with the card. I'm coming from a cooler that's extremely quiet most of the time (this Gigabyte GTX 770), and from listening to recordings of the MSI cooler, I'm more assured that it'll basically be silent even during heavy use. With everything else, I just didn't feel like taking a chance on the EVGA. Again, though, thanks for the help, Lockback and BIG HEADLINE.

Man. I remembered today why I don't like buying hardware. I fret over it and have way too much ingrained guilt over spending money on these things. Thanks, childhood! :cry:

some dillweed
Mar 31, 2007

repiv posted:

They show the RX480 8GB taking a smoothness hit from DX12 as well. It's not as bad as the Fury, but it's still hitting 70-80ms peaks every second or so :shrug:

Mankind Divided's current DX12 support is part of a beta patch. Not that I'm saying a more polished release will fix everything, but maybe it's just a problem/bug with the way their DX12 render path works right now. They also posted this as part of the patch announcement:

quote:

Known DirectX 12 issue:
- There is a known bug that causes some very high end cards to regress relative to DirectX 11. This bug is being addressed by the development team.

Still, it doesn't negate the point that average frame rate alone doesn't tell you everything about performance.

some dillweed
Mar 31, 2007

Hey, uh, in case anybody else is using the modified driver method of EDID override for your Nvidia card, don't forget to disable driver signature enforcement. My tired brain completely forgot to check that, and I just spent around an hour and a half trying to install my new 1060. I looked up the other methods of EDID overrides and then ended up installing the clean driver and doing manual registry edits before thinking that the driver signature enforcement might have turned back on. Yeeeeaah.

:pseudo:

some dillweed
Mar 31, 2007

EdEddnEddy posted:

Can't you just push a custom res in the Nvidia Control Panel or is it to push it outside the clock range that that tool allows?

My 1440P screen can go up to 110Hz easy, but it artifacts like mad, 100hz works but still has a few pixels in specific scenes, and at 90Hz its perfect which is good enough.

And I can't even use my ASUS 120Hz one anymore because passive DP -> DVI adapters suck and can only do 1080P at 100hz with the screen whining to me each power up that the cable is wrong. Stupid single DVI port on modern GPU's. :argh:
In my case, the EDID override is for my TV to disable HDMI audio and be recognized as a DVI-connected monitor instead, which lets it display a proper 4:4:4/RGB full range image. It's also stuck at single link DVI speeds with the HDMI-DVI cable, so unfortunately I can't get over the default 1080p60. I tried an active HDMI-DisplayPort adapter, but it doesn't enable 4:4:4, at least on this TV.

some dillweed
Mar 31, 2007

I don't know what the differences are between Nvidia and AMD cards when it comes to memory usage/handling/whatever, but at least Guru3D's tests on a 12 GB Titan X showed nowhere near the 7.3 GB VRAM usage that Computer Base saw in Mankind Divided on 1080p Ultra (they were a little under 4 GB). TechPowerUp also mentioned the same 5.5 GB figure as Guru3D got for Ultra at 4K, but that was also on another Titan X.

some dillweed
Mar 31, 2007

Well, if you're using settings where you'd want a 9 GB 1060, you'd likely need a higher performance card.

woodenchicken posted:

All I know is Rise of the Tomb Raider laughs at my 8gb 1070 and drops to 50fps if I choose highest textures.
(All other 'big' options like AA and AO also need to be turned to second highest. And that's on 1080/60. That card is no overkill.)
Every review/test and player report/complaint I'm seeing say it uses around 6.5-7 GB max at 1080p. It's probably not just textures that are causing performance drops at Very High settings. What happens if you turn everything else to the second-highest setting or High instead and then leave the textures at Very High? You could also always check VRAM usage on some kind of monitor (like Afterburner) if you really wanted to know how much that game uses at the highest settings.

some dillweed fucked around with this message at 21:50 on Sep 15, 2016

some dillweed
Mar 31, 2007

From last page:

japtor posted:

Probably gonna wait it out until Black Friday (considering I'm basically getting this for Pac Man of all things), but assuming nothing really changes in terms of any new cards coming, is the EVGA 1060 6GB the one to get if I want a short card? Doesn't seem like there's that many other options and I vaguely recall some positive posts or something about it. And is the SC version worth the extra :10bux: or however much it is? I don't really plan on OCing or anything so maybe it'd be worth the higher base clock?
According to some reddit posts, customer reviews, and posts on EVGA's forums, the non-SC cooler is pretty bad. Some people talk about 85+°C temperatures under regular gaming loads using the stock fan settings, and the throttle point is supposed to be 83°C. That might be due to poor airflow in their cases, I don't know. The standard one supposedly uses EVGA's GTX 950 cooler which is just a solid block heatsink, where the SC version uses their 960 SC cooler which has a couple of heatpipes and seems to generally run around 10°C colder. If there isn't much of a price difference (it's $15 CAD here), you should probably get the SC.

some dillweed fucked around with this message at 20:53 on Sep 17, 2016

some dillweed
Mar 31, 2007

Are you specifically looking for one that works with Windows 10? Just doing a basic Google search and filtering for the past month brought up 372.60 as the most recent iCafe driver, but it only mentions being for 7 through 8.1. It doesn't look like they're really working on iCafe drivers that are compatible with 10, so if that's what you're waiting on then you'll probably be waiting a while. That Guru3D download is through the Chinese Nvidia site, LaptopVideo2Go has a link to the German site if it makes any difference. Those drivers are supposed to support all three of the 10 series cards.

some dillweed
Mar 31, 2007

It sounds like it depends on the region you live in and the specific service person you get whether you'll have a good RMA experience with pretty much every manufacturer except EVGA. After reading multiple accounts of terrible customer service from ASUS and Gigabyte, it's hard to trust most AIB partners for post-purchase support. At the very least, horror stories are nowhere near exclusive to MSI. Aside from the universally loved EVGA, MSI seems to be one of the few decent companies to buy hardware from in Canada, which is the one of the few reasons I chose to buy that MSI 1060. If they had a terrible rep here like the other companies, I would have just bought another EVGA and dealt with possibly having to ship internationally for any RMAs.

Also, not really relevant, but why would you buy from TigerDirect? They haven't been reputable in more than a decade.

some dillweed
Mar 31, 2007

I have the same card with the same Samsung RAM. At least right now, you shouldn't need to do anything. One of MSI's moderators has posted an updated BIOS for that card on their forums a couple of times, but they aren't giving any kind of changelog so nobody knows what's different, and if you're not actually experiencing any problems then you shouldn't need to update the BIOS at all.

some dillweed
Mar 31, 2007

Is your BIOS up-to-date, and is your monitor plugged into the video card and not the integrated graphics? Other people using those ASUS Z77 boards seemed to be able to properly disable the integrated GPU after updating their BIOS/UEFI and changing those settings you've already mentioned.

some dillweed
Mar 31, 2007

According to bit-tech's review, that model doesn't have a silent/passive mode, and that 37% speed is the lowest it will go. The Zotac GTX 1060 Mini apparently also doesn't go below 40% fan speed, and other models of Pascal cards also don't have a passive mode.

some dillweed fucked around with this message at 20:58 on Feb 18, 2017

Adbot
ADBOT LOVES YOU

some dillweed
Mar 31, 2007

Are we still doing the GPU history thing? I couldn't really afford to upgrade my graphics card very often. That didn't stop me from buying a couple of subpar cards.

Riva TNT (Elsa Erazor II, the PCI version)
Radeon 7200 SDR (still PCI)
Radeon 7000 to replace the dying 7200 SDR above (thanks for downgrading a naive kid from an already bad card, Future Shop)
Radeon 9600 XT (with the Half-Life 2 voucher thing. I was hyped for Half-Life 2.)
8800 GTS (320 MB)
GTX 460 (1 GB)
GTX 580
GTX 770
GTX 1060 (6 GB)

Aside from the 7200 SDR starting to show red artifacts over everything and causing reboots as it slowly died, I luckily haven't had any (hardware) problems with my cards.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply