Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
RGX
Sep 23, 2004
Unstoppable
I'm planning on pulling the trigger on a GTX 970 this week and I'm opting for the MSI NVIDIA GTX 970 Gaming Twin Frozr edition, seems to be the top rated version and hey, I'm a sucker for red on black. Anything about this card I should know/might have missed?

Also, up until yesterday I saw some retailers offering this card with a free copy of the new Metal Gear, however some of them have now removed the promo for it from the info page. Is there anywhere in the UK still offering this card with a free game? It's not a huge deal or anything but I don't upgrade very often and a free AAA game would be sweet.

Adbot
ADBOT LOVES YOU

RGX
Sep 23, 2004
Unstoppable
So I'm interested in the threads opinion on the new VRR supported TVs and what this means for gaming on the big screen? Reason being, my company is going to pay for a powerful (ie gaming) laptop at some point this year and I'm also looking to purchase a big screen TV for my home, it'd be really nice if I could just hook it up to the laptop via HDMI and have a one-stop gaming solution with a super smooth framerate. Given AMDs....less than stellar performance recently its likely I'd be getting something with an NVIDIA card, I've seen a few people speculating that NVIDIA will start supporting VRR to go along with the new HDMI spec and that Gsync is likely dead in the water, how likely do we think this is to become a reality? Is it even possible to enable this via a driver change/software support or does this have to be physically built into the card?

I've been looking with some interest at the new Gsync TVs they announced recently, however they're likely to be much more expensive than a regular 4k screen and some of the first impressions I've seen have commented that the blacklight bloom is pretty atrocious which is a deal breaker for me. I really don't see the logic in buying a very expensive TV that only looks good for gaming, but butter smooth frames would be very nice.

RGX
Sep 23, 2004
Unstoppable
Haven't seen a lot of discussion on the RTX Max-Q mobile GPUs here, whats the thread consensus? From what I've read so far they're generally regarded as a pretty big waste of money given how low clocked they are, 30% performance drop over the desktop cards while still running very hot. I would be prepared to drop some cash on a max q 2080 laptop if it could output to my tv at 1440p 60 fps with high settings but haven't found many benchmarks that suggest it can get even close to a stable 60 at that res.

Seems a shame give the gaming laptop segment really seems to be expanding these days at a rapid rate, I guess the bigger size and heat is a significant issue. Even Razers new blade pro has been announced with max-q only, the previous model managed to fit a fully fledged 1080 in a slim form factor which suggested to me their cooling solutions were becoming a lot more advanced with a bigger 17" chassis to play with. Everything I've seen with a proper 2XXX series onboard looks hilariously huge and reminds me of the gigantic space heater laptops of the mid 2000s.

RGX
Sep 23, 2004
Unstoppable

Some Goon posted:

Its hot, its not "Too Hot".

:hmmyes:

A lot of people get way too concerned about temps imo, modern hardware is very good at throttling when things get too toasty and those temps are nowhere near "too hot" levels. By all means have a look at your airflow if you want an effeciency project or if your card is underperforming in the games you play but don't let it keep you up at night, particularly if you run it with hot ambient temps. That's well within reasonable margins.

RGX
Sep 23, 2004
Unstoppable
The real question is this: do we think this thread can reach page 3080 before anyone in it actually gets hold of one?

My bet is yes, by some margin.

RGX
Sep 23, 2004
Unstoppable
I miss when this thread was about wild baseless speculation rather than a news feed about peoples fruitless attempts to buy cards.

As such, lets have some speculation!

The rumour mill is churning regarding the new mobile ampere variants, with the consensus being we are likely to see a launch in January 2021. Leaked slides appear to show three variants, in m3060, m3070 and m3080 form, interestingly all appearing to use GDDR6 rather than the x variant. Also, AMD mobile cpu support!

So let the speculation begin. For a start, the desktop cards are notoriously power hungry, which is generally bad news for a laptop part. Mobile GPUs are normally run at a much lower clock than their desktop counterparts with lower performance to match, do we think this is likely to be the biggest discrepancy between desktop and mobile performance yet? I struggle to see how they are going to shave any meaningful power draw off of the chip(s) without a massive underclock, and this is likely bad news for performance.

I'm also curious as to the threads opinion on just how much performance will be lost using GDDR6 only? And for that matter, the age old problem of thermals. I've noticed over the past couple of years the rise of the gaming laptop from "hilarious oversized space heater" to "genuinely usable mobile gaming platform", everything I've seen of the new cards suggests they will not play nicely sandwiched into a small chassis, thoughts?

I've even seen some people speculating that nVidias first 7nm offerings will be these mobile chips in a new variant we haven't seen yet in an effort to solve the power and heat issues. Is this even feasible? I have no idea how fabs work please send help.

RGX
Sep 23, 2004
Unstoppable
I've got a quick question thats bound to be lost in amongst the stock chat, but here goes.

I've got a laptop I use for production work and light gaming thats got a 960m and a 60hz screen, and for the life of me I've never been able to get it to force v-sync with the nvidia control panel. I've had it for a few years, update the drivers regularly and its never ever worked, despite the fact that ingame v-sync works perfectly to prevent screen tearing.

I have Fallout 4 installed that has really annoying v-sync implementation (forces game to 30fps and causes hitching) and I'm also playing a couple of other things where v-sync causes engine issues (Wolcen, the buggy piece of crap, looks super smooth with in game vsync enabled but causes epileptic flashing water effects, vsync off fixes it but tons of tearing)

Any idea why forcing it with the drivers does not work at all? I've tried enabling it game by game in the control panel and in global settings as well but no dice. Screen tearing in Fallout 4 is really really bad especially indoors and I've become pretty sensitive to it in general.

RGX
Sep 23, 2004
Unstoppable

bus hustler posted:

Do you have Intel graphics as well, or is it just a 960m without Optimus? I'd try disabling Optimus (gpu switching on demand) and running just on the 960m to see if that allows the nvidia driver to force it.

I've got the two games I mentioned set in the control panel to use "high performance NVIDIA processor" and that definitely isnt doing it, is there anything else I can do to make sure I'm not using the intel chip? I'm pretty sure it does have integrated graphics as well, I've had system diagnostic tools misread it before.

RGX
Sep 23, 2004
Unstoppable

bus hustler posted:

I'd start with just disabling it in device manager and rebooting. Since the system has another one sometimes this just does the trick. Some laptops also have the option to disable in the bios. Another option is to try and force it to the most aggressive performance power plan possible

there are some older implementations of optimus (could be modern too i am not an expert) where this doesn't really work, but i'd start there. as far as i know in some implementations what happens is the nvidia card writes directly to the intel gpu's frame buffer so I could see all sorts of wonky poo poo happening here.

Thats really interesting, sounds like a likely candidate. Thanks for the tip, I'll have a mess around with it and see what happens.

RGX
Sep 23, 2004
Unstoppable

Rinkles posted:

I'm getting a stutter watching Prime Video on my PC. Is this likely an Nvidia driver issue? It plays fine on my UHD 620 laptop.

e:might just be limited to chrome. no issue using edge

Don't know if it's related but I recently had an odd hitching issue playing local media with VLC. No problem gaming whatsoever. Narrowed it down to the nvidia driver, did a remover wipe and installed the october drivers, solved it. I've been testing each iteration as they've been released since then and all of them gave me intermittent video playback issues of different types.

Installed the most recent update the other day and everything seems fine again, marathoned a load of video and all super smooth. Ive heard other people discussing various issues with team green drivers recently but the latest release seems to have solved my problem, I hope it works for you as that sort of thing drives me crazy.

RGX
Sep 23, 2004
Unstoppable
https://www.rockpapershotgun.com/resident-evil-village-pc-requirements

Thought this was interesting, Capcom are recommending a 3070 as recommended requirements for 4k 60FPS in their new Resident Evil game with ray tracing on. Apparently a 2070 can manage 45 FPS at the same settings.

Given how resource intensive raytracing currently is in other games, nevermind 4k res, do we think this is because their usage is relatively light or is ray tracing going to become more optimised in general? Is that resource hogging aspect of the tech simply because developers have yet to refine the implementation, or are we still simply not there hardware wise for it to be a viable option at 4k for all but the absolute top end cards?

I wonder, now that developers have had a decent amount of time to play with the tech, whether we are going to see increasingly optimised versions of it that might give the lower end cards more of a chance at decent framerates with fancy lighting.

RGX
Sep 23, 2004
Unstoppable

FilthyImp posted:

I think Dallas had a bi.g Nerd War

Manager said no line until 6am, dweebs camped anyway.
Guys started a Rules Line at 6am counter to the Scalper Line.

Manager came out and said that Scalp Line was fake. Guy in front of Scalpline starts open carrying and complaining.

Cops are called. Manager comes out and tells Rules Line to pound sand because he doesn't want to deal with angry gun scalper.

Then Manager says 3rd line by door for some reason so it was a complete clusterfuck.

This is one of the most dystopian things I've ever read. What the actual gently caress.

RGX
Sep 23, 2004
Unstoppable
Had an interesting article pop up in my feed and would be very interested in the threads opinion:

https://www.techradar.com/news/nvidia-rtx-4090-gpu-is-alarmingly-good-at-cracking-passwords

Is this something that might shift the metric on the prevalence of password cracking, or is it simply an evolution of something that was already commonplace? The article suggests a standard 8 character password could be forced within an hour with eight 4090s. That seems like a relatively low barrier of entry for somebody willing to spend a bit of cash to break into some specific (and presumably lucrative) targets.

RGX
Sep 23, 2004
Unstoppable
BL3 was a weird one for me, I played it on a 1060 with a mix of high/medium settings and the benchmark stuttered like crazy. In-game? Don't think I had more than 3 micro stutters in my entire playthough. Did it do enough shader compiling in the benchmark that the game itself wasn't affected?

It seems really fishy to me when a dev seems totally blindsided by such an obvious issue, a la The Callisto Protocol. Part of any product pre-launch should be the physical act of replicating the end user experience, ie, right, lets get a fresh machine, a fresh copy of the game, boot it up and see what happens. Seems almost wilfully negligent that such a basic bit of testing didn't take place, to the detriment of the games launch and presumably initial sales. Lawsuit worthy, even?

RGX
Sep 23, 2004
Unstoppable

repiv posted:

the callisto protocol got a patch which greatly reduced the stutters just a day after the public launch, so i suspect they were fully aware of the issue and intended to fix it but it just barely didn't make it in for the deadline

If thats the case, I wonder how many potential sales were lost for the sake of a single day. Even reviewers I follow that are normally sympathetic over pre-release performance issues slated it as virtually unplayable. What a clusterfuck.

RGX
Sep 23, 2004
Unstoppable
Forza Horizon 5 spent forever pre-compiling shaders, before dumping you straight into a scripted sequence where it deliberately throws the kitchen sink at you, literally every environment in the game one after the other in a "seamless" sequence, stuttering away all the time. After you get into the game proper, no issues whatsoever.

It was like the whole intro was designed to get the stuttering out of the way as early as possible, but boy was it a bad first impression.

RGX
Sep 23, 2004
Unstoppable

Animal posted:

That’s actually very clever

Agreed, as solutions go it wasnt a terrible idea, but it does make you spend the first 2 minutes of the game wishing it would let you tweak the settings.....only to find it runs very well once its had a chance to settle down.

TBH i'd take most games doing that over the current stutter situation. Nothing brings you out of the experience quite like playing the meta-game of "did I leave something open or is this just poorly optimised?"

Adbot
ADBOT LOVES YOU

RGX
Sep 23, 2004
Unstoppable

Turds in magma posted:

Let's say my biggest ambitions at the moment are Baldur's Gate 3 at 1440p 60 Hz, what's the decent value card? Still 3060 Ti? I've been out of the loop for a while.

I was reading up on whether my laptop could play it the other day and apparently it still runs well on a 1060 at 1080p, seems to be very well optimised.

A 3060ti at 1440p would give you some nice headroom I would imagine, I wouldnt go much lower than that simply for longevity.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply