Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
is it correct that if I feel like my GPU is running too hot for my liking, but I also don't want to ramp up the fans, that the solution is to cap the FPS and/or run at lower settings, so that the GPU doesn't have to work as hard, assuming I don't mind running 30 FPS at Medium?

anything else I could be doing?

Adbot
ADBOT LOVES YOU

sauer kraut
Oct 2, 2004

gradenko_2000 posted:

is it correct that if I feel like my GPU is running too hot for my liking, but I also don't want to ramp up the fans, that the solution is to cap the FPS and/or run at lower settings, so that the GPU doesn't have to work as hard, assuming I don't mind running 30 FPS at Medium?

anything else I could be doing?

Are you still running a 580? I'm guessing it's one of the cheaper models that struggle with the 185W TDP, like my lovely Asus Dual does.
Here are my settings that got noise under control, maybe it'll help.
Just lob 100mV off the higher power states, and drag the power slider to -25 to -30%



If it's still real bad at -30% you might have a serious airflow problem, or need to repaste the card.

sauer kraut fucked around with this message at 12:22 on Aug 21, 2020

Arzachel
May 12, 2012
Yeah, undervolting and repasting are both worth a shot.

SwissArmyDruid
Feb 14, 2014

by sebmojo

HERAK posted:

This is true but how many households might not classify a new game console, possibly the main source of media and entertainment for 5 or so years, as discretionary ?

Maybe if the PS5 had come out at the beginnning of Quarantine with the big fat 1400 check.

Truga
May 4, 2014
Lipstick Apathy

Craptacular! posted:

The reasons to not go Linux only are not related to Nvidia. They're related to anti-cheat kmods. Speaking of, someone figured out how to get GeForce Now to run and made a Lutris install script.

i don't think any games i still play feature lovely anticheats so i'm ok on that front at least.

SwissArmyDruid posted:

Maybe if the PS5 had come out at the beginnning of Quarantine with the big fat 1400 check.

yeah, $500 on entertainment might have been doable in march, today like 30 million are being threatened with eviction, no way are they gonna be buying consoles

cheesetriangles
Jan 5, 2011





24 GB of Vram just says to me I can install a billion more mods into a Bethesda game. Mod authors always think this gun needs 4000x4000 textures, ship it.

NewFatMike
Jun 11, 2015

24GB of VRAM might be a super good CAD card for me since it'll have G-Sync, too.

If only it had verified drivers for that sweet sweet real time SOLIDWORKS photorealistic rendering :negative:

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.
NVidia gimps Geforce for CAD. Gotta pay that Quadro tax!

repiv
Aug 13, 2009

https://press.crytek.com/crytek-announces-release-date-for-crysis-remastered

Release date announcement for Crysis Remastered (sept 18th) also confirms DLSS support, and hardware RTX raytracing in addition to their software solution for other hardware

BOOTY-ADE
Aug 30, 2006

BIG KOOL TELLIN' Y'ALL TO KEEP IT TIGHT

Truga posted:

i don't think any games i still play feature lovely anticheats so i'm ok on that front at least.


yeah, $500 on entertainment might have been doable in march, today like 30 million are being threatened with eviction, no way are they gonna be buying consoles

Same goes for video cards, people are sort of jumping the gun at the moment saying they'd gladly pay $1500-2000 for a card...forgetting exactly how many people have been affected by the pandemic (job/wage losses, no 2nd stimulus yet, no additional unemployment, etc). Unless people get some government help within like the next couple weeks tops (yeah loving right), I don't see NV selling many cards at those ridiculous prices.

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.

OhFunny posted:

Of the consoles priced at $500 or higher the Neo Gro and 3DO are clear failures. Sony had to spend considerable time turning things around with the PS3. The Xbone has been crushed sales wise by the PS4 and recently overtaken by the Switch.

$500 in 1993 is not the same as $500 in 2020. You’re forgetting to account for inflation.

Internet posted:

“In other words, $500 in 1993 is equivalent in purchasing power to about $896.54 in 2020, a difference of $396.54 over 27 years.”

Subjunctive posted:

This sort of study is a good start: https://www.pewtrusts.org/en/research-and-analysis/issue-briefs/2016/03/household-expenditures-and-income

Most US households would find a $500 discretionary purchase harder to afford today than 5 years ago, I think you’ll find.

This is also a good point though.

E: Holy poo poo I had an old version of the thread pulled up, my bad

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

repiv posted:

https://press.crytek.com/crytek-announces-release-date-for-crysis-remastered

Release date announcement for Crysis Remastered (sept 18th) also confirms DLSS support, and hardware RTX raytracing in addition to their software solution for other hardware

Oh, wow. That's much further than I thought. I was expecting a straight port of the console remaster. DLSS and temporal anti-aliasing on Crysis is going to be huge. Its anti-aliasing was one of the original's weak points especially with all its foliage

quote:

For the first time a Crytek game will feature ray tracing on Xbox One X and PlayStation 4 Pro powered by CRYENGINE’s proprietary software based ray tracing solution.

If any developer could put ray tracing on legacy consoles (!) of course it's going to be Crytek. I am fascinated by the possibility of ray tracing on PS4.

Zedsdeadbaby fucked around with this message at 16:56 on Aug 21, 2020

MikeC
Jul 19, 2004
BITCH ASS NARC

BOOTY-ADE posted:

Same goes for video cards, people are sort of jumping the gun at the moment saying they'd gladly pay $1500-2000 for a card...forgetting exactly how many people have been affected by the pandemic (job/wage losses, no 2nd stimulus yet, no additional unemployment, etc). Unless people get some government help within like the next couple weeks tops (yeah loving right), I don't see NV selling many cards at those ridiculous prices.

I don't think it is the same crowd. As many people that are now in tough times, there are people who are still working and now have ample discretionary funds because they stopped going out and have no vacation expenses.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
It's completely 100% not the same people. That has a very good chance of changing over time as the loss in production moves job losses into the white collar sectors, but that's a problem for later.

The launch itself is timed perfectly for the high end market as it currently stands. The only way it could be any better is if they dropped a stim check right on top of the launch window, which won't happen, but that's just gravy anyways.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

MikeC posted:

I don't think it is the same crowd. As many people that are now in tough times, there are people who are still working and now have ample discretionary funds because they stopped going out and have no vacation expenses.

Yeah. This right here is a story of how we basically have two Americas right now: we have the well-off, mostly in tech, finance, and other white-collar jobs who have been inconvenienced by the pandemic and whatnot, but otherwise are more or less just carrying on with their 401k's and shoving all that discretionary money they normally would have spent going out and partying into other purchases--like vddjya game cards. Compared to a year ago a lot of them are more willing to spend do$$ars on high-end cards because they have the spare cash to do so, and they've been stuck inside for a while and would see it as a "good" investment in the sense that they'll likely get a lot of use out of it in the near future.

The other chunk of America, the factory workers, shop clerks, service industry, artists, and blue-collar jobs in general are pretty hosed. They have minimal spare money to buy GPUs and whatnot, but at the same time, this chunk of the country was probably never going to buy a $1k GPU in the first place. In another time they'd be your market for the 3060 and/or consoles, so it'll be real interesting to see what the gaming industry in general does to try not to miss out on being able to sell to that considerable slice of the consumer base.

e; I mean, not to take SA as representative of anything at all, but how many people in this very thread have stated that their contention is basically between the 3090 and whatever is one step down? A $2k card might be an effort to move, but a $1500 card would probably sell straight through stock pretty quickly if it had compelling enough performance.

DrDork fucked around with this message at 18:49 on Aug 21, 2020

redreader
Nov 2, 2009

I am the coolest person ever with my pirate chalice. Seriously.

Dinosaur Gum

CactusWeasle posted:

I am still on a GTX 1070. The only reason I didnt buy a 2070 was because 80% of my issue was running out of vram so I sure wasnt going to spend 500~ for a new card with the exact same amount, and sure as hell wasnt spending 1300~ for a Ti. So I am one of the idiots praying for a 16GB 3070 :pray:

I saw a rumour (probably linked here) about a 3070ti which runs a bit better according to the benchmark, than a 2080ti. If it has more than 10gb of ram (i bet 3070 has 8 and 3070ti has 16) I'm willing to buy it and use it for another 4+ years. I just want to be able to do decent ray-tracing at 2k and put everything on 'max/ultra' for at least 2 years, AND have ray-tracing. Ray-tracing is what's making me want to buy the 30 series.

edit: the 2080ti only has 11gb of ram? drat. maybe the 2070ti will have something similar.

redreader fucked around with this message at 19:12 on Aug 21, 2020

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

Zedsdeadbaby posted:

It's spitballing more than anything else. Common sense alone tells us there is no way the XSX's graphical output will match anywhere near a $1100 video card.
'Common sense' tells us that comparing the price of a card that sells in absolutely tiny volumes next to the massive orders that a console manufacturer will demand for components is foolish. The economies of scale are completely different, MS likely pays far less for GDDR6 due to the amount they order at one time, plus their revenue is partly based on Xbox Live subscriptions and the software license cut. They don't have anywhere near the same demands for margins off the hardware that Nvidia requires.

Digital Foundry saw a 2-week port of Gears 5 on the Series X months ago and said the performance was basically equivalent to it running on a 2080 (and Gears 5 is well optimized on the PC).

quote:

The XSX is tiny and has to think about thermals, something AMD is notoriously poor at with high-end GPUs (vega 56 and 64 says hello).
The most common critique of the Series x when it was revealed was "omg huge". There's obviously been a lot of work put into the cooling. Why compare it to Vega? This is RDNA2 which has significantly better perf/watt. On the CPU side, as we've seen with Zen2 architecture as well, it scales incredibly well with lower clock speeds, by just dropping the mhz down a tad you can cut the power draw massively.

quote:

If it truly was 2080ti level power, their flagship game Halo Infinite wouldn't look like such a bag of poo poo now would it? I think that's the most telling thing.
It says absolutely nothing. If we go by what early random console titles looks like, then I guess the PS5 is 10X more powerful based on Horizon: Forbidden West footage.

Happy_Misanthrope fucked around with this message at 19:20 on Aug 21, 2020

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
The working class is a different loving world. I grew up poor (UK) and it's no way to live. Neither of my parents made it to 70 cos they lived such lovely loving lives eating terrible red meat and junk that was out of date because that was all they could afford. I remeber getting back from school and literally picking apart bread to eat because it had mold on it. And this isn't just a unique thing, pretty much all of us went through this as kids in my area.

I still can't believe half the poo poo I went through growing up, I count my blessings every day while simultaneously cursing my parents who somehow thought it was a good idea to have five of us while impoverished... ??? I don't know why they did that. It just made us all miserable to no end. Thanks for reading my blog lol, whenever working class stuff comes up I always have to vent. What a shitshow man.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

DrDork posted:

money they normally would have spent going out

uh what now

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
People who are working are spending less money. They aren't going out to expensive restaurants. They aren't taking expensive vacations. They aren't spending money on gatherings. So their disposable income is way up. And when you're spending more time at home, that GPU or console which never was that expensive to you, but now might actually get some use, suddenly starts to look like a good buy.

Happy_Misanthrope
Aug 3, 2007

"I wanted to kill you, go to your funeral, and anyone who showed up to mourn you, I wanted to kill them too."

How are you confused by this, other than you being Paul

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

Happy_Misanthrope posted:

How are you confused by this, other than you being Paul

his prescience doesn't extend to reality :(

There's a lot of people who have disposable $ locked at home and are price insensitive.

The working class that got hosed by the lockdowns didn't have much $ to spend and would never spend it on a $400 graphics card had they had it.

Worf
Sep 12, 2017

If only Seth would love me like I love him!

p sure the joke is just that he didnt go out to begin with/doesnt think nerds are inclined to go out


Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

Statutory Ape posted:

p sure the joke is just that he didnt go out to begin with/doesnt think nerds are inclined to go out

^

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Happy_Misanthrope posted:

How are you confused by this, other than you being Paul

are you really that desperate to @ me on an obvious joke

FuturePastNow
May 19, 2014


Malcolm XML posted:

The working class that got hosed by the lockdowns didn't have much $ to spend and would never spend it on a $400 graphics card had they had it.

Ironically, this is normally me but the Pandemic Unemployment Assistance stars aligned and I'm considering spending too much on a video card. But if I buy an expensive one, I'll probably get 5 years out of it.

Edit: and by expensive, I mean $500ish and not anywhere near Titan territory

FuturePastNow fucked around with this message at 20:19 on Aug 21, 2020

Malcolm XML
Aug 8, 2009

I always knew it would end like this.

FuturePastNow posted:

Ironically, this is normally me but the Pandemic Unemployment Assistance stars aligned and I'm considering spending too much on a video card. But if I buy an expensive one, I'll probably get 5 years out of it.

Edit: and by expensive, I mean $500ish and not anywhere near Titan territory

It'd be bananas to spend unemployment funds on a graphics card my dude just buy a cheap last gen one if you must

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Malcolm XML posted:

It'd be bananas to spend unemployment funds on a graphics card my dude just buy a cheap last gen one if you must

I mean, while I agree with you in principle, there's a lot we don't know here. Maybe he lost his job, got the $1200, and then a few weeks later got a new, better job. Or moved home and now doesn't have to worry about rent. Or had his 6-figure-making S/O move in. Whatever.

For a guy who would be keeping a card for 5 years, I'd actually probably recommend not picking up a Turing card unless prices absolutely tank on them. The reason for this is because Ampere will (presumably--holy gently caress it would be laughable for them to not) finally add HDMI 2.1 to the spec sheet, plus most likely an optimized iteration on the RTX/DLSS stack.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

DrDork posted:

finally add HDMI 2.1 to the spec sheet

is that mostly relevant for hooking up to tvs for VRR?

OhFunny
Jun 26, 2013

EXTREMELY PISSED AT THE DNC

Rinkles posted:

is that mostly relevant for hooking up to tvs for VRR?

I think DrDork has triple 4K120hz monitors and current specs don't let them all run at max resolution and frame rate.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

OhFunny posted:

I think DrDork has triple 4K120hz monitors and current specs don't let them all run at max resolution and frame rate.

Not yet! Waiting for HDMI 2.1, you see.

But, yes, HDMI 2.1 will let you connect up to TV's/monitors that are 4k@120 w/VRR, which is pretty swank. For standard PC use, current HDMI 2.0b limits you to 4k@60, and DP 1.4 can do 4k@120Hz, but only with DSC. While this isn't an immediate concern for most people right this minute, in a few years those sorts of monitors will be commonplace, and it'd suck to lose out on that because you wanted to save $50 or something on a GPU. Again, that's a "I'm keeping it for 5 years" view, which I think applies to a minority of people in this thread.

Also, it'll naturally depend on what the prices actually end up looking like. If I had $500 and it was a choice between a 3060 or a used 2080Ti, you'd get a lot more with the 2080Ti. But if it's more like a $500 3070 vs a $450 used 2080 that gets beat by the 3070, the savings wouldn't be worth it. I guess we'll find out in a few weeks.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
HDMI 2.1 VRR is on the current RTX cards and it rules.


OhFunny posted:

I think DrDork has triple 4K120hz monitors and current specs don't let them all run at max resolution and frame rate.

Not even one monitor can do 4k/120 on HDMI right now, and even when they will, it will not be across 3 monitors unless it's a very old game.

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Taima posted:

HDMI 2.1 VRR is on the current RTX cards and it rules.

Not even one monitor can do 4k/120 on HDMI right now, and even when they will, it will not be across 3 monitors unless it's a very old game.

Yeah, just note that Turing is running HDMI 2.0b with VRR enabled, not actual HDMI 2.1. That VRR is a (optional) part of the HDMI 2.1 spec makes it kinda confusing with how some sites made the announcement, but there's no bandwidth bump from the update.

You are right that no current PC monitor supports HDMI 2.1, but we already have TV's that do (LG C9 series) for 4k@120Hz. Presumably we will see PC monitors start dropping in the not too distant future with HDMI 2.1 and 4k@120 panels, given that HDMI 2.1 has been this bizarre chicken-and-egg thing for a few years. I think we can mostly thank consoles for finally "fixing" that issue for us all.

You don't need to span a game across 3x4k@120 screens to enjoy them. Me, specifically, I'll be looking for one ~5k 21:9 @120+ central panel for gaming with two flanking 1440p@100+ panels for other things. While those side ones already exist, that central one can't exist given HDMI 2.0/DP 1.4 bandwidth, so I'm waiting for now.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?

DrDork posted:

That VRR is a (optional) part of the HDMI 2.1 spec

oh god that is gonna be a pain when looking for tvs

Cygni
Nov 12, 2005

raring to post

HDMI 2.1 will make the 4k/144/HDR monitors worth considering seriously (and producing seriously for the OEMs), and that also makes it Good

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Rinkles posted:

oh god that is gonna be a pain when looking for tvs

Oh yeah, it's gonna be a total poo poo-show. While I expect that your higher-end TVs will throw it in there pretty much all the time, the $500 budget model? Good luck! It's just gonna say "HDMI 2.1 support" and hope you don't ask too many questions.

Cygni
Nov 12, 2005

raring to post

https://twitter.com/VideoCardz/status/1296903927259111433

das a big boy

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

Cygni posted:

das a big boy

Waiting for one of the NVidia partners to steal the Thiccc naming scheme from AMD for that. Jesus.

BIG HEADLINE
Jun 13, 2006

"Stand back, Ottawan ruffian, or face my lumens!"
Wouldn't surprise me if we get our first four slot AIB card. Or at least a "3.75" slotter.

Adbot
ADBOT LOVES YOU

VelociBacon
Dec 8, 2009

I think my 2080ti xc ultra was 3.75 slots before I swapped the cooler but I can't remember.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply