Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Arivia
Mar 17, 2011

punk rebel ecks posted:

I guess. I just really want to upgrade my build. It's literally ancient.

Ea-Nasir, this GPU only does one triangle per second!

Adbot
ADBOT LOVES YOU

Sagebrush
Feb 26, 2012

punk rebel ecks posted:

I guess. I just really want to upgrade my build. It's literally ancient.

Then upgrade your build and get vastly increased performance today*, but accept that maybe you won't get 4k 60fps with raytracing on ultra on the newest games that aren't even out yet. Turn down one or two settings though (I recommend using DLSS) and you should be fine well into the future.





*you may not actually be able to buy a card today

punk rebel ecks
Dec 11, 2010

A shitty post? This calls for a dance of deduction.

Sagebrush posted:

Then upgrade your build and get vastly increased performance today*, but accept that maybe you won't get 4k 60fps with raytracing on ultra on the newest games that aren't even out yet. Turn down one or two settings though (I recommend using DLSS) and you should be fine well into the future.





*you may not actually be able to buy a card today

But I'm still on a 2500k and 970. :(

Dr. Video Games 0031
Jul 17, 2004

The main issue is that 4K is just stupidly expensive to render. Unless you're outputting to a large TV or something, just stick to 1440p and you'll be able to max out everything without a problem (for now).

punk rebel ecks
Dec 11, 2010

A shitty post? This calls for a dance of deduction.
My i5-2500k will end up lasting me 12 years. 970 will last me 6 years. Jesus loving christ. That's unheard of mileage.

teagone
Jun 10, 2003

That was pretty intense, huh?

Otakufag posted:

I had that happen to me in Destiny 2 and Desmume DS emulator. Was able to fix this by enabling g-sync only for fullscreen, not windowed+fullscreen.

From what I remember, G-Sync is only enabled for fullscreen already. He's able to recreate the issue consistently by alt-tabbing though. So weird!

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Has he capped framerate to 140? I strongly suspect this behavior is related to frames coming too fast and VRR toggling off and on.

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy

punk rebel ecks posted:

I guess. I just really want to upgrade my build. It's literally ancient.

punk, we've been on this train for a while now, and my advice to you is to upgrade your CPU, mobo, and RAM. Get a 5600X, or an i3-10100(F) if you're really on a budget. CPU prices are pretty stable at the moment, and availability is good.

worry about the GPU later if you can't make up your mind right now.

NeverRelax
Jul 16, 2021

by Jeffrey of YOSPOS
5600x and a b550 tomahawk is a really sweet combo that wont hold back any of the top GPUs for some time to come. I built 3 5600x builds for friends and its a little upsetting how close they come to my 5900x build at a fraction of the price. If I were to do it again, i would go 10th gen intel, the prices are so good right now and you can get a something 8 core and overclockable for the same price as a 5600x

Inept
Jul 8, 2003

punk rebel ecks posted:

My i5-2500k will end up lasting me 12 years. 970 will last me 6 years. Jesus loving christ. That's unheard of mileage.

the 970 will be 7 years old in 2 weeks :haw:

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
Kinda nuts. The GTX 970 released at the same time as the Xbox One, similarly to Ampere releasing at the same time as the PS5/Bone. It's nuts to think that a midrange GPU an entire console generation old is still in demand because of how hosed things are.

punk rebel ecks
Dec 11, 2010

A shitty post? This calls for a dance of deduction.

Inept posted:

the 970 will be 7 years old in 2 weeks :haw:

:negative:

FuturePastNow
May 19, 2014


NeverRelax posted:

Up until recently 3060ti's have been super rare, way harder to find than a 3080.
Were they all being made into 3070s? Would that indicate that 3060ti's might have pretty decent headroom making an overbuilt cooler possibly worthwhile?

3060 Ti was the best price:perf ratio for mining

edit: best for power efficiency too. Basically miners ate almost 100% of the 3060 Ti production until the LHR cards

FuturePastNow fucked around with this message at 18:41 on Sep 5, 2021

shrike82
Jun 11, 2005

https://twitter.com/VideoCardz/status/1434453987781226496?s=20

i hope so

Alan Smithee
Jan 4, 2005


A man becomes preeminent, he's expected to have enthusiasms.

Enthusiasms, enthusiasms...
"geek is back baby!"

*sheldon noises*

Dr. Video Games 0031
Jul 17, 2004

punk rebel ecks posted:

My i5-2500k will end up lasting me 12 years. 970 will last me 6 years. Jesus loving christ. That's unheard of mileage.

I know you're looking for a huge upgrade that'll get you very high end performance, but if you (or anyone else here) is willing to settle for something more midrange, then the RX 6600 XT is still available for prices near-ish MSRP on Newegg if you get them in a combo deal with some motherboards: https://www.newegg.com/p/pl?d=rx+6600+xt&Order=1

I would never want to spend $500 on a 6600 XT class of graphics card in a normal market, though I suppose I paid almost that much for my almost identical performing 5700 XT a couple years ago and had no complaints. Still, this is an option worth considering for anyone looking to build a new midrange PC since these are also pretty reasonable motherboards they're being paired with.

edit: though lol at the only itx motherboard combo coming with a triaxial.

Dr. Video Games 0031 fucked around with this message at 12:44 on Sep 5, 2021

Yeep
Nov 8, 2004

Dr. Video Games 0031 posted:

edit: though lol at the only itx motherboard combo coming with a triaxial.

I finally settled on a Gigabyte Eagle 6700xt for my ITX build. Finding a strictly 2 slot card was much harder than the length constraints.

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!
How is AMD's FSR/FidelityFX/whatever supposed to work? Is it applied at the very end of the render? If so, what's the point? I'm just watching some Farcry 6 gameplay supposedly recorded at 4K60, and I can clearly see it's upscaled because the UI is jaggy. It pretty much looks the same as if I were playing 1440p upscaled by my 4K display.

You might as well just run it native 4K, subsample it and hope TAA makes something decent out of it (I'm currently playing FC5 at 0.7x rendering at 4K, and it seems to work OK, and I get sharp UI).

I wish these clowns would put DLSS in.

Dr. Video Games 0031
Jul 17, 2004

Combat Pretzel posted:

How is AMD's FSR/FidelityFX/whatever supposed to work? Is it applied at the very end of the render? If so, what's the point? I'm just watching some Farcry 6 gameplay supposedly recorded at 4K60, and I can clearly see it's upscaled because the UI is jaggy. It pretty much looks the same as if I were playing 1440p upscaled by my 4K display.

You might as well just run it native 4K, subsample it and hope TAA makes something decent out of it (I'm currently playing FC5 at 0.7x rendering at 4K, and it seems to work OK, and I get sharp UI).

I wish these clowns would put DLSS in.

No, FSR is applied midway through the rendering pipeline, before the UI is drawn. Whatever UI issues you're seeing isn't FSR's fault unless they really hosed up the implementation.

Kazinsal
Dec 13, 2011

Dr. Video Games 0031 posted:

No, FSR is applied midway through the rendering pipeline, before the UI is drawn. Whatever UI issues you're seeing isn't FSR's fault unless they really hosed up the implementation.

It's Ubisoft, so really, take your pick.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.

Combat Pretzel posted:

How is AMD's FSR/FidelityFX/whatever supposed to work? Is it applied at the very end of the render? If so, what's the point? I'm just watching some Farcry 6 gameplay supposedly recorded at 4K60, and I can clearly see it's upscaled because the UI is jaggy. It pretty much looks the same as if I were playing 1440p upscaled by my 4K display.

You might as well just run it native 4K, subsample it and hope TAA makes something decent out of it (I'm currently playing FC5 at 0.7x rendering at 4K, and it seems to work OK, and I get sharp UI).

I wish these clowns would put DLSS in.

Sounds like bad video encoding

FSR isn't as good as DLSS but on games that already use TAA (to get rid of shimmering) it's legitimately effective as an upscaler when used on quality mode. It's hard to tell native and quality mode apart at higher resolutions such as 1440p and 4k

Zedsdeadbaby fucked around with this message at 12:58 on Sep 5, 2021

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

Zedsdeadbaby posted:

Sounds like bad video encoding
Nah, UI text clearly looks like fractional scaling.

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Whatever that may be, that's not FSR which is applied before the UI unless you're using unofficial third party hacks like Magpie

gradenko_2000
Oct 5, 2010

HELL SERPENT
Lipstick Apathy
speaking of FSR, Lossless Scaling has been updated recently:

* you can now set a completely custom scaling factor, in cases where the resolutions supported by the game don't line up exactly with what would naturally fill the screen. In some cases, you can use this to get an internal render resolution that's even "better" than what FSR allows. For example, FSR's Ultra Quality uses a 1.30x scaling factor, but you can do something like 1632 x 918, with a scaling factor of 1.17x, to fill a 1080p screen. Of course, with a higher render resolution, the performance gains relative to native will be less.

* it implemented "FSR Lite", which is an open-source modification of the FSR algorithm to reduce the overhead even further than the original one did, which a particular developer decided to do to make the performance good enough for use with mobile applications.

EDIT: in reference to this latest discussion, if you do try to use these "injection" methods to implement FSR, then the UI will be affected, as opposed to a game that supports it at the in-game setting level

fknlo
Jul 6, 2009


Fun Shoe

Sagebrush posted:

28 minutes for an in-store only sale.

There were some that went out this way a week or two ago locally as well. My store didn't have any and the guy pulled out his phone to tell me what stores might and I told him he didn't need to do that because I didn't care enough to go anywhere else. I just went to the one in my town because I was already out.

I'll probably end up just grabbing the 3080ti my coworker has coming. I'll never find a regular 3080 and I still have $900 on my dresser from when I sold my 1080ti's late last year/early this year to eat up a ton of the cost.

Salean
Mar 17, 2004

Homewrecker

punk rebel ecks posted:

But I'm still on a 2500k and 970. :(

oh look, it's me

repiv
Aug 13, 2009

gradenko_2000 posted:

EDIT: in reference to this latest discussion, if you do try to use these "injection" methods to implement FSR, then the UI will be affected, as opposed to a game that supports it at the in-game setting level

The sharpening pass will run after post-processing too, which can cause quality issues

If you're using external sharpening (with or without upscaling) you should probably at least reduce or disable film grain in-game, as sharpening filters tend to amplify the noise

punk rebel ecks
Dec 11, 2010

A shitty post? This calls for a dance of deduction.

Dr. Video Games 0031 posted:

I know you're looking for a huge upgrade that'll get you very high end performance, but if you (or anyone else here) is willing to settle for something more midrange, then the RX 6600 XT is still available for prices near-ish MSRP on Newegg if you get them in a combo deal with some motherboards: https://www.newegg.com/p/pl?d=rx+6600+xt&Order=1

I would never want to spend $500 on a 6600 XT class of graphics card in a normal market, though I suppose I paid almost that much for my almost identical performing 5700 XT a couple years ago and had no complaints. Still, this is an option worth considering for anyone looking to build a new midrange PC since these are also pretty reasonable motherboards they're being paired with.

edit: though lol at the only itx motherboard combo coming with a triaxial.

Is the 3060 really that much weaker than say the 3080?

Sagebrush
Feb 26, 2012

Objectively yes. The 3080 is around 40% faster.

Compared to your 970 no. Both would be an enormous upgrade.

Endymion FRS MK1
Oct 29, 2011

I don't know what this thing is, and I don't care. I'm just tired of seeing your stupid newbie av from 2011.

K8.0 posted:

Has he capped framerate to 140? I strongly suspect this behavior is related to frames coming too fast and VRR toggling off and on.

Is this still accepted behavior? I mean using the Nvidia control panel to cap frame rate at 140 on a 144hz monitor

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
Yeah that's still the way to go. It doesn't really matter if it's via the nvidia control panel or something such as RTSS, or even games' own framerate limiters. The end result is the same. Maybe some guys will tell you that perhaps input latency is slightly slower or quicker with one or the other, but you're not going to notice it since most of the latency improvement comes simply from using VRR anyway.

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
VRR only works when frames come more slowly than max refresh rate. VRR monitors can slow down, but they can't speed up past their maximum speed. If time between frames is less than time between refreshes, VRR is off and you have standard vsync on behavior or standard vsync off behavior.

If you are in a situation where VRR is rapidly toggling on and off because frametimes keep going slightly above and below minimum refresh time, weird behavior often crops up. Sometimes it's flickering, sometimes it's frame pacing issues, all kinds of stuff depending on exactly what is happening with frame times, the particulars of your monitor, driver revisions, etc.

It's stupid that Nvidia doesn't frame cap by default, hell the driver could even detect when a game is or isn't capping itself properly and toggle the driver frame cap appropriately, but honestly no one gives a gently caress about making the VRR experience simple and smooth for the average user.

There are situations where you'd consider doing otherwise, but by default anyone who has a VRR monitor should set the global profile to cap framerate slightly below refresh rate and force vsync on. Then, if you want to optimize latency in a game that has a good built-in frame limiter, set it to the same value and make an Nvidia profile for that game that disables the frame limiter. That's about it for most people.

punk rebel ecks posted:

Is the 3060 really that much weaker than say the 3080?

If we were in a sane market I'd say no one should buy a 3060 because it's the worst value in the Ampere stack at MSRP, but the market is completely hosed and if you want to buy a GPU you do you.

Vasler
Feb 17, 2004
Greetings Earthling! Do you have any Zoom Boots?

Inept posted:

the 970 will be 7 years old in 2 weeks :haw:

I'm right there with you, friend. I've been on eVGA's waiting list since October, 2020.

Death On Toast
Aug 2, 2006
The better half of the Brothers Douche.

Sagebrush posted:

Objectively yes. The 3080 is around 40% faster.

I think you got your numbers backwards here. The 3060 is about 40% slower than the 3080, going the other way makes 3080 67% faster than the 3060. Per HWUB's 14 game average, 3080 is 60% faster at 1080p and 73% faster at 1440p, so that all checks out.

slidebite
Nov 6, 2005

Good egg
:colbert:

punk rebel ecks posted:

But I'm still on a 2500k and 970. :(
Mrs. Slidebite is still rocking my ancient 2500k and GTX780 system. For her games/uses, it still has lots of life.

I was going to put my 1080ti into her system when I got my 3090, but I would have been crazy not to sell it to recoup some $$$.

The 2500k was a great processor.

teagone
Jun 10, 2003

That was pretty intense, huh?

K8.0 posted:

Has he capped framerate to 140? I strongly suspect this behavior is related to frames coming too fast and VRR toggling off and on.

Framerate is capped at 141, yeah. Interesting. What's the remedy to that? Uncapping framerate? Blurbusters told me to cap all game framerates to 141 or below if using a 144Hz display with G-Sync enabled.

[edit] Oh, wait, were you assuming that framerates weren't capped and that's what might be causing the issue? I always create 3D profiles for each game and if the game doesn't include an in-game frame limiter, I cap the framerate using Nvidia CP inside the game's 3D profile.

[edit 2] I also know for sure my brother's PC (Ryzen 7 5800X, RTX 3060) can't push more than 140+ FPS in Aliens Fireteam at 1440p max settings. Game seems to consistently stay at 100-110 FPS, with dips into the high 70s/low 80s during big horde fights.

teagone fucked around with this message at 21:57 on Sep 5, 2021

NeverRelax
Jul 16, 2021

by Jeffrey of YOSPOS
https://www.amazon.com/EVGA-GeForce-Backplate-PowerLink-08G-P4-3188-Kp/dp/B07Y935CBS/

2080 super hybrid for 799
been in stock for 20+min

MSRP pricing for 3070 ish performance is not bad


and its gone
probably to actual people not bots. No bot would be watching that.

NeverRelax fucked around with this message at 23:06 on Sep 5, 2021

repiv
Aug 13, 2009

2080S is closer to 3060ti than 3070

https://www.techpowerup.com/review/nvidia-geforce-rtx-3060-ti-founders-edition/35.html

Shipon
Nov 7, 2005

K8.0 posted:

If we were in a sane market I'd say no one should buy a 3060 because it's the worst value in the Ampere stack at MSRP, but the market is completely hosed and if you want to buy a GPU you do you.
dunno how you can call the 3060 the worst MSRP value when the 3080ti and 3090 exist

Adbot
ADBOT LOVES YOU

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
There's inherent value to having the fastest GPU you can possibly have, since it enables scenarios that otherwise don't exist yet. Additionally, products higher on the stack retain relevance longer and often turn out to be better buys than they appear. The 3060 is ballpark a 2060 Super - a marginal card for raytracing, and a product with very limited utility going forward aside from its weak rasterization power. Right now you can make the argument to buy one because a GPU is a GPU, but in a normal market it would look really bad compared to used GPUs or AMD offerings.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply