|
punk rebel ecks posted:I guess. I just really want to upgrade my build. It's literally ancient. Ea-Nasir, this GPU only does one triangle per second!
|
# ? Sep 5, 2021 02:17 |
|
|
# ? Jun 13, 2024 06:11 |
|
punk rebel ecks posted:I guess. I just really want to upgrade my build. It's literally ancient. Then upgrade your build and get vastly increased performance today*, but accept that maybe you won't get 4k 60fps with raytracing on ultra on the newest games that aren't even out yet. Turn down one or two settings though (I recommend using DLSS) and you should be fine well into the future. *you may not actually be able to buy a card today
|
# ? Sep 5, 2021 02:22 |
|
Sagebrush posted:Then upgrade your build and get vastly increased performance today*, but accept that maybe you won't get 4k 60fps with raytracing on ultra on the newest games that aren't even out yet. Turn down one or two settings though (I recommend using DLSS) and you should be fine well into the future. But I'm still on a 2500k and 970.
|
# ? Sep 5, 2021 02:25 |
|
The main issue is that 4K is just stupidly expensive to render. Unless you're outputting to a large TV or something, just stick to 1440p and you'll be able to max out everything without a problem (for now).
|
# ? Sep 5, 2021 02:25 |
|
My i5-2500k will end up lasting me 12 years. 970 will last me 6 years. Jesus loving christ. That's unheard of mileage.
|
# ? Sep 5, 2021 02:34 |
|
Otakufag posted:I had that happen to me in Destiny 2 and Desmume DS emulator. Was able to fix this by enabling g-sync only for fullscreen, not windowed+fullscreen. From what I remember, G-Sync is only enabled for fullscreen already. He's able to recreate the issue consistently by alt-tabbing though. So weird!
|
# ? Sep 5, 2021 03:40 |
|
Has he capped framerate to 140? I strongly suspect this behavior is related to frames coming too fast and VRR toggling off and on.
|
# ? Sep 5, 2021 05:06 |
|
punk rebel ecks posted:I guess. I just really want to upgrade my build. It's literally ancient. punk, we've been on this train for a while now, and my advice to you is to upgrade your CPU, mobo, and RAM. Get a 5600X, or an i3-10100(F) if you're really on a budget. CPU prices are pretty stable at the moment, and availability is good. worry about the GPU later if you can't make up your mind right now.
|
# ? Sep 5, 2021 05:20 |
|
5600x and a b550 tomahawk is a really sweet combo that wont hold back any of the top GPUs for some time to come. I built 3 5600x builds for friends and its a little upsetting how close they come to my 5900x build at a fraction of the price. If I were to do it again, i would go 10th gen intel, the prices are so good right now and you can get a something 8 core and overclockable for the same price as a 5600x
|
# ? Sep 5, 2021 05:27 |
|
punk rebel ecks posted:My i5-2500k will end up lasting me 12 years. 970 will last me 6 years. Jesus loving christ. That's unheard of mileage. the 970 will be 7 years old in 2 weeks
|
# ? Sep 5, 2021 05:32 |
|
Kinda nuts. The GTX 970 released at the same time as the Xbox One, similarly to Ampere releasing at the same time as the PS5/Bone. It's nuts to think that a midrange GPU an entire console generation old is still in demand because of how hosed things are.
|
# ? Sep 5, 2021 06:46 |
|
Inept posted:the 970 will be 7 years old in 2 weeks
|
# ? Sep 5, 2021 07:05 |
|
NeverRelax posted:Up until recently 3060ti's have been super rare, way harder to find than a 3080. 3060 Ti was the best price:perf ratio for mining edit: best for power efficiency too. Basically miners ate almost 100% of the 3060 Ti production until the LHR cards FuturePastNow fucked around with this message at 18:41 on Sep 5, 2021 |
# ? Sep 5, 2021 07:55 |
|
https://twitter.com/VideoCardz/status/1434453987781226496?s=20 i hope so
|
# ? Sep 5, 2021 12:11 |
|
"geek is back baby!" *sheldon noises*
|
# ? Sep 5, 2021 12:16 |
|
punk rebel ecks posted:My i5-2500k will end up lasting me 12 years. 970 will last me 6 years. Jesus loving christ. That's unheard of mileage. I know you're looking for a huge upgrade that'll get you very high end performance, but if you (or anyone else here) is willing to settle for something more midrange, then the RX 6600 XT is still available for prices near-ish MSRP on Newegg if you get them in a combo deal with some motherboards: https://www.newegg.com/p/pl?d=rx+6600+xt&Order=1 I would never want to spend $500 on a 6600 XT class of graphics card in a normal market, though I suppose I paid almost that much for my almost identical performing 5700 XT a couple years ago and had no complaints. Still, this is an option worth considering for anyone looking to build a new midrange PC since these are also pretty reasonable motherboards they're being paired with. edit: though lol at the only itx motherboard combo coming with a triaxial. Dr. Video Games 0031 fucked around with this message at 12:44 on Sep 5, 2021 |
# ? Sep 5, 2021 12:39 |
|
Dr. Video Games 0031 posted:edit: though lol at the only itx motherboard combo coming with a triaxial. I finally settled on a Gigabyte Eagle 6700xt for my ITX build. Finding a strictly 2 slot card was much harder than the length constraints.
|
# ? Sep 5, 2021 12:49 |
|
How is AMD's FSR/FidelityFX/whatever supposed to work? Is it applied at the very end of the render? If so, what's the point? I'm just watching some Farcry 6 gameplay supposedly recorded at 4K60, and I can clearly see it's upscaled because the UI is jaggy. It pretty much looks the same as if I were playing 1440p upscaled by my 4K display. You might as well just run it native 4K, subsample it and hope TAA makes something decent out of it (I'm currently playing FC5 at 0.7x rendering at 4K, and it seems to work OK, and I get sharp UI). I wish these clowns would put DLSS in.
|
# ? Sep 5, 2021 12:53 |
|
Combat Pretzel posted:How is AMD's FSR/FidelityFX/whatever supposed to work? Is it applied at the very end of the render? If so, what's the point? I'm just watching some Farcry 6 gameplay supposedly recorded at 4K60, and I can clearly see it's upscaled because the UI is jaggy. It pretty much looks the same as if I were playing 1440p upscaled by my 4K display. No, FSR is applied midway through the rendering pipeline, before the UI is drawn. Whatever UI issues you're seeing isn't FSR's fault unless they really hosed up the implementation.
|
# ? Sep 5, 2021 12:54 |
|
Dr. Video Games 0031 posted:No, FSR is applied midway through the rendering pipeline, before the UI is drawn. Whatever UI issues you're seeing isn't FSR's fault unless they really hosed up the implementation. It's Ubisoft, so really, take your pick.
|
# ? Sep 5, 2021 12:55 |
|
Combat Pretzel posted:How is AMD's FSR/FidelityFX/whatever supposed to work? Is it applied at the very end of the render? If so, what's the point? I'm just watching some Farcry 6 gameplay supposedly recorded at 4K60, and I can clearly see it's upscaled because the UI is jaggy. It pretty much looks the same as if I were playing 1440p upscaled by my 4K display. Sounds like bad video encoding FSR isn't as good as DLSS but on games that already use TAA (to get rid of shimmering) it's legitimately effective as an upscaler when used on quality mode. It's hard to tell native and quality mode apart at higher resolutions such as 1440p and 4k Zedsdeadbaby fucked around with this message at 12:58 on Sep 5, 2021 |
# ? Sep 5, 2021 12:56 |
|
Zedsdeadbaby posted:Sounds like bad video encoding
|
# ? Sep 5, 2021 12:57 |
|
Whatever that may be, that's not FSR which is applied before the UI unless you're using unofficial third party hacks like Magpie
|
# ? Sep 5, 2021 12:59 |
|
speaking of FSR, Lossless Scaling has been updated recently: * you can now set a completely custom scaling factor, in cases where the resolutions supported by the game don't line up exactly with what would naturally fill the screen. In some cases, you can use this to get an internal render resolution that's even "better" than what FSR allows. For example, FSR's Ultra Quality uses a 1.30x scaling factor, but you can do something like 1632 x 918, with a scaling factor of 1.17x, to fill a 1080p screen. Of course, with a higher render resolution, the performance gains relative to native will be less. * it implemented "FSR Lite", which is an open-source modification of the FSR algorithm to reduce the overhead even further than the original one did, which a particular developer decided to do to make the performance good enough for use with mobile applications. EDIT: in reference to this latest discussion, if you do try to use these "injection" methods to implement FSR, then the UI will be affected, as opposed to a game that supports it at the in-game setting level
|
# ? Sep 5, 2021 13:04 |
|
Sagebrush posted:28 minutes for an in-store only sale. There were some that went out this way a week or two ago locally as well. My store didn't have any and the guy pulled out his phone to tell me what stores might and I told him he didn't need to do that because I didn't care enough to go anywhere else. I just went to the one in my town because I was already out. I'll probably end up just grabbing the 3080ti my coworker has coming. I'll never find a regular 3080 and I still have $900 on my dresser from when I sold my 1080ti's late last year/early this year to eat up a ton of the cost.
|
# ? Sep 5, 2021 15:12 |
|
punk rebel ecks posted:But I'm still on a 2500k and 970. oh look, it's me
|
# ? Sep 5, 2021 15:25 |
|
gradenko_2000 posted:EDIT: in reference to this latest discussion, if you do try to use these "injection" methods to implement FSR, then the UI will be affected, as opposed to a game that supports it at the in-game setting level The sharpening pass will run after post-processing too, which can cause quality issues If you're using external sharpening (with or without upscaling) you should probably at least reduce or disable film grain in-game, as sharpening filters tend to amplify the noise
|
# ? Sep 5, 2021 15:27 |
|
Dr. Video Games 0031 posted:I know you're looking for a huge upgrade that'll get you very high end performance, but if you (or anyone else here) is willing to settle for something more midrange, then the RX 6600 XT is still available for prices near-ish MSRP on Newegg if you get them in a combo deal with some motherboards: https://www.newegg.com/p/pl?d=rx+6600+xt&Order=1 Is the 3060 really that much weaker than say the 3080?
|
# ? Sep 5, 2021 18:21 |
|
Objectively yes. The 3080 is around 40% faster. Compared to your 970 no. Both would be an enormous upgrade.
|
# ? Sep 5, 2021 18:23 |
|
K8.0 posted:Has he capped framerate to 140? I strongly suspect this behavior is related to frames coming too fast and VRR toggling off and on. Is this still accepted behavior? I mean using the Nvidia control panel to cap frame rate at 140 on a 144hz monitor
|
# ? Sep 5, 2021 18:47 |
|
Yeah that's still the way to go. It doesn't really matter if it's via the nvidia control panel or something such as RTSS, or even games' own framerate limiters. The end result is the same. Maybe some guys will tell you that perhaps input latency is slightly slower or quicker with one or the other, but you're not going to notice it since most of the latency improvement comes simply from using VRR anyway.
|
# ? Sep 5, 2021 19:00 |
|
VRR only works when frames come more slowly than max refresh rate. VRR monitors can slow down, but they can't speed up past their maximum speed. If time between frames is less than time between refreshes, VRR is off and you have standard vsync on behavior or standard vsync off behavior. If you are in a situation where VRR is rapidly toggling on and off because frametimes keep going slightly above and below minimum refresh time, weird behavior often crops up. Sometimes it's flickering, sometimes it's frame pacing issues, all kinds of stuff depending on exactly what is happening with frame times, the particulars of your monitor, driver revisions, etc. It's stupid that Nvidia doesn't frame cap by default, hell the driver could even detect when a game is or isn't capping itself properly and toggle the driver frame cap appropriately, but honestly no one gives a gently caress about making the VRR experience simple and smooth for the average user. There are situations where you'd consider doing otherwise, but by default anyone who has a VRR monitor should set the global profile to cap framerate slightly below refresh rate and force vsync on. Then, if you want to optimize latency in a game that has a good built-in frame limiter, set it to the same value and make an Nvidia profile for that game that disables the frame limiter. That's about it for most people. punk rebel ecks posted:Is the 3060 really that much weaker than say the 3080? If we were in a sane market I'd say no one should buy a 3060 because it's the worst value in the Ampere stack at MSRP, but the market is completely hosed and if you want to buy a GPU you do you.
|
# ? Sep 5, 2021 19:06 |
|
Inept posted:the 970 will be 7 years old in 2 weeks I'm right there with you, friend. I've been on eVGA's waiting list since October, 2020.
|
# ? Sep 5, 2021 20:56 |
|
Sagebrush posted:Objectively yes. The 3080 is around 40% faster. I think you got your numbers backwards here. The 3060 is about 40% slower than the 3080, going the other way makes 3080 67% faster than the 3060. Per HWUB's 14 game average, 3080 is 60% faster at 1080p and 73% faster at 1440p, so that all checks out.
|
# ? Sep 5, 2021 21:13 |
|
punk rebel ecks posted:But I'm still on a 2500k and 970. I was going to put my 1080ti into her system when I got my 3090, but I would have been crazy not to sell it to recoup some $$$. The 2500k was a great processor.
|
# ? Sep 5, 2021 21:47 |
|
K8.0 posted:Has he capped framerate to 140? I strongly suspect this behavior is related to frames coming too fast and VRR toggling off and on. Framerate is capped at 141, yeah. Interesting. What's the remedy to that? Uncapping framerate? Blurbusters told me to cap all game framerates to 141 or below if using a 144Hz display with G-Sync enabled. [edit] Oh, wait, were you assuming that framerates weren't capped and that's what might be causing the issue? I always create 3D profiles for each game and if the game doesn't include an in-game frame limiter, I cap the framerate using Nvidia CP inside the game's 3D profile. [edit 2] I also know for sure my brother's PC (Ryzen 7 5800X, RTX 3060) can't push more than 140+ FPS in Aliens Fireteam at 1440p max settings. Game seems to consistently stay at 100-110 FPS, with dips into the high 70s/low 80s during big horde fights. teagone fucked around with this message at 21:57 on Sep 5, 2021 |
# ? Sep 5, 2021 21:52 |
|
https://www.amazon.com/EVGA-GeForce-Backplate-PowerLink-08G-P4-3188-Kp/dp/B07Y935CBS/ 2080 super hybrid for 799 been in stock for 20+min MSRP pricing for 3070 ish performance is not bad and its gone probably to actual people not bots. No bot would be watching that. NeverRelax fucked around with this message at 23:06 on Sep 5, 2021 |
# ? Sep 5, 2021 23:02 |
|
2080S is closer to 3060ti than 3070 https://www.techpowerup.com/review/nvidia-geforce-rtx-3060-ti-founders-edition/35.html
|
# ? Sep 5, 2021 23:12 |
|
K8.0 posted:If we were in a sane market I'd say no one should buy a 3060 because it's the worst value in the Ampere stack at MSRP, but the market is completely hosed and if you want to buy a GPU you do you.
|
# ? Sep 5, 2021 23:57 |
|
|
# ? Jun 13, 2024 06:11 |
|
There's inherent value to having the fastest GPU you can possibly have, since it enables scenarios that otherwise don't exist yet. Additionally, products higher on the stack retain relevance longer and often turn out to be better buys than they appear. The 3060 is ballpark a 2060 Super - a marginal card for raytracing, and a product with very limited utility going forward aside from its weak rasterization power. Right now you can make the argument to buy one because a GPU is a GPU, but in a normal market it would look really bad compared to used GPUs or AMD offerings.
|
# ? Sep 6, 2021 00:06 |