Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Dogen
May 5, 2002

Bury my body down by the highwayside, so that my old evil spirit can get a Greyhound bus and ride

Statutory Ape posted:

I effectively have an extra 1050ti (a smaller one), besides more ports is there any neat stuff I can do if I added it to my 1080t build ?

Tbh I mainly just shitpost and game so I guess that's my angle

I know it won't like boost performance like bolting on a turbo charger or something lol.

It might get you better frame pacing in games that use hardware physx (which is like, what, batman?)

otherwise no

Adbot
ADBOT LOVES YOU

VelociBacon
Dec 8, 2009

DeadFatDuckFat posted:

I'm thinking my 1070 just died. It lights up when I power up, but the fans don't spin and my monitor just goes right to power save mode (nothing shows on screen). Rest of my parts are less than a year old. Anything I should try before buying a new card?

If your CPU has an igpu just try plugging your monitor into the motherboard to see if it's a GPU problem or something else.

DeadFatDuckFat
Oct 29, 2012

This avatar brought to you by the 'save our dead gay forums' foundation.


ufarn posted:

Make sure you have sufficient power between your computer and the power outlet. Could be an extension that's bottlenecking your power. If you haven't already, try connecting the computer directly to a power outlet and see if anything changes.

No dice unfortunately. Any opinions on evga b stock cards? Is it a huge risk? Saw a 2070 on there for 389.

DeadFatDuckFat
Oct 29, 2012

This avatar brought to you by the 'save our dead gay forums' foundation.


VelociBacon posted:

If your CPU has an igpu just try plugging your monitor into the motherboard to see if it's a GPU problem or something else.

My mobo doesn't have that unfortunately...

VelociBacon
Dec 8, 2009

DeadFatDuckFat posted:

My mobo doesn't have that unfortunately...

I'm still not convinced this is GPU, I'd start taking sticks of ram out and trying again. You just have a no-POST situation. See if the pc boots at all without the GPU even without a monitor hooked up maybe.

DeadFatDuckFat
Oct 29, 2012

This avatar brought to you by the 'save our dead gay forums' foundation.


Gonna borrow my friends old 980 today and plug it in to make sure.

repiv
Aug 13, 2009

Happy_Misanthrope posted:

Yeah I'm very skeptical of this, but I guess we'll see? I have a 1660 so I'm glad to hear AMD's solution (which should work on all cards) is good though. But yeah, isn't it just adding a sharpening filter? It's not actually creating new pixels like DLSS is.

Yeah the CAS upscaler is just a combined sharpening/interpolation filter. It gives better results than scaling then sharpening or vice versa but it can't increase detail.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

jisforjosh posted:

For the 1080Ti the cooler just wasn't adequate at least in my situation. Loud as hell and overclocking headroom was diminished over AIB cards strictly due to thermal throttling. Don't get me wrong, the card has been amazing (especially after putting an NZXT G12 on it and mounting an AIO on it) but if I could've been patient and waited and gone with an AIB one I would.

Yeah totally fair. Even if I didn't want the founders edition, the fact is, I lack the technological restraint to not purchase the first 3080/3090 I see. But I totally respect people who can wait.

I literally don't even care how much it costs... it costs less than the pain of having HDMI 2.1 and not being able to use it.

nelson
Apr 12, 2009
College Slice
My old GTX 970 literally burned out (as in part of it caught on fire). What is a decent newer generation card to replace it with (that preferably uses less power and heat). I only need to worry about 1080p for now.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

1660S

nelson
Apr 12, 2009
College Slice

Thank you.

E: Decided to save $80 and go with the 1650s. It’s a downgrade but I’ve had 2 kids since originally building my PC and almost no time for gaming. I figure this will be good enough for an occasional game and when they get to college I can splurge again :)

nelson fucked around with this message at 23:52 on Jul 1, 2020

Fabulousity
Dec 29, 2008

Number One I order you to take a number two.

DeadFatDuckFat posted:

Gonna borrow my friends old 980 today and plug it in to make sure.

Also be sure to try a different cable if possible. You never know...

DisplayPort cables go sideways if an errant electron within a mile gets pushed to an excited state whilst pointing at your DP cable. It's probably the cheapest point of failure to test so might as well? Also try different output ports on the back of the video card.

DeadFatDuckFat
Oct 29, 2012

This avatar brought to you by the 'save our dead gay forums' foundation.


Fabulousity posted:

Also be sure to try a different cable if possible. You never know...

DisplayPort cables go sideways if an errant electron within a mile gets pushed to an excited state whilst pointing at your DP cable. It's probably the cheapest point of failure to test so might as well? Also try different output ports on the back of the video card.

I don't have a displayport cable, or at least I can't find it. I've been using a dvi. I switched to a different cable, switched out the 8 pin and connected it to a different spot on my psu, tried using my friend's 980 and still no luck. Also tried connecting to a different monitor. Heres whats basically happening


https://www.youtube.com/watch?v=mv0DxA3CT3o

The dram light is on, but even if my ram was bad I should still see bios right? I put in each stick individually and still nothing

Sorry if I'm posting this in the wrong thread btw

DeadFatDuckFat fucked around with this message at 23:18 on Jul 1, 2020

DrDork
Dec 29, 2003
commanding officer of the Army of Dorkness

DeadFatDuckFat posted:

The dram light is on, but even if my ram was bad I should still see bios right?

Not always. Bad RAM can keep a system from even getting to the BIOS sometimes. That the card's fans are twitching a few times but never spinning up makes me think it's caught in a pre-BIOS boot loop.

I would take everything out except your CPU (including all the RAM), reset the CMOS, and turn it on--you won't get to BIOS, but you should get some angry beeping, assuming you have an internal speaker connected or built in to the motherboard. Then add back one stick of RAM and try both your and your friend's video card. Change the single stick of RAM if that doesn't work. You should hopefully get to the BIOS from there, and then add back in components one by one.

DeadFatDuckFat
Oct 29, 2012

This avatar brought to you by the 'save our dead gay forums' foundation.


DrDork posted:

Not always. Bad RAM can keep a system from even getting to the BIOS sometimes. That the card's fans are twitching a few times but never spinning up makes me think it's caught in a pre-BIOS boot loop.

I would take everything out except your CPU (including all the RAM), reset the CMOS, and turn it on--you won't get to BIOS, but you should get some angry beeping, assuming you have an internal speaker connected or built in to the motherboard. Then add back one stick of RAM and try both your and your friend's video card. Change the single stick of RAM if that doesn't work. You should hopefully get to the BIOS from there, and then add back in components one by one.

:(

Still nothing. Reset cmos both by picking up the battery and the 2 pins with screwdriver. Dram light was the only thing to light up still with only the cpu in. Board doesn't have speakers. Tried single ram sticks in various slots with both my card and my friend's. Does that mean its a problem with the mobo or both sticks?

DeadFatDuckFat fucked around with this message at 00:08 on Jul 2, 2020

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
You need to do some real base-level troubleshooting. Disconnect everything from the motherboard and PSU except the CPU and one stick of ram, find a speaker to hook up to it, and see if it posts. If you can't do that, stick the ram in your friend's system - bad ram is not going to hurt a motherboard or anything.

DeadFatDuckFat
Oct 29, 2012

This avatar brought to you by the 'save our dead gay forums' foundation.


K8.0 posted:

You need to do some real base-level troubleshooting. Disconnect everything from the motherboard and PSU except the CPU and one stick of ram, find a speaker to hook up to it, and see if it posts. If you can't do that, stick the ram in your friend's system - bad ram is not going to hurt a motherboard or anything.

Aight, I'll try to find a way to test the ram. I'm gonna stop for today though so I don't tear my hair out. Thanks to everyone for all the help.

tehinternet
Feb 14, 2005

Semantically, "you" is both singular and plural, though syntactically it is always plural. It always takes a verb form that originally marked the word as plural.

Also, there is no plural when the context is an argument with an individual rather than a group. Somfin shouldn't put words in my mouth.

DeadFatDuckFat posted:

Aight, I'll try to find a way to test the ram. I'm gonna stop for today though so I don't tear my hair out. Thanks to everyone for all the help.

Don’t let that poo poo overwhelm you. When you get frustrated you’re going to rush over something and miss what’s causing the actual issue and end up even more upset.

Take a breath, get outside and grab something to eat, then hit it fresh and from a different angle. The advice posted is good and will get you your answer.

shrike82
Jun 11, 2005

It still kinda amazes me how PCs work at all and last years given the complexity of components.

Paul MaudDib
May 3, 2006

TEAM NVIDIA:
FORUM POLICE

Fabulousity posted:

Also be sure to try a different cable if possible. You never know...

DisplayPort cables go sideways if an errant electron within a mile gets pushed to an excited state whilst pointing at your DP cable. It's probably the cheapest point of failure to test so might as well? Also try different output ports on the back of the video card.

Between the limited range, the sensitivity to ambient noise (remember the video of office chair gas pistons causing a noise spurt and dropping signal as you stand up), and the general flakiness when you are even close to the official spec, I really can’t wait for HDMI VRR to take off in gaming monitors. That or NVIDIA should just adopt the FreeSync Over HDMI mode for now.

canyoneer
Sep 13, 2005


I only have canyoneyes for you
I'm assembling a new PC right now and holy moly the 2070 Super is gigantic :eyepop:

In awe at the size of this card, the absolute unit

Lackmaster
Mar 1, 2011
I posted this in the monitor thread but I’m cross posting here cause this is a GPU question as much as it’s a monitor question


I’ve got a question about HDMI 2.1 and another one about (theoretical) Freesync support for the RTX 3XXX series of graphics cards. I also want to get a sanity check to make sure there aren’t any flaws in my reasoning here:

So I currently have a Dell S2417DG. It’s a 24” 1440p 165hz TN panel. I have a 1060 6GB that lets me run stuff like Apex Legends on high/ultra at ~45 FPS and if I crank stuff down to medium I can push ~90 FPS.

I’m strongly considering upgrading to a LG 27gl83a‑b. It’s a 27” 144hz IPS panel. I don’t care about going down from 165hz to 144, I like having a slightly bigger screen, and the IPS is attractive to me because of (what I assume will be) better color and viewing angles. The color banding on dark images is also really bad on the TN panel. I know I’m potentially opening myself up to IPS glow and reduced contrast, but I think the pros outweigh the cons for me. (Anyone made this jump or similar and have thoughts?)

I am also feeling pretty constrained by my 1060. I will very likely get a RTX 3080 soon after they come out, like late this year. I like playing AAA games with the settings turned up - I want Cyberpunk 2077 to look really good.

So, my questions are, basically, assuming I’m on the right track with all the above, what will it be like to have a 3080 with my new LG 27gl83a‑b? Am I right assuming that as long as I’m staying at 1440p 144hz, HDMI 2.1 is irrelevant to me and thus I won’t care that my new GPU will (probably) have it but my monitor won’t?

Probably my bigger concern is, will the RTX 3080 play nice with the freesync/gsync-lite on the LG?

I know that second question probably doesn’t have a definitely answer, but y’all are super knowledgeable and I’m wondering if people have guesses or thoughts?

I’m hoping the answer is it’ll be fine, and then I can be happy with rock solid 1440p while OLED or micro-LED sorts itself out. If that took 4 years, that’d be fine by me.

huhwhat
Apr 22, 2010

by sebmojo

Lackmaster posted:

Probably my bigger concern is, will the RTX 3080 play nice with the freesync/gsync-lite on the LG?

I run an Adaptive Sync (not validated as G-Sync Compatible) 75Hz 1440p IPS Acer with a 1660 Ti, haven't seen any stuttering yet when fullscreen. Feels like windowed borderless has a wee bit of stuttering though, despite having G-Sync enabled for fullscreen and windowed in Nvidia control panel. https://www.testufo.com/vrr Full screen is like the top scrolling image and windowed borderless is like the bottom, but not as bad. May be related to freesync/gsync-lite, may be unrelated :shrug: .

-

So, I've been always been impressed by game-agnostic post-processing techniques like FXAA, SMAA and sharpening filters+upscaling, despite their flaws (smearing vaseline for FXAA and oversharpening producing noisy colors and moire). Not a fan of DLSS and DLSS2.0. It feels like HairWorks and PhysX to me. They're platform specific and requires developer investment. I can't just go back to older games and boost the image quality with new tech.

After watching Digital Foundry's analysis on Detroit Become Human's image quality in native 4k on PC and PS4's checkerboarding, I'm surprised that such a feature isn't something I can toggle from my control panel. I have zero knowledge here, so tell me why isn't checkerboading something that can be toggled on/off? Is it because the devs have to tell the engine to render every frame in a checkerboard pattern, even if the reconstruction algorithm for the full frame can be done post process?

huhwhat fucked around with this message at 08:07 on Jul 2, 2020

Zedsdeadbaby
Jun 14, 2008

You have been called out, in the ways of old.
So basically you want to know if it's g-sync compatible, which it is. It's certified by nvidia. Go hog wild

CaptainSarcastic
Jul 6, 2013



canyoneer posted:

I'm assembling a new PC right now and holy moly the 2070 Super is gigantic :eyepop:

In awe at the size of this card, the absolute unit

Yeah, mine is enormous. I'm using the support arm in my case to prop it up a bit. I should probably pull off the back panel and tighten the screws on it to make sure it is supporting weight properly.

I got a three-fan version, and it is massive.

repiv
Aug 13, 2009

Facebook/Oculus are working on their own flavor of DLSS

It's early days though, their algorithm takes 18-24ms to generate a 1080p output on a Titan V

Needs some work to get down to DLSS 2.0s ~1ms runtime

shrike82
Jun 11, 2005

I wouldn't be surprised if FB overtakes Nvidia pretty quickly - they make the point about taking a public research approach with commodity hardware/software. FB AI's a lot more prestigious from a research/talent standpoint these days than Nvidia.

Worf
Sep 12, 2017
Probation
Can't post for 51 minutes!

shrike82 posted:

FB AI's a lot more prestigious from a research/talent standpoint these days than Nvidia.

horrifying

v1ld
Apr 16, 2012

repiv posted:

Facebook/Oculus are working on their own flavor of DLSS

It's early days though, their algorithm takes 18-24ms to generate a 1080p output on a Titan V

Needs some work to get down to DLSS 2.0s ~1ms runtime

Interesting that they're targeting 16x upscale vs NVidia's 4x (or was it more - can't really go looking right now). Is there like a Nyquist-like criterion limiting reconstruction fidelity? Seems like there's going to be lots of work in this area, very cool.

I for one welcome all our new non-pixellated insect overlords.

E: I meant that does something like the Nyquist criterion even apply to ML techniques?

KillHour
Oct 28, 2007


v1ld posted:

Interesting that they're targeting 16x upscale vs NVidia's 4x (or was it more - can't really go looking right now). Is there like a Nyquist-like criterion limiting reconstruction fidelity? Seems like there's going to be lots of work in this area, very cool.

I for one welcome all our new non-pixellated insect overlords.

E: I meant that does something like the Nyquist criterion even apply to ML techniques?

Not really, because ML just makes poo poo up.

repiv
Aug 13, 2009

v1ld posted:

Interesting that they're targeting 16x upscale vs NVidia's 4x (or was it more - can't really go looking right now). Is there like a Nyquist-like criterion limiting reconstruction fidelity? Seems like there's going to be lots of work in this area, very cool.

I for one welcome all our new non-pixellated insect overlords.

E: I meant that does something like the Nyquist criterion even apply to ML techniques?

DLSS is 2x, 3x or 4x depending on preset, with the 2x one being more or less indistinguishable from native and the others still being pretty good with a few flaws.

I don't think there's a Nyquist-like hard limit on how much you can reconstruct, but there's practical limits on dynamic scenes. There's always going to be cases where a region has no usable history data due to disocclusion, and at 16x I don't see how you could avoid a noticeable trail of low res shittiness around fast moving objects. But it would be extremely cool if I'm proven wrong.

e: I suppose you could apply the ATAA concept to selectively brute force areas with little to no valid history.

https://news.developer.nvidia.com/understanding-the-need-for-adaptive-temporal-antialiasing/

KillHour posted:

Not really, because ML just makes poo poo up.

These realtime ML reconstruction techniques aren't really making poo poo up like photo/video super-resolution techniques do, they work by amortizing supersampling over multiple frames (same principle as TAA) with an ML model guiding which history data to accept/reject. They accumulate real detail just in a roundabout way.

repiv fucked around with this message at 17:40 on Jul 2, 2020

repiv
Aug 13, 2009

quote is not edit

Combat Pretzel
Jun 23, 2004

No, seriously... what kurds?!

Fabulousity posted:

DisplayPort cables go sideways if an errant electron within a mile gets pushed to an excited state whilst pointing at your DP cable. It's probably the cheapest point of failure to test so might as well? Also try different output ports on the back of the video card.
I have one cable that creates insane amounts of fireflies if I plug it in in one direction between GPU and display, but when I flip it around, everything's fine. It's the same goddamn wires that are active. :psyduck:

repiv
Aug 13, 2009

Combat Pretzel posted:

I have one cable that creates insane amounts of fireflies if I plug it in in one direction between GPU and display, but when I flip it around, everything's fine. It's the same goddamn wires that are active. :psyduck:

the audiophiles were right, cable directionality is real :eyepop:

K8.0
Feb 26, 2004

Her Majesty's 56th Regiment of Foot
The one variable I would immediately guess in that situation is grounding. Like most connectors, DP has multiple ground pins. If one ground at one end of the cable is a bit crudded up, and the device at that end depends heavily on that particular pin for its grounding, you could see those type of signal integrity issues.

Fantastic Foreskin
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.

repiv posted:

the audiophiles were right, cable directionality is real :eyepop:

Now I just need to find a way to push analog audio over DP...

CLAM DOWN
Feb 13, 2007




Statutory Ape posted:

get this information to nvidia and amd stat

Please do the needful and post better, thank you kindly.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

repiv posted:

Facebook/Oculus are working on their own flavor of DLSS

It's early days though, their algorithm takes 18-24ms to generate a 1080p output on a Titan V

Needs some work to get down to DLSS 2.0s ~1ms runtime

I like how we're going from "is DLSS even going to be a REAL THING on ACTUAL AAA future games" to "Everyone understands how incredible this tech is and wants it in their own stack, even if they're not making GPUs" in like 9 seconds.

Dead technology folks. May as well pack it in now, Nvidailures.

Worf
Sep 12, 2017
Probation
Can't post for 51 minutes!

CLAM DOWN posted:

Please do the needful and post better, thank you kindly.

lol nah

i appreciate the irony tho

(USER WAS PUT ON PROBATION FOR THIS POST)

Worf fucked around with this message at 17:46 on Jul 2, 2020

Adbot
ADBOT LOVES YOU

Lockback
Sep 3, 2006

All days are nights to see till I see thee; and nights bright days when dreams do show me thee.

Lackmaster posted:


So, my questions are, basically, assuming I’m on the right track with all the above, what will it be like to have a 3080 with my new LG 27gl83a‑b? Am I right assuming that as long as I’m staying at 1440p 144hz, HDMI 2.1 is irrelevant to me and thus I won’t care that my new GPU will (probably) have it but my monitor won’t?


I dunno if this has been answered, and we don't have specs for the 3080 yet, but..yeah. It's probably going to be a great card for 1440p. I run a 2080S on 1440p and its basically a "set everything to max" card.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply