Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Jarl
Nov 8, 2007

So what if I'm not for the ever offended?
Problem description:
For some reason my second monitor (a TV) is redetected by my NVIDIA graphic card once in a while (NVIDIA reacts as if the monitor had been detached and then reattached). When that happens anything that was on my second monitor is suddenly on my primary monitor and needs to be moved (annoying when you are sitting in the sofa watching a movie and need to go to the desk to fix the issue).

You could argue that it is due to the TV, but I didn't have the problem before getting an NVIDIA graphic card. I have had the problem with two NVIDIA cards in a row.

I have GIGABYTE GeForce RTX 2070 SUPER WindForce OC 8GB GDDR6 RAM and have the newest driver installed.
Before that I had GF GTX 960 2GD5 2GB DDR5.

Sometimes it is prevalent and other times it happens maybe once every third week.

Before those two graphic cards I had Club 3D Radeon PCI-E HD6950 and experienced none of these problems.

Attempted fixes:
Byproduct of building a new computer.

Operating system:
Windows 10 Pro.

System specs:
MSI B450 TOMAHAWK MAX - AMD B450 - AMD AM4 socket - DDR4 RAM - ATX
AMD Ryzen 5 3600 Wraith Stealth CPU - 3.6 GHz - AMD AM4 - 6 cores - AMD Boxed
G.Skill RipjawsV DDR4-3600 C19 DC - 16GB
GIGABYTE GeForce RTX 2070 SUPER WindForce OC - 8GB GDDR6 RAM
Intel 660p SSD M.2 NVMe - 1TB
WD Red NAS WD40EFRX - 4 TB - 3.5" - 5400 rpm - SATA-600 - 64 MB cache

I have Googled and read the FAQ:
Yes.


I would be thankful for any suggestions to solve the problem or just an explanation. :confused:

EDIT:
Removed danish words from system specs.

Adbot
ADBOT LOVES YOU

Slayerjerman
Nov 27, 2005

by sebmojo
On your HDTV what HDMI port type is it plugged into? And what brand/model is the TV? That's kind of more important here than your video card info.

Some ports are typically labeled "ARC", "STB" or "DVI" or sometimes even a label that says "PC", I would switch your input port to the one marked ARC, DVI or "PC" if you have ports marked that. Or perhaps try a different port just for testing purposes... you didnt mention trying that already so yeah...

More details here:
https://www.howtogeek.com/306176/what-the-labels-on-your-tvs-hdmi-ports-mean-and-when-it-matters/

Also, make sure you have HDCP compliant HDMI cable connected as TVs will want to use a secure video signal these days (filthy pirates). This was a very common issue caused with Playstations as well as Sony brand TVs in the past when not using proper cables that support HDCP (or not turning on the HDCP setting on your shiny new PS3, lol). If you're reusing some ancient HDMI cable on that new 2070, chances are the cable is failing in one way or another...

I would definitely recommend trying any other HDMI cables and try to test if the cable is at fault. Then check Nvidia Control Panel like so:
https://www.nvidia.com/content/Control-Panel-Help/vLatest/en-us/mergedProjects/nvdsp/To_verify_if_your_system_is_HDCP-capable.htm

Lastly, double check that your TV's display settings in Nvidia Control Panel are set to run at the TV's native resolution (1080p, I assume?) and it's native Hz frequency (60hz most likely) in addition to switching out the cable. Lastly, try changing the output HDMI ports from the video card, if your running on HDMI port#2 off the video card, try changing it to port1 and let your monitor be port2.... that's a fix that works sometimes. Kinda just need to do some more testing imho and see what you discover.

Slayerjerman fucked around with this message at 06:47 on Mar 31, 2020

Jarl
Nov 8, 2007

So what if I'm not for the ever offended?
Thanks a bunch for giving me something to work with.

My TV is a Grundig (model 47 VLE 9279BP - nothing special) and have multiple HDMI sockets, but looking in the manual it doesn't seem like there is any difference between them. I tried switching to a different port, but the problem persisted.
My graphic card only has one HDMI port (the rest are display ports)

My cable is very old (it's 15 meters long and in cable trays) so with this new information you have provided it is probably the culprit. I was only aware that different HDMI cables support different amounts of pixels and hertz. - Would also explain why getting a new graphic card introduced the problem even though there was probably nothing wrong with it since the problem persisted with the next graphic card.

NVIDIA Control Panel -> View HDCP status says "Your graphics card and display are HDCP-cabable" whether I select my Acer XB270HU or GRUNDIG WUXGA. I assume that means the cable too support HDCP.

Still, it seems the HDMI cable is the most likely cause so I will get myself a new HDCP compliant HDMI cable.
Will probably be a while but will post the results when I have run a test for about a week.

Jarl fucked around with this message at 21:10 on Apr 2, 2020

Jarl
Nov 8, 2007

So what if I'm not for the ever offended?
I have tried so many things.

I tried changing HDMI input port to the one used by Chromecast with which I have never experienced any problems.
I bought a new HDMI cable of good quality and found a way to reduce the length to 10 meters.
I tried turning hardware acceleration off in Kodi.
I tried using VLC media player instead.

Nothing helped.

However, I have played through Jedi: Fallen order on my TV using my computer without experiencing the issue once. I can't explain why, but it seems the problem only occurs when watching a movie on my TV using my computer.

Will still be very thankful for any suggestion or explanation.

Jarl fucked around with this message at 09:34 on May 6, 2020

ryangs
Jul 11, 2001

Yo vivo en una furgoneta abajo cerca del río!

Jarl posted:

However, I have played through Jedi: Fallen order on my TV using my computer without experiencing the issue once. I can't explain why, but it seems the problem only occurs when watching a movie on my TV using my computer.

This means you're only having a problem when HDCP is enabled, which is presumably making your devices more sensitive to signal dropouts.

Ten meters is still pretty long for an HDMI cable. I think you're having signal strength issues. I would try adding an HDMI signal booster (amplifier) near your source (video card). You could verify by temporarily moving your TV and computer closer and using a regular length (one or two meter) HDMI cable.

Jarl
Nov 8, 2007

So what if I'm not for the ever offended?

ryangs posted:

This means you're only having a problem when HDCP is enabled, which is presumably making your devices more sensitive to signal dropouts.

Ten meters is still pretty long for an HDMI cable. I think you're having signal strength issues. I would try adding an HDMI signal booster (amplifier) near your source (video card). You could verify by temporarily moving your TV and computer closer and using a regular length (one or two meter) HDMI cable.

Thanks for the suggestion.

So Kodi and VLC tells my graphic card it is rendering a movie which enables HDCP? That's interesting.

I would have imagined a booster would help more at the TV end of the cable rather than the video-card end.

Looking into buying a HDMI booster/amplifier/repeater.

ryangs
Jul 11, 2001

Yo vivo en una furgoneta abajo cerca del río!

Jarl posted:

So Kodi and VLC tells my graphic card it is rendering a movie which enables HDCP? That's interesting.

In that case, I actually doubt HDCP is a factor. (:filez: should mean no HDCP.) I assumed you were talking about something with DRM like Netflix streaming. You're still suffering a signal dropout, but it's more of a mystery why it only happens during movies, not gaming.

You want the booster near the source (video card) so it has a cleaner/stronger signal to amplify. Otherwise, it's garbage in, garbage out.

My assumption here is that your new video card is putting out a slightly weaker HDMI signal than your old one. Thus, you need to boost it a bit for that long cable run.

ryangs fucked around with this message at 17:26 on May 7, 2020

Adbot
ADBOT LOVES YOU

Jarl
Nov 8, 2007

So what if I'm not for the ever offended?

ryangs posted:

You want the booster near the source (video card) so it has a cleaner/stronger signal to amplify. Otherwise, it's garbage in, garbage out.

My assumption here is that your new video card is putting out a slightly weaker HDMI signal than your old one. Thus, you need to boost it a bit for that long cable run.

It works. Thanks for the help and explanation.

I got an amplifier/booster/repeater - specifically bought a "Lindy Premium HDMI Booster/Repeater" - and have 1 meter from the graphic card to the amplifier and 15 meter from the amplifier to the TV (have not switched to a 10 meter cable yet since I have to disassemble cable trays behind furniture), and I have not experienced any problems for a couple of months.

I didn't have to add power to the amplifier (the one I bought have the option to add a 5VDC) which surprised me. Because that means the amplifier use the power from the signal which I would imagine would make the signal weaker. If I ever experience anything I will add power to the amplifier.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply