Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?


K8.0 posted:

16:10 makes no sense, frankly. The likelihood that a slightly different AR from 16:9 makes a real difference to you is insignificant compared to the likelihood that 5:4 or 4:3 is an even better AR for you, yet I see no one campaigning for those to return. At 1080p vs 1200p when vertical space was significantly reduced compared to 4:3/5:4 CRTs, there was something of an argument to be made for every vertical pixel you could get. At 1440p and especially 4k, it's silly.

As an erstwhile 16:10 fan, this is how I feel after upgrading to 1440p. The extra pixels make dividing your screen between multiple windows much more practical, which makes the wider aspect ratio a real advantage.

Adbot
ADBOT LOVES YOU

Splinter
Jul 4, 2003
Cowabunga!

I have a 30" 16:10 monitor (2560x1600) at work as one of my monitors and it's pretty great for coding (I'd take ultra-wide over this though). In the days of 1440p monitors, I don't think there's a huge difference anymore and I wouldn't go out of my way to grab a 16:10 monitor over a 16:9 monitor, but I think it's a bit off to say 16:10 is wasted space (all else being equal). The only time that is really the case is when consuming 16:9 media. Web is generally not optimized to make great use of horizontal space after a certain point. Still photo formats are generally taller than 16:9 (and of course you have to deal with portrait oriented photos as well). When working on 16:9 content, the extra vertical space allows for potentially less cropping of the content once you account for the program's UI. For programming (and I imagine spreadsheet work as well, hell anything text related where you might want to split the screen vertically), the extra vertical pixels are noticeable when splitting the screen vertically to show things like console/debug output or additional editor windows.

I should note I think the proper way to compare 16:10 to 16:9 is to have the same physical width and horizontal pixel count, just adding the vertical pixels/height (as this keeps the same desktop footprint), rather than having the same diagonal measurement (which would make a 16:10 screen more narrow).

kw0134
Apr 19, 2003



Alright, so is the LG 34" Ultrawide a bad idea? I have the 27" and while I like it, at this price I can use it as the primary I've been kind of putting off and then finally retire an old 27" 1080 panel.

What vertical resolution are people looking for in their ultrawides?

Shipon
Nov 7, 2005


kw0134 posted:

Alright, so is the LG 34" Ultrawide a bad idea? I have the 27" and while I like it, at this price I can use it as the primary I've been kind of putting off and then finally retire an old 27" 1080 panel.

What vertical resolution are people looking for in their ultrawides?

I have this monitor and it's quite nice, would recommend.

SCheeseman
Apr 23, 2003



16:10 and squarer ratios in general are picking up steam on laptops again. It makes sense there, since the alternative is either shortening the body leaving less room for the touchpad or a big chin under the display.

Pitch
Jun 16, 2005

しらんけど


I'm not sure if this is a question about monitors or about Windows. I have an annoying issue with my unusual multiple display situation. I have a real monitor (Acer ROG Strix XG279q) plugged in by DisplayPort and a Samsung TV connected by HDMI. The TV is almost always off if I'm not actively using it but Windows detects it and considers me to have another 1920x1080 of screen real estate off to one side. I use a program called Dual Monitor Tools to lock my cursor onto the main display and mostly ignore it. The mild annoyance is that when the PC is idle and the display turns off, any open windows are reset to default positions. This doesn't happen if the monitor is the only display plugged in and for some reason it didn't happen with a previous monitor that was DVI instead. I don't know if this behavior is a DP thing of fully disconnecting a powered-down display or if this monitor turns itself off more completely than the old one when it's idle. Is there any way I can get Windows to stop forgetting the monitor exists without just keeping the TV physically disconnected most of the time?

butt dickus
Jul 7, 2007

top ten juiced up coaches
and the top ten juiced up players

SCheeseman posted:

16:10 and squarer ratios in general are picking up steam on laptops again. It makes sense there, since the alternative is either shortening the body leaving less room for the touchpad or a big chin under the display.
my laptop is 3:2, my monitors are 16:10, my tv is 16:9 and that's the way i like it

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?


Pitch posted:

I'm not sure if this is a question about monitors or about Windows. I have an annoying issue with my unusual multiple display situation. I have a real monitor (Acer ROG Strix XG279q) plugged in by DisplayPort and a Samsung TV connected by HDMI. The TV is almost always off if I'm not actively using it but Windows detects it and considers me to have another 1920x1080 of screen real estate off to one side. I use a program called Dual Monitor Tools to lock my cursor onto the main display and mostly ignore it. The mild annoyance is that when the PC is idle and the display turns off, any open windows are reset to default positions. This doesn't happen if the monitor is the only display plugged in and for some reason it didn't happen with a previous monitor that was DVI instead. I don't know if this behavior is a DP thing of fully disconnecting a powered-down display or if this monitor turns itself off more completely than the old one when it's idle. Is there any way I can get Windows to stop forgetting the monitor exists without just keeping the TV physically disconnected most of the time?

What I do is switch screens with Window's Project function. It's a hassle because it means I have to manually switch screens (Win + P is a shortcut), but I mostly don't have to deal with resolution difference issues.

Pitch
Jun 16, 2005

しらんけど


Rinkles posted:

What I do is switch screens with Window's Project function. It's a hassle because it means I have to manually switch screens (Win + P is a shortcut), but I mostly don't have to deal with resolution difference issues.
I didn't know about this, but it doesn't seem to help. If the TV is connected and turned on, even if Project is in "PC Screen Only", then turning off the monitor switches the display to the TV and resets all the window positions to default (I guess because they were taller than 1080, yeah). If the TV is turned off I get the same result so I assume the same thing might be happening but I can't prove it (since I can't see if all the windows have been shoved onto that screen). The result I want is for the monitor to be my main display all the time, and all windows to stay there, unless I specifically want a video to play on the TV.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?


Pitch posted:

I didn't know about this, but it doesn't seem to help. If the TV is connected and turned on, even if Project is in "PC Screen Only", then turning off the monitor switches the display to the TV and resets all the window positions to default (I guess because they were taller than 1080, yeah). If the TV is turned off I get the same result so I assume the same thing might be happening but I can't prove it (since I can't see if all the windows have been shoved onto that screen).

Hmm, my TV doesn't do that. As far as I can tell, the pc only outputs to the TV if I tell it to. Idk if it matters, but it's a decade old Sony. My Gigabyte monitor is hooked up via DP, like yours.

Just spitballing, but did you turn off Dual Monitor Tools?

Pitch posted:

The result I want is for the monitor to be my main display all the time, and all windows to stay there, unless I specifically want a video to play on the TV.

I would love a more convenient solution too, but (to my surprise, since I don't think this is particularly uncommon situation) I haven't found one. Also it'd be nice if GPUs had HDMI-CEC support (the thing that, among other things, automatically turns on your tv when you turn on a connected device).

Pitch
Jun 16, 2005

しらんけど


I suspect (completely amateur opinion) this is related to Displayport Deep Sleep, and Windows can't tell the difference between the monitor going into very-low power mode and the monitor completely disappearing. Unfortunately the XG279q doesn't have an option to disable this, although previous models did. It also takes basically as long to wake from sleep as it does to turn on completely cold, but that never inconvenienced me that much.

I'd like to hear a real solution but after some googling I found something that seems to work. For now I'm using Custom Resolution Utility to create an EDID override so Windows doesn't detect any change in the monitor's status at all. It seems to work for now but I'll have to see if it has any weird side effects.

wolrah
May 8, 2006
what?


Pitch posted:

I suspect (completely amateur opinion) this is related to Displayport Deep Sleep, and Windows can't tell the difference between the monitor going into very-low power mode and the monitor completely disappearing. Unfortunately the XG279q doesn't have an option to disable this, although previous models did. It also takes basically as long to wake from sleep as it does to turn on completely cold, but that never inconvenienced me that much.
If the monitor is turning off its presence signal and the TV is not, then you are correct Windows is going to see that and assume everything now needs to be on the TV regardless of if it was previously disabled. Given the choice between no monitor and disabled monitor, the desktop GUI operation system is of course going to pick the disabled one.

Am I correct though in understanding that this only happens when you turn the monitor off, as in pressing the power button on the monitor? And that it works as expected when you just allow power management to put the monitor to sleep?

If that's the case, then why not just do that? The default Windows power management settings aggressively sleep the monitor once the lock screen is up, so if you lock or log out when you leave your computer (like you should be in the habit of doing anyways) it'll be off soon enough one way or another.

If on the other hand the display actually disconnects itself when it goes to sleep through power management, then that's just broke as gently caress behavior and if it's too late to return then you're probably stuck with hacky workarounds.

mom and dad fight a lot
Sep 21, 2006

twenty-six characters long



Well this was apt foreshadowing.

Cygni posted:

As someone who mostly does admin/HR/manager stuff, yeah, i can't recommend multi monitor enough. I've personally found 2 x 27 to be better than 3 smaller screens, though. I like that 27in at 1440p/4k let you put two 8.5x11 documents next to each other at full size with full legibility plus the toolbars and such. Really helps.

I would def recommend dropping by a store (if thats safe/possible to do in your area) and taking a look at a 27in 1440p in person if you havent.

I took over some financial duties, and now I'm using magnum-sized spreadsheets that could really use some extra monitor width. There doesn't seem to be an appreciable price difference between a 2560x1080p and 2560x1440p monitor, so I guess I'll invest in the latter. My dining table is running out of room, so I'll have to take a look at some el-cheapo monitor arms too.

Need. More. Canvas.

mom and dad fight a lot fucked around with this message at 09:29 on Mar 21, 2021

Pitch
Jun 16, 2005

しらんけど


wolrah posted:

If the monitor is turning off its presence signal and the TV is not, then you are correct Windows is going to see that and assume everything now needs to be on the TV regardless of if it was previously disabled. Given the choice between no monitor and disabled monitor, the desktop GUI operation system is of course going to pick the disabled one.

Am I correct though in understanding that this only happens when you turn the monitor off, as in pressing the power button on the monitor? And that it works as expected when you just allow power management to put the monitor to sleep?

If that's the case, then why not just do that? The default Windows power management settings aggressively sleep the monitor once the lock screen is up, so if you lock or log out when you leave your computer (like you should be in the habit of doing anyways) it'll be off soon enough one way or another.

If on the other hand the display actually disconnects itself when it goes to sleep through power management, then that's just broke as gently caress behavior and if it's too late to return then you're probably stuck with hacky workarounds.
No, it acts the same whether I physically turn it off or it's put to sleep by Windows being idle. Apparently the previous model in this line not only allowed you to disable "Deep Sleep" but recommended it in the instruction manual, but now there's no such option (or it's tied in to disabling FreeSync for some reason, according to forum rumors).

KingEup
Nov 18, 2004
I am a REAL ADDICT
(to threadshitting)


Please ask me for my google inspired wisdom on shit I know nothing about. Actually, you don't even have to ask.


Rollie Fingers posted:

This guy’s been using the LG CX Oled screen as a monitor for six months and hasn’t noticed burn yet:

https://youtu.be/AhV09HD7Ee0

He says he hasn’t had to babysit it, but he’s still hidden the taskbar and all desktop icons and changes backgrounds regularly.

The part that I’d find ultra irritating is the screen’s burn-in protection auto adjusting brightness regularly

You can buy second hand OLED PVMs on ebay https://www.ebay.com/itm/Sony-PVM-2551MD-OLED-Monitor/264898311440?hash=item3dad2b7510:g:aJoAAOSwlBVfhfxl

I have one and it’s outstanding (I paid USD 250)

Shipon
Nov 7, 2005


Huh, why would a medical office need an OLED monitor specifically?

KingEup
Nov 18, 2004
I am a REAL ADDICT
(to threadshitting)


Please ask me for my google inspired wisdom on shit I know nothing about. Actually, you don't even have to ask.


Shipon posted:

Huh, why would a medical office need an OLED monitor specifically?

These are FDA approved surgical display. Amazing colour accuracy and no input lag because you need to see yourself nicking an artery in real time.

KingEup fucked around with this message at 11:32 on Mar 20, 2021

LochNessMonster
Feb 3, 2005

I need about three fitty



I think I saw some talk about calibrating your screen some time ago. Is this something that's necessary for most screens or only for specific type of screens?

wolrah
May 8, 2006
what?


Pitch posted:

No, it acts the same whether I physically turn it off or it's put to sleep by Windows being idle. Apparently the previous model in this line not only allowed you to disable "Deep Sleep" but recommended it in the instruction manual, but now there's no such option (or it's tied in to disabling FreeSync for some reason, according to forum rumors).
Wow, how did that ever make it out of QA? (answer: QA probably never checked)

CaptainSarcastic
Jul 6, 2013

HAIL SATAN



LochNessMonster posted:

I think I saw some talk about calibrating your screen some time ago. Is this something that's necessary for most screens or only for specific type of screens?

I'd say it depends on what you're doing and what your personal preferences are.

The stock calibration on my monitors seems fine, but if I was doing serious digital darkroom stuff on it then I would have calibrated at least my main monitor. I haven't had a decent camera in years, and don't currently own a photo printer, so I'm personally playing faster and looser with my color calibration than I do when more actively doing photography.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?


Found the DP 1.2 fuzziness.





60, 120, 170 Hz, from top to bottom.

I'm not sure why sometimes I don't see it. The effect is pretty subtle but I mean in the pictures I took before I didn't see refresh rate making any difference.

Rinkles posted:

but now itunes seems to have issues



RTings profile vs default* respectively (it shouldn't be yellowish)




*sRGB IEC61966-2.1

Still pulling out my hair over this. I didn't realize getting a wide gamut monitor would actually end up being a liability.

crepeface
Nov 5, 2004

r*p*f*c*

I'm getting that fancy Acer Predator monitor I posted and imo spending money on something you're not setting up correctly seems like a waste. The kind of people posting in this thread probably spend hours finding that perfect sweet spot of performance vs cost, what's another 30 mins to set up it properly?

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?


crepeface posted:

I'm getting that fancy Acer Predator monitor I posted and imo spending money on something you're not setting up correctly seems like a waste. The kind of people posting in this thread probably spend hours finding that perfect sweet spot of performance vs cost, what's another 30 mins to set up it properly?

You're ignoring the fact that doing so properly requires fairly expensive equipment.

CaptainSarcastic
Jul 6, 2013

HAIL SATAN



Rinkles posted:

You're ignoring the fact that doing so properly requires fairly expensive equipment.

Years ago I spent a lot of time scanning, viewing, and printing cards like this to calibrate all my equipment:



Not expensive but it was a lot of time and hassle.

crepeface
Nov 5, 2004

r*p*f*c*

Rinkles posted:

You're ignoring the fact that doing so properly requires fairly expensive equipment.

I'm pretty obviously talking about doing basic calibration, I assumed that's what OP was talking about.

Spacedad
Sep 11, 2001

We go play orbital catch around the curvature of the earth, son.

Edit: Nm I was thinking of the wrong display.

Anyway, I was wondering what are the budget displays (gaming or general use) that people here recommend. I might need 1-2 cheap-but-good displays to go with a secondary PC in the near future.

Spacedad fucked around with this message at 17:51 on Mar 21, 2021

Lackmaster
Mar 1, 2011


So why is the Chromebook Pixel the best IPS display on the market?

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE


Lackmaster posted:

So why is the Chromebook Pixel the best IPS display on the market?

It was a pretty neat high DPI 13" display (wacky 3:2 aspect ratio though) when it came out in 2013 but the OP and the thread title hasn't been updated since then.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?


I hadn't checked cause I assumed it was 1.4, but the M27Q only has a DP 1.2 port, not 1.4.

That means it can't go higher than 144Hz at 1440p without chroma subsampling, right? Anything else?

I guess it's also relevant to hooking up a PlayStation 5 (assuming Sony doesn't add 1440p output). Can't take a 120Hz 4K signal.

e:I'm dumb, PS5 would be over HDMI (2.0 in this case).

Rinkles fucked around with this message at 14:45 on Mar 22, 2021

Harik
Sep 9, 2001

From the hard streets of Moscow
First dog to touch the stars




Plaster Town Cop

CaptainSarcastic posted:

Years ago I spent a lot of time scanning, viewing, and printing cards like this to calibrate all my equipment:



Not expensive but it was a lot of time and hassle.

That's seems like a lot of effort to go through to calibrate your monitor to the quality control of a color printer cartridge.

Now I'm down the rabbit hole of seeing if you can even buy expired pantone swatches.

E: Yes, like $50-75 for 10-15 year old formula guide.

https://www.ebay.com/itm/Pantone-Formula-Guide-Solid-Matte-2008/133663327735

I don't think reflective vs emissive will give you a good calibration unless you have known-accurate lighting

BRB calibrating my LED bulbs

Harik fucked around with this message at 14:46 on Mar 22, 2021

CaptainSarcastic
Jul 6, 2013

HAIL SATAN



Harik posted:

That's seems like a lot of effort to go through to calibrate your monitor to the quality control of a color printer cartridge.

It was a lot of effort, partly because the whole calibration loop involved my scanner as well as monitor and printer. I did have a high-end photo printer - it's not like I was calibrating to a disposable HP Deskjet and tricolor cartridges. This was back when film was still in use, so I was scanning negatives a lot.

Rinkles
Oct 24, 2010

What I'm getting at is...
Do you feel the same way?


I just learned that AMD cards have a useful feature Nvidia cards don't: you can force sRGB emulation on the driver level.

    AMD GPU drivers offer an alternative, a flexible sRGB emulation setting that is activated in the driver itself. This setting reads the EDID (Extended Display Identification Data) of the monitor which contains information on the native gamut and corrects based on that. It has actually existed for some time, but the naming never made it clear what it did so even more technically-minded users may have overlooked it. In older graphics drivers there was a ‘Colour Temperature’ toggle that could be set to ‘Automatic’ rather than the default ‘6500K’ to achieve sRGB emulation. In newer drivers it’s done by opening ‘AMD Radeon Software’, clicking ‘Settings’ (cog icon towards top right) and clicking on ‘Display’. You should then ensure that the ‘Custom Color’ slider to the right is set to ‘Enabled’ and ‘Color Temperature Control’ (CTC) set to ‘Disabled’. It may appear to be set this way by default, but the native rather than restricted gamut is likely in play. If that’s the case, simply switch the ‘Color Temperature Control’ slider to ‘Enabled’ then back to ‘Disabled’ to leverage the sRGB emulation behaviour. This is shown in the image below



    We’ve tested this on a broad range of monitors and have found it does usually offer reliable sRGB emulation. Exactly how closely the gamut tracks sRGB varies between models, but we typically see ~98% sRGB coverage (sometimes a touch below, sometimes a touch above) with very little extension beyond sRGB. The beauty of this setting is that you use it in conjunction with the full native gamut of the monitor, allowing you to use the full array of monitor OSD settings available to you. It’s set universally, too, so isn’t something that is simply ignored by some applications.

https://pcmonitors.info/articles/taming-the-wide-gamut-using-srgb-emulation/

Zarin
Nov 11, 2008

I SEE YOU


Read the last 3 pages, didn't see this answer here:

What are the current recommendations for a 4k gaming monitor of my buddy ends up going with a 3080? Are 4k/120 monitors the standard now for 4k? Or is that refresh rate still considered pretty aggressive for mainstream yet?

Edit: It sounds like 4k/60 is too aggressive for a 3070, is that about correct?

Zarin fucked around with this message at 02:31 on Mar 23, 2021

Blorange
Jan 31, 2007

A wizard did it



A 3070 can do 4k on high instead of ultra, it's fine. 4k/144hz monitors exist, they're just double the price. If that isn't a problem, great. If price is a factor, most people would recommend prioritizing refresh rate over resolution.

Hardware Unboxed did a monitor roundup recently, that might point you in the right direction.
https://www.youtube.com/watch?v=leaDHuKlke8

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE


Zarin posted:

Read the last 3 pages, didn't see this answer here:

What are the current recommendations for a 4k gaming monitor of my buddy ends up going with a 3080? Are 4k/120 monitors the standard now for 4k? Or is that refresh rate still considered pretty aggressive for mainstream yet?

Edit: It sounds like 4k/60 is too aggressive for a 3070, is that about correct?

There's basically only one 4K high refresh rate monitor that makes sense right now, and it's the LG 27GN950 (well, unless you want to consider LG's OLED TV's as an option but that's pretty outlandish for most people). It's about 900€ here in Europe, dunno what it's like in the US but probably around $1000. There are a handful of other options, like the Acer XV273K and the Asus XG27UQ, but if you're gonna spend this much money on a monitor I don't think it makes sense to try to cheap out by 10% or something and choose one that's worse. If you're even considering 4K gaming you're well into "more money than sense" territory anyway. I've had an XV273K basically since it launched around two years ago now; it was 1000€ when it launched but has the general shoddiness and panel quality of something in the 1440p 3-400€ segment. Would not really recommend.

bobua
Mar 23, 2003
I'd trade it all for just a little more.



Is there a thread recommended option on software that sets up monitor partitions for windows? As in, split the screens up so that maximizing a window doesn't actually fill the monitor, but just the partition it's in?

I'm currently using the LG software that does it, and its fine except that when I customize a window's size, it just remaximizes, no real options.

no one hates free, but I don't mind paying for an option if it's worth it.

grate deceiver
Jul 10, 2009


I need some help figuring out how to set up a TV for games/movies.

TV is going to be mounted in the living room, I will be using it mainly for 2 things - watching movies and youtube from my desktop computer and playing couch games. The two will be separated by a wall and about 6m in a straight line. What is the best/least annoying way to accomplish this?

I'm considering 3 options:

- Run a HDMI cable from the computer to TV, probably also powered USB for the controllers or wireless controller dongles, and just treat it like another computer screen. Total length of cable would be a little under 10m, so HDMI should handle it just fine I think.

- Run Steam Link on the TV and connect wirelessly - sounds convenient, but I'm not sure how well the signal will carry through a wall, and also if the latency will be ok for gaming. Also not sure if I should also connect some devices to eachother by cable. Also, I have no idea how does one install something on a TV and how wcompatible it is.

- Same as above, but have a Raspberry Pi dedicated to Steam Link. I already have one running pi-hole and my DNS server, and it's connected to my desktop by cable, so lag should be less? This would also I guess make it easier to stream to and from other devices.

TheFluff
Dec 13, 2006

FRIENDS, LISTEN TO ME
I AM A SEAGULL
OF WEALTH AND TASTE


bobua posted:

Is there a thread recommended option on software that sets up monitor partitions for windows? As in, split the screens up so that maximizing a window doesn't actually fill the monitor, but just the partition it's in?

I'm currently using the LG software that does it, and its fine except that when I customize a window's size, it just remaximizes, no real options.

no one hates free, but I don't mind paying for an option if it's worth it.

The FancyZones feature of Microsoft PowerToys does all I need.

Paul MaudDib
May 2, 2006

"Tell me of your home world, Usul"


grate deceiver posted:

I need some help figuring out how to set up a TV for games/movies.

just fyi but HDMI 2.1 is limited to a max length of 2 meters. 2.0 was the last standard that supported the full 50 meter length. So you will be limited to 4K60 at your length unless you go to a HDMI 2.1 active cable (fiber optic) which look like they're running about $200. And I've heard they don't necessarily work reliably.

if you go the steam link route, you will lose VRR support (if your TV has it). I think maybe you could do it if you had a box running HDMI 2.1 to actually drive the TV (I think Steam supports VRR encoding?) but that in turn means you have to have something to drive it... which means either a 6000 series AMD GPU, a Turing or newer NVIDIA card, or possibly an Intel tiger lake NUC.

Adbot
ADBOT LOVES YOU

grate deceiver
Jul 10, 2009


Paul MaudDib posted:

just fyi but HDMI 2.1 is limited to a max length of 2 meters. 2.0 was the last standard that supported the full 50 meter length. So you will be limited to 4K60 at your length unless you go to a HDMI 2.1 active cable (fiber optic) which look like they're running about $200. And I've heard they don't necessarily work reliably.

if you go the steam link route, you will lose VRR support (if your TV has it). I think maybe you could do it if you had a box running HDMI 2.1 to actually drive the TV (I think Steam supports VRR encoding?) but that in turn means you have to have something to drive it... which means either a 6000 series AMD GPU, a Turing or newer NVIDIA card, or possibly an Intel tiger lake NUC.

Ok, good point, I should have probably mentioned my graphics card as well. I have a NVIDIA GTX 1060 right now, so on a big screen in 4K I imagine it will be straining. I plan to upgrade, but for a while I will probably be limited to 1080p anyway, so I also probably don't have to worry about HDMI 2.1 or VRR

And I also have not yet bought the TV, party because I'm not yet sure what to look for to accomplish what I want, so any suggestions here would be welcome as well.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply