|
K8.0 posted:16:10 makes no sense, frankly. The likelihood that a slightly different AR from 16:9 makes a real difference to you is insignificant compared to the likelihood that 5:4 or 4:3 is an even better AR for you, yet I see no one campaigning for those to return. At 1080p vs 1200p when vertical space was significantly reduced compared to 4:3/5:4 CRTs, there was something of an argument to be made for every vertical pixel you could get. At 1440p and especially 4k, it's silly. As an erstwhile 16:10 fan, this is how I feel after upgrading to 1440p. The extra pixels make dividing your screen between multiple windows much more practical, which makes the wider aspect ratio a real advantage.
|
# ? Mar 18, 2021 23:58 |
|
|
# ? Apr 23, 2024 08:19 |
|
I have a 30" 16:10 monitor (2560x1600) at work as one of my monitors and it's pretty great for coding (I'd take ultra-wide over this though). In the days of 1440p monitors, I don't think there's a huge difference anymore and I wouldn't go out of my way to grab a 16:10 monitor over a 16:9 monitor, but I think it's a bit off to say 16:10 is wasted space (all else being equal). The only time that is really the case is when consuming 16:9 media. Web is generally not optimized to make great use of horizontal space after a certain point. Still photo formats are generally taller than 16:9 (and of course you have to deal with portrait oriented photos as well). When working on 16:9 content, the extra vertical space allows for potentially less cropping of the content once you account for the program's UI. For programming (and I imagine spreadsheet work as well, hell anything text related where you might want to split the screen vertically), the extra vertical pixels are noticeable when splitting the screen vertically to show things like console/debug output or additional editor windows. I should note I think the proper way to compare 16:10 to 16:9 is to have the same physical width and horizontal pixel count, just adding the vertical pixels/height (as this keeps the same desktop footprint), rather than having the same diagonal measurement (which would make a 16:10 screen more narrow).
|
# ? Mar 19, 2021 00:25 |
|
Alright, so is the LG 34" Ultrawide a bad idea? I have the 27" and while I like it, at this price I can use it as the primary I've been kind of putting off and then finally retire an old 27" 1080 panel. What vertical resolution are people looking for in their ultrawides?
|
# ? Mar 19, 2021 03:43 |
|
kw0134 posted:Alright, so is the LG 34" Ultrawide a bad idea? I have the 27" and while I like it, at this price I can use it as the primary I've been kind of putting off and then finally retire an old 27" 1080 panel. I have this monitor and it's quite nice, would recommend.
|
# ? Mar 19, 2021 03:48 |
|
16:10 and squarer ratios in general are picking up steam on laptops again. It makes sense there, since the alternative is either shortening the body leaving less room for the touchpad or a big chin under the display.
|
# ? Mar 19, 2021 09:43 |
|
I'm not sure if this is a question about monitors or about Windows. I have an annoying issue with my unusual multiple display situation. I have a real monitor (Acer ROG Strix XG279q) plugged in by DisplayPort and a Samsung TV connected by HDMI. The TV is almost always off if I'm not actively using it but Windows detects it and considers me to have another 1920x1080 of screen real estate off to one side. I use a program called Dual Monitor Tools to lock my cursor onto the main display and mostly ignore it. The mild annoyance is that when the PC is idle and the display turns off, any open windows are reset to default positions. This doesn't happen if the monitor is the only display plugged in and for some reason it didn't happen with a previous monitor that was DVI instead. I don't know if this behavior is a DP thing of fully disconnecting a powered-down display or if this monitor turns itself off more completely than the old one when it's idle. Is there any way I can get Windows to stop forgetting the monitor exists without just keeping the TV physically disconnected most of the time?
|
# ? Mar 20, 2021 02:31 |
|
SCheeseman posted:16:10 and squarer ratios in general are picking up steam on laptops again. It makes sense there, since the alternative is either shortening the body leaving less room for the touchpad or a big chin under the display.
|
# ? Mar 20, 2021 02:36 |
|
Pitch posted:I'm not sure if this is a question about monitors or about Windows. I have an annoying issue with my unusual multiple display situation. I have a real monitor (Acer ROG Strix XG279q) plugged in by DisplayPort and a Samsung TV connected by HDMI. The TV is almost always off if I'm not actively using it but Windows detects it and considers me to have another 1920x1080 of screen real estate off to one side. I use a program called Dual Monitor Tools to lock my cursor onto the main display and mostly ignore it. The mild annoyance is that when the PC is idle and the display turns off, any open windows are reset to default positions. This doesn't happen if the monitor is the only display plugged in and for some reason it didn't happen with a previous monitor that was DVI instead. I don't know if this behavior is a DP thing of fully disconnecting a powered-down display or if this monitor turns itself off more completely than the old one when it's idle. Is there any way I can get Windows to stop forgetting the monitor exists without just keeping the TV physically disconnected most of the time? What I do is switch screens with Window's Project function. It's a hassle because it means I have to manually switch screens (Win + P is a shortcut), but I mostly don't have to deal with resolution difference issues.
|
# ? Mar 20, 2021 02:47 |
|
Rinkles posted:What I do is switch screens with Window's Project function. It's a hassle because it means I have to manually switch screens (Win + P is a shortcut), but I mostly don't have to deal with resolution difference issues.
|
# ? Mar 20, 2021 03:22 |
|
Pitch posted:I didn't know about this, but it doesn't seem to help. If the TV is connected and turned on, even if Project is in "PC Screen Only", then turning off the monitor switches the display to the TV and resets all the window positions to default (I guess because they were taller than 1080, yeah). If the TV is turned off I get the same result so I assume the same thing might be happening but I can't prove it (since I can't see if all the windows have been shoved onto that screen). Hmm, my TV doesn't do that. As far as I can tell, the pc only outputs to the TV if I tell it to. Idk if it matters, but it's a decade old Sony. My Gigabyte monitor is hooked up via DP, like yours. Just spitballing, but did you turn off Dual Monitor Tools? Pitch posted:The result I want is for the monitor to be my main display all the time, and all windows to stay there, unless I specifically want a video to play on the TV. I would love a more convenient solution too, but (to my surprise, since I don't think this is particularly uncommon situation) I haven't found one. Also it'd be nice if GPUs had HDMI-CEC support (the thing that, among other things, automatically turns on your tv when you turn on a connected device).
|
# ? Mar 20, 2021 03:45 |
|
I suspect (completely amateur opinion) this is related to Displayport Deep Sleep, and Windows can't tell the difference between the monitor going into very-low power mode and the monitor completely disappearing. Unfortunately the XG279q doesn't have an option to disable this, although previous models did. It also takes basically as long to wake from sleep as it does to turn on completely cold, but that never inconvenienced me that much. I'd like to hear a real solution but after some googling I found something that seems to work. For now I'm using Custom Resolution Utility to create an EDID override so Windows doesn't detect any change in the monitor's status at all. It seems to work for now but I'll have to see if it has any weird side effects.
|
# ? Mar 20, 2021 04:39 |
|
Pitch posted:I suspect (completely amateur opinion) this is related to Displayport Deep Sleep, and Windows can't tell the difference between the monitor going into very-low power mode and the monitor completely disappearing. Unfortunately the XG279q doesn't have an option to disable this, although previous models did. It also takes basically as long to wake from sleep as it does to turn on completely cold, but that never inconvenienced me that much. Am I correct though in understanding that this only happens when you turn the monitor off, as in pressing the power button on the monitor? And that it works as expected when you just allow power management to put the monitor to sleep? If that's the case, then why not just do that? The default Windows power management settings aggressively sleep the monitor once the lock screen is up, so if you lock or log out when you leave your computer (like you should be in the habit of doing anyways) it'll be off soon enough one way or another. If on the other hand the display actually disconnects itself when it goes to sleep through power management, then that's just broke as gently caress behavior and if it's too late to return then you're probably stuck with hacky workarounds.
|
# ? Mar 20, 2021 05:08 |
|
Well this was apt foreshadowing.Cygni posted:As someone who mostly does admin/HR/manager stuff, yeah, i can't recommend multi monitor enough. I've personally found 2 x 27 to be better than 3 smaller screens, though. I like that 27in at 1440p/4k let you put two 8.5x11 documents next to each other at full size with full legibility plus the toolbars and such. Really helps. I took over some financial duties, and now I'm using magnum-sized spreadsheets that could really use some extra monitor width. There doesn't seem to be an appreciable price difference between a 2560x1080p and 2560x1440p monitor, so I guess I'll invest in the latter. My dining table is running out of room, so I'll have to take a look at some el-cheapo monitor arms too. Need. More. Canvas. mom and dad fight a lot fucked around with this message at 10:29 on Mar 21, 2021 |
# ? Mar 20, 2021 05:58 |
|
wolrah posted:If the monitor is turning off its presence signal and the TV is not, then you are correct Windows is going to see that and assume everything now needs to be on the TV regardless of if it was previously disabled. Given the choice between no monitor and disabled monitor, the desktop GUI operation system is of course going to pick the disabled one.
|
# ? Mar 20, 2021 06:15 |
|
Rollie Fingers posted:This guy’s been using the LG CX Oled screen as a monitor for six months and hasn’t noticed burn yet: You can buy second hand OLED PVMs on ebay https://www.ebay.com/itm/Sony-PVM-2551MD-OLED-Monitor/264898311440?hash=item3dad2b7510:g:aJoAAOSwlBVfhfxl I have one and it’s outstanding (I paid USD 250)
|
# ? Mar 20, 2021 11:30 |
|
Huh, why would a medical office need an OLED monitor specifically?
|
# ? Mar 20, 2021 11:57 |
|
Shipon posted:Huh, why would a medical office need an OLED monitor specifically? These are FDA approved surgical display. Amazing colour accuracy and no input lag because you need to see yourself nicking an artery in real time. KingEup fucked around with this message at 12:32 on Mar 20, 2021 |
# ? Mar 20, 2021 12:28 |
|
I think I saw some talk about calibrating your screen some time ago. Is this something that's necessary for most screens or only for specific type of screens?
|
# ? Mar 20, 2021 16:34 |
|
Pitch posted:No, it acts the same whether I physically turn it off or it's put to sleep by Windows being idle. Apparently the previous model in this line not only allowed you to disable "Deep Sleep" but recommended it in the instruction manual, but now there's no such option (or it's tied in to disabling FreeSync for some reason, according to forum rumors).
|
# ? Mar 20, 2021 17:24 |
|
LochNessMonster posted:I think I saw some talk about calibrating your screen some time ago. Is this something that's necessary for most screens or only for specific type of screens? I'd say it depends on what you're doing and what your personal preferences are. The stock calibration on my monitors seems fine, but if I was doing serious digital darkroom stuff on it then I would have calibrated at least my main monitor. I haven't had a decent camera in years, and don't currently own a photo printer, so I'm personally playing faster and looser with my color calibration than I do when more actively doing photography.
|
# ? Mar 20, 2021 22:33 |
|
Found the DP 1.2 fuzziness. 60, 120, 170 Hz, from top to bottom. I'm not sure why sometimes I don't see it. The effect is pretty subtle but I mean in the pictures I took before I didn't see refresh rate making any difference. Rinkles posted:but now itunes seems to have issues Still pulling out my hair over this. I didn't realize getting a wide gamut monitor would actually end up being a liability.
|
# ? Mar 21, 2021 03:01 |
|
I'm getting that fancy Acer Predator monitor I posted and imo spending money on something you're not setting up correctly seems like a waste. The kind of people posting in this thread probably spend hours finding that perfect sweet spot of performance vs cost, what's another 30 mins to set up it properly?
|
# ? Mar 21, 2021 03:05 |
|
crepeface posted:I'm getting that fancy Acer Predator monitor I posted and imo spending money on something you're not setting up correctly seems like a waste. The kind of people posting in this thread probably spend hours finding that perfect sweet spot of performance vs cost, what's another 30 mins to set up it properly? You're ignoring the fact that doing so properly requires fairly expensive equipment.
|
# ? Mar 21, 2021 03:08 |
|
Rinkles posted:You're ignoring the fact that doing so properly requires fairly expensive equipment. Years ago I spent a lot of time scanning, viewing, and printing cards like this to calibrate all my equipment: Not expensive but it was a lot of time and hassle.
|
# ? Mar 21, 2021 03:16 |
|
Rinkles posted:You're ignoring the fact that doing so properly requires fairly expensive equipment. I'm pretty obviously talking about doing basic calibration, I assumed that's what OP was talking about.
|
# ? Mar 21, 2021 06:03 |
|
Edit: Nm I was thinking of the wrong display. Anyway, I was wondering what are the budget displays (gaming or general use) that people here recommend. I might need 1-2 cheap-but-good displays to go with a secondary PC in the near future. Spacedad fucked around with this message at 18:51 on Mar 21, 2021 |
# ? Mar 21, 2021 18:47 |
|
So why is the Chromebook Pixel the best IPS display on the market?
|
# ? Mar 22, 2021 01:17 |
|
Lackmaster posted:So why is the Chromebook Pixel the best IPS display on the market? It was a pretty neat high DPI 13" display (wacky 3:2 aspect ratio though) when it came out in 2013 but the OP and the thread title hasn't been updated since then.
|
# ? Mar 22, 2021 03:43 |
|
I hadn't checked cause I assumed it was 1.4, but the M27Q only has a DP 1.2 port, not 1.4. That means it can't go higher than 144Hz at 1440p without chroma subsampling, right? Anything else? I guess it's also relevant to hooking up a PlayStation 5 (assuming Sony doesn't add 1440p output). Can't take a 120Hz 4K signal. e:I'm dumb, PS5 would be over HDMI (2.0 in this case). Rinkles fucked around with this message at 15:45 on Mar 22, 2021 |
# ? Mar 22, 2021 07:21 |
|
CaptainSarcastic posted:Years ago I spent a lot of time scanning, viewing, and printing cards like this to calibrate all my equipment: That's seems like a lot of effort to go through to calibrate your monitor to the quality control of a color printer cartridge. Now I'm down the rabbit hole of seeing if you can even buy expired pantone swatches. E: Yes, like $50-75 for 10-15 year old formula guide. https://www.ebay.com/itm/Pantone-Formula-Guide-Solid-Matte-2008/133663327735 I don't think reflective vs emissive will give you a good calibration unless you have known-accurate lighting BRB calibrating my LED bulbs Harik fucked around with this message at 15:46 on Mar 22, 2021 |
# ? Mar 22, 2021 15:39 |
|
Harik posted:That's seems like a lot of effort to go through to calibrate your monitor to the quality control of a color printer cartridge. It was a lot of effort, partly because the whole calibration loop involved my scanner as well as monitor and printer. I did have a high-end photo printer - it's not like I was calibrating to a disposable HP Deskjet and tricolor cartridges. This was back when film was still in use, so I was scanning negatives a lot.
|
# ? Mar 22, 2021 21:35 |
|
I just learned that AMD cards have a useful feature Nvidia cards don't: you can force sRGB emulation on the driver level.
We’ve tested this on a broad range of monitors and have found it does usually offer reliable sRGB emulation. Exactly how closely the gamut tracks sRGB varies between models, but we typically see ~98% sRGB coverage (sometimes a touch below, sometimes a touch above) with very little extension beyond sRGB. The beauty of this setting is that you use it in conjunction with the full native gamut of the monitor, allowing you to use the full array of monitor OSD settings available to you. It’s set universally, too, so isn’t something that is simply ignored by some applications. https://pcmonitors.info/articles/taming-the-wide-gamut-using-srgb-emulation/
|
# ? Mar 23, 2021 02:15 |
|
Read the last 3 pages, didn't see this answer here: What are the current recommendations for a 4k gaming monitor of my buddy ends up going with a 3080? Are 4k/120 monitors the standard now for 4k? Or is that refresh rate still considered pretty aggressive for mainstream yet? Edit: It sounds like 4k/60 is too aggressive for a 3070, is that about correct? Zarin fucked around with this message at 03:31 on Mar 23, 2021 |
# ? Mar 23, 2021 03:26 |
|
A 3070 can do 4k on high instead of ultra, it's fine. 4k/144hz monitors exist, they're just double the price. If that isn't a problem, great. If price is a factor, most people would recommend prioritizing refresh rate over resolution. Hardware Unboxed did a monitor roundup recently, that might point you in the right direction. https://www.youtube.com/watch?v=leaDHuKlke8
|
# ? Mar 23, 2021 05:25 |
|
Zarin posted:Read the last 3 pages, didn't see this answer here: There's basically only one 4K high refresh rate monitor that makes sense right now, and it's the LG 27GN950 (well, unless you want to consider LG's OLED TV's as an option but that's pretty outlandish for most people). It's about 900€ here in Europe, dunno what it's like in the US but probably around $1000. There are a handful of other options, like the Acer XV273K and the Asus XG27UQ, but if you're gonna spend this much money on a monitor I don't think it makes sense to try to cheap out by 10% or something and choose one that's worse. If you're even considering 4K gaming you're well into "more money than sense" territory anyway. I've had an XV273K basically since it launched around two years ago now; it was 1000€ when it launched but has the general shoddiness and panel quality of something in the 1440p 3-400€ segment. Would not really recommend.
|
# ? Mar 23, 2021 15:03 |
|
Is there a thread recommended option on software that sets up monitor partitions for windows? As in, split the screens up so that maximizing a window doesn't actually fill the monitor, but just the partition it's in? I'm currently using the LG software that does it, and its fine except that when I customize a window's size, it just remaximizes, no real options. no one hates free, but I don't mind paying for an option if it's worth it.
|
# ? Mar 24, 2021 16:59 |
|
I need some help figuring out how to set up a TV for games/movies. TV is going to be mounted in the living room, I will be using it mainly for 2 things - watching movies and youtube from my desktop computer and playing couch games. The two will be separated by a wall and about 6m in a straight line. What is the best/least annoying way to accomplish this? I'm considering 3 options: - Run a HDMI cable from the computer to TV, probably also powered USB for the controllers or wireless controller dongles, and just treat it like another computer screen. Total length of cable would be a little under 10m, so HDMI should handle it just fine I think. - Run Steam Link on the TV and connect wirelessly - sounds convenient, but I'm not sure how well the signal will carry through a wall, and also if the latency will be ok for gaming. Also not sure if I should also connect some devices to eachother by cable. Also, I have no idea how does one install something on a TV and how wcompatible it is. - Same as above, but have a Raspberry Pi dedicated to Steam Link. I already have one running pi-hole and my DNS server, and it's connected to my desktop by cable, so lag should be less? This would also I guess make it easier to stream to and from other devices.
|
# ? Mar 24, 2021 17:13 |
|
bobua posted:Is there a thread recommended option on software that sets up monitor partitions for windows? As in, split the screens up so that maximizing a window doesn't actually fill the monitor, but just the partition it's in? The FancyZones feature of Microsoft PowerToys does all I need.
|
# ? Mar 24, 2021 17:27 |
|
grate deceiver posted:I need some help figuring out how to set up a TV for games/movies. just fyi but HDMI 2.1 is limited to a max length of 2 meters. 2.0 was the last standard that supported the full 50 meter length. So you will be limited to 4K60 at your length unless you go to a HDMI 2.1 active cable (fiber optic) which look like they're running about $200. And I've heard they don't necessarily work reliably. if you go the steam link route, you will lose VRR support (if your TV has it). I think maybe you could do it if you had a box running HDMI 2.1 to actually drive the TV (I think Steam supports VRR encoding?) but that in turn means you have to have something to drive it... which means either a 6000 series AMD GPU, a Turing or newer NVIDIA card, or possibly an Intel tiger lake NUC.
|
# ? Mar 24, 2021 17:29 |
|
|
# ? Apr 23, 2024 08:19 |
|
Paul MaudDib posted:just fyi but HDMI 2.1 is limited to a max length of 2 meters. 2.0 was the last standard that supported the full 50 meter length. So you will be limited to 4K60 at your length unless you go to a HDMI 2.1 active cable (fiber optic) which look like they're running about $200. And I've heard they don't necessarily work reliably. Ok, good point, I should have probably mentioned my graphics card as well. I have a NVIDIA GTX 1060 right now, so on a big screen in 4K I imagine it will be straining. I plan to upgrade, but for a while I will probably be limited to 1080p anyway, so I also probably don't have to worry about HDMI 2.1 or VRR And I also have not yet bought the TV, party because I'm not yet sure what to look for to accomplish what I want, so any suggestions here would be welcome as well.
|
# ? Mar 24, 2021 18:20 |