Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
papa horny michael
Aug 18, 2009

by Pragmatica
Is the HBO max app on LG C1 poo poo for anyone else? Gives me far lower quality than Roku ultra or newest Chromecast on same tv using their HBO max apps.

Adbot
ADBOT LOVES YOU

morestuff
Aug 2, 2008

You can't stop what's coming
It sucks on Roku as well, it’s slow to load everything and it occasionally will just not run Dolby Vision stuff for me despite a fiber connection

kliras
Mar 27, 2021
i used it alright on apple tv, but i think there was the occasional wonkiness with changing quality. i think hbo max in general has a pretty busted auto quality handling

i would definitely assume it's an hbo problem more than anything else

KillHour
Oct 28, 2007


bull3964 posted:

The slow cpus are annoying, but there’s really no reason why a TV needs more than a 100mb network connection. There’s zero use case in the smart platform on the TV that would require more than half that bandwidth.

I needed to get a gbe adapter for steam link to work in 4k.

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


KillHour posted:

I needed to get a gbe adapter for steam link to work in 4k.

You are running Steam Link on the Tizen OS?

KillHour
Oct 28, 2007


bull3964 posted:

You are running Steam Link on the Tizen OS?

I have a Sony so Android

CatHorse
Jan 5, 2008

bull3964 posted:

There’s zero use case in the smart platform on the TV that would require more than half that bandwidth.

UHD Bluray can peak at 144 Mbit/s

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


MikusR posted:

UHD Bluray can peak at 144 Mbit/s

That’s irrelevant. The smart platforms on these TVs are not streaming full bitrate UHD. That’s not a valid use case for the TV.

The number of people that are installing plex and streaming full bitrate UHD rips to their TV smart OS borderline on the statistically non-existent.

KillHour posted:

I have a Sony so Android

Still. 100mbps should be plenty sufficient for steamlink. I think the encoder default is 25mbps.

bull3964 fucked around with this message at 22:13 on Jul 19, 2022

fyallm
Feb 27, 2007



College Slice

bull3964 posted:


The number of people that are installing plex and streaming full bitrate UHD rips to their TV smart OS borderline on the statistically non-existent.


I am thinking I am going to do that, but looking for a good writeup for that

KillHour
Oct 28, 2007


bull3964 posted:

Still. 100mbps should be plenty sufficient for steamlink. I think the encoder default is 25mbps.

It wasn't. My connection noticeably improved with the additional bandwidth.

Edit: and the difference in hardware costs to do GbE is negligible so it's just stupid that they don't do it.

KillHour fucked around with this message at 22:19 on Jul 19, 2022

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


fyallm posted:

I am thinking I am going to do that, but looking for a good writeup for that

Quite frankly, it’s a gigantic PITA and I gave up. It’s a waste of storage space and you have to find special hardware that can even do it.

I have all my blu-rays ripped and I stream them using Kodi via my Shield, but for UHD I just pop in the disc.

KillHour posted:

It wasn't. My connection noticeably improved with the additional bandwidth.

Edit: and the difference in hardware costs to do GbE is negligible so it's just stupid that they don't do it.

There were likely other factors at play than just pure bandwidth because there isn’t really any benefit going beyond 50mbps or so with Steamlink. It’s probably more due to the SoC and it’s relationship with the onboard ethernet vs the USB bus than anything else.

It doesn’t really matter how much cheaper it is, it’s cheaper and spending more on the part isn’t making the TV any better for the core audience. Likely 90% of them would never even plug in the network port to begin with. I know my C6 has never seen an ethernet connection in 6 years.

bull3964 fucked around with this message at 22:25 on Jul 19, 2022

KillHour
Oct 28, 2007


bull3964 posted:

There were likely other factors at play than just pure bandwidth because there isn’t really any benefit going beyond 50mbps or so with Steamlink. It’s probably more due to the SoC and it’s relationship with the onboard ethernet vs the USB bus than anything else.

It doesn’t really matter how much cheaper it is, it’s cheaper and spending more on the part isn’t making the TV any better for the core audience. Likely 90% of them would never even plug in the network port to begin with. I know my C6 has never seen an ethernet connection in 6 years.

Quit trying to prove everyone around you wrong. It's annoying. I used to work in the video streaming industry. I know how poo poo works.

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*
Bravia Core on Sony TVs requires at least 115 Mbps to use Purestream. The manual even specifies you need to use wifi because the TV's ethernet adapters are only up to 100 Mbps, but you can use a USB 3.0 ethernet adapter with a specific chipset if you must use a wired connection. As already mentioned, it also makes a big difference if you are using Steam Link too.

wolrah
May 8, 2006
what?

bull3964 posted:

The number of people that are installing plex and streaming full bitrate UHD rips to their TV smart OS borderline on the statistically non-existent.
While the amount of people who use Plex, Jellyfin, etc. is likely a small subset of the total userbase of these devices it seems like a safe bet that the majority of people who set up such a system are feeding them with automated platforms like Sonarr and Radarr, which prefer full bitrate remuxes by default. I doubt a lot of people are intentionally looking to feed full rate rips to their TVs but I'm sure a lot of people do it just because it's what their system downloaded.

bull3964 posted:

It doesn’t really matter how much cheaper it is, it’s cheaper and spending more on the part isn’t making the TV any better for the core audience. Likely 90% of them would never even plug in the network port to begin with. I know my C6 has never seen an ethernet connection in 6 years.
I don't think anyone disagrees that capitalism' gonna capitalism and if it can be made cheaper without most people noticing it will be, but that thought can coexist with thinking it's loving stupid that a TV with a four digit price tag can have worse networking than a $35 Raspberry Pi 3 from 2016.

I mean, if you went to a computer store right now and saw a laptop with a 100 megabit ethernet port you'd think it's a cheap piece of poo poo, right? Even at a mid three digit price point that already says it's a cheap piece of poo poo, that would feel cheap. Almost no one who buys a new cheap laptop is going to be plugging it in to ethernet ever, people who care about wired networking on laptops generally know enough to buy used if they want to save money, but it still feels worse than not having the port at all.

That logic kind of works here too, most people won't care that their TV has an ethernet port at all, but the people who do care are significantly more likely to have opinions about it not being gigabit.

codo27
Apr 21, 2008

There's no reason anything sold, gently caress I'll be generous and say this decade, should come with less than a gigabit port. Maybe if you're buying some RCA poo poo. Not flagships

teh_Broseph
Oct 21, 2010

THE LAST METROID IS IN
CATTIVITY. THE GALAXY
IS AT PEACE...
Lipstick Apathy
Cool cool coool, having father in law help wall my mount 65" KS8000 that's been trucking along for years. Survived another move, it was working standing on the feet, then got it mounted and



cooool. The kids came kinda near it when it was on the feet on the floor maybe something happened there, or part of the mount was leaned up and fell over at some point maybe hit it, or once it was up and plugged in my eyes were on cords and all but corner of my eye saw it go black a second and come back while he was pushing on the corners to tilt it - so don't really know the cause but google says this one's effed.

KillHour
Oct 28, 2007


Yep that TV's hosed. Sorry dude.

Looks like it's OLED time :getin:

KS
Jun 10, 2003
Outrageous Lumpwad

bull3964 posted:

Quite frankly, it’s a gigantic PITA and I gave up. It’s a waste of storage space and you have to find special hardware that can even do it.

All I had to do was swap Plex for Infuse on an Apple TV.

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


KS posted:

All I had to do was swap Plex for Infuse on an Apple TV.

I'm talking about ripping them.

fyallm
Feb 27, 2007



College Slice

bull3964 posted:

Quite frankly, it’s a gigantic PITA and I gave up. It’s a waste of storage space and you have to find special hardware that can even do it.

I have all my blu-rays ripped and I stream them using Kodi via my Shield, but for UHD I just pop in the disc.


Ugh, I guess off to buy physical discs... Since I have the lg g2, should I upgrade my xbox one x to the series x? Or should I wait till actual 8k arrives?

morestuff
Aug 2, 2008

You can't stop what's coming

fyallm posted:

Ugh, I guess off to buy physical discs... Since I have the lg g2, should I upgrade my xbox one x to the series x? Or should I wait till actual 8k arrives?

4K is just now really starting to hit its stride and it’s been around for like a decade, I wouldn’t hold out for 8k

ihatepants
Nov 5, 2011

Let the burning of pants commence. These things drive me nuts.



I finally connected my PS5 to my new LG C1 and it's doing something that really bothers me. I've been playing MLB the show 22 on it and the brightness just goes crazy depending on what's happening. Like while I'm in the batters box waiting for a pitch, it starts to dim a lot then brightens back up when I actually hit the ball, or when I'm pitching/fielding. It's mainly annoying because the dimming occurs when I'm hitting, when that's where I would probably want the screen brighter.

morestuff
Aug 2, 2008

You can't stop what's coming
Try turning off the logo / static image detector, whatever it’s called. Automatic Brightness Limiter?

morestuff
Aug 2, 2008

You can't stop what's coming
Kind of bummed out. I spent years hemming and hawing over whether an OLED would be bright enough for my living room before the price on the A80J was too good to pass up. Delivery came today and even after messing with the settings for a few hours it’s unwatchably dim in everything except for the Vivid preset — and even that’s just OK.

Gonna give it a few days but leaning towards returning it at this point.

Bloodplay it again
Aug 25, 2003

Oh, Dee, you card. :-*

morestuff posted:

Kind of bummed out. I spent years hemming and hawing over whether an OLED would be bright enough for my living room before the price on the A80J was too good to pass up. Delivery came today and even after messing with the settings for a few hours it’s unwatchably dim in everything except for the Vivid preset — and even that’s just OK.

Gonna give it a few days but leaning towards returning it at this point.

The absolute brightest settings (for SDR content):

Picture mode: vivid
Advance contrast enhancer: high
Peak luminance: high
Brightness: max
Contrast: max
Color temp: cool
Live color: high

This will get it to about 700 nits, but the picture won't be super accurate. The same settings for HDR content will get you to about 900 nits. If it's still too dim with those settings, I doubt even a 3x as expensive A95K would really be bright enough and you should look at LEDs instead.

Edit: on second review, the new QD-OLEDs do get about twice as bright, so I take that last sentence back.

At a full white screen with no other colors, WOLED gets up to about 130 nits and QDOLED gets up to about 250 nits. The 700/900 values are assuming you are watching basically anything besides a hockey game.

Bloodplay it again fucked around with this message at 22:07 on Jul 20, 2022

papa horny michael
Aug 18, 2009

by Pragmatica
Do people trust the r-tings calibration settings pages on televisions? I've used them before on different tvs, and thought they seemed entirely arbitrary in setup.

morestuff
Aug 2, 2008

You can't stop what's coming

Bloodplay it again posted:

The absolute brightest settings (for SDR content):

Picture mode: vivid
Advance contrast enhancer: high
Peak luminance: high
Brightness: max
Contrast: max
Color temp: cool
Live color: high

This will get it to about 700 nits, but the picture won't be super accurate. The same settings for HDR content will get you to about 900 nits. If it's still too dim with those settings, I doubt even a 3x as expensive A95K would really be bright enough and you should look at LEDs instead.

Edit: on second review, the new QD-OLEDs do get about twice as bright, so I take that last sentence back.

This is what I settled on and some content looks great, though now that the sun is setting I've been able to find some settings on Custom picture modes that also look good. Probably should have held off panic posting but this was still helpful, thanks.

Wiggly Wayne DDS
Sep 11, 2010



papa horny michael posted:

Do people trust the r-tings calibration settings pages on televisions? I've used them before on different tvs, and thought they seemed entirely arbitrary in setup.
the settings for calibration are unique per-panel, you can use them as a rough idea for how each setting modifies the colour range, but that's about it

i looked at rtings' calibrations for my qn90a and it was mostly fine with one really bizarre choice:

quote:

Setting: RT / Mine
Picture Mode: Movie / Movie
Brightness: 7 / 7 (Preference, we happened on the same figure)
Contrast: 45 / 25
Sharpness: 0 / 5
Colour: 25 / 25
Tint (G/R): 0 / 0
Local Dimming: High / Low (I tried High but it was trying to be too clever in per-area brightness and it stood out in shifting scenes where there's latency to the video content)
Contrast Enhancer: Off / Off
Colour Tone: Warm2 / Standard (why you would ever set this off of unaltered is beyond me, they'd have to royally fuckup the presets for it to make sense)
Gamma: 2.2 / 2.2
i have no idea how they ended up on warm2 as a colour tone for accuracy on their panel, messing with colour tone and claiming accuracy increases is really wild given how aggressive those are usually set and it was pretty red on my panel. i didn't delve into the white balance settings they used but they must have had to heavily correct it for a hair better on their colorimeter

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


That’s likely. Warm 2 probably got them to a better middle of the road starting point where they could get more stuff into spec after white balance adjustment. For most normal sighted people, red is a strong color and it probably doesn’t take much to tip the image into looking overly red which also means it doesn’t take much to pull back that red in the white balance.

That’s really the thing with TV calibration. Brightness and contrast are easy to set with test patterns. Color and Tint are also easy to set with test patterns provided you have a blue filter (and some TVs today have a built in color filter that you can flip on to do the adjustment without anything extra.)

Beyond that though, you should just select the white point profile that you prefer most and just leave it unless you have a meter. Subtle nudges in the white balance can have all kinds of fun knock on effects that you may not notice at all except when you REALLY notice.

Warm1 may very well be the more color accurate white point out of the box but it may be shifted too far in one direction to get the average deltaE of all the colors as low as possible. So, maybe cyan has a deltaE of 3.2 as best possible with Warm1 while you can get red to 0.1 but with Warm 2 red technically is worse at 0.7 (but undetectable to the eye) while cyan is 1.5 which would be a noticeable improvement.

You can drive yourself crazy trying to calibrate a TV even with all the proper tools.

Mister Facetious
Apr 21, 2007

I think I died and woke up in L.A.,
I don't know how I wound up in this place...

:canada:

bull3964 posted:

That’s really the thing with TV calibration. Brightness and contrast are easy to set with test patterns. Color and Tint are also easy to set with test patterns provided you have a blue filter (and some TVs today have a built in color filter that you can flip on to do the adjustment without anything extra.)

Yeah that's what i did for SDR and it was "good enough" for me without buying tools to fine tune the blue. As for hdr i just trust Apple/Amazon/the BluRay to do Dolby Vision correctly.

Wiggly Wayne DDS
Sep 11, 2010



bull3964 posted:

You can drive yourself crazy trying to calibrate a TV even with all the proper tools.
absolutely, and there's the obvious red perceptive issue on calibrating by eye but i did use a colorimeter, here's the default display values for pure white with colour tones (cool/standard/warm1/warm2). i've mixed them up:

can anyone guess which one warm2 is?

Sphyre
Jun 14, 2001

bull3964 posted:

That’s irrelevant. The smart platforms on these TVs are not streaming full bitrate UHD. That’s not a valid use case for the TV.

glad that's cleared up then lol

filthychimp
Jan 2, 2006
Damned dirty ape

Wiggly Wayne DDS posted:

i have no idea how they ended up on warm2 as a colour tone for accuracy on their panel, messing with colour tone and claiming accuracy increases is really wild given how aggressive those are usually set and it was pretty red on my panel. i didn't delve into the white balance settings they used but they must have had to heavily correct it for a hair better on their colorimeter

White point is really hard to judge by eye, unless you're really familiar with what to look for. By eye, I figured Warm 20 was right for my LG OLED. Checked against my calibrated IPS display, I could plainly see that what I figured was correct was completely washing out skin tones. Even the max setting, Warm 50, is still too bluish.

And now I'm going to go down the TV calibration rabbit hole. At least I already own a colorimeter...

filthychimp fucked around with this message at 04:59 on Jul 21, 2022

OldSenileGuy
Mar 13, 2001
I started looking into getting a bias lighting setup for my TV, and very quickly fell down a rabbit hole into the world of Hue Sync Boxes, Hue Light Bars, and Hue Play Gradient Lightstrips.

Anyone use any of these? Do they work well? I’m almost certainly going to pick up either the light bars or the light strip just to have some kind of bias light. But I’m wondering if the sync box actually works well or if it’s janky and better in concept than execution.

Also - if anyone does have the light bars - should I be worried about mounting them on the back of my LG OLED? I imagine they’re pretty lightweight, but I know the screen is fragile. Costco has a three pack of the bars though that seem like they’d work well (left, right, center top), but is that too much weight to attach to the tv?

KillHour
Oct 28, 2007


OldSenileGuy posted:

I started looking into getting a bias lighting setup for my TV, and very quickly fell down a rabbit hole into the world of Hue Sync Boxes, Hue Light Bars, and Hue Play Gradient Lightstrips.

Anyone use any of these? Do they work well? I’m almost certainly going to pick up either the light bars or the light strip just to have some kind of bias light. But I’m wondering if the sync box actually works well or if it’s janky and better in concept than execution.

Also - if anyone does have the light bars - should I be worried about mounting them on the back of my LG OLED? I imagine they’re pretty lightweight, but I know the screen is fragile. Costco has a three pack of the bars though that seem like they’d work well (left, right, center top), but is that too much weight to attach to the tv?

Reviews I've read online say they're a massive pain in the rear end. Also, the Sync Box doesn't support HDMI 2.1 and supposedly it fucks with CEC real bad.

https://www.reviewgeek.com/96220/philips-hue-play-gradient-lights-and-sync-box-too-much-money-even-more-hassle/

Between the Govee stuff having a bewildering app with features unexplained and strewn all over the place, LIFX stuff being a horrific mess to configure (and IME, a surprisingly high failure rate) and the Hue stuff being wildly expensive and requiring all sorts of external hardware, I'm just generally underwhelmed by smart lighting even though I have a house full of it.

KillHour fucked around with this message at 07:39 on Jul 22, 2022

Enos Cabell
Nov 3, 2004


Colored bias lighting is a gimmick, you want proper 6500k bias lighting which can be done pretty easily with one of these kits:

https://www.biaslighting.com/pages/about-bias-lighting

kliras
Mar 27, 2021
the ones that sync tend to always have pretty horrible latency. i think there are some hacky raspberry pi solutions, but i mostly only care for background lighting for monitors to reduce eye strain. shouldn't be necessary for tv's

Mister Facetious
Apr 21, 2007

I think I died and woke up in L.A.,
I don't know how I wound up in this place...

:canada:

kliras posted:

the ones that sync tend to always have pretty horrible latency.

I watched a review on some of the Govee ones and it was so apparent how delayed they were compared to the screen.

OldSenileGuy
Mar 13, 2001
drat, that's disappointing. Like, yeah it's a gimmick, but it looks like a fun gimmick!

I had found some reviews that said the sync box wasn't great, but they were all pretty old reviews (from 2020) so I had hoped things had gotten better. And they (supposedly) HAVE gotten better - in the 2020 reviews, the box didn't support HDR or Dolby Vision or possibly even UHD, but it does now.

I may pick up the light bars anyway because they're on sale at Costco but just use them as a static 6500k light until the sync box gets up to snuff.

Adbot
ADBOT LOVES YOU

joedevola
Sep 11, 2004

worst song, played on ugliest guitar
Is it normal for a new LCD panel to display ghosting if the image has only been on screen for a few minutes?

Got a new LG 4K LCD and It's really loving distracting. The after image goes away eventually but the 15 year old HD set I replaced with it never had this problem.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply