Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
wolrah
May 8, 2006
what?


Are any of the cheap (<=$750 for >=42") 4K HDR options worth bothering with or are they all "HDR Ready" as in able to take the signal but not really able to make meaningful use of it? Primary use would be streaming Amazon/Netflix in my office but it also needs to be usable as a PC monitor.

Obviously plenty of good options if I'm not looking at HDR.

wolrah fucked around with this message at 19:01 on Apr 8, 2017

Adbot
ADBOT LOVES YOU

wolrah
May 8, 2006
what?


GreenNight posted:

poo poo. I meant 2.1

I don't believe any production hardware supports HDMI 2.1 at the moment. It's a huge change from previous versions so it's much less likely that any existing hardware will be able to be firmware updated to support it than previous revisions. All the previous revisions have been basically clock speed bumps to increase bandwidth and specifying metadata for new stream formats, where 2.1 completely changes the mode of operation.

I have a hunch that HDMI 2.1 adaptive refresh is related to Freesync-over-HDMI, so I won't be shocked if it turns out that modern AMD hardware can be updated to support it. I believe nVidia has a G-Sync over HDMI solution as well, so if it operates similarly the same may apply but at this point we're deep in to assumption land.

wolrah
May 8, 2006
what?


GreenNight posted:

No one ever said "I wish I got a smaller TV".
Correct at the sizes we're talking about here, but not always.

If you have a particularly short viewing distance it's possible with some of the larger screens available today (75+"), or of course with a projector. Unless you're one of those weirdos who likes the front row of the movie theater. I've definitely had situations where I've run my projectors with the zoom backed off a bit because the screen was just too large for where the couch fit in the room.

If I have to move my head to change focus from one side of the screen to another it's too big or too close. Just rough estimating the angles I'd say about 120 degrees of my horizontal FoV is the limit.

wolrah
May 8, 2006
what?


Massdrop has the LG C6 at $1600 with three more people needing to buy in.

https://www.massdrop.com/buy/lg-55-c6-curved-oled-4k-hdr-smart-tv

I've never bought anything from Massdrop so I don't know what the timeframes are like on shipping and such, but they seem to be pretty well respected.

wolrah
May 8, 2006
what?


Microcenter has now beat Massdrop, the one near me has both the B6 and C6 in stock at $1500 for the 55". I'm really tempted even though I don't need a new TV right now.

wolrah
May 8, 2006
what?


I currently have a 1080p DLP 3D projector shooting 98" 16:9. As much as I want the OLED 4K goodness that's a lot of size to be giving up.

I need to figure out a solution to raise my screen so I can have both...

edit: Also I have a viewing distance of 10 feet to the center seat and regularly used spots up to 40 degrees off center. A curved screen would range between pointless and harmful depending on where in the room you're sitting. If I buy one it'll almost definitely be the B6 unless there's some thing the C6 does better enough to override the curve.

wolrah fucked around with this message at 03:21 on May 27, 2017

wolrah
May 8, 2006
what?


the good fax machine posted:

I figured you meant 1499 but mine has it for 1599. Still going to get it.

Yeah, it was $1499 at my store. The web site lists $2499 but says "see store for current price".

wolrah
May 8, 2006
what?


Enourmo posted:

Hey thread, I've got a max width of 40" and a budget of about $250. Should I spring for one of the like two options amazon has under their OLED filter, or just get a standard LED from one of the usual suspects?

There are no OLED TVs on the market within that range, just a lot of listings either written by people who don't understand that OLED displays and LED backlit LCDs are not the same thing or people trying to fool buyers in the former category.

Your width limit allows for 42" diagonal, maybe 46" thin bezel just barely assuming there's a bit of fudge factor in that number. Your price on the other hand is pretty restrictive. TVs are really cheap these days and there are a surprising number of options below $250, but you're not going to get anything amazing.

As far as I'm aware the LG B6/C6 at 55" and $1500ish if you're willing to hunt for sales are the smallest and cheapest consumer OLED televisions currently in production.

wolrah
May 8, 2006
what?


bull3964 posted:

The same TVs that everyone have been recommending because there's no such thing as a good quality non-smart TV.

Yep, the correct answer here is to just ignore the smart features on the TV if you're concerned about them. Don't connect the TV to the internet and it may as well not have those things. I don't know if anything ever actually implemented the HDMI ethernet channel but if you're really paranoid make sure the cables connecting to the TV don't support it.

At that point it's functionally a dumb TV. Neither vulnerable software nor intentional privacy invasions can access or be accessed by the internet. Use it with your favorite boxes and sticks to access your services of choice.

wolrah
May 8, 2006
what?


So it's been made pretty clear that image retention/burn in isn't really an issue on modern OLEDs for normal TV/video game type situations. What about a worst case scenario like desktop use?

My main monitor just started acting up and the idea is sticking in the back of my head of picking up a C6 next time I see them around $1500, replacing my current two-over-one triple monitor setup with a single huge panel.

I work from home so it'll be displaying a static-ish desktop and set of applications for potentially 8+ hours straight. My old Dell 2005FPW used to get pretty bad IR on certain things like the Start button and Chrome extension icons that basically don't move. None of my TN panels seem to have an issue with it.

wolrah
May 8, 2006
what?


EAT FASTER!!!!!! posted:

Here's the real question I know the Steam Link box supports the Bone controller, but I wonder whether my TV can sense that and pick up on it too... My gaming PC is downstairs across the house and is on wired ethernet, but my TV is on wifi.

SO MANY QUESTIONS.

In theory the current version of the 'bone controller should work on anything that supports Bluetooth HID gamepads. The older versions only have the proprietary WiFi-based wireless connectivity requiring a dongle which I don't believe is supported on anything but Win10 yet.

WiFi to the TV might be a problem if it's not at least 5GHz 802.11n. My Steam Link works mostly OK on 802.11ac.

wolrah
May 8, 2006
what?


I like watching movies filling my field of view like I'm in a theater.

1080p does definitely look pretty lovely at 110" though. I'm hoping in the next few years 4K DLPs start dropping in price enough that I can afford to upgrade the projector.

wolrah
May 8, 2006
what?


It is possible as well for a TV to have had HDR-supporting hardware but the firmware support had some critical bug discovered before shipping which got fixed later.

Not saying that's the case here, but it's plausible.

wolrah
May 8, 2006
what?


But what sources exist that only send HDR?

HDMI has automatic negotiation of supported modes. The TV tells the device it doesn't support HDR, the device doesn't send a HDR signal, everything's happy.

There is no technically meaningful reason to "support" HDR without actually being capable of making use of it.

It's a "feature" that exists only to trick people who don't know any better and think they're actually buying a HDR capable set.

wolrah
May 8, 2006
what?


Exactly. Since there's no rational reason for a source device to refuse to provide a non-HDR signal the only purpose for this "feature" other than marketing trickery would be if some lovely source existed that would only send a HDR signal regardless of what the sink tells it.

Considering such a source wouldn't work on the vast majority of HDMI sinks that have ever been produced, I'd be willing to bet against it even existing at this point. Maybe in the future there might be some boutique HDR-only streaming service that sells a dedicated appliance to people with more money than brains which intentionally refuses to downscale for *reasons*, but not yet.

wolrah
May 8, 2006
what?


This is why adaptive refresh rates will be nice, since 120Hz inputs have never really seemed to catch on.

Sending 24 over 60 and then having the TV unfuck it is a stupid hack in the first place, but because changing to native 24 signaling causes a resync that interrupts things for a few seconds people don't like it. 120Hz basically bodges around it by allowing 24/30/60 to all be cleanly multiplied in the source, but adaptive refresh means 25/50 will be nice and smooth too.

Speaking of that, is 100Hz a thing for the 50Hz markets in the same way 120 is here?

wolrah
May 8, 2006
what?


priznat posted:

Would it be straightforward to get an HDHomeRun on a Vizio? No idea what kind of apps they support (all streamed from external devices?)

HDHomeRun can do UPnP DLNA streaming, and that works on pretty much everything that supports network streaming of any kind.

There's also native support in Kodi and Plex, apps for Android and iOS, etc.. If the TV doesn't work with it natively I can basically guarantee you already have some other device that'll work fine.

wolrah
May 8, 2006
what?


uwaeve posted:

Definitely some way to do this, I've recently set one up. Phoneposting but from memory you want to set up an activity, not a device. Like add all your devices and get them working.

Then you'll add an activity or something where you can associate different functionality across devices. You add the devices to the activity and the remote figures out (but you can also customize) the mapping. Like certain buttons map to something like an AppleTV, but the volume and mute buttons map to my AV receiver. Poke around and let me know if this doesn't seem right.

This is correct. Device entries are for one device only, activities are what you actually want to be using most of the time in an AV setup.

edit: Derp, that's what I get for leaving a reply window open.

wolrah
May 8, 2006
what?


bull3964 posted:

You can also just use the RealD glasses from theaters. Just don't recycle them the next time you go.

I thought the passive style did things with polarization of light and such, how can that actually work on a normal non-projection display?

My projector uses the active glasses, which is kind of annoying because they have to be charged before use and are expensive enough that keeping more than four around isn't practical, so I rarely watch 3D anything as a result.

wolrah
May 8, 2006
what?


bull3964 posted:

The standards are set, there is no more standardization to be done.
Mandatory xkcd



IIRC the 2017 LGs support HLG, does anyone else?

wolrah
May 8, 2006
what?


Don Lapre posted:

Digital TV uses the same frequencies analog did. You don't need an antenna made for hd. You just need the proper antenna for your local be it tuned for uhf, vhf, or both. And aim it properly.

Yes and no. The overall TV bands haven't changed, but a lot of stations did move frequencies so they don't actually exist on the channel they claim to be. Lots of stations moved from VHF to UHF and to a lesser extent vice versa. What was a proper antenna for your area might not be anymore, but a lot of people don't realize that because the numbers stayed the same.

Here in the Cleveland area for example out of our big four network affiliates only one is actually on the frequency their historical channel number points to. NBC "3" is actually on 17. ABC "5" is actually 15. CBS "19" is actually 10. Only the Fox affiliate on 8 actually kept their frequency when they turned off analog.

wolrah
May 8, 2006
what?


Don Lapre posted:

Still has nothing to do with being a "digital" or "hd" antenna. Nothing about that contradicts my statement.

"You just need the proper antenna for your local be it tuned for uhf, vhf, or both. And aim it properly."

"Digital TV uses the same frequencies analog did." is true on an overall sense but is often untrue in a specific channel sense. That's why I said yes and no.

wolrah
May 8, 2006
what?


Bought a 49" TCL S405 on a whim because Best Buy had it on sale for $360.

1. It won't do HDMI 2.0 unless you explicitly tell it to, but it will still claim 4K60 capability in 1.4 mode which just displays purple and green garbage if you attempt to use it.
2. It really doesn't like Displayport -> HDMI converters
3. Text has weird pixel artifacts even when I confirmed it was running in 4:4:4 mode.
4. Solid colors like the forums' grey backgrounds had a checkerboard dithering look.]
5. HDR mode from a PC is broken as hell, but I hear that's pretty much across the board.
6. Even though 49" 4K is basically the same DPI as my usual 24" 1080p monitor somehow the pixels on stood out badly.

Contrary to RTings' decent rating I think this is a pretty terrible PC monitor. Also I think any 49" would be too large for reasonable desktop use, at least not without an aggressive curve.

The ability to send audio from Netflix or Amazon content to my phone and thus headphones is pretty awesome and has earned this thing a second chance as a bedroom TV, but if it doesn't impress me there it'll be going back to the store.

wolrah
May 8, 2006
what?


EL BROMANCE posted:

It sounds like your requirements outweigh a TV in the $300 range, to be honest. When my budget is around that for something bigger than 40" or so, my needs are basically 'can it display an image that doesn't make my eyes bleed and fill me with hatred when I look at it'.

I'm not really complaining, just throwing out my thoughts. It was just a random buy because I had nothing better to do and felt like trying my luck. The decent rating on RTings for PC use caught me by surprise at the price so I figured why not give it a shot.

It's honestly not bad at all as a TV, just definitely not a good choice for PC use.

wolrah
May 8, 2006
what?


Josh Lyman posted:

A couple noob questions about the last 10 years of HDTVs.

-Was DLP really all that?
-Was plasma really better than LCD?
-I remember when LED backlights arrived, the idea was that LCDs wouldn't have to be edge backlit with CCFLs anymore, but it seems like LED backlighting "behind" the panel never really took off.
-Are OLEDs going to happen anytime soon or is it going to be another 5 years before I can buy a 60" model for under $1000?

1. Yes. Better color and much faster reaction times than a LCD projector could ever provide. The only downside is rainbowing, which can be minimized with a well designed color wheel or eliminated with a really expensive 3DLP light engine.
2. Definitely. OLED is basically plasma 2.0, same advantages (better colors, pure blacks) but less prone to burn-in.
3. Most nicer LCDs use backlighting rather than sidelighting. If it has zoned dimming it has backlighting.
4. Cheap OLEDs have been five years away for about 10 years now, but they're working their way down. If someone other than LG gets in to the market seriously it'll start dropping fast.

wolrah
May 8, 2006
what?


Josh Lyman posted:

Is DLP still better than a 2017 LCD? I ask because one of my neighbors is selling a 73". Or should I just wait for OLED?

Rear projection is dead for a reason. I was referring to front projection when I said DLP is better than LCD.

Definitely do not buy a rear projection TV for pretty much any meaningful amount of money. If you have a lot of space in a game room or something and someone offers you one for near free, gently caress it why not, but if you're comparing it against a modern flat screen it's not even in the same ballpark.

wolrah fucked around with this message at 15:39 on Sep 16, 2017

wolrah
May 8, 2006
what?


Number_6 posted:

I'm pretty tired of standards constantly changing in a transparent marketing effort to drive sales, especially while critical aspects of some display technologies are left "unfixed" generation after generation. 4k isn't that important to me; most of my sources are 1080p or worse. HDR sounds nice but is only marginally important to me. LED TV makers should be focusing on core tech issues like panel uniformity, black level, and viewing angle, all of which are much more important to me than 4k support or HDR or smart features.
Those things are nerdy and require knowledgeable salespeople to sell to people who don't know things.

HDR and 4K can be easily demoed with a video loop and a generic blueshirt can point at the detailed, colorful picture.


Also it's not like those things are unsolved, there are plenty of TVs that get those things right too, just they're also loaded with other features because the lower markets just don't care if the backlight bleeds.


Anyways, a bit over a decade ago your same rant would have applied to HDTV. Most of your sources were 480i (TV) or 480p (DVD) with the OTA stations still mostly broadcasting upscaled 480i if they had a digital signal going at all yet. Technology moves forward, and those of us who have the current thing often want something better once we get used to it.

Also the standard screen sizes have grown. When DVD was high tech my 35" CRT was still considered a "big screen" and pretty much anything larger required projection. These days 32" is pretty much the entry level and somewhere around 55". 70+ inch direct-view screens have gone from specialty equipment that costs more than a car to something you can choose from a half dozen options at Costco. 1080p just doesn't hold up on a screen that's large by modern standards.

wolrah fucked around with this message at 13:16 on Sep 22, 2017

wolrah
May 8, 2006
what?


You are correct about 4K120 but there are definitely a few TVs out there that will take a 1080p120 input and handle it correctly.

http://www.rtings.com/tv/tests/inputs/input-lag

Sort by the 1080p@120 column, you'll see the 2017 Visio P series, the LG C7 (not sure if this means just the C7 or all 2017 LG OLEDs), and a bunch of Sonys.

There's also an older model Sony which was known to support 1080p120. I think it was X850D maybe?

wolrah
May 8, 2006
what?


bull3964 posted:

Ok then, still though, there's not much practical use for it. PC gaming is the only place that comes to mind and without any sort of adaptive refresh, it's likely not going to be a pleasant experience.
Yeah, obviously the primary use would be PC gaming on a TV. It'd probably also be possible for the 4K console variants to enable 1080p120 support in an update, but I'd put pretty slim odds on that happening.

I disagree that it'd be unpleasant without adaptive refresh though, mostly because I've been gaming on a 1080p144 panel running at 1080p120 (144 is not ideal for watching TV) for a few years now without adaptive refresh and it works great.

Adaptive refresh would of course be nice, I'll never argue against it, but it's definitely not necessary to enjoy the benefits of >60Hz refresh rates for gaming. Tearing isn't the worst thing in the world, especially at higher rates.

wolrah
May 8, 2006
what?


BonoMan posted:

Philips ambilight changes to match the screen content though. Much different than self installed kits.

You can DIY this too. Here's a tutorial using a Raspberry Pi: https://www.reddit.com/r/raspberry_pi/comments/6892nf/diy_ambilight_tv_guide_ws2812neopixels/

Here's one of that guy's videos with 266 LEDs:
https://www.youtube.com/watch?v=Wp0LAw1Vgys

wolrah
May 8, 2006
what?


pwn posted:

Maybe it's just an extreme example, but that's way too obnoxious for my tastes.

It's very configurable, and that clip is basically the colorful showoff one.

Here's the same setup with normal TV content:

https://i.imgur.com/8Y7HLdZ.gifv

You can also make the effect "softer" and have it blend the colors together more if you want, or even smooth multiple frames.

wolrah
May 8, 2006
what?


Holy poo poo someone else who has a CED player?

This is only the third time I've seen someone even mention them existing online, and one of the other times was Techmoan who's pretty much all about obscure old stuff like that.

My grandma found it when cleaning out her sister's attic and the way she described it I thought it was a laserdisc player. I was kind of disappointed when I was able to pick it up but it's still pretty cool that a vinyl-based video format actually works.

wolrah
May 8, 2006
what?


Waltzing Along posted:

A dedicated player will take up less space and use less power than a bone. Also, it will boot up and shut down faster and will most likely have an easier remote with a better features set.

No argument on the other parts, but who cares about what remote a home theater device comes with? Buy a nice universal once and never deal with the garbage most vendors include again.

I got 11 years out of my Harmony 880 before the battery wore out enough that it started to get annoying. It still works, but I was able to pick up a Harmony 700 to replace it for $50 on sale so I figured why not.

wolrah
May 8, 2006
what?


Curved TVs are mostly pointless because you generally sit far enough away from them that the curve really doesn't matter all that much. It's when the screen takes up a significant part of your horizontal FOV that curved screens actually start to matter. 34" ultrawide PC monitors for example basically have to be curved otherwise they kinda suck to use at a standard desktop distance. They also typically have much greater curves than a TV. A 65" TV that's 3-5 times further from your eyes isn't going to see nearly the same benefits while still dealing with the potential problems (particularly backlight uniformity issues).

wolrah
May 8, 2006
what?


thechosenone posted:

What about this?:

https://www.google.com/shopping/pro...aAZgQ8wIIpwIwAQ

There are refurbished ones from TheExpressOutlet at around $330. Is getting one of those a bad idea?

For those not wanting to follow a link, Its a TCL 4k smart LED TV (120Hz). I figured it was the suggested brand, but was cheaper.

I bought one of the low-end TCLs on a whim a month or two back. It was fine as a TV. Not great at anything, but what do you expect for the $400 I paid for it? Roku was pretty neat, especially the ability to stream audio from the built in apps to my phone. That was absolutely wonderful for watching TV at night.

It was a terrible computer monitor though, text looked like total poo poo no matter what settings I played with. I returned it because that was my main purpose, a large monitor that also worked as a TV.

wolrah
May 8, 2006
what?


Don Lapre posted:

It will take a LONG time for someone to catch up with LG. The only real contender would be samsung and they dont seem interested at all.

This is something I haven't been able to figure out. Samsung is seemingly the only vendor capable of producing an OLED screen for a phone that isn't a total pile of poo poo, yet their TV division chooses to market the poo poo out of yet another LCD variant rather than capitalize on this.

And of course somehow the reverse is true for LG, their OLED TVs set the bar in a lot of ways while their OLED phones are underwhelming and problematic.

wolrah
May 8, 2006
what?


WhiteHowler posted:

From what I can tell, the only difference between the C7P and B7A is the Dolby Atmos support/speakers and a slightly different stand and bezel on the former?

The panels are exactly the same on all of the LG OLEDs of a given size/model year. If you're using them with a separate speaker system the only differences are in the physical appearance of the unit.

wolrah
May 8, 2006
what?


Regarding digital audio output what it comes down to is that S/PDIF only has the raw bandwidth for stereo audio signals. 5.1 requires that the signal be re-encoded and compressed, this is where Dolby AC3 and DTS come in to play.

If the source signal is already in AC3 or DTS the TV can pass that signal straight through to the S/PDIF port. Not all do, some still strip it down to plain stereo, but most good TVs should support passthrough.

If you've been in to big screen PC gaming for a while you may remember back before HDMI there was a time where we had to use analog connections to get 5.1/7.1 out of our games on a PC, or later we had to buy certain very specific sound cards which included Dolby Digital Live or DTS Connect encoders. The nVidia nForce chipset was famous partially for having this ability built in. This is for the same reason. Movies had a pre-encoded soundtrack which could be just sent out the S/PDIF port, but games were generating the audio in real time.

HDMI has shitloads of bandwidth compared to S/PDIF. It supports raw uncompressed 5.1, 7.1, and even beyond. A lot of source devices will now decode any advanced surround sound formats to plain raw format before sending it on because it eliminates compatibility issues with bitstreaming.

What this means though is that if your source is sending uncompressed 5.1 to your TV, the TV would need to have an encoder to send that out the S/PDIF port. That's extra cost for something only a small group of users would even potentially encounter, much less care about.

If you don't have any other choice and your TV does support passthrough, you'll just want to set up all the rest of your devices to bitstream the AC3 and/or DTS soundtracks rather than decoding on the device. Note that this will generally lose you menu sounds and such when playing movies, or in some annoying cases it can result in mode switches to stereo when something wants to mix audio.

wolrah
May 8, 2006
what?


bull3964 posted:

Unless your input is DD True HD (which you would only get with blu-ray), there's no encoder needed. DD+ contains a core AC-3 track that's backward compatible with any device that does 5.1. So, no encoders are necessary.
I was referring to how most devices just send PCM 5.1/7.1/whatever over HDMI rather than passing the Dolby or DTS source signal. If the TV's receiving PCM it doesn't matter what compatibility the original track may have had.

quote:

The output issue is purely how they have it configured. LG's OLEDs can pass 5.1 to the optical under the correct circumstances, but it's highly dependent on the device you connect to it so it may as well not work. Basically, the HDMI handshake presents it as 2.0 capable due to it just understanding the internal speakers. If you can't force an audio output mode on our source, you only get 2.0. If you can, you can get 5.1.

For example, I can only get 2.0 output from a Chromecast connected directly to the TV, even with ARC, because I can't configure audio on the chromecast so it outputs 2 channel PCM. With a Roku, which I can force to output DD+ instead of auto and I can get 5.1 out of that connected directly to the TV.
Hmm, that is an interesting problem. Now I'm kind of curious what my TVs claim support for.

edit: Just pulled the EDID from my 52LG60 via an attached Raspberry Pi and interestingly it is advertising only AC-3, not PCM of any sort. My Marantz SR7001 has 2 channel PCM, 8 channel PCM, DD, DTS, and one-bit (DVD-A IIRC?).

GreenNight posted:

On the TV? The receiver? There are no CEC settings on the 4K player. I have 5 other devices hooked up and nothing else turns on.
If only the player is misbehaving that's usually where you'd want to stop it. Some vendors rebrand CEC with their own naming, so check the manual for anything that implies remote control commands being passed over the HDMI wire.

If you can't change its settings there, see if your receiver has any options to block CEC traffic. Most don't but it's worth a shot. If not then you'll have to disable it at whichever device is generating the signal that causes the 4K player to decide it's time to wake up.

wolrah fucked around with this message at 21:48 on Nov 26, 2017

Adbot
ADBOT LOVES YOU

wolrah
May 8, 2006
what?


bull3964 posted:

That's really not the norm though except for game consoles. Pretty much everything else bitstreams DD by default.
That may be true. My device lineup is made up of game consoles, PCs, RPis (pretty much PCs from this standpoint), and streaming sticks.

That said, at this point I think that lineup might be the majority, with standalone disc players and especially cable boxes on the decline.

wolrah fucked around with this message at 23:07 on Nov 26, 2017

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply