Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


I don't think even the current ones are rendered in 4k, it's just not worth it.

The majority of live action films still aren't a 4k DI and even if they are you can be their VFX are rendered in 2K.

Films with a wider aspect ratio still get a boost in resolution on UHD even with a 2k DI since you don't lose lines of resolution to the black bars. Really though, WGC and HDR are the improvements here.

Adbot
ADBOT LOVES YOU

Rastor
Jun 2, 2001

Yeah all CGI movies are 2K, their render farms are budgeted for that and are already constantly maxed out without quadrupling the pixel count.

But 2K pure CGI does upscale nicely and such discs tend to get good reviews for picture quality.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
One of these days the Realtek Displayport to HDMI 2.1 adapter is going to come out... one of these days...

*mummifies in office chair*

BonoMan
Feb 20, 2002

Jade Ear Joe

Rastor posted:

Yeah all CGI movies are 2K, their render farms are budgeted for that and are already constantly maxed out without quadrupling the pixel count.

But 2K pure CGI does upscale nicely and such discs tend to get good reviews for picture quality.

Edit: I'm going to effort post on this more when I'm not on the road

BonoMan fucked around with this message at 18:48 on Sep 9, 2019

American McGay
Feb 28, 2010

by sebmojo

BonoMan posted:

Edit: I'm going to effort post on this more when I'm not on the road
:munch:

BonoMan
Feb 20, 2002

Jade Ear Joe

Lol nothing exciting. I just made a comment then went to clarify then my two month old woke up and started crying in the car and I just didn't have it in me to keep going.

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


LG is adding G-Sync to their 2019 OLEDs.

https://www.engadget.com/2019/09/09/g-sync-lg-oled/

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Wait what the flying gently caress? First of all I just jizzed myself, but isn't GSYNC through DP only? The C9 doesn't have a DP, or at least mine doesn't, maybe the E9 has it or something but I haven't heard anything about that.

I'm basically sitting in disbelief right now. This is exactly what I wanted, and the perfect holdover until true HDMI 2.1. Unbelievable.

Edit: Am I reading this right? It uses HDMI 2.1 VRR and will be compatible with the Nvidia 2000 series? It's just so serendipitous. I just happen to have all of this poo poo. I've wondered at times if getting a 2080 was worth it but goddamn does this put an end to that question.

Taima fucked around with this message at 04:45 on Sep 10, 2019

Ultimate Mango
Jan 18, 2005

Taima posted:

Wait what the flying gently caress? First of all I just jizzed myself, but isn't GSYNC through DP only? The C9 doesn't have a DP, or at least mine doesn't, maybe the E9 has it or something but I haven't heard anything about that.

I'm basically sitting in disbelief right now. This is exactly what I wanted, and the perfect holdover until true HDMI 2.1. Unbelievable.

The post and press release heavily imply that it’s HDMI, but maybe only through a limited series of nvidia cards.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Darn sorry I read further and edited before your reply. Looks like it's HDMI 2.1 VRR on 2000 series GPUS!

Edit: So this makes the C9 a 13ms input lag OLED HDMI 2.1 Gsync monitor. Good god. Ok I'm going to stop posting but fuuuuuuuck

Marketing-wise this seems like a great match. LG is facing sagging OLED sales this year, and the 2000 series is also looking for more value-add to spur sales so this is a nice crossover. It also raises a lot of questions about what other HDMI 2.1 features that the 2000 series can support (if any). It's kind of weird because afaik they've never even hinted that this was even possible with their current cards.

Taima fucked around with this message at 05:33 on Sep 10, 2019

codo27
Apr 21, 2008

I dont want to see that because then I'd have to have it.

Elysium
Aug 21, 2003
It is by will alone I set my mind in motion.
My father in law keeps changing my tv settings from Movie mode with all the bullshit turned off to full dynamic soap opera eye burning madness. Am I legally allowed to murder him or should a simple maiming suffice?

Ultimate Mango
Jan 18, 2005

Elysium posted:

My father in law keeps changing my tv settings from Movie mode with all the bullshit turned off to full dynamic soap opera eye burning madness. Am I legally allowed to murder him or should a simple maiming suffice?

Public shaming or stockade is entirely warranted. The worst of offenses deserves a harsh punishment.

American McGay
Feb 28, 2010

by sebmojo
Hide the TV remote and only have the cable box remote available for public use.

Bushido Brown
Mar 30, 2011

I bought a C9 through Greentoe earlier on a whim.

It's much better than I expected. I already had a 4K TV, and only moved from 60" to 65", but holy cow. It's just stunning.

WithoutTheFezOn
Aug 28, 2005
Oh no
Help me out, goons. It’s been a long, long time since I bought a TV.

I’m going to buy a 75” TV. I want to spend $1700 tops (before tax). Obviously I’m not concerned about OLED. I don’t care about smart functions or ports, everything (cable box, Apple TV, maybe the PS4) will be piped through either a Denon or Yamaha receiver. Black level doesn’t concern me (as long as it’s not totally crap) as much as brightness — this will be in a pretty bright room. Some of the seating will be at what I estimate is 30-35 degrees off center.

So what should I buy?

Related side question, what is the LG Nanocell stuff?

American McGay
Feb 28, 2010

by sebmojo
You should go to a store and look at a 65" C9 and see if you don't actually care about those things that you said you don't care about.

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

American McGay posted:

You should go to a store and look at a 65" C9 and see if you don't actually care about those things that you said you don't care about.

Hah yeah basically.

We have a Vizio M 75 inch that we got maybe a year ago or less, along with the C9, and we stopped using the 75 inch real quick after the C9 got set up.

That being said the 75 inch is very impressive in its own right on a sheer size basis. I've never seen the 77 inch C9 in person but it's probably awe-inspiring.

Boxman
Sep 27, 2004

Big fan of :frog:


We’re toying with buying a new TV and I’m sure I’ll have more questions, but here’s one to start me off - is input lag a thing to worry about anymore?

I bought my last tv when 3D was appearing to hit a stride, and I’ve been pretty happy with it (even though the 3D doesn’t get a lot of use, obviously.) but I did think about playing games on it and it has input lag bad enough to make anything except, like, Jackbox completely unplayable.

Have manufacturers licked that problem? It’s something I’d like to prioritize.

ufarn
May 30, 2009

Elysium posted:

My father in law keeps changing my tv settings from Movie mode with all the bullshit turned off to full dynamic soap opera eye burning madness. Am I legally allowed to murder him or should a simple maiming suffice?
You could configure a separate input channel and move the HDMI cable to that port for when you watch a movie, assuming it's not just plain TV. All the more reason to get an Apple TV I guess.

codo27
Apr 21, 2008

Theres definitely a difference set to set in lag. RTINGS.com continues to be the gospel with this stuff, you can go in and they will tell you whats the best gaming TV, but you can select exactly what specs you want and how important they are and it will present the best sets for you based on that as well.

FunOne
Aug 20, 2000
I am a slimey vat of concentrated stupidity

Fun Shoe
So, uh, does the C9 VRR update make it compatible with the XBONEX's VRR support?

Because that would be a killer feature.

EDIT: Apparently this has been a 'supposed to work' feature but many people are having issues with it. Hopefully this will get it to a more stable configuration.

FunOne fucked around with this message at 14:44 on Sep 12, 2019

ufarn
May 30, 2009

FunOne posted:

So, uh, does the C9 VRR update make it compatible with the XBONEX's VRR support?

Because that would be a killer feature.

EDIT: Apparently this has been a 'supposed to work' feature but many people are having issues with it. Hopefully this will get it to a more stable configuration.
From what I can tell, the VRR range is 40-120 MHz, and given the number of games locked to 30, that could be why. A 30 FPS minimum is decent for a PC game, but I don't know how much heavy lifting it ends up doing for console games.

If the game offers, you could go with Performance mode instead of Quality, but that's only worth your while if the game has notable FPS dips - that also don't go below 40.

It's so situational you'll probably end up not end up with LFC (low framerate compensation). Ideally, a TV should at least go down to 24 Hz so you could save power watching movies and TV and basic dashboard UIs on consoles.

ufarn fucked around with this message at 17:20 on Sep 12, 2019

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
Yeah it's pretty niche, nowhere near the benefits of using VRR on PC games.

FunOne posted:

EDIT: Apparently this has been a 'supposed to work' feature but many people are having issues with it. Hopefully this will get it to a more stable configuration.

I heard that the issues had been ironed out, but as mentioned before, it's not as much of a killer feature as you would suspect (though totally worth it in specific games).

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know

ufarn posted:

From what I can tell, the VRR range is 40-120 MHz

This is the 2.1 VRR spec, yeah. The only question that I have is whether that spec extends to all the way up to 120hz @ full 4K. I guess we won't know for sure until the appropriate device(s) come out, but there's no reason yet to suspect that ISN'T the case.

codo27
Apr 21, 2008

Its so sad that graphics are lagging so bad behind display tech right now. If HDMI 2.1 is on the doorstep with 120hz 4K capability I wanna be able to push it in a game other than Stardew Valley. I mean for me, current games maxed out @1440p in my current setup dont even look impressive anymore. I mean 8 years ago when I started playing BF3 I used to really wow at it, but I dont feel as if graphics have come that far since, relative to that kinda jump in time in any other era.

Monday_
Feb 18, 2006

Worked-up silent dork without sex ability seeks oblivion and demise.
The Great Twist

codo27 posted:

Its so sad that graphics are lagging so bad behind display tech right now. If HDMI 2.1 is on the doorstep with 120hz 4K capability I wanna be able to push it in a game other than Stardew Valley. I mean for me, current games maxed out @1440p in my current setup dont even look impressive anymore. I mean 8 years ago when I started playing BF3 I used to really wow at it, but I dont feel as if graphics have come that far since, relative to that kinda jump in time in any other era.

When the new consoles drop next year you can enjoy a nice round of PC ports that look amazing but run like poo poo.

codo27
Apr 21, 2008

Remember when the current gen of crackerjack boxes came out and they said "its all x86 now so porting will be easy"

wolrah
May 8, 2006
what?

codo27 posted:

Remember when the current gen of crackerjack boxes came out and they said "its all x86 now so porting will be easy"

Porting itself is relatively easy compared to generations past, as long as you're using a modern engine and not doing anything too screwy in your code. Performance optimization is the hard part, because in the end these things are just strange variants of Jaguar APUs with limited RAM.

Fedule
Mar 27, 2010


No one left uncured.
I got you.
I’ve asked this before but it didn’t go anywhere and I was more curious than anything else but it’s still a problem after a year of firmware updates and now it’s starting to bug me.

Context: UK Freeview.

Channels in the COM7 band (eg BBC4 HD, BBC News HD) all don’t work when my AV receiver (Yamaha RX v583) is connected (HDMI) and turned on; they just show the TV’s No Signal popup. If I do any kind of rescan while it’s connected the channels will not be found or will be deleted from the list. When I turn off or disconnect the receiver, signal instantly comes back. This happens whether or not the receiver is actually being used by the TV for anything in particular; signal still disappears if I tell the TV to use optical or built in speakers for audio. This doesn’t happen with any other devices I have (PS4, Nintendo Switch, Apple TV), or to any other channels that I know of (including SD versions of these channels).

I have found less than nothing trying to google for this problem. I can’t even find a forum thread with 0 replies.

This is confounding me. How is an HDMI device able to disrupt a TV signal?!

ufarn
May 30, 2009
I think BBC broadcast with HLG - a specific thing like HDR10, but for broadcast TV. I don’t know how the fallbacks work for that. See if there's a setting you can change somewhere. Maybe there's a separate channel you can use.

Heliosicle
May 16, 2013

Arigato, Racists.
From a cursory search for similar things, someone here seems to have the same problem https://www.avforums.com/threads/tv-signal-interference-with-hdmi-cable.1271838/

People there talking about interference from hdmi, either the interface or the cable itself, or just a poorly shielded aerial cable? I wouldn't think they would interfere but don't know anything about the frequencies they both operate on.

I searched for "av receiver disrupting tv signals" fwiw

pwn
May 27, 2004

This Christmas get "Shoes"









:pwn: :pwn: :pwn: :pwn: :pwn:
I’ve had a similar problem in the US. I have my OTA tuner (a PVR) connected to HDMI1 and my Nintendo Switch in HDMI2. Weirdly, like 1 out of 10 times that I am watching Hulu or Youtube through the Switch, the reception on the tuner goes to hell. Sometimes if I switch the TV over to the tuner input, it clears up; other times, it won’t clear up until I turn off the app. Once in a while it won’t stop even then, and I have to shut the Switch off, full-stop. It can also happen while playing a game.

Alternatively, sometimes the Switch is the one that acts up. Say, while watching a show on Hulu, switch over to the tuner, switch back to Hulu, only to see an admonishing message that it can’t even with copyrighted content or whatever.

If it did this every time, it’d actually be less infuriating. But as it’s absolutely weirdly random, with no discernible pattern (every combination of content from both devices [e.g. network programming vs local on tuner] has both worked fine together and been absolutely scrambled, based on ???,) there’s no way to prepare except to just unplug the one I’m not watching.

I can only surmise this is some schmuck’s idea of copyright protection. Like one of these devices- the tuner, the Switch, the TV, or all of them- is programmed to detect the presence of another device and shut it down if so. It feels incredibly frustrating to have to get up and unplug the tuner HDMI cable just so I can record a show and play Mario Kart at the same time.

wolrah
May 8, 2006
what?

pwn posted:

Alternatively, sometimes the Switch is the one that acts up. Say, while watching a show on Hulu, switch over to the tuner, switch back to Hulu, only to see an admonishing message that it can’t even with copyrighted content or whatever.

This is because you switched inputs. HDCP needs to renegotiate every time the connection path changes and things can go weird, especially if you switch back and forth quickly.

The rest, with the signal problems, sounds more like some kind of RF interference. I don't know what frequencies the channels you guys are having issues are on but maybe there's some kind of issue similar to the USB 3 2.4GHz issues a few years ago.

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


Good news everyone, micro LED is avaliable for consumers to purchase!

https://www.highdefdigest.com/news/...5-million/44839

quote:

Home installation for Crystal LED systems will be available via select Sony dealers. Here's a rundown of some sample configurations that customers can put together:

Full HD Size (18 Units) at around 110-inches diagonal
4K Size (72 Units) at around 220-inches diagonal
8K Size (288 Units) at around 440-inches diagonal
16K Size (576 Units) at around 790-inches diagonal

Exact pricing for the residential systems has not been confirmed, but reports by TechHive and Engadget indicate that each module costs around $10,000 -- which would make the 220-inch 4K option come in at around $720,000 and the gigantic 790-inch 16K option come in at an absolutely absurd $5.8 million.

The sizes are what they are because there's a minimum pixel distance for microLED at the moment 220 inches is the smallest 4k set they can make.

qirex
Feb 15, 2001

Maybe they should rename it to "Fairly Small LED"

codo27
Apr 21, 2008

Yeah forget about 65" its time we had 65'. Holy jesus. So glad I'm winning the lottery this week.

The best part is these are still rife with HDMI issues that constantly come up in this thread I bet

wolrah
May 8, 2006
what?
Is "16K" somehow not 8K doubled in both dimensions? The 16K model being 790" doesn't really math out with the previous steps being 110, 220, and 440 inches. Shouldn't it be 880 inches in its largest form? I guess maybe the assumption is that the largest size will be primarily installed at movie theater aspect ratios rather than 16:9?

bull3964
Nov 18, 2000

DO YOU HEAR THAT? THAT'S THE SOUND OF ME PATTING MYSELF ON THE BACK.


That looks to be the case based on when they showed it off in april.

https://www.anandtech.com/show/14204/sony-develops-16k-display-a-783inch-crystal-led-screen

Not sure why it would have exactly double the units form 8k though.

Adbot
ADBOT LOVES YOU

Taima
Dec 31, 2006

tfw you're peeing next to someone in the lineup and they don't know
It's going to be so long before consumer class 55-75 inch 4K Micro LED displays hit the market.

Not only is the tech difficult to manufacture, but they're still having tons of problems getting the pixel density low enough, which is why the first iterations of these panels are like "4K but 900 feet wide".

Samsung has a working 75 inch 4K Micro LED panel but it seems like tremendous leaps will have to be made to get that panel anywhere near even upper-middle class consumer price points. I'm not that optimistic about this technology in the short term for consumer application.

Maybe the short term future for consumer Micro LED is in large size 1080p displays. That would actually work well for people who put their TV back way farther than they should from their sitting position (which is the majority of consumers). They aren't taking advantage of 4K anyways so the resolution loss isn't much of an issue. For example you could have a 75 inch, 1080p Micro LED display and as long as it was, say, 12+ feet away from where you were sitting.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply