Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
MrMoo
Sep 14, 2000

I recently saw a YouTube video saying colour depth is not remotely a modern problem with computers, and I’m getting the exact opposite from James Cameron wannabes and their terrible children cartoon animation quality output.

Colour sample size
8-bit is the norm for 24-bit total or 32-bit with an alpha channel.

10-bit is usually for capturing to enable colour grading. Its the minimum for HDR.

16-bit+ floating point is the fancy stuff usually reserved for in GPU representation.

Linux and it’s apps are quite buggy on 10-bit, Windows with nVidia at least has some override facility:



Desktop sample size and output sample size, not confusing at all.

colour space
The range of colours, hasn’t changed in ages. UHD bumps from a thing called Rec.709 to Rec.2020 and HDR changes the name to Rec.2100. 8-bit Rec.2020 looks worse due to the increased gamut though.

OSs have colour profiles, but idk anything above that, like if you are on a UHD resolution will it automatically change?

Gamma function
More stuff no one cares about for a while. HDR has a couple of options because it is not backward compatible to SDR. This is why HDR is nowhere. HLG is a compromise over a thing called PQ to try to be backward compatible.

There are things called LUTs to help SDR renders and it all seems such wonderful fun that I’m glad someone else does it.

https://partnerhelp.netflixstudios.com/hc/en-us/articles/360025502033-What-is-Color-Management-

Chromacity
All modern video is has reduced chroma (colour) resolution compared to luma (brightness) due to our eyeballs. Compositing and editing video needs full resolution otherwise noise and artifacts appear around objects in a scene. The final processing stage can throw it all away though.

So today all presentation video is 8-bit 4:2:0 SDR Rec.709, or 10-bit 4:2:0 HDR Rec.2100. Working content is often Apple’s ProRes 4:4:4:4 (extra 4 for alpha).

I’m having Dunning Krueger designers wanting me to render directly their multiple giggawatt video files or at least transcode to “only” 10-bit 4:4:4. For low resolution LEDs where you can see a pixel this has some logic, but pretty much nothing works like this.



:lol: We have not been permitted to use this content after this week, it's so important, or something.

Windows 11 and HDR is so annoying, I've had 8+ hard locks on a new Razer laptop with HDR display in the first week of usage. Chrome, Zoom, WebEx, etc, all have no clue what is happening and frequently fail in different ways.

MrMoo fucked around with this message at 23:29 on Jun 9, 2022

Adbot
ADBOT LOVES YOU

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply