Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Locked thread
josh04
Oct 19, 2008


"THE FLASH IS THE REASON
TO RACE TO THE THEATRES"

This title contains sponsored content.

NTSC standard definition TV was an infamous shitshow for colour accuracy because they didn't fix a standard for the TVs, which all engaged in a race to the bottom for hot, saturated colours, then had to adjust the studio monitors to approximate the horror the actual TVs available were producing. Rinse and repeat for several decades. As far as I'm aware it basically never recovered until HD came in and they had a complete do-over.

Adbot
ADBOT LOVES YOU

Cemetry Gator
Apr 3, 2007

Do you find something comical about my appearance when I'm driving my automobile?

SpaceDrake posted:

A question from four months and one page ago :v:, but I still own a 17" 4:3 CRT specifically for use with my old game consoles, so that I can, without irony, experience them as in my glorious 90s childhood: coming through an RF or composite signal onto a 480i screen.

Surprisingly, for some games, this can actually matter.

Blaster Master is perhaps the best known example, but there are a number of other titles (Mega Man 2, several Castlevanias, Actraiser, etc) which actually had their color balancing done with an eye for what the end product coming through RF/composite would look like. (You can view some comparisons here, though it's a pretty old site.) While I know some folks are obsessive about RGB modding old consoles, that isn't actually what a lot of developers intended as the final display - they knew what kind of actual display color space the vast majority of consumers were going to be working with, and they adjusted the colors in the renderer to produce the actual color balance they desired at the end. Blaster Master is the big example again, but that game actually doesn't look right unless it's "suffering" the color crushing from RF. A number of colors don't even match correctly without it.

So leading back to more film-like chat, on that note: were there any television shows or other made-for-TV filmed media that did something similar? Theatrical film, I imagine, never had this problem because of this magical device called the film projector, as it turns out, but between 1962 and 1995 or so, everyone involved in television must have known that whatever the initial edit looked like was going to go through the hell of RF before reaching people's eyeballs. So are there notable stories of films or shows factoring this in to their production? I would imagine there are, I just don't know of any.

Well, you should take your media format into consideration. So a lot of movies knew that stuff would be obfuscated by the added grain that producing prints would add when they did special effects.

For TV, you wanted to avoid obvious artifacts as much as possible. So, how you dress the characters and the set mattered. Also, stuff like film and video was important.

Keep in mind that with videogames, you were dealing with a low number of colors. Back in the early 80s, CGA composite was a popular choice. By using those NTSC artifacts, you could get 16 colors from a 4 color signal. If you used an RGB monitor, the colors were poo poo.

Schwarzwald
Jul 27, 2004

Don't Blink

Cemetry Gator posted:

For TV, you wanted to avoid obvious artifacts as much as possible. So, how you dress the characters and the set mattered. Also, stuff like film and video was important.

My father worked in a television station, and every once in a while a politician or similar would come in to be on camera for another network's show.

The station kept a selection of neck ties on hand, because it was almost a given that whoever it was would be wearing a tie with a design busy enough that it could not be properly filmed.

Instant Sunrise
Apr 12, 2007


The manger babies don't have feelings. You said it yourself.

Cemetry Gator posted:

Well, you should take your media format into consideration. So a lot of movies knew that stuff would be obfuscated by the added grain that producing prints would add when they did special effects.

For TV, you wanted to avoid obvious artifacts as much as possible. So, how you dress the characters and the set mattered. Also, stuff like film and video was important.

From what I remember, the big no-nos were:
Pure black and pure white clothing because black would lose detail too easily and white would blow out really easily.
Fine horizontal lines because they'd end up flickering and dancing around when displayed on an interlaced TV.
Grid patterns because that would give you a moire effect on a camera.

SpaceDrake
Dec 22, 2006

I can't avoid filling a game with awful memes, even if I want to. It's in my bones...!

Cemetry Gator posted:

Well, you should take your media format into consideration. So a lot of movies knew that stuff would be obfuscated by the added grain that producing prints would add when they did special effects.

For TV, you wanted to avoid obvious artifacts as much as possible. So, how you dress the characters and the set mattered. Also, stuff like film and video was important.

Keep in mind that with videogames, you were dealing with a low number of colors. Back in the early 80s, CGA composite was a popular choice. By using those NTSC artifacts, you could get 16 colors from a 4 color signal. If you used an RGB monitor, the colors were poo poo.

Well sure, I get all that (especially the CGA NTSC trickery), but I meant more along the lines of this specifically (raw rendering vs. intended):

vs.

Where in "shooting" (the renderer in the videogame case, but filming in any film/television case) the colors were deliberately blown out or otherwise exaggerated so that in final presentation, on an RF/RCA-composite TV or a film projector or what have you, they would actually look how the director wanted them to appear and probably more "natural" to the audience.

It's kind of topic drift, I know, but since 4:3 CRTs came up, this has always been something I found fascinating about the best products of the NES era or so, where the art was designed with the limitations of the assumed actual projector in mind and how viewing them on later devices can actually get the color balance very wrong. Maybe it's slightly more along the lines of this thread, though that's really more about specific film preservation than wider presentation of old media on newer devices and the effects it can have.

Anyway, point is: are there any examples of film media that did something along these lines? I would imagine there are, I just don't know of any and would like to broaden my knowledge! And I particularly wonder if films did it at all, due to the differences between RF broadcast and "pure" film projection (and then there's nitrate film vs. safety, Technicolor vs. Kinemacolor, etc). You could even link it back to the original discussion of what the best AR to use is, since depending on how your movie is framed, the color balance of the overall scene can change.

Instant Sunrise posted:

Fine horizontal lines because they'd end up flickering and dancing around when displayed on an interlaced TV.
Grid patterns because that would give you a moire effect on a camera.

I actually think I remember seeing a couple of examples of these, back in the RF days of my youth before HD broadcasting took over and the news or whatever just couldn't avoid featuring dudes wearing outfits of this nature. The latter, in particular, was spectacular for giving your viewers a migraine.

Magic Hate Ball
May 6, 2007

ha ha ha!
you've already paid for this

Instant Sunrise posted:

From what I remember, the big no-nos were:
Pure black and pure white clothing because black would lose detail too easily and white would blow out really easily.
Fine horizontal lines because they'd end up flickering and dancing around when displayed on an interlaced TV.
Grid patterns because that would give you a moire effect on a camera.

My favorite example of this was Cary Grant's striped sweater in To Catch A Thief:



Besides the fact that he looked like a sun-dried tomato, his shirt played havoc with most TVs, especially early ones - apparently on some he was just a cloud of static with a head. Some sci-fi shows used the same effect on purpose, which I think is neat, a kind of technological-analog special effect that's literally happening in your own home.

Egbert Souse
Nov 6, 2008

Magic Hate Ball posted:

My favorite example of this was Cary Grant's striped sweater in To Catch A Thief:



Besides the fact that he looked like a sun-dried tomato, his shirt played havoc with most TVs, especially early ones - apparently on some he was just a cloud of static with a head. Some sci-fi shows used the same effect on purpose, which I think is neat, a kind of technological-analog special effect that's literally happening in your own home.

2001 was one of the first Blu-Rays I bought for my then-new HDTV. I was impressed by how the "revolving room" right after the Blue Danube sequence didn't strobe like it did on DVD and tape.

FreudianSlippers
Apr 12, 2010

Shooting and Fucking
are the same thing!

Moire can still be a problem even with modern non-interlaced digital cameras. Something about some types of sensors not being to handle it.

Adbot
ADBOT LOVES YOU

Instant Sunrise
Apr 12, 2007


The manger babies don't have feelings. You said it yourself.

FreudianSlippers posted:

Moire can still be a problem even with modern non-interlaced digital cameras. Something about some types of sensors not being to handle it.

Moire is a problem on any kind of sensor that based on a fixed grid of pixels, which includes all digital cameras.

Film doesn't have that problem because the silver halide crystals are distributed randomly and aren't on a fixed grid.

  • Locked thread