I’m a skeptic and a troublemaker. I admit it. I love to learn, but a side effect of learning is that I ask questions. Too many questions, perhaps. Sometimes I’m surprised I’m still alive after asking some of the questions I’ve asked.
I’ve long known that CRI is an inadequate measure of the color response of broken spectrum lighting, particularly when it comes to color reproduction in video. The short story is this:
There are eight CRI test colors. CRI averages a light’s affect on those colors, which are equidistant around the Munsell color wheel and are very desaturated. The less saturated a color the broader its spectral response, which means that it’s not terribly difficult for a light source with a lot of gaps in its spectrum to register at a high CRI level because it’s gaps are largely hidden by the broadness of the test samples. If no one sample is narrow enough to miss a sharp color spike (such as the green in fluorescents) then metamerism may make the light source appear to look better than it should.
There is a new standard taking hold, CQS, which shows a lot of promise. It uses 15 color samples and uses additional criteria to judge the quality of color. More samples means more opportunities to register color spikes in LED and fluorescent lights, but there’s some question as to whether 15 samples is enough for motion picture lighting instrument evaluation.
Click here for an excellent short article about why CQS is better than CRI.
This brings us to TLCI, whose intent is predict how a light will appear on camera. Cameras attempt to mimic the human eye’s response to color but they do so imperfectly, and TLCI is an attempt to quantify those imperfections across a wide range of cameras to create a generic camera “model” against which the spectra of new lights can be tested. This is a great idea as it helps manufacturers predict, during the manufacturing process, how the average camera will reproduce common colors under their light’s spectrum, and gives cinematographers and gaffers a way to quantify lighting quality without spending a lot of time and effort on extensive testing.
In my opinion, however, I think TLCI may be flawed.
Part of my issue with TLCI is that the test was performed using a 24-color Macbeth Colorchecker. In this article I questioned this chart’s usefulness in video by showing that none of the 18 color targets have any meaning when viewed on common video test equipment. It’s also a fairly inexpensive matte chart, and after working with DSC Labs as a consultant for a number of years I’ve learned how hard it is to make a precision chart. It can’t be done cheaply and it can’t be done very well in a matte finish. These two features, however, are distinguishing features of this chart.
According to Wikipedia, the color references include
“…blue sky, the front of a typical leaf, and a blue chicory flower. The rest were chosen arbitrarily to represent a gamut ‘of general interest and utility for test purposes’, though the orange and yellow patches are similarly colored to typical oranges and lemons.”
None of these are objective references. This chart was invented in 1976 as a way to eyeball match colors between transparencies and offset printed images. It has nothing to do with video.
What particularly baffles me is why this chart would be used as a reference for video colorimetry when its vectorscope pattern resembles a shotgun blast. It means nothing to any common video signal measurement device. The inventors of TLCI apparently understand this: on page 17 of EBU Tech document 3355 that describes TLCI’s colorimetric assessment methods, a table of suggested corrections for colorists gives the following instructions to correct for a sample daylight fluorescent:
The idea is that a + or – means “more” or “less” correction and the number of +’s and -‘s gives the colorist an idea of how far they have to push their wheels or trackballs to achieve the proper result. While I think this chart is well intentioned, I also think it shows little to no understanding of how fast a colorist has to work. Can you imagine a colorist who grades episodic TV at a pace of a shot or more per minute quickly figuring out where “++++++” is on a waveform monitor or vectorscope? I can hear them now: “Plus plus plus plus… damn, too far… minus… minus…”
A much better method would have been to use a video-specific tool like the DSC Labs Dreamcatcher 76 and give the colorist a printout of the resulting vectorscope pattern. This chart creates multiple hexagonal patterns whose outside points should fall directly on the primary and secondary color squares built into every Rec 709 vectorscope in existence, while displaying two smaller hexagonal shapes at different saturation levels. Spectral spikes would have nowhere to hide.
Here’s an example of one of the more common DSC Labs charts viewed on a vectorscope:
The chart displayed on these vectorscope images has 24 samples at one saturation level and, as you can see, it makes a beautiful and easy-to-read pattern on a vectorscope. The Dreamcatcher would create a much more granular outside pattern, with two less saturated pattens inside. It would be easy for a colorist to simply push a hue’s color toward its proper box instead of trying to figure out how far “++++++” is.
Best of all, DSC Labs charts use Rec 709 primaries mixed in various ways and each patch is tested with a spectrometer for accuracy. The ColorChecker is basically paint on cardboard with references like “sky” blue and “lemon” yellow.
The Dreamcatcher 76 costs anywhere from $2,750.00 to $7,084.00, depending on the size. ColorCheckers cost about $70.
That’s less of an issue for me, though, than the kinds of cameras tested.
In this tech article about the TLCI testing process there appears on page 9 a diagram of “the optical elements in a modern TV camera.” The diagram shows a three sensor prism block. On page 11 it is noted that TLCI was derived from tests involving nine cameras from three manufacturers with this exact configuration. The bottom of the page contains this quote:
“It is on this background and on the basis of the calculated colour errors that the standard camera model can be represented by the ‘smoothed’ average of all measured cameras, including data from single chip CMOS cameras…”
The problem, of course, is that the top of the page states that only prism block cameras were tested. Additionally, nowhere is it stated which cameras were tested, which is somewhat crucial information. Every camera manufacturer has their own secret sauce, and every camera model is different. Were only the most popular brands of three-CCD cameras tested (Sony, Panasonic, Ikegami, Grass Valley) or were there some rarer brands included (JVC, Hitachi)? This information is not provided.
If only nine prism block cameras were tested, where did the single sensor CMOS camera information come from?
My experience has shown me that prism block cameras and single sensor cameras respond very differently to color. For ten years I shot corporate video projects for a major healthcare company using their internally-owned cameras, all of which had prism blocks. I shot in hospitals all over the U.S. under every kind of fluorescent light imaginable. The bandpass of these prism block cameras was such that they didn’t see the green spike in common fluorescent tubes very well, if at all. The environment might have looked a little bit yellow or a little bit cool, but not so much that adding a tungsten light was overly noticeable or objectionable.
Years later I shot a project in one of those same hospitals with a Sony F3, and that camera saw the green spike in fluorescents so strongly that I used Lee Half Plus Green gel to match my tungsten light to the background fluorescents. I’ve seen the same thing in all the single sensor cameras I’ve worked with, including REDs, Alexas and Sony F55/F5s. They just don’t see color the same way prism block cameras do. Or, maybe, their overlapping spectral response curves are less forgiving. Fluorescent green spikes seem to fall into a blind spot in prism block cameras, but single sensor cameras see them very well indeed.
The red, green and blue separations in prism block cameras are made using large dichroic filters that sit directly atop the sensor. It’s not difficult to make a large dichroic filter that passes specific wavelengths of light and blocks others, so while the optics of sensor blocks can be quite complex the process of splitting the spectrum apart is not. Single sensor cameras, however, utilize microscopic dye filters that sit atop each photo site, and there are a limited number of pigments available that will come close to doing the same job.
Prism block cameras are largely confined to Rec 709 colorimetry, or so I was told by a highly-placed employee at a major camera manufacturing company. They had made a broad spectrum three-CCD camera that captured a significantly wider color gamut than Rec 709 but the prism block was hugely expensive to make. Single sensor cameras, on the other hand, generally capture a much wider gamut than Rec 709 at much lower costs, as is demonstrated by the Sony F5/F3/FS7 cameras. There’s something about single sensor design that seems to allow for wider color gamuts at much lower prices, at least for manufacturers that make their own sensors. (Off-the-shelf sensors don’t seem to boast the same wide color gamut, but still respond differently to color than do sensors around a prism block.)
It’s clear that each design has advantages and disadvantages. It’s also clear that each one should see color very differently. This article discusses the strengths and weaknesses of each system, and while it’s a little bit dated it’s also very informative.
I recently posted my qualms about the TLCI process on the Cinematography Mailing List, and someone on the list passed my email on to the author of TLCI, Alan Roberts. I’d like to say that we had a constructive conversation but he repeatedly denied that single sensor cameras and three-chip cameras showed any significant disparities at all in the way they reproduce color. He was completely unwilling to accept the possibility that I’d seen significant differences in color rendering between these two types of cameras, but at the same time he was not able or willing to tell me which single sensor cameras he’d tested. He did state that he had tested them and they were part of the TLCI-2012 spec, but the document detailing the testing procedure states that TLCI was derived solely from testing nine prism block cameras plus single sensor data from… somewhere.
Given the choice between believing Mr. Roberts or my own eyes, I have to choose my own eyes.
He’s not the only expert with whom I have disagreements. I came across this article that discusses the TLCI testing process in depth. In it, the author states that the reference color R9, a highly saturated hue of red, is not worth testing because television cameras can’t see it. (This article discusses the importance of R9 in flesh tone reproduction.) In fact, R9 has been discovered to be indispensable in the accurate reproduction of flesh tone. It’s the Holy Grail of LED lighting, as saturated red light is the hardest hue to create with an LED light, and yet without a little bit of that hue flesh tone doesn’t come alive.
I’ve seen this myself in my role as a consultant to three different LED lighting companies. Initially I helped develop the Kelvin Tile, the first broad spectrum variable color light invented for use in the film industry. A couple of years ago I shot an LED lighting shootout for PRG in Hollywood, and one of the lights we tested was the Kelvin Tile. (At that point the product line had been sold to Gekko Lighting.) Take a look at the flesh tone test results here.
That was shot on an Arri Alexa (single sensor), and the Kelvin Tile made our beautiful model look like a corpse. When we first developed the Kelvin Tile there were only two single sensor cameras on the market (the Panavision Genesis and the Dalsa Origin) (removed: Viper, which was an early log camera but still a prism block camera) so we confined our testing to the Sony F900R, which was the gold standard at the time. It was a prism block camera, and the Kelvin Tile made skin tone positively glow. It made people look better than tungsten light did, which was one of the major selling points of the light.
Looking at what it did to flesh tones viewed through an Alexa was embarrassing.
In retrospect, the issue seems to be that the Kelvin Tile didn’t put out enough R9 red. While the Sony F900 wasn’t terribly sensitive to that portion of the spectrum the Arri Alexa definitely is.
This is why I’m skeptical about Mr. Roberts’ statements that the spectral sensitivity of prism block cameras and single sensor cameras is essentially the same.
I could be completely wrong, but I don’t think I am. My personal experience has shown me that single sensor colorimetry differs significantly from prism block cameras, and as TLCI appears to be derived exclusively from the testing of prism block cameras I don’t know how well its predictions apply to single sensor cameras. Until I know, without a doubt, that single sensor cameras were profiled, and I have some idea of which cameras were profiled overall in general, I can’t put much stock in the TLCI system unless I’m using a prism block camera.
And, at this point, I haven’t used a prism block camera in years.
Disclosure: I’ve worked as a paid consultant to DSC Labs, Element Labs, PRG, Cineo Lighting, and Sony.
Art is a professional cinematographer and part-time target. His website is at www.artadamsdp.com.