Setting up your on-set monitor: what to look for, and what to avoid

About half the monitors I find on my productions are set up improperly. Here’s what I look for.

A few years ago, when high quality LCD monitors didn’t exist, I found myself in the wonderful position of mentally averaging every monitor on the set. The on-camera monitor would show me a low contrast monochromatic look, while the 17″ monitor from one manufacturer would appear slightly yellow-green and clip highlights. A third monitor would show me unclipped highlights but the white point made the entire image a cool blue. It was almost impossible to know what the camera was truly recording.

In earlier times I’d have a video engineer, a set of scopes, and a paintbox on set. The engineer would carefully tune up the camera, adjust for highlights and open up shadows as needed, and just generally make sure that the image was the best it could be—and I could see what that image looked like on a high-quality CRT monitor. I knew exactly what was going to post, and that was good because video was rarely graded in those days. I had to get it right the first time.

In the early days of LCDs, I had to look at all of the monitors, mentally average them… and guess.

Now we capture everything in log or raw… but I still need to know exactly what I’m getting as I’m rarely invited to color grading sessions, or the “grading” is done by an editor who was pushed into doing it to save money and who slaps on a LUT that roughly preserves what I gave them in the first place.

Fortunately the days of inaccurate monitors have largely come to an end. OLED monitors are commonplace, and they look as good and even better than CRTs ever did. LCD monitors still exist, but I avoid them when possible because they rarely show me deep blacks. That may be a more accurate representation of how my work will be viewed, as most people view media on LCD displays, but there’s a purity of color I get out of OLEDs that I don’t see in most LCDs. As my work may not be graded, or may be “graded” by someone whose eye for color is not as good as mine, I want to see exactly what’s being captured even if no one ever sees it that way again.

About 50% of the time, rented monitors are set up to someone else’s idea of what looks good.  Unless I check every setting, or perform a factory reset, I have no idea what I’m capturing. Here’s what I look at.


I was once on a job where the director asked me which monitor he should trust: mine or his. I hadn’t been over to video village yet so I wandered over to take a look. His monitor was very bright and very cool. I jumped into the menus and found the white point to be D93, or 9300K.

The color of white affects everything, Rec 709 uses a white point of D65, or 6500K. Most monitors have the following built in:

  • D93: Apparently this is a preference in Japan, where cool colors are culturally more desirable in the same way that Americans and Europeans prefer warm colors. I’ve also seen this used by computer graphics artists, although I don’t know why.
  • D65: The Rec 709 standard, and the one you should use.
  • D50: I have no idea why this is even in a video monitor. The only use I’m aware of is for proofing print materials, as ambient window light tends to fall at around 5000K.
  • D63: This is the DCI-P3 white point. It’s a little warmer and greener than D65. There really isn’t a reason to use this on set, as P3 is meant to be viewed at dim light levels (about 1/4 of what you’d see on a TV) and surrounded by black. Both low light levels and a black surround influence your perception of imagery, and there’s little to no use for this on a set. Rec 709 is meant to be viewed, and to look correct, at fairly high ambient light levels.


I don’t know how many monitors I’ve seen that have come to set to a gamma of 2.2. This was the old standard, in the days of CRTs where a true black was nowhere to be found (tubes tend to be a dull gray at their darkest), but now that displays can show deeper blacks the new standard is 2.4.

No, that’s not completely true: the new standard is BT.1886. Believe it or not, while a capture standard for cameras has existed for decades (Rec 709) a similar standard for displays has shown up only recently. That’s what BT.1886 is. Many professional monitors have it built in. The question is, should you use it?

That depends on who you ask. BT.1886 has a base gamma of 2.4 but there’s a modification that causes the first few steps of brightness to rise out of black at an accelerated rate. This allows for the preservation of a true black while taking advantage of the fact that modern displays have more dynamic range in the shadows, as this little brightness “bump” increases the visibility of dark shadow detail. In theory, BT.1886 is a function that adapts to a monitor’s maximum white and black values and presents a somewhat uniform experience between displays, but the reality is that there are a lot of complaints that it lifts shadows too much on displays that don’t have very deep blacks. For those, a gamma of 2.4 is almost always better.

There are cases to be made for all of these gamma settings. A setting of 2.2 may show you more shadow detail when the monitor is in an environment containing a high level of ambient light. BT.1886 may work well on an OLED with deep, dark blacks. And 2.4 may work somewhere in between. I often use 2.4, or I might toggle between 2.4 and BT.1886 (if both are available options) to see what looks “right” on that particular monitor. More often than not I use 2.4.

The important thing is that once I make a choice, I stick with it. While a gamma of 2.2 might make shadows more visible outdoors, and 2.4 might give me crisper blacks indoors, I don’t have the time or bandwidth to check the monitor calibration every time I change locations. I don’t often work with a DIT, and my camera assistants tend not to be experts in setting up monitors. I’d rather use one overall setting and work to bring down ambient light levels around the monitor than toggle back and forth between gamma settings and occasionally forget what I’m looking at.


Unless you’re monitoring in UHD (Rec 2020) or HDR (Rec 2100) you should always choose Rec 709. It’s kind of a “worst case scenario.” If it looks good in Rec 709, it’s probably going to look good everywhere else.

On one job I walked by video village and saw one of the OLED monitors was pink. It turned out someone had set it to decode SLog on a previous job and the rental house hadn’t reset it. While most displays don’t have options like that, both Canon and Sony incorporate settings into theirs that will make sense of a log signal sent to the monitor from one of their own cameras. It’s always a good idea to make sure a monitor’s color gamut is properly set before the client, producer or director see an image they aren’t expecting, as that can be detrimental to one’s career.


A Sony product manager once told me how the factory calibrates the OLED Sony PVM-A170 (my favorite on-set monitor). A robotic arm holds a flat sensor up to the surface of the display and tunes it section by section. It takes a full morning to calibrate a PVM display, and an entire day to calibrate a BVM display. When I told him that a lot of rental houses tune up those displays using off-the-shelf calibration tools, or even by eye, he was horrified.

One of those reasons is that many calibration tools aren’t smart enough to apply what is known as a Judd offset. The CIE 1931 study that attempted to spec out how humans see color is flawed in the blue portion of the spectrum, and a gentleman by the name of Judd proposed an offset in the 1950s to correct this error. It wasn’t necessary until fairly recently because the error doesn’t manifest in CRT or single-phosphor LCD displays because the color saturation isn’t high enough. RGB LED-backlit LCDs and OLEDs, however, cross that line. If you calibrate your display and it looks a blueish green then your calibration tool might not be applying that offset.

I’ve seen LCDs that needed careful post-factory calibration, but I’ve not seen many quality OLEDs from major manufacturers that do. If a rental house has some custom calibration settings loaded in I might toggle them on and off to see what’s going on. More often than not I turn them off. Unless a manufacturer has clearly made a mistake, I want to get to know how a monitor looks in its native mode, because that’s the same no matter where I go. In house monitor calibrations might be an improvement or they might not be: in my experience, this is highly variable.


I find that detents almost always work best on modern OLEDs.

If you have a monitor you don’t trust, and you don’t know how to set up to color bars, you should at least learn how to set white and black. Every test pattern has chips that fall just above, below and at black. You want the chips that fall just above black to be faintly visible, while the others should simply be black and blend together. The white chip should be dimmed, and then dialed up until it just starts to appear white to the eye.


Trust no one. Always look to see what’s inside your monitor. If you’re working with a user calibrated display, make sure those settings are an improvement over the default settings (if that’s possible to discern without a reference). Certainly don’t trust settings that you haven’t put in, and make sure that the basics of Rec 709, D65, and 2.4 gamma (or BT.1886) are in place before you roll on anything.

Disclosure: I’ve worked as a paid consultant to Sony, although not for the monitor division.

Art Adams
Director of Photography

Support ProVideo Coalition
Shop with Filmtools Logo

Share Our Article

Art Adams is a cinema lens specialist at ARRI, Inc. Before that, he was a freelance cinematographer for 26 years. You can see his work at Art has been published in HD Video Pro,…

Leave a Reply

2 Comment threads
0 Thread replies
Most reacted comment
Hottest comment thread
2 Comment authors
DavidPaul Nordin Recent comment authors
newest oldest most voted
Notify of
Paul Nordin

Nice overview, Art. But, damn, how many articles have you and Adam published here about the same thing: Monitor Calibration over the years?! The topic that keeps on keepin’ on. 🙂


“D50: I have no idea why this is even in a video monitor. The only use I’m aware of is for proofing print materials, as ambient window light tends to fall at around 5000K.”
Because d 50 is actually more neutral d65 is bit more blue.
I don’t know why d65 became standard