3 frequently repeated myths about HDR

A whole heapin’ amount of info on HDR

In developing the first episode for season two of 5 THINGS, I did quite a bit of research on HDR. During that time, I came across many of incorrect assumptions and myths about HDR. Surprisingly, many were from other tech minded folks in the industry. In honor of the recent Mythbusters swan song, I’ve decided to bust 3 frequently repeated myths about HDR.


1. HDR and 4K are mutually exclusive. They are either/or.


In fact, folks have been shooting in formats that allow for greater dynamic range for almost 160 years – way before video was even invented. We now call that HDRi – High Dynamic Range Imaging – and involved multiple photographic exposures of different time lengths of the same image combined after the fact. It’s low tech nowadays to be sure, but it certainly produced a viewable image with a much wider range in lightness and darkness than traditional one exposure film.

“Brig in the moonlight” (above) may have been the first HDR photo. 1856-57 by Gustave Le Gray
“Brig in the moonlight” (above) may have been the first HDR photo. 1856-57 by Gustave Le Gray

Fast-forwarding almost 130 years, HDR with video became a reality, albeit only known to enthusiasts and folks with deep pockets. It followed the same methodology – shooting the same subject at the same time at different exposure levels, and then combining them downwind of the image acquisition.

It was known then that acquisition was possible, but the immediate limitation was the exhibition of said HDR content. That and the exorbitant price tag. And so it became a video tool available in high-end industrial application and to enthusiasts.

It wasn’t until the era of the Y2K bug that the price tag for sensors that could capture a wider dynamic range began to be somewhat accessible and nearly affordable, and physically small enough to be developed into an affordable acquisition tool.

That puts us squarely into today’s marketplace, which just so happens to be right as UHD/4K acquisition is being pushed (as for who is doing the pushing, we’ll save that for flame wars on forums, I’m sure).

Thus, 4K and HDR happen to converge at a point in consumer technology time…and they just so happen to coexist in relative harmony.


2. I’ll have to develop brand new workflows for HDR!


Well, you might. But the industry already has a good foundation for this. Remember offline/online workflows? I know, it seems so long ago…

In the past ~12 years, post production has enjoyed a digital luxury in the media creation realm: content acquisition, encoded at the time of recording that could be edited robustly in post production – as is, without any transcoding. News organizations enjoyed XDCAM formats, and many cinema type cameras worked with post production to shoot in more robust, post friendly formats in camera. Shooting ProRes and DNxHD flavors allowed for quality formats that met and even exceeded broadcast and exhibition specifications to be useable throughout post. And for a short time, for a good majority of projects, we enjoyed a homeostasis. Sure, we had the feature film market that always works this way, and many TV show that still followed this model, normally for storage costs…but I digress.

But there was a time. A time…before. A time when folks shot on a physical medium – film! – and had to do an offline edit; that is, edit with a lower quality version of the raw footage, for sake of ease of use. And with HDR, that’s just what we have to do now.

We’ve been spoiled by the “edit natively” NLE battle cry. Those who edit often are well aware that expecting a smooth edit with high quality camera originals only occurs when all of the post stars (and codecs and CPUs) align. The desire for immediate gratification must be tempered with the reality of technology, coupled with a pleasurable post experience.

Thus, it’s imperative you remember the post days of yore. Shoot – create proxies with a 1 light – edit – reconform—grade – export. It’s not immediate joy joy feelings, but you will certainly enhance your calm.


3. All I have to do is put a LUT (Look Up Table) on my HDR image and all will be good, right?


In audio, we have the term “suck knob”. As in, your audio sucks, let me turn down (“fix”) the amount of suckage with this single knob.



A LUT – either used in post for footage shot with HDR in mind, or for footage simply shot flat to have better color latitude in post – will have just as many color flaws as the footage you used to shoot in a non-flat format. You simply now have more latitude to FIX this color later. It’s not a Staples red button (“That was easy!”). A LUT is not a fix; a LUT is a starting point.

I’m serving up a whole heapin’ amount of info on HDR in the upcoming episode. Check out Episode One of 5 THINGS on HDR, available on Tuesday March 15th!

Was This Post Helpful:

0 votes, 0 avg. rating


Michael Kammes

Michael Kammes is the Director of Technology and Marketing for Key Code Media. He consults on and demonstrates digital filmmaking workflows benefiting the post industry. These include workflows about stereoscopy, acquisition, storage, editorial, audio, finishing, and encoding. His post audio experience encompasses serving as a Dialogue Editor, SFX Editor, ADR Recordist and Editor, Re-Recording Mixer and Supervising Sound Editor on many film, television and internet based projects. He is currently a member of the MPSE (Motion Picture Sound Editors), an FCP Apple Certified Trainer, an Avid Certified Support Representative (ACSR). You can catch him as a presenter at colleges, conventions, technology retreats and symposiums. Want more information about Michael Kammes? You can find it at http://www.MichaelKammes.com

  • Mike Cavanagh

    nice piece. I saw the secret Sony monitors last Friday comparing “regular” 4K to 4K + HDR. The later looked so real it looked fake. fluorescent light was totally blown out on the 4K, but was in full glory in 4K HDR. What’s scary about offline/online, was 15:1s and online at AVR26. Ahhh 1994!

    • scottsimmons

      AVR … .those are three letters together I don’t think anyone wants to see come back

      • Michael Kammes

        Certainly not, but what it does signify is “perception of quality”. There was a time when AVR was acceptable for broadcast quality. Just like many years from now we’ll look back at non HDR and think how archaic it was. Video tech is almost as ever changing and fickle as the pop culture landscape.

        • Andrew Smith

          What is this AVR that you speak of? Never heard of it before and it seems neither has Google. (I searched “AVR video”)

          • Michael Kammes

            AVR – Avid Video Resolution. The first video compression Avid systems used. Via Wikipedia:

            The first-release Avids (US) supported 640×480 30i video, at resolutions and compression identified by the prefix “AVR”. Single-field resolutions were AVR 1 through 9s; interlaced (finishing) resolutions were initially AVR 21-23, with the later improvements of AVR 24 through 27, and the later AVR 70 through 77. AVR12 was a two-field interlaced offline resolution. Additionally, Avid marketed the Media Composer 400 and 800 as offline-only editors.

          • Steve Hullfish

            I was one of the first to use this crappy AVR for broadcast… I was editing at Oprah and I think we were broadcasting AVR27, which I think was about 1/4 of uncompressed standard definition video… maybe less. That was in the early or mid 1990s. It was one of the first “broadcast resolution” codecs that Avid had.. the original resolutions were all off-line quality only. Everything had to be sent via an EDL and conformed in an on-line suite.