John Bourbonais says it all in the title of his recent article, Why 4K is Wrong. In it, Bourbonais looks at the issues 4K presents around quality, post, delivery and future-proofing while laying out why 4K doesn’t add anything to the process or the product. He doesn’t delve to far into the details or even disarm the easy counter-arguments, but he certainly makes his case.
It’s a topic that came up during the PVC Panel at the most recent ETE. Clint Milby and Von Thomas were focused on the logistics of 4K and how it impacts their workflow, while Terence Curren is someone who has never been shy about his opinion around why he does not like 4K. He backs up that position with several points that Bourbonais talks through in his article, but goes into far more detail in terms of why it doesn’t make sense for the professional or the consumer. Fast forward to 6:50 in the video to hear specifically what the guys had to say at the event.
Bourbonais makes a lot of sense, but it’s pretty clear that 4K is more than the flash in the pan that he claims. 4K was all over CES just like it will be at NAB, so not only has it arrived but it’s readily available for both professionals and consumers. So really it comes down to the needs and desires of both consumers and professionals. Do professionals need to shoot, edit and deliver at 4K because of the perceived benefits? Do consumers want to watch something at 4K because they’re really able (or think they’re able) to tell the difference between 4K and HD? Or should moving to 4K (and eventually 8K) just be accepted as inevitable?
For me, 4K is just another workflow choice. It should be decided just like any other technical decision on a shoot. Shooters now have a plethora of choice in terms of bitrates, resolution, aspect ratio, and framerates. These are decisions we have to make on every shoot, and “to 4K or not to 4K” is just another such decision.
When I’m building a camera package for a project, I talk to the production or editor and determine what the final deliverable requirements are. That will help determine what camera system we use, do they need a flat or RAW image (or maybe they need a clean & pretty “pre-baked” look), what resolution would serve the post process, etc.
When delivering 1080 or 2K, 4K is obviously still useful for repositioning, stabilizing, etc, so that would be one such reason to choose it. Most of my work is intended for 1080p delivery, so delivering in 4K hasn’t really been a decision impacter to date. That will change in the coming years, I am sure. But until then, 1080p is my default standard, and 4K is seen as something mostly useful in post, pre-output.
Of course, budget is always a big key factor in these decisions. And a full 4K post workflow adds some time & cost to the budget. Oftentimes I shoot content that has a very fast turnaround, and in those cases sometimes 4K acquisition can be a major slowdown.
Ultimately this should be factored in like any budget decision…weigh the advantages vs the cost. Most of what I shoot tends to have a relatively short shelf life (non-theatrical), so 4K for me is currently just another feature that I would choose if and when it would be helpful. It’s not something I would choose as a default for music videos, commercials, or web content.
If having a presence at CES justifies the future of something, then we would all be doing 3D by now. The reality is that all but a very few of the home viewers will be able to perceive the difference between 4K and HD anyway. So we should all invest in brand new gear, handle 4x the bandwidth with the accompanying slow downs and data storage issues for the net result of no improvement to the consumer? And we should do all of this without being able to charge significantly more to cover the costs?
The whole 4K farce boils down to 2 questions:
- What is the maximum resolution the human eye can perceive at a normal viewing distance vs screen size situation?
- Can we call that good enough or, how far beyond that are we willing to go while wasting money and bandwidth?
A couple handy resources on the subject.
First, can you see the difference? 4K Calculator – Do You Benefit? – Reference Home Theater
Next, how do you measure the ability to perceive differences? (warning, this one makes my head spin) http://www.mpeg.org/MPEG/JND/
Personally, I don’t believe it’s going to sell to the masses. TV manufacturers are pushing it because their sales are flagging, but why is someone going to dump their perfectly good HD set for a 4K set when they can’t see the difference?
Oh, and if you truly buy into 4K you should be pushing to go 8K. The Japanese are already doing that.
Stunning Good Looks
“Until a producer or client calls and asks specifically for a 4K camera, there appears to be no real reason to have one for most video production professionals.”
Well, guess what: that’s already happening. Why, do you suppose, is that the case? Could it stem from a certain camera company’s marketing tactics, where resolution is one of their key selling points? Could it be that a major studio’s parent company, who also owns subsidiaries that manufacture and sell both cameras and consumer electronics, wants to sell a LOT more TVs?
I’ve written before that new technology has become a lot like storytelling. Hollywood loves to repeat itself because no one ever really knows which stories will take off and which will sink like stones. Rather than take a risk they’d rather go with the tried and true, even if it ends up being boring and predictable, because it will most likely make some money, which is better than taking a chance on another “Ishtar.” Rather than seek to produce a project that succeeds on its artistic and storytelling merits studios will instead go for the proven formula because it’s quantifiable, whereas art and storytelling is anything but.
The same thing is happening to technology in media production. Rather than hire someone who knows how to make compelling imagery, the producer or client “stacks the deck” by making sure they use whatever tool or format gets a lot of buzz at the moment. They, too, have a lot of money on the line and they want some quantifiable way of guaranteeing that they’ll get it back and then some. Technology is more predictable than artistry, or so they think. (They often confuse artistry with craftsmanship, which IS predictable and which is what really counts, but tends to cost more than they often want to pay.)
The problem is that it’s questionable whether 4K makes any real difference in the average theater environment. I’ve not heard of a lot of studies that show 4K presents a noticeably better experience. A year ago I saw a presentation at a major studio that compared 2K and 4K on a reasonably large screen and I couldn’t see much of a difference. I certainly wasn’t blown away.
What DID blow me away was seeing 4K footage on a 4K TV: the amount of detail was both eye catching and staggering. If I were an exec at an advertising agency I’d be looking at 4K production for digital signage installations and billboards. Cramming 4K worth of detail onto a 6′ wide OLED display makes a huge difference in image quality, whereas projecting it onto a large cinema screen via projector seems to make less difference.
4K is, of course, tremendously useful for VFX, car spots, aerials… anything that needs stabilization, certainly. But I’m getting requests to shoot 4K for spots that will only ever be seen on Youtube. I’m happy to do that, but budgets are tight and I’d rather see that money spent somewhere that will actually make a difference to how the product looks.
4K for Youtube distribution makes very little sense outside of the exceptions noted above. 4K for big screen presentations and point-of-sale… sure! Makes perfect sense.
Until the 4K Alexa shows up I have to do a lot of selling on such projects to shoot HD or 3.5K with a current Alexa. For me, the look I get out of that camera is vastly more important than oversampling live action footage for broadcast or the web. It also speeds me up considerably as I can let more stuff go lighting-wise that I can’t with other cameras.
The Sony F55 is catching up fast as a low cost 4K alternative, and just in time. I’m sure Arri will release a 4K Alexa eventually—rumors have been flying for a year or more, and Arri prefers to release products later than other companies while getting everything right the first time—so it’s nice to have a low-cost alternative with color science that’s coming along nicely. (I’m anxious to see how Red’s Dragon sensor looks, but I’ll have no idea until it’s released.)
There are solid uses for 4K right now, but unless the content requires “future proofing” (which most of the projects I shoot don’t need) 4K is a waste outside of certain circumstances. But it’s definitely coming, and I’m guessing that 4K origination will be common in five years or less as displays become affordable and distribution channels adopt H.265.
As a cinematographer 4K interests me but it doesn’t excite me overly much. In theory 4K allows me to shoot wider, more epic shots into which viewers can lose themselves as they soak in every detail, but current styles dictate rapid cutting and constant camera movement that completely negate that experience. Textures come alive in 4K but that’s not always a good thing, particularly where faces are concerned. Panning speeds must be extremely slow when shooting for 4K monitors as strobing and loss of detail become severe problems.
While I may sound like I’m wearing my Luddite hat (and it’s a very nice hat, too—made entirely of oak and held together by wooden pins instead of nails) I’d love to shoot more 4K. I just don’t want to shoot it because it’s a fad, or a buzzword, or the thing that clients want because it will magically make their projects successful. I want to shoot it because it’s the right tool for the story. I can think of lots of other things that I would trade for 4K:
- Improved dynamic range. Only a couple of cameras give us the same kind of freedom that film did in this regard.
- Improved color. Several camera companies seem to spend more time looking at charts than at the real world, and more time listening to engineers (who quantify color) than artists (who understand how color is used.)
- Improved ergonomics. Getting the camera in the right place at the right time makes a huge difference in the quality of coverage.
- Simpler recording. I’d love a camera that has maybe five settings and a start/stop button. Less thought about camera setup means more time to shoot.
Any of these will dramatically improve visual quality long before 4K will, particularly because we can see the results NOW. Where can I see a TV show or movie shown in actual 4K resolution today?
We’re probably headed toward a 4K world, and that’s fine. I certainly like HD quality over SD, so I’ll probably enjoy 4K over HD as well. If the experience is overwhelming in some cases (if the grass on the field at the Super Bowl is so dazzling that I don’t notice that there’s a game on) it’s easy to knock the image down a peg or two. Manufacturers are always looking for reasons to sell us new technology and I’m hardly one to complain about an improved visual experience. Still, though… until I see 4K broadcast pick up steam I’m in no hurry to give up shooting 2K on an Alexa. I’ll take stunning color and exceptional dynamic range over ridiculously high resolution any day.
TV sales and viewing are both down, and hi-res tiny mobile screens are way up and not slowing down. So it’s understandable that manufacturers invested in big screens will invent bigger, wider screens. Let’s hope they are serious about contrast and color and not just pixel numbers.
There was been talk of hi-def TV for longer than I can remember. What took it so long to arrive, and still not fully arrived on cable? But how long will it be before 8K is preferred, with prototypes already at CES?
From my primary perspective as an editor, my thought about 4K is really that I can use a single camera to get me a wide shot and a close up on an interview without needing two cameras. That “Duck Dynasty” set up of two f900s where one is basically on top of the other so you can do that smash in cut to the close up on the punchline is maddening.
I’m out shooting interviews today with two cameras and I’d definitely prefer the 4k option for what I’m doing. Clearly there are aesthetic reasons for shooting an interview with two distinct cameras, whether it’s for a slightly different perspective or the different look that cutting from a 50mm prime to an 85 or a 28 will give you. But pretty commonly, all I want is to cut tighter to cover an edit. And if 4K means I don’t have to see any more television interviews with a second camera pointing into someone’s ear, I’m all for it.
But 4K for 4K’s sake is going to be a ways off for me.
The Pixel Painter
Getting into 4K is cost prohibitive point of entry for a lot of people – from the camera gear to the new computer gear to edit it on. As much as I’d love to get a new pair of 4K BMDCCs and a loaded Mac Pro trashcan and all the peripherals to support it, to do it right I’ve gotta come up with a minimum of $40-50K just to get in the game.
I don’t doubt that there are uses for 4K production, but until I get a client willing to pay 4K production prices regularly, I’m happy with 2K and delivering 1080p for all their Vimeo and YouTube hosting and delivery.
As one who works exclusively as an editor the first things I think about when taking on a 4K job is: How much more GBs of footage am I going to get than normal? How much more time is it going to take to transfer that footage from the little portable drive that will come from the shoot? How much longer will it take before I can get to editing since I have to transcode that footage down to something usable?
Of course I ask those questions since I am still working on an old Mac Pro tower, often in Avid Media Composer, and not a brand new, designed for 4K machine. But I do ask those questions because I have some clients who have asked me about 4K because they fully intend to shoot upcoming jobs in 4K. And since they are shooting in 4K they want to post in 4K. And since they are gear heads and read the blogs they know it’s as easy as plugging in a Thunderbolt drive, firing up an NLE and going to work … right? Now, no one has asked exactly what screen we will be looking at while we edit the footage. And no one has asked who or what will take that 4K master file when we are done. The answer is we will be looking at that same old screen we have always looked at and the master file will be that same old 1920 x 1080 one we have always delivered. There may be a time in the future when an all 4K post-production pipeline makes sense for some jobs. By then hopefully we’ll all have new Mac Pros that can cut it like butter and a Chinese tv so we can see some type of 4K image at 4K but until then it’s way overkill.
That internet delivery where everyone will see your product will look just fine at 720. That 1080 commercial will be long forgotten in a few weeks. And that music video? I really doubt there’s much need to “future proof” that.
There are some obvious analogies to the transition from analogue TV transmission (PAL / NTSC) to digital 16:9 TV, and also with the transition from shooting film to shooting digitally. However everything that was learned from those changes seems to be overlooked by marketing departments when discussing 4K!
The introduction of digital TV and HD TV demonstrated that the general public prefers greater choice over better image quality. Broadly speaking, the public would rather have 5 or 6 channels of heavily compressed, SD television than one channel of pristine HD footage. With limited bandwidth, you would have to wonder what the incentive is for broadcasters to push 4K considering that even HD has not proved a strong selling point.
What are the benefits? Quality?
The biggest issue I have with 4K is the difference between the theoretical benefits and the realities of production. Quality is not just a number on the camera and a setting in Final Cut, it’s an entire production chain. If 4K is shot with a high-end camera, using high-end prime lenses, with minimal or no compression, the footage is edited and graded properly etc etc etc then great. But in reality, the majority of us working with 4K will be facing heavily compressed footage shot on prosumer cameras with cheap zoom lenses, and I’m not sure where any benefits will be.
We should have clearly learnt from the smartphone megapixel wars that it’s pretty naive to simply take the ‘bigger number is better’ approach. You can buy a phone with a camera that has more pixels than a DSLR, but no serious photographer would suggest the phone is better. The number of pixels is simply not the main determining factor in image quality, and although I’m hardly a technical expert I’m aware that the physical size of the sensor, the noise in the signal, and latitude are much more significant. That’s before we consider the lens, which outside of a Hollywood shoot is probably the most under-rated component in the production chain. If we look at what happened with smart phones, we should brace ourselves in anticipation of a flood of cheap prosumer cameras that claim 4K on the sticker but produce inferior pictures to higher quality HD cameras.
If the marketed benefit of 4K is image quality, and the selling point of upgrading to 4K equipment is to improve image quality, then is it possible to get a better return on investment other than upgrading to 4K gear? In any hypothetical scenario, is the money better spent elsewhere to get a better result? For a typical corporate video production, I would suggest that using conventional HD equipment but spending money on a specialist colour grader will produce a bigger difference in perceivable quality than simply spending the same money on a 4K camera (assuming that in our hypothetical situation colour grading is usually done by a multi-skilled editor). In terms of purchasing, a new set of lenses for an existing camera may yield better results than a new 4K camera that comes with a plastic zoom lens. Or even hiring something like an Alexa and shooting 2K would give a better result than purchasing a cheap prosumer 4K camera.
Lessons from the death of Kodak
It wasn’t that long ago when articles about pixel counts and quality were popping up everywhere, although they were to do with the transition from shooting film to shooting digitally. On one hand you had manufacturers – especially Sony – bandying around pixel numbers and resolution charts, on the other hand you had DoPs claiming it was actually all about latitude, which digital cameras lacked by comparison, and that film would always look better. It would be interesting to look back at some of the articles written from that time to see what can be applied to the 4K debate. Certainly the size and noisiness of the image sensor is critical to the end result, no matter what the pixel count is. Again, we can anticipate a flood of cheap prosumer 4K cameras that may have more pixels, but poor latitude.
Where’s the Quality:
My mother in law is still happy to watch VHS. Most of my movie collection is on DVDs – standard definition. The basic quality of the image has little to do with the enjoyment of watching something great. “Leaving Las Vegas” and “The Hurt Locker” won Acadamy awards and they were shot on 16mm. “Moonrise Kingdom” was shot on 16mm. They wouldn’t be “better” movies if they had been shot on 4K.
Citizen Kane on VHS will always be a better film than Transformers 5 shot on 4K.