Alexis Van Hurkman is a man of many talents. He might be best known for his expertise in post-production, and as the writer of books such as the “Color Correction Handbook,” and “Adobe SpeedGrade: Getting Started,” that should come as no surprise. ProVideo Coalition readers are also familiar with him for his Answers Occasionally Given column, where he answers specific questions about the various approaches to balancing scenic color and the differences between visual and graphical analysis, just to name a few of the topics he explores there.
His talents extend to other areas of a production though, as he’s also an accomplished writer and director. He needed to rely on all of these skills for the short that he recently completed, “The Place Where You Live”, which he wrote, directed and edited. The film tells the story of a professor of physics who gets abducted by her counterpart from an alternate dimension. Replaced by her doppelgänger, she struggles to rebuild the machine and reopen the gateway between worlds to regain the life that should be hers.
Alexis lived and breathed this project for a long time, and we wanted to talk with him about what it took to make this film a reality. In this interview, Alexis tells us about his inspirations for the story, what it means to be involved in every aspect of a production and how the tools he used impacted the way in which “The Place Where You Live” came together.
Watch the trailer for the film below…
Your original sci-fi short, “The Place Where you Live,” deals with the many-worlds interpretation of quantum physics. Is this a topic you’re passionate about?
I’ve had a casual interest in physics ever since high school, which probably goes hand in hand with my longstanding enthusiasm for science fiction. I’ve always been fascinated with research delving more deeply into how the universe works, which also happens to provide actionable by-products like faster computer chips, interesting optics technologies, and all manner of other industrial advances. Narratively, I’ve long been a fan of “hard” science fiction, which tries to present as plausible a use of futuristic technology as authors know how to present. In both cases, my interest comes down to a fascination with problem solving, which is an underlying theme in much of what I do creatively. As far as “The Place Where You Live” goes, the many-worlds hypothesis provides a plausible-sounding door for viewers to walk through in order to suspend disbelief about what they’re seeing, but at that point my movie is more interested in the ramifications of these events on the main character. The central story is really about what she’s doing and why she’s doing it; physics is a backdrop.
Not to take us down a rabbit hole, but quantum mechanics could do anything from demonstrate the existence of god to provide a means to destroy the universe. And that’s not even the half of it, is it?
Well, demonstrating the existence of god is a tall order for any line of scientific or philosophical inquiry; destroying the universe also seems a bit grandiose. However, the very question you ask is what makes quantum mechanics so interesting narratively; as a writer of fiction, I’m actually more interested in what kinds of questions this sort of research evokes in the viewer than I am in the science itself. I’m a wretched mathematician, and so I’m condemned to only reading articles aimed at the layman. However, the many-worlds hypothesis suggests so many possibilities that as a writer it’s an embarrassment of riches. And, it allows me to employ vague visual suggestions as a director in order to get viewers to fabricate their own interpretation of what’s going on in the movie, which I find endlessly fascinating. In particular, everyone I know seems to have a different take on the ending, which I love.
Can you talk a little bit about the time period and technology that you showcase throughout the film?
I’m deliberately vague about the time period. I like to think that it’s obviously a near-future scenario, with technologies that may be just around the corner, but I hate to pin a date on anything, since invariably you’re going to be wrong. I try to avoid showing any technology that will be obviously dated in a year or two; showing a cellphone is almost the worst thing you can do in a movie any more, since they change so rapidly; yesterday’s latest model is tomorrow’s embarrassing brick. But on a short film budget, there are limits. I mulled over faking a car interior, but at the end of the day, I saved the money by just using my own car.
The biggest cue to era is wardrobe, of course. I’m lucky in that my wife Kaylynn Raschke is a stellar art director and costumer, and she pulled wardrobe pieces that are modern and cool, but also have a certain timelessness to them, at least within the last couple of decades. She’s very tuned into what’s happening with fashion and where trends are going, and knows how to avoid wardrobe choices that will appear dated in four years. That’s invaluable when you’re trying to create a “futuristic” movie that will be sitting on the web for who knows how long. She also worked to keep the set dressing “academic” and vague, which is to say the sort of worn, old furniture you’d find in any college professor’s office.
The technology I show is another thing altogether. I’ve become convinced that technology will eventually altogether eliminate physical displays and the limitations they impose. I’ve followed developments in augmented reality for the last twenty years, and seeing what’s happening now in manufacturing with workers using heads-up displays, certain phone-based augmented reality applications, and technology like Google Glass and Oculus Rift, I’m absolutely convinced that in fifteen or twenty years, monitors on tables will be obsolete. Whether you’re wearing a pair of glasses or, as I show in my movie, a pair of contact lenses that are capable of projecting imagery right into your eyes, everything we currently see on a television, computer display or movie screen will then be simply projected into the world. I try to show an office-based version of that sort of technology in “The Place Where You Live” that combines gestural controls (another technology I believe will become more ubiquitous as it matures) with a heads-up display that is motion tracked in place and locked to her desk, car window, or to a console, so that the virtual computer-based displays and controls appear to simply be objects in the world with which she interacts.
The benefit to showing this as the dominant technology of the movie is that I was able to avoid having to actually fabricate props for the futuristic technology shown; it’s all computer HUDs put in during post. The HUDs were created in two stages: the actual animations and graphics were created by Brian Olson and Patrick Burke here in Minneapolis using After Effects and the composites that marry those elements to the live-action in the first scene of the movie were done by Flame artist Aaron Vasquez in New York using Smoke. He did a phenomenal job, especially creating all of the little interactive bits that combined his own animation with the pre-build animation from the graphics package. I composited the other HUDs throughout the movie, based on the style Aaron set with his work.
You wrote, produced, directed and edited this film. What’s the difference between being involved with a project every step of the way versus trying to get your head around it in post-production?
The nice thing about being involved with an effects-heavy project from the very beginning is that you’re able to try and prevent problems early on through intensive pre-planning. You can also control the budget more aggressively by figuring out what’s going to be too expensive in advance, and in my case writing and/or storyboarding around the issue allowed me to cut costs without sacrificing the story. In many cases, it’s simply a matter of substituting a less expensive solution to shoot a visual concept for the original idea. Since I’m intimately involved with both post and pre-production, it’s a fairly easy set of decisions for me to combine. I’m always saying that post-production starts in pre-production, and when directing, I get to put my money where my mouth is.
When you only work on the post side of things for a project, then you get stuck having to deal with every regrettably expensive idea that was acted upon during the shoot, often with little regard to the reality of how long it will take to execute later, whether because of unexpected rotoscoping, or a poor selection of wardrobe that makes keying a nightmare – or any number of endless decisions that can transform an otherwise simple composite or color correction job into a complex, time-consuming chore.
I can’t emphasize that last bit enough. Wardrobe and art direction can be the post-production artist’s best friend or worst enemy. Since Kaylynn and I work together so closely on all of my projects, she now knows enough about digital compositing and color work that she can anticipate the choices that might cause problems, and those that could actually help, in terms of foreground and background colors, dealing with edges and transparency for shots involving keying, etcetera.
Did you ever think to yourself, “I’ll take care of that in post”, or did you make a conscious effort to ensure post wasn’t about “fixing” what you shot?
At 11pm at night, everyone says “fix it in post”; I don’t care who you are in the crew, you want to go home. That said, we spent a lot of time trying to get things right with the practical components of what we were shooting so that the VFX would go easier later on, and I spent a lot of time overseeing folks to do things that may not have made sense to them, because I knew how the shots would come together later. That’s the terrific advantage of being a director with hands-on compositing experience. I later went on to composite several of those shots myself, which made me appreciate all the more how small decisions on-set can come with large consequences later.
Perhaps the biggest challenge was aligning two separate sets that join together during the “dimensional doorway” scenes. It was cheaper to set up two small stages rather than one large one, so we ended up shooting the office with a greenscreen, and the lab with a greenscreen, and then joined both locations together digitally by punching out the greenscreen, overlaying the dimensional doorway (a wonderful effect devised by Smoke artist Brian Mulligan) and aligning the opposing set to be visible through the doorway. Cinematographer Bo Hakala did a fantastic job figuring out the math to align locations, cameras and sight lines of actor Dawn Krosnowski. He suggested we use a giant mirror, with even more math, to be able to put the camera at the correct distance from Dawn. This way both locations would line up correctly, despite the tight shooting spaces we had to work with. When I first lined up these shots during the rough cut, I held my breath because I knew they’d either work perfectly, or be a huge hassle; they worked perfectly. By and large, we managed to avoid taking shortcuts that resulted in shots being eliminated because they didn’t work.
On the flip side of things, I’d say the biggest problem I caused myself was not giving enough time to lighting and grip to light certain greenscreen setups well enough. That was totally my fault for packing the entire greenscreen shoot for both the office and lab sets into two days. Fortunately, the Modular Keyer and Master Keyer in Autodesk Smoke are so good that I ended up getting great keys anyway; however, it did take more work. Still, balancing lighting for VFX with scheduling is only solved by implementing a bigger budget to allow for more days of shooting, or by cutting coverage (which the editor in me wasn’t willing to do). I felt I achieved a reasonable compromise between the two.
Tell us about some of the challenges you encountered during filming. How were they resolved?
The biggest challenge was time. I had the budget to hire a proper crew, and build out custom sets. I didn’t want to skimp on lighting and grip because I knew how essential they would be to the composites and final look of the film. I’m a huge fan of having as many practical environments as possible for the actors to interact with, and that requires people. Unfortunately, to have the crew I wanted, I could only afford two and a half days of shooting. I’m convinced that part of being a director is having a hopeless sense of optimism. I was pretty sure I’d be able to get everything we needed within that time, but Assistant Director Molly Katagiri is really to thank for pulling together the schedule that made it all possible. Keep in mind that, for scenes where the doorway is open, one angle of coverage needed to be shot two times – once on each set, and the actor and her double each had two costume changes. Balancing costume changes against the lighting and set transitions was a logistical nightmare. It took careful previsualization on my part to create a complete set of 3D storyboards that Molly was then able to hammer into a shootable schedule.
Our second biggest challenge was the shooting environment – Minnesota in December. The weather had been unseasonably warm and snow-free, which was logistically great; however, the last night of the two-day set shoot, we experienced our first blizzard. We’d originally planned to shoot the third half-day at a frozen yoghurt shop in town, but the owner of the shop was snowed in at home, and had lost his cell phone in a snow drift the night before, so I couldn’t reach him. As I was shivering in front of the shop and the crew began arriving, I figured something had gone horribly wrong, so I immediately sent everyone to my house, where the actors were preparing for that day’s scene. As tightly storyboarded and controlled as the previous two days shoot had been, I traditionally work much more loosely when doing live-action, non-VFX shooting. My tendency to block everything on set with the actors the morning of any shoot stood me in good stead, as Bo and I improvised a whole new shooting plan that hour, with all of us up to our knees in snow. The new location worked out well (again, advantages of living with an art director), and I really love how that scene turned out.
Music is a big part of the film and really establishes a tone and mood. Did you have the music you wanted to use already in mind, or was it more of a process to find something that worked after you were done shooting?
John Rake composed the film’s music specifically for this project. We’d talked about it in advance, and I knew I wanted to work with him, so I avoided using placeholder music altogether when doing my initial rough cut. This isn’t much of a hardship for me, since I often find myself humming abstractly as I edit (only in unsupervised environments). It allows me mull over musical ideas for the timing without committing to anything.
Once I finished a rough cut, I sent it to John first, and we had a long phone conversation about what sort of vibe I envisioned. He then started laying down tracks, and sent me his first “sketch” ideas in about a week, if I recall. I then laid these into the timeline to evaluate them to picture, which led to my first round of notes for him. While he was evolving the music, I had my scratch track to work to, and I continued refining my edits to his musical sketches. As subsequent versions of the tracks came to me, I’d edit them in, refine the timing of the scene, and send him updated rough cuts, along with whatever notes I had. After about three rounds of back and forth, I had the final tracks to lay in.
I’ve used plenty of needle-drop music in the past, as well as using commercial tracks that require later licensing, but at this point I’m completely sold on working with a composer and getting bespoke music tailored to your own project. The process is enormous fun, and the results fit the project so well.
Let’s talk about post-production. You used Autodesk’s Smoke, and the big advantage with Smoke is that it’s editing, color correction/grading, compositing, titling and effects all in one system. Did that fact change the way you approached post for the film?
The integration that Smoke has really helped me to work fluidly in post. Every time I have to open another application, it makes me wonder if the thing I want to do is worth it. Obviously, you’re going to move to the right application for the job whenever you need to build a signature composite. But, at the very end of the project, when you’ve reached the point of making small tweaks to various shots that really are debatable, that’s when being able to just open any composite in the edit and do another five minutes worth of work makes the difference between whether it’s worth tweaking something or not. Something I’ve learned from years of color grading is that post-production is an iterative process; it’s very much like writing, in fact. Every time you look at something, you’ll probably see something new that you could improve. If it’s easy to dig in and just make the change, then you win. If it’s not, then you second-guess yourself. It’s nice to keep things fluid as you work.
Of course, no application works in a vacuum, and once I locked both editing and VFX, I moved the whole timeline into DaVinci Resolve for color grading. While Smoke has fine grading tools that are entirely appropriate for :30 spots, I’ve got a full DaVinci Resolve suite to work with, and I’m simply more facile with Resolve as a colorist. I could have moved my project into Resolve using EDLs to conform the clips output from Smoke, but instead I simply output the whole Smoke timeline as a ProRes 4444 clip, and used Resolve’s scene detection tool to automatically cut it into individual shots (my edited timeline was almost entirely cuts only).
While “The Place Where You Live” was shot on a RED ONE MX camera (with SuperSpeed primes), I transcoded every clip to Apple ProRes 4444 using the REDlog Film gamma profile because I knew that virtually every shot would be a VFX shot, and I wanted all of the compositors using identically debayered media without worrying whether someone would use different settings without telling me (something I wrestle with as a colorist sometimes). I generated these working files, and synced the sound – all using Resolve from the very beginning of the process, before even moving the media into Smoke. I edited using the ProRes 4444 files. In my case, using proxy resolution files didn’t really accelerate my workflow, and editing with the full quality clips meant that I and the other folks on the team could start compositing almost immediately, which was a huge benefit. Since just about every shot had some kind of composite in it, there was no point to reconforming back to the original raw media, so sticking with log-encoded ProRes 4444 worked wonderfully to maintain latitude and highlight detail through each generation of media.
How is a workflow and even a project as a whole impacted when you’re using one system to take care of almost everything in post?
Having high-end, node-based compositing tools available for every single shot in the timeline was a huge benefit. As a former Shake user, I learned to love nodes previously, and then of course, using Resolve as a colorist means that I use node-based image processing every week of the year. The ConnectFX compositing tools in Smoke are comprehensive, and having them constantly available meant that there was never a time I skipped making a compositing tweak in a shot, no matter how trivial. I used to drive After Effects non-stop from 1996–2003. Layer-based compositing is powerful, but I find it’s easier for me to get deeply detailed in my composites using nodes. Having this kind of environment available on top of your editing timeline is fantastic, because alterations to an edit will affect the VFX, and vice versa. I might uncover a key moment that isn’t noticeable until the middle of a composite that would benefit from making tweaks to the timing of the shots around it. That might in turn, impact how I want to set up certain timings within the composite itself. Being rapidly iterative is easy in Smoke, because it’s all right there. I had a few shots where there’s a flickering light on the wall, which is all CG. This sort of integration let me play with that flicker’s timing against what came in the shot before and in the composite after. It’s a silly little thing, but essentially, it encourages you to edit the timing of your VFX along with the rest of your sequence so that hopefully the rhythm of every element in the movie flows smoothly for the viewer. Smoke’s integration enables the triumph of minutiae.
What should an editor who is familiar with a different NLE system keep in mind if they’re looking to learn Smoke or pitch that system to their producer as a way to save time and money?
First, don’t underestimate Smoke’s learning curve. Autodesk has made huge strides in the four versions of Smoke that have emerged while I’ve used it, and it’s much easier to learn now then it used to be, but you still need to be willing to learn how Smoke “thinks” if you’re going to succeed with it. There are many areas where Smoke works differently from other NLEs or compositing applications you’ve used before, and understanding these differences is key. Second, Smoke is a terrific tool for any project where you’re combining effects and editing; this combination is Smoke’s superpower, and it’s in these kinds of projects that Smoke shines.
On the other hand, if you’re only doing editing, and especially if you’re doing long-form editing, Smoke’s integration may not end up being that much of a benefit. However, you shouldn’t think in terms of either/or. If you have Smoke expertise, then even if you’re working on a feature being cut in another NLE, there may be an effects-heavy scene that you can still cut and composite entirely within Smoke, taking advantage of the integration I described previously to eliminate the barrier between craft editing and compositional timing for that scene. You can then export the result back into your other NLE for continued work. Smoke is also an excellent finishing environment, so importing a cut from another NLE, and then using Smoke to finesse it is also a great way to work.
“The Place Where you Live” is currently making the rounds at film festivals, and we’ve heard it’s being received really well. What advice would you give to filmmakers who are looking to put together their own short that they’d be able to submit to festivals?
If you’re making your own movie, focus on making something you’ll enjoy. Don’t try and second-guess what you think festivals will like, because you’ll never know what that is. There are so many festivals already, and I see more emerge every year. I took another feature to the festival circuit in 2006, and between then and now, the overall number of festivals available to submit to has tripled; it’s crazy. Make sure that you’re making the film you want to make, and do it as well as you know how.
Don’t shortchange your casting. It’s easy to put a lot of attention into camera technology, postproduction software, visual effects and color grading to make the prettiest pictures possible, and then forget that the actors are what the audience is really watching. In fact, everything else in the film is there to support their performances, so cast carefully, and work closely with your talent. And while you’re casting, make an effort to diversify the talent on screen. There are all kinds of people in the world, and Hollywood tends to represent only a fraction of the population. This is your film, so cast the most talented, diverse and unique actors and actresses you can find.
Lastly, edit your film down to trim the fat. When I see the work of emerging filmmakers, the principal sin I find is when a cut is too long and drags on. You can take the best script and acting in the world and make it unbearable by not editing the project tightly enough. That’s not to say every movie needs to be rapid-fire (mine sure aren’t), but each moment should be no longer than it needs to be, and each pause should be very carefully considered. Keeping an audience engaged is challenging. Don’t ever take your viewers’ attention for granted.