Site icon ProVideo Coalition

ART OF THE CUT with Tiffany Hillkurtz, editor of “The Secret Life of Pets 2”

Tiffany Hillkurtz started working in animation in 1997 as an assistant in the animation department. She continued through the ranks of editorial on films like Space Chimps, Astro Boy, Madagascar 3: Europe’s Most Wanted, Free Birds, Penguins of Madagascar, Minions, and The Grinch. (My favorite imdb note on Hillkurtz is that she played a Jawa in Star Wars: Episode IV – A New Hope  including an adorable photo of her in costume.)

Art of the Cut spoke with Hillkurtz about her work as the editor of The Secret Life of Pets 2.

(This interview was transcribed with SpeedScriber. Thanks to Martin Baker at Digital Heaven)

Secret Life of Pets 2, Paris crew editing crew in Hillkurtz’s cutting room left to right: Gian De Feo – 2nd assistant editor, Christophe Ducruet – 1st assistant editor, Tiffany Hillkurtz – Editor, Tifenn Philippot – edit production supervisor, Nico Stretta – associate editor

HULLFISH: I’ve done a couple of interviews with animation editors so some people that have read those might know something about animation, but let’s really talk about the differences between live action and animation editing and just kind of fill me in on where the process starts for you.

HILLKURTZ: I like to think that editing live action and animated films are both akin to creating a sculpture. With live action, you have a block of marble, all this footage that’s been shot, and you’re chipping away, trying to find the movie. Whereas animation is more of an additive process, like working with clay or plaster, starting from nothing, building something from the ground up. In the beginning, there’s nothing but a script, and sometimes not even that, sometimes you start when there’s only an idea. You’re creating an animatic of the entire film, layer by layer, continually fleshing it out, adding in complexity and elements from each department as they’re created, in order to build a film.

So, in animation, you come in at the beginning and you record the dialogue with just people in the office — this is called “scratch dialogue”. You edit that together while the storyboard artists are drawing boards. Then you time out those boards with the dialogue to see how it plays, and to see if that scene kind of works. It still has to “play” as a real scene, so you add sound effects and temp music.

Then you work with the director to implement any changes he wants and show a pass to the executives and whoever needs to approve it, and then eventually you send it to Layout. Layout places the cameras for the scene in the set that has been built in the computer (this process is regarding CG animation. It is different for a hand-drawn animation pipeline.). The story guys don’t always know what the location looks like, so the boards are not like live-action boards where it’s about how you’re going to shoot it. Animation storyboards are more about acting, characters, developing the story, or coming up with gags that aren’t in the script.

When the layout shots come back, they’ve “shot it” but then we might want to change it. Sometimes I have to ask for a new shot, or I extend a shot with a freeze frame or reframing, or I might realize I need a wide shot, so I’ll just ask for it. So it’s more malleable than live action because a lot of times in live action you’re just working with whatever they’ve shot, and if they didn’t shoot it, you don’t get it unless you’re able to do reshoots.

HULLFISH: A lot of people may not understand what the layout process is. But before we get to layout, let’s talk a little bit more about the process that happens even before layout, which is that the story is so much more malleable while you’re still in storyboards. What is your job or what are you bringing to the process while you’re still in storyboards?

HILLKURTZ: Basically, I’m timing out the boards and the dialogue to create the film. Once you get the dialogue from the script, even if it’s scratch dialogue, you’re still trying to decide if it’s sounding right. Sometimes you’ll go back to the writer and ask for a different line, or once it’s all laid out, you’ll discover it’s too long. You tune it as you go, depending on how it feels. In the beginning, you’re just trying to have it play as well as it can, figuring out the characters and how they work with each other.

It changes when you get the actors’ voices cut in, because that’s truly the character and that significantly affects the timing. Also, as an animation editor, you’re free to pick any take you want. The actors are by themselves in the booth recording anywhere from three to — depending on the actor — 40 takes of the same line, and you listen to them all. You choose which one you want. But you don’t have to use the whole line. You can use the first half of this take and the second half of that take, or one word in the middle. You can shape the performance of each character. Then, when you put that performance with the other actors’ performances, you create the pace and the tone of the interaction. There are many things to consider: Is it a sarcastic response or a serious response? Is there a really long pause, which could change the performance between two characters? Or are they talking over each other, which can also change the performance?

When the actor records their dialogue, I note the takes the director and I respond to the most. I will use those first when assembling, but they don’t always work in the context of the complete scene, so I like to re-listen to every take. Because of this careful process of working with the dialogue, a character’s performance is basically a combination of the voice actor, the animator, and the editor.

HULLFISH: I’m interested in the malleability of the script at that stage because when you edit the whole movie together, you can then watch it and see structural and story problems in context and in TIME (instead of just on the page) and STILL have the chance to add things, which you rarely get in live action. Or you can delete whole groups of scenes without the pressure of how much effort it had taken to shoot them.

HILLKURTZ: That’s the great thing about doing this all in storyboards – it’s easier to just move things around. A lot of times the script isn’t even completely finished. So you’re working at the beginning and it informs how the script will end. A lot of places I’ve worked, you work in chronological order. You start at the beginning of the movie and you go through as much as you can.

Obviously, there are some sequences that bounce around, but even if you have a full script, when you’ve done act one and act two, you might think, “Wouldn’t it be better if THIS happened in act three?” And you can — at that point — just change everything without huge consequences. After the first nine months or so we pretty much have a screening every eight to ten weeks just so you can see how everything feels, and how it’s playing together, and how it looks. It is very malleable at almost every stage.

HULLFISH: From a technical standpoint, when you’re in storyboards, how much are you simply cutting from one storyboard to another and how much motion work are you doing WITHIN the frame? After Effects type stuff? Or trying to pan or tilt or zoom on the storyboards? I’m assuming you’re editing in Avid — so how much keying and motion work are you doing in the Avid?

HILLKURTZ: Yes, definitely Avid. It depends on how much time you have. Some studios do a lot of After Effects. On Pets 2, we didn’t do any. I’ll do some moves — push-ins, crops, etc— in the Avid. Quite often I’ll re-draw eyes or mouths if I have storyboards where the character is looking one direction and I want them to look in another. Instead of getting a new board, I’ll just erase the eyes and draw them in the other direction with the paint tool. It’s amazing the difference changing a mouth shape can have.

The thing to remember is, the whole movie is in different phases throughout the process. You don’t finish all the storyboards and then move everything to Layout.

HULLFISH: Because the studio’s trying to keep everybody so you don’t dump everything off onto one department at the same time.

HILLKURTZ: Exactly. So when we have enough inventory of storyboarded scenes that have been approved, then we send that out to Layout. But then you keep going with the storyboards for the next part while Layout is working on that first part. Then layout shots start coming back into editorial, too. Eventually, you start getting animation shots in, and then you’re in three stages for different parts of the movie.

Editorial is like the hub of a wheel in animation because everything comes through our department. The storyboards come in, you cut them, you send them back out to Layout, who does their thing, then sends that back to you. Then you work on the layout material and send that back out to animation, and then animation sends their stuff back to you. It’s this ongoing process of importing, editing, and then exporting.

HULLFISH: Before we talk about what layout is, you mentioned about the transition from editing with scratch tracks to editing with the real actors doing the voices. What is the schedule or when do you start getting actors to do their lines and I’m assuming that they come in a couple of different times?

HILLKURTZ: Depending on the character and the size of the part, there are anywhere from two to six recording sessions per actor, and then an ADR session at the end for any extra bits that we need to fill in. I always like to have the production dialogue, with the actors’ voices, for when I cut layout because the actors talk at different speeds than the scratch, or they might improv and say some different lines. For storyboards, it’s great if you have the real actor but we don’t always have that luxury because we’re still working through the story and the script, and you don’t want to bring them in for no reason, and then throw everything away. Usually when you have a pretty solid storyboard pass, then you’ll have actors come in and do a recording session.

HULLFISH: Sometimes you’ll go into layout with the real voices, but you can’t go to animation without it. How widely spread out are those recording sessions?

HILLKURTZ: It depends on the availability of the actor. It takes a few years to make an animated movie, so they’re off doing other movies or tours. And then, when we’re ready, we schedule the next session. So it could be like a month or it could be six months in between. It also depends on the quality of the scratch. Every movie I’ve worked on, it’s different.

HULLFISH: So it’s got to be before animation because they don’t want to be wasteful, but before animation, because they need to animate the mouths and physicality to match the performance.

I worked for an animation studio for two years, but explain to my readers what the term “layout” means and what it entails and what kinds of decisions are being made at that phase.

HILLKURTZ: The actual term comes from the 2D animation days, where the artists did a drawing of the scene’s background. In CG animation, it’s translating a 2D drawing into 3D space. Basically, they’ve built the location for the scene in 3D space in the software, and Layout places the virtual cameras in the scene to create what in live action would be a set-up. The characters are added in their place in the set. In storyboards, you might not have really been able to see how that works. So with a “real” space and a real camera, you can now see things like, “Oh, he can’t get through that window, so we might need to change a character from moving left to right to moving right to left. Or this character has to come in through this door, so now we have to change the eye-line to be different than it was in storyboards. They’re basically looking at how to shoot it: How do we block the cameras and the characters?

Layout has gotten significantly better from when I first started. Back then, the characters might have been represented by tubes or frozen grey versions of themselves, but now they actually look like the characters. They’re moving around, but they’re not animated per se. There’s not a lot of lip sync and they’re not doing their subtle character gestures, but they’re moving around the room in time to the dialogue that we’ve given them to try and figure out how it’s going to be shot. This is another place where you can change things. It’s easy to decide “This should be a closer shot.” Or “This should be a two-shot.” And you don’t have animators spending time animating gestures or things that won’t end up in the shot. It’s the best time to figure out how it all is going to look.

HULLFISH: You mentioned layout has gotten a lot better in recent years, but it’s still a stage where the storyboards have a lot of character and personality and acting to them and then you get to the layout stage and a lot of the emotion kind of gets sucked out of it doesn’t it?

HILLKURTZ: A little bit, yes, but certainly not as much as before. I remember on a project years ago we would get notes about the character’s eyebrows or something and I’d think, “But that’s not what we’re looking at! We’re supposed to be looking at the shot.” But we were getting acting notes because the layout stage was starting to look so much more like animation.

It is harder to see emotion or feel emotion in the layout stage, so sometimes for a screening, we’ll actually go back to storyboards instead of showing layout, just to get the emotion and the character moments in the scene. You just have to conform the boards to wherever you are in layout.

HULLFISH: Is that conforming something you’ve got assistants doing or is it something you’ve got to do?

HILLKURTZ: Usually an assistant or an associate editor will do the conforming. I had three different Associates through this film because of scheduling issues: Tom Walters, Rachel Brennan, and Nico Stretta. All of them were super helpful to me and the whole process. I like to always carry the boards on a lower track, so if we need to conform storyboards back to layout timing, they’re right there. Eight or nine years ago, there were a lot more conforming boards than now, because layout has come so far. We would probably do that conform a lot more for screenings for people outside the studio. Whereas inside the studio, people can pretty much envision it.

HULLFISH: As the project becomes more locked, what are your responsibilities at that point, since the conforming is mostly left to others?

HILLKURTZ: That would be nice (laughs) but it actually gets so much busier towards the end, because, if there’s time, people will use it. And so we’re still getting everything in pretty much toward the very end. Also, I’m kind of a control freak — which is not very healthy — so I like to look at everything. I’ll have my associate cut in the lighting shots and I still want to look at it before we’re moving on. And, after you have a preview screening, things might change again. So sometimes you go back to boards again, or you’re moving scenes around to see if they work better in a different order.

In Pets 2, there are basically three storylines and the biggest challenge was figuring out how to intersperse those storylines so you weren’t spending too much time with one character and also not too much time away from a different character. So moving scenes around to get that proper placement was really important. So my work definitely continues to the very end.

HULLFISH: I’m really interested in that idea of cutting between A, B, and C story. Did that intercutting change past the storyboard phase?

HILLKURTZ: Oh yes. It changed throughout. I have a big whiteboard in my room with all of the sequence names and numbers on it, and sometimes I would even put scenes on pieces of paper in front of me on my desk, and just move them around and play it back in my head before even trying it in the Avid. Then the director and I would talk about it. Then we’d try it in the Avid and see how a big, long 10-minute segment would work. Then we could see if moving a scene or cutting a scene in half would work. You’re just playing with it pretty much until you can’t anymore. Especially with three storylines, because you really want to get it right. You really want to get it so that it’s the right amount of time with each storyline and with each character, so you still care about everybody and you’re not missing anybody.

Also on that board, I would write notes about what I’m waiting for: Am I waiting for production dialogue? Am I waiting for layout? Or have I received the layout, but I haven’t cut it in yet? I’d keep little notes of what the status of that sequence is.

HULLFISH: Did you color code those status points?

HILLKURTZ: Yes I did: Red if I was waiting for it, Green if it was in & ready to cut, Blue If I had to show it to the director. That way I could glance across the room and see, “Ooh, there’s a lot of red. Where is that?” Or if there’s a lot of green: “I should be getting on that!”

HULLFISH: Did stuff happen where the animation changed the timing of what you had in layout or was there some other way that you had to lock that down or some other reason why you had to lock down those timings?

HILLKURTZ: You don’t really begin “locking” until you start to hand over to the composer or the sound effects team towards the end. Animation changed a lot because they’re coming up with really fun things for the characters to do, or some funny body movement. Then when we get the animation back, my associate will cut it in, but if there are timing changes, they’ll put it on a different track and point it out to me so I can make sure that it will still work. Sometimes if the animators change animation timings, it changes the timing of a joke because the dialogue has now changed. So then I see if it still “works,” and if I think it isn’t working, I’ll point it out to the director and then we have to decide whether to change the animation or move the dialogue so the joke still works. We can basically re-take any shot. Production doesn’t like it, but when the animation changes timing, you really have to look at it to make sure things are still playing. Especially with a comedy, you just want to make sure that the jokes are all still landing. Or, is the action funnier, and now you don’t actually need the line of dialogue?

HULLFISH: Talk to me a little bit about track management. You were talking about how you are carrying storyboards along with you. Can you just describe what your tracks are looking like?

HILLKURTZ: I’m an archivist and I carry everything. (Meaning that as she moves from one phase to another, the previous phases exist under the latest tracks in video.) Once the boards are there and I get layout in, I’ll just carry the boards on a track below, and when layout is approved I just have one track of layout, and then when animation comes in — I learned this years ago — I carry the latest two takes of animation because sometimes you want to go back to the last take. But once we get lighting in, we just carry one track of animation.

screenshots from Chloe catnip scene: 1) radio play of just scratch dial
2) storyboards pass with sfx & needle-drop mx
3) pass with Chloe production dial, and playing with first Layout delivery (orange).
4) with Gidget prod dial, Layout sorted out, a few anim shots in (dark pink).
5) almost fully animated. Lighting/ Compo (yellow) coming in.
6) “final” scene. all lighting, dummy track, stems from Skywalker for dial/sfx/mx (and our clips muted in the timeline)

Closer to the end of the process, when we turn over to music or sound effects, we make a dummy track, so track one doesn’t really have anything on until we have a dummy track. And then a layer of boards and then layout and then animation and then lighting.

HULLFISH: What’s the purpose of the dummy track?

HILLKURTZ: When you turn over a locked sequence and then you change timing, a dummy track is a quick way to see if there was a cut. If there is space, a break, that’s where you changed the timing. So you can let them know, or you know when you get something back if it’s different. For me, it is just an easy visual way to immediately see that the timing has changed. Because if the dummy track is full, then nothing has changed, but if it has splices in it or spaces, then things have changed since you turned it over last time.

HULLFISH: So what’s in that dummy track? Is it just filler?

HILLKURTZ: It’s just a video mixdown of whatever we turned over.

HULLFISH: Ok that makes sense. It’s very interesting that you carry two different animation tracks — the two latest versions of animation. Let’s explain what the difference is between a lighting track and an animation track.

HILLKURTZ: When the animation shot comes in it’s the animator’s performance of the character(s) to the voice and location and actions, but the lighting and the color in that shot is all kind of drab. Then it actually goes through a few departments before we get it back. The shot will go through VFX, CFX (character), OCC, and Lighting, etc, then finally it’s all composited. Those departments put all the textures on any cloth, costumes, or things in the background, and add the lighting. What in live action would be the lighting rigs, they add that in the lighting stage. And then they composite it all together in a polished final shot. To me, it’s just magical. You see the animation and it works great and then you get Compo back and it’s like, “Wow!” With just the animation pass, you can’t really tell if it’s day or night. You can’t tell if the room is dark except for one spotlight. We know in theory that that’s what it is going to be but when you get it back you see that it actually works, and it’s kind of magic.

HULLFISH: The difference is that the lighting stage makes the animation look photo-realistic compared to the animation stage.

HILLKURTZ: Yeah, pretty much. The lighting (compo) stage is what goes up on screen in the end.

HULLFISH: What about your audio tracks? How are you managing audio tracks?

HILLKURTZ: I do 1 through 6 for dialogue. That will depend on the size of the cast. But having worked on Pets 2 or Madagascar 3 type films, there are a lot of characters. Then maybe 7 through 16 for sound effects. And then about eight tracks for music, because I do a lot of music editing with temp music.

A 4min section showing temp score. There are about 14 different tracks.

Long ago I started to color code each character’s dialogue a different color. Scratch dialogue is all one color – white – because then we know if there’s scratch in there at the end when it shouldn’t be. So, say, Max is blue, and Snowball is yellow. My assistant and I just pick whatever colors feel right for the characters. That way, in the timeline, at a glance, you can see who’s talking, which I also find helpful, because you can look at it and think, “Oh wow, they’re talking a lot.” In my head, when I see the timeline with the different colors of the character’s dialogue it just makes it make more sense.

HULLFISH: Do you do that with Avid’s timeline local clip color or by setting the color in the bin and using source clip color?

HILLKURTZ: Source Clip color. Same thing with sound effects – we make those green – and I personally use blue for temp score and light blue for needle drop temp music. Then, when we get music in from the composer, it’s a different color. If something comes from the music editor, it’s a different color, so I can just see what I need to work on, what’s still temporary and what’s final.

HULLFISH: Do you have different types of audio tracks? Like, I would think dialogue would all be mono. Do you do stereo? Do you do left-center-right? Do you do 5.1?

HILLKURTZ: All the dialogue is mono. For the project, we just stick with stereo. SFX are mono, and music is dual mono. When we do internal screenings, there’s not a lot of ways to play left-center-right so we play stereo.

HULLFISH: You mentioned how important music is to you. Let’s talk a little bit about temp music and what you used and since there was a Secret Life of Pets 1, was a lot of the temp music from that score or from that composer?

HILLKURTZ: Well I started out using a lot of temp from Pets 1, as it’s so great, but in this movie, there are a few locations other than New York City, so I started using other stuff. Alexandre Desplat did the first one, so I looked through his scores, but his scores are all so different. There are some composers where you can listen to a track and know the composer, but with Desplat, they’re all so different, so I basically just used whatever worked for the scene. We pulled a lot from Planes: Fire and Rescue, which is a fabulous score, and from things like Night at the Museum and Shaggy Dog that work well for animation. And I also temped with Avengers, and Eight Legged Freaks. There was some Malificent in there, probably somewhere between 20-25 different scores in total. I do pay a lot of attention to music, so in a four-minute stretch I could have pieces of ten different scores in there. You wouldn’t necessarily know, because they all work together. I like to score the emotional moments so I spend a lot of energy doing that.

Because animation is such a long process, and we’re carrying this along for two or three years with a screening every couple of months, it has to play like a final movie, basically, so we have a lot of sound effects in there, filling it out: footsteps and dog collars and things that you wouldn’t necessarily put in if you were just going to hand it off soon to the sound effects team. But it’s got to play for two years so you can’t really leave that empty. For sound effects passes, I have my assistants do that a lot. Then I’ll listen to it and say, “I want a twing here instead of a twang.” So we’ll look for something else. I get really great sound effects passes by my assistants.

HULLFISH: What Avid resolution were you working in, since you were doing so many screenings?

HILLKURTZ: The project was HD 1920 x 1080. The boards were imported as DNxHD 36, and the shots (Layout, Animation, and Compo) were imported as DNxHD 175.

HULLFISH: When I’m cutting live action, I’m always pretty diligent to add room tone, if it was recorded, or to fake it, if not. It just makes the room seem alive and helps smooth out the other audio edits. Do you do anything to provide sonic atmosphere for animation — other than sound effects?

HILLKURTZ: Oh yes. That’s one of the first things that I like to put in, because, as you said, it just keeps everything alive — if you’re in the city It’s nice to hear horn honks, if you’re in the country it’s nice to hear crickets and birds. It gives you a sense of place. Even if you don’t have time to add footsteps or any of that other stuff, I would certainly at least have the benefit of that in there.

HULLFISH: Do you have a bunch of “fake Max’s Apartment room tone?”

HILLKURTZ: I did have the timeline of Pets 1 that I had access to, so I could use that and it was very helpful. And, yes, as you go, you create the library of each location’s atmos track so the next scene where you don’t have to even think about ‘finding’ it. They replace it all when the film goes to the Skywalker sound team, but I carry it until we get to that point.

HULLFISH: But, as you mentioned, you’ve got two years before you get to that point where you still want it to sound good. What are some of the things your team has to do to handle turnovers?

HILLKURTZ: This is accomplished by my assistants. When we lock a stage — when the director and I decide that we’re ready to move on — it’s actually a different process for every studio. Usually, in our process, the term is “to publish.” We “publish” to another department. So the assistants tell the department how many frames each shot is and if it’s changed from the last time we published. The very first time we publish it from storyboards, that’s when we come up with the shot numbers and how long that shot is, and then it’s forever in the pipeline as that shot number. So if we have a new shot, we give it a new number. But every time we publish it back or turn it over to the next apartment the assistants basically have a huge list of each shot and what has changed — or not — since the last time.

HULLFISH: And is that kept in like a codebook for a live action film? Is it some kind of cloud-based database so that everyone has access to it?

HILLKURTZ: Yes, we use Shotgun software for tracking, so everybody can see the current version and current notes. Also, the important notes are different for each department. Every department needs to see different information, so that’s all out there in the system for everybody to see.

It’s the department managers’ responsibility to update the tracking system if any shot needs to be put “on hold” for any reason. Say the dialogue will be changing, or the wardrobe is changing color, or the shot is in a scene that is being re-written, etc. Then the rest of the crew know that this is a shot that should not currently be worked on.

My associate Tom says:
As well as supporting the editor creatively and technically, the editorial department is responsible for communicating all cut changes to the other departments. In order to do this efficiently, the film is broken down into sequences of a few minutes each. A sequence can contain one longer scene or several short ones. Within each sequence, every shot has a unique number, much like a VFX workflow in a live action film.

Once the animatic of a sequence is ready, it can be published to go into production.

In the first instance, the assistant editors assign a unique shot number to each shot and send the information to our pipeline via an XML and QT movie.

With each subsequent stage of production – layout, animation, VFX, lighting, etc. – updated versions of those shots are delivered to editorial. In layout, it is common to have timing and shot changes. As the sequence progresses through the stages of production, changes become less frequent and by the time composited lighting shots are sent, the shot timings are fairly stable.

When a sequence moves to the next stage of production, and whenever the editor makes changes to a sequence in production (dial changes or timing changes, etc.), it is the assistant editors’ responsibility to communicate the changes so everyone working on those shots is kept up to date. As with the first publish, this is done using an XML and a Quicktime to update our in-house database which tracks the status of every shot in the movie, which everyone in the production pipeline can reference to be sure to be up to date and if they have any questions. Also, you cannot underestimate the benefits of a face-to-face conversation with the relevant heads-of-department if the changes are subtle or complex!

HULLFISH: Care to share any misperceptions that you think live action editors have about animation editors?

HILLKURTZ: I’ve been amazed at the people who ask me about animation editing and say, “So, all of the shots are there. Do you just have to put them in order?” People don’t think through the process. Years ago, I went to a Q and A with the editors of Where the Wild Things Are and they were describing the steps where they were figuring out the process of working through the effects in the movie and I asked them, “Did you ever think about talking to an animation editor? Because this is basically what we’ve done for years.” And they said that they hadn’t because it never dawned on them that that’s really what it was because they thought of it as mostly visual effects as opposed to animation. They didn’t think about doing an animation pipeline. People just don’t think it through. Maybe because it’s like two different languages, or it’s maybe like the difference between English and American: You think it’s the same, but there are really so many different words and so many different nuances that it can be even more confusing. But I was in live action production for 10 years before I started editing (and in my childhood), so I think I’m bilingual in that way. It’s such a different beast altogether.

HULLFISH: I still laugh at all the people who think that the animation outtakes are real at the end of a PIXAR movie. There are NO outtakes in animation!

HILLKURTZ: Right!? Everything is created.

HULLFISH: I also had a friend who was a live-action director and for some reason, someone gave him a job as an animation director, and he was asking me all these questions about animation and one of the things he couldn’t wrap his head around was that there is no such thing as “coverage.” You don’t have multiple angles of a shot in animation. You have the ONE angle that you choose in storyboards and layout.

HILLKURTZ: Yeah. You don’t have six cameras rolling at the same time. I’ve also talked to live action editors who come into animation and they’re kind of shocked that you don’t have handles.

HULLFISH: Do you have any handles at all on a shot? Like with VFX, you commonly get 8 frames.

HILLKURTZ: We usually have zero handles. Whatever number of frames that we send through boards is the exact number of frames we get back from layout. I worked on Astro Boy and that was different. There was a big battle scene and the layout was much more primitive, but the head of layout did shoot it with four different cameras from different angles, and I had four layers of the whole scene, and I could cut it more like a live action, and that was actually pretty awesome. I haven’t had that since then, though. Animation shows generally don’t do that.

HULLFISH: As long as the render didn’t take too long, it should be easy, right? It’s the same animation, just different “virtual” camera placements. We chatted a bit about sound effects, but you just pulled stuff from the previous movie or you have a library?

HILLKURTZ: Yes. I’ve just been collecting a library since day one as an editor, and every studio has other libraries.

HULLFISH: Talk to me a little bit about working with an animation director.

HILLKURTZ: It’s three years of togetherness in the editing room, so it’s usually a pretty close relationship. It does depend on the people and personalities, as does any movie cutting room. I’ve worked with directors that just absolutely need to hear every take of every dialogue – they just want to control more. And I’ve worked with directors who basically let me do a pass and watch the scene, and then they ask about just one line, “Do we have a more emphatic take on this line?” We’ll go through and listen to some and maybe replace it or maybe not. It’s very collaborative.

I also like for editorial to be a place where the director can just come and “be.” They don’t have to be “on” all the time: just hang out and talk about Star Wars for a few minutes, or whatever. I think that’s important, too. You have to build that relationship so they trust you, and your work. I also like to really learn the director’s language. My first editing job was in TV at Sony, and so every week or every few days I’d be working with a different director on a different episode. Each one of those directors had their own language. Like, “Nudge that a smoodge” for one director meant six frames but for another director it meant two frames. You really had to tune in to that particular director. I learned very early on to learn the language of the director and what they were really saying and what they’re feeling and thinking. Ultimately it’s my job to get their vision on the screen right? There are times when I might disagree, but my rule is that if I bring something up three times and it doesn’t pass, then I’ll let it go.

HULLFISH: The other difference between collaborating with a live-action director and an animation director is that with live action, the director is completely involved during production with shooting and the editor is left largely alone for the initial edit because the director has more important things to do. Then later, the director is kind of all yours after the assembly. But in animation, it’s almost the reverse. You’re building the story with them side-by-side from the beginning, then as they get more involved with layout and animation, they have other things they have to attend to.

HILLKURTZ: That’s very true. And sometimes you have to fight for director time because they’re off in other departments doing other things, but it’s also interesting because they have a global vision of what’s in process and what can be done or not within the perimeters of the production. But sometimes the director can be distracted by other things and other departments so they need the editor to have that global vision for them. There might be things that they approve in another department and then you get footage into editorial and you say, “I don’t think this is really working,” but the director couldn’t see that in the other department because it was more specifically about a particular shot or particular scene. Whereas when we have it in editorial, in the grand scheme of things, we need to make sure it all works together for the whole film and not just in individual meetings or reviews or approvals.

HULLFISH: Talk to me about the screenings that you mentioned — every few months for years. I’m assuming that you wanted to be at all those screenings and how much were you learning from your personal interpretation or feeling as the audience watched it and how much of it was response cards and screening notes?

HILLKURTZ: Then, every few months screenings are just for internal crew and execs, and they are very informative. Then, towards the end of the project, we will have a couple of public ‘preview’ screenings. It was really interesting at the first preview to see it with “real” people. You could ‘feel’ the room and could feel if something was falling flat, or working way better than you thought, or the feel of the structure of the three timelines. Then we would all go talk about it and discuss what we thought worked and what didn’t and how much should change or not.

HULLFISH: So you were in Paris for this edit, but there is also an LA office?

HILLKURTZ: Yes, there’s an LA office, and I have an assistant editor there which is nice because I can time out the storyboards and then go home for the night while he can work on — for example — a sound effects pass. They do a lot of the scratch dialogue recording in LA for the American voices. And he would also attend all the voice records in LA which is very helpful. Also, the writer is there in LA, and the producer is there, and quite a few production people, etc.

HULLFISH: It’s interesting that you’ve got an assistant in L.A. in addition to your Paris assistants. That’s pretty cool.

HILLKURTZ: I had an amazing team on this show, and everyone worked so well together. Christophe Ducruet, my 1st assist in Paris, GianDe Feo, my 2nd assist in Paris, and Sam Craven in LA. It’s really useful because Sam can hand stuff off to the LA crew or marketing, and when we have reviews, he runs it on the LA side. The Paris & LA assistants can split up the work when there is production dialogue so quite a bit will be ready to go when we arrive in the Paris office after a record or such. It’s very helpful to have him there to liaise with whatever the L.A. office might need from us. He would deal with most of the turnovers as well. To the composer, sound, marketing, etc.

HULLFISH: Tiffany, thank you so much for a really informative interview. Wonderful talking to you.

Art of the Cut: Conversations with Film and TV Editors

HILLKURTZ: You too. Thank you so much.

To read more interviews in the Art of the Cut series, check out THIS LINK and follow me on Twitter @stevehullfish

The first 50 interviews in the series provided the material for the book, “Art of the Cut: Conversations with Film and TV Editors.” This is a unique book that breaks down interviews with many of the world’s best editors and organizes it into a virtual roundtable discussion centering on the topics editors care about. It is a powerful tool for experienced and aspiring editors alike. Cinemontage and CinemaEditor magazine both gave it rave reviews. No other book provides the breadth of opinion and experience. Combined, the editors featured in the book have edited for over 1,000 years on many of the most iconic, critically acclaimed and biggest box office hits in the history of cinema.

Exit mobile version