Site icon ProVideo Coalition

HPA Tech Retreat 2011 Day 1

TR11d1.jpg

HPA set the stage for the Super Session with these words:

“With all the options in workflow today, it seems no two projects are exactly alike. Just like snowflakes, our projects begin beautiful, shiny, and a unique wonder to behold. But as they slowly drift to their final destination, they seem to just turn into slush, as we leave them behind searching for our next wonderful, pristine, new way.”

This and following posts will be of barely-edited stream-of-consciousness note-taking; there’s no other way to grab all this info in a timely manner, get it published, and still get enough sleep for the next day’s sessions!)

“Setting the Stage”

Panel: Next Generation Workflows
– Sean Cooney, VP Advanced Production Tech at Warner Bros.
– Jonathan Smiles, digital production supervisor
– Marco Bario, production & post consultant
– Nathan Gunn, editor/assistant editor.

Sean screened a short program about six-week workflow tests at WB with Jonathan Smiles, using several high-end digital cine cameras (RED, Alexa, Phantom, F35) and some DSLRs for good measure. Workflows included high-end integrated on-set systems (Codex Digital), mid-range multi-vendor kits (1Beyond with CatDV), traditional facility-based systems (“send it to the lab”), etc. 2D and 3D tested. Workflows had to support a data repository, transcoding (digital dailies, screener DVDs), asset management, basic color correction. Sony smart slates; software script notes. Any given shot would be processed 20 times, through 20 different workflows. Over a year in prep prior to the shoot (doing proper testing is hard). Each workflow was taken all the way from shooting through finishing, making sure that there was a uniformity to the results: each workflow delivered a predictable, consistent product.

Questions:

Only one workflow category involved traditional post facilities or labs; what’s the future of the facilities business?
– Facilities are increasingly coming on-set and handling the data wrangling / on-set workflow. Sharing common servers, common crew with the post production folks.
– Maybe the traditional lab’s / post facility’s days are numbered.
– Big challenge with cloud-based workflows is getting the material into the cloud!
– It won’t just be all-facility or all-on-set; it’ll depend on the project.
– Post facilities are becoming repositories of talent, whether they work in-house or on-set. Experienced people are where the value lies, not in the brick-and-mortar structure.
– Still need the post house for finishing. It’s not trivial doing film-out!
– QC is important; consistency is important.
– Jonathan Smiles, in his role as digital production supervisor, takes the pictures all the way through the process, from working with the DP all the way through the DI and to the finished product. He provides that QC and consistency.
– Is it the person, or the technology, that makes the difference?
– Some workflows do work better than others. A lot depends on the needs of the show; will a one-light daily work, or does the material need more comprehensive grading on-set?
– A [re]definition of roles is important (Jonathan’s role is a new one, for example). It needs to be a conversation between production and post. Someone needs to follow the material all the way through the process. “The undefined nature” of the new roles; “I just need to know who to blame!” “Ownership of process from start to finish.”
– “A log workflow is not necessarily the best for a linear-light camera.”
– Big question with any of these digital processes: “It works, but will people adopt it?”
– There’s no longer a cookie-cutter, assembly-line approach with a file-based workflow, the way there was with film.
– Preserving color intent (in both directions, from the DP and from grading), as well as generating and storing set metadata. “All sorts of programs can generate an ALE file. But should they?” “The [script supervisor’s hand-written] codebook has become a formality; all the important stuff is store as EDLS in the Avid.”
– The amount of metadata that comes off a 3D rig is hugely beyond what comes from a 2D shoot… and the data may not even be looked at for a year. But it has to be stored until it’s needed.
– How much data stays live on drives, how much can be put on LTOs?
– To test an asset management system, you really need to build it first. Makes evaluation challenging; “we need to shoot in six weeks, what do we do?” Answer: you need to start a LOT earlier. “We used to have a 65-year-old (film-based) workflow; now we feel like we have to design stuff from scratch on every new production.”
– Advice to filmmakers? Productions need to take a deep breath before plunging in. What the most recent article in some magazine said was The Thing To Do may not be the right thing for you. It’s important for filmmakers to get involved in the process; sometimes they’re hesitant to get deep into the tech, but they need to stay involved so that the resulting tools reflect the needs of the creatives.
– What was learned? “It hit home how important conversations are”: talking amongst the crew, talking to vendors. There’s a lot of overlap in functionality and the only way you suss out what’s really working is to ask questions and explore. The area where productions are dangerously exposed today are in QC, in “managing the negative” and preserving the data. You need a very clear procedure on who does what, and where… test, test, test… supervise through the entire production process. “Treat digital data with respect and you’ll have a better time with it.”
– There’s no sense in having a faster-than-realtime expensive transcoder if the folks working with the files can easily make their deadlines with slower gear.
– Dealing with unions? (laughter). Not as great a challenge as first envisaged. Certainly on large shows with lots of people it’s not a problem; more an issue on smaller shows with smaller crews that want to eliminate/combine duties and roles. There’s still a problem with software that crosses union jurisdictions: who is supposed to push its buttons?

“Introducing Confusion”

Library 3D Mastering – Kari Grubin, Deluxe

Is there a way to create new 3D digital masters from existing 3D films? Sources vary in completeness, 3D format (over/under, two-strip, etc.). It can be done, but can it be done affordably? Amityville 3-D, 1983, 35mm 4-perf over/under IP, scammed on Spirit, CC on DaVinci 2K+ . To restoration: used Smoke for cleanup, create separate L/R eye files. Used consumer over/under plasma for viewing, adjusting 3D effect (each shot needed individual, minor adjustments). “GOG”, “classic 3D title”, 1954 sci-fi/horror film about an underground lab and experimental robots 35mm right-eye negative, right-eye IP, but only a 35mm print for the left eye. Restored in Smoke; created over/under for the plasma, tweaked shot-by-shot. Mixing the left print with the right neg or IP worked surprisingly well; mismatches weren’t apparent.

What was learned? Good proof of concept: using these existing tools yielded a decent result, and the source materials were sufficient to the task. Costs comparable to other 2K mastering workflows.

With these affordable processes available to restore old stereo shows, “classic 3D is back!”

“Short Form Flows”

Unscripted TV Workflow – Josh Rizzo, Director of Technology of Wexler Video; Mike Silvon, RDF Post Supervisor

Producing a show in NYC. The deal with reality TV: “imagine an alternate reality” where you have to deal with less than half the budget and less than 1/4 the time. [Note: how does this differ from the universe the rest of us are in these days? -AJW] Everything from REDs down to GoPros. Because of the budget, left with “hand-me-down” tech, such as ENG cameras. “Workflow does not equal workaround!” MPEG2 (XDCAM 50 Mbit/sec) mezzanine format, conforming to DNxHD. “Post begins in preproduction.” 2000-3000 hours of material per season, all needs high availability in post. Josh not sold on LTO for archiving, but it’s the best we’ve got. Can some VC build a camera with an LTO drive on the back end? (laughter) Often set up post in hotels / trailers / some rented space, but we still need all the capability of a facility.

Mike: PI Moms, “docusoap” shot in SF Bay Area. Plan: 3 EX3s, Sony A1U HDV palmcam, 2 Panny lipstick cams, 1 button cam; 2 Sound Devices 744T 4-track field audio recorder, 1 Sound Devices 788T 8-track, 12 channels of audio across three teams; laptop with ShotPut Pro to copy clips to dual G-Tech drives, one FedExed to NYC, one kept in SF.

Additional backup made in NYC, ingested via AMA at 14:1 into Avid ISIS storage for editing, digitizing any HDV tapes. Loggers transcribe all interviews into Microsoft Word. 8-10 hours/day of interviews! Try Adobe SoundBooth speech-to-text, sent to Avid ScriptSync. Producers have access to ISIS 5000 via MacBook Pros, using Avid MC Soft to create stringouts, watch cuts. Very flexible, can work anywhere.

MediaSilo used for exec screening: iPhone playback was better than the hardwired connection in a hotel!

Issues: Would have worked better to record sound on camera (avoiding the sync in post, which takes time). Loggers couldn’t get access fast enough, and field audio was too poor for Soundbooth to recognize it; 65% accuracy at best (male voice on a clean track), but mixed result on females. Some loggers were faster just typing stuff in than waiting on Soundbooth and then correcting its mistakes.

ScriptSync was invaluable for producers and editors. MC Soft on laptops very useful, depending on user’s experience with Avid.

Shooting ratio? Each week would become an episode, but PI cases spill across weeks, so 5-6 days shooting would become an episode. Several cameras, 10 hours a day; it adds up. 2 and half or 3 assistant editors just weeding through footage.

What single bit of gear would help? Unscripted TV is “I feel like everyone else has an iPhone and I have two tin cans! … I’d like to see some products focused on unscripted production.” Good scheduling software, for example (currently using an ancient copy of Now Up-To-Date).

On-Set Dailies – Greg Ciaccio, Next Element by Deluxe

MobiLabs – taking the lab on location, in hotel rooms, data facilities, production offices.

SmartSoft, in-house web-based framework for data tracking in the dailies pipeline. Handles multi-format dailies generation, multiple format cameras, automated transcoding where needed, syncing and color correction (non-destructive). Uses BaseLight on the back end for transcoding / color correction, but could plug in something else as needed. Normally a two-man system as deployed on location (colorist and an assistant), an extension of Production from Deluxe Labs. Centralized support from Deluxe, including NOC / service bureau with multiple capabilities (e.g., editorial services, load balancing, services on demand).

Recent projects: “Shameless” (DP Rodney Charters), PhantomHD, Iconix, 7D; ABC’s “Detroit 187”, etc.

The Image Interchange Framework: Report from the trenches – Andy Maltz and Ray Feeney, Academy of Motion Picture Arts & Sciences; Lou Levinson, Laser Pacific

OpenEXR, ACES, IIF – demystifying confusion. “If you’re not living on the edge, you’re taking up too much space!”

Deployments of IIF happening all over. Real work being done; shows being delivered. Standards documents making good progress. “It’s all words until you can see pictures.”

IIF is an HDR picture processing format / environment built on OpenEXR, a means for manipulating images without damaging them. Generated from tests shot with a 12-bit capture (Sony F35), trying to see the difference between 10- and 12-bit captures (the verdict? You can see the difference). Using an ACES pipeline (16-bit floating point) to ensure no losses. Film-outs don’t exactly match the IIF images, but they’re close—and how much longer are we going to be watching film prints?

“It doesn’t look like film, and it doesn’t look like digital. It just looks right.” Most film stocks and modern sensors capture more info than 10-bit log can handle, thus the need for IIF.

IIF is a way of handling multiple image formats output-referenced (referenced with respect to an output device, such as the P3 digital cinema colorspace); separates the lab work from the creative work, to make the colors and tone specified and consistent so that what winds up onscreen looks the way it should, with no surprises.

TV Case Study: “IIF-ACES Workflow” – Pankaj Bajpal, colorist at Encore; Curtis Clark, ASC

Triggered by desire to keep all 12 stops of dynamic range the F35 and SRW9000 are capable of. Clark and Bajpal took existing clips from the first season of TV show “Justified” and were able to demo the difference between traditional Rec.709 color correction and IIF ACES workflows. Traditional workflow wound up clipping highlights and losing shadow detail; IIF ACES kept that range (article in Film & Video / Studio Daily, will be in American Cinematographer March issue). If the cameras can capture these wide ranges, why are we color correcting the same old way? The old way, you need lots of power windows and masks; with ACES those extremes are rendered without special measures. As a side benefit, shadow noise is greatly reduced.

In sample clips shown, lamps blow out and lose color in linear; ACES holds both detail and color. Outdoors, skies and backlit fur detail that lost in linear are held in ACES. Skintone highlights, windows, backlit lampshades render more neutrally.

If I had to characterize differences between the images shown, I might suggest the difference between shooting an EX1 or EX3 in Cinegamma 4 instead of STD3 gamma (though that’s a coarse and imprecise analogy), or the difference between grading RED footage in the old tools without FLUT S-curving, and using newer tools with FLUT. The input and reference transforms used in the ACES workflow capture the characteristics of both the camera and the display device and automate the clean conveyance of a camera’s dynamic range to the output device without a lot of manual fiddling about with S-curves and color correcting: It Just Works. The default image coming through the ACES workflow looks good to start with; it means the colorist’s time is spent enhancing the image and crafting a look, it’s not spent in the tedious rescue-work of keeping shadows and highlights from getting lost.

TV Case Study: “Community” – Jake Aust, Producer/Post

“Community” is half-hour NBC comedy from Sony’s studios. Producers said they wanted to shoot using DSLRs (!), not just a creative issue: they wanted to save time and money and have some operational flexibility. But they were convinced to shoot on Arri D21s recording to XDCAM HD disks.

Jake Aust started on “The Office”, shot to tape in HDCAM, no color correction for dailies, still synced using double-system sound. Shot 60 42-minutes loads per show, costing $25,000 per episode. Downconverted to DVCAM for editorial, night assistant synced sound overnight. Still relied on post facility to downconvert; edited on Mac OS 9 Avid Meridians.

For the “Community” pilot, shot D21 recording to tethered HDCAM-SR, downconverted to DVCAM for editorial.

For production, tethered instead to PDW-1600 XDCAM decks. Nice on-set; behaves like a tape deck, so no separate data wrangler needed to shuffle CF or SD cards. But in post, behaves like a disk: mount the deck via FireWire and access its clips. Avid mounts it natively, ingests at 2.5x real time. Long GOP, so very demanding for the NLE’s CPU. 50 Mbit/sec 4:2:2; compressed, but no heavy compositing on this show so it’s OK. Overall, saved $13,000 per episode between cheaper stock (at 75 camera masters per show, that’s a lot of stock), no online time for downconverts, staying XDCAM native throughout. The savings more than paid for 2 XDCAM decks; allowed purchase of an Avid MC Mojo DX and 4 Mac Pro octocore towers and a 16 TB Unity storage system (enough for 8-10 episodes online).

(Backup plan, in case things didn’t work: XDCAM disk could have been treated like a tape.)

Night assistant ingests, then syncs sound in Avid. Daily screeners generated from Avid ALE by Sony 24pDailies.

Challenges: Some Avid XDCAM bugs to start with, mostly dealing with Unity. Editing was a bit “laggy” from the long-GOP footage. TC breaks when dubbing HDCAM SR material to XDCAM (e.g., varispeed material, 4:4:4 material) needed to be managed (TC breaks made the Avid upset).

Other advantages. Very loose production style, with cameras always running, so large amount of dailies; ScriptSync helps editors log things quickly. File-based workflow is a huge time saver for editors, exec producers working in editorial. Editing in HD allows blowups, repositions with no guesswork as to sharpness.

TV Case Study: Homeland – Paul Chapman, Fotokem; Tom Vice, nextLAB/Fotokem

Given the changes in the industry, Fotokem formed nextLAB, a mobile dailies system :

Two cameras of interest, the Arri Alexa and the RED ONE. Alexa may be recording ProRes on SxS cards, or externally to HDCAM SR or solid state (S.two, Codex, etc.). REDs shoot R3Ds to CF cards or RED Drives.

Need to capture these materials on location, convert Alexa LogC to something viewable, sync audio, archive everything, track metadata, perform QC (e.g., 24″ monitors not big enough, need 50″ displays to see everything). A big issue with digital is not having eyeballs on the material at all stages the way you do with film.

“Homeland”, 20 Century Fox, Alexa / SxS / ProRes4444, look management on location. Clips viewed on-set, applied looks, drives shipped to FotoKem with color manifest files for the looks (metadata showing all the work done on a source clip; nondestructive editing; basically a dump a SQL database logging all editing / grading operations). Checksums used throughout. FotoKem finds that you don’t always get a good read on a CF or SxS card, so they require a read/eject/reread process to verify content. Once at FotoKem, clip and manifest ingested into main storage, work continues: syncing, color correction, dailies generation, HDCAM SR master generation. Transcoding: getting the color spaces correct is a challenge with QuickTime! FotoKem has created a large wall chart of the various transcode combos and what the pitfalls are (and how to avoid ’em), which they’ll sell you a copy of for a couple of million dollars!

Challenges resolved: brings the lab to the set, allows catching and correction of production errors earlier on, makes syncing sound faster, easier, more automated, helps color decisions made on set make it back to the lab.

Case #2, “Glory Daze”, Warner, RED ONE MX, nextLAB set up at KMP (Keep Me Posted). RED files and wavs sent to KMP on drives. Used RED color tools instead of ASC CDL + LUT. Made new MXF master media using RED Rocket. Different workflow; similar outputs.

Next: Social and 3D Obligations; Sony announcements…


“Social Obligation”

Live Action Feature Film Workflow: The Social Network – Michael Cioni, DI Supervisor, LightIron; Katie Fellion, DI Producer, LightIron; Ian Vertovec, colorist, LightIron; Hameed Shaukat, VFX, a52.

Michael: Shot RED ONE MX Recode42 @4K 2:1 4096×2048, to allow reframing. Pix are very locked-down by design; repositioning allowed removal of operator bobbles. Shot T1.3 throughout on Master Primes. Cut using FCP ProRed LT, conformed in Adobe Premiere, graded on Quantel Pablo, 1200 VFX shots.

“House-less post.” No dailies post house; all done by Tyler Nelson near-set. No offline post house; offlined in David Fincher’s office. No conform post house: With Adobe’s help, and RockPaperScissors, used Automatic Duck to conform using Premiere and After Effects software deBayering (even VFX like split-screens handled by Automatic Duck). No DI post house, really, to be explained below.

First film shot on R1 MX. Cleaner in the toe; about 13 stops dynamic range; much less fixed pattern noise than original sensor. Opening title sequence (campus, night) was shot pretty much as it looks; it wasn’t lit for the film. No FPN, nice clean toe, but some skies were black; these were replaced with a gradient, and they added some Harvard buildings on the skyline (since it wasn’t shot at Harvard).

Workflow unique to the film; hasn’t been done quite that way since at LightIron. Made multiple transcodes, h.264 high and low-bandwidth, with/without watermarks, and ProRes LT. Multiples are nice; make ’em right off the bat. On SN, the low-bandwidth 264s were sent to editorial to start with; ProRes came later but could be dropped in shot-for-shot since there were parallel versions of every clip.

RED footage wasn’t denoised at all; it’s as-shot. Reliance was used to sharpen clips (Fincher wanted super-sharp, both for edges and for the buzzed focus shots, which at T1.3 must have been quite a few!). The master is a gamma corrected DCP; DVDs, broadcast, and film masters were made from the DCP.

Katie: data conform and VFX pipeline. Fincher’s thing was “precision.” When DI started, director was off scouting another project and not all of SN had been shot! When material came in, came in as unstabilized log DPX plates, often without vfx. As updated plates came in (stabilized, color-corrected “linear”, etc.) they could be dropped in as needed for finishing. For example, log dpx comes in; linear plate comes in for grading and is graded; original log plate sent to vfx; vfx returns the finished effect, and it’s dropped in and the grading is applied to it. It took 2.5 months to complete the DI. A HUGE FileMaker database was used to track all the versions of all the clips through the process.

It’s a great workflow to follow if you have a lot of vendors and not a lot of time.

Every single shot was stabilized (and if it’s a three-shot comp, that means all three shots are stabilized individually and then married, and then the camera motion is put back in). Also: multiple splitscreens (all the twins shots); all the computer screens; snow, breath, and mist FX; boom removal; the Henley Regatta with the tilt-shift look.

Ian: FCP and Adobe tools really worked well together; FCP FX translated cleanly to the Adobe tools (e.g., splitscreens and composites) through Automatic Duck. Being able to precomp in FCP and then translate cleanly to After Effects was a huge time (and money) saver. The fact that the creative team can keep this stuff in-house means all the creatives are working for the director, not just for “a client”.

Hameed: Tilt and shift Regatta sequence. DI was already well underway before Henley was shot; a slug was used in the DI at that point. Fincher went to England and shot the race; dailies came back for cutting; then the entire sequences needed VFX (57 shots) in three weeks. All the tilt/shift look was done in VFX. Also: matte paintings, building and boat additions and enhancements; crowd replacement, sky replacement, adding marker buoys and a flying duck… Closeups were shot separately, so entire backgrounds needed replacing. Over 1000 stills were shot for use as VFX elements. In screening, David would stop on every shot, point with his laser pointer, and say, “add boats here, make this building more interesting, add trees, make this in focus, make this out of focus; this entire row of oars needs to stay in focus…” (the latter requiring frame-by-frame roto work). Fincher typically only had an hour to look at stuff, so the LightIron crew would shoot the screen with a Flip camera, recording the laser pointer and Fincher’s narrative, and they had the director’s commentary captured for reference.

Because the VFX just dropped into the DI sequences, Fincher never approved any VFX that weren’t already color-corrected. Being able to drop in new VFX clips and immediately have all the grading work applied was an incredibly efficient way to work.

“Outpost DI:” take the DI tools and mobilize them. LightIron put a Pablo with 2 hours of 2K storage in a road case. They built a prefab cabling system; take the system on the road, unroll the cable harness, plug in control surfaces and monitors and such, and you’re good to go. For SN, used a soundstage as a DI theater. RED Studios has a spare stage, Fincher didn’t want to go to Culver City where LightIron was, so they spent a weekend converting one of the stages to a grading theater (this looks to be Stage 4 where I saw the HDRx demo, with its little stepped sofa platform). Pablo on-set, FCP nearby or quick editing, a Ki Pro to move files around with. No need for a trac=ditional DI lab.

Fincher would only take his Macbook with him when he traveled, so Ian calibrated his and Fincher’s Macbook Pros together with an EyeOne probe, then built a LUT to send Fincher screeners that worked on his laptop’s display.

Also used audio from Fincher’s iPhone to match his review commentary to the screener footage. He;’s go “boop” when he saw the two on the leader, and that’s all the LightIron folks needed to match his audio-only remote commentary to the pix on the clip they sent him.

REDColor was really designed for Rec709 output, not P3, but REDColor2 is much better. The DCP DSM (digital source master) was the master, then went out to Fuji Eterna print stock. Fuji was sharper than Kodak; Eterna RDI / 3513 was the best stock, determined after much testing. It was essential to preserve sharpness, as per Fincher’s demands.

Shooting RED MX at ISO 500 yielded “unprecedented clean blacks”. In grading, Fincher wanted shadows at 5%, highlights rarely above the 80s, skintones “with no red”, favoring greens and yellows.

Used the Dolby RGB LED-backlit reference monitor, “highly recommended”.

“DI is the last line of defense before the real world.”

“Raw files let you go back into old clips using new tools and pull out information you didn’t realize was in there.” Even so, you wouldn’t go back into those files for a new release; half the movie is the decisions made in post. So nowadays, yes, you should archive cam orig files, but you also need to archive all the post files (like all the color-correction metadata for each scene) as well.

[I would add that it’s even more complex than that; I finished a RED spot using FCP 6 and Color 1.0.3 a couple of years ago. The FCP / Color round-trip functionality changed for FCP 7 / Color 1.5; those old project files don’t work in the newer software. If I hadn’t archived the older code, as well as the software environment it ran in, I’d be S.O.L. Nowadays, you need to consider putting the media, the software, and even the platform it runs on in the archive, if you want to be able to re-create it later!]

“3D Obligation”

3D Live Action Feature Film Workflow: Tron: Legacy
– Steve Gaub, co-producer
– Andrew Loshin, VFX editor
– Dylan Firshein, first assistant
– Eric Barba, VFX Supervisor
– James Haygood, Editor

“TRON: Legacy” shot 400 TB of 3D data; that’s 300 million floppy disks or 16,000 Blu-Ray disks to hold it all. 1500 VFX shot (twice that, really, for 3D) 5 Theatrical versions created: 3D digital, 2D digital, 3D IMAX, 2d film, 3D film; 5.1, 7.1, IMAX audio; 30 different country day-and-date releases; DVD, Blu-Ray, 3D Blu-Ray, iTunes, and electronic sell-through.

Yes, it was a complex process. Just trust us…

All digital capture to Codex Digital recorders, brought through a Pablo system in a trailer on set for dailies. Both 2D and 3D dailies generated.

Tremendous amounts of previz in the form of basic animatics, mostly for the CGI bit (e.g., most of the show). Budget-wise there were too many shots to execute for the finished film (a single shot costs $25K – $250K). Spent a lot of time economizing in previz; great place to experiment and work out shots before committing to high-cost production. Even some stereo decisions were made in previz, and the stereo previz was used on-set too validate setups.

Started in 2008, so there weren’t a lot of 3D precedents: the Hannah Montana and Jonas Bros movies, not much else. First 3D show for everyone on the panel. Lots of questions about process. In editorial, they’d get the “fixed eye” image as well as an anamorphic side-by-side, and made them a multi-clip in the Avid:

That way, could edit as a “normal” 2D show and pop into 3D as needed to check things. Occasionally they would screen the show in 3D, but mostly worked in 2D; the 3D was very conservative in this show. Besides, before all the compositing was done, there wasn’t that much to see in 3D in the first place, just a lot of blue screens. Editing with previz proxies was a critical part of putting the show together.

Metadata: The Pace 3D rigs generated a lot of metadata, and the Codex recorded it via its UI. This resulted in a text file, one copy of went to VFX and one of which went into the Avid. LTOs created in tar format (the o.neg, never accessed again) and in tina format (sic) for actual use.

PIX used for collaborative production, and Google Docs for text-based stuff. It would be nice if PIX and Avid were more closely integrated, so it didn’t require a separate posting step to get stuff online. CineGrid (discussed at HPA last year) also used for live collaboration.

The 3D digital version was the “master” grade; other versions were graded with reference to it. But, as with sound, they found that most real-word theaters didn’t render the show like their grading theater or the Skywalker Sound mixing stage did.

In the future, it would be nice to have a more integrated 3D production format, instead of the 2D-plus-side-by-side Avid multiclip. Also, more previz, more people, a bigger office!

“Every workflow is a snowflake.”


So what did we learn? Leon Silverman asked the attendees for feedback.

We’re in the “digital forest fire” period, where some workflows will die and some will be left standing. Yes, it’s been disruptive before; now it’s more disruptive on more different levels. It’s not gonna stop and it’s not gonna settle down.

The big takeaway: plan early… and test, test, test. The real skill set for today is being comfortable with being uncomfortable; with being open to learning and reinventing stuff every day.

There aren’t any more lines between preproduction, production, and postproduction. The lines aren’t just blurring, they’re merged, and there aren’t these different departments any more, there’s just filmmaking.


After the Super Session, Sony’s Alec Shapiro announced next-gen 17″ and 25″ Sony Pro OLED monitors, delivering April/May 2011. Full res, 10-bit displays. Also a prototype HDCAM-SR solid-state “memory camcorder”: “SR 3.0”. 5 Gbps, 1 TB memory cards. He also mentioned that Sony is planning for true 4K, 8K origination in the future.

Sony had a 25″ OLED displayed in a dark room, between a BVM-series CRT and a current top-end Sony LCD. The OLED had the color and vibrance of the CRT (without the CRT’s flicker) and the full 1920×1080 resolution of the LCD (without the LCD’s washed-out blacks and narrow viewing angle.

LCD, OLED, and VTR monitors (flicker, off-axis viewing angles not shown!).

Many attendees I spoke with said the 25″ OLED was the most impressive bit of new gear at the Tech Retreat. It’s rumored to cost about $1000 per diagonal inch, though; not cheap! We’ll just have to wait and see…

The OLED panel for the 25″ monitor. The 17″ is sitting just to the left.

Disclaimer: I’m attending the HPA Tech Retreat on a press pass, which saves me the registration fee. I’m paying for my own transport, meals, and hotel. No material connection exists between myself and the Hollywood Post Alliance; aside from the press pass, HPA has not influenced me with any compensation to encourage favorable coverage.

The HPA logo and motto were borrowed from the HPA website. The HPA is not responsible for my graphical additions.

All off-screen graphics and photos are copyrighted by their owners, and used by permission of the HPA.

Exit mobile version