Site icon ProVideo Coalition

OnLocation CS4 Review

Last year I wrote a review of various software-based waveform/vectorscopes. In that review I included my praise and criticism of Adobe OnLocation, though technically it’s a lot more than just simple wave/vector software.
 
Since that time, Adobe has released the CS4 collection, upgrading most of their apps with significant features and – probably more importantly – increasing the interoperability of the apps in the overall suite. Once of the apps that got a huge overhaul in CS4 was OnLocation, so I wanted to revisit it and share a little of what has changed and why you should consider using this app on your next project.

One big consideration for many is that OnLocation used to be a PC-only application. With CS4, you can run OnLocation on a Mac.

Before I get too far into a review of the new and improved OnLocation, I should point out what this app is designed to do, for those who are unfamiliar with it. OnLocation’s name is a pretty good indication of what it does. You generally load OnLocation onto a laptop, though you could have it on a desktop computer, for use monitoring and/or capturing footage in the field from your camera. The monitoring is the big thing, though capturing direct to disk is obviously a huge timesaver in itself. Basically OnLocation turns your laptop into an inexpensive version of a feature film’s “Video Village.” You can monitor the image itself and see a waveform, vectorscope, VU meters. I’ve seen plenty of footage come in from the field that could would have looked a LOT better if the DP had had access to a waveform monitor instead of relying on their viewfinder. Having a waveform monitor and vectorscope on a greenscreen shoot is especially useful in eventually being able to pull a good key from the material. A minute or two adjusting lighting can save hours of compositing time and additional rendering.

Another obvious advantage to capturing video in the field through OnLocation is that you can review footage in the field without having to rewind the tape, with the danger that you forget to play all the way through your footage or fast forward back to the blank area of the tape. Of course reviewing footage in the field is also a great way to waste a LOT of time in the field. But if you can save even one shot that has a technical issue that wasn’t noticed during the shoot and grab a reshoot before leaving the location or setting up for another angle, then the review process will “pay for itself.”

In my previous review of OnLocation, I complained that the interface was actually TOO cool. It was designed to look like gear in a rack. It was a really interesting and fun interface, but it wasted a lot of space and the vectorscope display was very “retro” which actually detracted from its true purpose as an important, high tech tool. The new interface sheds all of the “fun” and assumes that you want to get some business done. The interface is much more modern and adopts many of the same features and operability that Adobe users are used to in apps like After Effects and Photoshop, like tabbed, draggable windows. That’s probably the biggest single improvement for me. How a user interfaces with an application is critical, and the change in the UI signals that this is a tool that can be taken seriously. I will miss some of the kitsch of the former UI, but the renovation is a welcome improvement.

The main thing of interest to me is the quality of the scopes. The trace – the part that actually shows the video levels – is very nice. Some software scopes seem very granular in their lack of detail. All you see are big pixels to indicate the levels instead of a nice smooth, wave-like trace. The scopes in OnLocation are very detailed and pleasing to read.

I do have a couple of issues with the scopes. The first is that there is no way to zoom the scopes to get a better look at either the top or bottom of the waveform or at the center of the vectorscope. This is a feature of most hardware-based scopes and some software-based scopes. This ability to zoom in is really important for critical monitoring of the signal. Another issue is with the scale of the waveform monitor. When watching the waveform monitor in Y (luma) mode, the scale of the graticule (the part of the monitor that gives you a reference for how high or low the video is) only displays on a 0-255 scale and the black of the video is set at 16. However if you go to RGB Parade mode, black is displayed at 0. In YUV Parade mode it is also set to 16. Most scopes have a user preference to choose the type of scale you want to view in – for example 0-100% or 0-100 IRE – and they allow you to choose where you want to indicate video black – either at 0 or 7.5IRE. And, while you can set OnLocation to show “zebras” to indicate various levels, it would also be nice to have illegal blacks, gamut and gain indicated on the waveform and vectorscope as well. Another small annoyance is that if you change the display of one of the scopes from one type of waveform to another or from viewing the full raster to only a single line (a very nice feature) the scope does not update until you hit play on the clip again. I would prefer if you could make these changes and have the scopes update live or almost live, without the need to restart the video. This is particularly a pain if you are viewing a scope in line mode because the scope doesn’t update the information as you drag the line to different positions.


As I mentioned before, interoperability between apps is one of the main ways that Adobe is really able to distinguish its products and add value. OnLocation is one of the initial players in working as part of the suite when it comes to video, because it’s meant to be used in pre-production and production. The added support for complete XMP metadata in OnLocation allows it to pass on all kinds of important and time-saving data to the other applications that will touch the media later, like Premiere, After Effects, Encore and even Bridge – allowing for easier browsing for the perfect shot.

Creating this metadata in the field is always a challenge, because you’re working quickly and should be more interested in the quality of what’s going on in front of the lens than how the pixels are getting tagged at the other end. However, OnLocation has a solution, albeit one that requires a pretty anal production team: you can enter shot data for every shot BEFORE you go into the field. This also allows you to kind of create a “to do” list for your shoot. If you use OnLocation in the field with a laptop, you can enter the information for every shot that you know you want to get as “placeholders” in a shot list. These placeholders can be arranged – using the numbering in the Sequence column – so that you can grab shots in order of location or time of day or availability or in script order. And if you have a chance to pick up a shot that’s out of your prearranged order, you can do that too.

As you’re shooting you can capture the footage for the shot – including multiple takes of the same shot – directly into the shot list, linking it to all of the metadata you entered earlier in the placeholder. This helps ensure that you don’t miss any shots that you planned on capturing AND speeds up the process in the field, because you’ve already entered much of the metadata you need. I am not the kind of producer to take advantage of this kind of stuff myself, but if you are incredibly meticulous and a serious planner, or you have the budget to have someone who is, then these tools are really great.

The process of logging footage is also speeded up with highly customizable keyboard shortcuts. And if you are fortunate enough to have someone on set to log and capture using OnLocation, they can type in comments live while recording is taking place and their notes will be attached to the exact timecode, making it easy to find those great moments – or avoid using dangerous mistakes. If you are capturing using OnLocation, but don’t have the personnel to log during the shoot, you can also add comments live during playback in the same way. So, if you review shots with the director or a client on-set, you can capture their specific comments about exact moments of a take.

Capturing footage into OnLocation is meant to be done through FireWire. A useful feature – though not new – is the ability to record before you record. In other words, you can set a “pre-roll buffer” to a certain length – the default is 5 seconds – so that you are actually recording 5 seconds BEFORE you hit the record button. You can also grab stills throughout the recording process. There are also preferences for warnings in case audio or video levels are exceeded. This can help alert you to potential problems during a take so that you can record a safety take. Video recorded in OnLocation does not change the video stream from the Firewire, but you can define the video format as AVI or QuickTime. You can record DV or HDV using OnLocation. When recording HDV, it stores it as HDV’s native M2T, which is an MPEG-2 format. Most applications will have to transcode this format to something else for editing. As always, it’s a good idea to test a complete roundtrip BEFORE a real shoot. Make sure that you are able to capture and then get the files into and out of your editing application in the desired format/aspect ratio/framerate.


OnLocation provides a few tools to help with composition in the field. You can have a “rule of thirds” grid superimposed over the image if you want as well as a customizable “safe action” area. You can view the full raster of the image in OnLocation, or change to see what it looks like “overscanned.” This is good for me because the viewfinder on my DVX100a cuts off a little of the edges, so I think I have things off-frame, but I don’t. You can also zoom in 10:1 on the image in OnLocation’s monitor, or set it to fit the image to the monitor or to ensure a 1:1 ratio of actual pixels to monitor pixels, which is how it should be set for focusing. OnLocation also has a handy function that lets you preview different aspect ratios, so you can see what your 4:3 image would look like in 16:9 or vice versa.

Another super useful feature in OnLocation is the split screen button beneath the field monitor. You can superimpose or split the screen between either two previously shot clips, or between a previously shot clip and the live camera. You can set the position of the split screen and or the opacity of the overlay of one scene onto the other. This allows you to do things like match eye-lines or position actors from different scenes relative to each other. This could even help monitor continuity between scenes. A big additional asset to this feature is that it doesn’t just apply to the field monitor, but to the waveform monitor as well, so you can see a split screen of the levels between two different shots. This is critical in matching lighting and video levels and can save lots of time in color correction and rendering in post production.

OnLocation even comes with little cards to help with white balance, exposure and focus. Though I don’t know any DP who would consider doing it, you can place the included SureShot card at the plane of focus in your shot and focus the camera while watching the Focus Meter in OnLocation until the meter reading is as high as possible. I think the potential for misfocusing a shot is pretty high using this though, because if the card is not exactly even with the part of the image you want to have in focus – the talent’s eyes, for example – then the Focus meter will actually screw up the shot. Better to leave this one to the professional’s tried and true methods.

It would also be cool to have some way of recording P2 and XDCam or even RED files in the same way, possibly even transcoding them or providing access to proxy files. I wouldn’t doubt that this is in the works.

Getting the footage from OnLocation to Premiere is pretty simple. You can drag and drop from OnLocation’s Project Folder directly, you can import in Premiere using the standard import dialog, or you can Export from OnLocation which allows you to only export clips you’ve marked as “good” or only selected clips.

Overall I was very impressed with the redesign of OnLocation. As most of you may know, OnLocation was previously known as DVRack. I was pretty pumped when DVRack came out originally because I had always wanted an inexpensive way to monitor with scopes in the field. OnLocation has certainly improved on DVRack at each step. Personally, for me, my interest in using OnLocation is pretty much limited to direct recording and monitoring, but the other myriad features will no doubt appeal to other users.

While I’m on the subject of features that some users would use and others may not, I’d like to put on my Product Designer hat and offer a feature request or three that would definitely make OnLocation even more valuable. One of the features I’d really like to see in OnLocation is some kind of media management/back-up scheme where OnLocation either simultaneously records the media to multiple locations at the same time, or that you can set it to back-up the media automatically to another drive in-between takes and check the validity of the backup file. Another great asset would be to have a simple chroma keyer built in to OnLocation (or even a nice one, like from After Effects) so that you could test out lighting and composition of greenscreen work live while monitoring the results in OnLocation. It would also be cool to have some kind of BlueTooth capability so that someone not sitting at the keyboard could place marks in the timeline during a live recording. For example the director or interviewer could mark certain moments that were either great or problematic or needed to be reviewed; or even as a way to mark the best take without stopping the production. And in the “wouldn’t it be WAY cool” category: How about the ability to stream the OnLocation field monitor over the internet, so if there’s some interested party who couldn’t be on-set, they could monitor the shoot and provide feedback remotely! Just some food for thought for my friends at Adobe.

Exit mobile version