NAB Show

Do Away with the Array: RE:Lens for After Effects brings clarity to extreme fisheye scenes

Ultra-wide-angle lenses solve camera limitations and will make real use of 8k (and beyond).

RE:Vision Effects, best known for its Academy Sci-Tech award-winning motion-estimation-based effects including Twixtor and Reel Smart Motion Blur, demonstrated RE:Lens for After Effects at NAB 2016. I was their invited guest artist, demonstrating the technology in their booth, and this article goes beyond press-release coverage to give a first-hand account of what I learned working with this technology. RE:Lens goes far beyond After Effects’ built-in (and now rather ancient) Optics Compensation effect to work with accurate FOV data and covert extreme-wide-angle, or “superfish” images commonly up to 280 degrees. The result can be used to produce a single undistorted image in virtually any format—16:9 all the way to ludicrous formats like ultra-wide 9:1—and there are also tools to round-trip to and from the 2:1 equirectangular format that is the standard for virtual reality on platforms such as YouTube 360.

Even if you’ve simply encountered frustration just working with GoPro footage using Optics Compensation (whose FOV controls don’t relate to real-world numbers, requiring instead that you measure straight lines to derive an accurate setting, as I have shown you how to do), you may find these plug-ins helpful, but it is also possible to work with footage from exotic lenses that have been out of bounds for the built-in effects that ship with After Effects. The demo video below includes shots derived from the 280 degree hemispheric Entapano Entanaya Fisheye lens that is mounted on a Go Pro via a RIBCAGE setup from Back-Bone Gear Inc. The entire setup including camera and plug-ins costs less than $1500 and was promoted in the RE:Vision booth.

Those are the basics, and you can stop reading here if just looking for industry news. What follows is what I learned from working on documentation, development and demos with RE:Vision about why handling of extreme lens distortion could have far-reaching benefits even for that most mundane of video productions, the talking-head round-table discussion.

Before we get into applications of these lenses, let’s take a look at camera arrays, the current standard for capture of ultra-wide images all the way to a full 360 degree view. Keep in mind that arrays are useful for more than just VR: they can be used to cover a scene that a moving camera can’t, and they provide the means for effects such as Matrix-style bullet-time effects.

Arrays are also a bit of a pain, for two main reasons. One is that they don’t provide a finished image; you have to stitch the results, which means averaging together overlapping areas of frames (which also happen to be the most lens-distorted areas of those frames, the edges). Many of the rigs are custom-built and don’t provide a means to sync color or frame capture, as can be learned the hard way with custom GoPro arrays. Short summary: arrays have to be fixed in post, sometimes expensively so.

The bigger problem with the array? In some ways it has thus far doomed virtual reality to be forever stuck in the wide shot, holding action 15 feet or further from camera. Sure, you can have the talent make sure to hit and hold a mark that is perfectly framed by one of the cameras and move in closer, but they’d better stay on-axis or risk being ripped apart in the resulting image. For an example of this, take a look at the fun Diamond brothers interview posted to this very site earlier this week. Use your mouse to turn the virtual camera around and take a look at interviewer Neil Smith standing below the rig, at this point in the video, right after the 2:00 mark—or rather, take a look at the fraction of his image that is displayed. Welcome to the future.

What if you could capture a scene, including close-ups, with a single camera, without the need for stitching? What I found most fascinating among our RE:Lens demos was not the VR capture and display process, but high-definition scene capture: covering a scene with a superfish lens with the intention of pulling standard, undistorted looking HD images out of it. Specifically, there are a couple of intriguing setups:

  • A 280º lens is attached to a camera positioned horizontally on its back, pointing directly upward. The result covers the full 360º around the camera, up to the apex of the sky or ceiling (but omits the nearby floor, which, if you’re looking at it, may mean it’s a pretty boring scene). Forget about VR for a moment; imagine instead a conversation between 3 people seated around a table. With one camera on a stand to be around head-height, you can grab a single of each talking head that looks no different than if you had brought 3 cameras, and if a fourth person joins in, no problem.
  • A 180º Canon EF 8-15 mm f4/L Fisheye USM lens is attached to a camera pointed at the action (the normal way of shooting, angled vertically). By removing distortion from a fisheye view, dimensional perspective is retained so that when you animate a virtual camera in post, pans and zooms don’t look like an old Ken Burns documentary, they appear to have the dimensionality of an actual camera in 3D space, as if the shot were captured with a crane or drone.

The resulting master image in either case contains no gaps, seams, no need for stitching or managing color or frame sync. In the example dance scene that can be viewed starting at 1:51 in the product overview video above, the 180 degree image was captured with an 8k sensor, so that even the softness that is apparent with 4k images is not apparent.


The After Effects virtual camera, animated in post. Covering a scene like this makes effective use of an 8k camera that might not otherwise make sense in an HD video production.

Not deciding is a decision, and leaving camera moves to be solved in post is not the future of cinema, unless the future is pans and zooms from a static location (since moving a VR rig is problematic enough that I’m not even considering it here).

And at 4k, extreme wide shots will appear soft, as can be predicted with the simplest of math. 4k divided by 1920 rounds to 2; even dividing by 1280 only makes the field of view 3 shots wide. With a 360 degree master view, there are 9 shots to be extracted to a standard 40 degree view, and for all 9 to be HD, the sensor would need to be, let’s see, 17,280 pixels square.

But that math allows the dance scene to look pretty great cropped to undistorted HD pans and even zooms, since 8k via an 180 degree lens is, in fact, enough.

If you’re starting to feel sold on the hemispheric fisheye/single sensor setup, be aware that there are currently a few associated annoyances and minor limitations that I haven’t mentioned:

  • The source image looks strange. If shooting a lot of takes, it’s helpful to review and evaluate fish-eye lensed images in full-motion, without removing distortion for each one, but that requires a little getting used to.
  • It’s difficult to keep the lens clean. Superfish lenses protrude, have a large surface area, and may not even have a lens cap, so you have to remember not to touch them and to wipe them every so oftne if leaving them exposed for a while.
  • Chromatic aberration is often significant near the edge of frame. RE:Lens includes a Chromatic Aberration effect to remove this color offset that occurs when lens and sensor aren’t perfectly aligned, as occurs due to the heavy light-bending near the edges.
  • About half of the image from a rectangular sensor contains no useful data whatsoever. It would be fantastic to see a camera with a square (or even circular) sensor specifically designed for hemispheric lenses, since the image on the sensor is perfectly circular. By placing the sensor pixels only where light is actually hitting the sensor, the same amount of image throughput with a denser sensor could theoretically contain 50% or more extra useful data.

Nevertheless, ultra wide-angle lenses are a huge part of post-production today, and the set of plug-ins that make up RE:Lens make working with these images possible in After Effects. For the time being, this toolset is After Effects-only; for Premiere Pro VR production tools you can also check out SkyBox Studio from Mettle, which was featured in their booth in the VR pavilion area at NAB 2016.

Was This Post Helpful:

0 votes, 0 avg. rating

Support ProVideo Coalition
Shop with Filmtools Logo

Share Our Article

Designer of effects and experiences, author at LinkedIn Learning ( and Adobe Press (After Effects Studio Techniques).

Leave a Reply

1 Comment threads
0 Thread replies
Most reacted comment
Hottest comment thread
1 Comment authors
Pierre Jasmin Recent comment authors
newest oldest most voted
Notify of
Pierre Jasmin
Pierre Jasmin

Thx Mark, actually I rented a GH4 – with firmware 2.0 or over and it can shoot 2880×2880 square video at 30P. (1:1) ratio. The Sigma 4.5mm (using the EF mount with a Kipon adapter) is probably the best natural fit for that size (on the m43 sensor, the part it then records at that res). It does bin/crop the sensor differently though. – the image circle when shooting 1:1 is only something like 10% (by eye) larger than when shooting 16:9. (as opposed to 25% more height). I wished they did not do that as it would allow one… Read more »