Ncam, a company known for its work in augmented reality for television and film production, will reveal ground-breaking advances in creative capabilities at NAB 2016. Ncam will present a live demonstration of its photorealistic augmented reality solution based around the Unreal Engine by Epic Games.
Combining Ncam’s technological capabilities with Epic Games’ Unreal Engine, augmented reality takes a massive leap forward into photorealism. The Unreal Engine was first showcased in 1998 to support a first-person game that became a reference in the industry: Unreal. Developed for first-person shooters, it soon became evident that the game engine could be used for different types of games, so it became a tool used by many developers. The fact that it was designed to run on different platforms made it even more interesting. In 2009 a free version of the Unreal Engine 3 SDK – named Unreal Development Kit (UDK) – was released, contributing to a widespread use of the tool.
The Unreal Engine has been used beyond games, from TV series like Lazy Town, where it was used to generate real sets for real-time integration of actors and puppets, to a training simulator for the FBI. Successive versions of the engine have been launched, with each new version taking the possibilities one step further. The Unreal Engine, is associated with many popular games in the market, from Tom Clancy’s Rainbow Six to Assassin’s Creed, and also games related to Hollywood films, as Harry Potter, Star Trek, Star Wars or Batman.
“It is clear from discussions with customers, studios and productions, especially around episodic television, that there is demand in augmented reality for set extensions, virtual environments, previsualization and finished visual effects,” said Nic Hatch, CEO of Ncam. “To do that convincingly calls for photorealistic graphic elements that audiences believe to be real. And to get photorealism in real-time means tapping into the power of games engine technology.
“That is why we are working with Epic Games’ Unreal Engine and can now demonstrate real-time photorealism from multiple, freely-moving cameras,” he said. “Our demonstrations at NAB this year are going to be absolutely stunning, and a real game-changer for creative television and movie-making.”
A separate demonstration area will show the work that Ncam is doing on matching natural lighting. Ncam’s Relight measures and models the light in a scene, allowing virtual elements to cast shadows on actual objects and to respond to lighting changes based on the surrounding environment in real time.
A third presentation will show the latest addition to the data output of the Ncam camera tracking system: depth data. Armed with this, broadcast augmented reality graphics such as sports analysis will be able to allow presenters to walk around and through virtual graphics.
“So far, Ncam has been recognized for its camera tracking expertise, which is still the most accurate and fastest way to track any camera, anywhere,” said Hatch. “That remains our foundation and what is really exciting is that we are now combining our camera tracking with Ncam’s new relighting and depth technology, delivering something no-one else can do: photorealistic augmented reality.”
Ncam can be seen on booth C10345 at NAB2016. Demonstrations of Ncam camera tracking technology can also be seen in conjunction with graphics from Vizrt (booth SL2417), Orad (part of Avid, booth SU902) and Brainstorm (booth SL4617).