Site icon ProVideo Coalition

NVIDIA DLSS takes Virtual Production to the next level

NVIDIA DLSS takes Virtual Production to the next levelDeep Learning Super Sampling is an asset for the content creation industry, as it allows to enhance all kinds of workflows including virtual production, animated films and more.

The gaming industry quickly adopted NVIDIA DLSS with titles such as Minecraft RTX, Fortnite and Cyberpunk 2077 seeing up to 2x frame-rate boosts from the technology. DLSS is like magic: the technology uses AI to turn lower resolution rendered images into higher resolution ones. It gives users the performance of rendering at lower resolution while maintaining the visual quality of higher resolution.

AI is taking visual fidelity in computer graphics to a new level, and spearheading this revolution is NVIDIA Deep Learning Super Sampling technology. This exclusive asset works in tandem with NVIDIA RTX GPUs, which feature dedicated AI processors, called Tensor Cores, that enable DLSS to run on rendered images in real time. Deep Learning Super Sampling opens doors for performance-hungry processes. Now, the content creation industry is deploying this technology to enhance all kinds of workflows, including virtual production, architectural visualization, animated films, product design, simulations and data generation.

Leading creative developers such as 51 World, Goodbye Kansas Studios, Hyundai Motors, Lucasfilm’s Industrial Light & Magic, Orca Studios and Surreal Film Productions are taking advantage of DLSS to boost performance and realize their creative visions. There are some exciting new experiences being made, as film, TV and YouTube projects are increasingly using virtual production techniques as studios and artists are under immense pressure to iterate as quickly as possible and keep the cameras rolling.

HYPER BOWL: How we killed the greenscreen

HYPER BOWL and Orca Studios are two examples of studios exploring the new options that Deep Learning Super Sampling brings to the table.  The team at HYPER BOWL says that The Mandalorian opened the door to a new era of filmmaking, and that the revolutionary technology from Hollywood is now available in Munich at the HIPER BOWL. With it, the company says, “we killed the greenscreen”.

In the HYPERBOWL, the team adds, “you are shooting in a 100% controllable environment! Totally independent on light or weather, you can go to different places around the globe with a push of a button without traveling. Even the cast can shoot several scenes within one day without changing locations.”

By adding NVIDIA DLSS to the virtual production pipeline of HYPER BOWL, NSYNK has seen big increases in render performance, nearly doubling frame rates even with real-time ray-tracing effects, a previously impossible combination. DLSS also provided a cleaner and more stable image compared to even full-resolution anti-aliasing techniques.

A revolutionary system based on Unreal Engine

“DLSS is game-changing, as it improves render results while lowering performance cost. It has become an integral part of our render pipeline,” said Dennis Boleslawski, head of virtual production at HYPER BOWL.

Another example of the power of NVIDIA Deep Learning Super Sampling technology comes from Orca Studios, where the team believes “this the perfect time to explore and apply the potential of VR to the film and television industry. We developed a revolutionary system based on Unreal Engine, which could radically change the way in which filmmakers approach pre and post production. An Unreal’s virtual camera is plugged-in on an AR kit or tablet, which feed camera position to the engine, enabling the filmmaker to use that device as a portal into the virtual world of the project.

The previz system is just part of the whole story. Imagine a virtual storyboard, inside which it is possible to freely move around. For a fraction of the price of the current popular systems, we make possible to pre-visualise entire virtual environments, exploring them as if they are real. It is possible to experiment with camera angles, camera movements, grip, lighting before the actual shoot take place, not only saving time and money but allowing directors and cinematographers full control over the storyboard and the shoot.

Orca Studios: DLSS is our go-to render configuration

Orca Studios also developed the use of LED screens. Static images or pre-rendered footage is played on giant LED screens. This allows the camera to acquire realistic interactive lighting, and eventually more “fly-on-the-wall angles”, while immersing the actors into the shots and allowing more in-camera shots. The team adds that “Recently this system has proved incredibly successful for features such as First Man, where it proved key in providing visor reflections and even reflections on dials inside spacecraft.”

At Orca Studios, NVIDIA DLSS has also become an essential part of the real-time rendering setup for virtual production. With the high resolution of their LED walls and the ray-tracing-based renders needed for realistic lighting, DLSS is Orca Studios’ go-to render configuration.

“DLSS is a really powerful way in which we’re bringing the more demanding Unreal Engine renders to a high enough performance/noise ratio needed for live virtual production,” said Adrian Pueyo, virtual production supervisor at Orca Studios.

At HYPER BOWL you are shooting in a 100% controllable environment!

Download DLSS for Unreal Engine

At Lucasfilm, Stephen Hill, principal rendering engineer, says that “NVIDIA DLSS helps supercharge ILM StageCraft 2.0’s Helios renderer, allowing us to drive higher-fidelity datasets at greater resolutions and framerates within our real-time LED shooting environments.” Zhengzhi Cao, technical director, Surreal Film Productions notes that “​DLSS gives a great boost in performance. It’s like a free hardware upgrade.  With DLSS, we can use high fidelity film assets with little to no optimization in a real-time game engine, enable real-time ray tracing features, or render ultra high definition images on gigantic LED screens with only a couple render servers.”

These are just a few examples of how the rapid adoption of NVIDIA DLSS is revolutionizing content creation. DLSS is accelerated by a broad range of NVIDIA products, from professional design and visualization solutions for enterprises to Studio PCs and laptops for creators.

For app developers, NVIDIA DLSS is now available in mainline Unreal Engine as a plugin, compatible with UE 4.26 on the Unreal Engine Marketplace. To access the full DLSS SDK for custom engine integrations, visit the NVIDIA DLSS developer page.

Exit mobile version