Site icon ProVideo Coalition

Epic Games: real-time production is the future of content creation

 

Real-time production is the future of content creation, according to Epic Games, and Unreal Engine is putting that front and center at Siggraph this year.

The affirmation above, made by said Marc Petit, Enterprise GM, Epic Games, suggests the direction the company is taking. We’ve seen the presence of the Unreal Engine grow in recent years, “bridging” different segments of the imaging industry, and this edition of Siggraph represents a bold step forward, as the software appears associated to areas as diverse as architectural and design visualizations, the creation of one animated short film rendered entirely in real time using UE4, or Michelangelo’s “Statue of David” in virtual reality, allowing you to walk around the 17-foot (5-meter) statue a use a virtual scaffold to see the work up-close as few have seen it before.

It’s astonishing what software and computers can do these days, and the examples above are opening the doors to a new perception of what can be created in the future. Experiments like the Virtual Mike, a photorealistic virtual human in the shape of Mike Seymour (FXGuide & University of Sydney) demonstrated in the VR Village, at Siggraph 2017, reveal the power of the Unreal Engine, in this case used to render the human figure in real time.

On a more classical note, we have a Hard Days’ Nite, an animated short film created for Epic’s game Fortnite that is rendered entirely in real-time using UE4. Indistinguishable from a traditionally rendered piece, the short demonstrates how Unreal’s advanced cinematic tools empower storytellers of all experience levels to realize the benefits of real-time workflows.

In fact, the goal is more ambitious. Epic Games believes that the future of “film is real-time” and has already demonstrated that in a project developed with The Mill, “The Human Race”, an interactive film made possible by real-time VFX, blurring the line between production and post. Powered by UE4, The Mill Cyclops, and The Mill BLACKBIRD, the film revolutionizes the conventions of digital filmmaking to create a film you can play.

Epic Games has been at the forefront when it comes to exploring new technologies for storytelling, as ProVideo Coalition documented in one article last May, and at Siggraph their presence goes in different directions, from the VR Jam, at the VR Village, where finalists learn how to transform their cinematic assets into VR experiences, to simulations of autonomous vehicles driving or urban design visusalization.

Real-time, is a key word for Epic Games now, so it comes as no surprise that the keynote presentation at DigiPro, this year, was titled “Real-Time Technology and the Future of Production”.  The keynote presenter was Epic CTO Kim Libreri, and the meeting was used also to present a talk about creating a three-minute cinematic trailer 100% in Unreal Engine and 100% in real time for the video game Fortnite.

Kim Libreri, who is one of the original creators of the now-legendary “Bullet Time” technology from The Matrix Trilogy, joined Epic Games in 2014, having served, before, as Chief Technology Strategy Officer at Lucasfilm, where he lead the development of the highly-regarded Star Wars: 1313 prototype. He has worked as Visual Effects Supervisor at Industrial Light & Magic, Digital Domain, and ESC Entertainment, having supervised visual effects and digital technology on more than 25 films, including Super 8, Speed Racer, Poseidon, and Space Jam.

“We are committed to simplifying the integration of Unreal Engine in content creation workflows across animation production, AR/VR and visualization,” said Marc Petit, Enterprise GM, Epic Games. “Unreal Engine is powering Siggraph’s most impressive new AR/VR showcases along with the first ever real-time CG shorts at the Computer Animation Festival. Real-time production — with the ability to iterate interactively without render farm delays — is the future of content creation and Unreal Engine is putting that front and center at Siggraph this year.”

Datasmith, which will be available as a private beta in August (unrealengine.com/beta), is designed to help artists and designers simplify the process of importing data into Unreal Engine from over 20 CAD/DCC sources, including Autodesk 3ds Max. Datasmith provides a high-fidelity translation of numerous common scene assets like geometry, textures, materials, lights and cameras. This results in significant time savings as content creators don’t need to replicate this work again in the engine.

This technology was publicly previewed for the first time at the Unreal Engine User Group at Siggraph on Monday July 31 at the Orpheum Theatre. Using data from Italian architects Lissoni and designers at Harley-Davidson Motorcycles, Epic’s Chris Murray showed how Datasmith’s 3ds Max plugin and CAD importers enable users to import files that retain visual fidelity to the source — saving hours in data transfer and preparation time — taking the user most of the way to creating a fully interactive, photo-real, real-time visualization experience.

The Alembic workflow enables the integration of quality animated mesh transformations with skeletal deformations in the engine, in real time. Developed for the Fortnite cinematic trailer “A Hard Day’s Night,” the tools are available in Unreal Engine v4.17, due early August.

The advancements and new tools revealed by Epic Games at Siggraph enable content creators to easily add Unreal Engine to existing production pipelines and reinforce Epic’s leadership in real-time workflows, blurring the line between production and post-production.

Exit mobile version