Site icon ProVideo Coalition

MetaHuman Animator: digital actors will never replace human actors

MetaHuman Animator: digital actors will never replace human actorsThe  use of digital humans in the filming industry may not be as digital as some believe, as capturing the performance of a human actor and then using tools like MetaHuman Animator may be the best option.

When last June ProVideo Coalition shared the news that MetaHuman Animator, from Epic Games, was finally available, we mentioned that to show how it all worked, Epic Games shared a short film, Blue Dot, created by Epic Games’ 3Lateral team in collaboration with local Serbian artists, including renowned actor Radivoje Bukvić, who delivers a monologue based on a poem by Mika Antic. The performance was filmed at Take One studio’s mocap stage with cinematographer Ivan Šijak acting as director of photography.

Now Epic Games has revealed more about the process, information that arrives as the discussion about the use of digital humans or digital actors extends to an industry that has strong links to the movie industry: video games. In a recently published article, Starfield, Baldur’s Gate 3, videogames, AI and IBC2023, we mention how Baldur’s Gate 3 has set a new high bar, by exploring the use of cinematic cut-scenes in ways never tried before.

Baldur’s Gate 3, from Larian Studios, sets new records, with 174 hours of cinematic video capture – more than the whole Game of Thrones TV series –  and the use of 248 human actors, that not only recorded the voices for the game but also put on a mocap suit for their movements to be recorded for Baldur’s Gate 3 cut-scenes. As Aliona Baranova, Performance Director on Baldur’s Gate 3, wrote in Twitter (now X) that “ALL the NPCs and not just the companions put on a mocap suit and their movements, gestures and physical choices were recorded & sent along with the audio files for the animators to use in game. Which is why the performances feel so *alive*”.

Digital actors can not perform

The experience Baldur’s Gate 3 offers those who play the game is unlike anything else, and raises a question: will fully digital actors ever replace humans? The answer is NO, and it’s not just the human actors who say it. In fact, Epic Games, in the article revealing how Blue Dot was made, now notes that “In the world of computer graphics, creating realistic digital humans has been something of a holy grail for a couple of decades now. Many have achieved the goal of making a still image of a digital human that is indistinguishable from reality. But often, it’s when these characters are called on to perform—especially if they are required to be rendered in real time—that the ‘Uncanny Valley’ creeps in.”

So, all the technology serves one purpose: to capture the visual aspects of real human emotions, in such a way as to faithfully translate it to a digital actor… that is based on the mocap registration of a human actor. That is probably the right way to go,  and that’s where the recently announced MetaHuman Animator comes in: the tool which allows you to use a smartphone to capture an actor’s movements brings stunning fidelity to cinematics. MetaHuman Animator is one more tool from a toolset that includes MetaHuman Creator, that made the creation of realistic digital humans accessible to everyone. Then, with Mesh to MetaHuman, that takes the technology a step further by enabling you to create a MetaHuman based on a sculpt of a character or a scan of an existing person.

To push MetaHuman Animator to its limits during its development, the Serbia-based 3Lateral team collaborated with local artists and filmmakers to produce Blue Dot, the short film about which ProVideo Coalition wrote before, which brings traditional filmmaking techniques to digital productions.

Create cinematics of stunning fidelity

The film demonstrates how MetaHuman Animator unlocks the ability for teams to create cinematics of stunning fidelity and impact by using a creative on-set process typical of traditional filmmaking to direct and capture a performance. What’s more, the quality of animation delivered straight out of the box was so high, only a small team of animators was required for final polishing.

To get the project underway, Bukvić’s likeness was captured at 3Lateral, using the company’s custom 4D scanning techniques. While the 3Lateral team created a bespoke MetaHuman rig from this data, the animation data created by MetaHuman Animator from a captured performance can be applied to any digital character whose facial rig uses the control logic corresponding to MetaHuman Facial Description Standard—including those created with MetaHuman Creator or Mesh to MetaHuman.

Even though the piece was to be entirely digital, Šijak and his team drew heavily on their traditional filmmaking experience throughout the process. For added realism, real-world movie cameras, complete with dolly tracks, were brought into the mocap studio. These were tracked, along with Bukvić’s body, and of course, his face. All this enabled the team to precisely recreate the camera motions in Unreal Engine, with Bukvić’s MetaHuman acting directly to the camera.

Lighting setup recreated digitally in Unreal Engine

To design the lighting exactly as they would for a live-action shoot, they brought in physical lights and adjusted them to get the look they wanted on Bukvić. With the chosen lighting setup recreated digitally in Unreal Engine, they could quickly preview how the lighting was working with Bukvić’s animated MetaHuman while the actor was still on the motion capture set, and get another take right away if required. And of course—unlike with physical lighting—the lighting could be tweaked after the fact.

It’s the immediacy of these results and the iterative process that this facilitates between artists, animators, and actors—combined with the fidelity of the capture—that makes MetaHuman Animator such a powerful tool for creating cinematics. Animation studios can now work with an actor on set, using their creative intuition to direct a performance that will faithfully translate into the animated content to create emotional cinematic pieces.

“You cannot distinguish the quality of the output that was done through this technology or shot on set with a real camera,” says Šijak. “The camera work and the performance of the actor and everything that gets the audience involved in the shot, it’s there. Nothing’s lost.”

A comment on YouTube about the BTS video says it all: “soon movie studios would have to clearly specify that their films are either live-action or animated because we would not be able to tell the difference just by looking at it.

Follow the link to read the whole story “Behind the scenes on Blue Dot: MetaHuman Animator brings stunning fidelity to cinematics”.

Exit mobile version