DGene’s new development and production facility in Baton Rouge, Louisiana, features technology for creating 3D holograms of people and objects.
Founded to offer a variety of Artificial Intelligence solutions for Hollywood filmmakers, Dgene made the news last April, when the company announced the start of its operations in Los Angeles, along with the introduction of Amber, the perfect influencer, a virtual character tailored to match the values of any brand. The company’s AI actors are, says Dgene, “convincing portrayals of historical figures, living people, and unique characters. They can appear photoreal or stylized; integrated into movies, streaming content, or advertising; used as digital influencers; or employed to preserve personal legacies.”
Virtual Amber is just part of what DGene brings to Hollywood, as the Silicon Valley and Shanghai-based developer of AI technology, launched operations in Los Angeles. The portfolio of the future tech company offers, as ProVideo Coalition noted last April, AI solutions for content creation, including virtual actors, virtual production, visual effects, and film restoration.
After introducing Amber, DGene announces that it has opened a research and development and production facility in Baton Rouge, Louisiana. The centerpiece of the new site is a 900-square-foot volumetric capture stage that leverages AI technology to create “holograms” of humans and objects for use in augmented reality (AR), virtual reality (VR), holographic displays, AR, mixed-reality glasses, and framed video.
DGene’s first volumetric stage in North America
Volumetric video is projected to become a billion-dollar industry over the next few years with applications in entertainment, interactive gaming, marketing, digital advertising, training, education, and other areas. DGene is developing tools to make the production of 3D imagery practical and efficient. It is the leading provider of volumetric capture services in China with four stages there and a fifth planned. The Baton Rouge stage is its first in North America. DGene chose the site due to its proximity to Louisiana’s deep film and television production infrastructure, technological research resources and tax incentives for media production.
“We are excited about the potential of volumetric capture, and working to make it affordable and routine,” says DGene CTO Jason Yang. “We want to work with content producers to create compelling, new forms of immersive experiences.”
DGene is currently collaborating with Edward Bilous, composer, artistic director, and founding director of the Center for Innovation in the Arts at the Juilliard School, on a concert event blending live performances with virtual reality. Titled The Story of Awe and scheduled to appear next year, it will feature an ensemble of actors, musical soloists, a digital sound artist, and dancers from different locations performing together in a virtual environment.
A holographic Alice in Wonderland
Additionally, DGene recently teamed with the emerging media production company zyntroPICS Inc. on their Volumetric Wonderland production. Wonderland is designed to demonstrate how volumetric production can be applied to new, story-driven content and delivered on webAR (mobile browser augmented reality). A new spin on Alice, Through the Looking Glass, shot on the DGene Baton Rouge stages, the initial AR release features a holographic Alice, DGene-scanned objects as AR set dressing, and a guest appearance from the first holographic rabbit. https://wonderland.zyntropics.com/
“Working with the DGene teams in Baton Rouge and California has been an exceptional experience,” says zyntroPICS producer Eric Weymueller. “They helped elevate this project and broaden the scope of what is possible with volumetric video production. In addition to AR/VR, the emerging holographic displays that are coming to market will change the very nature of digital content consumption. We are evolving from framed 2D content to spatial 3D content and DGene’s tools are helping us get there.”
DGene’s volumetric capture stage is a dome-shaped structure equipped with a 90-unit array of color and infrared cameras. It employs proprietary AI-driven software for 3D capture and reconstruction and can capture people and objects in motion at a rate of up to 60 frames per second. Captured images are turned into 3D holograms, viewable from any angle at any moment in the timeline. Along with the stage, DGene has developed ultra-scanning systems for creating precise, detailed, 3D scans of objects and human faces.
Results as realistic as possible
The Baton Rouge facility is led by Lead Scientist Yu Ji. Ji, who has a PhD in Computer Science from the University of Delaware, has a diverse background in computer vision and computational photography. He is the recipient of awards and research grants from IEEE Computer Society PAMI Technical Committee, the University of Delaware, and Huazhong University of Science and Technology.
DGene’s current focus is to employ AI to improve the quality of volumetric capture and make the process more efficient. “We want to the results to be as realistic as possible,” says Ji. “We are also developing software to manage and process large volumetric datasets so that production and post-production time is reduced. We are continually improving our compression algorithms to deliver the highest quality images across AR, VR, holographic displays, and other formats.”
DGene is harnessing the power of AI and other emerging technologies for content creation. The company offers groundbreaking solutions for AI actors, virtual production, visual effects, digital influencers, real-time holograms, 3D reconstruction, and other applications. Its AI-driven platform simplifies and accelerates the process of producing breakthrough content, empowering artists, expanding creativity, and enhancing storytelling.
DGene, one reads on the company’s website, “was founded by the brightest minds in artificial intelligence, computer vision, and computer graphics. We created the world’s first dynamic, light-field shooting, and cloud processing system. We also developed the first system for capturing dynamic, 3D human models. Our technology has been applied to film and television production, mobile phone applications, cultural events, and more.”