Announced a few months ago under the code name Project Animal and still with no official launch date, the program Adobe Character Animator is the newest addition to the Adobe Creative Cloud family of applications.
Presented at NAB 2015 as part of the new and future updates to Premiere Pro and After Effects and other Adobe software, the Adobe Character Animator may have slipped unnoticed to some, but it points to the dawn of a new era when it comes to character animation. Put simply, Adobe Character Animator tracks your facial movements, lets you record dialogue or a voice performance and enables you to trigger actions with your keyboard that give life to characters created in Illustrator CC or Photoshop CC.
Yes, Adobe Character Animator brings still image artwork from Photoshop or Illustrator to life by capturing your performance using a camera and microphone, reproducing your facial expressions, synchronizing mouth movements to your speech, and giving you control over all aspects of a character’s movement through the mouse, keyboard, and programmable behaviors. You can simply perform to animate a character that you’ve acquired from someone else, or you can rig your own characters based on your own artwork from Photoshop or Illustrator. You can even write your own programmable behaviors or plug in behaviors from elsewhere.
Lip Synch, as the demonstration video shows, is still not up to the standards some need for their work, but being able to do it without having to take a degree and spending hours trying to get the desired effects is going to be appreciated by all those that need to do character animation. Although the program seems limited to 2D animation – at least now -, the possibility to work with characters created by each user opens new venues for a commercial use of the program. The workflow starts by creating your character in Illustrator or Photoshop. You can build your character from scratch or import artwork such as eyes, moustaches, and limbs via Creative Cloud Market, for example.
According to Todd Kopriva, from the After Effects team at Adobe, Adobe Character Animator (still in Preview 1 mode) is installed with the upcoming version of After Effects. Todd Kopriva says: “ I have never had more fun with a piece of software, and I can’t wait to see how you all make use of this utterly delightful creative application.”
Adobe believes that Character Animation is ready to be used in real animation workflows, but, said Todd Kopriva, “we also know that we need more input from you to bring it to the level of completeness and quality that you’ve come to expect from Adobe creative applications. When you use Adobe Character Animator, you’ll be able to submit feedback through a forum linked to directly from the application itself. We want to hear from you so that we can build the best possible experience for you.”
One interesting aspect of this new program is that Adobe has taken the face-tracking technology from Adobe Character Animator and integrated it into After Effects as a highly accurate face tracker that can generate masks to isolate faces or, in detail mode, generate effect control points for every major feature of the face. This makes attaching effects, layers, or other objects to specific facial features incredibly easy and precise, making workflows such as digital makeup, creative eye replacement, and others much easier than in the past.
A program like this will, surely, not make everybody happy. Translating into a more accessible process something that has been only achievable by a few, always creates some tensions. But that’s what technology does: like we’ve seen before with digital video and photography and many other areas, where suddenly everybody could try things that were not commonly accessible, Adobe Character Animator opens to more people the chance to do lip synch techniques and character animation. Still, the quality of content will continue to be paramount. And that’s what is really important.
Follow the link for more information about Adobe Character Animator.