Site icon ProVideo Coalition

Lip Sync in After Effects

Last year Angie Taylor posted Lip Sync in After Effects, demoed a possible solution to a task that most animators will be asked to do at some point. Admittedly though, it would rarely involve a talking dog or space coyote. Please note references to Papagayo and After Effects, and phonemes. Newer automation tools can be found toward the end of the roundup.

The techniques in the videos below from Angie (Adobe After Effects CS5: Learn by Video) and from Robert Powers automate the process using time-remapping and other built-in features of After Effects.

 

Mettle sponsored a 9-part series, Build Me Some Hope: Lip-Synced Character Animation Series by David Legion. See also, Adding variety to lip-sync animations, a 2014 video from Getting Started with After Effects Expressions with Angie Taylor.

 

Here's a few more lip sync video tutorials for After Effects, with updates to follow. They're often similar in using time remapping, basic expressions, or converting audio to keyframes, etc. and not using older ideas that used CC Split, Reshape, or other effects. Remember: older scripts, expression, and filter effects might not work the same in newer versions of AE.

For more on audio in AE, see Animating to Sound with After Effects, about a course from Chris Meyer.

Back to AE, here's Ryan Boyle on lip sync:


 

◊ See also Animate a character in After Effects: A survey of resources. By the way, if you're trying to sync up audio recorded separately, Premiere has an Audio Units mode for its Timeline that allows for easy fine-tuning.


UPDATELipsyncr is new for AE from AEscripts.com on October 2014, and offers “automatic high quality viseme–based mouth animations. In a matter of seconds.”

 

Earlier, Mamoworld released Auto Lip-Sync, a tool that leverages built-in features of After Effects to help you animate a mouth in sync with audio. Auto Lip-Sync features a step-by-step wizard that analyzes a voice recording and helps you customize and fine-tune the mouth for your project, for example, how wide the mouth opens and how it deforms for certain syllables. It also provides null objects to animate user-provided teeth and tongue. It won't solve all your problems, especially if prominent jaws need animating, but should pay for itself before the first job is finished.

VinhSon Nguyen had an early review, and on AEtuts is A Look At How To Use “Auto Lip-Sync” by creator Mathias Mohl, which does lip sync on a moving car: tracking in MochaAE, importing with MochaImport+, and stepping through the wizard. Here's the intro tutorial, Talking Tree with Auto Lip-Sync:

 

Andrew Devis added Animation Lip Sync, a two-part tutorial that shows the basics of a character animation lip-sync technique in AE using time remapping to select which mouth shape to use at any time. While a good basic method to achieve the goal, it requires the animator to remember what frame number any particular mouth shape is located on, so in part 2 Andrew shows how to create a simple rig which makes the whole process far easier for the animator, with good visual feedback. Here's part 1 and 2:

 

A bit similar to the Andrew Devis tip above but easier in some cases, Mikey Borup shares a tip he learned at AE World — how to assign Rotation as framer picker for mouth animation (and use the quick +/- shortcut):

Exit mobile version