Uses your webcam and microphone to track facial expressions and voice in real-time, instantly mapping them onto a 2D puppet.
Refined algorithms provided more accurate matching between mouth shapes (visemes) and audio, resulting in higher-quality dialogue sequences. Adobe Character Animator 2020 v3.4
This workflow improvement allows users to consolidate multiple lip-sync or trigger takes into a single, manageable track on the timeline. Core Functionality Uses your webcam and microphone to track facial
Characters are typically designed in Adobe Photoshop or Illustrator . The software uses a specific layer-naming convention to automatically assign behaviors like eye blinks and mouth movements. This version bridged the gap between manual rigging
Adobe Character Animator 2020 (v3.4), released in , was a major update that introduced sophisticated automation tools to the performance-based animation platform. This version bridged the gap between manual rigging and AI-driven movement, making it significantly easier to create expressive 2D characters. Key Features of v3.4