Epic Games has just released a beta of Live Link Face for Android, adding to the existing support on iPhone and iPad. The tool enables real-time facial motion capture—an essential piece for high‑end animation, rigging, shading, lighting and finishing in virtual production pipelines.
Hitting record (literally)
The app generates animation data directly from the device’s front‑facing camera and streams it to Unreal Engine via the MetaHuman Live Link Plugin. That means compositors and editors can capture expressive face performances and feed them instantly into MetaHuman characters—no offline cleanup needed, but any studio should fact‑check stability before using it in tight schedules.
Network sync without network hassle
Timecode support is built in: users can sync via the device’s system clock, an external NTP server or even a Tentacle Sync master clock. That makes it possible to coordinate multiple devices on the same shoot with accurate animation timing for post‑production and color grading downstream.
Snapdragon power matters
To run the Android beta, you’ll need a Qualcomm Snapdragon 8 Gen 1 or newer. Epic notes enhanced performance on Gen 3 silicon. Supported reference models include Samsung S23 Ultra (Gen 2), ROG Phone 7 Ultimate (Gen 2), Samsung S24 (Gen 3), S24 Ultra (Gen 3) and Sony Xperia 1 VI (Gen 3). If your device meets that spec, you’re good to go—just install from Google Play, grant camera and mic permissions, join the same Wi‑Fi network as your Unreal workstation, and connect as a Live Link Face source.
Final take
For VFX artists, animators and post‑production teams, this release makes facial capture using mobile hardware more accessible and portable. It cuts down rigging and editing prep work, letting artists focus on performance and color. But remember: this is beta software—usability, simplicity and stability need a thorough trial before being trusted in production pipelines.
Always perform a real-world test before integrating new tech into your projects.