This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.
Epic Games has launched a new iOS app that uses the platform’s ARKit technology to capture real-world facial movements and translate those to character models in Unreal Engine.
While the app Live Link Face is designed to benefit studios large and small alike, the free iOS app could be an especially welcome boon for smaller teams that otherwise wouldn’t have access to facial animation capture tech.
“The app’s tracking leverages Apple’s ARKit and the iPhone’s TrueDepth front-facing camera to interactively track a performer’s face, transmitting this data directly to Unreal Engine via Live Link over a network,” reads an explainer.
“Designed to excel on both professional capture stages with multiple actors in full motion capture suits as well as at a single artist’s desk, the app delivers expressive and emotive facial performances in any production situation.”
Epic has toyed with ARKit tech before to create facial capture features for Unreal Engine, but those previous options required devs to also have a Mac and Apple dev account. Live Link Face, meanwhile, doesn’t need to be compiled for iOS in the same way, so developers can instead use it to stream facial animation directly to properly-configured models in Unreal.
That being said, there is a bit of work devs will need to take on before their captures are given life as animations, particularly with how characters are rigged before being imported into Unreal Engine. More details on those steps, and Live Link Face as a whole, can be found in the app’s documentation.