The wold of gaming and animation is fast undergoing a transformation and new advancements are creating a whole new experience for users and artists. I am sure many of you are aware of the popular videogame Fornite. Well, Epic Games, the publisher of this game, has recently launched an iOS App for Unreal Engine that will allow capturing facial expressions of the users in real time. Creates an excitement doesn’t it?
Named as Live Link Face, this new app allows the streaming of facial expression or facial animation in real time from the iPhone to any character of Unreal Engine directly and visualize them with live rendering. Needless to mention, the animation is of extremely high quality. Epic Games feels that this will make capturing of facial expressions far more accessible and easy for the creators. Besides this, one can record the facial tracking data and fine-tune it further using animation tools to achieve a superior final output which can be put together using Unreal Engine’s Sequencer. This app also has a well integrated stage work flow with which one can shoot a performance capture of a professional level.
So how does this app work. Live Link Face tracking feature makes use of Apple’s ARKit and the iPhone’s TrueDepth front-facing camera to track an user’s face in an interactive manner. The data collected is then transmitted via Live Link over a network directly to Unreal Engine. According to Epic Games, this app has been designs such that where on one hand it can be used for professional capture stages involving multiple users wearing suits meant for full motion capture, on the other hand it can also be used by a single user right at his or her desk. Hence, no matter what the production set up is, this app can easily generate expressive and emotive facial animation.
One of the key features of the app is that it can be controlled or operated remotely. Very suitable in the global pandemic situation. This is done with the help of the Open Sound Control (OSC) support feature which allows external applications to operate it remotely and record on multiple iPhones with a single click. This capability to be able to record externally gives the artists the scope to focus on their performance. The app adjusts natively since it includes both head and neck rotation data as part of its facial tracking stream. This, therefore, gives a greater freedom of movement for an artists digital avatars that too with only the iPhone.
Another advantage of this feature is that it automatically collects data and archives it. A specific feature that needs mention here is that in the event of multiple takes, Live Link Face allows the user to remove unwanted takes, share the selected clips via AirDrop and also play the reference video directly on the phone. Cool isn’t it? Apart from the ones mentioned above, another feature includes time-code support to ensure synchronization between multiple devises, and the ability to record the footage of the front-facing video reference and the raw data on facial animation.
Revolutionizing the animation space, at present, the Live Link Face app is available only for iPhones. The possibility of being available for Android phones seems distant as developers feel that making it compatible for the different front camera set-ups of Android phones will be a challenge. With photorealism being the key area of focus of Epic Games, this app is sure to create a rave in the world of animation. It is also an example of Unreal Engine’s efforts towards democratizing the usage of real-time tools for perform production in a virtual mode.