By Jon Porter, a reporter with five years of experience covering consumer tech releases, EU tech policy, online platforms, and mechanical keyboards.
If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.
Today, Epic is releasing a new tool designed to capture an actor’s facial performance using a device as simple as an iPhone and apply it to a hyperrealistic “MetaHuman” in the Unreal Engine in “minutes.” The feature, dubbed MetaHuman Animator, was detailed at the Game Developers Conference in March but is now available for developers to try out for themselves. Epic has also released a new video today produced by one of its internal teams to show what the tool is capable of.
While Epic’s short film shows off some impressively subtle facial animation, the big benefit the company is emphasizing is the speed with which MetaHuman Animator produces results. “The animation is produced locally using GPU hardware, with the final animation available in minutes,” the company’s press release reads. That has the potential to not just save a studio money by making performance capture more efficient but also, Epic argues, it could allow them to experiment and be more creative.
“Need an actor to give you more, dig into a different emotion, or simply explore a new direction?” Epic’s press release asks. “Have them do another take. You’ll be able to review the results in about the time it takes to make a cup of coffee.” Facial animation can be applied to a MetaHuman character “in just a few clicks,” Epic says, and the system is even smart enough to animate a character’s tongue based on the performance’s audio.
Performance capture using iPhones has been possible in the Unreal Engine since at
Read more on theverge.com