At Unreal State of Unreal 2025 Epic Games revealed its impressive MetaHumans tech is no longer Unreal Engine-exclusive and tethered to Unreal Engine 5. A new licensing update announced at Unreal Fest in Orlando has opened the door to use MetaHumans in other 3D modelling software or creative apps, including Blender and Maya.
The FAB marketplace (read our report on FAB for details) is now packed with MetaHuman-compatible outfits, hair grooms and accessories ready for drag-and-drop use. You can also sell your MetaHuman content directly on FAB or third-party marketplaces.
For Maya artists Epic Games has launched MetaHuman for Maya, a powerful plugin that unlocks direct mesh editing, high-end rigging controls, and groom export tools, enabling artists to push past the limits of the default MetaHuman look while staying fully pipeline-compatible.
In short: MetaHumans are more flexible, accessible, and powerful than ever. With cross-platform animation, marketplace support, and pro-grade rigging in Maya, bringing digital characters to life just got a whole lot easier, no matter what your pipeline looks like.
Animation made easy
MetaHumans are stepping into the spotlight with Unreal Engine 5.6. Epic Games has refreshed MetaHuman Animator, enabling real-time facial animation from standard webcams, Android phones, and just about any mono camera that works with Live Link.
You no longer need expensive stereo HMC rigs or even an iPhone to get high-fidelity, on-the-fly animation. I saw this in action at Unreal Fest in a number of demos, which showcased how you can simply stand in front of the camera or camera phone and start recording live, saving animation to hand tweak afterwards.
Whether you’re capturing live performances on set or just want instant visual feedback, your MetaHuman can now keep pace with the actor in real time. I got to take part, and watch as the face mapping tech in MetaHuman animator matched my expressions and lip synced perfectly.
Get the Creative Bloq Newsletter
Daily design news, reviews, how-tos and more, as picked by the editors.
It even works with just audio, so no camera, for just recording live mouth movement. This means you can now animate a MetaHuman using only audio. Epic’s latest tools analyse vocal input in real time to generate lifelike facial motion, including emotion-aware performance and automatic head movement. You can even fine-tune the emotional tone manually, giving you full control to match your project’s mood or message.
Visit the Epic Games Unreal Engine website for more details. Read our guides to the best laptops for 3D modelling and best camera phones to gauge the hardware you may need.
Thank you for reading 5 articles this month* Join now for unlimited access
Enjoy your first month for just £1 / $1 / €1
*Read 5 free articles per month without a subscription
Join now for unlimited access
Try first month for just £1 / $1 / €1

Ian Dean is Editor, Digital Arts & 3D at Creative Bloq, and the former editor of many leading magazines. These titles included ImagineFX, 3D World and video game titles Play and Official PlayStation Magazine. Ian launched Xbox magazine X360 and edited PlayStation World. For Creative Bloq, Ian combines his experiences to bring the latest news on digital art, VFX and video games and tech, and in his spare time he doodles in Procreate, ArtRage, and Rebelle while finding time to play Xbox and PS5.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.