Unity gets Wētā's Hollywood VFX tools, and "assisted artistry"

Unity and Weta tools at Siggraph; a close up of a woman's face
(Image credit: Weta / Unity)

When Unity bought Wētā, the VFX studio behind some of Hollywood's biggest movies, I suddenly became very interested in this unsung real-time game engine. The idea that these advanced VFX tools will someday be available to everyone could really change how we create.

We outlined some of the next-gen features already in Unity in our previous feature, but the plans revealed today at Unity's Siggraph keynote could be revolutionary. The aim is simple, to bring Wētā's industry-leading tools to Unity, an approachable realtime game engine that can be used for everything from game creation to architectural visualisation and animation. This humble platform will now be home to some of the most powerful hyper-realistic character creation and creature effects, and it's very exciting.

The tools announced today, that are coming to Unity in beta now and finally in October, include the new real-time digital human technology, Ziva (for realistic character design) and Wig (for creating complex hair and fur). As well as this, Wētā's Deep Compositing, a toolset for Unity that enables you to combine CG renders more efficiently, and SyncSketch for real-time feedback, will really push what can be done in Unity. (Watch the Unity Siggraph keynote for more demos and details.)

Natasha Tatarchuk, vice president of Unity Wētā Tools, tells me why this marriage of tools and accessibility will change how 2D and 3D artists work, as well as the workflow issues facing studios and artists. For Tatarchuk, the bringing together of Unity's real-time tools and the best of Wētā is about speeding up the "time to decision", to enable artists to "make better decisions about the choices that they need to own […] I think it is the fundamental shift".

Tatarchuk says: "Unity is known for making things accessible. And there are multiple strengths that we're actually building from. One, ability to reach a wide audience of creators who dream themselves to be creators, but may not have come from many, many, many years of a specifically trained skill, right? They have a vision, and Unity was there to make it easier for them to achieve that vision, no matter what their background."

That principle, the strength of accessibility, to build tools that enable the creative access to the widest range of artists and creatives is a driving factor behind bringing Wētā's proven VFX tools to Unity. Today's announcement essentially puts some of the most advanced professional VFX and CG tools in the hands of everyone, and it could be revolutionary. (We've already shown how Unity and Procreate can work together in a workflow tutorial.)

This convergence of high-end tools with a platform aimed at all creators also comes with a new AI. Yes, Tatarchuk confirms, Unity will make use of AI to help "accessibility to a wider range of non-experts". It's what she refers to as "assisted artistry" and the aim is to ensure everyone can use the new tools, and for content creation to "be scalable". 

It sounds like Unity's application of AI won't be doing the work, artists and creators will be in control and own their work but, as with the introduction of procedural tools, Unity will now enable more people to achieve better results and open up the platform to more sectors, from traditional indie game studios to fashion designers and more. 

Tatarchuk name-checks some of the biggest movies in recent years, including Avengers, and explains the pitch is to bring that level of Wētā expertise to everyone through Unity. (Wētā tools have also been used in The Last of Us for HBO.) 

"So [Wētā] is known to be able to deliver on the vision, no matter how ambitious the vision is, and now we combine this strength with the accessibility of Unity and the scalability of the content to effectively bring this out, this deep expertise, in a much more accessible way, an assisted way, so that we can assist the artists, the creators that have the vision."

She continues: "[It's] to make their own content easier and quicker through the assistance of configurable content, proceduralism and, of course, AI assisted workflows where you own the content that you're creating. And I think that that's an incredibly empowering moment in time because what we're trying to do is remove the repetitive tasks, the tedium, and really just let the creator sit in a space where they get to express their vision."

Unity and Wētā is a marriage of approach and technology that could really change the fluidity of how artists work, and in an era where 3D and real-time is going to become more important, and crucial to every sector from game development to web design, it feels that this coming together is more important than ever. As Tatarchuk says: "Our philosophy is very simple, we want to meet artists where they are; we really don't want to dictate what pipeline people are using."

Tatarchuk offers the coming together of game development and film production as an example, highlighting how the team from Wētā's character creature departments adopted Ziva (Unity acquired Ziva Dymanics in January 2022) and "pushed what real time technologies are capable of doing," she tells me, adding: "That was an example of the flow from the elite expertise from film really empowering, what we're able to do sort of push the bleeding edge of what's possible in games.

It's worked the other way, too. Tatarchuk tells me how recently artists from Guerrilla Games, developer of PS5 game Horizon Forbidden West, joined Wētā and brought their real-time know-how to the film VFX team. They helped "push things in terms of performance and in terms of the ability to handle constrained data sizes".

By bringing Wētā's high-end tools to Unity, Tatarchuk is getting closer to her goal of "assistant artistry", as she explains how the real-time platform is "a place where there's a lot of workflows that help the artists create more of their own content better and faster".

Unity and Weta tools at Siggraph; a character render from a Marvel movie

The Wētā tools used in Marvel's Shang-Chi and the Legend of the Ten Rings are coming to in Unity (Image credit: Marvel)

A recent Adobe survey reveals how 3D skills will become essential in the coming years, whether you work in game or graphic design, film VFX or illustration. Tatarchuk reflects on this, saying: "I think regardless of what particular goal or industry you're working in, the reality is we're at a kind of inflection point in the evolution of how people ingest content," says Tatarchuk.

Tatarchuk picks up her mobile phone and tells me how this device is more powerful than fifth generation consoles, how you can create on smartphones (and we've seen how Unreal Engine's MetaHuman Animator works on iPhone), and how the next generation, such as her daughter, now use the internet expecting deeper levels of interaction and storytelling. 

"And that's why," she begins," we have to have come to a realisation that this is about 3D that is interactive. This is a 3D where we're going to create emergent behaviours; the experience will be created with us rather than for us. And this is the fundamental shift. I think all creators should be thinking about how they create experiences that rise up to that expectation?"

Game development is breaking beyond the confines of the medium, and the convergence of Unity's real-time platform and Wētā's advanced character and creature tools, and workflow applications, will enable a greater number of people of all skill levels to create in deeper ways.

A real-time platform like Unity that offers the power and scale of Wētā's best tools combined with a new approachable workflow for everyone, could really be used in a new era of design across the board. "Humans want to engage," Tatarchuk tells me, and Unity will offer the tools to make that happen.

For more, visit Unity's Siggraph page for more content and deep dives into the tools and topics discussed here. Read our Unity versus Unreal Engine feature for anyone new to these real time game engines and a feature that poses the current question, 'Are game engines the future of 3D art?'. 

Thank you for reading 5 articles this month* Join now for unlimited access

Enjoy your first month for just £1 / $1 / €1

*Read 5 free articles per month without a subscription

Join now for unlimited access

Try first month for just £1 / $1 / €1

Ian Dean
Editor, Digital Arts & 3D

Ian Dean is Editor, Digital Arts & 3D at Creativebloq, and the former editor of many leading magazines. These titles included ImagineFX, 3D World and leading video game title Official PlayStation Magazine. In his early career he wrote for music and film magazines including Uncut and SFX. Ian launched Xbox magazine X360 and edited PlayStation World. For Creative Bloq, Ian combines his experiences to bring the latest news on AI, digital art and video game art and tech, and more to Creative Bloq, and in his spare time he doodles in Procreate, ArtRage, and Rebelle while finding time to play Xbox and PS5. He's also a keen Cricut user and laser cutter fan, and is currently crafting on Glowforge and xTools M1.