The motto of LA-based studio MastersFX (opens in new tab) is 'Art Imitating Life'. For over 30 years, the MastersFX team has been creating creatures and practical effects for the entertainment industry. We spoke to lead digital makeup designer Johnathan Banta to find out more about the studio's effective techniques.
MastersFX is known for its practical effects work, so how does digital make-up fit into its plan?
This is a brand new service that the studio is providing, although we've been heading toward it for some time, with work on shows like Fringe and True Blood. It's very much driven by JJ Abrams and other directors' desire for practical work. They want the interaction and ability to shoot on set.
But what would previously happen in these cases was the digital element would then be sent out to another bidder. We're trying to offer a way to save money by saying, "You've got the people doing the practical make-up, why not have the same creative team handling the digital and making use of decades of experience and knowledge?"
Do you think there's still a place for practical effects work?
With a near-unlimited budget it's possible to create a photoreal CG creature, and the necessary tools are becoming more democratised. But to declare everything else null and void has been premature. Physical effects have advanced an incredible amount. Directors and actors want it on set, and combining physical with digital gives real financial savings.
What were the reasons for using the practical/CG approach with the alien character of Cochise in Fallen Skies?
With Cochise, what we have underneath three inches of make-up is a very accomplished actor. Doug Jones defines the character on set and interacts with the other actors. But because of the limitations of a mask you can't put a lot of animatronics in, so our job is to read his performance, analyse his speech patterns and use that to drive procedural animations.
Presumably you rely quite heavily on on-set reference?
We actually capture as little as possible. We want a very low footprint. Our main focus is to apply the physical mask and keep the actor as comfortable as possible. We then go back to the shop, get 3D scans of our sculpt - an immediate cost-saving compared to trying to do things between two different studios.
Once we've matched the motion we then hook up our postproduction 'animatronics'. Rather than trying to simulate muscles as they would at somewhere like Weta Digital we emulate our make-up process, making virtual cable controls, the digital equivalents of air bladders and all the other things you'd get on a puppet.
Where next for this hybrid approach?
One thing we've been exploring in the last month is to take a practical effect and then apply this digitally to an actor. Another area involves our renting out of baby models for mid-ground shots. What we're now starting to do is performance transfer onto the rubber puppets, to make it look like they're really alive. And then, of course, you cut to the 'expensive' real baby for the close-ups.
Words: Mark Ramshaw
This article originally appeared in 3D World (opens in new tab) issue 179.