The Lion King CGI has been much talked about since the release of the film last year. The film has hit headlines for its use of animation, from criticism that the animation was too realistic to show emotion (opens in new tab), to deepfake technology (opens in new tab) being utilised to correct CGI mistakes. But controversy aside, it remains an incredible achievement in terms of photorealistic effects.
When we spoke to the Elliot Newman, VFX Supervisor on MPC's Lion King production, he was all too aware of the epic creative venture he had completed. This interview explores how the film was shot, including the approach to animal inspiration and the tools utilised during the process. (For more inspiration, explore our guide to the best 3D movies, or browse our roundup of the 3D models you can use in your projects.)
Click the icon in the top right of each image to enlarge it
Based at MPC (opens in new tab) in London, Newman begins our conversation by noting that it served as the centre of the company's work on the movie. Assets for the film were built at the London studio and then made available to the production base in Los Angeles, where the virtual production work of 'shooting' the movie took place (under the lead of director Jon Favreau and production visual effects supervisor Rob Legato). Animation for the movie was then undertaken by Newman's team in London with input also provided by MPC's Los Angeles and Bangalore studios.
Having broadly defined the setup for the work to be done, Newman starts by addressing the fundamental challenge in the appropriately epic journey of bringing this new version of The Lion King to the screen, saying that: "It was probably just the scale of it and that it's a remake of such a classic movie. It was enormous for us, just managing the fact that we're the sole facility. The expectations were incredibly high and it doesn't get much bigger. But it was super exciting and it's a special thing to be a part of. But there's always the pressure of expectations that were so high."(opens in new tab)
Of his schedule on the project, Newman recalls,"I started about two and a half years ago, in January 2017, and we did some pitch work and had conversations about the process of shooting approval. My involvement began with preparing a teaser, comprising 25 shots, for D23 [the Disney fan club convention] which took place in August 2017. That teaser was 90 seconds of the opening scene, and every shot was a different location and involved different lighting conditions and was all done on a very rapid schedule."
Newman explains his day-today schedule on the production. He would typically meet with two or three production staff taking notes, one CG supervisor and perhaps a lighting lead, for a series of regular agenda-based conversations. Additionally, at the start of the studio's work on the film, Newman would talk every day to lighting leads and to Legato.
"It's quite challenging to work out how to manage resources," Newman observes, breaking down the scale of MPC's work on the movie. The volume of data being organised, shared and iterated between MPC and the production base in Los Angeles was immense.(opens in new tab)
Critically, Newman explains that no motion capture was undertaken in making the movie and that the characters are all key-frame animated. As such, the film's foundation in long-standing traditions of animation has been reset within the context of virtual production. "The camera and focus-pulling moves were recorded from the virtual camera," Newman explains. "We built the master scenes and then Jon [Favreau] would put VR goggles on and they'd then work out their shots. Pre-animation was handled in Maya and then exported into Unity and, in converting reality into a render, we were always concentrating on simulating depth of field in the composition of a shot."(opens in new tab)
Of the virtual production process used for the project, Newman says, "It was fun to watch the filmmakers realise this freedom, that the physical constraints are gone." That said, the production would impose certain limits on creative choices in pursuit of consistent believability. Newman explains that MPC built the assets that were then imported into the game engine, within which layout and staging were then determined. Regarding the flexibility offered by the virtual production approach, Newman notes that it allowed MPC and their team in Los Angeles to make the most minute and subtle adjustments.
"If they shot something and weren't happy with part of a camera move," Newman explains, "they can now work just with layers. It's like visual dubbing. You can correct just one part of a camera move. If a move was too exaggerated you could adjust it."(opens in new tab)
When talking about MPC's toolset, Newman catalogues the studio's use of Maya (see our rundown of the best Maya tutorials (opens in new tab)), Nuke (for compositing), Katana (opens in new tab) (for lighting and lookdev), RenderMan (opens in new tab), and also the latest iteration of their proprietary fur-simulation tool, Furtility (opens in new tab).
As with MPC's work on its previous collaboration with Favreau, The Jungle Book, Newman notes that he and his team on The Lion King "realised that we had to research the colour of hair and fur, right down to the melanin." Newman continues, "We aren't a software company, but we do build and interface around the software that we use. We've written lots of deep compositing toolsets, for example. With Maya you open it up and there, within that, will be MPC-built stuff for how to get data in and out of our pipeline."(opens in new tab)
When it comes to accessing and organising material to review, discuss and develop further, Newman explains, with a wry laugh, that "it's all about data management." When reviewing shots, "I can filter a clip by shot number or discipline or artist," he states. Discussing the film's photorealistic visual language, Newman says, "Jon Favreau's modus operandi was 'don't fall into the trap where you over-beautify everything.' Sometimes the sky is blown-out, sometimes it's overcast and so on. We didn't overwork the shots and we made sure that Jon's realism- and documentary-quest was backed up with Caleb Deschanel [director of photography] and Legato's visual sensibilities."
Newman then proceeds to detail some of the nuances that MPC brought to the plates that simulated environment and natural light, indicating the kinds of detail to which they worked. "If we wanted to, we could emulate real sun falloff and exposure and we put a virtual camera on it (the sunlight) to get the
right kelvins. We got quite 'techy' and when we went to Africa we worked on capturing the feeling of the landscape there, and correctly profiled and calibrated our cameras to capture the exposure values of the sun."
The Kenya shoot provided Newman and his team with motion reference material, still images and records of animal behaviour. Additional reference footage of animals was then captured at Disney's Animal Kingdom. "No animal was put on a scanning stage," Newman adds, explaining that, for Favreau, it was essential to not interrupt the activity of the animals as they documented them.
Given that a film like The Lion King has the potential to inherit and then push the envelope of earlier films' creative and technological achievements, MPC's work on the movie marks another watershed in the long-standing relationship between animation and VFX. The movie is a step towards a kind of filmmaking that continues to dissolve the lines between pre-production, production and post-production.
Are you ready for Vertex? Join us this Feb at Olympia London for the ultimate conference for 2D and 3D artists. The top artists working in film, games, VFX, illustration and animation will be there to show you how to create your best ever art. Find out more (opens in new tab).
This article was originally published in 3D Artist.