Why you should know about real-time cinematography

There’s a new form of entertainment media emerging, one that is being explored in 3d movies as well as both VFX studios and game developers. For example, Unreal Engine has seen non-game makers begin to experiment with the render engine.

“Because the engine is really accessible to anybody, we’re finding that loads of people are using Unreal in all sorts of different ways,” says Kim Libreri, CTO at Epic Games. “In the movie industry, people are going to be using it for pre-visualisation and a little bit of virtual production here and there, and some companies are really embracing the future of where media is going to go.”

One of those companies is Ninja Theory, whose Hellblade cinematic demo is seen as defining real-time cinematography and raising many questions about the future of real-time production.

“Real-time cinemagraphy is quite analogous to traditional cinematography, where you have actors, actresses and cameras. You film the scene and then end up with different camera angles and shots that you can then edit into a sequence. That’s the way you’d do it in the normal world,” explains Libreri. “The difference is that everything that we were doing on Hellblade was being generated in the virtual world. So through the process of motion capture and performance capture, not just the performance of our actress Mel [Melina Juergens], but the performance of the camera, we were able to digitise the motion of what we call the virtual camera,” Libreri reveals.

Game engine technology gives you a wealth of creative possibilities that enable you to do live performance

Kim Libreri, CTO, Epic Games

The virtual camera is a handheld camera rig minus a camera. Instead of the camera, it has motion capture markers so the director knows where the camera is in space and where it’s pointing. “We can use that as a portal into the virtual world that the actress is basically driving her body motion through,” says Libreri.

For Hellblade, the entire environment and the character of Senua don’t exist in the real world. What Epic and Ninja Theory are doing is taking motion capture data and information from the real world, and using that to puppet the virtual world.

Libreri says he’s been playing with performance capture and digital humans for a long time, ever since his time spent as VFX supervisor on the Matrix trilogy, where, back in 2003, he created a digital Hugo Weaving and Agent Smith for the groundbreaking film franchise.

“I’ve been following this technology a lot, so it was only a matter of time for GPU technology to get to a point where they can do this in real-time,” he says.

A recent Hellblade demo at Siggraph showed a glimpse of what this technology can do: “We showed the world that game engine technology isn’t just about making pretty pictures akin to what you’d see in a modern visual effects movie, but that it gives you a wealth of creative possibilities that enable you to do live performance,” Libreri explains.

Rethink your workflow

Hellblade is pioneering real-time mocap, bringing postproduction into a live take

Hellblade is pioneering real-time mocap, bringing postproduction into a live take

“The whole drive behind Hellblade is to find more efficient ways to do the things that we’ve been doing in the past,” says Tameem Antoniades, Ninja Theory’s chief creative director. “As we start new projects, I think our approach to how we build our worlds will change. So we’ll build our characters and our locations first, and then shoot them like we would a film,” he adds.

When working on past games such as Heavenly Sword, Antoniades followed the standard workflow: motion capture was used to shoot the scenes and it would take up to six months to get that data into a finished scene. Up until that point, Antoniades was always unsure if what he had would be good enough.

There are many benefits of real-time cinematography: “With this system, you shoot and it’s there instantly. From there you can still tweak things. So you can still take the facial data and take it to an offline package and tweak it, but the guess work and finger crossing has gone,” says Antoniades.

“The way we shoot has been the same since Heavenly Sword, but what this allows us to do is have the game and the tools catch up with the way a director would want to shoot on set with actors. And so it just makes sense. It’s the last piece of the puzzle as you can actually see what you’re shooting rather than hoping that it will turn out.”

During live demos the team use Xsens MVN suits for live performance capture

During live demos the team use Xsens MVN suits for live performance capture

Libreri agrees: “No longer are you constrained. I grew up in the visual effects business, and you would wait weeks or months to be able to see a result, and now it’s totally possible to produce amazing, interesting visuals interactively. It opens up the possibilities of new types of art and reaching audiences with new types of spectacle.”

The system also helps with creating more dynamic scenes: “Good quality live feedback means things like the character’s eyes can move around,” says Libreri. “If you do a normal, traditional MotionBuilder preview you’re going to have a pretty bad idea where the eye line is, you won’t know where she’s looking.”

Libreri explains that in the Hellblade demo, actress Mel is performing against a digital version of herself and there are many times where it’s really important for her to get her eye line correct. “The fact we had a high quality rendition out of the engine that looks like it’s going to look in the game has changed the kind of capture you’re going to get. You’re now confident that when you leave the set or the mocap stage, you’ve nailed it, whereas before it would be pretty easy to do a two-pass take and say we nailed it. And then you would get it in post and keyframe animators would have to animate it.”

Game changer

”It was decided from the beginning to make a detailed recreation of the character… Animators or mocap artists can drive the performance: pucker the lips, open the jaw, blink and rotate the eyes in Maya,” says Epic Games’ Kim Libreri

”It was decided from the beginning to make a detailed recreation of the character… Animators or mocap artists can drive the performance: pucker the lips, open the jaw, blink and rotate the eyes in Maya,” says Epic Games’ Kim Libreri

Creating cinematics in real-time in Unreal in this way means a director can react to the scene, as a film director would. Antoniades explains: “[In the past] with two characters there in live action, you’d never see both characters at the same time, so you’d be unable to frame the shots. Now, when shooting I am able to frame the shots with both characters, and I can move each character around live to get the best framing I can, and get the environment in the frame as well. I am also able to check things like lighting, this would have been impossible to do in any other way.”

Just the start

Actress Mel performed against a digital version of herself in order to capture details such as her eye line

Actress Mel performed against a digital version of herself in order to capture details such as her eye line

According to Libreri and Antoniades, filmmakers will use real-time tools as an advanced form of pre-visualisation, so more quality footage can be created early in the production process to save costs, but there’s so much more on offer.

“By creating a purely digital set, digital performances and storing everything as digital data, you are setting that data free to do with it what you want, and that magic ingredient is to do things like interactive experiences. You’re not just shooting linear scenes, you’re shooting scenes that have multiple outcomes, multiple possibilities,” says Antoniades.

“You can add procedural systems layered over the top of what you’re shooting, so if you’re in an experience like in VR, for example, you can have a person there,” he continues. ”In this case, while they’re talking they’ll look around you, and they’ll follow you and if you make a choice or an action in the room, they’ll react to you and branch off into new performances. This is something that is completely impossible to do with anything other than game technology – I think ultimately it will define a new form of entertainment that hasn’t existed before.”

Who's it for?

The character Senua within the game’s atmospheric environment

The character Senua within the game’s atmospheric environment

Questions are raised about how accessible this new entertainment format will be. Though Hellblade is being produced by a team of 16, the real-time approach of Ninja Theory is an industry facing project using high-end technology. It is a collaboration: face scanning and animation was by 3lateral and Cubic motion, House of Moves provided the mocap rig, and IKinema handled the body motion capture.

However, Unreal Engine is free to download and use, something Libreri encourages: “Download the Unreal Engine and start playing," he urges, adding: “Unreal is not a toy. It’s definitely built for high-end professionals… Crossing the uncanny valley and making an effects-based facial rig is never going to be the domain of simple hobbyists, but those with the hunger to master it can achieve great things.”

This article was originally published in 3D World magazine issue 213. Buy it here.

Related articles

Thank you for reading 5 articles this month* Join now for unlimited access

Enjoy your first month for just £1 / $1 / €1

*Read 5 free articles per month without a subscription

Join now for unlimited access

Try first month for just £1 / $1 / €1

Ian Dean
Editor, Digital Arts & 3D

Ian Dean is Editor, Digital Arts & 3D at Creativebloq, and the former editor of many leading magazines. These titles included ImagineFX, 3D World and leading video game title Official PlayStation Magazine. In his early career he wrote for music and film magazines including Uncut and SFX. Ian launched Xbox magazine X360 and edited PlayStation World. For Creative Bloq, Ian combines his experiences to bring the latest news on AI, digital art and video game art and tech, and more to Creative Bloq, and in his spare time he doodles in Procreate, ArtRage, and Rebelle while finding time to play Xbox and PS5. He's also a keen Cricut user and laser cutter fan, and is currently crafting on Glowforge and xTools M1.