How real-time rendering will change the way you work forever

Artists and TDs are turning to real-time engines to speed up the process of content generation.

We've talked at length about the advent of GPU-accelerated ray-tracing with renderers like Octane, Indigo and the upcoming V-Ray RT. But the graphics card in most workstations is also capable of producing quality imagery using the latent power of real-time engines – videos like this from Beniot Dureau using Unreal Engine 4 amply demonstrate what the latest technology is capable of.

Architectural visualisation is already making good use of the graphics card with apps such as E-on's LumenRT. Powered by modified versions of DirectX on PC and OpenGL on the Mac, its Dynamic Immersive Visualisation Engine provides a simple way for architects and urban designers to envision their creations in real-time, with physical effects such as sunlight, shadows and reflections, plus realistic water, vegetation, traffic and even changes in weather.

The app is just as capable of recreating a designer apartment as it is an entire cityscape, but unlike games, where the bulk of the scene is compiled beforehand, LumenRT does everything on the fly and remains editable at all times. However, the more realistic lightings schemes can take some time to calculate before being baked into the simulation.

For companies like The Third Floor, real-time engines provide a way of quickly generating previs animations: there's no need for a timely ray-trace render when a Maya playblast will suffice. Improvements to Maya's Viewport 2.0 – in conjunction with more powerful GPUs – has helped the situation.

"We are finding that we can handle scenes with more polygon density with better shaders applied than ever before," says The Third Floor supervisor Albert Cheng, "And together with the other higher quality lighting model, which has become a significant motivator for the switch to VP2.0, the visual quality of our work has gone up."

However in a desire to improve its offering, the studio is currently evaluating Unreal Engine 4.5 as a replacement. "I think the output speaks for itself," says, executive producer Duncan Burbridge, "but it provides much better flexibility with shaders, lighting and shadows. However the workflow is very different; we're still reviewing what the ongoing process will be, but we've been able to bring assets and animation from Maya into Unreal."

Watch The Third Floor promo video below…

Unreal speed gains

Epic's engine is also used in RenderDigimania, a standalone app that employs the Unreal technology to render animations created in Maya or Max. You simply model and animate your meshes, then export via FBX or Alembic and import them into RenderDigimania. There you can arrange your assets, add materials and set up the lighting, with instant and near-render-accurate feedback. It can also output a number of render passes for fine-tuning your animation in post, and is capable of 4K output.

While the output lacks the niceties of true ray-tracing – like refraction, global illumination and so on – there are many studios that don't need this level of realism, as Paul Collimore, commercial director of Glasgow-based Digimania, explains: "In terms of where we're targeting, we feel the software is ideal for large production studios, the animation factories, the guys that are churning out hours and hours of medium quality animation."

And what RenderDigimania lacks in realism, it more than makes up for in speed. In early tests, scenes rendered with mental ray took 210 second per frame, while RenderDigimania chewed through them at a rate of five frames per second. When you're rendering a 15- or 20-minute episode every week, it's easy to see how this pipeline is both efficient and affordable. In a typical test case, Digimania calculated that on a 52-episode production, its GPU solution could save a studio over £340,000 in rendering costs.

And it's not an either/or proposition for companies already invested in a ray-tracing pipeline, suggests Collimore. "If you have a project that potentially you'd turn away because it's not cost effective for you, this solution may make it cost effective. There's nothing that breaks somebody's heart more than turning away business, so it gives studios another option. We've been talking to studios who've been saying 'well we'll render this part of the series in this way, but these bits, I need to get it out quickly so I'll use RenderDigimania."

Open standard

For cross-platform support, OpenGL is still the way to go and one app really leveraging its power is Element 3D, the plug-in from Video Copilot, which enables the loading of OBJ and C4D files directly into After Effects. It provides a high-quality preview in real-time, and then renders in near-real-time, depending on the complexity of the scene and effects like anti-aliasing, motion blur and so on.

The idea came from seeing the quality of videogames like Call of Duty, says Andrew Kramer, founder of Video Copilot. "It originally started as a way of creating helicopters and jets, specifically in 3D. It was kind of a dual thing: we were working on a particle plug-in, and then we were working on the helicopter plug-in, just kind of for fun. And as we started to play with it and realise how good it looked and how fast it was, we thought, gosh, there must be a way to build this into something even more useful than something that's just helicopters or jets."

The end result is a tool that provides a simple way of adding 3D elements to a 2D composite, without any prior knowledge 3D modelling or rendering. "I think there are so many projects and jobs where there's motion graphics or added visual effects – add a '3D something' into a scene," says Kramer.

"You have editors and compositors and people who are not 3D artists. They probably don't have time to jump into a large, dedicated application – not to mention the cost involved. Element is meant as an intermediate place, and even really advanced 3D users have told us how much they love it because, hey, they can do the thing that they need to do really, really fast."

Version 2.0 was released in November of 2014, bringing real-time shadows and physically-based shaders, plus a whole new level of realism and with it, potential new users. "With version 2 we've finally stepped up into that realm that focuses on visual effects," suggests Kramer, "where you have shadows and real, ray-traced ambient occlusion and the things necessary to get higher quality composites and integration. So we have a lot of interest from some really big studios that just have compers who add elements into shots, and they're really excited by it."

Like other developers we spoke to, Kramer suggests that ultimately, cloud-based ray-tracing solutions will provide real-time output, but for now Video Copilot's focus is on OpenGL. "Right now, we think it's the fastest way," he comments. "It doesn't have all the benefits of ray-tracing, but to be able to get to 75 per cent or 80 per cent, to be able to do what it does now with the speed it has, we think it's an amazing advantage. And of course, we have to see where OpenGL gets; we can't just rule it out. I can potentially see OpenGL taking advantage of some other ray-based processing technologies."

To see what Andrew Kramer has to say about the future of Element 3D and upcoming Video Copilot tools, read this interview.

Words: Steve Jarratt

Steve Jarratt has been in CG for many years. He's a regular contributor to 3D World and, at one point, edited the magazine for two years.

Like this? Read these!