The third dimension

3D movies have been around since the 50s, and 3D monitors with shutter glasses were being sold 15 years ago. But 3D didn't make it into the mainstream until recently, when a major push for 3D movies put the technology in the news. Now that 3D is moving into TV, online video and the web, 3D motion graphics and video are starting to generate new opportunities for designers and creatives.

The bad news is that 3D is still loosely defined, and there's no one single standard or technology. The good news is that you can create and edit 3D content fairly cheaply with the tools you already own. You'll need to expand them with new input hardware, new plug-ins and new output options, but you should be able to do this without completely overhauling your workflow or spending a fortune on upgrades and new tools.

3D starts with two types of content - live footage, and rendered video. The easy way to capture live 3D action is with a single 3D camera. Panasonic has led the market for the last few years with its professional AG-3DA1 camera. But the five-figure price is eye-watering for anyone who doesn't work with video full-time, and cheaper options are becoming available. Panasonic, Sony and JVC are all launching affordable (£1,500 or less) prosumer cameras this year, with adequate quality for web video and basic broadcast. For example the Sony HDR-TD10E offers full 3D in a handycam-sized unit. JVC's GS-TD1 is slightly larger, slightly more expensive and - arguably - produces slightly better images. Either one is good enough for professional content production.

If you already own a couple of camcorders, you can create a basic 3D rig by linking them next to each other on a frame. A dual-camera system is big and heavy, and you can't zoom or use other standard video techniques because the two cameras must have exactly the same settings for the 3D effect to work. Even basic movement can be iffy, and you'll need a clapperboard or some other sync tool to guarantee that the footage is in sync when you edit it later. But as long as you can sync the footage and remember to keep left and right views clearly labelled, it's possible to create true 3D with two cameras. ikan manufactures adjustable dual-camera rigs that make it easier to assemble and work with a stable dual camera set-up.

For rendered sources, Autodesk's Maya includes built-in support for 3D. But creating 3D output with other 3D tools is easy - you simply create two separate render passes, with a slight horizontal camera offset. You'll need to experiment with the offset to match other virtual camera features such as focal length, but you can use this technique to get 3D output from any 3D modelling, rendering and animation package.

  • Don't forget to check out this Maya 2013 review

All 3D sources produce the same content - two video streams, one for each eye. Some video-editing tools, such as Sony Vegas and AVID Media Composer, include built-in 3D support. You can import the two streams either individually or pre-paired, and the software lets you edit the footage in the usual way, with optional 3D preview.

But other tools, such as Final Cut Pro and Adobe Premiere or After Effects, don't yet have native 3D. The solution is a set of plug-ins and standalone tools from CineForm, whose products have become an industry standard. The Neo series of tools includes the FirstLight importer, which can combine two video streams into a single 3D stream with optional control of depth, offset, colour and other features. The other Neo tools integrate with Final Cut Pro, Premiere and After Effects to offer a full set of editing, preview and output options.

Editing can be tricky because standard 2D effects such as transitions, dissolves and animations may not have a 3D equivalent. CineForm includes a very basic selection of dissolves and other 3D effects, but popular eye-candy plug-ins such as the Trapcode series don't yet support 3D. Standard techniques and effects may not be available; some design ideas won't work at all, while others may need a work-around. To compensate, alternative 3D effects, including 3D motion for titles and animations, become possible. Creatively, it's important to be aware of the limitations and the possibilities before you start designing visuals.

You'll need to monitor the 3D effects as you work. This isn't difficult, but you will need a mid or high-end graphics card and a 3D monitor. The only difference between standard and 3D monitor technology is a higher refresh rate, typically 120Hz, which makes it possible to output two alternating frames of 3D at double the usual 60Hz refresh rate. Hyundai, Samsung and many other manufacturers supply suitable monitors and glasses. But Nvidia's graphics cards and 3D glasses have become a standard, and it's worth considering compatibility.

Optionally, you can also buy a standard widescreen domestic 3D TV, with glasses, for a bigger 3D effect. CineForm can output video previews in most of the formats supported by 3D TVs, and send it out over HDMI. If your graphics card supports HDMI, you're good to go.

For final output, the CineForm codec has become a recognised video delivery format, so you can render your 3D project to a CineForm file and send it to your client. Unfortunately web video is still playing catch up. It would be useful to deliver to a standard Flash video container, but the current version of Flash doesn't support 3D video. There are unconfirmed rumours this will change later this year, but there's been no final announcement yet from Adobe.

Do viewers want 3D? In sales terms, 3D TV hasn't become the big draw it was supposed to. But for designers, the cost of entry is so low, and the learning curve is so shallow, that it's well worth experimenting with 3D and offering it as a client option. Even if 3D doesn't conquer the world, it's unlikely that it will disappear altogether - and adding depth to your creative output is never a bad plan.

3D technology jargon-buster
Confused by the different 3D options? Here's a no-nonsense guide

Active 3D
Various schemes for monitors and TVs that combine interlacing and page flipping with 3D LCD shutter glasses that alternative the scene between each eye. The TV or monitor sends out an IR signal that synchronises the glasses to the 3D content.

A complicated word for 'coloured glasses'. Blue/yellow glasses are generally preferred, because they don't trash the colours in a video. Cheap cardboard plastic film specs are better than more expensive plastic lenses because they let more light through.

Nvidia 3D Vision
Nvidia's active technology, which is becoming an industry standard of sorts that combines monitors, graphics cards and active glasses.

A cinema theatre system that uses passive specs with polarised lenses, dual projectors and a high quality screen. Passive 3D was used almost exclusively by Imax theatres until the rest of the industry discovered it a few years ago. Unless you own a cinema, you'll probably never work with this system directly.

SDI (Serial Digital Interface)
High-end 3D video interface and connection system, used exclusively on professional TV and movie products.

Stereoscopic 3D
This is how most current 3D technology works. It uses two stills or two video streams, and sends one to each eye. Your brain then combines them to create a 3D scene with real depth. Don't confuse stereoscopic 3D with 'normal' 3D rendering or output, which creates a 3D scene with 3D objects displayed in a single flat 2D view with no depth.

Now check out the best 3D movies of 2012 on our sister site Creative Bloq!