With 3D, cinema has changed forever.
Well, not really, but we are seeing a resurgence in 3D feature films and, for the first time, live 3D TV. The difference this time around is that 3D imaging and playback are no longer the preserve of large studios: The ability to capture and view 3D footage has been incorporated into consumer electronics.
Right now, capturing, editing and creating high-quality 3D footage is still expensive on the whole, but costs are coming down fast. Several consumer-level 3D cameras are on the market and their presence has driven software companies to create plug-ins and updates for nonlinear editing (NLE) systems.
Shooting 3D video
At the lower end of the market, you now have several options for shooting 3D video. These camcorders retail from $200 up to more than $1000 dollars and most come with software that allows you to create your 3D film easily. Your options also vary as to whether you want a camcorder mostly designed for shooting 3D footage or a primarily 2D camcorder that allows you to shoot 3D as well.
For example, the Panasonic HDC-SDT750 ($1999) is a full HD consumer camcorder that has a detachable 3D lens, while the Sony Handycam HDR-TD10 ($2199) and JVC GS-TD1 ($2199) both have fixed dual lenses and glasses-free 3D screens. You don’t need 3D glasses to see the 3D images while you’re filming with the Sony and JVC camcorders, which is a huge bonus.
A popular professional 3D video camera for indie filmmakers at the moment is the Panasonic AG-3DA1. Priced at around $20,000, it certainly isn’t the cheapest camera, but the results are impressive. It has two lenses and two sensors, it records in AVCHD at full HD, and it provides a dual HD-SDI output for 3D monitors.
The way cameras handle 3D varies from model to model. Some record a single file that contains the video from each channel, but the file formats differ based on each camcorder. The Panasonic AG-3DA1, in contrast, captures two separate video channels on two separate cards, so when you come to the edit, you bring in two separate channels of video. You can edit this video in any NLE system (such as Avid or Final Cut Pro), but you first need to install plug-ins that will allow you to layer the image and view it in 3D by outputting it to a 3D-capable monitor.
Options for editing 3D video
A popular stand-alone application for professionals is a program called Cineform Neo3D (US$999). The software takes the left and right channels of video information and combines it into one AVI or MOV video file, which makes the dual-channel video compatible with most video-editing programs. Once the conversion is done, you open the file with your NLE suite; active metadata allows for all kinds of adjustments within both channels of video. Cineform Neo3D files work with most popular professional video suites, including After Effects, Avid, Final Cut Pro, Media Composer, Premiere and Vegas.
Although it’s expensive, Neo3D is great software that helps speed up 3D editing. It allows you to make adjustments to the image with fine accuracy so that even if the lenses are slightly out of alignment, the software can correct that and make sure that you get the proper 3D effect.
With 3D, however, you still have to do a lot of things twice, even if you are using Cineform Neo3D. For example, if you are rotoscoping a section of video (converting live-action footage to lifelike animation), you have to do so for the left channel and then again for the right.
That isn’t the only headache. Three-dimensional filming can be incredibly complicated, as you are not only looking at focus but also at the depth and the spatial relationship between objects across several planes. If your setup is only slightly wrong, your foreground image will quickly resemble a blurry, ghostlike object floating somewhere in space (not in a good way).
3D output options
All of that time-consuming, expensive editing is for naught unless you can watch the results of your work. You will need a 3D TV and some glasses to get the most out of your 3D film. You can watch it on a standard-definition TV with red-and-blue glasses, but the effect is pretty disappointing: The quality isn’t there, and instead of the image being crisp and popping out of the screen, it comes out more blurry and hazy.
What’s more, you have no easy way to move 3D films from your editing suite to your TV on disc (Blu-ray or DVD) without spending a fair bit of cash. While 3D technology seems to be progressing with the cameras, the editing systems, and the TVs, the actual output options are pretty limited at present.
At the moment, your best bet for 3D Blu-ray creation is to take your project to a finishing studio for mastering. The technology necessary for the task is so expensive right now that it would make very little sense for anyone to try it in their home.
Sony Handycam HDR-TD10
On the much lower end of the quality equation, YouTube 3D is a quick and easy way to watch and share your 3D videos; it will even convert 2D videos into anaglyph 3D footage that people can view with colored glasses.
By uploading your footage and adding ‘yt3d:enable=true’ as a tag for your video, you can view your 3D footage in a number of different ways: with red-blue glasses, with amber-blue glasses and even through glasses-free methods that involve crossing your eyes. Of course, some methods work better than others and none of them truly approach the 3D effect you’d get from active-shutter or polarised glasses in a cinema.
If you have a networked 3D TV, you can also stream your 3D YouTube clips to the set—and in my experience, today’s 3D TVs display the footage properly. Streaming to a 3D TV won’t work miracles, though: This is an easy way to view 3D in your home, but the quality isn’t great, to say the least. The trade-off is that your video is instantly accessible to the world, and you don’t need to spend nearly as much money.