BURBANK, Calif.--Back in the dark ages of modern 3D filmmaking -- meaning more than a year or two ago -- the process of aligning imagery coming from the two cameras required to shoot in 3D could be slow, methodical, and frustrating work. But one company thinks it has done away with those days forever.
At 3ality, an outfit here that is among the leaders in the nascent new era of 3D filmmaking -- as opposed to the horrible 3D films dating back a few decades -- there's no reason to labor over the optimization of such imagery after the fact. Not when the company's technology is designed to give directors and their 3D engineers and stereographers the tools to digitally align and optimize what they're shooting in 3D in real-time, and to do so from a laptop computer -- leaving the camera operator to worry only about shooting.
Their approach must be working. Its technology has been adopted on a number of films, and this year alone, has been used by Ridley Scott on "Prometheus," on the upcoming "Amazing Spider-Man," and even on singer Katy Perry's "Part of Me" documentary. Past hits have included U2 3D.
The company has been at it for several years, and its approach hasn't changed that much since 2007, when CEO Steve Schklair told CNET that 3ality's technology was all about efficiently taking footage from the two cameras and manipulating them after they're shot so they match, and that it was designed to help easily blend the shot-by-shot depth of scene so that as a film cuts from one scene to another, viewers are not taken on a roller coaster ride of perception as was the case with the 3D films of old.
"On a cut-to-cut basis in the past, the depth of each shot was different," Schklair said at the time. "But now, we're using these new tools that hand off the depth for each shot. So we transition the depth from shot to shot, so your eyes are led though the movie. So your eyes don't even notice."
And the goal has never changed: To make it easy for filmmakers to produce a 3D movie that offers depth and richness, while never causing the dreaded "throw-up territory" that comes when the 3D separation between imagery from the two cameras is more than 2 percent.
But now, 3ality is bringing even more cutting-edge technology to the table, both on the hardware and software sides.
It starts with the company's latest 3D stereoscopic system, a new set of precision tools that are meant to offer filmmakers sub-pixel level accuracy for the imagery coming from their two cameras. Where previous versions of their technology has been state-of-the-art and given 3D engineers and stereographers the ability to quickly integrate their footage, the newest tool, the Helix, does so with the highest level of automation ever, explained Jill Smolin, 3ality's director of education.
With Helix, Smolin explained to me when I stopped by the company's offices as part of Road Trip 2012, filmmakers have full automation at their hands, with motors operating all the elements of lenses that would previously have to be done manually -- including focus, zoom, and iris.
A 3D camera setup works like this: Two cameras are mounted on gear like Helix, one facing forward, and the second pointing either straight down or up. The second camera points at a half-silvered mirror, while the first shoots through the mirror. The Helix system -- or whatever rig is being used -- splits the beams from the two cameras, or blends them. The amount of distance between the two lenses determines the depth of the 3D image.
The second half of 3ality's high-tech equation is its stereo image processing (SIP) technology, as well as its Intellesuite software.
The SIP tool offers a deep set of features. First is an on-screen system of advice about the kinds of settings -- vergence, interaxial, and infinity point measurement --that determine the quality of the final 3D film. Next is geometric analysis that gives "vertical alignment, rotation, focus, and image size matching [to] preserve the highest level of quality in the imagery," according to 3ality's Web site. A third tool provides depth analysis via an on-screen bar graph "that visualizes the total depth distribution of the image."
At the same time, the company's Intellesuite software -- which works hand-in-hand with the SIP and the beam-splitter -- gives a variety of options to filmmakers, from quick automatic setup to real-time 3D control and composition and more.
And what used to be a manual processing requiring a 3D engineer or stereographer to eyeball whether both cameras were seeing the same thing is now an automated process that requires only hitting one button. "If [the system] sees that I've got two cameras, and they're vertically misaligned -- throw-up 3D," Smolin explained -- "the stereo image processor brings it down to...the tolerance of one-one-hundredth of a pixel of vertical misaligment."
State of 3D
For 3ality, its future obviously depends on there being a market for 3D feature films, TV shows, documentaries, and so on. But according to head of production Ted Kenney, that future doesn't rely solely on the silver screen.
Instead, he said, 3ality is seeing a growing demand for 3D content that can go on small devices like smart phones and tablets and 3DTVs. Still, Kenney knows that viewers aren't yet sold on 3D. Some people find themselves uncomfortable when watching 3D films, though he attributes that more to filmmakers unable to properly adopt the medium than to fundamental flaws with the technology: If a film doesn't keep to the 2 percent limit for image separation, it may be due to the fact that the filmmakers don't fully understand those limits.
Still, if a moviegoer gets nauseous watching a 3D movie, they may not be interested in seeing another one, especially because they have to pay a premium for a ticket, Kenney admitted. That's why it's essential for new movies using the technology like Baz Luhrmann's "The Great Gatsby" or "Prometheus" to convince people that it's worth their time and money. As well, Peter Jackson's "The Hobbit" may be next to boost 3D in the way "Avatar" did a couple years ago.