X

Playing Iron Man for a day

To celebrate the release of the hit film's release on DVD, ILM showed off some of the technology behind the blending of 3D animation and live-action footage.

Daniel Terdiman Former Senior Writer / News
Daniel Terdiman is a senior writer at CNET News covering Twitter, Net culture, and everything in between.
Daniel Terdiman
5 min read
While wearing a motion-capture suit, CNET reporter Kara Tsuboi shows how her movements are translated instantly to an 'Iron Man' character on the screen behind her. The technique is used in an increasing number of films to mix live-action footage with digital, 3D sets. Daniel Terdiman/CNET News

SAN FRANCISCO--On Tuesday, the DVD version of the mega-hit film Iron Man will be released, and to celebrate, the visual effects superstars at Industrial Light & Magic decided to show off just a little bit more magic behind the movie.

Back in April, ILM invited me and a couple of my colleagues to their fantastic facilities here for a look at the technology behind the famous suit used in Iron Man. Recently, they invited us back to see how the seamless animation in some of the film's scenes--such as one famous shot involving the throwing of an Audi--was produced.

In particular, they wanted to give us the inside scoop on the motion-capture technology used to create a number of the film's scenes, a technology that is increasingly being used today that allows directors to see, in real time, while the actors are acting, what animated sequences will look like.

Click for gallery

That's why we--myself, CNET reporter Kara Tsuboi, and a cameraman--spent several hours on an ILM image capture stage last week: So that Tsuboi could don a motion-capture suit and we could all see how footage of her would translate instantly into an animated Iron Man scene.

The idea is that George Lucas--who owns the effects studio--wants to give filmmakers advanced technological tools that provide them with flexibility and efficiency. And so he staffs ILM with the kinds of people who can make that happen.

"We understand the entire process," said ILM digital supervisor Michael Sanders, "from writing code to animating creatures to even shooting live elements. So we know each layer in the process. We understand the vision of the key creatives and understand" what the actors are going to do.

The technology used at ILM--and elsewhere, as well--allows directors to mix real filmmaking and virtual spaces, but with full camera control, depth of field, tracking, and panning. The upshot? A filmmaker can have an entire digital set created, then have an actor perform on the image capture stage wearing the motion-capture suit, and see, as the filming is happening, how the actor's character looks superimposed on the digital background.

And that means that a director--using a special wireless device that lets him or her see the entire mixed-media image in real time--can move around on the stage, looking for the angles he or she wants and creating new compositional choices on the fly.

This device is used by directors to arrange new camera angles on the fly as they film live actors and have the footage mixed with digital backgrounds. Daniel Terdiman/CNET News

Further, because a film production is using fewer resources to take a crew to and from a physical location, that frees up resources to experiment with different ideas.

"There's more time to play with it in the virtual space," said ILM spokesman Greg Grusby, "because you're not burning money on set."

Further, because the process is entirely digital--save for the actor's role, of course--that means results are available much faster. Often, said Sanders, within a couple of hours of shooting a performance, animators have already had their opportunity to work on what has been shot and can deliver a fully formed scene to the producers.

To be sure, this technology has been around for some time. But now, according to Sanders, the wizards at ILM and other effects houses are working on new systems that could, finally, allow animators to create photo-real human faces, doing away once and for all with the dead eyes so familiar to fans of video games and animated movies.

But this kind of advancement might still be as much as five years off, Sanders said.

From impossible to possible
For now, though, the current technology is still hugely valuable to the filmmakers who hire ILM to do their visual effects. For one reason, that's because the technique allows for putting things on film that might not be possible, or at least efficient, otherwise.

Watch this: Morphing into 'Iron Woman'

For example, the real-time animation and motion-capture technology could be used to create a scene where a character jumps off a skyscraper. By animating the background, and having an actor jump inside the image capture studio, the two different sets of images can be mixed for the desired effect.

"It crosses the line of what you can actually do," said Sanders. "You can do battle scenes where there are thousands of people battling."

In part, this process is done by using not just actors with motion-capture suits, but also proxy props adorned with their own set of reflective sensors. And that means that the footage also captures the movement of the props, allowing the animators to build them into the scenes.

For Tsuboi, taking part in a demonstration of this technique meant putting on a spandex motion-capture suit and having several dozen reflective sensors placed all over her body, sensors that are used to capture her exact motions and translate them to the Iron Man figure being displayed on several screens around the image capture facility.

On the screens, the ILM crew had pre-positioned a generic Southwestern desert background. And as Tsuboi began to do several poses needed to capture her "skeleton," each pose was reflected instantly by an identical pose by Iron Man.

According to Sanders, the motion-capture system is "not forgiving at all," meaning that actors need to be very precise in their motion, or else the computers will pick up every little thing. Similarly, someone with, say, an injury might find that the slightest limp will be picked up by the system.

Back in the control room, ILM associate research and development engineer Spencer Reynolds was manning the computers, taking the digital skeleton captured when shooting Tsuboi's poses, and, with a mouse, clicking on a series of dots on the skeleton as part of a process that tells the computer exactly how her specific shape corresponds to the model of a human body it can understand. That means clicking on the several dots that make up her arm, her leg, her head, and so on. Once he's done with that, Reynolds told me, the computer can interpret Tsuboi's motion as that of a human, and not be confused by irregular movements.

Here, after the hat--adorned with motion-capture sensors--fell off Tsuboi's head, on screen it looked as though Iron Man's head had come off. Daniel Terdiman/CNET News

And after having Tsuboi finish the set of calibrating poses, the crew came out and removed many of the sensors, because only a few were still needed once her form was fully understood by the computer.

Tsuboi, in the meantime, was playing around inside the image capture stage, and her every move was being instantly translated onto the screen. Except she had become Iron Man.

Perhaps the funniest moment of all was when she accidentally dropped the sensor-studded hat she was wearing; on the screen, Iron Man's head suddenly fell off. And when she picked up the hat, it appeared as though Iron Man's head was in his hands.