New mo-cap tech renders CG in real-time

An upcoming short film called Construct finds new ways to use the current CG animation technology to produce mind-blowing results.

Michelle Starr Science editor
Michelle Starr is CNET's science editor, and she hopes to get you as enthralled with the wonders of the universe as she is. When she's not daydreaming about flying through space, she's daydreaming about bats.
Michelle Starr
3 min read

An upcoming short film called Construct finds new ways to use the current CG animation technology to produce mind-blowing results.

Take a look at this.

This is a teaser for a short film called Construct, coming out later this year. We don't know a lot about the plot, but we know that the name is clever ("construct" being both a verb meaning "to build" and a noun meaning "a physical object that is deliberately built", which seems appropriate for a film about robotic builders) and that the animation looks so much more lifelike than any other CG we can think of right now.

It's the work of Kevin Margo, whose CG oeuvre with Blur Studio includes work on Batman: Arkham Origins, Halo 4, Lost Planet 3 and Mass Effect 2. Construct will be the second of his solo projects — and, not only the first of his projects, but possibly the first CG project to date, that uses a new technique that allows photorealistic rendering in real-time.

"What you're seeing in the Construct making-of video [embedded below] is a re-appropriation/assembly/revising/porting of a bunch of existing tech/hardware/software to establish a new rendering workflow that will streamline the virtual production process geared towards filmmaking," Margo explained to CNET Australia.

"It's really the workflow surrounding the rendering approach that's the key new thing. It uses the same photorealistic rendering software coarsely in real time that scales to a final film quality frame you'd see in Avatar or Pacific Rim. It's a specialised form of Ray Tracing called Path Tracing. It's what most modern rendering software used in VFX/feature films to generate photorealistic images. However, up until now those images have taken minutes or hours to produce a result. We've extended these rendering concepts to a real time context while capturing performances."

What this means is that the CG can be overlaid on the mocap actor in real-time. This produces a fairly coarse, grainy rendering, but with some extra processing time, it can be resolved into a final frame — meaning, Margo said, that redundant, dead-end assets are effectively eliminated, streamlining the animation process and allowing the creative team to see pretty much straight away how lighting and shading behave in each scene.

Although real-time mocap CG has been seen before — the PrioVR gaming rig is a good example — the difference is that they tend to use rasterisation, rather than ray tracing, which produces a much less realistic result.

"Up until now...any/all of these types of mo-cap/vr tech are based on the same openGL/directX/rasterization hacks that game engines have used for years," Margo said. "And games noticeably don't yet look of feature film quality. You can only squeeze 75 per cent of the quality of those approaches compared to what a ray traced CG feature film looks like."

The software Margo is using is something called V-Ray, under development by Chaos Group in collaboration with Margo. The software allows Margo to achieve the results he is after, while the film's production needs are driving the development of the software.

But, as you can probably imagine, the kind of processing power required to render CG in real-time is pretty hefty. It makes one wonder: just what kind of power is behind the project?

"Top of the line GPUs made by Nvidia, and a tower with a bunch of PCI-Es capable of supporting multiple GPUs," Margo said. "That's another newish thing we're doing... harnessing the insane power of multiple GPUs to render these images in real-time. Traditionally, these ray-trace renderings have been done on CPUs only. But recent advances in GPUs have opened up the possibility to ray trace/path trace on them. I was getting up to 70x rendering speed improvement on the GPU over a CPU-based approach. This approach is not new or unique to Construct, but hasn't been taken really serious until recently, in part because of the example Construct is setting."

It opens up some tremendously exciting possibilities for filmmakers — and for viewers, too, if it's producing content as stunning as Construct.

You can read more about the technical aspects of the project here, and watch the Construct making-of below.