Projection mapping creates 'living' digital makeup
A combination of motion tracking and projection mapping creates animated "makeup" on the face of a moving model.
What if, instead of spending an hour perfecting your cyborg makeup, you could have a face not only ready to go, but animated as well?Let's face it -- the equipment to do so would likely be way too unwieldy. But we've seen a glimpse of what it could look like, and it is glorious. Omote -- the work of Japanese producer and technical director Nobumichi Asai, whose previous work includes projection-mapped buildings, dockyards, and stages -- combines projection mapping and motion tracking for real-time "living" makeup.
Taking his inspiration from the Noh mask, Asai first laser-scanned his model's face and created a 3D mesh. This formed the basis of the projection map. Discreet dots placed on the model's face allow motion tracking so that her face can be transformed, from makeup, to an animated cyborg face, to abstract patterns shifting and changing as she moves her head.
Asai hasn't detailed which camera he has used, although given his past work with Microsoft Kinect, it's possible that it is integrated somehow. It's also not clear how much movement the system could handle; the model remains relatively stationary, seated in position and moving only her head.
Nevertheless, it certainly looks spectacular. We imagine -- given Asai's previous work in theatre -- that it could be developed for stage productions. Stay tuned to Asai's website for more information about the project.