Kinect hacks have been used for many a grand feat, from a tool that helps the blind navigate more easily to hands-free questing in World of Warcraft and virtual cat brushing.
So why not integrate the powers of Microsoft Kinect with a mirror to teach such subjects as basic anatomy?
For the past year, a team out of the Technical University of Munich in Germany has been working on just that. The researchers use Kinect to estimate the position of a person in front of an augmented-reality mirror in order to create the illusion that the user can see inside her own body.
Researchers Tobias Blum and Nassir Navab say the tool, which they call Mirracle--for "mirror miracle," I suppose--is largely educational, and report that they installed a prototype of their Mirracle system in the Academic Medical Center in Amsterdam in September 2011.
The Kinect provides tracking, while software from OpenNI and PrimeSense NITE project the skeleton of a person onto the subject in front of the mirror. The Kinect is positioned next to the screen, so that the person standing in front of the Mirracle system can interact with the screen, using touch-screen gestures without having to actually touch anything.
While the end result is a little crude--for instance, it uses someone else's CT image instead of the subject's own--the Mirracle system certainly enables more user interaction and visualization. Those are two big perks for studying anatomy.