If you ever feel like robots are getting the upper hand on humanity, consider using your own hands to put them in their place.
Researcher Akihiro Nakamura from the Nara Institute of Science and Technology (NAIST) in Japan has developed a motion controller for iRobot's Roomba vacuum bots that recognizes his gestures and posture.
The system is yet another Microsoft OpenNI API. The gestural interface eliminates the need to bend over and push Roomba's buttons. It also allows you to lord it over the overgrown hockey puck.using the
First, to calibrate the Kinect you have to assume a hands-up stance (either humiliating or all-powerful, depending on your perspective). Then the system starts recognizing gestures, as seen in the demo above. To make Roomba clean a spot on the floor that it missed, assume a scolding stance: left hand on your hip, and right hand pointing at the offending dirt. Roomba scoots over to the spot and does a thorough hoovering.
Similarly, lifting your left hand makes Roomba turn in circles. No word yet on what happens if you give it the finger.
If you leave the room, Roomba goes into docking mode and searches for its recharging station. Rather sheepishly, it seems.
MIT's Philipp Robbel has also used Kinect to hack a Roomba, turning it into a gesture-controlled 3D mapping robot.
The Kinect tinkering possibilities seem endless. I'd love to see a dancing Roomba hack.