Engineer unlocks Wii's hidden potential

Chalk up another victory for the Wii: it's a good foundation for virtual-reality helmets, a vogue multitouch user interface, and a virtual whiteboard.

Correction 7:45 a.m. PST: I got the sensor bar and the Wiimote's duties mixed up. Names notwithstanding, the sensor bar has the infrared LEDs, and the Wiimote actually has the cameras that detect the signals.

I support the hardware-hacking philosophy on principle, but most of the movement's labors have left me uninspired. That all changed when I started seeing the uses that Carnegie Mellon researcher Johnny Chung Lee has found for the Nintendo Wii's infrared remote control.

In a collection of videos, notable for their lucid explanations, the Ph.D. graduate student from CMU's Human-Computer Interaction Institute shows exactly how versatile the "Wiimote" system can be. Among the uses he convincingly demonstrates: a virtual-reality head tracker; a virtual whiteboard on a wall, tabletop, and laptop screen; and a Minority Report-style arm-waving and finger-pointing multitouch user interface.

The Nintendo game device includes a bar-shaped device, ordinarily placed atop a TV screen, with two LEDs, or light-emitting diodes. It emits infrared light that the Wiimote can detect within a 45-degree field of view. Lee uses a computer to process data from those components and dramatically expand their utility.

By attaching the sensor bar to his head and the Wiimote to a TV, he was able to construct a system that knows where his head is located. That information is then fed into an algorithm that changes the perspective of an image on a monitor. The result is a very convincing 3D feel that looks like it would be a great fit for video games.

The whiteboard application relies on use of a pen with an infrared LED in its tip. After a quick calibration--the subject of Lee's thesis--a computer can track where Lee is "drawing" on a wall, tabletop, and laptop screen.

Perhaps the most mainstream potential comes with Lee's Wiimote-based multitouch user interface.

Lee attaches small reflectors to his fingertips, which the sensor bar can track. The result is a user interface that can respond to gestures such as pinching and swiping. And by tracking four points, it enables the "multitouch" abilities that are all the rage with Apple's iPhone and MacBook Air as well as the Microsoft Surface "Milan" project.

Lee's open-source work has traveled beyond his own domain. Cynergy Labs' Maestro project shows the Wiimote-based multitouch system in action. And his work has spawned a discussion site called Wiimote Project.

Lee also is notable for another practical design, a poor man's steadycam.

About the author

Stephen Shankland has been a reporter at CNET since 1998 and covers browsers, Web development, digital photography and new technology. In the past he has been CNET's beat reporter for Google, Yahoo, Linux, open-source software, servers and supercomputers. He has a soft spot in his heart for standards groups and I/O interfaces.

 

Join the discussion

Conversation powered by Livefyre

Show Comments Hide Comments