Smart shoes step up the wearable-computing pace

A human-computer interface conference explores novel gestures for operating computers -- and placing a smartphone sensor called ShoeSense in shoes.

This prototype demonstration shows from above how a person could use pinch gestures to operate a smartphone. In actual use, the phone would not need to be held in the other hand, and the shoe sensor would be small enough to fit in a shoe.
This prototype demonstration shows from above how a person could use pinch gestures to operate a smartphone. In actual use, the phone would not need to be held in the other hand, and the shoe sensor would be small enough to fit in a shoe. Screen capture by Martin LaMonica/CNET

A group of researchers says shoes may be the next thing in the busy field of wearable computers and gesture interfaces.

Computer scientists from the Telekom Innovation Laboratories, the University of Munich, and the University of Toronto this week published a paper on ShoeSense, a wearable computing system for a smartphone.

It's one of many gesture interface-related papers being presented this week at the Conference on Human Factors in Computing Systems (CHI 2012) conference, which is sponsored by the research arms of Microsoft, Google, eBay, and other tech companies.

What a ShoeSense sensor would look like. Telekom Innovation Laboratories

Wearable computing got a high-profile plug when Google introduced Project Glass , a set of glasses that does what a smartphone can but is apparently operated by eye gestures, head motions, and a button for taking photos , according to a demonstration from one of its makers.

Developing alternative inputs for smartphones makes sense when a person is moving or engaged in other tasks, such as driving, or when it's inappropriate to pull out a smartphone, such as during a family dinner, the ShoeSense developers said in a paper.

Its developers envision a sensor being placed in a shoe that is able to understand customizable hand and arm gestures. In a video, a user moves his finger along his forearm to turn up the volume on a music player in his pocket, pinches to select the next track, and then pinches with three fingers to send an "I will be late" e-mail to his wife.

Having a sensor device in a shoe has advantages over glasses in that it allows for eyes-free interaction, and it doesn't constrain body motions. ShoeSense's designers also think that it can be more socially acceptable to operate a smartphone through arm and hand gestures than via glasses. Potentially, the sensor could be powered by a walking motion.

"ShoeSense introduces a novel and unique perspective (from the shoe), making it possible to recognize discreet and relaxed, as well as large and demonstrative, gestures without the need for cumbersome hats or body-mounted sensors," according to the paper.

For a working demonstration, the sensor was actually a Microsoft Kinect game controller , which includes a depth camera able to recognize gestures, but the researchers envision shoe sensors small enough to be strapped onto shoe laces.

As sensors get smaller and less expensive, computer scientists are exploring a wide range of gesture-based interfaces. These can be used to interact with existing devices or with other objects in a building.

Also at CHI, Microsoft Research announced SoundWave , a way to operate a laptop computer using hand gestures by measuring the changes in sound waves emitted from the computer. Another paper was on Touche, a gesture interface system designed by Disney Laboratories and Carnegie Mellon University that uses "smart doorknobs" and gesture-operated table tops .

There were also a number of papers at CHI presented on touch screens, which, with the soaring popularity of tablets and smartphones, have given life to an active field of research.

  • Google Research introduced Gesture Coder, software designed to more quickly develop applications with a multitouch interface. It automatically generates code, which can be later modified, based on code written earlier.
  • In experiments, a customizable touch-screen keyboard lets people type faster than on a static touch keyboard. University of Washington, University of Maryland

  • The Universities of Maryland and Washington presented a paper about designing personalized touch-screen keyboards that would enable people to type quicker than on static touch-screen keyboards. Another study looked at how to augment touch-screen keyboards with user-defined gestures that could be used as shortcuts.
  • A lot of the research in computing interfaces and wearable computers reflects the new possibilities for bridging the digital and physical worlds using sensors, such as Kinect. But whereas the multitouch devices have caught on rapidly, outside of video games, gesture interfaces still face the challenge of finding real-world uses and applications.

     

    Join the discussion

    Conversation powered by Livefyre

    Don't Miss
    Hot Products
    Trending on CNET

    HOT ON CNET

    Need a digital detox?

    Not everything has to be shared on the Internet. Our experts at The Fix help you take control of what you share online.