Lumus smartglasses to get EyeSight gesture recognition

The Israeli gesture-control company believes that hand and finger motions are a more versatile way to control wearable computers, and Lumus agreed.

Ari Grobman, Lumus' director of business development, demonstrates gesture control to run his company's smartglasses.
Ari Grobman, Lumus' director of business development, demonstrates gesture control to run his company's smartglasses. Stephen Shankland/CNET

BARCELONA, Spain -- One of the difficulties with wearable computing is that it can be hard to control devices that don't have a handy keyboard or touch screen attached. And that's how gesture control company EyeSight Mobile won a place in Lumus' smartglasses.

With the technology a person can hold out a finger to tap on icons or swipe away notifications in the virtual view Lumus glasses present. Later, EyeSight plans to add the ability to drag items around the display, too.

"You can actually touch the icons in the air with your fingers," EyeSight Chief Executive Gideon Shmuel told CNET.

The companies revealed the partnership here at the Mobile World Congress. EyeSight expects its gesture recognition software will ship with the glasses, Shmuel said.

EyeSight CEO Gideon Shmuel at Mobile World Congress 2014.
EyeSight CEO Gideon Shmuel at Mobile World Congress 2014. Stephen Shankland/CNET

The Lumus glasses mount a transparent 640x480 display onto the lens of the battery-powered, head-tracking glasses; the wearer can see information overlaid on top, and the glasses change what's shown according to the wearer's orientation.

The glasses have a camera, an OMAP 4 processor, and Android 4.1.2. That's enough horsepower to run EyeSight's gesture recognition software, which is smart enough to recognize fingers and hands even against a cluttered, moving backdrop.

Ari Grobman, Lumus' director of business development, said the glasses are scheduled to ship later this year.

With the EyeSight software, a person wearing the Lumus glasses could do things like browse Facebook, play games, or control navigation instructions shown in a head-up display.

Ari Grobman, Lumus' director of business development, wearing his company's smartglasses at Mobile World Congress.
Ari Grobman, Lumus' director of business development, wearing his company's smartglasses at Mobile World Congress. Stephen Shankland/CNET

Shmuel said EyeSight has other wearable-computing partnerships as well, but declined to detail them at this stage.

People might feel a bit silly waving their hands around in front of their glasses, but it's possible to add a second, downward-pointing camera that would let people control the system with a hand held more discreetly at waist level, Shmuel said. And it's better than voice control, which fails in noisy environments, or touch control, which is mostly limited to taps and linear swipes, he argued.

Of course, Google might disagree -- the Google Glass computerized eyewear use voice and touch controls. But it's pretty clear that what whatever the future of wearable computing, the interfaces that'll be used to control them have yet to settle down into something as ordinary as PC mice and phone touchscreens are today.

Ari Grobman, Lumus' director of business development, showing his smartglasses' EyeSight gesture-control interface.
Ari Grobman, Lumus' director of business development, showing his smartglasses' EyeSight gesture-control interface. Stephen Shankland/CNET
About the author

Stephen Shankland has been a reporter at CNET since 1998 and covers browsers, Web development, digital photography and new technology. In the past he has been CNET's beat reporter for Google, Yahoo, Linux, open-source software, servers and supercomputers. He has a soft spot in his heart for standards groups and I/O interfaces.

 

Join the discussion

Conversation powered by Livefyre

Don't Miss
Hot Products
Trending on CNET

HOT ON CNET

iPhone running slow?

Here are some quick fixes for some of the most common problem in iOS 7.