X

Lumus smartglasses to get EyeSight gesture recognition

The Israeli gesture-control company believes that hand and finger motions are a more versatile way to control wearable computers, and Lumus agreed.

Stephen Shankland Former Principal Writer
Stephen Shankland worked at CNET from 1998 to 2024 and wrote about processors, digital photography, AI, quantum computing, computer science, materials science, supercomputers, drones, browsers, 3D printing, USB, and new computing technology in general. He has a soft spot in his heart for standards groups and I/O interfaces. His first big scoop was about radioactive cat poop.
Expertise Processors, semiconductors, web browsers, quantum computing, supercomputers, AI, 3D printing, drones, computer science, physics, programming, materials science, USB, UWB, Android, digital photography, science. Credentials
  • Shankland covered the tech industry for more than 25 years and was a science writer for five years before that. He has deep expertise in microprocessors, digital photography, computer hardware and software, internet standards, web technology, and more.
Stephen Shankland
2 min read
Ari Grobman, Lumus' director of business development, demonstrates gesture control to run his company's smartglasses.
Ari Grobman, Lumus' director of business development, demonstrates gesture control to run his company's smartglasses. Stephen Shankland/CNET

BARCELONA, Spain -- One of the difficulties with wearable computing is that it can be hard to control devices that don't have a handy keyboard or touch screen attached. And that's how gesture control company EyeSight Mobile won a place in Lumus' smartglasses.

With the technology a person can hold out a finger to tap on icons or swipe away notifications in the virtual view Lumus glasses present. Later, EyeSight plans to add the ability to drag items around the display, too.

"You can actually touch the icons in the air with your fingers," EyeSight Chief Executive Gideon Shmuel told CNET.

The companies revealed the partnership here at the Mobile World Congress. EyeSight expects its gesture recognition software will ship with the glasses, Shmuel said.

EyeSight CEO Gideon Shmuel at Mobile World Congress 2014.
EyeSight CEO Gideon Shmuel at Mobile World Congress 2014. Stephen Shankland/CNET

The Lumus glasses mount a transparent 640x480 display onto the lens of the battery-powered, head-tracking glasses; the wearer can see information overlaid on top, and the glasses change what's shown according to the wearer's orientation.

The glasses have a camera, an OMAP 4 processor, and Android 4.1.2. That's enough horsepower to run EyeSight's gesture recognition software, which is smart enough to recognize fingers and hands even against a cluttered, moving backdrop.

Ari Grobman, Lumus' director of business development, said the glasses are scheduled to ship later this year.

With the EyeSight software, a person wearing the Lumus glasses could do things like browse Facebook, play games, or control navigation instructions shown in a head-up display.

Ari Grobman, Lumus' director of business development, wearing his company's smartglasses at Mobile World Congress.
Ari Grobman, Lumus' director of business development, wearing his company's smartglasses at Mobile World Congress. Stephen Shankland/CNET

Shmuel said EyeSight has other wearable-computing partnerships as well, but declined to detail them at this stage.

People might feel a bit silly waving their hands around in front of their glasses, but it's possible to add a second, downward-pointing camera that would let people control the system with a hand held more discreetly at waist level, Shmuel said. And it's better than voice control, which fails in noisy environments, or touch control, which is mostly limited to taps and linear swipes, he argued.

Of course, Google might disagree -- the Google Glass computerized eyewear use voice and touch controls. But it's pretty clear that what whatever the future of wearable computing, the interfaces that'll be used to control them have yet to settle down into something as ordinary as PC mice and phone touchscreens are today.

Ari Grobman, Lumus' director of business development, showing his smartglasses' EyeSight gesture-control interface.
Ari Grobman, Lumus' director of business development, showing his smartglasses' EyeSight gesture-control interface. Stephen Shankland/CNET