
BARCELONA, Spain -- One of the difficulties with wearable computing is that it can be hard to control devices that don't have a handy keyboard or touch screen attached. And that's how gesture control company EyeSight Mobile won a place in Lumus' smartglasses.
With the technology a person can hold out a finger to tap on icons or swipe away notifications in the virtual view Lumus glasses present. Later, EyeSight plans to add the ability to drag items around the display, too.
"You can actually touch the icons in the air with your fingers," EyeSight Chief Executive Gideon Shmuel told CNET.
The companies revealed the partnership here at the Mobile World Congress. EyeSight expects its gesture recognition software will ship with the glasses, Shmuel said.
The Lumus glasses mount a transparent 640x480 display onto the lens of the battery-powered, head-tracking glasses; the wearer can see information overlaid on top, and the glasses change what's shown according to the wearer's orientation.
The glasses have a camera, an OMAP 4 processor, and Android 4.1.2. That's enough horsepower to run EyeSight's gesture recognition software, which is smart enough to recognize fingers and hands even against a cluttered, moving backdrop.
Ari Grobman, Lumus' director of business development, said the glasses are scheduled to ship later this year.
With the EyeSight software, a person wearing the Lumus glasses could do things like browse Facebook, play games, or control navigation instructions shown in a head-up display.
Shmuel said EyeSight has other wearable-computing partnerships as well, but declined to detail them at this stage.
People might feel a bit silly waving their hands around in front of their glasses, but it's possible to add a second, downward-pointing camera that would let people control the system with a hand held more discreetly at waist level, Shmuel said. And it's better than voice control, which fails in noisy environments, or touch control, which is mostly limited to taps and linear swipes, he argued.
Of course, Google might disagree -- the Google Glass computerized eyewear use voice and touch controls. But it's pretty clear that what whatever the future of wearable computing, the interfaces that'll be used to control them have yet to settle down into something as ordinary as PC mice and phone touchscreens are today.
Mobile World Congress 2020
-
•readingLumus smartglasses to get EyeSight gesture recognition
-
•Mar 12I got a peek at what your 5G-enabled AR home office could look like
-
•Feb 27LG's V60 ThinQ phone sees double in blue and white
-
•Feb 26Hands-on LG V60 ThinQ: 5G with 8K video and dual-screen
-
•Feb 25New details on Qualcomm's 5G VR prototypes: Cloud rendering, eye tracking, high-res displays
Discuss: Lumus smartglasses to get EyeSight gesture recognition
Be respectful, keep it civil and stay on topic. We delete comments that violate our policy, which we encourage you to read. Discussion threads can be closed at any time at our discretion.