If you need to tell a digital device what to do but can't use a keyboard, voice is the first option that springs to mind. You can use it for phones, digital assistants and smart speakers.
But some devices are getting digital eyes in addition to digital ears. That's the vision, so to speak, of Israeli startup EyeSight Technologies. EyeSight has mostly dropped its previous efforts to build its gesture-recognition tech into phones, TVs and PCs. But it's got a new angle on the market: connected devices in homes and cars.
On Monday, Sony and EyeSight plan to announce that the Japanese electronics maker has built the vision technology into an interactive portable projector called the . The device already can project an image onto a wall or other surface and then use infrared light sensors to see how you're interacting with the image on that surface -- slicing up fruit in Fruit Ninja or playing a virtual piano keyboard, for example.
EyeSight's upgrade will now let you control the content from a distance, without touching the projected image's surface. Think Tom Cruise in "Minority Report."
You may already be familiar with gesture tech in video game controller software like Microsoft's Kinect, but the technology has yet to widely reach other computing devices. The addition of gesture control in the Xperia Touch shows that the tech industry is still hard at work trying to find a better, more natural way for us to interact with our devices. Keyboards and mice show no signs of disappearing, but they can't work in every situation.
EyeSight is particularly fired up about its gesture-detection interface for use in our increasingly tech-laden cars. Hand positions and movement can be easier than a textureless touch screen a driver has to look at to use, EyeSight Chief Executive Gideon Shmuel said.
"With your finger, you can do a small circle motion to the right or left to turn volume up and down," he said. You could flash one finger to call home, two fingers to call your office or hold your palm flat to answer an incoming call.
EyeSight is working with LG Electronics, which supplies some car computing technology, to build gesture recognition into vehicles. And once you have brainy cameras watching the driver, there are other things you can do, too -- like watch for drooping eyelids and fast blink rates that indicate a dangerously sleepy driver.
"Driver monitoring we'll see in aftermarket products later this year," Shmuel said. But for it to be built directly into the car, we'll have to wait until 2019 or 2020, he said.
But it's in the Xperia Touch today. EyeSight obviously likes gestural interface technology, but Shmuel says it also can work well in conjunction with voice controls, too. That's how Sony sees it.
Combines with voice control
"We aim to offer our customers the most intelligent and advanced user experiences, and with the synergy between our touch and voice-based interaction and EyeSight's touch-free gestures, we're able to fulfill that promise," said Hiroshi Ito, deputy head of Sony Mobile's Smart Product Business Group.
Just how successful the idea will be isn't clear yet. It didn't catch on TVs or PCs, even though manufacturers are desperate for a way to get their products to stand out from the pack.
Shmuel, though, said the new strategy is working.
"There is such a strong pull from automotive space," and there's strong interested from connected digital devices in the home. "Between those two markets, we are flooded," he said.
The Smartest Stuff: Innovators are thinking up new ways to make you, and the things around you, smarter.
Blockchain Decoded: CNET looks at the tech powering bitcoin -- and soon, too, a myriad of services that will change your life.