X

What's behind next-gen mobile gestures? Ultrasound (Hands-on)

Touchless gestures like scrolling and advancing a photo or music track are going to get a lot more interesting, according to Elliptic Labs, seen at CES 2014.

Jessica Dolcourt Senior Director, Commerce & Content Operations
Jessica Dolcourt is a passionate content strategist and veteran leader of CNET coverage. As Senior Director of Commerce & Content Operations, she leads a number of teams, including Commerce, How-To and Performance Optimization. Her CNET career began in 2006, testing desktop and mobile software for Download.com and CNET, including the first iPhone and Android apps and operating systems. She continued to review, report on and write a wide range of commentary and analysis on all things phones, with an emphasis on iPhone and Samsung. Jessica was one of the first people in the world to test, review and report on foldable phones and 5G wireless speeds. Jessica began leading CNET's How-To section for tips and FAQs in 2019, guiding coverage of topics ranging from personal finance to phones and home. She holds an MA with Distinction from the University of Warwick (UK).
Expertise Content strategy, team leadership, audience engagement, iPhone, Samsung, Android, iOS, tips and FAQs.
Jessica Dolcourt
2 min read
Watch this: Ultrasound drives next-generation mobile gestures

LAS VEGAS -- Waving your hand over a phone or tablet instead of touching the screen is cool in theory, but often jerky or temperamental in practice. Elliptic Labs is rewriting the rules of touch-free gestures using the sense of sound.

When you turn on gesture control, ultrasonic speakers begin emitting sound waves above 20kHz, outside the range of human hearing. These waves hit your hand, then bounce back to the listening microphones. From there, software turns signals into action.

Touchless inputs generally use camera optics or infrared to get their gestural commands, which means you need to be fairly close to the screen in order to trigger the action, and fairly precise. The use of sound waves, however, mean you'll have a much larger field of action to work with, and the presence or lack of light won't make a difference at all.

Elliptic Labs says it can give device-makers up to 180 degrees of sensitivity, which means you can wag your fingers and hands all around the device, and be inches away from the screen. There is such a thing as too much sensitivity, however, but OEMs can restrain and fine-tune the parameters so users get fewer unintended results, a must for people who talk with their hands.

Even with a more constrained field of motion, sound waves allow for more elaborate gestures. Spinning your hand up or down could control volume, for example, and spinning to the side could launch another action.

Another advantage? These integrated ultrasound speakers are low power, which means they'll sip at your battery stores rather than gulp.

Elliptic Labs unveiled its demo smartphone and tablet at CEATEC in Japan this past October and promises that it's actively working with device-makers to incorporate the technology in 2014. Since the Samsung Galaxy S4 was the demo smartphone, the forthcoming flagship Samsung Galaxy S5 could very well be a debut device.