If the Samsung Galaxy S4 rumors pan out, Samsung's newest smartphone may let people interact with the screen using just their eyes.
Join CNET on Thursday, March 14 at 3 p.m. PT / 6 p.m. ET for live coverage of the Samsung Galaxy S4 event
Eye-tracking uses the camera to lock onto the motion of a user's peepers, following wherever they move. With it, the phone can perceive where the user is looking, and can respond to a set of behaviors, let's say a very intentional movement to scroll a Web page up and down, or a long, purposeful blink to click.
If your eyes have reached the bottom of a page, eye-tracking software could automatically scroll you down the following paragraphs of text.
This type of technology -- which had been researched for desktop computing long before it was conceived of for the smaller smartphone screen -- has been demoed for a variety of actions: zooming in or out, pausing a video by looking away from a screen, and playing games.
One company, Umoove, has already posted a demo video on how different eye-tracking navigation could work (below).
This isn't to say that this is the exact implementation that Samsung would use, if it were to integrate eye-tracking software at all, but it does help us visualize the pros, cons, and use cases of "perceptual computing" with this type of gesture-based software.
Software that's hard to perfect
We'll be the first to admit that weaving and bobbing your head to interact with the screen looks a little silly, but there are a few practical use cases, particularly if you're the type of person who's often busy with your hands. It's also a potentially useful accessibility feature.
From a business perspective, eye-tracking software also has interesting ramifications for advertising, potentially allowing companies to tailor ads based on the parts of a story or screen where people actually look.
However, there are also plenty of possible cons. Since the technology is still in its early days, commanding the screen with a come-hither look won't always be accurate. Just think of the issues users have had with Apple's Siri and Samsung's S Voice assistants.
Movements could look awkward in public, and distractions could easily keep your orbs darting this way and that, interfering with the tracking software's behavior. Battery life is also an issue, since the phone would have to be awake to keep an eye on you.
Just how likely is this?
We don't have any insider information on this, but eye-tracking is just the kind of feature Samsung would include in its handset.
Why not? The Galaxy S3 has SmartStay, which, if you enable, keeps the screen from dimming if you look at it. An Android phone, the Galaxy S3 also includes rudimentary facial recognition to unlock the screen.
All that's in addition to a long list of optional physical gestures that use sensors like the accelerometer to pan and zoom when you move the phone, and mute a call or song when you flip the device over.
For Samsung, a company all about staying ahead of the pack, being one of the first to use a feature like eye-tracking would be a big win -- whether anyone really uses it or not.
"Innovation is quite difficult to achieve in these devices when you're effectively using the same software platform as everyone else and the same underlying hardware," Ovum analyst Tony Cripps said. "These investments are perceived as important in that they provide some kind of differentiation from rival devices in the market."
A recent Bloomberg article reports that eye-scrolling (one of eye-tracking's behavioral expressions) won't make it into the Galaxy S4, but there's a strong chance that future devices could feature it.
As more and more handset-makers look for ways to innovate, expect to see more visual gestures creep into a smartphone's bag of tricks.