X

Google developing Lookout app to aid the visually impaired

The Android app will offer users spoken clues to the objects, text and people around them.

Steven Musil Night Editor / News
Steven Musil is the night news editor at CNET News. He's been hooked on tech since learning BASIC in the late '70s. When not cleaning up after his daughter and son, Steven can be found pedaling around the San Francisco Bay Area. Before joining CNET in 2000, Steven spent 10 years at various Bay Area newspapers.
Expertise I have more than 30 years' experience in journalism in the heart of the Silicon Valley.
Steven Musil
2 min read
google-lookout

Google Lookout aims to tell the blind and visually impaired more about their surroundings and text.

Google

Google is developing an app it hopes will help the millions of blind and visually impaired people in the world become more independent.

The web giant said Tuesday it is working on an Android app called Lookout that offers users auditory clues to the objects, text and people around them. The app is designed to be used with a device worn in a shirt pocket or hanging on a lanyard around a user's neck, with its camera pointing away from the body.

Lookout is the latest in a string of smartphone apps that have in recent years replaced expensive technologies that magnified their computer screens, spoke navigation directions, identified their money and recognized the color of their clothes. Today smartphones paired with apps and accessories help the blind navigate their physical and online worlds.

"Lookout delivers spoken notifications, designed to be used with minimal interaction allowing people to stay engaged with their activity," Patrick Clary, product manager for Google's Central Accessibility Team, wrote in a company blog post.

Lookout operates in four modes based on the user's current activity: Home, Work & Play, Scan and Experimental. After selecting a mode, the user will be told about objects the app senses around them, such as the location of a sofa at home or the elevator at work.

The coolest things we saw at Google I/O

See all photos

The app's Scan feature can also read text, such as a recipe from a cookbook, while its Experimental mode allows users to tinker with features in development.

The app doesn't require an internet connection to operate and uses machine learning to learn what people are interested in hearing about.

The app is expected to be available in the Google Play Store later this year.

Google I/O: All our coverage of this year's developer conference.

Google Assistant could become the most lifelike AI yet: Experimental technology called Duplex, rolling out soon in a limited release, makes you think you're talking to a real person.