is developing an app it hopes will help the millions of blind and visually impaired people in the world become more independent.
The web giant said Tuesday it is working on an Android app called
that offers users auditory clues to the objects, text and people around them. The app is designed to be used with a device worn in a shirt pocket or hanging on a lanyard around a user's neck, with its camera pointing away from the body.
Lookout is the latest in a string of smartphone apps that have in recent years replaced expensive technologies that magnified their computer screens, spoke navigation directions, identified their money and recognized the color of their clothes. Today
paired with apps and accessories help the blind navigate their physical and online worlds.
"Lookout delivers spoken notifications, designed to be used with minimal interaction allowing people to stay engaged with their activity," Patrick Clary, product manager for Google's Central Accessibility Team, wrote in a company blog post.
Lookout operates in four modes based on the user's current activity: Home, Work & Play, Scan and Experimental. After selecting a mode, the user will be told about objects the app senses around them, such as the location of a sofa at home or the elevator at work.
The app's Scan feature can also read text, such as a recipe from a cookbook, while its Experimental mode allows users to tinker with features in development.
The app doesn't require an internet connection to operate and uses machine learning to learn what people are interested in hearing about.
The app is expected to be available in the Google Play Store later this year.
Google I/O: All our coverage of this year's developer conference.
Google Assistant could become the most lifelike AI yet: Experimental technology called Duplex, rolling out soon in a limited release, makes you think you're talking to a real person.