For Mike May, who is blind, navigating new spaces can be a challenge. A few weeks ago, he went to a work event at a brewery and had a hard time figuring out where to go.
Thankfully, he had a pair of Envision smart glasses on him, which use artificial intelligence to help people who are blind or visually impaired better understand their surroundings. Using a small camera on the side, the glasses can scan objects, people and text, then relay that information via a small built-in speaker. Envision can tell you if someone is approaching, for instance, or describe what's in a room.
May was using a feature on the glasses called Ally, which lets him start video calls with friends and family to get help.
"I called up one of my colleagues, Evelyn, and said, 'What do you see?' and she described the environment to me," said May, chief evangelist at accessible navigation company Goodmaps. "She told me where the tables were and just gave me the lay of the land."
Envision Glasses are built on the enterprise edition of Google Glass. (Yes, Google Glass is still alive.) Google unveiled these smart glasses back in 2013, then touting them as a way for users to take calls, send texts, snap pictures and look at maps, among other things, right from the headset. But after a limited -- and unsuccessful -- release, they never hit store shelves.
A few years later, Google started working on an enterprise edition of the glasses, which is what Envision is built on. Their wearable design makes them ideal for capturing and relaying information as a user would see it.
"What Evision Glasses essentially does is takes in all the visual information that's around, tries to process that information, and then speaks it out to the user," says Karthik Kannan, Envision's co-founder.
There are a handful of other apps designed to help people who are blind or low-vision, including Google's Lookout app, which can identify food labels, find objects in a room and scan documents and money. Be My Eyes is another app that connects people who are blind or visually impaired to sighted volunteers, who can help them get around via live chat.
But Envision's goal is to make those experiences more intuitive. The headset design frees up people's hands so they can more easily hold a cane or walk a dog, and the camera is conveniently right next to your eyes, so you don't have to hold a phone up to scan your surroundings.
"It's narrating nonstop when you're walking down a busy street about signs on the side of a bus or a taxi or on the side of a building or on the ground," May says. "There's this whole stream of information."
Envision Glasses cost $3,500, and you can order them on the company website or from a distributor. Alternatively, you can opt to use the Envision app, which also scans text and tells you about your surroundings using your phone's camera. The app costs $20 for a one-year subscription, or $99 for a lifetime subscription.
To use the glasses, you'll need to open the Envision app and link the glasses via Bluetooth. Then, connect the glasses to Wi-Fi, and you're good to go. You only have to do this once. After that, you won't even need to carry your phone around for the glasses to work. To teach Envision to recognize faces, have people take selfies in the app and then enter their name. After that, the glasses will speak aloud that person's name when they're in frame.
The company's goal is to bring other apps to the glasses like Aira, a service that connects people to trained agents who can see what's around them using their phone's camera. An integration with Envision would mean users could connect to Aira directly from the glasses instead. Envision is also in talks with navigation apps to try to bring their services onto the glasses.
"Anyone who is in the assistive technology space and they're building apps, they can easily come onto the Envision glasses and build as well," Kannan says.
May, who says tech makes him feel like "a kid in a candy store," says he loves the independence Envision affords.
"I do really like the feeling that a kid gets, which is, 'Aha, I did it myself.'"