X

Google Lens' future could be discovery, maps and AR glasses

The head of Google’s new visual search tool, which lets you find information using your phone’s camera, talks about what could be next.

Richard Nieva Former senior reporter
Richard Nieva was a senior reporter for CNET News, focusing on Google and Yahoo. He previously worked for PandoDaily and Fortune Magazine, and his writing has appeared in The New York Times, on CNNMoney.com and on CJR.org.
Richard Nieva
5 min read
google-lens.png

Lens, Google's visual search tool, is hitting shelves with the Pixel 2 this week.

Google

Google Lens is here, and it promises to do no less than change the way you find information about the world around you.

Announced in May by the company behind the world's biggest search engine, Lens lets you find details on different things in the real world by pointing your phone's camera at them. Take a picture of a book, and up pops info about how to buy it, who published it and recent reviews.  

There's one hitch: The software only works -- at least for now -- on Google's new Pixel 2 phones, which hit store shelves on Thursday. Google is calling it a "preview" and says the software will eventually work with other phones running its Android software, but it's not saying exactly when. And it will also work --  someday -- on Apple's iPhone, Aparna Chennapragada, who heads up Google Lens, said in an interview last week.

This isn't the first time Google has tried to develop a camera-based search product. Google Goggles, a visual search app for Android, debuted in 2012 but hasn't been updated since 2014. Google Glass, also introduced in 2012, layered information and graphics right in front of your eyes when you wore a headset, but the $1,500 device failed spectacularly before it left the prototype stage because people worried about privacy and piracy.

But Google Lens feels like the search giant's first true entry into augmented reality, the superimposing of digital graphics over the real world. Lots of things have changed since those earlier, ill-fated attempts. Voice and facial recognition have gotten a lot better. Smartphones can handle more intense computing demands.

"We thought, can you have the camera be the browser for the world around you?" Chennapragada says.

Google's rivals are in the fray too. Last week, Snapchat, which pioneered how teens and young adults currently use AR, unveiled Context Cards, a sort of visual search that links people with restaurant reviews or Lyft rides when they see an image in their snaps. Apple has an AR platform called ARKit for software makers to develop apps for iPhones. And Facebook has a similar platform called Camera for developers to create apps for its social network.

But while the rest of the tech giants are trying things out, Google hopes its 19-year history as a search company will give it a leg up. "Google arguably has the best track record to have something to build on," says Jan Dawson, principal analyst at JackDaw Research. "It looks very compelling."

When Google first showed off Lens, one of the big crowd-pleasers was a video of someone taking a picture of a Wi-Fi password on a router, with Lens automatically connecting the phone to the network. Chennapragada says that while her team had big ambitions for image recognition and visual search, many users were most excited about having an easy way to cut and paste anything you see on your phone's camera. Another Lens feature lets you take a photo of a business card and extract the name, email and phone number from it. 

"It's the first time you can actually bridge the real world to your phone in a really interesting way," she adds.

google-lens-photos-app

To activate Lens in Google Photos, tap the icon at the bottom of the screen.

Screenshots by Jason Cipriani/CNET

How it works

Lens isn't an app or product on its own but a feature that's going to be built into several Google services. The first app to get Lens is Google Photos . Here's how it works: First, take a photo. When you view the picture, you'll see the Lens icon at the bottom of the screen (it looks kind of like the Instagram logo, but in black and white).

It works well with books. I took a photo of "The Facebook Effect" by David Kirkpatrick, and Lens showed me author and review information.

If you take a picture of something with little or no text on it, the software will do its best to figure out what's in the picture. I took a photo of the unfinished Salesforce Tower outside my office window in San Francisco (admittedly, probably an unfair test, since the tower is still under construction). Lens couldn't identify it specifically and instead pulled up pictures of other skyscrapers.

In another test, I took a photo of the Bart Simpson figurine on my desk. Lens identified the object as "figurine" and surfaced pictures of other toy figures with similar colors, including two elves and Foghorn Leghorn from Looney Toons. But the list did include one photo of a Bart Simpson figurine.

The software still needs to improve, but its attempts were valiant.

And even though it's still early, there's an imperative for Google to get it right. "People get jaded," Dawson says. "They try something that doesn't work, and trying to convince them to come back is difficult."

Google knows how crucial that is. Chennapragada said Google is taking its time with the rollout -- both with the types of devices Lens will be on and the number of Google services Lens will be built into. "The rollout is in proportion to the capabilities," she says.

What's next?

The second Google product Lens will be added to is Assistant, the search giant's digital helper, akin to Amazon's Alexa and Apple's Siri. That update is happening "in coming weeks." Google demoed that for me last week. The biggest difference is that instead of taking a picture and filling up your camera roll, you'll be able to point your camera at an object and do a visual search in real time. Then you'll be able to ask follow up questions.

So what's next? Chennapragada won't say but hints that Google Maps could be a good contender. For example, you could point your camera at a storefront and see ratings and menu information. (Google already teased this in a video in May but didn't give live demos.)

Google also eventually wants to make Lens a tool for discovering new content, instead of just figuring out what's in front of you. "It's not just to say, 'What is this?' But, 'Give me ideas related to this. What else can I do? Give me ideas and inspiration,'" Chennapragada says.

Of course, while all of this is centered on your phone -- for now -- the real promise of AR is when it might come to smart glasses. Facebook says it's working on a pair, though it'll take years. Google has a project called Aura, sort of a reincarnation of Google Glass.

When I ask about AR glasses, Chennapragada downplays the question. She says there's still a lot that needs to happen to realistically enable that type of form factor -- techspeak for the physical design of a device.

"Being able to easily overlay reality in a seamless, frictionless way -- augmenting using voice and vision -- is a very key component of anything in that form factor, but there are other challenges in terms of that falling into place," Chennapragada says.

Still, she acknowledges where Google Lens and its technology could eventually go.

"But anything we do here will be a building block," she says. "These will be the building blocks for anything we do in terms of future form factors as well." 

Virtual reality 101: CNET tells you everything you need to know about VR.

Batteries Not Included: The CNET team reminds us why tech is cool.