CNET también está disponible en español.

Ir a español

Don't show this again

Mobile

Google makes its AR search tool Lens available for your phone, too

Google Lens moves beyond Pixel and into phones from Samsung, LG and more -- making it easy to learn more about your world just by pointing your phone’s camera.

google-lens

Google Lens is coming to more phones besides the Pixel.

Josh Miller/CNET

When Google unveiled Google Lens last May, it was billed as the future of search.

Typing a search was outdated. Now you could just point your camera at, say, a landmark and instantly learn about it.

Take a picture of a book, and there's all the info you need about where to buy it, who published it and what reviewers say about it. Point the camera at a painting, and Google Lens will tell you about the artist or the artwork's historical era.

google-lens-plant-result

Google Lens lets you point your camera at stuff and get information. 

Google

Whatever your eyes -- and by extension, your camera -- could see, Google could help you make sense of it. Even if you couldn't describe what you were looking at.

There was only one problem: Very few people could actually use it. That's because the product was exclusive to the Google Pixel, the search giant's flagship branded phone. In the scheme of smartphone units, that's remarkably not much, since Google shipped only 3.9 million Pixel phones last year, according to IDC. For context, Apple sold 77.3 million iPhones in the last quarter alone.

Now Google is trying to broaden that reach. On Friday, the company announced it's bringing the product to more devices.

"When it comes to products based on algorithms and data and feedback, you've got to start somewhere," Aparna Chennapragada, who heads Google Lens, said in an interview Thursday. "But we thought, how can we expand it to more users or devices?"

Google Lens isn't an app. Instead, it's a capability Google is building into other apps -- so far on Google Photos and the Google Assistant, its digital helper akin to Amazon's Alexa. In Google Photos, Lens uses machine learning and image recognition to identify the object in the photo and serve up information about it. With the Assistant, Lens can give you that same sort of information just by pointing your phone at an object -- no photo required.  

Google said Friday that Lens will be available on the latest version of Google Photos on Android and on iPhones running iOS 9 and newer. (For now it only works in English.)

Lens is also coming to Google Assistant on more devices, including flagship phones from Samsung, Huawei, LG, Motorola, Sony and HMD/Nokia. (It's only coming to certain devices from those companies because Lens requires camera specifications that not all phones have.) It's available in English and only in the US, UK, Canada, Germany, Australia, India, France, Italy, Spain and Singapore.

Google Lens is one of the search giant's biggest investments in augmented reality, in which digital images are overlaid on what you normally see through a camera. Google's biggest rivals have made big pushes in AR, too. Facebook has a platform called Camera, for developers to make AR apps and games for the social network. Apple offers ARKit, for software makers to create AR apps for iPhones. Google's version for building AR Android apps is called ARCore, which the company is officially releasing today.

With Google Lens, though, the company aims to go beyond Snapchat photo filters or Pokemon Go creatures --  what most people think of AR today.

"The camera is a medium for understanding what the world is," Chennapragada said. "It's an evolution of information, discovery and search." 

Virtual reality 101: CNET tells you everything you need to know about VR.

Batteries Not Included: The CNET team reminds us why tech is cool.