X

Google's Lens-Led Multisearch Tool Goes Global

The company will also soon let Android users search things they've seen on their screens without leaving the app.

Katie Collins Senior European Correspondent
Katie a UK-based news reporter and features writer. Officially, she is CNET's European correspondent, covering tech policy and Big Tech in the EU and UK. Unofficially, she serves as CNET's Taylor Swift correspondent. You can also find her writing about tech for good, ethics and human rights, the climate crisis, robots, travel and digital culture. She was once described a "living synth" by London's Evening Standard for having a microchip injected into her hand.
Katie Collins
2 min read
A phone camera looking at a flower via Google Lens

Google Multisearch will soon be available globally.

Screenshot CNET

Google might be the longtime leader in internet search tools, but it's not resting on its laurels. On Wednesday, it announced the global rollout of some of its newest image-led search features, as well as a new tool for Android users.

Google first introduced Multisearch, which allows people to search for things using a combination of image and text, to US users last year. It harnesses the Google Lens tool so that people can search for things they see, while also adding in a text query to get more specific and helpful results. The Multisearch family of features also includes "Multisearch near me" and "Multisearch on the web," which allow you to use the tool to find things locally or add text to images you've already found while searching.

As part of its announcement on Wednesday, Google said that all Multisearch features will be rolling out in the coming months in all countries and languages where Google Lens is currently available.

Training computers to understand what they're looking at has been a major part of Google's AI strategy for years now. Google first introduced Lens, its visual AI tool, in 2017. According to the company's stats, it's now used more than 10 billion times per month.

Google is still developing new use cases for Lens, including a feature the company announced on Wednesday for Android phone users. The feature, called "search your screen," will allow people to search for things they're seeing in images and videos on their phone screens without having to leave the messaging app or web page.

If someone sends you a picture of an intriguing mystery location, you'll be able to long-press the home or power button on your phone, and Google Assistant will present you with an option to search your screen and bring up any information about that location. It will do the same for clothing or other items it sees, pointing you toward where you might be able to buy what you're looking at.

Unlike previous Lens features that were US-only at first, "search your screen" will come to all languages where Lens is currently available when it rolls out in the coming months.