Speaker 1: It's more naturally than ever before. Just last month we launched multi search. One of our most significant updates to search in the Google app. You can now search by taking a photo and asking a question at the same time, you can snap a pick of a spill proof water bottle, and last for one with rainbows on it to brighten your kid's day. Or in my case, I was able to take a photo of my [00:00:30] leaky faucet and order the part to fix it. The funny thing is I still don't know what that part is called
Speaker 2: <laugh>
Speaker 1: But this is just the beginning of what we can do with multi search. With later this year, we add a new way to search for local information with multi search near me. Just take a picture or long press one. You see online and add near me to find what you need from [00:01:00] the millions of local businesses we serve in Google, near me will work on multis search for everything from apparel to home goods, to my personal favorite food and local restaurants. So let's say I part of tasty looking dish online, I don't know what's in it or what it's called, but it's making me hungry with this new capability. I can quickly identify that it's JK a Korean dish find nearby restaurants that serve it and [00:01:30] enjoy it in no time While this all seems simple enough, here's what's happening under the hood. Google's multimodal understanding recognizes the visual intricacies of the dish and combines it with my, with under, with an understanding of my intent that I'm looking for local restaurants that serve job, check it, then scans millions of images and reviews posted on webpages [00:02:00] from, and from our active community of maps contributors to find results about nearby spots. Multi search near me will be available globally later this year in English and will come to more languages over time.
Speaker 2: Today,
Speaker 1: Today, this technology recognizes objects captured within a single frame, but sometimes you might want [00:02:30] information about whole scene in front of you
Speaker 1: In the future with an advancement, we are calling scene exploration. You'll be able to use multi search to pan your camera and ask a question and instantly glean insights about multiple objects in a wider scene. Let me give you an example. Let's say you're trying to pick out the perfect candy bar for your friend was a bit of a chocolate Conor. You know, they like dark chocolate and have an abortion to [00:03:00] nuts. And of course you want to get them something to go something good. If you went to the store today to find the best, not free dark chocolate, you'd be standing in the aisle for a while. You'd look at each bar, figure out which type of chocolate it is, whether it's not free compare and contrast the options, and maybe even look up reviews online. But thanks to seeing exploration, you'll be able to scan the entire shelf with your camera and see helpful insights overlay in front of you. [00:03:30] Yeah. Insights, overage. So you can find precisely what you're looking for. Try doing that with just keywords.
Speaker 1: Here's how it works. See an exploration uses computer vision to instantly connect the multiple frames that make up the scene and identify all the objects within it. [00:04:00] Simultaneously. It taps into the richness of the web and Google's knowledge graph to surface the most helpful results in this case, which bars and not free dark chocolate and highly rated. Seeing exploration is a powerful breakthrough in our device's ability to understand the world the way we do. And it gives us a superpower, the ability to see relevant information overlay in the context of [00:04:30] the world around us. You could imagine using this in a pharmacy to find a cent free moisturizer or at your local corner store to find a black on wine label to support This. This is like having a supercharged control F for the world around you.
Speaker 2: <laugh>
Speaker 1: [00:05:00] Looking further out. This technology could be used beyond everyday needs to help address societal challenges like supporting conservationists in identifying plant species that need protection or helping disaster relief workers quickly sort through donations in times of need From multi search near me to see exploration. The advancements we've talked about today are in service of our broader vision to make search even more natural [00:05:30] and helpful. So you can search your whole world asking questions anywhere and anywhere.