Speaker 1: It's amazing how quickly voice is becoming such a common way to access computing every month, over 700 million people around the world, get everyday tasks done with their assistant. They can just say, Hey, Google, to get help on the go in their homes and even in the car. But these interactions are still not as natural as they could be. First, you should be able to easily initiate conversations [00:00:30] with your assistant. So today we are introducing two new options. So you don't have to say, Hey, Google. Every time First is a new feature for nest hub, max called look and talk, which is beginning to roll out today. You can simply look directly at your device and ask for what you need. Like when you make eye contact to [00:01:00] start a conversation with another person, once you opt in, look and talk is designed to activate when both face match and voice match recognize it's you and video from these interactions is processed entirely on device. So it isn't shared with Google or anyone else. Let me turn the camera back on and show you how it works.
Speaker 2: [00:01:30] The mic and camera are back on.
Speaker 1: Walk it gone, walking into the kitchen to start a weekend with my family. I can simply look over and ask, show me some beaches in Santa Cruz.
Speaker 2: I found a few beaches near Santa Cruz.
Speaker 1: Pretty cool, right?
Speaker 3: Yes.
Speaker 1: [00:02:00] How long does it take to get to that first one
Speaker 2: By car? The trip to natural bridges, state beach is 51 minutes.
Speaker 1: That's so much easier than saying the hot word over and over. <laugh> the ability to distinguish intentional eye contact from a passing glance requires [00:02:30] six machine learning models that a processing over a hundred signals by proximity, head orientation, and gaze direction to evaluate the user's intent. All in real time. We've also tested and refined. Look and talk to work across a range of different skin tones using some of the same principles of inclusion behind real tone on pixel six camera and monk scale. [00:03:00] We're also excited to expand quick phrases on nest hub. Max, quick phrases already lets you skip the hot word for things like answering calls on pixel six and stopping timers on nest devices. And in the next few months, you'll be able to ask your assistant for many common requests like setting alarms, asking for the time and controlling lights from your Nesta max, all without saying the hot word. [00:03:30] All right, check this out. Turn off the living room light.
Speaker 1: That was so easy. I just said what I wanted designed with privacy in mind you choose which quick phrases are enabled for your nest hub max. So those are two ways that it's getting easier to start talking to the assistant. We're also improving how the [00:04:00] assistant understands you by being more responsive as you just speak. Naturally, if you listen closely, people's conversations are full of ums, pauses and corrections, but that doesn't get in the way of understanding each other. And that's because people are active listeners and can react to conversational cues in under 200 milliseconds. Humans handle this so naturally, but doing this for open ended conversations across the assistant [00:04:30] is a really hard problem. Moving our speech models to run on the device made things faster, but we wanted to push the envelope even more. The breakthrough comes by creating more comprehensive neural networks that run on the Google tensor ship, which was built to handle on device machine learning tasks super fast. Let me show you a preview of how this will all come together. For example, I might tap and hold on my pixel [00:05:00] buds and say, play the new song from
Speaker 2: Mm-hmm <affirmative>
Speaker 1: Florence and the something
Speaker 2: Got it. Playing free from Florence in the machine on Spotify.
Speaker 1: You heard how I stumbled at the beginning, [00:05:30] but my assistant gently encouraged me to complete my thought
Speaker 3: <laugh>.
Speaker 1: And then even when I messed up the artist name, Google assistant correctly, figured out the song I wanted. It's amazing how these improvements changed the way it feels to talk to your assistant. You can stop worrying about the right way to ask for something and just relax and talk naturally. This is how we are pushing computing forward with natural conversation, [00:06:00] letting you easily initiate conversation and making it so you can just speak naturally all so you can be truly understood. I'm excited to see how our voices will become a faster hands free way to get things done across many types of devices, including a growing Android ecosystem that you'll hear about in a few minutes. Thanks. And back to you send.