-Answer on the left here.
-I'm Robert Scoble, one of the first Glass holes, so thank you.
-Yeah.
-Thank you for getting my Glass.
-Robert, Robert, I really didn't appreciate the shower pictures on.
-You-- here at Google I/O, several contextual things started coming out.
We've started seeing an API that's gonna tell us whether we're working or running or what not.
Where are you gonna take that in the future now that we have more sensors, and are you gonna talk about the little sensor inside the Google Glass that watches our eye?
-Yeah, that's a great question.
I mean, this is a bigger area of focus.
I think you saw that in the presentations.
I think to really being able to get computers out of the way and really focus on what people really need.
Mobile has been a great learning experience,
I think, for us and for all of you.
You know, the smaller screens, you can't have all this clutter.
I think you saw on the new Google maps how we got all sorts of stuff out the way.
You know, there's like a hundred times less things on the screen than there was before, and I think that's gonna happen with all of your devices.
You're gonna understand the context.
You know, just before I came on stage, I have to turn off all of my phones, right?
And so I'm not interrupting all of you.
That's crazy.
That's not a very hard thing to figure out.
So all that context that's in your life, all these different sensors are gonna be able to pick that up and just make your life better, and I think that word, again, only at the very, very early stages of that.
It's very, very exciting.