Hi my name is [UNKNOWN] and I'm a product manager at Google and I'm working on Android.
What we're going to demo today is Android in the vehicle [UNKNOWN] concept and what we have here is a concept vehicle which consists of two different screens, one is a 4K screen in the center console and the second one is a 720p screen which is co-located in the cluster.
They're both powered by a Qualcomm Snapdragon chipset.
Everything is powered with Android then.
What we wanted to demonstrate is that you can utilize Android to actually power the displays as well as the overall infotainment.
Ecosystem in any vehicle.
So, for example, if you would to turn on the car, you would be greeted with an interface [UNKNOWN] listed it which is a home, you would have access to different functionalities such as for example, things such as recent history for example, if you're going to a restaurant or you need to for example, get to SFO or recent phone calls or if you were playing podcast from your favorite application such as Pocket Cast.
You would also have access to functionality such as a microphone, so you can quickly talk to it in a case you either have to have any questions, or you wanna do a reminders, or actually control the system.
Lastly what you have in here, and this is accessible to the whole system, is actually controlling the air conditioning in the vehicle.
So different features such as temperature or for example, for the passenger or for the driver or different ways that the air can blow or work.
It we quickly take a look at Google Maps and we go here, we'll see that Google Map have been redone in order to make them more friendly in such a large display, in order to also make them function.
So We add a feature, for example, such as gonna have the buildings which allow users to orient, or they can access also satellite mode which will also allow them to see buildings and other things.
User also has access because they're logged into such things home, work, or some other recent destinations that they've gone to.
So if you were to navigate to a hardware store and we were to press navigate.
Was that happening is that the cluster actually start showing navigation as well so for example if we were to dial somebody and go quickly here we see that three of phone you will actually to bluetooth, you will get accessibility to your recent phone calls.
Navigation continues over here and navigation also continues in a cluster so if you look at a call history We can scroll and actually select somebody to dial.
So for example, if we were to dial Dylan, we see that quickly you get a notificational cluster that Dylan is being dialed.
You'll also see that all of these functions continue.
So you're gonna be able to access your phone call correctly here.
You'll also see the navigation.
And you also have the media application.
They're all currently available, including the microphone and [UNKNOWN] functionality.
So if you wanted to end a phone call, all you have to do is press the little button here, and it's gonna end the phone call.
That goes away, it ends it, and you basically can continue navigating.
All the functionalities always accessible also via voice, because we understand that a lot of people might want to interact with some of the pieces of the system that way.
We also wanted to showcase some of the other developmental tools that we're still developing, such as controlling windows or controlling doors.
And again our goal for this is just primarily to showcase some of the APIs that are gonna be coming with an Android, which are gonna enable partners to develop the experiences that they want to do.
We're obviously not stating that experiences have to look this way.
We all understand that different car manufacturers are gonna have different experiences they wanna build and we wanna build a platform that enables them to do so.