Google's mobile operating system already tries to predict which app you'll want to use next. It's next update, , will move on to "predicting the next action you want to take," said Dave Burke, Google's VP of engineering for Android.
You'll experience this in a variety of ways as you're using a phone that runs Android P, Burke said. For example, plug in your headphones, and you'll see an "action" on your screen that lets you press play on music you were listening to earlier.
Actions will eventually appear as you're interacting with the launcher, Google's Play Store and. It'll also appear as part of smart text selection and Google search.
"The phone is adapting to me and trying to help me get to my next task more quickly," Burke said. It's one way Google is applying artificial intelligence to phones, he added, saying the move will make "Android smart by teaching the operating system to adapt to the user."
Android P will also show you what it calls "slices," or a small section of an app's interface, when you might need it. For example, Burke said, if you look up Lyft in Google search, a slice of the Lyft app will appear on the screen, giving you the option to start a ride request.
Slices will eventually show up in a variety of places on your Android phone, but they'll first appear in search, Burke said. The goal is to "enable a dynamic two-way experience where the app's UI can intelligently show up in context."
Early access to the feature will begin in June.
: Google still won't tell us what the "P" stands for.
: Experimental tech called Duplex makes you think you're talking to a real person.
Google I/O 2018
reading•Google's Android P will know what you want to do before you do it
Oct 15•7 ways Google Assistant just got better
Aug 23•Android Pie works like the iPhone X these two ways
Aug 21•JBL Link Bar brings Google Assistant, better TV sound for $400
Aug 7•Android Pie: 5 new features coming to your phone