Google Has Big Plans for AR. Google Maps Could Be the Key
The company is harnessing its GPS power to build new AR features.
Scott SteinEditor at Large
I started with CNET reviewing laptops in 2009. Now I explore wearable tech, VR/AR, tablets, gaming and future/emerging trends in our changing world. Other obsessions include magic, immersive theater, puzzles, board games, cooking, improv and the New York Jets. My background includes an MFA in theater which I apply to thinking about immersive experiences of the future.
ExpertiseVR and AR, gaming, metaverse technologies, wearable tech, tabletsCredentials
Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps
Before Google has its own pair of augmented reality glasses, it'll need AR to work everywhere. World-spanning AR that blankets the real world using map data has been a goal for several companies lately, and Google is layering its AR using Google Maps.
The toolkit, announced at Google's I/O developer conference on Wednesday, could leap ahead of several competing efforts from rivals such as Niantic, Snap and Apple by using swaths of existing Google Maps data to generate location-specific AR anchors. Google's doing this using the same technique it used to create AR layers on top of Google Maps, called Live View, that were introduced back in 2019.
Watch this: Google Maps Immersive View Reveals Building Interiors
The new ARCore Geospatial API, as it's called for developers, could quickly allow specific augmented reality information to be placed at specific locations around the world, so that many people could see it at the same time and interact with it. It will work in over 87 countries, according to Google, without requiring any location scanning.
Google's evolving its own Maps to become more AR-infused over time, including adding an Immersive View to certain locations that will create ever-more-detailed scans of indoor and outdoor spaces. But these new moves look like they'll also enable app developers to create those experiences, leaning on maps data, for themselves.
Microsoft, Apple and Meta, among others, are already working to combine AR with map data, but not all initiatives are the same. Some recent initiatives by Snap, Apple and Meta have used lidar or depth-scanning cameras to map locations, which also requires regions to have been prescanned in order to work. Other location-mapping tools, such as Niantic's world-scanning AR in its Lightship platform, don't need lidar. Still, Google's existing maps look to be a huge starting set of mapped locations that could work with location-specific AR very quickly.
According to Google, the AR effects can appear in any location where Google Street View is also available, which could give it a big edge on working quickly in a lot of places.
Google's already begun working with early app partners, including the NBA, Snap and Lyft, to use the phone-based AR tech. It seems like a clear stepping-stone toward the tools a future set of AR glasses would need, too. According to Google, Lime is using the feature to explore how to show available parking spots using AR in certain cities.
A few open-source demo apps were announced as well, which show off collaborative location-specific AR: a balloon-popping app that could be used by lots of people at once in various places, and a multiperson interactive gardening game that's reminiscent of a collaborative AR demo we tried at Google I/O years ago.