Google's new AR update adds depth without needing lidar like Apple's iPad
ARCore's depth scanning could open up world-recognizing possibilities.
Scott SteinEditor at Large
I started with CNET reviewing laptops in 2009. Now I explore wearable tech, VR/AR, tablets, gaming and future/emerging trends in our changing world. Other obsessions include magic, immersive theater, puzzles, board games, cooking, improv and the New York Jets. My background includes an MFA in theater which I apply to thinking about immersive experiences of the future.
ExpertiseVR and AR, gaming, metaverse technologies, wearable tech, tabletsCredentials
Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps
This week's AR news has been focused on Apple's augmented reality updates to iOS 14, many of which lean on the depth-scanning hardware only on the recent iPad Pro. Google announced its own AR news this week, too, and you won't need specialized hardware to use its depth-sensing tools.
A Depth API update to ARCore, announced and available today, will be able to make 3D meshes of environments and use them to lay virtual objects down more realistically. Virtual objects will even appear to hide behind real ones through a technique called occlusion.
Google first announced this feature last December, but it's just becoming available now. Google has already announced a number of apps that will use the depth functions in AR, including an update to a key Samsung app. The Quick Measure AR app in the Samsung Galaxy Note 10 Plus and Galaxy S20 Ultra will gain extra depth-improvement updates for measurements "in the coming months," according to Google. A demo reel of what the depth-sensing AR can do is embedded below; a lot of it looks impressive.
A few games and apps already use some of these features, including Five Nights at Freddy's AR: Special Delivery. Snapchat has integrated them into two AR lenses: Dancing Hotdog (which is a dancing hotdog) and Undersea World, which turns your space into an aquarium full of fish. (Snapchat's getting depth support for other Lens developers now on Android, with this update: You can expect more world-aware experiences from lenses in future.)
The ARCore update comes with a Google ARCore Depth Lab app, which will allow for some experiments, too.
It's unclear how precise Google's Depth API is over other tools. Apple's lidar sensor on the iPad Pro, much like Google's now-discontinued line of depth-sensing Tango phones and tablets, can create a physical depth map with tangible measurements. But the future of computer vision will undoubtedly move to enabling more world scanning with less hardware. Android devices with time-of-flight sensors, which also measure depth, will get faster and better depth results, according to Google.