Google's new AR update adds depth without needing lidar like Apple's iPad
ARCore's depth scanning could open up world-recognizing possibilities.
See that little critter hide? That's occlusion.
This week's AR news has been focused on Apple's augmented reality updates to iOS 14, many of which lean on the depth-scanning hardware only on the recent iPad Pro. Google announced its own AR news this week, too, and you won't need specialized hardware to use its depth-sensing tools.
A Depth API update to ARCore, announced and available today, will be able to make 3D meshes of environments and use them to lay virtual objects down more realistically. Virtual objects will even appear to hide behind real ones through a technique called occlusion.
Google first announced this feature last December, but it's just becoming available now. Google has already announced a number of apps that will use the depth functions in AR, including an update to a key Samsung app. The Quick Measure AR app in the Samsung Galaxy Note 10 Plus and Galaxy S20 Ultra will gain extra depth-improvement updates for measurements "in the coming months," according to Google. A demo reel of what the depth-sensing AR can do is embedded below; a lot of it looks impressive.
A few games and apps already use some of these features, including Five Nights at Freddy's AR: Special Delivery. Snapchat has integrated them into two AR lenses: Dancing Hotdog (which is a dancing hotdog) and Undersea World, which turns your space into an aquarium full of fish. (Snapchat's getting depth support for other Lens developers now on Android, with this update: You can expect more world-aware experiences from lenses in future.)
The hotdog hides.
Many companies are exploring tools for camera-based world-scanning, including Pokemon Go creator Niantic, which bought collaborative 3D meshing company 6D.ai earlier this year.
The ARCore update comes with a Google ARCore Depth Lab app, which will allow for some experiments, too.
It's unclear how precise Google's Depth API is over other tools. Apple's lidar sensor on the iPad Pro, much like Google's now-discontinued line of depth-sensing Tango phones and tablets, can create a physical depth map with tangible measurements. But the future of computer vision will undoubtedly move to enabling more world scanning with less hardware. Android devices with time-of-flight sensors, which also measure depth, will get faster and better depth results, according to Google.
Services and Software Guides
VPN
Cybersecurity
Streaming Services
Web Hosting & Websites
Other Services & Software