X

What does iPhone 12 Pro's lidar feature actually do? Here it is in action

House-scanning app Canvas works with lidar on the iPhone 12 Pro, but also scans without it. The difference is accuracy.

Scott Stein Editor at Large
I started with CNET reviewing laptops in 2009. Now I explore wearable tech, VR/AR, tablets, gaming and future/emerging trends in our changing world. Other obsessions include magic, immersive theater, puzzles, board games, cooking, improv and the New York Jets. My background includes an MFA in theater which I apply to thinking about immersive experiences of the future.
Expertise VR and AR, gaming, metaverse technologies, wearable tech, tablets Credentials
  • Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps
Scott Stein
3 min read
iphone-12-pro-kitchen-scan-instructions-3

The Canvas app 3D scans homes using the iPhone 12 Pro's lidar. Expect a lot more of this.

Occipital

The iPhone 12 Pro's depth-scanning lidar sensor looks ready to open up a lot of possibilities for 3D-scanning apps on phones. A new one designed for home scanning, called Canvas, uses lidar for added accuracy and detail. But the app will work with non-pro iPhones going back to the iPhone 8, too.

The approach taken by Canvas indicates how lidar could play out in iPhone 12 Pro apps. It can add more accuracy and detail to processes that are already possible through other methods on non-lidar-equipped phones and tablets.

Read more: iPhone 12's lidar tech does more than improve photos. Check out this cool party trick

Canvas, created by Boulder-based company Occipital, originally launched for the iPad Pro to take advantage of its lidar scanning earlier this year. When I saw a demo of its possibilities back then, I saw it as a sign of how Apple's depth-sensing tech could be applied to home-improvement and measurement apps. The updated app takes scans that are clearer and crisper.

Since the lidar-equipped iPhones have debuted, a handful of optimized apps have emerged offering 3D scanning of objects, larger-scale space-scanning photography (called photogrammetry) and augmented reality that can blend meshed-out maps of spaces with virtual objects. But Occipital's Canvas app sample scan on the iPhone 12 Pro, embedded below, looks sharper than 3D scanning apps I've played with so far.

Apple's iOS 14 gives developers more raw access to the iPhone's lidar data, according to Occipital's VPs of Product Alex Schiff and Anton Yakubenko. This has allowed Occipital to build its own algorithms to use Apple's lidar depth map to best use. It could also allow Occipital to apply the depth-mapping data to future improvements to its app for non-lidar-equipped phones.

Read more: iOS 14.5 is on the way. What we know about a release date and new features

Scanning 3D space without specific depth-mapping lidar or time-of-flight sensors is possible, and companies like 6d.ai (acquired by Niantic) have already been using it. But Schiff and Yakubenko say that lidar still offers a faster and more accurate upgrade to that technology. The iPhone 12 version of Canvas takes more detailed scans than the first version on the iPad Pro earlier this year, mostly because of iOS 14's deeper access to lidar information, according to Occipital. The newest lidar-enabled version is accurate within a 1% range, while the non-lidar scan is accurate within a 5% range (quite literally making the iPhone 12 Pro a pro upgrade for those who might need the boost).

Yakubenko says by Occipital's previous measurements, Apple's iPad Pro lidar offers 574 depth points per frame on a scan, but depth maps can jump up to 256x192 points in iOS 14 for developers. This builds out more detail through AI and camera data.

Canvas room scans can convert to workable CAD models, in a process that takes about 48 hours, but Occipital is also working on converting scans more instantly and adding semantic data (like recognizing doors, windows and other room details) with AI. 

As more 3D scans and 3D data start living on iPhones and iPads, it'll also make sense for common formats to share and edit the files. While iOS 14 uses a USDZ file format for 3D files, Occipital has its own format for its more in-depth scans, and can output to .rvt, .ifc, .dwg, .skp, and .plan formats when converting to CAD models. At some point, 3D scans may become as standardized as PDFs. We're not quite there yet, but we may need to get there soon.