X

The iPad Pro can scan your house, and future iPhones might too

Canvas is the latest app to use the iPad Pro's lidar sensor. Depth sensing and 3D scans could be the real killer tools of Apple's newest hardware.

Scott Stein Editor at Large
I started with CNET reviewing laptops in 2009. Now I explore wearable tech, VR/AR, tablets, gaming and future/emerging trends in our changing world. Other obsessions include magic, immersive theater, puzzles, board games, cooking, improv and the New York Jets. My background includes an MFA in theater which I apply to thinking about immersive experiences of the future.
Expertise VR and AR, gaming, metaverse technologies, wearable tech, tablets Credentials
  • Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps
Scott Stein
3 min read
occipital-canvas-ipad-pro-lidar.png

A 3D room scan from Occipital's Canvas app, enabled by depth-sensing lidar on the iPad Pro. Imagine this on iPhones next.

Occipital

It's been a few months since Apple's AR-enhanced iPad Pro became available, but relatively few AR apps have taken advantage of the device's new lidar depth sensor so far. That rear sensor could be a sneak preview of the tech that makes its way into the next iPhones, which means that early iPad Pro apps could offer a window into what territory Apple ventures into next. With Apple's remote WWDC developer conference approaching next week, Apple will likely place great emphasis on new AR tools. Occipital's new house-scanning app, Canvas, may be the clearest indicator of what's to come for a next wave of depth-enhanced apps.

Occipital is a Boulder, Colorado-based company that's made depth-sensing camera hardware and software for years, similar to the type of tech that used to be on Google's early Tango tablets and phones. Occipital's Structure camera could measure depth in a room, combined with image capture, and build a 3D-mapped mesh of a space. In the years since, companies such as 6D.ai (recently acquired by Pokemon Go maker Niantic) have figured out ways to measure and map space without extra depth-sensing hardware at all. Occipital has since moved to a software-based approach for its scanning tools.

Occipital's latest Canvas app uses regular iPhone and iPad cameras to mesh and map rooms quickly. But the company is also adding in support for Apple's iPad Pro lidar sensor because of its extra depth-sensing range. The company sees the extra depth data adding to the tools it already has.

It's likely that few developers have singled out the iPad Pro's distinct lidar depth sensors because the 2020 iPad Pro models are a relatively small subset of all of Apple's AR-ready devices. As that depth-sensing tech expands across the line to phones and other iPads, it's likely that what Occipital is pursuing will happen with other companies, too.

"It's a reasonably good assumption that you're going to start seeing this on higher end iPhones as well," Occipital's Product Manager Alex Schiff says, referring to Apple's likely iPhone upgrades later this year, which are reported to have depth-sensing rear camera capabilities much like the iPad Pro.

You can look at a Canvas scan below, embedded in this story. Room scans can sometimes look rough, but the data could get upgraded over time, and scans can be converted into CAD models for professional designers to use.

The iPad Pro's lidar isn't a camera: it's a separate camera-free sensor, and the depth maps it collects are more like arrays of points in space, forming a mesh or a 3D mapping. Occipital's app also can use camera data to stretch photos into a simulated 3D mesh, too, but the processes are separate. Future AR cameras, however, could blend both functions more. And Occipital sees the depth data as being useful in training cameras to compute where depth might be, too. 

"Instead of just recognizing basic features such as a floor or wall, you're able to see the whole geometry of the scene," Occipital's VP of Product Anton Yakubenko says. "It also accounts for semantic recognition. Joining the geometry and image information. It's not perfect, but it could enable a new generation of applications which are not only about geometrical information, but also about semantic information. And that's where we're moving." Occipital converts home scans into CAD files, but, "the next step is to know that this piece of a CAD model is a window or a door or a baseboard."

Schiff sees the scans becoming more valuable as algorithms keep improving: "You can capture a home and have that data forever to revisit, and that data is going to become better or more interesting as you reprocess it with different kinds of algorithms that become available over time. It's the revisitability of the data that is the thing that makes 3D mapping of space different than AR."

As someone who's still staying at home and avoiding home repair visits, I can see the appeal of scanning my home's rooms to get analysis and expert opinions remotely. Apple's likely to announce more major changes to its AR tools next week at its remote WWDC developer conference, but making more of the lidar sensor's functions and how it relates to camera data could be the next step for iPhones.