X

Lidar is one of the iPhone and iPad Pro's coolest tricks: Here's what else it can do

Lidar sensors add depth scanning for better photos and AR, but in future we could see mixed-reality headsets and more.

Scott Stein Editor at Large
I started with CNET reviewing laptops in 2009. Now I explore wearable tech, VR/AR, tablets, gaming and future/emerging trends in our changing world. Other obsessions include magic, immersive theater, puzzles, board games, cooking, improv and the New York Jets. My background includes an MFA in theater which I apply to thinking about immersive experiences of the future.
Expertise VR and AR, gaming, metaverse technologies, wearable tech, tablets Credentials
  • Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps
Scott Stein
5 min read
01-iphone-12-pro-2020

The iPhone 12 Pro's lidar sensor -- the black circle at the bottom right of the camera unit -- opens up AR possibilities and much more.

Patrick Holland/CNET

Apple is bullish on lidar, a technology that's in the iPhone 12 Pro and iPhone 13 Pro, and Apple's iPad Pro models since 2020. Look closely at the rear camera arrays on these devices and you'll see a little black dot near the camera lenses, about the same size as the flash. That's the lidar sensor, and it delivers a new type of depth-sensing that can make a difference in photos, AR, 3D scanning and possibly even more.

Read more: iPhone 12's lidar tech does more than improve photos. Check out this cool party trick

If Apple has its way, lidar is a term you'll keep hearing. It's already a factor in AR headsets and in cars. Do you need it? Maybe not. Let's break down what we know, what Apple is using it for, and where the technology could go next. And if you're curious what it does right now, I've spent hands-on time with the tech, too.

What does lidar mean?

Lidar stands for light detection and ranging, and has been around for a while. It uses lasers to ping off objects and return to the source of the laser, measuring distance by timing the travel, or flight, of the light pulse. 

How does lidar work to sense depth?

Lidar is a type of time-of-flight camera. Some other smartphones measure depth with a single light pulse, whereas a smartphone with this type of lidar tech sends waves of light pulses out in a spray of infrared dots and can measure each one with its sensor, creating a field of points that map out distances and can "mesh" the dimensions of a space and the objects in it. The light pulses are invisible to the human eye, but you could see them with a night vision camera.

Isn't this like Face ID on the iPhone?

It is, but with longer range. The idea's the same: Apple's Face ID-enabling TrueDepth camera also shoots out an array of infrared lasers, but can only work up to a few feet away. The rear lidar sensors on the iPad Pro and iPhone 12 Pro work at a range of up to 5 meters.

Lidar's already in a lot of other tech

Lidar is a tech that's sprouting up everywhere. It's used for self-driving cars, or assisted driving. It's used for robotics and drones. Augmented reality headsets like the HoloLens 2 have similar tech, mapping out room spaces before layering 3D virtual objects into them. There's even a VR headset with lidar. But it also has a pretty long history. 

Microsoft's old depth-sensing Xbox accessory, the Kinect, was a camera that had infrared depth-scanning, too. In fact, PrimeSense, the company that helped make the Kinect tech, was acquired by Apple in 2013. Now, we have Apple's face-scanning TrueDepth and rear lidar camera sensors.

The iPhone 12 Pro and 13 Pro cameras work better with lidar

Time-of-flight cameras on smartphones tend to be used to improve focus accuracy and speed, and the iPhone 12 Pro did the same. Apple promises better low-light focus, up to six times faster in low-light conditions. The lidar depth-sensing is also used to improve night portrait mode effects. So far, it makes an impact: read our review of the iPhone 12 Pro Max for more. With the iPhone 13 Pro, it's a similar story: the lidar tech is the same, even if the camera technology is improved.

Better focus is a plus, and there's also a chance the iPhone 12 Pro could add more 3D photo data to images, too. Although that element hasn't been laid out yet, Apple's front-facing, depth-sensing TrueDepth camera has been used in a similar way with apps, and third-party developers could dive in and develop some wild ideas. It's already happening.

It also greatly enhances augmented reality

Lidar allows the iPhone and iPad Pros to start AR apps a lot more quickly, and build a fast map of a room to add more detail. A lot of Apple's core AR tech takes advantage of lidar to hide virtual objects behind real ones (called occlusion), and place virtual objects within more complicated room mappings, like on a table or chair.

I've been testing it out on an Apple Arcade game, Hot Lava, which already uses lidar to scan a room and all its obstacles. I was able to place virtual objects on stairs, and have things hide behind real-life objects in the room. Expect a lot more AR apps that will start adding lidar support like this for richer experiences.

lidar-powered-snapchat-lens.png

Snapchat's next wave of lenses will start adopting depth-sensing using the iPhone 12 Pro's lidar.

Snapchat

But there's extra potential beyond that, with a longer tail. Many companies are dreaming of headsets that will blend virtual objects and real ones: AR glasses, being worked on by Facebook, Qualcomm, Snapchat, Microsoft, Magic Leap and most likely Apple and others, will rely on having advanced 3D maps of the world to layer virtual objects onto.

Those 3D maps are being built now with special scanners and equipment, almost like the world-scanning version of those Google Maps cars. But there's a possibility that people's own devices could eventually help crowdsource that info, or add extra on-the-fly data. Again, AR headsets like Magic Leap and HoloLens already prescan your environment before layering things into it, and Apple's lidar-equipped AR tech works the same way. In that sense, the iPhone 12 and 13 Pro and iPad Pro are like AR headsets without the headset part... and could pave the way for Apple's first VR/AR headset, expected either this or next. For an example of how this would work, look to the high-end Varjo XR-3 headset, which uses lidar for mixed reality.

occipital-canvas-ipad-pro-lidar.png

A 3D room scan from Occipital's Canvas app, enabled by depth-sensing lidar on the iPad Pro. Expect the same for the iPhone 12 Pro, and maybe more.

Occipital

3D scanning could be the killer app

Lidar can be used to mesh out 3D objects and rooms and layer photo imagery on top, a technique called photogrammetry. That could be the next wave of capture tech for practical uses like home improvement, or even social media and journalism. The ability to capture 3D data and share that info with others could open up these lidar-equipped phones and tablets to be 3D-content capture tools. Lidar could also be used without the camera element to acquire measurements for objects and spaces.

I've already tried a few early lidar-enabled 3D scanning apps on the iPhone 12 Pro with mixed success (3D Scanner App, Lidar Scanner and Record3D), but they can be used to scan objects or map out rooms with surprising speed. The 16-foot effective range of lidar's scanning is enough to reach across most rooms in my house, but in bigger outdoor spaces it takes more moving around. Again, Apple's front-facing TrueDepth camera already does similar things at closer range. Over time, it'll be interesting to see if Apple ends up putting 3D scanning features into its own camera apps, putting the tech more front-and-center. For now, 3D scanning is getting better, but remains a more niche feature for most people.

Watch this: Our in-depth review of the iPhone 12 and 12 Pro

Apple isn't the first to explore tech like this on a phone

Google had this same idea in mind when Project Tango -- an early AR platform that was only on two phones -- was created. The advanced camera array also had infrared sensors and could map out rooms, creating 3D scans and depth maps for AR and for measuring indoor spaces. Google's Tango-equipped phones were short-lived, replaced by computer vision algorithms that have done estimated depth sensing on cameras without needing the same hardware. This time, however, lidar is already finding its way into cars, AR headsets, robotics, and much more.