X

iPhone 12 Pro, iOS 14.2 let people who are blind detect others around them

A new People Detection features uses lidar to alert iPhone 12 Pro, 12 Pro Max and iPad Pro users how close other people are.

Shara Tibken Former managing editor
Shara Tibken was a managing editor at CNET News, overseeing a team covering tech policy, EU tech, mobile and the digital divide. She previously covered mobile as a senior reporter at CNET and also wrote for Dow Jones Newswires and The Wall Street Journal. Shara is a native Midwesterner who still prefers "pop" over "soda."
Shara Tibken
4 min read
apple-iphone-12-pro-1754

The lidar scanner on Apple's new iPhone 12 Pro and 12 Pro Max enables new AR features -- and the ability for people who are blind or low vision to detect others around them.

James Martin/CNET

Apple's iPhone 12 Pro and 12 Pro Max have a new feature for users who are blind or low vision -- the ability to essentially see other people coming. The feature went live for iPhone and iPad users on Thursday with the launch of iOS 14.2

The devices make use of the new lidar sensor on the back of the phones to detect how close other people are to the user, something Apple has named People Detection. Lidar is a type of depth sensor that helps with augmented reality apps and serves as the eyes of self-driving cars. Now, Apple is applying it to accessibility in an effort to help people who have vision problems better navigate the world around them. 

When someone who is blind is grocery shopping, for instance, they'll be able to turn on People Detection on their iPhone 12 Pro to let them know when they need to move up in the checkout line. Or someone walking down a sidewalk will get alerts about how close other people are as they pass by. People who are blind or low vision can use the feature to figure out if a seat is available at a table or on public transit, and they'll be able to maintain proper social distance when going through health screening or security lines at an airport.

People Detection will be able to tell the person's distance from the user in feet or meters, and it works up to 15 feet/5 meters away. Anyone in the iPhone 12 Pro's wide-angle camera view can be detected by the feature. If there are multiple people nearby, People Detection will give the distance from the one closest to the iPhone user.

Apple released a beta version of iOS 14.2 software for developers on Friday before rolling out the full version for all users on Thursday. 

Globally, at least 2.2 billion people have a vision impairment or blindness, according to a World Health Organization report from last year. In the US, over 1 million people over the age of 40 are blind, according to the Centers for Disease Control and Prevention. By 2050, that number could skyrocket to about 9 million because of the "increasing epidemics of diabetes and other chronic diseases and our rapidly aging US population," the CDC said. 

Apple has made accessibility a focus for decades. It builds features into its technology to help people with low vision navigate the iPhone's touch screen and allow people with motor impairments to virtually tap on interface icons. Four years ago, Apple kicked off one of its flashy product launches by talking about accessibility and showing off its new, dedicated site.

"Technology should be accessible to everyone," Apple CEO Tim Cook said at the time. 

Apple, in particular, has long made features to help people who are blind or low vision. Its new people detector takes that a step further.

Lidar sensing

The technology makes use of the new lidar scanner built into the camera array of the iPhone 12 Pro and 12 Pro Max. It's also on the newest iPad Pro and is likely to come other other devices in the future. The scanner itself is a tiny black dot near the camera lens on the back of the new, highest-end iPhones. 

People Detection won't work on older iPhones, the iPhone 12, 12 Mini or even the new iPad Air. None of those devices come with lidar scanners, which is essential for the people sensing technology. 

Watch this: Our in-depth review of the iPhone 12 and 12 Pro

People Detection uses Apple's ARKit People Occlusion feature to detect if someone is in the camera's field of view and estimate how far away the person is. The lidar scanner makes the estimate more accurate. It sends out a short burst of light and measures how long it takes the light to come back to the lidar scanner. The new feature doesn't work in the dark or low light environments. 

All of the sensing happens in real time to give feedback on how far away a person is from the iPhone 12 Pro user. 

The user gets feedback from People Detection in four possible ways, and they can be used in any combination. All can be customized in settings. One way to get information about a person's closeness is through an audible readout. The phone will say out loud, "15, 14, 13" and so on, when it comes to feet. It gives the distance in half meters for people who choose that unit of measurement. 

The iPhone 12 Pro users also can set a threshold distance with two distinctly different audio tones. One is for when people are outside that distance and another is for when people are closer to the user. The default threshold setting is 6 feet, or 2 meters.

Apple iPhone 12 Pro has a bold, striking no-frills look

See all photos

The third type of alert is through haptic feedback. The farther away a person is, the lower and slower the physical pulsing of the haptics. The closer the person gets, the faster the haptics buzz. Currently, the haptics are only through the phone, not through the Apple Watch. 

There's also the option to get a visual readout on the screen itself. In that case, it will say how far away the person is, and a dotted line will point out where the person is on the screen. 

People Detection lives in Apple's Magnifier app. Users can launch it using Apple's Back Tap setting or through the triple-click side button Accessibility shortcut. Siri can launch Magnifier, but then users have to enable People Detection from there. 

It's designed to be a situational tool that people turn on when they need it, rather than an always-on feature. Running it for a significant amount of time will consume a lot of battery life. 

For now, the iPhone only detects people, but developers could use the lidar technology to make apps for detecting objects. 

Other iOS 14.2 features include 13 new emoji characters and a music recognition feature via Shazam in the Control Center.