Apple Short Film Highlights Accessibility Features

Detection Mode, Voice Control and AssistiveTouch have made navigating iPhones, iPads and Macs more seamless for people with disabilities.

Abrar Al-Heeti Technology Reporter
Abrar Al-Heeti is a technology reporter for CNET, with an interest in phones, streaming, internet trends, entertainment, pop culture and digital accessibility. She's also worked for CNET's video, culture and news teams. She graduated with bachelor's and master's degrees in journalism from the University of Illinois at Urbana-Champaign. Though Illinois is home, she now loves San Francisco -- steep inclines and all.
Expertise Abrar has spent her career at CNET analyzing tech trends while also writing news, reviews and commentaries across mobile, streaming and online culture. Credentials
  • Named a Tech Media Trailblazer by the Consumer Technology Association in 2019, a winner of SPJ NorCal's Excellence in Journalism Awards in 2022 and has three times been a finalist in the LA Press Club's National Arts & Entertainment Journalism Awards.
Abrar Al-Heeti
2 min read
Apple iPhone 14 seen from the back

Apple showcased a handful of accessibility features that it's released in recent years in a new short film.

James Martin/CNET

Apple on Wednesday released a short film highlighting the company's efforts to make its products more accessible, ahead of International Day of Persons with Disabilities on Dec. 3. The iPhone maker showcased features like Detection Mode, Voice Control and AssistiveTouch, which it's released over the last few years to help users with disabilities better navigate iPhones, iPads, Macs and Apple Watches.

Detection Mode rolled out earlier this year, and is built into the Magnifier app on iPhone and iPad. It gives users who are blind and low-vision a description of their surroundings using information from the camera, the LiDAR scanner and machine learning. 

As part of Detection Mode, Door Detection helps blind and low-vision users find a door and know how far they are from it. It can also describe whether the door is open or closed and, if closed, how to open it. Door Detection can also read signs and symbols such as a room number.

Another component of Detection Mode is People Detection, which lets blind and low-vision iPhone and iPad users know how close someone is to them. Detection Mode also speaks aloud general image descriptions of someone's surroundings, similar to Google's Lookout app

Apple's film also showcased a feature called Voice Control, which lets users speak voice commands to navigate and control their devices. People with physical and motor disabilities can also use facial expressions to control their iPhone, iPad or Mac. Facial expressions like sticking out your tongue or raising your eyebrows can simulate pointer actions like clicking with a mouse or trackpad.

And finally, Apple featured AssistiveTouch, which helps users with upper body limb differences control their devices. On iPhone and iPad, AssistiveTouch shows an onscreen button to help adjust volume, take a screenshot or lock your device without needing to touch a physical button. The feature also lets users control their Apple Watch with gestures like a pinch or clench, without needing to tap the screen.

Apple's short film features photographers, content creators and musicians who use these accessibility features in their work and everyday lives. It's available to watch on YouTube. There's also an audio described version of the film, which is linked in the YouTube description.

Apple is one of many companies working to make its products and services more accessible. In recent years, other tech giants like Google, Meta and Microsoft have launched features to cater to users with various disabilities. Last month, Google unveiled Guided Frame on the Pixel, which helps blind and low-vision users take selfies. In the last couple of years, Meta has added automatic captions to feed videosIGTV and Instagram Stories. And in October, the University of Illinois at Urbana-Champaign spearheaded the Speech Accessibility Project, with support from Amazon, Apple, Google, Meta and Microsoft. The initiative aims to make voice recognition technology more useful for people with a range of diverse speech patterns and disabilities.