Apple Previews iPhone, iPad and Mac Accessibility Features Ahead of WWDC

Live Speech, Point and Speak and other software updates are slated to roll out later this year.

Abrar Al-Heeti Video producer / CNET
Abrar Al-Heeti is a video host and producer for CNET, with an interest in internet trends, entertainment, pop culture and digital accessibility. Before joining the video team, she was a writer for CNET's culture team. She graduated with bachelor's and master's degrees in journalism from the University of Illinois at Urbana-Champaign. Though Illinois is home, she now loves San Francisco -- steep inclines and all.
Expertise Abrar has spent her career at CNET breaking down the latest trends on TikTok, Twitter and Instagram, while also reporting on diversity and inclusion initiatives in Hollywood and Silicon Valley. Credentials
  • Named a Tech Media Trailblazer by the Consumer Technology Association in 2019, a winner of SPJ NorCal's Excellence in Journalism Awards in 2022 and has twice been a finalist in the LA Press Club's National Arts & Entertainment Journalism Awards.
Abrar Al-Heeti
3 min read
Apple iPad and iPhone showing Assistive Access and Live Speech features

Features like Assistive Access are designed to lighten cognitive load, while Live Speech lets users type out text that's then spoken aloud.


Apple on Tuesday previewed a handful of new features for the iPhone, iPad and Mac designed to boost cognitive, vision, hearing and mobility accessibility, ahead of Global Accessibility Awareness Day. The features are slated to roll out later this year. This comes as Apple gears up for its Worldwide Developers Conference, which kicks off June 5. 

Watch this: Apple Puts an AI Twist on Accessibility

One feature, called Live Speech, is geared toward users who are nonspeaking or who have diverse speech patterns or disabilities. Live Speech lets someone type what they want to say and then have it spoken aloud. The feature can be used for in-person conversations as well as over the phone and on FaceTime. It works on iPhone, iPad and Mac, and uses any built-in device voices like Siri. You could say, "Nice to meet you, I'm ..." and introduce yourself, for example, and can also save favorite phrases such as, "Can I please get a black coffee?"

Taking that feature a step further is Personal Voice, which lets users at risk of speech loss create a voice that sounds like them and then have it speak aloud their typed-out phrases. Personal Voice uses on-device machine learning. To train the feature, a person spends about 15 minutes speaking a series of text prompts aloud on iPhone or iPad. 

Watch this: Tech accessibility is lagging. Here's why that needs to change

The iPhone's Magnifier app is also getting a new feature called Point and Speak, which allows users with vision disabilities to point to objects with text labels and have their device read that text aloud. For example, someone could use this to identify buttons on a microwave. Point and Speak uses your phone's camera, lidar scanner and on-device machine learning to find and recognize text as you move your finger across different objects. Point and Speak can be used alongside other Magnifier features like People DetectionDoor Detection and Image Descriptions, which help blind and low-vision users navigate and identify their surroundings.

Assistive Access is designed for people with cognitive disabilities, and offers a more focused device interface to lighten cognitive load. This includes large text labels and high contrast buttons on the iPhone's home screen and across Calls, Messages, Camera, Photos and Music. The experience can be tailored for different preferences. For instance, someone who prefers visual communication can use an emoji-only keyboard in Messages or can record a video message to send.

"These groundbreaking features were designed with feedback from members of disability communities every step of the way, to support a diverse set of users and help people connect in new ways," Sarah Herrlinger, Apple's senior director of Global Accessibility Policy and Initiatives, said in a statement.

Other accessibility updates coming this year include the ability to pair Made for iPhone hearing devices directly to Mac and to more easily adjust text size across Mac apps like Finder, Messages, Mail, Calendar and Notes. Voice Control is also adding phonetic suggestions, so that users who type using their voice can choose the correct word if there are others that sound similar, such as do, due and dew.

Apple is also launching SignTime in Germany, Italy, Spain and South Korea on Thursday, which lets Apple Store customers communicate with staff via sign language interpreters. The service is already available in the US, UK, Canada, France, Australia and Japan.

Apple is one of many companies boosting its accessibility offerings. Other tech giants like Google have rolled out features like Lookout, which helps blind and low-vision users identify objects and read documents using their phone's camera. Last year, Google added a feature called Guided Frame to its Pixel phones, which uses audio and haptic cues to give users exact guidance for framing their selfies.