X

Apple Is Bringing Eye Tracking, Vocal Shortcuts, Music Haptics to the iPhone

Accessibility updates are also coming to iPad, VisionOS and CarPlay.

Abrar Al-Heeti Technology Reporter
Abrar Al-Heeti is a technology reporter for CNET, with an interest in phones, streaming, internet trends, entertainment, pop culture and digital accessibility. She's also worked for CNET's video, culture and news teams. She graduated with bachelor's and master's degrees in journalism from the University of Illinois at Urbana-Champaign. Though Illinois is home, she now loves San Francisco -- steep inclines and all.
Expertise Abrar has spent her career at CNET analyzing tech trends while also writing news, reviews and commentaries across mobile, streaming and online culture. Credentials
  • Named a Tech Media Trailblazer by the Consumer Technology Association in 2019, a winner of SPJ NorCal's Excellence in Journalism Awards in 2022 and has three times been a finalist in the LA Press Club's National Arts & Entertainment Journalism Awards.
Abrar Al-Heeti
4 min read
A person looking at a window of live captions

Live Captions are coming to VisionOS.

Apple

A handful of Apple accessibility updates, coming to the iPhone, iPad, Vision Pro and CarPlay, will make it easier to navigate devices, communicate with Siri and connect with loved ones. The updates, announced by Apple on Wednesday, are slated to arrive later this year. There's also a feature that allows those who are deaf or hard of hearing to more fully experience music, and another that can help alleviate motion sickness.

The updates come ahead of Global Accessibility Awareness Day, which this year falls on May 16. Apple, along with other tech giants, has leaned into promoting digital accessibility across its platforms and products in recent years, launching features like Detection Mode, which gives blind and low-vision people a description of their surroundings; Voice Control, which lets you speak voice commands to navigate and control your devices; and Live Speech, which lets you type what you want to say and then have it spoken aloud.

The latest batch of accessibility features will make Apple's suite of products and services even more accessible to a wider range of people. It's possible the company will highlight some of these features during its Worldwide Developers Conference on June 10. In the meantime, here's what's in the pipeline.

Eye Tracking

In the coming months, Apple will introduce Eye Tracking to iPhone and iPad. This feature will allow you to control Apple using just your eyes, which includes navigating apps and selecting various options and commands. No additional hardware or accessories are needed for this this to work, eliminating the need for pricey add-on gadgets.

Read more: I Controlled Honor's Magic 6 Pro Phone With Just My Eyes

Eye Tracking works with both Apple's native apps and third-party apps. It uses the device's front-facing camera and on-device machine learning to decipher eye movements and carry out commands. 

Vocal Shortcuts

An iPhone screen with Vocal Shortcuts

Vocal Shortcuts lets you carry out actions with custom utterances.

Apple

This feature uses built-in speech recognition so users can navigate their iPhone or iPad. You can set a custom utterance -- it doesn't need to be an understandable voice command or statement -- to launch various shortcuts through Siri, like setting a timer, calling someone, getting directions, taking a screenshot and scrolling. Vocal Shortcuts also uses on-device machine learning. 

Siri can better detect atypical speech 

Siri will have a new feature called "Listen for Atypical Speech." This will allow the voice assistant to better decipher a wider range of speech, using on-device machine learning to recognize those patterns.

The update was built with the help of the Speech Accessibility Project, an initiative spearheaded by the University of Illinois at Urbana-Champaign and supported by companies like Apple, Amazon, Google, Meta and Microsoft, with the goal of improving speech recognition for people with a range of diverse speech patterns and disabilities. 

Music Haptics

Using the Taptic Engine in iPhone, which enables haptic feedback for actions like clicking and typing, people who are deaf or hard of hearing will also be able to experience music through a series of taps, textures and vibrations that play along to the audio through their device. Music Haptics works across millions of songs in Apple Music, and it'll be available to developers as an API so they can expand this to other apps, too.

A CarPlay home screen

You'll be able to receive Sound Recognition notifications in CarPlay.

Apple

Sound Recognition and Voice Control in CarPlay

Apple is adding Sound Recognition to CarPlay, so drivers or passengers can be alerted to any important sounds picked up by their iPhones, like horns or sirens. Sound Recognition, which first became available in iOS 14 and iPadOS 14, is designed to notify those who are deaf or hard of hearing about any sound-based alarms, alerts and notifications, such as smoke alarms or doorbell chimes. (Google has a similar feature for Android and Wear OS, called Sound Notifications.)

Now, Sound Recognition will be integrated into CarPlay, so drivers and passengers can see notifications appear on that interface, too.

In other news, Apple is bringing Voice Control to CarPlay. This will allow drivers to simply speak commands to navigate CarPlay and control apps, such as saying "Swipe right" on menus, "Tap play" on music controls and "Tap go" on their GPS. 

Apple is also adding Color Filters to make the CarPlay interface visually easier for those who are colorblind, as well as options like Bold Text and Large Text.

Watch this: Apple's iPad Pro Event Was a Flop

Vehicle Motion Cues

This feature can help reduce motion sickness when you're in a moving vehicle. Research suggests motion sickness is often caused by a sensory conflict between what you see and what you feel, making it hard to look at your screen while in a moving car. 

A feature called Vehicle Motion Cues reduces that sensory conflict by displaying animated dots on the edges of your phone's screen. These reflect changes in the vehicle's motion. So if a car turns left, the dots push to the right, and vice versa. When a car accelerates, they move down, and when it brakes, they slowly move forward and come to a stop.

The feature sits as a layer and works across any app, and can be set to automatically turn on when an iPhone senses motion. It can also be switched on and off manually in Control Center. 

Apple's Vehicle Motion Cues feautre

Vehicle Motion Cues is designed to reduce motion sickness when looking at your screen.

Apple

Live Captions on VisionOS

When the Vision Pro launched earlier this year, it already came packed with accessibility features like Voice Control, Audio Descriptions and braille support. Soon, an updated version of VisionOS will also enable system-wide Live Captions, so you can follow along with what someone is saying, or audio from apps, using real-time transcriptions. This will work in FaceTime in VisionOS, so more people, including those who are deaf or hard of hearing, can connect using their Persona.

Other updates

Apple will roll out a handful of other accessibility updates, including more voices in the VoiceOver screen reader, a Hover Typing option that shows larger text (in a user's preferred font and color) when typing in a text field and the ability to create a Personal Voice using shorter phrases.

Check Out Apple's Vision Pro Headset and Everything in the Box

See all photos