Apple Accessibility Vehicle Motion Cues

Apple Plans for Exciting New Accessibility Features, Including Eye Tracking

Apple has confirmed that it has several new accessibility features coming later this year, including Eye Tracking, Music Haptics, Vocal Shortcuts, and Vehicle Motion Cues. These accessibility features are designed to help users with all types of accessibility needs, from those who are hard of hearing to users with limited or no vision, to even those with speech challenges. There’s even a new feature to purportedly help those who suffer from motion sickness while using their devices in moving vehicles.

Eye Tracking will function as a way for those with physical disabilities to control their iPhone or iPad with their eyes. With Music Haptics, those who are deaf or hard of hearing will be able to experience music using the Taptic Engine in iPhone. Meanwhile, Vocal Shortcuts help users perform tasks by making a custom sound; and Vehicle Motion Cues help reduce motion sickness when using iPhone or iPad in a moving vehicle. Additional accessibility features are coming to visionOS as well.

All these new accessibility features combine Apple hardware and software, harnessing Apple silicon, AI, and machine learning. Here’s a rundown of how each will work.

Eye Tracking

Apple Eye Tracking

Powered by artificial intelligence, Eye Tracking makes it possible to navigate iPad and iPhone with just your eyes. Designed for users with physical disabilities, it uses the front-facing camera to set up and calibrate in seconds. With on-device machine learning, all data used to set up and control this feature is kept securely on device, and isn’t shared with Apple.

Apple Eye Tracking

Eye Tracking works across iPadOS and iOS apps and doesn’t require additional hardware or accessories. With Eye Tracking, you can navigate through the elements of an app and use Dwell Control to activate each element. Access additional functions like physical buttons, swipes, and other gestures solely with your eyes.

Music Haptics

Those who are deaf or hard of hearing can leverage Music which, when turned on, plays taps, textures, and refined vibrations to the audio of the music. Music Haptics works across millions of songs in the Apple Music catalog and will be available as an API for developers to make music more accessible in their apps.

Speech

Apple Vocal Shortcuts

With Vocal Shortcuts, iPhone and iPad users can assign custom utterances that Siri can understand to launch shortcuts and complete complex tasks. Listen for Atypical Speech, another new feature, gives you an option for enhancing speech recognition for a wider range of speech. Listen for Atypical Speech uses on-device machine learning to recognize user speech patterns. Designed for users with acquired or progressive conditions that affect speech, such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke, these features provide a new level of customization and control. It builds on features introduced in iOS 17 for users who are nonspeaking or at risk of losing their ability to speak.

Vehicle Motion Cues

Apple Vehicle Motion Cues

Tend to suffer from motion sickness in the car when you try to look at an electronic device? While the best solution is to simply enjoy your surroundings instead of burying your face in your phone anyway, for long road trips, sometimes a short reprieve to watch a movie or TV show, or play a game, helps the time pass. Vehicle Motion Cues is designed to help reduce motion sickness for passengers in moving vehicles. Building on research that suggests motion sickness is commonly caused by a sensory conflict between what a person sees and what they feel, according to Apple, the feature presents animated dots on the edges of the screen that represent changes in vehicle motion. The idea is to help reduce sensory conflict without interfering with the main content. Using sensors built into iPhone and iPad, Vehicle Motion Cues recognizes when a user is in a moving vehicle and responds accordingly. The feature can be set to show automatically on iPhone or can be turned on and off in Control Center. It will be interesting to hear feedback from folks who suffer from motion sickness once this feature is officially available for use.

CarPlay Voice Control & More

Apple Accessibility Vehicle Motion Cues

Additionally, there are accessibility updates coming to CarPlay, including voice control, colour filters, and sound recognition. With Voice Control, you can navigate CarPlay and control apps with just your voice. With Sound Recognition, drivers or passengers who are deaf or hard of hearing can turn on alerts to be notified of car horns and sirens. For those who are colourblind, Color Filters make the CarPlay interface visually easier to use, with additional visual accessibility features including Bold Text and Large Text.

visionOS

Apple Vision Pro Live Captions

This year, accessibility features coming to visionOS will include systemwide Live Captions to help everyone, including users who are deaf or hard of hearing, follow along with spoken dialogue in live conversations and in audio from apps. With Live Captions for FaceTime in visionOS, more users can easily enjoy the unique experience of connecting and collaborating using their Persona. Apple Vision Pro will add the capability to move captions using the window bar during Apple Immersive Video, as well as support for additional Made for iPhone hearing devices and cochlear hearing processors. Updates for vision accessibility will include the addition of Reduce Transparency, Smart Invert, and Dim Flashing Lights for users who have low vision, or those who want to avoid bright lights and frequent flashing.

More Updates

There will be several other updates for those with accessibility needs. VoiceOver, for example, includes new voices, a flexible Voice Rotor, custom volume control, and the ability to customize VoiceOver keyboard shortcuts on Mac. Magnifier will offer a new Reader Mode and the option to easily launch Detection Mode with the Action button.

Braille users will get a new way to start and stay in Braille Screen Input for faster control and text editing. There will also be Japanese language availability for Braille Screen Input, support for multi-line braille with Dot Pad, and the option to choose different input and output tables.

Apple Hover Typing

For users with low vision, Hover Typing shows larger text when typing in a text field, and in a user’s preferred font and colour. For users at risk of losing their ability to speak, 

Apple Personal Voice

Personal Voice will be available in Mandarin Chinese. Users who have difficulty pronouncing or reading full sentences will be able to create a Personal Voice using shortened phrases.

For users who are nonspeaking, Live Speech will include categories and simultaneous compatibility with Live Captions.

For users with physical disabilities, Virtual Trackpad for AssistiveTouch allows you to control your device using a small region of the screen as a resizable trackpad.

Switch Control will include the option to use the cameras in iPhone and iPad to recognize finger-tap gestures as switches.

Voice Control will offer support for custom vocabularies and complex words.