Apple announces new accessibility features, including eye tracking and music haptics

Apple unveils accessibility features for upcoming software updates. Eye Tracking allows iPhone & iPad users with physical limitations to control devices through eye gaze. Music Haptics enhance music appreciation for deaf/hard of hearing users.

Apple announces new accessibility features, including eye tracking and music haptics
Image: 9to5mac

Apple unveiled new accessibility features scheduled to debut later this year across its operating systems.

The marquee addition is Eye Tracking, enabling iPhone and iPad users with physical limitations to control their devices solely through eye gaze. This functionality leverages a novel iOS gaze system.

Additional features include:

  • Music Haptics: Designed to enhance music appreciation for users who are deaf or hard of hearing. The iPhone's haptic engine provides synchronized taps and vibrations corresponding to the audible music.
0:00
/0:03

Video: Apple

  • Vehicle Motion Cues: Aims to mitigate motion sickness experienced by iOS users riding in moving vehicles. When enabled, this feature overlays an animated dot pattern that aligns with the vehicle's motion, reportedly reducing sensory conflict and alleviating motion sickness symptoms.
0:00
/0:12

Video: Apple

  • Vocal Shortcuts: Enables users to assign custom spoken phrases for triggering Shortcuts and executing complex multi-step actions.

Accessibility enhancements are also slated for CarPlay, with existing iOS features like Voice Control, Color Filters, and Sound Recognition becoming available within the CarPlay interface.

The new Sound Recognition feature in CarPlay alerts a user of a potential siren sound.
Image: Apple

Apple announced a range of further accessibility improvements, including:

  • New VoiceOver narration voices.
  • Hover Typing: Enlarges the current text field content during editing.
  • iPhone 15 action button access to Magnifier app's Detection Mode.
  • Braille input refinements.