Apple announces new accessibility features, including eye tracking and music haptics
The marquee addition is Eye Tracking, enabling iPhone and iPad users with physical limitations to control their devices solely through eye gaze. This functionality leverages a novel iOS gaze system.
Additional features include:
- Music Haptics: Designed to enhance music appreciation for users who are deaf or hard of hearing. The iPhone's haptic engine provides synchronized taps and vibrations corresponding to the audible music.
- Vehicle Motion Cues: Aims to mitigate motion sickness experienced by iOS users riding in moving vehicles. When enabled, this feature overlays an animated dot pattern that aligns with the vehicle's motion, reportedly reducing sensory conflict and alleviating motion sickness symptoms.
- Vocal Shortcuts: Enables users to assign custom spoken phrases for triggering Shortcuts and executing complex multi-step actions.
Accessibility enhancements are also slated for CarPlay, with existing iOS features like Voice Control, Color Filters, and Sound Recognition becoming available within the CarPlay interface.
Apple announced a range of further accessibility improvements, including:
- New VoiceOver narration voices.
- Hover Typing: Enlarges the current text field content during editing.
- iPhone 15 action button access to Magnifier app's Detection Mode.
- Braille input refinements.