Apple announces accessibility features

Ahead of its annual developers-centric event (WWDC) on June 10, Apple announced new accessibility-focused features for iPhone and iPad. Coming later this year, the new accessibility features include Eye Tracking, Music Haptics, Vocal Shortcuts, and Vehicle Motion Cues. Apple said these features are the result of its strides in artificial intelligence and machine learning and a combination of its advancements in hardware and software. Below are the details:

Eye tracking

Apple said this feature will allow users to navigate their iPhone or iPad with just their eyes, which gets tracked by the front-facing camera and does not require any supplementary hardware. Apple said the eye tracking feature is powered by on-device capabilities and the data used for this feature to work is stored on the device itself and not shared even with Apple. This feature will allow users to explore the elements of an app and use dwell control to activate every element to access physical buttons, swipes and other functions with just the eyes.

Music Haptics

Aimed to people with hearing disability, the Music Haptic feature is said to allow such people to experience music on iPhone through haptic feedback, powered by the Taptic Engine. With this accessibility feature turned on, the Taptic Engine in iPhone plays taps, textures, and refined vibrations to the audio of the music, said Apple. This feature will be available as an API for developers to make music more accessible in their apps, and it works for several songs in the Apple Music catalogue.

Vocal Shortcuts

This will allow iPhone and iPad users to designate Siri with verbal instructions to launch shortcuts and complete actions. Another related feature is Atypical Speech, which is said to allow users to widen the scope of speech recognition. According to Apple, this feature identifies vocal patterns through machine learning.

Vehicle Motion Cues

To solve the problem of motion sickness for passengers in moving vehicles, animated dots will appear on the screen of the iPad and iPhone to show the changes in vehicle motion that would result in reducing sensory conflict without altering the content on the screen. Apple said that the feature employs sensors in the device to know that the user is in a moving vehicle and can be turned on or off from the Control Center.


Apple is also bringing more accessibility features to its CarPlay. These include Voice control, Colour Filters and Sound Recognition. Voice Control will let users control apps with their voice; Sound Recognition will allow drivers or passengers who are deaf or hard of hearing to turn on alerts to be notified of car horns and sirens; and Colour Filters will make the interface easier to use by colour blind users.

First Published: May 16 2024 | 12:40 PM IST

Source link