Apple has announced a new set of assistive features to enhance the accessibility of its devices, namely iPhones and iPads. These include new eye-tracking capabilities, voice controls, haptics, and many others.
Among the highlights is the introduction of eye-tracking support for iPhones and iPads with A12 chips or later, allowing users to navigate their devices using only their eyes through the front-facing camera and on-device AI. Users can move through apps and menus by looking at the screen and selecting items by lingering on them, a process called “Dwell Control”, which is already available on the company’s Mac devices.
Apple is also improving its voice-based controls with customisable vocal shortcuts. These can be set up using on-device AI to recognise unique commands, making it easier to launch apps or perform tasks without needing to activate Siri. Additionally, a new feature called “Listen for Atypical Speech” will help devices better understand and recognise speech patterns that are atypical, enhancing communication for those with speech impairments. This development is part of the company’s collaboration with the Speech Accessibility Project at the Beckman Institute.
For those with hearing difficulties or disabilities, Apple is introducing music haptics to its Music app, allowing users to experience music through taps, textures, and vibrations that sync with the audio. According to the company, millions of songs on the first-party audio streaming platform will support this feature upon release. Additionally, music haptics will be available as an API for other developers to incorporate into their apps, broadening accessibility.
Furthermore, Apple is enhancing CarPlay with voice control, colour filters, bold text support, and sound recognition, making it easier for drivers with disabilities to interact with their vehicles safely. Another notable feature is Vehicle Motion Cues, which is designed to help users who experience motion sickness while using their devices in moving vehicles. This feature uses onscreen dots that sway in response to the vehicle’s motion, aligning sensory inputs to reduce discomfort.
Alongside these updates, other accessibility features announced by Apple include Live Captions in VisionOS for the Vision Pro headset, a new Reader mode in Magnifier, as well as support for multi-line braille and a virtual trackpad for Assistive Touch users. All of these are expected to be available in upcoming iOS and iPad updates, likely to be announced during the upcoming WWDC 2024 event in June.
(Source: Apple [newsroom])
Follow us on Instagram, Facebook, Twitter or Telegram for more updates and breaking news.