Apple is bringing new accessibility features to iPads and iPhones, designed to cater to a diverse range of user needs. These include the ability to control your device with eye-tracking technology, create custom shortcuts using your voice, experience music with a haptic engine and more. The company unveiled the announcements ahead of Global Accessibility Awareness Day on Thursday. 

Apple has already supported eye-tracking in iOS and iPadOS, but it required the use of additional eye-tracking devices. This is the first time Apple has introduced the ability to control an iPad and iPhone without needing extra hardware or accessories. The new built-in eye-tracking option allows people to use the front-facing camera to navigate through apps. It leverages AI to understand what the user is looking at and which gesture they want to perform, such as swiping and tapping. There’s also Dwell Control, a feature that can sense when a person’s gaze pauses on an element, indicating they want to select it. 

“Vocal Shortcuts,” another new useful feature, improves on Apple’s voice-based controls. It lets people assign different sounds or words to launch shortcuts and complete tasks. For instance, Siri will launch an app, even after the user says something as simple as “Ah!” The company also developed “Listen for Atypical Speech,” which uses machine learning to recognize unique speech patterns, and is designed for users with conditions that affect speech, including cerebral palsy, amyotrophic lateral sclerosis (ALS) and stroke, among others. 

Other speech improvements Apple has made in the past include “Personal Voice,” which launched last year to give users an automated voice that sounds just like them.

For people who are deaf or hard-of-hearing, “Music Haptics” is a new feature that allows users to experience the millions of songs in Apple Music through a series of taps, textures and vibrations. It will also be available as an API, so music app developers can soon provide users with a new and accessible way to experience audio.

Apple also announced a new feature to help with motion sickness in cars. Instead of looking at stationary content, which can cause motion sickness, users can turn on the “Vehicle Motion Cues” setting. This feature puts animated dots on the edges of the screen that sway and move in the direction of the motion. 

CarPlay is getting an update as well, including a “Voice Control” feature; “Color Filters,” which gives colorblind users bold and larger text; and “Sound Recognition” to notify deaf or hard of hearing users when there are car horns and sirens.

Apple also revealed an accessibility feature coming to visionOS, which will enable live captions during FaceTime calls. 

techcrunch.com

Previous articleUK cannabis millionaire’s legal ‘deals on wheels’ via crypto
Next articleBitwise Explains Bitcoin ETF Mechanics: A FAQ Guide