Apple recently announced a new wave of accessibility features coming later this year for iPhones and iPads. The most exciting addition is undoubtedly eye tracking, which will allow users to control their devices entirely with their eyes.
This innovative feature is designed specifically for users with physical limitations, but anyone will be able to utilize it. Apple's eye tracking technology leverages on-device machine learning to calibrate to individual users for a personalized experience. This functionality represents the first of what is likely to be many new AI-powered features coming to Apple's mobile devices.
Controlling your iPhone or iPad with your eyes is achieved through a feature called Dwell Control. By simply focusing your gaze on a specific area of the screen for a designated amount of time, you can activate various functions. This allows users to navigate applications, select items, and perform gestures with their eyes.
Apple assures users that eye tracking utilizes the front-facing camera for a quick and straightforward setup process. Importantly, Apple emphasizes that the feature does not access or share any user data.
This eye tracking technology is just one of many accessibility features Apple has introduced. Other highlights include:
1.Music Haptics: The Taptic Engine within your iPhone will create unique vibrations that correspond to the music you're listening to, offering a new dimension to the audio experience.
2.Personal Voice: This feature allows users who have difficulty speaking or reading long sentences to create customized voice commands using shorter phrases.
3.Vehicle Motion Cues: This innovative feature aims to combat motion sickness by displaying animated dots that dynamically correspond to a vehicle's movement when using your iPhone or iPad as a passenger.
The introduction of these features underscores Apple's commitment to accessibility and its continuous development of cutting-edge technology to integrate everyone into the Apple ecosystem.