Apple reveals Eye Tracking coming to iPhone and iPad later this year, likely with iOS 18
![](https://www.theapplepost.com/wp-content/uploads/2024/05/933D8E55-DBA9-4F79-86ED-AE8A827D4F98.png)
Apple has shared a sneak preview of some of the new Accessibility features coming to iPhone and iPad later this year, likely with the release of iOS 18 in the fall.
In a post on the company Newsroom, Apple says it is adding Eye Tracking, a way for users with physical disabilities to control iPad or iPhone with their eyes. Music Haptics to offer a new way for users who are deaf or hard of hearing to experience music using the Taptic Engine in iPhone, and Vocal Shortcuts to allow users to perform tasks by making a custom sound on their device.
“We believe deeply in the transformative power of innovation to enrich lives,” said Tim Cook, Apple’s CEO. “That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.”
Touting the new Eye Tracking feature, Apple says “Powered by artificial intelligence, Eye Tracking gives users a built-in option for navigating iPad and iPhone with just their eyes. Designed for users with physical disabilities, Eye Tracking uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on device, and isn’t shared with Apple.”
“Eye Tracking works across iPadOS and iOS apps, and doesn’t require additional hardware or accessories. With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes.”