Apple brings eye tracking to iPhone and iPad in accessibility update

Apple is to enable eye tracking on the iPhone and iPad as part of a new range of accessibility tools aimed at helping those with physical disabilities to more easily use their devices.

Using artificial intelligence, it will allow users to navigate their Apple device using just their eyes.

The new feature is joined by a new Music Haptics tool, which uses the taptic engine in the iPhone – which powers the vibrations on the device – to enable those who are deaf or hard of hearing to experience music vibrating to the audio of the music.

The eye-tracking tool uses the front-facing camera on the iPhone or iPad to set up and calibrate, and does not require any additional hardware or software.

The AI processing to enable eye tracking also takes place on-device, Apple said.

“We believe deeply in the transformative power of innovation to enrich lives,” Apple chief executive Tim Cook said.

“That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software.

“We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.”

Also among the suite of new features announced by the tech giant was a feature the firm says could reduce motion sickness for passengers in moving vehicles.

Apple said research has shown that motion sickness is commonly caused by sensory conflict between what a person sees and what they feel, but said its new Vehicle Motion Cues feature, which places animated dots on the edges of the screen to represent changes in vehicle motion, can help to reduce sensory conflict and therefore motion sickness.

– Advertisement –
– Advertisement –