With its cutting-edge features, Apple is making big steps forward in accessibility technology. Soon, people will be able to control their iPhones and iPads with just their eyes. This new feature, designed to help people with physical disabilities, will be available later this year when iOS 18 and iPadOS 18 come out. Take a closer look at how this technology works and what other changes are coming to make things easier for people with disabilities.
Eye Tracking: A New Way to Navigate
Apple’s new eye-tracking feature lets people use their eyes to move around in iOS and iPadOS. It does this by using artificial intelligence. This feature quickly sets up and calibrates using the front-facing camera. Once calibrated, users can control their devices without any extra hardware or accessories. With Dwell Control, the eye-tracking feature lets users move through apps, open different parts, press buttons, swipe, and use gestures.
People with trouble moving their bodies will benefit the most from eye tracking because it makes using their devices easier. For privacy and safety, all the information needed to set up and manage this feature is kept safely on the device.
Enhancing the Music Experience
Along with eye tracking, Apple is also releasing Music Haptics, a feature that uses the iPhone’s Taptic Engine vibration system to make music more immersive. Through taps, textures, and fine vibrations, this feature lets people who are deaf or hard of hearing feel the beat of the music. Apple Music will have Music Haptics for millions of songs, and developers can add this feature to their apps using a new API.
Reducing Motion Sickness
Apple has also added a new Vehicle Motion Cues feature to help people with motion sickness. This feature uses moving dots around the edges of the screen to show how the vehicle is moving, making it easier for people to use while in a moving vehicle. The feature can either be turned on and off manually in the control center or automatically know when a person is in a vehicle and act accordingly.
Vocal Shortcuts and Atypical Speech Recognition
Vocal Shortcuts are another useful addition. It lets users create their own words that Siri can understand and use to start shortcuts and finish complicated tasks. This feature lets you do things without using your hands, which can be especially helpful for people with trouble moving around.
Apple is also releasing a new feature called Listen for Atypical Speech. This feature uses machine learning to recognize different speech patterns. This makes it easier for people with acquired or progressive conditions like cerebral palsy or ALS to communicate with their devices.
Apple’s Commitment to Accessibility
Apple is always working to make technology easier for everyone to use, and these new features are part of that. Apple’s CEO, Tim Cook, emphasized the company’s commitment to inclusive design by saying, “We believe deeply in the transformative power of innovation to make lives better.” That’s why, for almost 40 years, Apple has pushed for inclusive design by making our hardware and software accessible by design.
Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives, says these new features are crucial: “Each year, we break new ground when it comes to accessibility.” These new features will improve the lives of many users by giving them new ways to communicate, control their devices, and get around.
Looking Ahead
These new accessibility features, which include music haptics, vocal shortcuts, listening for atypical speech, and eye tracking, will be available later this year. They are a big step toward making Apple products more accessible and easy for people with different needs. As technology changes, Apple stays on the cutting edge of new ideas and ensures everyone can use all its products. These new features will not only improve the experience for everyone but also give disabled people new and useful ways to interact with their devices.