Enabling users with disabilities to navigate and connect, Apple on Tuesday unveiled innovative software features to preview users with blind, deaf and physical and motor disabilities.
Apple says using advances across hardware, software and machine learning, those who are blind or visually impaired can navigate the last few feet to their destination via door detection using their iPhone and iPad; Users with physical and motor disabilities who can rely on helpful features such as voice control and switch control can fully control the Apple Watch via mirroring the Apple Watch from their iPhone; And the deaf and hard of hearing community can follow live captions on iPhone, iPad and Mac.
Apple is expanding support for its screen reader voiceover, with more than 20 new languages and locales. These features will be available later this year with software updates across the Apple platform.
“Apple embeds accessibility in all areas of our work, and we’re committed to designing the best products and services for everyone,” said Sarah Herlinger, senior director of Apple’s Accessibility Policy and Initiatives. “We’re excited to introduce these new features, which combine innovation and creativity from teams across Apple to give users more options to use our products in a way that suits their needs and the best of life.”
Using a combination of LiDAR, camera and on-device machine learning, door detection will be available on iPhone and iPad models, including the LiDAR scanner. This feature will be available in a new detection mode within Magnifier, Apple’s built-in app supports blind and low vision users.
It can help users identify a door after it has reached a new destination, understand how far it is and describe the features of the door – whether it is open or closed and when it is closed, whether it can be opened by pushing, turning. Knot, or pull a handle. It will also be able to read signs and symbols around the door, such as the presence of an office room number, or an accessible entrance sign.
For the deaf and hard of hearing community, Apple will bring live captions to iPhone, iPad and Mac. Whether it’s FaceTime, video conferencing or social media or other apps, users will be able to easily follow any audio content and adjust the font size for ease of reading.
Live captions on FaceTime feature auto-recorded dialogs for participants to make calls, making group video calls even more convenient for hearing impaired users. When live captions are used for calls on Mac, users have the option to type a response and speak aloud in real time with others who are part of the conversation.
Voiceover is adding support for more than 20 additional locales and languages, including Apple’s Screen Reader for Blind and Blind Users, Bengali, Bulgarian, Catalan, Ukrainian and Vietnamese. Users will be able to choose from dozens of new voices that have been optimized for helpful features in different languages. These new languages, locales and voices will also be available for Speak Selection and Speak Screen Accessibility features. Additionally, Mac’s voiceover users can discover common formatting issues such as duplicate spaces or incorrect uppercase letters using the new text checker tool, which makes documents or emails much easier.