Apple has announced that they are providing even more features for those with disabilities. There are a couple of things to highlight including SignTime, Assistive Touch, eye-tracking hardware support, support for bi-directional hearing aids, and improved voiceover support.
SignTime is a new service, available starting today, May 20th, 2021 which will allow users to communicate with Apple Retail and Customer Care representatives through a Sign Language interpreter. This is accessible right from within the browser. If a customer is in an Apple Store and needs an interpreter, they will be able to use SignTime to communicate as necessary. There are three different languages available for users, American Sign Language (ASL) in the US, British Sign Language (BSL) in the UK, and French Sign Language (LSF) in France.
There is another new feature, for those with limited mobility called AssistiveTouch.
AssistiveTouch is a feature for the Apple Watch that will detect subtle differences for the Apple Watch that will allow users to be able to interact with the Apple Watch. This is done by the Apple Watch detecting differences in muscle movements and tendon activity to be able to sense the intension.
This will be a significant leap in terms of inclusivity for those who may have limb differences so that they can be able to use the Apple Watch, the same as anybody else.
Those who have limb differences are not the only ones seeing some improvement.
Eye-Tracking Support for iPad
There are those who may have motor issues and these issues would still want to be able to control an iPad. Later this year the iPad will support hardware eye-tracking devices through the Made for iPad, or MFI, program.
With the Eye-Tracking support users will be able to move the point on the iPad, solely use the eye-tracking hardware. This, like AssistiveTouch, will allow users with limited mobility to use an iPad, just like other users.
Bi-Directional Hearing Aids
There are different types of people throughout the world. There are some users who may have difficulty hearing certain things. Apple has provided some features, like “Listening Mode”, which allows you to use your iPhone or iPad as a way to increase volume.
While this may work for some users, there are those users who need other types of auditory assistance. Those users may need hearing aids. Apple has updated their MFi program to support bi-directional hearing aids. According to Apple, “The microphones in these new hearing aids enable those who are deaf or hard of hearing to have hands-free phone and FaceTime conversations”.
This will be a great way of allowing those with hearing issues to have Hearing Aids that will allow them to natively use their phone hands-free and participate in FaceTime conversations.
Exploring Images with VoiceOver
Over the last few years Apple has been improving their object detection through machine learning. While this has been great for object detection, facial recognition, and much more. One benefit that all of this work has is a new feature that Apple will be releasing is the ability to explore images with VoiceOver.
For those who are visually impaired, you will be able to use VoiceOver to detect objects within an image. This goes beyond what users have experienced thus far by being able to describe details about people, like their position with a photo. Beyond this, for things like receipt VoiceOver will be able to go column by column, row by row. This includes being able to provide header information.
These new initiatives for Apple will be great additions to be more inclusive of additional people who may mobility, hearing, and movement issues. It is good to see Apple add this, but I hope that Apple adds more assistive technologies over time.