What can humans do alone? What can computers do? And what can humans and computers do when they work together? Apple's latest accessibility features gives us a sense of new ways in which tech and humans can augment each other.
This week, Apple is celebrating Global Accessibility Awareness Day with news of these upcoming improvements, and a range of special sessions, curated collections, and more.
The company will also introduce SignTime to Canada on May 19, a service that connects Apple Store and Apple Support customers with on-demand sign language interpreters. SignTime was introduced in time for Global Accessibility Awareness Day 2021 and is already available in the US, UK and France.
But to my mind what's most interesting is that Apple plans to extend its existing accessibility features with machine learning to provide sophisticated solutions to help people with disabilities get around, communicate, care for their health, and more.
It seems a useful illustration of how computers can augment human beings.
"Apple embeds accessibility into every aspect of our work, and we are committed to designing the best products and services for everyone," Sarah Herrlinger, Apple's senior director of Accessibility Policy and Initiatives, said in a statement. She noted that these new features reflected contributions from teams across the company.
They also reflect a long-standing commitment from the company, which has built accessibility solutions into its products from day one. Apple CEO Tim Cook remains committed to that; in 2018 he described it as:
"...A basic core value of Apple. We don't make products for a particular group of people; we make products for everybody. We feel very strongly that everyone deserves an equal opportunity and equal access."
Apple's accessibility enhancements continue to close the gap, empowering people with disabilities with tools every business should also employ to build resilient, hybrid workforces strengthened with the shared insight you always gain when your teams come from a diversity of outlooks and backgrounds.
Apple has announced four main features, which it says will be introduced "later this year":
A tool that helps people who are blind or have low vision navigate the last few feet to their destination. This will help locate a door, tell users how far they are from that door, whether it is open or closed, and how it is opened. The system uses LiDAR on supporting devices and can also read signs and symbols, such as door numbers.
Door Detection will be available in a new Detection Mode within Magnifier. In Detection Mode, users with vision disabilities can use a set of customizable tools to help them get around and access rich descriptions of their surroundings. Apple Maps will offer sound and haptics feedback designed to help VoiceOver users identify the starting point for walking directions.
This lets users with physical and motor disabilities control an Apple Watch using Voice Control or Switch Control on their iPhone -this extends to external Made for iPhone switches, head tracking, and voice commands. The idea is that users with such disabilities can still use the health-supporting tools the watch provides, including Blood Oxygen.
Apple has also improved Quick Actions on Apple Watch: Now a double-pinch gesture can answer or end a phone call, dismiss a notification, take a photo, play or pause media in the Now Playing app, and start, pause, or resume a workout. This builds on Apple's existing AssistiveTouch technology, which empowers people with upper body limb differences with gesture controls on Apple Watch.
Live Captions will be made available on iPhone, iPad, and Mac. A universal feature, it basically turns your Apple product into a subtitling machine for your life and is intended to be benefit the deaf and hard-of-hearing community.
In use, your device can listen to any form of audio and deliver a real-time transcript. It will work with any audio content -a WebEx call, social media, phone conversation, or even an in-person conversation.
When used with FaceTime, Live Captions can also attribute dialog to specific speakers, while Mac users can type responses to what is said and have their Mac speak those responses to others within the conversation. Font sizes are adjustable and information remains only on the device.
A big boost to proof-readers, VoiceOver on Mac gains a new Text Checker tool to identify common formatting issues such as duplicated spaces or misplaced capital letters.
Apple will add support for an additional 20 languages, including Bengali, Bulgarian, Catalan, Ukrainian, and Vietnamese and locales to VoiceOver. Users can select from dozens of new voices that are optimized for assistive features across languages. The new languages, locales, and voices will also be available for Speak Selection and Speak Screen accessibility features.
There are other improvements:
This week, Apple is celebrating Global Accessibility Awareness Day with special sessions, curated collections, and more. Its retail stores will be hosting live sessions to explain existing accessibility features, and it is introducing an Accessibility Assistant shortcut in the Shortcuts app on Mac and Apple Watch to help recommend accessibility features based on user preferences.
Apple's online services and portals are also focusing on accessibility, including movies and shows, fitness workouts and app, books, music, podcasts, and other content showcases.
Apple Maps will also feature new tools to help users discover accessible features and services in parks, while guides from Gallaudet University are also to be made available. Cook recently delivered the commencement address at Gallaudet. (It's around 1:57 into the video here).
These many improvements make valuable differences to people's lives.
Please follow me on Twitter, or join me in the AppleHolic's bar & grill and Apple Discussions groups on MeWe.