With Global Accessibility Awareness Day (GAAD) coming up on Thursday, Apple has rolled out a variety of new accessibility features to celebrate. While that’s noteworthy on its own, industry watchers also know that what the company reveals in the weeks before its big Worldwide Developers Conference (WWDC) often has some bearing on what happens.
Celebrating Global Accessibility Awareness Day
The latest features aim to improve cognitive, vision, hearing, and mobility accessibility. They include new tools for individuals who are unable to or are at risk of losing the ability to speak. The features for cognitive accessibility are slated to appear later this year.
The improvements sound very much like the Custom Accessibility features Apple briefly discussed in 2022, news of which seemed to peter out in November. They include some fantastic implementations of machine vision intelligence:
- Individuals can type to speak during calls and also in conversation with Live Speech.
- Users will be able to create a synthesized voice that sounds like them so they can speak with family and friends. The feature is called Personal Voice.
- Live Speech will also let users save commonly used phrases so they can use them swiftly during a call. (This feature combines Phone and FaceTime within a single Calls app.)
- Users can access those tools along with Messages, Cameras, Photos, and Music using large high-contrast buttons.
- Messages gains an emoji-only keyboard and can share video messages while in this mode.
- Users and trusted supporters can select a more visual, grid-based layout for their Home Screen and apps, or a row-based layout for those who prefer text.
Detection Mode in Magnifier
This powerful use of machine intelligence and the camera means users will be told when their device is pointed at text and that text will be read to them. This is intended to help them use objects around the home, and relies on LiDAR. These tools are available within the Magnifier app, work with iPhone and iPad, and can be used in conjunction with those quietly impressive accessibility features Apple has deployed in recent years: People Detection, Door Detection, and Image Descriptions.
There’s a lot wrapped in this, but it is interesting that Apple has effectively invented a new visually aware user interface and further improved its device’s capacity to understand and interact with surrounding objects. It’s also interesting that Switch Control can now turn any switch into a virtual game controller. Importantly, users will be able to enter and exit Assistive Access with a triple click on the side button.
Even more assistive improvements
There are some additional improvements, including Mac support for Made for iPhone hearing devices, much more natural sounding Siri voices in VoiceOver ,and more adjustable text size.
The company is also shedding light on accessibility with special events at its retail stores, and themed content across all its services, including Fitness+ and music videos featuring American Sign Language (ASL).
Apple: ‘Accessibility is part of everything we do’
“At Apple, we’ve always believed that the best technology is technology built for everyone,” said Apple CEO Tim Cook. “Today, we’re excited to share incredible new features that build on our long history of making technology accessible, so that everyone has the opportunity to create, communicate, and do what they love.”
“Accessibility is part of everything we do at Apple,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives. “These groundbreaking features were designed with feedback from members of disability communities every step of the way, to support a diverse set of users and help people connect in new ways.”
On the road to WWDC
All these improvements are important in their own right, but with Apple preparing for arguably the most important WWDC it has hosted for a decade, it’s quite telling to see the many ways in which it is exploring alternative user interfaces and sensor-based technologies within accessibility support. Perhaps that’s precisely what you might expect from a company building a new user interface for mixed reality — one that exploits the unique computational power of its processors and integrates with products and services from across its ecosystem.
WWDC opens with an Apple keynote speech June 5.
Please follow me on Mastodon, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.
Copyright © 2023 IDG Communications, Inc.
This story originally appeared on Computerworld