Assistive Access on iPhone and iPad helps people with cognitive disabilities

Assistive Access Apple accessibility

Apple recently showed off several valuable new features coming to its devices soon for people with disabilities, including specific accommodations for mobility, hearing, vision, and cognitive function.

The preview comes ahead of Global Accessibility Awareness Day on 18 May, highlighting the importance of accessibility, which Apple has a history of factoring into its devices. Out later this year, some of the new features include synthesised speech, simplified device layouts, and greater assistance in helping blind users read labels.

As with any accessibility feature, each addition will not only help specific disability groups, but it will also help many others with their iPhone or iPad day-to-day use.

Assistive Access to help with cognitive function

Aimed at people with cognitive disabilities, Assistive Access streamlines the visual layout of an iPhone or iPad to help reduce cognitive load. For some, looking at a screen overflowing with app icons and wallpaper imagery can be overwhelming and difficult to parse. With Assistive Access, you can display apps in a simplified grid or row-based layout, making it easier to navigate the device.

Apple Assistive Access iPhone
Assistive Access helps reduce strain for people with cognitive disabilities.

Another Assistive Access feature customises apps like FaceTime, Messages, Camera, and Music to aid with navigation. Similar to the adjusted iPhone and iPad home screen layout, these apps benefit from a high-contrast interface with large buttons and text labels. There’s also an emoji-only keyboard for anyone who prefers communicating visually.

It’s great to see more customisation options available to people with cognitive disabilities. Whether someone needs an easier way to filter on-screen information or feels overwhelmed by a busy screen, these accessibility features will help many people.

Live Speech helps you communicate

For those who have difficulties with speech, like people with amyotrophic lateral sclerosis (ALS), more commonly known as motor neurone disease (MND) in Australia, Live Speech can handle the talking for you. Using the feature, you can type text in and the technology will speak it for you during phone and FaceTime calls. If you have some go-to phrases, you can save them for quick access, too.

Another big innovation in this space is Personal Voice, which produces a synthesised version of your voice. To set it up, you record 15 minutes of audio on an iPhone or iPad, speaking aloud provided random text prompts. Once completed, you can use Personal Voice alongside Live Speech to communicate with others using a more personalised touch. Naturally, privacy is a concern, but Apple reassures users that all information is kept private and secure using on-device machine learning.

Apple Personal Voice
What it looks like when setting up Personal Voice.

I can also see this as being a helpful tool for people with cognitive disabilities. Additionally, some neurodivergent people are nonspeaking or encounter periods of nonspeaking, so speech technology may help make communication easier.

Detection Mode will soon read labels for you

Apple previously introduced detection-based features for people with vision disabilities, helping them navigate doors and read text printed on them. Soon, a feature called Point and Speak will come to the Magnifier app on iPhone and iPad. It’s intended to make it easier to read multiple sets of labels on physical objects.

In the example provided by Apple, Point and Speak uses the camera and LiDAR scanner to read out the buttons on a microwave as you move your fingers across them. It’s something that will have many different use cases and assist in navigating various environments.

More accessibility features to come

Apple points to accessibility as a core part of its design philosophy, and there are more features on the way. Community consultation drives many of the new features, according to Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives.

MacBook Voice Control
You can also use your voice to control your MacBook.

“Accessibility is part of everything we do at Apple,” Herrlinger said. “These groundbreaking features were designed with feedback from members of disability communities every step of the way, to support a diverse set of users and help people connect in new ways.”

In addition to everything discussed above, here are some of the incoming features from the official announcement:

  • Deaf or hard-of-hearing users can pair Made for iPhone hearing devices directly to Mac and customise them for their hearing comfort.
  • Voice Control adds phonetic suggestions for text editing so users who type with their voice can choose the right word out of several that might sound alike, like ‘do’, ‘due’ and ‘dew’. 
  • Voice Control Guide gives users tips and tricks about using voice commands as an alternative to touch and typing across iPhone, iPad and Mac.
  • Switch Control enables users to turn any switch into a virtual game controller to play their favourite games on iPhone and iPad.
  • Text Size is now easier to adjust across Mac apps such as Finder, Messages, Mail, Calendar and Notes.
  • Users who are sensitive to rapid animations can automatically pause images with moving elements, such as GIFs, in Messages and Safari.
  • For VoiceOver users, Siri voices sound natural and expressive even at high rates of speech feedback; users can also customise the rate at which Siri speaks to them, with options ranging from 0.8x to 2x.

According to Apple, the upcoming accessibility features are coming “later this year”. It’s been a big week of announcements from the tech giant, following Tap to Pay contactless payments and the SOS satellite feature launching in Australia.

Read more app news on GadgetGuy