Apple’s next frontier: AI and the future of accessibility

Guest blog: Colin Hughes

Colin is a former BBC producer who campaigns for greater access and affordability of technology for disabled people. Colin is a regular contributor to Aestumanda.

Graphic of side profile of a head with the letters AI written above it and swirls inside the 'brain' areaThere is little doubt that Apple will emphasise AI (artificial intelligence) in its 2024 releases. Apple's CEO Tim Cook has reiterated this point twice this year. But what could AI mean for disabled people and their accessibility to technology?

Imagine being able to control your iPhone, iPad, or Mac without touching them. Sounds like science fiction, right? Well, for millions of disabled people around the world, this is a reality thanks to Apple's Voice Control app.

It’s an app that lets users type with their voice and control their Apple devices hands-free. It’s a lifeline for millions of disabled people like me, but it has some flaws that need fixing.

Some of the problems are:

  • Mishears or ignores commands, especially in noisy places
  • Inaccurate dictation, such as confusing “for” and “four” and the verb “will” with the name “Will”, and more like these
  • Doesn’t support some common tasks, such as editing text in any app
  • Challenges supporting proper nouns, such as capitalising names, places, and organisations

These issues affect our user experience, independence, and dignity. We can’t just use the keyboard to correct them. 

Graphic showing images of Robin Christopherson and Michael Vermeersch of Microsoft and webinar title How can AI help disabled people?FREE WEBINAR: How can AI help disabled people?
Microsoft and AbilityNet share in our upcoming free webinar on Wednesday 17 April at 1pm BST how AI could (and probably already does!) improve your life.

AI enhancements for Voice Control

The prospect of integrating AI to enhance Voice Control presents a beacon of hope for addressing these shortcomings. Through AI, Voice Control could undergo substantial improvements, including:

  • Leveraging natural language processing to understand and transcribe speech more accurately 
  • Harnessing machine learning to learn from user feedback and preferences
  • Advanced speech recognition to reduce errors
  • Interoperability to work with other apps or services
  • Personalised speech recognition for people with non-standard speech, which means they face challenges in making themselves clear.

AI can learn to recognise and transcribe speech more accurately and naturally, regardless of the factors that affect speech, such as speech impediments, breathing difficulties, or muscle disorders.

For severely disabled people, dictating accurately can make or break a day. It’s not just a convenience - it’s a way to connect with the world, from messaging friends to posting on social media.

These issues transcend niche concerns. The World Health Organisation estimates that 3.5 billion people will need assistive technology like Voice Control by 2050. 

Selection of voice technology and microphoneRumours and revelations

Reports suggest that iOS 18, is poised to be the most significant update in Apple's history, with a pronounced focus on AI features. Anticipated enhancements include a more intelligent Siri, leveraging advanced AI systems incorporating large language models.

Given Apple's longstanding commitment to accessibility, there is palpable anticipation for the extension of AI advancements to bolster Voice Control and other accessibility features across Apple's product lineup.

The Apple Watch challenge

It's not just Voice Control where AI can help with accessibility. Turning to the Apple Watch, it’s clear that Siri’s accessibility remains a bottleneck for users with severe mobility issues.

Apple’s innovation has made Siri smarter in recent times. With the voice assistant on your wrist, you can do everything from replying to texts to setting alarms and reminders and more. Despite these advancements, the reliance on wrist movements to wake Siri erects unintended barriers for those facing mobility challenges.

Double Tap a double whammy 

Last autumn, Apple brought Double Tap to the latest Apple Watches. The feature lets you tap your index finger and thumb together twice to perform common actions like answer a call or reply to a message.

There was hope that Double Tap could help with Siri activation by double tapping finger and thumb to wake the assistant.

However, Double Tap not only doesn’t support Siri activation, it also falls short for people with limited mobility who find it impossible to use because of the same prerequisite of raising the wrist to wake the screen before activating the feature.

Ideally, Siri activation on the Apple Watch should transcend physical limitations. A first step could be implementing Double Tap without requiring a raised wrist movement, providing a more accessible trigger for Siri activation. However, the goal, shared by many in the disability community, is an always-listening Siri on the Apple Watch, akin to the functionality on the iPhone and Mac.

Apple should consider introducing an accessibility setting that allows users with extremely limited mobility to train the built-in motion sensor that activates Siri when a wrist is raised, enabling it to recognise even very subtle movements, unlike those typically used by Apple Watch users.

While I don’t have inside information on Apple’s design decisions, one reason for Siri not always listening on the Apple Watch could be attributed to battery optimisation and resource management. Due to the limited battery of the Watch the Siri detector operates only when the built-in motion sensors detect a raised wrist. Constantly listening for a wake word requires ongoing processing power, potentially draining the battery faster. 

The role of AI

The integration of AI advancements holds the potential to overcome these accessibility barriers with the Apple Watch. By leveraging AI algorithms, Apple can optimise the functionality of Siri on the Apple Watch while mitigating concerns related to power consumption.

Advanced AI algorithms can refine wake word detection mechanisms, minimising resource usage and preserving battery life. Additionally, personalised AI models can enhance Siri's contextual understanding of voice commands, fostering a more intuitive and responsive user experience.

Need advice about how to make your device easier to use?
Our simple guides on our My Computer My Way website give you free, step-by-step instructions on how to adapt your phone, computer or tablet to meet your needs. You can search for a specific need (e.g. making text larger) or filter the guides based on your symptoms (e.g. hand tremor) or condition (e.g. dyslexia).

Life-Saving Potential

Apple Watch on person's arm in outside settingBeyond the accessibility challenges, it’s important to recognise the transformative potential of the Apple Watch for disabled people in terms of personal safety and health monitoring. The constant presence on the wrist allows for seamless tracking of vital health metrics, enhancing overall well-being. With the release of watchOS 10.2 Siri can now help users access and log their Health app data. For example:

“Siri, what’s my heart rate?”

“Siri what’s my blood oxygen?”

Since the Apple Watch was launched there have been stories of how it has saved lives, from car crashes, mountain rescues, fall detection, and alerting people to heart problems, For disabled people, the Apple Watch should serve as a lifeline, providing instant access to emergency services, care giver contacts, and real-time health data that can be invaluable in critical situations. Without access to Siri these important features are next to useless.

As someone who lives with a disability that sometimes requires blood oxygen monitoring it’s great the Watch has this feature built-in, (outside of the USA at least), but if I can’t trigger a check with Siri the feature may as well not be there.

The importance of Siri access to these features transcend convenience. They could quite literally save your life.

Breaking new ground with tech

As Apple charts a course toward AI-driven innovations, the potential to enhance accessibility features holds immense promise for disabled people of whom only1 in 10 have access to the tech they need. By leveraging AI advancements, Apple can pave the way for a more inclusive and empowering technological landscape, where every user can navigate and engage with technology seamlessly. 

Tim Cook’s commitment to “breaking new ground” with AI this year hints at a future where technology adapts to serve all users, regardless of their physical abilities. As we look forward, the question isn’t just about how AI will change technology, but how it will transform lives, making everyday tasks more accessible and empowering disabled people to connect, create, and communicate on our own terms.

I certainly hope Voice Control and accessing Siri on the Apple Watch is on Apple’s AI road map!

This article was written by Colin Hughes. All views are his own freely expressed opinions. Colin is a former BBC producer who campaigns for greater access and affordability of technology for disabled people.

Additional related content