Why artificial intelligence needs to overcome the ‘evil’ image and embrace accessibility

No one wants to be remembered as the creator of something defined as "evil" to accessibility. So how can you make sure you aren't?

Captcha Challenge on YouTubeThe evil thing Robin Christopherson, Head of Digital Inclusion at AbilityNet, is referring too when he's talking about building tech that's accessible is the CAPTCHA security boxes. The ones you sometimes complete online to prove you are not a robot for security reasons. These nasty little critters by default (and even with audio options) make themselves inaccessible to many people with disabilities. You can take the '2 minute Captcha Challenge' on YouTube to experience for yourself what we're talking about.

At TechShare Pro we know creating interfaces that refuse to make themselves accessible is a big no-no. We want to bring together experts, developers and designers to make sure the next generations of artifical intelligence (AI) is accessible. You can find out more about TechShare Pro on our website.

Artificial Intelligence, once the realm of Sci-Fi, is fast becoming the norm as devices become smarter. Currently, chat bots help us online and intelligent office assistants help us manage our lives and homes. In the not too distant future there will be driverless cars to contend with.

How can developer and designers using AI make sure their products are accessible to everyone?

Photo of Robin Christopherson MBE"It's about choice. It's about developing artificial intelligence that will give you choice..." said Robin.

If you are going to build a website, bot or driverless car you need to make sure that it can be different things to different people. That it has choice built-in. That anyone can use it.

When Siri first came onto the market it could only be operated by Voice Control, which meant that it was no good for people who couldn't speak. The latest version has the option to type questions and instructions and if a person can't hear the audio response, they can read it in a conversation thread on the screen.

So can you retrospectively fix accessibility problems?

Accessibility features like the ones I just mentioned were worked into the software retrospectively. But, there is the real danger that If you haven't worked in accessibility from the beginning things can end-up fundamentally flawed. For instance, if you used Flash (notoriously inaccessible) to build your website there was no easy fix - you just had to start again from scratch.

Accessibility is about making sure no one is left behind. Once it was just thought of in terms of helping people with disabilities, but as technology has developed it's become about creating inclusive design for everyone.

We are all carrying our smart phones and tablets around 24/7. Because of using them in public spaces, we all need to be able to contrast the screen to make information clearer or have the option for subtitles, just like people with visual or hearing impairments.

One of the accessibility issues that was flagged up early on with AI tech was that Siri, Alexa and many other virtual assistants don't like non-American accents - reported on wired.com. For a global product like Apple that reduces the number of customers you can sell things to massively.  

How would it cope with computer-generated voices like those of Steven Hawking?

"Don't think about accessibility in terms of disability, flip it 180 degrees and think about inclusive design..." Robin told us. "You don't need to be just asking if your AI can understand people with speech impairments. You need to think wider. For instance, does it understand someone with a strong Glaswegian accent?"

Now let's flip it.

"Why AI is going to be massively useful to accessibility?"

It's all to do with simplicity.

Robin described virtual assistants as the "pinnacle of simplicity." People are going to be able to do things more easily and accessibly because AI requires it to be so.

Mobile phone screen with apps

If mobile phone apps made things cleaner and simpler than a full desktop experience, personal assistants like Alexa are another step ahead. You don't need to physically open your phone or computer to do things. Now you can just ask your assistant, without lifting a finger, to do stuff for you. If you layer on top of that the ability to do your online banking and checking when the next bus is coming, then life just got a whole lot easier. Artifical intelligence has huge potential to increase the accessibility of all these things, for people who find conventional channels more challenging than others.

Developers won't need to reinvent the wheel

The incredible thing too is that the big tech giants, like Google and IBM, allow developers to link to their AI research and development via APIs (Application Programming Interfaces). So, if you are developing an app and need to link to Google translate, that is possible and free, provided you're not reaching more than say 10,000 people. You just focus on the user experience and making sure that the interface you create is inclusive.

If you would like to find out more about accessibility and AI, meet the experts and speak to other like-minded colleagues, you can still book tickets for our TechShare Pro event in November.

TechShare Pro banner image