3 ways students can reduce stress and create better essays

Now the festivities are over it's time for us all to start thinking about the year ahead. For most students it's time to get to work. Whether you've got a thesis, dissertation or a simple report to write, these student apps will help to maximise productivity, reduce procrastination and even improve the eloquence of your writing.

1  Zotero 

This is particularly good for science students who need to reference, but good for anyone writing essays and thesis. Use it to collect, organise, speedily cite, and share your research sources. 

Get Zoreto here

See this useful short video guide for a Zoreto demo
 

student looking relaxed writing in cafe with hot drink

 2 Dragon Dictation

Dragon Dictate is a voice recognition app that listens to you speaking and automatically converts those words into digital written text. Obviously useful for essays, but you could also try it for capturing notes and ideas. If you have trouble getting your words down and prefer to speak, or are dyslexic or have trouble writing for physical reasons, this could ease the pressure. By allowing users to dictate a stream of thoughts and words, it takes the pressure off students who feel it’s difficult to put words on the page while thinking. 

Get Dragon Dictate for iOS, here

There’s a super quick demo of Dragon Dictation here
 

3  Evernote

A popular organisational tool, where you can keep different notes and subjects in order and in separate sections and add and subtract from them when you wish, syncing across your devices. Handily it’s also an audio recorder for lecturers or verbal notes and ideas. You can share notes with course mates too, but this might involve a charge. One extra really cool feature - scan and search - means that if you take pics of whiteboard content or handouts, you can search them using any of the words in the image, because Evernote recognises content within images. In addition, you can write or draw on those PDFs on screen. Try Trello too if you want a project management board where you can see your projects’ workflow really easily and clearly. 

Get Evernote here

Find out more about how Evernote could help you in this video

How tech can help disabled people: AbilityNet's top 10 blogs of 2017

AbilityNet's website saw more traffic than ever before in 2017. Here are the top 10 most viewed articles published last year. 

man wearing VR glasses in a mountain scene simulation


1 Virtual reality 
From trying out-of-reach experiences to aiding muscle recovery, we looked at 8 ways virtual reality could enhance the lives of disabled people 

2 iPad / iPhone settings and tremors
How the latest version of the software that drives iPads and iPhone (iOS10) offers significant improvements for people with tremors due to conditions such as Parkinsons, Cerebral Palsy or old age

3 Government and disability
The Equality and Human Rights Commission (EHRC)'s Disability report says government has failed to support disabled people over the last 20 years. 

4 Windows 10 Fall Creator
The new accessibility features which came with Microsoft Windows Fall Creator update

5 AI and blind people
The artificial intelligence app which helps blind people hear the world

6 Dementia and digital design webinar 
Advice for developers on creating online services which are dementia friendly 

7 Robin's MBE
Our tech guru Robin Christopherson picks up a well deserved MBE for services to digital inclusion

8 BBC producer on life with dyslexia
Ed Booth opened up about his life, from struggles with dyslexia to writing top BBC programmes

9 Dementia-friendly websites
An AbilityNet accessibility consultant's top 6 tips for a dementia-friendly website

10 Microsoft's SEEING AI app
Robin, who is blind, blogs about the updates to Seeing AI. He was excited that it can now read colours and handwriting


What else was popular?

 

How Image Recognition and AI is Transforming the Lives of Blind People

A demo of the Orcam MyEye 2.0 was one of the highlights at the AbilityNet/RNIB TechShare Pro event in November. This small device, an update to the MyEye released in 2013, clips onto any pair of glasses and provides discrete audio feedback about the world around the wearer. It uses state-of-the-art image recognition to read signs and documents as well as recognise people and does not require internet connection. It's just one of many apps and devices that are using the power of artificial intelligence (AI) to transform the lives of people who are blind or have sight loss.

the new Orcam MyEye clips onto a standard pair of glasses and can recognises every day products

Last week, we took a look Microsoft’s updated free app Seeing AI and its amazing new features for people who are blind or have sight loss, including colour recognition and handwriting recognition. The app proved popular with AbilityNet’s head of digital inclusion, Robin Christopherson. 

And it's not the only innovation that is helping blind people. In the last few years we’ve seen popular and loved apps such as TapTapSee powered by Cloudsight.ai image recognition. This app allows users to take a photo and the details of what and who is in the photo are then spoken to the user. Similarly, Aipoly Vision app gives real time image recognition using Deep Learning. 

New smaller Orcam MyEye

Version 2.0 of the MyEye can clip onto a standard pair of glassesAt TechShare Pro, Orcam, the makers of AI vision tech MyEye who've recently launched MyEye 2.0, gave delegates an advance look at the updated tech before launch (6 December). The MyEye 2.0 consists of a very small camera and microphone attached to a pair of glasses linked to a smaller processor that can be clipped onto the body. A user can point to text, for example on a menu or notice board, and will hear a computerised voice read out the information. The device can also recognise faces, money and other objects.

Presenting the technology, Leon Paull, Orcam’s international business development manager, said: “You can teach it to identify certain items and it will find those in a supermarket. It’s ability to find products has been enhanced. The device is being used all around world and the new version understands multiple languages and can read barcodes and has colour recognition." 

He used simple hand gestures to work the technology, such as pointing a finger towards a page to have the text on the page read discreetly into his ear. With a wave of his hand, the system then stopped reading out text. He looked at his wrist to mime that he wanted to know the time, and MyEye 2.0 spoke the time. 

The MyEye 2.0 builds on the previous model for blind people, offering a more discreet and portable device with no wires. It currently costs around £3,000, but the creators say they are hoping funders will come forward so the devices can be provided at a cheaper cost or for free. 

Useful links

Five top tips for building accessible conversational interfaces

Leonie Watson of WC3 The Amazon Echo and Google Home top of many Christmas lists this year. Both of these amazing devices can use our verbal instructions to play music, turn our lights on and off, exchange the basics of a short conversation with us, and of course, tell us the weather. Amongst the highlights of our TechShare pro conference in November was a talk by Leonie Watson - who offered her five top tips on creating accessible conversations with our machines.

Legends of talking machines

“There are reports that as far back 1000 years ago people were thinking about the concept artificial speech,” says Watson, director of developer communications for the Paciello Group, a member of the W3C (World Wide Web Consortium) Advisory Board, and co-chair of the W3C Web Platform Working Group.

Legends of talking “machines” go 1000 years back to Pope Sylvester II (950–1003 AD), the first French Pope who supposedly created a very basic first dialog system including components of speech recognition, understanding, generation, and synthesis - according to the essay Gerbert, the Teacher by Oscar G. Darlington in The American Historical Review.

Steve Jobs introduces the first talking Apple Mac

Fast forward to the 1980s, when the first text-to-speech synthesiser by DECtalk was introduced. 1984 also saw Steve Jobs demonstrate the Apple Mac’s talking capabilities for the first time with Apple MacInTalk. See the video below from 3 minutes 20.


"There’s been some good marketing around such technology", said Watson, who has sight loss. "But I've found that talking to tech has been a laborious process - with a person having to speak very, very clearly and with specific phrases for machines to understand. Even then, the interaction has ended up bearing little resemblance to an actual conversation", said Watson.

Siri

“The thing that really changed that was Siri in 2011. For first time we could have something that felt a lot more like a conversation with technology. In 2014 the Windows Cortana launch followed, giving us another digital assistant that would talk back to us.”

“The same year, with the Amazon Echo, we started to see digital assistants be able to do practical things around the house, but we still needed very structured language and to ask very carefully phrased demands to get it to do things," explained Watson. "A further leap forward came in 2015 with Google making it’s technology more context aware. Meaning, for example, if a song was playing, you could ask your Google device ‘where’s this artist from? Or what’s his real name?” without having to specifically state who you were talking about.”

How to build accessible conversational interfaces

Watson laid out five ways that developers could make interactions with machines as clear as possible for a wide-range of people.

1 Keep machine language simple

  • Think about the context of how people might be using the device. They might be driving or cooking and need short, simple interactions.
  • Offer choices but not too many choices.
  • Suggest obvious pointers to useful information.

2 Avoid idioms and colloquialisms

  • ie, terms like “it’s raining cats and dogs” or “coffee to go” might only be understood by certain audiences and so lack inclusivity.

Amazon echo3 Avoid dead-ends in conversation

  • Give the users cues around what to say or ask next to get what they need.

4 Use familiar, natural language.

  • Ie for time, say ‘three thirty in the morning’ for a UK or US audience. Don’t say ‘zero, three, three zero a.m’.

5 Provide a comparable experience

  • Users of such technology will generally require speech and hearing to talk to machines.
  • For those with hearing loss, conversational transcripts could be posted on screen.
  • For those without speech, the only obvious option at the moment is using simulated speech, like Stephen Hawking does, for example.

Learn more

Microsoft Seeing AI - the best ever app for blind people just got even better

Microsoft’s revolutionary app Seeing AI took the blind world by storm when released earlier this year. Now it's been updated, with several new functions including handwriting and colour recognition - and it’s still free. It's launching today and I’m personally wriggling in my office chair with excitement, whilst simultaneously hitting refresh in the iOS app store.

Take a look at the features below to see why it's such an amazing step forward for disabled people.

Seeing AI just got even better

Microsoft Seeing AI app describing sceneSince it was launched in mid-2017 Seeing AI has been downloaded more than 100,000 times and has assisted users with over 3 million tasks. It was released with features such as the ability to identify a product audibly using the barcode, as well as being able to describe images, text and faces of friends and family as they come into view.

Today (14 December) Microsoft has announced new features that provide new user experiences including currency, handwriting and colour recognition, as well as light detection. It's also now available in 35 countries, including the European Union.

Seeing AI in action

As you can see from my video I've been using Seeing AI for reading magazines as well as handwritten notes left by my family.

A whole new set of features

New features in Seeing AI v2 include: 

  • Colour recognition: Getting dressed in the morning just got easier with this new feature, which describes the colour of objects, like the garments in your closet.  
  • Currency recognition: Seeing AI can now recognize the currency of US dollars, Canadian dollars, British Pounds and Euros. Checking how much change is in your pocket or leaving a cash tip at a restaurant is much easier.
  • Musical light detector: The app alerts you with a corresponding audible tone when you aim the phone’s camera at light in the environment. A newly convenient tool so you don’t have to touch hot bulbs to know that a light is switched off, or the battery pack’s LED is on.
  • Handwriting recognition: Expanding on the ability of the app to read printed text, such as on menus or signs, the newly improved ability to read handwriting means you can read personal notes in a greeting card, as well as printed stylized text not usually readable by optical character recognition.  
  • Reading documents: Seeing AI can read you the document aloud without voiceover, with synchronised word highlighting. Also includes the ability to change the text size on the Document channel.
  • Choose voices and speed: Personalization is key, and when you’re not using VoiceOver, this feature lets you choose between the voice that is used and how fast it talks.

Raise your glasses to a brighter future for the blind

As amazing as it is the development of the app is also a sign of the power that AI will bring in the future.

The video above shows a prototytpe that uses the Seeing AI engine in a pair of smart glasses. While the Google Glass smart spectacles fell a little flat, there’s no doubt that head-mounted cameras (with or without a screen) are going to play a major role in the future of wearable tech and it’s only a matter of time before this functionality is added.

For the users of Seeing AI, it makes perfect sense to have the camera that is working so hard to tell us about our surroundings mounted on our heads and looking in the direction most important to us. This is important in terms of warning us about upcoming obstacles, people we’re interacting with (or wanting to avoid physically or possibly even socially) and also in terms of street signs, shop fronts, notices and hoardings etc (all of which will soon be able to be automatically and effortlessly read out to us).

It is also possible to combine this breathtakingly useful level of awareness of our surroundings with the spoken cues of navigation apps - my personal top GPS app with added accessibility sparkles is BlindSquare. This means that people who are blind, severely dyslexic or have a learning disability will now have a whole new world of support wherever we go.

Thank you, Microsoft

I just want to end this quick post by spelling out my gratitude to Microsoft for bringing real cutting-edge machine learning to a group of users with such evident needs in this area. While none of the smarts within Seeing AI are solely or even primarily intended for blind users, it takes a company as acutely aware of the importance of accessibility as Microsoft to do such an excellent implementation that brings the best of AI to those who benefit most.

Robin Christopherson is head of digital inclusion for AbilityNet

Related links

Paralympic gold medallist urges businesses to use tech to support disabled people

Sophie Christiansen CBE is a para-equestrian dressage rider who spoke at AbilityNet's TechShare pro conference in November.


As someone who was born with quadriplegic Cerebral Palsy, technology helped me access education. Simply having a typewriter and then a computer, enabled me to do my school work independently. To write would’ve been far harder. Tech allowed me to show the part of my body that did work properly – my brain. Because of that I knew that I could go on to get a job and live independently like everyone else.

Sophie Christiansoon was a guest speaker AbilityNet's TechShare pro

I know my value in the workplace in my role as tech analyst at Goldman Sachs. When you have a disability you really focus on your abilities. You think differently – outside the box. It’s this mentality that makes disabled people good employees and also inspires able-bodied co-workers to do the same - to make their product and business accessible for everyone.

Voice recognition issues

One thing that would make my life even easier is if voice recognition worked better for me. My voice is obviously different to other people’s and I find voice recognition a bit hit and miss. For example, Siri on iPhones doesn’t understand me, but Google does. Amazon Alexa gets me when she is online and processes on Amazon’s server, but to switch it on using the specific ‘wake up words’ doesn’t work because at that point she’s offline and her local processing skills are less clever. I did a little experiment with her to show you what I mean.

There’s lots more that can be done to keep improving things. I spend a lot of time on trains, and feel companies could do more to make it easier for disabled people. At the moment, to catch a train, I have to phone to book assistance 24 hours in advance (because disabled people can never be spontaneous, right?). But phone at peak times and this normally involves being put on hold for 15 minutes.

Reasonable adjustments in the transport sector

I’d say about for about one in 10 journeys, the member of staff forgets that I have booked on, so in the absence of a ramp, I have to rely on kind members of the public to lift my wheelchair down so I don’t end up in Portsmouth.

As a reasonable adjustment there could be an app which I could use to quickly book assistance half an hour or more before my train, get reassurance that the staff know about me, and send data back to the train operator on their performance. I can guarantee that if this was in place more disabled people and their families would have the confidence to travel by train.

The world of tech is still in its infancy, so why not think about accessibility in the embryo stage of an idea? You don’t have to just do this out of the goodness of your hearts – there are 13.3 million disabled people in the UK alone with a household spending power of £249bn. That’s a lot of business that you are missing out on by not being accessible. A little extra thought goes a long way for everyone.

Visit Sophie's website for more information

Follow Sophie on Twitter @SChristiansen87

How AI could transform the lives of disabled people

Hector Minto, Sharon Spencer and Ellie Southwood at TechSharePro panel“The Echo Dot makes me feel included," says Ellie Southwood, chair of The Royal National Institute of the Blind (RNIB). “I spend far less time searching for things online; I can multi-task while online and be more productive. Microsoft’s Seeing AI app (narrates the world for people with sight loss) means I can recognise people and scenarios and make up my own mind about what’s going on.”

Southwood (pictured right with Hector Minto of Microsoft and Sharon Spencer of IAAP), who has sight loss, made the comments at AbilityNet/ RNIB’s TechShare Pro conference in November, which focused on AI, disability and inclusive design. News about Artificial Intelligence (AI) can often present a dystopian version of the future - cars let loose on the motorway driving themselves; robots controlling our thoughts and taking jobs. But this sold out London conference looked at the many ways that advances in AI could transform the lives of disabled people.

The power of AI

AI is fast becoming part of our lives. The technology is behind the likes of Siri, Alexa, Cortana and other similar services. It powers speech-to-text service and is getting better at understanding different voices. It’s responsible for the suggested responses on GMail, auto-captions on Facebook and picture library searches for a specific location or person.

Opening the event, IBM’s evangelist for its AI database Watson, Jeremy Waite, told delegates that the company's survey of 1,200 UK executives found 28% plan to invest in AI in the next year. Watson can search 10 million records a second - for example it can find specific words in the whole back catalogue of TED talks in seconds or less.

AI and accessibility

The biggest tech companies’ use of AI within their products is seeing new tech features becoming far simpler for everyone to use, including disabled people. And by investing in inclusive designs those products are now reaching bigger audiences - accessible designs are more popular with every customer.

“Hopefully AI means that, rather than expecting people to provide something in an accessible format, it will mean that everything becomes accessible in the future,” said Hector Minto, head of accessibility and assistive tech at Microsoft, who sat on a future-gazing panel at the event. He spoke about the new Microsoft Seeing AI app, wGoogle's Kiran Kaja speaking at TechShare Prohich enables blind people to recognise faces, scenes, money, text and more, and also about a range of other Microsoft features which use AI to reduce or remove the technology barriers that disabled people face.

For example Windows Hello uses biometric login, ie fingerprint, face or iris, which can work for people with physical disabilities or those with dyslexia who might struggle to remember passwords. Subtitles in Powerpoint mean people can save a transcript of the narration which happened alongside slides and keep the transcript.

Minto added that the big opportunity for AI - with advances in translation capabilities and free apps - is that it could help assistive technology could 'go global' and reach parts of the world where there are more disabled people and fewer services and support.

Biometric logins and disability

Delegates also heard from Kiran Kaja, technical programme manager for search accessibility at Google.

“Everyone wins when we harness AI,” he said. “Voice recognition was developed for disabled people, but it’s the hot item at the moment and is useful for everyone. The same with speech-to-text technology, which is completely based on renewal networks. This uses AI and predictive text is also AI. Google wants to use intelligent tech to improve customers experience,” said Kaja, who has sight loss.

Kaja spoke about Google Home’s connection to Smart devices and the potential of home automation technology to support disabled people. In particular, people with physical disabilities or sight loss can more easily do things like turn lights on and off, alter a thermostat and turn on home appliances using such technology. 

TechShare pro: How Barclays uses latest tech to offer more accessible services to disabled people

Barclays bank logoOne in five people in the UK is disabled, whether that be by sight loss, hearing impairment, a motor or cognitive disability or other - and 90% of Barclays customers now interact with the bank through a screen. As the bank's Head of Digital Accessibility Paul Smyth explained at TechShare pro, unless those services are accessible the business risks losing 20% of its customers.

“We became the first bank to offer talking cash machines when we signed up to the RNIB’s Make Money Talk campaign in 2011. We immediately saw customers vote with their feet and recognised it was good for business and good for society,” said Paul Smyth, who was a keynote speaker at the AbilityNet/RNIB TechShare Pro in London In November.

Paul SMyth is Head of Digital Accessibility at Barclays

“It was a huge catalyst for change. We now have virtual sign language interpreters and high visibility bank cards....people think accessibility market is a small market, or accessible design is boring, but the purple pound (estimated household spending power of disabled people in the UK) is worth £265 billion."

Listening to disabled customers 

Barclays keeps a range of people in mind when developing its services, said Smyth, and uses surveys, social media engagement and stories to listen and share disabled people’s experiences.

“As someone with a visual impairment, I can choose to interact with my accessible smartphone rather than the bank kiosk when requesting cash withdrawals, highlighting the benefits of offering multiple ways to do the same things….We aim to be the most accessible business in the FTSE100,”

The popularity of mobile banking has meant that banks are having to provide information in a more accessible way, he said. “Many of us are on a smaller smartphone screen these days, banks are forced to distil down and display only the core information that the customer wants less of the generic marketing blurb that the bank would want. Simplifying both interface, language used and ways of interacting.

AI and Chat Bots

Smyth sees the newest technologies as a way to make banking safer and simpler. “AI and chat-bots are helping customers wade through bank sites and make sense of the information that they want through conversational interfaces rather than reading lengthy FAQs." 

"Simpler information means such services will be more accessible to a wide range of people, including those with dyslexia or cognitive impairments, as well as those using a screenreader. The industry is ripe for a revolution of simpler, safer and smarter tech, powered by predictive AI and presented in a personalised way that works for everyone,” said Smyth.

More information

Accessibility Professionals: UK Chapter launched at TechShare pro 2017

The UK branch of the International Association of Accessibility Professionals (IAAP) launched at the TechShare Pro conference in London on 23 November. The IAAP started in 2014 with 31 founding members including AbilityNet and Microsoft and has now grown to members in 40 different countries and recently launched chapters in the UK, India, and the Nordic countries.  It is a membership organisation for all people involved with digital accessibility working with websites, software, hardware, content and services. IAAP offers programs to support the advancement of skills and ways to demonstrate achievement of those skills and can be of particular interest to those working in web development and UX.

Barclays offered a chance to try new VR solutions at AbilityNet's TechShare pro event

TechShare pro was a sold out one-day conference organised by AbilityNet and RNIB and sponsored by Barclays, IBM, Microsoft, OrCam and Storm. As well as IAAP UK members it featured experts from the Google, Barclays, IBM and the BBC. Alongside practical sessions about accessible design much of the focus was on Artificial Intelligence and Machine Learning, which could transform the lives of the world’s one billion disabled people (Unicef figure).

Employing qualified web accessibility experts

Launching the IAAP, managing director of the body Sharon Spencer (pictured below second from left), said:

“The aim is to help organisations integrate accessibility into their products and infrastructure and provide a professional qualification for accessibility professionals. We offer certification and these qualifications are something organisations can look for when employing experts to work on their websites to ensure they comply with the law.”

The majority of websites in the UK are not fully accessible to people who have sight or hearing loss and other disabilities. Accessibility is not taught as part of mainstream digital education at any level, and it is common for web developers to be unaware of the Web Content Accessibility Guidelines (WCAG).

AbilityNet CEO Nigel Lewis (Twitter: @NigelLewis18) has been involved in IAAP since its inception and sees a huge opportunity to recognise the skills of accessibility professionals: 

"I amd personally very proud to be a founder of the IAAP and AbilityNet fully supports its mission to help grow and develop the accessibility profession. We want accessibility to be recognised as a profession on a par with User Design & Experience, Development, Testing and Security and other disciplines IT profession. This will be a key factor in driving the delivery and uptake of accessible and inclusive technology, which in turn will help millions of disabled and older people around the world."

"IAAP provides a place for accessibility professionals around the world to gather, share experiences and enrich their knowledge of accessibility. The certification programme aims to better define what accessibility professionals are expected to know and increase the quality and consistency of work in this space.

Students of the accreditation will be given details on what kinds of skills they need in order to pass multiple choice tests, showcasing their knowledge. There are several certifications available:

  • The Certified Professional in Accessibility Core Competencies (CPACC)
    This represents broad, cross-disciplinary conceptual knowledge about disabilities, accessibility and universal design, accessibility-related standards, laws, and management strategies.
  • The Web Accessibility Specialist (WAS)
    This represents an individual’s detailed technical knowledge about the WCAG guidelines and other related web accessibility topics.
  • Those who pass both the CPACC and the WAS exams will receive the designation of Certified Professional in Web Accessibility (CPWA).

IAAP President Sharon Spencer with other accessibility professionals at TechShare pro

Pankaj Bhasin (pictured second from right), an IT expert who volunteers with AbilityNet’s ITCanHelp service, said he would be looking to become accredited. “I have a Master’s Degree in Business and Information and at no point did we look at accessibility. This is something I would definitely be interested in to increase my career prospects."

Find live recordings of expert speakers from TechShare Pro here.

Find slides and presentations from speakers at TechShare Pro here, including BBC, Microsoft, Google and more.

Follow us on Facebook for more live videos and updates about next year's TechShare Pro.

Alexa vs Google Home vs Cortana: The battle to reach every user intensifies

There's been an explosion of Echos! We’re not talking the sort of effect that we’d get if Captain Caveman went wild in his mountain dwelling, we’re talking about an absolute explosion of Amazon Echo models in recent weeks. From a new incarnation of the big black column to the tiniest of cute (and very smart) bedside clocks, there’s something for every ear, every location and every budget.

The cute clock that is the soon-to-be-released Echo Spot is probably my favourite - here’s a sneak peak courtesy of the nice people at the Verge.

And of course the arms-race that is ambient computing contains several other horses - to horribly mix metaphors (and split infinitives). The Cortana-driven Invoke speaker is also in the running and, at the time of Amazon’s Echo announcements, you can’t have missed the simultaneous release of a similarly large range of Google Home speakers.

Last out of the gate, due out early next year, will be Apple’s HomePod.

Ambient computing is about to change everything

I’ve discussed voice assistants in several recent posts and shown how simply speaking to the air and getting useful information, being entertained and even performing sophisticated tasks is the next significant chapter in computing.

But how well are these new devices keeping up with inclusive design?

Again, in many recent posts (you really should follow that link above), I’ve explored the imperative that is inclusive design. For anything to be fit for purpose in this rapidly changing world where most people on mobiles are temporarily impaired by extreme environments on a daily basis, and a proliferation of platforms means that your content and functionality needs to be able to morph to fit any number of devices and use-cases, inclusive design is the only real way of ensuring that you’re reaching the broadest possible audience and future-proofing your projects going forward.

Yes, I am talking accessibility here but, as I’ve said so many times before, accessibility is now for everyone so let’s give it a new name for a new reality.

Anyone who has experienced ambient computing knows it is here to stay. It represents an entirely new use-case (or whole range of use-cases) and accessibility will play its part in weeding the winners from the also-rans.

Showing the way with VoiceView

Amazon Echo Show includes a screenThese smart speakers will only truly be inclusive when everyone’s needs are taken into account. Just as we have the excellent ‘type to Siri’ in iOS11 (thus making the virtual assistant available to those without speech or for anyone who finds themselves in a noisy environment), the ability to review a text version of everything that an Echo speaks out within the Alexa app (or on the screen of those models such as the Echo Show) makes the A-lady accessible to people without hearing.

The Echo Show, however, also includes a full ‘screen reader’ (software to help blind users access screen text and functions) meaning that the addition of a screen does not suddenly exclude a group of die-hard fans from a whole new range of features.

VoiceView is the name of this screenreading ability and, just like Microsoft’s excellent advancements in the built-in screenreader in Windows 10, Amazon should likewise be applauded for bringing inclusion to their latest models out-of-the-box. Here’s a full break-down of all the accessibility features found in the Echo Show.

Google’s smart speaker – accessibility home run?

We know an awful lot about the accessibility of the various Amazon Echos, but what about the Google Home? Is it a home run or a rookie batter wildly swinging at the plate. Well the jury is still out (I’ve decided to see how many metaphors I can mix and mangle in one article).

We know that Google can make accessible products (a good example is the screenreader built into Android) but we also know that they aren’t averse to releasing products without a whiff of inclusion, such as Android Wear, the version of Android that runs on smartwatches.

The good news is that the accessibility of the companion app used to set up and control your so far screenless Google Home is nicely inclusive and this represents a vital component to the overall accessibility of each solution. We also know that, whilst the Echos are chockablock with accessibility features, Amazon has some way to go before its Echo companion app, again so vital in every Echo users' experience, is truly inclusive.

As a screenreader user myself, I can attest to just how awful the Alexa app is on both iOS and Android.

There is increasing evidence that a Google Home with a screen is on the way. Will it be as accessible as the Echo Show or a strike-out like the Android watch? When it lands we’ll line up the jury, present the evidence and let them deliver their verdict. Baseball bats may or may not be involved.

Cortana and the Halo effect

Microsoft has entered the voice assistant market with the InvokeWhilst it’s natural to assume that, as Microsoft has been a long-time champion of accessibility, the new Invoke speaker with built-in virtual assistant, Cortana will be inclusive. We’ll again have to see when they fall into the hands of hundreds of eager users with a range of impairments.

Microsoft has produced a huge number of truly inclusive mobile apps in recent years (not least the all-important Office suite) and so I’m confident the Cortana companion app will be accessible.

For my money, the acid test will come with the first model to include a screen. I’m rooting for a home run…

Related links