Next up then is Jonathan Hassell, who I’m sure most of you know. Jonathan’s been with the Beeb for quite some time now. Before that he had a PhD in HCI, Human Computer Interaction, and he’s been pivotal in accessibility in all areas of the BBC for ages now. And at the moment he’s Acting Head of Audience Experience and Usability and leads the Usability and Accessibility Team, who are basically responsible for every single aspect of accessibility right across the online content and probably beyond as well. And for the last three years he’s been heavily involved in JAM, which I think we’re going to see a bit of as well today. And he’s talking about UGC, a hugely challenging area that’s got huge potential for opportunity for disabled people. So a round of applause for Jonathan.
Well good afternoon. Antonia got the graveyard slot after lunch and hopefully you’re waking up. Thanks Antonia, wherever you got to. People with learning difficulties often get missed out at things like this it’s so good that that was on the agenda. Probably the only other thing to add to what Robin said was the eagle eyed amongst you, if you can see the slide, will see that I’m not acting anymore, I got the job yesterday. Lovely. So moving on.
What am I going to talk to you about for the next sort of twenty-five minutes today? I’m going to give you a little a bit of history, BBC and accessibility, what we’ve been doing for a while. So Web2 really changes the rules in a lot of ways, as I think for most speakers this afternoon, a lot of our thunder was stolen this morning. So I’ve been re-jigging my presentation all morning saying, ‘I just need to reinforce that. Don’t need to say that anymore’. So I’m going to reinforce various things that people have already said to make sure you got them, and hopefully to give you a few more examples, some more detail on UGC, multimedia and what I call beyond inclusion. I’m going to show you some stuff that, unless you’ve been at various different conferences that I was at for the last few months you may not have seen anything like before. So I just want to take us on.
So a brief history of the BBC and accessibility. As I say, with the BBC we’ve been around for quite some time. We started off on broadcast media, so if you like we’ve been looking at the needs of disabled people for a while. There are going to be three themes in my talk as I go through and I think we’ve kind of touched on a number of these already but I just wanted to try and make them a little bit concrete: Inclusion, personalisation and beyond inclusion. I’m going to tell you more now and I’ll just reference them as I go through.
So the BBC, TV and radio we’ve been in these areas for ages trying to make TV accessible, that sort of thing, subtitles to any audio description, it’s what half of the BBC spends a lot of time on. And also sort of creating programmes specifically for disabled audiences, things like See Hear for deaf audiences and In Touch on the radio. And we’re also trying to move those towards the mainstream now, so programmes like Desperado’s, that my nephew who plays wheelchair basketball absolutely loves, which was all about a wheelchair basketball team. As I said, trying to get disabled people not just to be one of those, if you like, accessibility issues, but actually fundamentally them on screen so that they can see themselves, it’s an identity thing. Online, because I guess that’s what we’re all interested in here.
As I say, it goes back to about 1998, Betsy was a tool for sort of creating text only pages and it’s very, very long in the tooth now and Christian was saying earlier that you have to make sure if you put something good on the website you keep it going. Betsy is pretty much on its ten-year anniversary and we’re just about to retire it and replace it with something better. 2002 we really got into accessibility sort of well. Training courses run by Robin here and it’s the first time that AbilityNet and us I think worked together. We created Ouch, our disability website on bbc.co.uk. Through 2005/2006 again, I think Christian alluded to it earlier, My Web My Way, I completely agree with him. It’s more important that we understand that users use the Internet rather than our site so when trying to do things like text re-sizing or whatever. My Web My Way was, again, in collaboration with AbilityNet, our attempt to try and give people the tools to get what they needed, not just from the BBC site but from everywhere. And we’re doing a huge amount of work on Web2 at the moment. So things like iPlayer, the new BBC homepage, all sorts of things that you may have seen if you’ve come to the BBC. I’m going to skip on.
Well it’s good that we’re here and it’s good that AbilityNet wrote this state of the e-nation report earlier in the year. As I say, they’ve already illuminated a lot of the problems that face disabled people but unfortunately that’s only the start of the challenge really and again, a bit of my fire was kind of stolen this morning but fundamentally it’s about the content. If the interface enables you to get there that’s great and actually a lot of those sites at the moment have real problems there. But if the content itself isn’t accessible you have a real problem. So increasingly because it’s the user’s creation that the content, we have a new world.
So I just want to show you, because as Robin said, I’ve been doing this sort of thing, standing up at these sort of events, for years, and this was a slide that I probably created in about 2003 about accessibility being a partnership. All of these different people, all the website creators, assistive technology, all the way through to operating systems all need to be working together to get an accessible experience. And we’ve had some of that debate going backwards and forwards this morning about who’s responsibility it is for a site, for instance, to provide speaking facilities for people who might require that. So, in general, if we have all of those working well together you have an accessible site. And because of the DDA, and I guess the work of various people that I can see around here, we’ve had a huge number of successes.
So we’ve got Legal & General and Tesco up there, just a couple of instances of sites that have really put accessibility fundamentally on their road maps and have had a lot of success from that. But how do we persuade the every day users of blogs, MySpace, Bebo or YouTube to make their content accessible? And this is the thing here - it’s the content that I’m talking about. We’ve had a lot of other things today. So what we have here is a few of those sites. How much ALT text do you think is on the one on the left there, the blog page? Any ideas? Ah no actually it’s not that bad, it’s one. Right in the top left corner is a blogger logo. That has ALT text, nothing else does. Similarly there’s nothing on the Bebo. I can’t remember how much was there on the YouTube. But really what we’re talking about is that when it comes to the users creating this stuff how do you work that through?
Now I think I probably don’t need to show you this slide. We all know this is really, really important. Blogs were identified as changing election results over in the States. Many sites have got a have your say type thing, increasingly now with imaging. AbilityNet have an accessibility channel on YouTube and pretty wonderful it is too. The BBC’s Born an Island and Second Life, this stuff is really, really crucial to where the Internet is going so it is important that we get this sort of thing right. So if disabled people need accessibility support and content how do we enable it and fundamentally where is the responsibility? Whose job is it to do this stuff? So there are two things here. There are the tools and the moderation is what I’m saying.
So the responsibility of the site, and I think this can be nobody else’s, is to make sure that the tools enable this to happen. Does the content building tool include a mechanism for enabling the users to enrich pages with the right sort of things? And as was mentioned earlier this really comes down to the ATAG guidelines which are there in the WAI guidelines, which is pretty much lesser known, as I say, unless you’ve been creating content management systems. So a quick example of the state of play.
This is a site I created for myself last year just to do a little bit of a test because there’s a lot of site builders out there now and I wanted to see how quickly I could turn jonathanhassell.co.uk into something I’d be proud of. And only one out of four site builders that I used enabled me to add any ALT text to any images on that site at all. The tools themselves wouldn’t allow me to do it and I was trying to say, ‘Hey guys, I do accessibility and that site was trying to put that message across’. So only one tool would allow me to do that and unfortunately it didn’t make the page look very good, so I just gave up. Now that’s because I was actually looking for those things but if I wasn’t and I didn’t even know that I should be doing that, you can imagine if people are using these sort of tools, and they come free with domains a lot of the time. If you buy a hosting package from your web company you get something like this thrown in free. A lot of people are putting these out there and what do you do? So I think it is the responsibility of the site creators to put that in.
But after that it starts getting even worse because firstly, is it up to the site owner to let users know that accessibility supports – and I’m just looking at ALT text here for a second, I’m going to move on from there – but whose responsibility is it? So the site owner can let people know that it’s useful. They can give them the tools to do it. Will they monitor if users are actually enriching their content with that? And if the users aren’t doing it then what happens? Will the site owners do the enrichment themselves through moderation? We know a lot about moderation at the BBC. A lot of people skipped out of sort of message boards and things like that a number of years ago because it was so expensive to moderate the stuff. We’re still doing it so we know how much it costs and those people who got out of it knew how much it cost as well.
So it’s incredibly costly and when you’re looking at ALT text it’s almost impossible to automate. If you think it is possible to automate we just ask you to look at CAPCHAs. So everyone knows that CAPTCHAs in general are not very accessible. The other great thing about CAPTCHAs and the reason why they’re there is because they don’t think that a computer can look at it, look at an image and work out what it is. So we can’t automate this. So where does that ALT text come from? Should there be, for instance, a difference in responsibility between social networking sites, so things like Facebook and YouTube or more general site creation tools, so Blogger, Google Page Creator, things like that. So I’m guessing that Facebook are slightly more responsible, potentially, for stuff that appears on their site than Word Press, anybody like that. And the problem I think really is that the DDA isn’t clear here.
We’ve had that case in the past where at the worst case if a disabled person went to a site and found it wasn’t accessible they would at least know who to contact to see what their rights were to actually start that conversation. Now if we’re pushing out a lot of our stuff and we’re saying, ‘Actually let’s get the users to do it’. Who do you go to? So to a certain extent I’m just giving you a problem here. It’s really, really difficult and we’ve got to try and come up with things that are going to help this.
What the BBC are doing, as I say, we’re looking at the moderator as the accessibility approach. We’re trying to be strategic about it. We’re trying to enrich the most popular content on our sites with this sort of stuff because as I say, to try and do everything is incredibly expensive. But it’s not just about consumption. Web2.0 is all about creation. So what about contributing? Can disabled people actually get their stuff out there? Will their tools allow that? Or will disabled people be left without a voice? So for many this is very easy. For others it’s really not.
I’m going to give you two examples. One for people with literacy difficulties and one for people whose first language is BSL.
So literacy difficulties. These people have probably wanted to contribute to things like Wikipedia for ages but words are the problem for them. They don’t like communicating with words. It’s not their strong point and so if they want to be out there in public making their opinions heard, they feel like they’re going to be embarrassed. So if you, for instance, compare the task of emailing your friends with Facebook to Outlook you can see a couple of things there. They’re a little bit difficult to see. The only thing you really need to look at is the little red underline. So in my Outlook it does me an automatic spell check. I get my underline. When I’m in Facebook and I’m in just a normal form I don’t get my underline. Now for me that could just be slightly annoying but for someone who’s dyslexic it could be something that they use to make sure that when they communicate via the electronic medium they’re doing it in a way that makes them feel okay about themselves and good about themselves and they’ve got their spelling right. So you could put a spell checker on all of your form boxes. You can find ones that AJAX gives you now. And fundamentally you don’t need to do things with text. So BBC’s upstaged site was all about a video contribution so that really doesn’t place anybody with a literacy difficulty at a disadvantage. Moving on quickly.
People who use BSL may be happy to use written English in a closed forum with their peers. Again, these are people for whom English is not their first language. BSL’s their first language. They use written English to communicate with the rest of us. Now they may feel very intimidated because their language skills in English, say, if you look at the statistics the average reading age of a deaf person in the country is around about 9. Not everybody has a reading age of 9 but that’s about the average. So text isn’t really where they want to be at but here great. Web2 really comes in. They may put BSL video on YouTube. It’s great. It fundamentally allows them to be at the same party as the rest of us. They may even put subtitles or transcripts on it to make it accessible to non-BSL users. You’re not used to this.
Here we are. Here’s the other side of the equation. Suddenly everybody who doesn’t have this particular, and some of them don’t feel it, disability they think of it as a linguistic thing. But effectively you have some content there that I’m guessing pretty much everyone in the room could not access unless they had a transcript. So they may decide to do that. Maybe they wont. Which moves us on to multimedia.
So far so mostly good. As I say Web2.0’s very cool. We’ve had a lot of kind of technical stuff this morning but there’s so much multimedia content out there. How’s that changing things? And my contention, if I go back to that slide, I’m now looking at the assistive technology part of the equation here. So some threats and challenges there.
I don’t think the accessibility chain works anymore. Find me an assistive technologist that can make video accessible to blind people, or Podcasts accessible to deaf people, or for that matter multimedia online games fully accessible to either. The problem really is that modern AT’s can’t crack things that pretty much aren’t in text. As I say, I’ve got a still of Harold Lloyd hanging off a clock there. That’s a silent movie. To make that accessible to a blind person no screen reader on the planet can do it. And it’s not just about the controls. It is about the content. That’s why we need to move into new territories and we need to look at content creators who have actually been doing this sort of thing for a long time.
Before I do that I just want to say and reinforce that this is not all bad news. The opportunities for disabled people from multimedia are absolutely massive and for a lot of people who sometimes get forgotten, as I say, WCAG - which everybody still sort of uses as a touchstone doesn’t really say much about people with learning difficulties - people who might be autistic, people who might be deaf. But video for many is miles more accessible than text. Put simply, why have plain English when you can have TV? Any person with a learning difficulty I think would pretty much say they’d always prefer watching a video. As I say, multimedia can give us new opportunities and for deaf people who use BSL it’s really the first time they’ve ever been able to get their language out on the Web.
So but what do we do for all of those threats, all of those difficulties? How do we personalise multimedia? And as a number of people have said, and it feels great to be standing here in 2008, maybe in 2005 when most people were thinking, ‘Accessibility - That’s those screen readers isn’t it?’ It’s not just about blind people. I think we need to get rid of the phrase ‘it isn’t accessible to…’ I don’t think anything’s accessible really because I don’t know what audience you’re talking about. I’d like to replace it with it is or isn’t useable by people with a particular disability. And as I think Christian said this morning we can understand that sometimes you can’t please all of the people with the same product. It’s always been true and I’ve been having this battle for years. Web2.0 just makes it even more so.
So what you really need to do is to make things multimedia. I love ARIA but that rich internet and sort of rich media and all the rest of it, multimedia, multi-modal stuff is where I’m coming from because you don’t want to throw the old media out with the new stuff. So access services for TV, known this for years. Subtitling audio description and signing. As I say that’s a still from the BBC iPlayer about two weeks ago now we got subtitling working in it for download. We’re still working on the streaming. I think we’re the first broadcaster in the UK to have got that done. As I say, it’s possible. You can do it and that’s what I would term personalisation because not everybody wants that. It’s not one size fits all, especially when it comes to signing. How many people have decided to watch that repeat of a programme at one o’clock in the morning only to find that the programme is in the top left corner and there’s a signer in the bottom left hand corner. How off-putting is that? As I say, these things are not one size fits all.
I think personalisation is something that’s been alluded to. There was a number of preferences in some of the stuff this morning. Personalisation for me is really, really key to where we’re going. And if you look at virtual environments and games you take all of those things and you really, really push it, free immersive environments, all of that sort of stuff. How are assistive technologies really going to help us with that? The only people I know who really, really know what they’re doing are people on the games accessibility SIG. And if it’s user-generated stuff, it’s okay if it’s a broadcaster creating these things, but if it’s user generated well then we have all of those costs double. I won’t let you know how much it costs to do subtitling for off-schedule contents.
So that’s content that hasn’t gone out on the TV in the BBC. So this is content that we can’t reuse the subtitles from, from the telly but I mean it’s pretty large. So you can look at the moderator as the accessibility approach. You have to be strategic about it. You can also ask other motivated users.
As I say, we were talking earlier. I did a little bit of a look on Read On sort of whilst I was sitting up there. They’ve done 500 clips. One of the services that I was looking at from the BBC yesterday I was looking at 200 hours of clips every day and that’s all user generated content. So it’s really costly, it’s incredibly expensive. Even the people who are trying to do it via this sort of way of trying to get the community in on it are going to be behind where the programmes are. And I’ll just reference the point about we could get the podcast of this up tomorrow but if you want the transcript it’s going to take a couple of weeks. There’s nothing wrong with that at all. That’s just the reality of the situation. That’s where we’re at, at the moment. This is a huge accessibility challenge. And porn for the blind was referenced earlier, as I say, that was in the Metro yesterday. I wouldn’t advise you go and see but it’s quite an interesting community initiative.
Okay I’ve bamboozled you enough. I think I’ve probably made the whole situation sound rather worrying. I wanted to give you some demos of things because sometimes even personalisation really isn’t enough when a particular audience needs something specific to their needs. I’ve spent my last three years looking at how to use games to educate children, and specifically children with disabilities. Now the sort of stuff we’ve been doing is very R&D and it can potentially suggest new techniques, new technologies that can be used in the future.
So I wanted to show you a couple of things. One is a create type tool for children who are deaf who use British Sign Language and might also help with actually crossing the barriers between people who use sign language and people who use written English. And if I’ve got time I may wave my arms about a lot trying to enable you to understand how we can make games accessible for blind children. So I’m just going to start up the first one. So just a little bit of context about the project that I’m just about to show you.
We were trying to find a way of enabling primary age deaf children to learn British Sign Language and English at the same time. Ah here we are. This always happens when you’re trying to do things live. Let’s try this way again. Oh no, don’t worry. We’re there. Good. Okay. This is something called Performing Hands. It still isn’t out there yet. We had a few technical and political programmes with the project that I was working on. I won’t bore you with it but this is the product that we created and there are a couple of things on this I want to show you. So I’m just going to bring up a signing avatar.
You may have seen these before in various places. They came out of an EU project initially a few years ago and people have been struggling to know what to do with them for a while. So we have a little genie here and what I’m going to do is very, very quick because I know we’re running out of time, example of the type of thing we can. So I don’t know how your BSL is but BSL is a performed language. It does not exist written down. It’s also incredibly difficult if you think about it. If you want to communicate with that language on the web the only way really you can do it is to…so you can’t type, it just doesn’t exist. You can maybe video everything but that assumes you’ve got a video camera. It’ also not particularly fun. So we invented these little guys and they sign and what we’ve got here is potentially the start of a means of going between the languages in a very, very restricted set of vocabulary and if I put that together with this.
So that was our first try to ensuring that deaf children could write because this is a literacy product. So it was all about reading and writing. So what we’ve got here is a very, very simple playmaker. So in true panto style I can choose which actor plays which part. So I’ve put the man as the woman and the woman as the man because that’s what you do in panto. And what we have here is a very simple way of putting together a little script. I’m just going to do a couple of things. ‘Hello, Grandma. I’m a little shy’. And so we have it there in English and we also have it there in BSL. So it’s not perfect, but as I hope you’re getting, it looks kind of cute and actually for deaf kids it was pretty much the first software that they’d really got to help them in this sort of area before. And you can imagine something like that could actually get, because of the eye candy-ness of it, if you think of chat rooms, I mean literally you can put these avatars in a room, you think of Second Life and all of those deaf people who are saying, ‘Well why can’t I use my language in there?’ We’re heading in that direction.
And the last thing I wanted to show you, I think I may have to waggle my arms about a lot. Have we got time for this? Okay. So we had a bit of trouble over lunchtime trying to get this…I’m just going to tell you what we’re about here because this is a really unusual project.
So this was trying to get blind kids to use games to learn, in this case, science. So this is for 7 to 11 year old blind kids, none of whom at the start of this project had ever used a computer before. So we wanted for them to understand what that funny computer thing was in front of them and we wanted them to understand, pretty much, three decades worth of the grammar of video games and how to have fun with a computer. And we then wanted to try and get to use that to teach them things about science. As I say, we don’t do things by halves at the BBC.
So what we’ve got here is something, which is all about friction and also something, which is rather like a number of those golf games that people may have played. I’ve just turned the volume down for a while so that I can scroll through a lot of the instructions. As I say, these kids will not be seeing what you’re seeing. This was for anybody who was working with them. Someone asked earlier about that difficult situation where you have a blind person using something and they’re trying to communicate with a sighted person who’s trying to help them. For a lot of these blind kids their parents would want to know what they were doing but they probably weren’t into video games themselves as well.
So the stuff that you’re seeing on the screen is for anybody who is sighted working with them. The stuff you can hear, or will be able to hear soon, is for the blind kids. So what we’ve got, for those who can see on the screen, and I will just talk you through it, is we have a game where you have to get mine carts from the left to the right of the screen. They are of various different weights and you have a power meter. So this game is all about how hard you need to push something and how far it goes. So you need to know how hard you’re pushing fundamentally and then the state of play at all times, where those carts are.
So I’m just going to give you a demonstration of how we did that. All ogres are from Huddersfield. So what we’ve got here, and this is why I didn’t really eat much lunch is because we were trying to get stereo working in here and unfortunately I’m guessing, unless any of you start waving your arms at me, that you didn’t get that in stereo, which is a real shame.
As I say, we’ve actually got this in three dimensions with the next product. We have carts that start off from over there. They move this way and they arrive at a certain point. So everything works in stereo. So they start on the right and when they move across you can hear them moving on the stereo field, if the stereo were working today. I feel like actually I’m sort of tweaking every sort of presentation space in the UK at the moment. I’ve done this presentation about six times now and I’ve only had stereo once and every time we’ve tried.
So I’m used to moving my arms about but if anyone’s got a bit of time later and you want to hear this with headphones you get the real experience. So things move in the stereo field but there are also three bits of feedback there. If we just listen to the power meter. I’ll turn him down for a bit. So two bits of feedback there. So I was pressing my key down and I was trying to see how far up. So we have tones going up: da, da, da, da and we have one, two, three and four because some people can’t actually tell the differences between notes. So that gives you an idea there.
Now what has this got to do with anything that you’ve got back at the office? Well if you think back to that meter earlier, so the twitter thing, the number of characters counting down, it was telling your thirty characters, fifteen characters, ten characters. That will be getting in the way of what you are typing in on screen reader. So the two will be interfering.
The whole point for us to try and get almost for the first time sound actually being the feedback for a blind person from a bit of software rather than just this interminable babble of the screen reader. And the other great thing about is you can layer sound on top of each other and just as long as you don’t do that badly you can actually get a hell of a lot more information across. And just in terms of how the carts go along, I’m going to end now, but effectively yes, you hear them coming through. You also then hear the ogre doing a number of steps to get to it and again you can count the steps.
So it’s about trying to give people a multi-modal way of getting the information. And if you think about if you were blind and you were trying to use Second Life, the cocktail party effect is always the thing. You have fifty people in the room with you and somehow you’re still managing to hear exactly the person you want to hear. Why? Because you can locate them in space, you can turn yourself to them and you can say, ‘Right, you’re the only person here I really care about in terms of hearing’. In Second Life at the moment you have about five avatars in the same place, well how do you that with them? And I mean the good thing is that IBM and Second Life are looking at this sort of technology. That was all I really had. I don’t know if we have time for questions but thank you.
Thank you very much indeed, Jonathan. Is this on? Great. Yes, so we’ve got a chance for a few questions while the next speaker, Stephen, is swapping over. Is that all right? Great. So hopefully that wont be too off putting, Jonathan. Sorry about that. So fantastic. Really, really good. Thank you very much. Any questions? The angels will descend upon you.
Well can I ask you a question? So did that do it for you? I get the feeling that you had a lot of very, very sort of do this, do that type stuff, very, very useful stuff this morning. I’ve probably given you loads of worry and loads of stuff that may not seem very close to you where you are. I mean I guess what I would say is that if Gartner is to believed then by 2010 80% of all internet users will have a presence in some form of virtual environment.
So you may not be thinking about this stuff at the moment but if you think you are accessibility professionals, and I guess pretty much everybody is somewhere in that, then the game is moving and the rules are changing and I hope I’ve just given you a glimpse of some of the solutions that can maybe help with that. Any…yes?
Hi. My name is Mandy and at the moment I work for Riverside Housing, the Riverside Group up in Liverpool. But my previous position was the Online Projects Manager for FACT in Liverpool, which is the Foundation for Art and Creative Technology, which is based up there. And one of my roles there was to take artist’s projects and give them an online presence. I’m just giving you a bit of background.
While I was there I had to a project with Chris Watson, who is the sound engineer for Richard Attenborough. And he created a project, which was a sound based project for the children of Alder Hey Hospital. So he was taking birdsong and putting it into the hospital and instead of having bleeps and clicks the children were hearing birdsong and my job was to create a site that basically had that online. But one of my problems was is that the accessibility for that was almost zero for people with hearing problems because getting birdsong across to someone who’s never heard a bird and doesn’t understand what birdsong is, is incredibly difficult.
Now we kind of sourced hardware from Europe and we did a sort of vibration booth. So we created a booth where people, with a touch screen interface, sat on a vibrational panel and could feel the birdsong come up and we could turn up black bird and turn down sparrow and all that kind of thing. We did a sort of vibrational walk through part.
It’s very long winded but my question to you is we were talking before about software and getting people to be more aware when they’re developing software packages about accessibility. How did you find it when you were creating things like this with hardware because hardware is just as big an issue and it’s something that as developers and as project managers we can only work with the tools we’re given hardware wise. How are you finding the development stages of people creating hardware?
Yes I mean so we didn’t create any hardware. Actually there’s a load of really good stuff out there. I mean brain computer interfaces have come a hell of a long way. Europe are putting a lot of money into it at the moment if you look at what’s happening in terms of tactile things.
So what I’ve showed you in terms of the mine carts and things is almost about a year and a half old now. We’ve got a whole new generation from there but that uses 3D sound that comes through headphones and that would really kill the room. But also it uses a number of kind of haptic devices that we’ve been playing around with. A lot of them are really expensive.
Sure. I mean I will just give you one example from the start of one of my projects. So Braille displays exist and they’re used by generally teenagers in schools as an aid for when they are doing English, maths and all the rest of it. But most children who are learning Braille in the UK have no electronic support for that at all. They are literally using a machine, which is an old fashioned Braille machine where they don’t know if they’re doing it right or not without somebody telling them. So the hardware was out there. It’s very expensive. There’s no software at the moment until we created what we did for anybody to say why people would want to buy that.
If you compare that to Holland where there are some wonderful, wonderful blind schools they buy in bulk and because they buy in bulk the hardware costs come down immensely. So a lot of these things come down to economies of scale. At heart I’m a bit of a visionary and hard-edged businessman all at the same time and I believe that if we create the demand, if we let people know that there is stuff out there that will work for them, they will go out and buy it. As I say, no blind child in this country is using any sort of e-learning software at all until at least about the age of 11, and we wanted to change that.
So yes, it’s all about economies of scale really in the end but thanks for your question and it sounds like a great project.
Thanks, Jonathan. Yes one example of economy of scale is the RNID lobbied long and hard with the Government to get digital hearing aids delivered through the NHS. And the Government’s argument was that they cost £2,000 a piece but they pushed hard with the suppliers and now they are delivered through the NHS and they’re £50 a piece, so that’s a really good example. Great.
Thank you very much indeed. A round of applause please for Jonathan.