Latest version of Smart Specs is sleeker with a focus on users who have tunnel vision

Around 85% of blind people still have light retention in their eyes, but this has historically been perceived as light 'noise' and has been of little use to a blind person in comprehending their surroundings.

smart specs being trialled at a demonstration

Could this light be made more meaningful with a bit of technological intervention, changing the lives of some of the 285 million blind people worldwide? According to finalists of the 2016 AbilityNet Tech4Good Accessibility Award, Oxsight – yes it can.

OxSight created Smart Specs to do just this. Using augmented reality, the invention enables the wearer to make more sense of their surroundings by simplifying the ambient light, translating it into shapes and shades that allow people to discern physical objects and depth perception within the physical environment.

The Oxfordshire-based company uses see-thru Epson projection lenses in conjunction with three infrared and depth sensing cameras in order to 'project' the outline of people and objects into the remaining eyesight of users. They can then navigate through space with a depth recognition of up to four metres (see the Techcrunch video below).

 

Dr Stephen Hicks, research lecturer at the University of Oxford and co-Founder and head of innovation at Oxsight Ltd talked to AbilityNet about what's been happening at Oxsight since last year's Tech4Good Awards...

How did it feel to be a finalist in the Tech4Good Awards?

It was a great way to get our young team to recognise the importance of working in an area where we are building technology to change people's lives. Also, we met lots of other people in similar spaces, particularly the guys from Wayfindr (winners of the Tech4Good Accessibility Award 2016), and that's helped us build up collaborations and hopefully better products. We've also been given the first prize in the Industrial and Life Science Design Competition by Thinkable since Tech4Good.

Have the Smart Specs developed or change in the last year?

Yes, we've spent time working on making them more discreet and less bulky (photo above shows an older version), as well as making them run more efficiently with less power. It's really important that they look good and don't make people feel conspicuous.

Has the technology behind the Smart Specs changed?

Yes, we're pushing hard on machine learning so that future versions of the glasses can detect and recognise different objects in the environment. We've got a nice mixture of research and application and are testing internally. Currently deep learning requires very serious hardware so we are putting effort in to reducing that.

girl drinking tea in black and white with outline shapes showing depth of field

(Image: Oxsight uses depth to determine what level of enhancements to apply to a scene. If it is very far away it is turned black, in the middle distances Oxsight enhances the object boundaries, and close objects are treated with an algorithm that essentially turns everything in to a high contrast cartoon)

Are you focused on blind people in the UK?

Yes, but we are also very keen to develop markets in areas of world that really need such a product, such as India and China which have huge populations of people with low vision.

What's next for the Smart Specs?

We're starting a new trial with the Guide Dogs Association next month in London. A number of people will get the glasses over several weeks. We're mostly focusing on people with tunnel vision as this may suit the new type of display better. We still want to work with people who have macular degeneration (loss of central vision, usually associated with ageing). We're doing further work at Oxford Uni, funded by the National Institute for Health Research i4i (Invention for Innovation programme), to look at displays and algorithms that should be able to make a big difference to people with macular damage.

When do you think Smart Specs will be on the market?

We expect to have our first product out before the end of the year.

 

Read more: