Wearable tech that interprets sign language

Wearable technology is poised to bridge the communication gap between the 70 million people in the world who use signing as their first language and the rest of the hearing world. A research team at Texas AM University, lead by associate professor of biomedical engineering Roozbeh Jafari, is developing technology that can detect hand and finger movements and translate them into spoken or written text as we see in this short video.

The wearable technology combines sensors that detect hand motion with other sensors that measure the electro-magnetic signals from the wrist muscles. It is able to recognise the hand gestures of the wearer and then translate the individual’s sign language into text or even spoken words using synthetic text-to-speech.

“We decode the muscle activities we are capturing from the wrist,” says Roozbeh Jafari. “Some of the activity is coming from the fingers indirectly, because if I happen to keep my fist like this versus this [slightly changes hand shape], the muscle activation is going to be a little different.”

A break-through in translation

British Sign Language sign for computer - both open hands held next to each other in front of body with palms facing down. Fingers wiggle.Computers are getting ever better at translation from one written or spoken language to another. There are a huge number of apps, such as Google Translate on your smartphone, that make communicating with others who speak a different language much easier.

Even Skype on your computer now comes with an option to instantly translate what you have just said into another language which is then spoken out loud to the person on the other end of the call – and vice versa.

Sign language, however, provides a completely different challenge, not least because it varies considerably from English as we know it. Though based upon English, It has quite a different grammar (sentence structure) and reduced vocabulary. It is in effect another language, and sign language varies across the world. Translating from British Sign Language into spoken English presents the same challenge as translating from French, for example, into English.

Actions speak louder than words

So the technology of translating from one written, or even spoken, language to another is maturing nicely. Translating from hand gestures, however, has its own unique challenges and this breakthrough tech will help give the deaf a voice and put their actions into words.

The team at Texas AM University hope eventually to shrink this prototype into something the size of a smart watch – reading tendon and muscle contractions and variations in electrical pulses directly from the wrist and translating into everyday spoken English.

More information