It's a simple idea, yet a big one: a smart glove that tracks sign language gestures with bend sensors on the fingers and a gyroscope at the base then either reads out the sentence or shows the text on a display.
That's exactly what Hadeel Ayoub, a PhD researcher in assistive technology at Goldsmiths' department of computing, is working on. She is demoing her IBM Watson powered Re:Voice data glove prototypes at the TechXL8 show in London this week and I got to see a demo of the glove in action. Ayoub signed "One day I hope to give a voice to those who can't speak" and as I bent down to put my ear closer to the built-in speaker, I heard the words, even on the noisy showfloor. Pretty remarkable.
People were so emotional during the demos, especially family members of people who use sign language every single day
Now, there are a few similar projects in the works, notably a duo at the University of Washington working on SignAloud gloves. What's exciting about Ayoub's project is that once she raises enough funding, her Re:Voice gloves could be ready to order within nine months with product development company Nine Degrees. So by "next Christmas" they could be a real product. Ideally, they would be priced affordably, at a target of around £500 (versus typical £2000 assistive communication tablets) and purchased by schools, offices, airports and other institutions.
Read this: Machina is merging smart clothing and VR
"My Masters was in computational arts and I was designing a wearable called Air Draw which you wear and you draw your sketches in the air," she explains. "I used the same tech for model building. Now, to see who did each painting, I wanted them to write their names but I didn't want to bring in the keyboard. I wrote three or four lines of codes to write A, B, C. The letters became words, words became sentences, sentences became speech and now it's translating."
Ayoub still teaches in design so she will continue to develop her model-making wearable alongside this PhD research into assistive tech. The feedback from the start has been overwhelmingly positive.
"People were so emotional during the demos, especially the family members of people who use sign language every single day," she says. "I've had a lot of emails from parents, waiting for this type of tech to come out. They're really invested."
The translation is in real time but while Ayoub experimented with speech as the wearer signs, early feedback from testers suggested that it is less confusing to start reading out once the phrase is completed. So the current prototype works in this format instead. "I was testing this with kids with it translating at the same time and it wouldn't know when to stop," she says. "It was running all the time. But this one [the latest prototype] I tell it I've started and I've stopped with a button."
There is a training mode in which the wearer can add their own signs and gestures. This is built into a library with the help of machine learning and has worked better than custom libraries so far as people can sign slightly differently. With the training mode, there are almost no errors unless it's a connectivity problem as it all works via cloud. Plus the translation works in any language - French, Italian etc.
The plan now is to continue with another round of testing the glove with 15 - 20 people who are either hearing impaired or have non verbal autism, Ayoub's two main target groups locally. She is also actively looking for investment as, although she has some collaborators to call on, it's very much a one woman show at the moment.
The final design of Re:Voice is still in flux though. To make it more practical day to day, she is also making the glove part insulated, waterproof and washable with removable hardware. She is also experimenting with both batteries and external power: "Last week I was in training mode and I decided it would be more helpful to display that it's recording on the screen than the text, for instance. Everyday I sit in the lab and I try new things."
How we test