Researchers at Texas A&M University's biomedical engineering department have created a connected armband that converts sign language into text.
While Google and co focus on developing Babelfish-style software to translate speech from one language to another, this new wearable focuses on turning wrist motions and muscle activity into words.
The person who is signing wears the armband which includes sensors to track hand and wrist gestures as well as electromyography (EMG) signals produced by muscles in the wrist. The sensor data is then decoded and sent, as text, via Bluetooth to a PC or smartphone.
Read this: Biometrics to blow your mind
"We decode the muscle activities we are capturing from the wrist," said Roozbeh Jafari, associate professor of biomedical engineering. "Some of it is coming from the fingers indirectly because if I happen to keep my fist like this versus this, the muscle activation is going to be a little different.
Ultimately, the team is aiming to miniaturise the tech down to a smartwatch size as well as improve the software to translate whole sentences, not just individual words as the proof of concept device is capable of.
Such a smartwatch could become much more powerful as it could also include a speaker to give deaf people a synthetic voice. They would sign what they want to say, the sensors capture the movements, translate it into text which is then read out loud via the speaker. It might be a slower way to chat initially but as the user would be signing anyway, it could simply enhance the conversation.
"When you wear the system for the first time the system operates with some level of accuracy," Jafari continued. "But as you start using the system more often, the system learns from your behaviour and it will adapt its own learning models to fit you."
Recently Thalmic Labs' Myo gesture control armband was used to help a man control his robotic prosthetic arm by thinking. The Myo band replaced stick-on electrical sensors and communicated with the robotic arm via Bluetooth.