The Myo armband has been used to allow a man named Johnny Matheny to control his prosthetic forearm and hand with gestures.
It's an amazing use of Thalmic Labs' wearable, which reads electrical activity in the muscles and uses a 9-axis motion sensor to detect rotations and movements. Thalmic Labs and the Johns Hopkins Applied Physics Laboratory have created a set up in which Matheny has a prosthetic arm attached directly to his skeleton then wears two Myo armbands on his upper arm.
The gesture control wearables than read electromyography (EMG) signals, sends them to a computer for analysis via Bluetooth and this is sent to his prosthetic arm. No buttons, no levers, no voice control. When he thinks about moving his arm, his arm moves: opening his hand, shaking hands, making a fist, picking up objects and rotating his arm are all possible.
The potential for gesture controls
What's nice about this Myo band project is that it has been marketed as a tool for all sorts of gesture based computer interactions with 100 apps on the Myo Market. It is designed to be worn under your clothes, on your upper arm, but weighs less than 95g and it is also expandable.
Now, there's a chance to not only give us new ways to play with tech toys, control our smart homes or play Iron Man, but change the relationship between prosthetics and their users.
"If Johnny's case shows it is possible to directly turn thoughts into actions, then the future of human-computer interaction can achieve a new reality," said Thalmic Labs CEO Stephen Lake. "While each person's arm and mind may be different, this is an incredible example of how scientists, developers and engineers around the world have transformed lives using the Myo armband. That is why we have opened the SDK to third party developers."
The Myo armband is on sale now in black and white and its app store is live. Expect to see more innovative uses pop up over the course of 2016.