
What if your smartwatch could detect exactly how you're feeling? Researchers at MIT have built wearable software to do exactly that, detecting a person's emotions based on their speech patterns.
The software uses algorithms to monitor audio, text transcripts and bio-signals to tell (roughly right now) how a person truly feels. The researchers at MIT's Computer Science and Artificial Intelligence Laboratory say they can currently detect the overall tone of speech with 83% accuracy.
They found that certain signals, like monotonous tones, long pauses, increased heart activity and fidgeting signalled a sad story was being told, as you might expect. The software can also take a reading every five seconds of conversation with an accuracy rating of 18% above chance.
Read next: Stress-beating tech to keep you sane
Right now they've got the software running on a Samsung Simband, the company's Tizen-based wearable development platform, but the researchers plan to get it working on commercially available devices like the Apple Watch.
They also plan to take more nuanced readings, picking out moments when the speaker is tense or excited, for example.
Their hope is that one day this type of software could be used in wearable devices as an AI social coach for people with anxiety or Asperger's.
"Imagine if, at the end of a conversation, you could rewind it and see the moments when the people around you felt the most anxious," said Tuka Alhanai, who co-authored a related paper with PhD candidate Mohammad Ghassemi.
"Our work is a step in this direction, suggesting that we may not be that far away from a world where people can have an AI social coach right in their pocket."
How we test