Artificial intelligence is no longer something confined to the pages of pulpy sci-fi novels; the rise of smart machines is here and now, with everything from smartphones to speakers gaining a degree of intelligence through machine learning algorithms.
Thanks to the use of powerful machine learning techniques, trained on supercomputers yet capable of running on tiny processors, we now have virtual assistants that can serve up personalised and contextual information in a slick fashion, while ordering you an Uber or even a pizza on demand.
The limited space for displays and controls means that tapping into smart algorithms, whether they are built into a smartwatch or piped from an iPhone, makes a lot of sense for wearable tech. Prompting a smartwatch to answer a call with a flick of a wrist is a lot more logical than trying to tap a tiny icon on a watch face or twiddling a crown.
So it's no surprise that the latest generation of virtual assistants are hopping over from smartphones into wearables; Alexa can now be wrist-mounted thanks to the Martian mVoice watch and a few other smartwatches, and she's certainly not alone.
A helping hand
But dig a little deeper and we start to see that AI and machine learning systems are also being built specifically for wearables. Ian Hughes, a connected tech analyst from 451 Research, believes AI in wearables will be designed to excel at certain tasks and not just act as a jack-of-all-trades assistant.
"You're not treating it as a generalist computing device, but you're treating it as something that knows what it's doing. It can be trained to run machine learning or decent levels of analytics to work out what to do, and how to switch what it's doing to meet your needs without having to interface with it," he explains. "It doesn't have to be very clever AI really, it has to have good design."
One rather cutting-edge example of this can be found at MIT's Computer Science and AI Laboratory. Researchers at the lab took Samsung's Simband health-orientated concept smartwatch and created an AI algorithm that uses onboard sensors to collect the wearer's biometric data, like movements, blood pressure and skin temperature, and combine it with audio analysis to detect the tone of conversations the person is having.
The idea is the wearable AI could act as a way to help people with conditions such as Asperger's better understand and detect social cues in everyday conversations that they might otherwise struggle to spot. "Our work is a step in this direction, suggesting that we may not be that far away from a world where people can have an AI social coach right in their pocket," notes MIT researcher Tuka Alhanai.
But this form of AI tech is not just confined to concept projects and research labs. Take the Bragi Dash Pro, a pair of smart earphones with custom AI tech called the Contextual Engine. It harvests data collected by sensors on the headphones to recognise gestures and movements and control various features in the hearable without needing to tap at a connected smartphone app.
The AI also uses elements of the powerful natural language processing found in IBM's Watson cognitive computing technology, which allows it to provide real-time language translation through the third-party iTranslate app.
It has some way to go, but Darko Dragicevic, Bragi's vice president of partners and solutions, told us that the Dash Pro will at some point offer a form of smart universal translation.
"We're not that far away from being able to walk into a board meeting in South Korea, or Milan, or Los Angeles and having everyone in the room understand everyone else seamlessly, regardless of their native tongue," he says.
Wearables are great at vacuuming up health, fitness and biometric information; AI software loves data. So it's logical to see companies working on fitness tech exploring a combination of the two.
Born out of sensor laden helmets designed to monitor the physical exertion fighter pilots are put under when pulling multiple Gs, Vi is a pair of headphones from AI wearables firm LifeBEAM. It's built around an AI that crunches the wearer's health and fitness data to fuel smart voice coaching features, aimed at keep users motivated to achieve their fitness targets.
Another fitness focused firm, PIQ, tapped into AI tech to create the PIQ Robot. This is a wrist-mounted sensor kit paired with an onboard AI which analyses the wearer's boxing workouts to intelligently find strengths and weaknesses in their technique.
Then you have the Supa Powered Sports Bra created by AI startup Supa, which uses biometric sensors to feed data into an AI-powered app to track the wearer's activity and serve up personalised health recommendations.
None of these examples are necessarily making use of cutting-edge AI tech, but instead they demonstrate how AI in wearables is less about the breadth of stuff it can do and more about transferring information and making specific tasks easier for people.
Getting smarter all the time
"I think the devices themselves are going to have more responsibility to do what they need to do," says 451 Research's Ian Hughes, who suggests that when it comes to mixing AI and fitness tech the goal is to provide access to useable information and advice more quickly and easily for the wearer, rather than requiring them to make their own judgements on the data their device kicks out.
But while fitness trackers get smarter, Hughes says their future is stymied a little by having lots of devices gathering and processing information yet struggling to play nice with each other. "The trouble is we've got lots of devices that are all trying to be the entire hub, whereas really we want lots of distributed devices all talking to each other where we can pick and choose the ones we want," he explains. "At the moment those things don't integrate very well."
Boltt is one startup that's looking to overcome this problem by creating an AI-powered fitness platform that can not only pull data from its own fitness trackers, but also sync information from other wearables like the Apple Watch and Garmin Forerunner.
The mixed reality angle
AI has a lot of potential to make accessing information and features in smartwatches and fitness wearables easier. But Hughes reckons it's with mixed reality, the blending of virtual and augmented reality, that AI in wearables will really make a difference.
The current crop of mixed reality headsets mostly rely on external sensors, a smartphone, or a tether to a powerful PC to deliver even semi-convincing virtual experiences. But headsets like Microsoft's HoloLens have all the kit needed to power them onboard, essentially presenting a head-mounted computer that has to balance creating a convincing mixed reality experience with not putting the processors under so much strain they end up baking the wearer's skull.
"That device has to interact with the person and understand what they are doing and what they want to do, and understand the world around it, which is really complex stuff anyway, and understand any of the extra information it's supposed to be displaying for you," highlights Hughes.
But he also points out that AI can make things a lot easier by smartly adjusting a headset's performance to suit each wearer and how they are using mixed reality, all on-the-fly. Hughes predicts AI will make all the difference here, as it can intelligently tailor a mixed reality experience to cope with factors such as different room layouts as well as different users' gestures and movements, all without hitting the latency encountered when sending data from a wearable to an external AI system.
"It may be that it's learning what you're doing or the type of person you are so it knows how much back-end communication to do versus how much processing to do [on the device]," adds Hughes.
Microsoft is looking to lead the way here as its HoloLens 2 will have a dedicated AI coprocessor chip to better analyse objects around it, helping it improve how it tracks individual wearers' movements and generally perform faster. But that's just the beginning. The smarter the device, the more it could potentially do to make using mixed reality a more expansive experience, such as adapting to the type of content a person has shown interest in and serving up more of it.
On the horizon
And it's experience that's really the crucial element when it comes to AI in wearables. Hughes points out that machine learning and AI will need to focus on making access to information on wearables easier and bypass the need for people to learn new apps or physical interactions. After all, it's no good if smart wearables alert you to something that you can only view or access on your smartphone.
"Blending [information] and changing it and being able to see it in situ is going to make the difference to people," predicts Hughes. "If it's completely and utterly hands-free and you just see it; you don't have to learn how to navigate it."
The tech may still be a little way off before AIs and wearables are smart enough to present information in a Minority Report-style augmented reality, but Bragi's Dragicevic is confident there's more to come.
"We have only started to touch the surface of AI's potential with wearable technology," he says. "You've got decades of computing that suggest chips, antennas and devices will get smaller, faster and more effective. It's going to happen, but just not tomorrow."