While the wearables industry grapples with next-generation glucose, blood pressure, and ketone sensors, AI may enable us to get medical-grade insights without the need to wear anything.
Today, we predict glucose. Tomorrow we can predict any biomarker like lactate, ketone, cortisol and so on
That’s the promise of January AI, which uses natural language processing and generative AI, to learn how what you eat affects your body, predict glucose levels, and train us to eat smarter.
It points to a future where AI can make wearable data more accessible, smarter, and more effective in creating better health outcomes:
“Wearable data is continuous and conducive to machine learning. Using AI to learn people’s baseline, recognize patterns and partially hidden variables, and alert for irregularities will change how we monitor health,” Noosheen Hashemi, founder and CEO of January AI told Wareable in an interview.
Glucose tracking is coming. In the next 5 years, wearables will offer everyone insights into how food affects our well-being.
Apple is rumored to be working on a non-invasive blood glucose sensor, and we’ve heard from the likes of Afon, Fitbit, and Movano – all of which are working on bringing glucose insights to both type 2 diabetics, and healthy people looking to improve their health.
Add to that Abbott Lingo, which is a forthcoming consumer-orientated CGM, and could also track ketones, lactate, and other biometrics, and Supersapiens, which uses an Abbott CGM to help athletes plan their readiness for endurance sessions.
But January is already looking to do all this using AI.
“At January AI, we have invested five years into generative AI for human biology and, today, we predict glucose. Tomorrow we can predict any biomarker like lactate, ketone, cortisol, and so on,” Hashemi told us in an interview.
How it works
To train January, users still need to wear a CGM for between 4 days and 2 weeks (the longer the better), and log food intake using January AI’s 32 million-strong directory of meals, so that it can learn how your body responds to food.
After the training period, January says its AI can estimate the effect of any food on an individual’s glucose and produce the same data as a CGM.
And using heart rate data from smartwatches will make these levels even more accurate.
January will inform users when their blood sugar is too high or low, learn which foods trigger the biggest responses, and how exercise will affect glucose curves.
And unlike a normal CGM, it can predict how food will affect your body, not simply report what's already happened:
“Think of it as like the weather report. Imagine if you were given a report of what had already happened. That wouldn't be that useful to you. Think of this as your glucose forecast to let you plan,” Hashemi said.
However, users will need to use a CGM to validate the AI. For diabetics, this could be every month, but for those using the data for general wellness, this could be once a year.
AI glucose tracking and ‘food as medicine’
Wearable glucose monitors – and January AI – aim to help the hundreds of millions of type 2 and pre-diabetics, who can benefit from learning how glucose affects their body.
“In the US three are 133 million people that have diabetes and prediabetes. Of those only four and a half million people have insulin intense use. 1 in 3 people in the US has pre-diabetes, the numbers are pretty staggering. So we are here for that group,” Noosheen Hashemi said.
But it’s not just about those with diabetes. It’s also about ‘healthy people,’ who can feel better, live healthier and perform at their best ability, when they’re in control of their glucose.
“Food as medicine is picking up steam as a movement. That’s recognizing the role of food in your health for a higher quality of life, mental acuity, and better microbiome,” she continued.
“We want people to practice mindful eating and think before they eat. And our prediction model enables you to see what you should eat and what it’s going to do.”
AI and food logging
The idea of logging food brings back some unpleasant memories of serious admin in MyFitnessPal.
But January’s food tracking also uses AI, to make things simpler.
January says it has integrated every food database available, including the menus of big chain restaurants, and individual items. Food tracking can be added using simple voice input.
“We have tried to make things really simple. So we have NLP (natural language processing) logging so you could say “I had chicken tikka masala with coke and vanilla ice cream,’” explained Hashemi.
Counterfactuals and training behavior
Once food has been logged, the AI kicks into gear. Unlike offerings that simply report glucose levels, January’s AI can offer insights into what could have happened if you’d chosen alternative foods or added in exercise.
“We can give you food counterfactuals. If you’d moved after 10 minutes this would be your curve. If you had walked for 25 minutes after eating, this would have been your curve,” Hashemi said.
Just like our conversation with Kristen Holmes about the role that AI can play in coaching, January claims it can train users to have a healthier relationship with food.
And Hashemi also claims this is where other implementations of consumer CGMs, such as Supersapiens, have fallen short.
“I looked at Supersapiens, it basically says ‘your blood sugar is going down, eat something.’ It doesn't tell you what to eat. It doesn't tell you how much to eat. There is no AI. There's no prediction whatsoever,” she said.
But it shouldn’t be forgotten that while consumer glucose monitoring could help all of us reimagine our relationship with food and health – it can change the lives for millions.
And AI and wearables together have the power to deliver the kind of insights we’ve been promised for a decade.
How we test