Oura is preparing a new food logging feature, which will use AI to analyse how your meals affect your circadian rhythm
And the new feature could use AI to identify your food from a simple photo, rather than having to manually add the meal – and then analyse it against your smart ring data.
The information comes via an “APK teardown” of the Oura app by Android Authority. That means looking into the source code of beta software, looking for strings of code that reveal upcoming features.
The code revealed strings that clearly show a meal-tracking component within the Oura app.
It also clearly shows an AI component will enable users to photograph their meals, with the app estimating calorie content and reducing manual data entry. And there are other strings that point to the use of natural language input, either via text or voice.
One string of code read: “Logging your meals consistently with an AI-assisted analysis can help you understand the effects of eating habits on your circadian rhythm and internal processes of your body.”
Circadian rhythm is the next big wearables focus
We just published a deep dive into how circadian rhythm tracking is set to be a new frontier in wearables.
Circadian rhythm refers to the natural cycle of your body, and how aligned you are with how you naturally sleep and wake. But it’s affected by your exposure to light and what you eat. And there’s a growing body of research that points to the importance of aligning yourself with your natural rhythm.
Studies have shown that disruptions in circadian rhythms can lead to metabolic syndrome, which includes conditions like obesity, diabetes, and cardiovascular diseases.
So the potential for Oura to advise when to eat, in tune with when you wake and sleep, could be really compelling. It could have the potential to boost your mood, energy levels, as well as improve sleep and longevity.
But logging food manually can be hugely frustrating – and it’s something that takes time, is easy to forget, and isn’t a very precise science. We’ve experienced this recently using services such as Abbott Lingo (above) and Nutrisense, which require tagging meals to make sense of CGM data. It’s a total pain, and makes it tough to generate the best data
However, the act of logging food can be a big driver for awareness and accountability. Simply taking a picture of what you eat would be a huge time saver. So having AI do the work puts Oura in a strong position.
Wareable says
One of the huge opportunities of AI is to take away pain points such as food logging, which can unlock the potential of wearables. Smart rings such as Oura offer huge potential for tracking our data longitudinally, which means amassing data over months and years. When this kind of data can be combined with information such as what we eat, wearables can start to unlock revolutionary data.
The problem is that asking users to collect data via food logging or surveys is annoying. If AI can step in and take the friction away, then wearables have a better chance of offering truly actionable insights.
But to get users engaged in taking a photo of every meal will mean giving them a reason. So it will be fascinating to see what kind of data Oura can generate.