Identify landmarks and get AI answers
The Ray-Ban Meta smartglasses are gaining AI powers via the built-in cameras, with new features released into private beta.
It’s all part of its multimodal AI features, which were touted last year in demos by Meta.
Multimodal AI enables users to get information on the world around them. Of course, the current Ray-Ban Meta glasses are not augmented reality (AR), so all interactions with the AI are via the Meta voice assistant.
The smart glasses feature forward-facing cameras – and these can now be used to interact with the world around you. Examples could be asking for information about landmarks, or translating text on a menu while traveling.
Some of the features have been shown off by Meta CTO Andrew Bosworth in a post on Threads in which he gets information on landmarks in San Franciso: Mark Zuckerberg has also been showing off some of the capabilities on Instagram as well.

There are tons of useful applications, and this kind of feature is a big part of the roadmap to truly useful consumer smart glasses.
We talked about that roadmap as part of our podcast with the CEO and founder of Brilliant Labs, which is about to launch the Frame smartglasses, which leverage AI and AR together – so that’s worth checking out if you’re interested in how AI is being leveraged by wearables.
We’re big fans of the Ray-Ban Meta glasses, which truly surprised us in our review period for being useful, discreet, and wearable.
Getting access to AI insights brings these excellent smartglasses on a step. We’ll be trying this out in detail soon.
Via: Engadget