Google is continuing to explore next-gen augmented reality glasses by developing a system to detect connected technology around the home and control it in the virtual space.
In the new patent filing spotted by Wareable, Google explains how an image sensor within a pair of smartglasses could map and identify smart home and entertainment devices within the home.
The patent describes how, upon recognition of the device, a pop-up user interface would appear above the device – such as a Nest speaker, or Learning Thermostat, an Android TV, or a smart bulb. Options presented in the user interface could be selected via pointing gestures – as shown in the patent diagrams below.
Roundup: Best AR glasses
So, when the user looks at their Nest speaker, for example, options to play and search for music would appear in the eyeline. If tunes are already playing, the pop-up will display a playback interface featuring album art and controls.
In another example depicted within the patent, wearers could turn their gaze to a Nest thermostat and see their average monthly energy use.
The technology could also be used to turn smart bulbs on/off (as long as they’re not too close to each other!), or open apps and select content on a smart television, Google says.
As is often the case with these filings, the language focuses less on the exciting functionality that could unlock greater freedom in the smart home. Indeed, it’s mostly about the intricacies of how the headset discovers the physical position of the connected device on a 3D map (via 6 degrees of freedom information) and then identifies it using an object recognition model.
Google writes: “The method includes obtaining a position of the first controllable device in a physical space based on visual positioning data of the first 3D map and rendering a user interface (UI) object on a display in a position that is within a threshold distance of the position of the first controllable device”.
However, when explaining its diagrams, Google does venture into more detail about the end goal of the technology. Figure 8 for example “illustrate(s) examples of a display depicting a UI object positioned in a location that is proximate to (e.g., close to) a detected controllable device.”
Google goes on: “In some examples, the controllable device includes a smart speaker. However, the controllable device may include any type of controllable device discussed herein. In some examples, the information depicted in the display is the visual information shown through the lens of smartglasses.
“The Ul object may be considered a virtual object that is positioned in a physical space as shown through the smartglasses. Referring to FIG. 8, a Ul object includes UI controls for controlling the smart speaker and information about what is playing on the smart speaker.”
In Figure 4C and 4D (below) Google also explains “the UI objects may include a visual indicator that indicates an area in which a hand (or finger) of the user is positioned in reference to the other visual information shown in the display and multiple UI controls that permit the user to control the controllable device. In some examples, the UI controls may include actions such as playing music or searching the web.”
Whether this technology ever appears within a consumer technology product available to the public remains to be seen. Google retired its Google Glass display for consumers many years ago, and this year for the remaining users in the enterprise realm. It is yet to hint publicly of a successor.
However, numerous patent filings show the search and mobile giant retains a keen interest in the sector and is actively developing technology that could be featured as part of a return for a more advanced set of mixed reality glasses.
In the decade since the original Google Glass arrived – dogged by privacy fears related to the onboard camera – attitudes have significantly softened towards such devices and many observers viewsmart glasses as an eventual replacement for the smartphone.
It would seem a safe bet that Google – along with Apple, Meta, and the rest – would want in on that.
How we test