Millimetre accurate micro-motion gesture tracking could ease touchscreen blues
Google is working on a new gesture recognition system for future wearables – such as the Pixel Watch 3 – that could enable millimetre-accurate motion tracking.
The recent patent filing explains how a radio frequency-based system could be used to track hand and finger movements on a micro scale, with the end goal being less pawing at a tiny touchscreen.
Squirrel_B0BDSMLQ6K
We’ve already reported on a previous patent from Google that uses pressure sensors around a bezel to lessen the need for interaction with touchscreens, but this latest filing could eliminate the need for touch in some instances.
While radar-based gesture tracking tech does exist, Google says conventional methods are cumbersome, expensive, and inaccurate and can only track hand movements to the centimetre. Thus they haven’t managed to address usability issues of touchscreens as intended and hence why we don’t see them deployed in many wearables.
However, the tech described in the patent can overcome hardware limitations of conventional radar systems, thanks to a simpler single-radar emitting element and antenna.
Such a method would enable tracking of smaller motions. This would be less expensive, smaller and less complex than conventional multiple radar-emitting elements.
Google writes: “This radar system can be simpler, less costly, or less complex than conventional radar systems that still cannot, with conventional techniques, determine micro motions in the millimetre. The micro-motion tracking module is configured to extract relative dynamics from a radar signal representing a superposition of reflections of two or more points of a hand within a radar field.”
Diagrams accompanying the patent show various swipes, pulls and pinches away from a touchscreen with multiple elements of the hand moving in different directions.
Google explains that the micro-motion tracking module would work by tracking the direction and speed of motion and displacement between, say, the index finger and the knuckle point.
“This hand has various points of interest, some that move toward the radar antenna element, some that move away, and some that are immobile,” the filing adds, before showing how it works out the sums.
“Assume that for a micro-motion gesture the thumb point is moving away from the antenna element, that the index-finger point is moving toward the antenna element, and that the knuckle point is immobile.
“For each of these points the micro-motion tracking module may determine their relative velocity and energy. Thus, assume that the velocity of the thumb point is 1.7 meters per second away, the index-finger point is 2.1 meters per second toward, and the knuckle point is zero meters per second. The micro-motion tracking module determines a velocity profile for these points of the hand using the radar signal.”
It shows that most major smartwatch manufacturers are looking at gesture control as a major chnage to the way we use wearables devices.
With Apple bringing the feature to the mainstream, we predicted it wouldn’t be long until similar, or even expanded, versions of gesture control appeared elsewhere. And that vision is one step closer.