Smartwatches feel relatively settled now. Apple, by its own mysterious metric - which could be revenue - says its the number one watchmaker in the world. Those of us always looking for what's next could, perhaps, turn our heads toward smartglasses.
When you think smartglasses, you mostly think about AR. They're a perfect match, like peanut butter and jelly, but there is a problem. AR smartglasses aren't yet ready, and even if they are ready they're not anything regular people are going to want to put on their face.
Read this: The best smartglasses of 2017
In the meantime, smartglass manufacturers are doing something a little different: They're rethinking what it means to use them.
...but don't you just put it on your face and look through them?
Yes, but many smartphone glasses - or ideas for smartphone glasses - must balance technology with looks, and that's just so hard to do right now. But there's no denying that fashion is going to be the priority, so companies are thinking differently about what it even means to be a pair of smartglasses.
For example, some of the glasses we've already seen use touch panels on the arms that allow you to scroll through menus or select items. It works, but it's also kind of a pain. Every time you want to interact with it, you've got to move your hand up to your temples.
So there's a better way?
Maybe. Computer scientists from KAIST University in South Korea, Keio University in Japan, the University of St. Andrews in Scotland and the Georgia Institute of Technology are looking at a way for you to control your smart glasses with your nose.
They have electrooculography sensors built in to what look like a regular pair of glasses. They can determine the difference between flicking, rubbing and pushing your nose. Different actions can be assigned to different gestures. So, for example, you can flick your nose to change tracks.
And I can use these to control augmented reality?
Well, no. For now, these are built for things like your phone or PC. You know, devices that you can actually use on an everyday basis. But it's not hard to imagine how these gesture-based actions could eventually make it into AR smartglasses.
Ah, so it's kind of like a fun side thing.
Right. The accompanying tech to make augmented reality on smartglasses happen just isn't ready, so why not let manufacturers figure out other uses. Like controlling your existing gadgets, or even being a hearable.
Wait what. A hearable?
I know, right? Vue, for example, figures that people are more accustomed to wearing things like glasses than wearing to small earphones - like AirPods or Here Ones. Instead of having you put something near your eardrum, they rely on bone conduction technology for audio. You can take phone calls, listen to music, use Siri or Google Assistant and check in on some basic fitness metrics, like steps. There are touch panels on the side, but you can also do a lot with your voice.
It turns out that glasses are pretty fashionable, and people will often wear glasses for style - not just making their eyes better. Vue CEO Aaron Rowley tells us that the fastest growing lens type in the US is non-prescription. If you make them look good, people will wear 'em.
All right, that's pretty cool. Why haven't the big players gotten in on this?
Actually, it sounds like Amazon is planning to do something similar. It's working on a pair of glasses with Alexa built in. So not only will you be able to use all of those skills she has built up, but you'll be able to shop on Amazon too. Just imagine if Amazon ever puts a camera on that thing. "Alexa, buy that milk." Ugh. But also wow.
Maybe they should team up with Snap.
Snap Spectacles are another good example of a company trying to do something different with smartglasses. Turns out they're pretty good at being the ultimate vacation camera.
And that's exactly the thing. As we're dreaming of what feels like the inevitable AR smartglass future, companies are rethinking what's possible with smartglasses with the technology that's more feasible now. In the short term, it gives us more specified wearables that are potentially really good at certain things. In the long term, this makes it easier for companies to eventually pool all these features together, and build toward our inevitable AR future.
How we test