Some of them came to show off groundbreaking stuff, while other companies show how they're elbowing their way into segments that are quickly being saturated (hello, AR for various construction projects). Other companies have nothing but a booth and a dream, hoping to draw enough interest to keep things going.
A lot of these things were cool, and some of them were just plain weird. Follow us, won't you, as we take you on a tour on some of the highlights.
Nod's universal touch controller
If you've invested in a VR system, like a Vive or a Rift, you know that you need to be using each's unique controller system. You can't use Vive controllers with a Rift, for instance. Nod is looking to create a common controller platform across all VR.
Basically, you'd get two six-degree controllers and a tracking camera that you stick on whatever headset you're using, whether it be a Vive, Daydream or Rift. Once you do that, you can use the same pair of Nod controllers with the platform of your choice. In our demo with the prototype, we were able to try out the same controllers with experiences on both Daydream and Vive. While they mostly worked well and were pretty responsive, they did suffer from some lag once in a while, though Nod staff members pointed out it was mainly due to extreme interference from the many Lighthouse sensors in the area.
UC Davis' AR sandbox
When you think of AR, you think of a pair of glasses you wear on your head. The world in front of you is essentially changed. UC Davis, on the other hand, is looking at a different way to play with AR.
It's created a literal sandbox that uses a Kinect and a projector to display an augmented topographical map on top of actual sand. It's not just there for show either; it's interactive. If you hold your hand out over the sandbox, you create rain and can see how it trickles down your sandy mountains. If you take a little shovel and demolish mountains and re-route valleys, your virtual water will flow in different directions.
Dr. Oliver Kreylos, a researcher at UC Davis, tells Wareable that the system could be used in classrooms and museums to teach people, especially children, how geography works. It can also be adapted for other uses, like firefighters looking at how forest fires could affect national parks or Dungeons & Dragons games.
InteraXon's Muse-powered VR
InteraXon put together a custom HTC Vive headset with sensors that use its Muse brain-reading technologies to sense your mood. It then teamed up with Microdose VR to create a special VR experience that adjusts to how you're feeling.
The sensors, which are placed where your brow and cheeks meet the headset, as well as your earlobes and chest, then relay your information to the software, which analyzes how you're feeling and changes the experience. For instance, it can sense if your jaw is clenched or if you're excited and give you a faster paced experience. If you're calm, you'll get something a little slower.
The actual experience is like falling into an empty kaleidoscope. You're given two controllers to fill the space with psychedelic imagery. While our demo didn't have the brain-sensing powers enabled, we did get a preview of how fast and slow the experience could go, and it turned out to be an entrancing, pulse-pounding time.
Graeme Moffat, scientific and regulatory director for InteraXon, tells us that, while other companies are looking to use brain-reading tech to do things like typing or interfacing with computers, what will happen much sooner are systems that can adapt to how you're feeling. "You can measure brain responses in such a way that you can tell whether someone is habituating to stimulus presentation, whether they're getting distracted, whether they're getting tired and you can adapt mixed reality, augmented reality or virtual reality in that way." You would even be able to check whether someone is getting nauseous and then use machine learning to adapt the environment to make it better.
Zenka's AR and VR art
While VR and AR can be used for a whole bunch of industrial and consumer reasons, artist Zenka wants to make sure you realize it can be used to bring a different dimension to art. One of her installations featured a series of paintings that come alive when you point the companion AR app at.
While that was all well and good, they were placed above a series of sculptures of odd faces wearing what appear to be concept ideas for different types of VR headsets. They're weird and fascinating in a way that draws you over to them, which was born out by how many people couldn't resist but stop at her booth and gawk at the faces gawking at VR.
The Tap keyboard
Tap is a wearable keyboard for AR, VR and smartwatches. They're like a flexible pair of bare knuckles that you slot onto your fingers, with an accelerometer on every finger slot to keep track of when you tap. Once you learn the alphabet, you can tap on any surface - including you head or leg - to type.
How it works is this: vowels are a single tap with one of your five fingers. Other letters require more advanced taps, like tapping your thumb and pinky finger at the same time to tap y. The company says it takes about an hour to fully learn how to type with its TapGenius Game, which aims to give you the muscle memory to tap easily. The Tap strap can also be used for other things, like games or as a mouse-pointer of sort.
Tap's Guy Bendov actually says that the company has seen great interest from blind people who want a simpler way to be able to type without relying on the potential misses of voice commands.
Rokoko Smartsuit Pro
Full-body motion capture is nothing new, but it usually requires a room set up with cameras to track the body's entire movements. Rokoko has instead made a suit that does the whole job itself, giving the user full freedom to move around as they please and still be captured. More interestingly, it could work a as a full-body VR controller, for the ultimate level of immersion, as Rokoko CEO Jacob Balslev explained when he came to give us a demo. It's $2,500 a suit, but it's already generated a lot of interest, and Balslev says Sony has even ordered one, although the reason why remains a mystery.
Vuforia collaborative AR
One of the more interesting demos at AWE was Vuforia's Project Chalk, an AR collaboration system. You play mission control, and you've got an astronaut on Mars who has to fix their ship before they're lost to the void of space forever. As mission control, you can see what your astronaut is seeing and annotate what they need to do in real-time.
In our demo, we had to rely on an iPad with a camera and real-time augmented annotations, but it's not hard to imagine this tech jumping to a nifty pair of smart glasses. And yes, our astronaut was saved.
Epson's drone AR goggles
Epson revealed two new pairs of smartglasses at AWE, both of which are really targeted to the workplace/enterprise area. However, the company also gave us a look at a pair of its glasses that were being used to control a drone, the result of a partnership with drone maker DJI. Putting them on, we were suddenly able to see what the drone could, and... it's weird. But also incredibly neat. The idea is that it still allows full line of sight, including letting the user look above them easily. Although it took us a while to adjust to having essentially two feeds of the world being beamed into our brain at once, the picture was surprisingly sharp.
Puttview's AR golf instruction
One of the more fun parts of golfing video games is being able to turn on magical assists that can tell you where to hit a ball and, when you do hit that ball, what direction it'll travel in. That's essentially what Puttview's technology is aiming to do.
Managing director Lukas Posniak tells Wareable the system scans the green, determines the best ball path for your potential hole-in-one, and then displays an overlay that lets you put things in motion. There are also two version: an indoor version that relies on a mounted camera and projector to instantly overlay your information on a green, allowing multiple people to see it at once, and an outdoor version that relies on an AR headset.
Massless VR pen
Massless had one of the more fully realized demos on the floor of AWE. It's a highly precise system that lets you use a pen to draw or write in virtual reality. You use Massless' pen alongside a camera tracker and your regular Rift controller. Together, you can quickly and simply draw and write in 3D space as easily as you can draw or write on a piece of paper.
It's fast, accurate and, somehow, didn't make our signature look hideous. It's also a far better feeling than using a regular VR controller to do something in, say, TiltBrush. Being able to use a pencil-like object to draw and write in VR just feels far more natural than anything else.
Reflekt's AR platform
Reflekt is a company that builds AR platforms to enterprises to build the apps they need, but one of its demoes was so easy and simple to use that it feels like it could easily benefit all of us. In conjunction with Microsoft HoloLens, you look at a mechanical object and can instantly see an x-ray of said object.
From that, you can highlight different parts and see all the information you need to know. If you need to make repairs, it'll walk you through how to repair it with an animated, detailed guide right on top of it. If you're unsure what any particular part is or why it's there then you can just look at it and it'll tell you. It all works really well, and was so impressive that even Microsoft was saying that Reflekt had a better demo on the AWE floor than its own HoloLens version.
The tough part of VR is that it's one of the few things that you don't get until you actually experience it. It's just too difficult to put into words what you're experiencing until you put on that headset. Global Imagination is hoping to change that with Magic Planets, which are video globes that can basically display VR content. In this way, CEO Mike Foody tells Wareable, you can give people a hint of what VR is like before they actually try it on. Except, you know, in reverse, since the content is experienced on the "outside" rather than "inside."