The first day of AWE 2016 in sunny Santa Clara, California has kicked off with a bang. The giant roster of speakers provided plenty of personal observations from industry veterans about the world of augmented and virtual reality.
One thing they were all certain about? 2016 is ripe for VR and 2017 is the year touchless interface through AR will shine through.
Here are a couple highlights from the day's keynotes, including insight from Meta, the team behind the HoloLens contender, the Meta 2 and ODG, a company that's been in the business for nearly 20 years.
ODG and better mixed reality
We recently chatted with the Ralph Osterhout, CEO of ODG, about the state of the company's AR smartglasses but he had a bit more to add about the R-7's. Specifically how he expects the next six months will see another iteration of the glasses with a wider field-of-view, and a deeper focus on mixed reality.
The company is no stranger to the idea of mixed reality as the glasses are able to darken immersing you into a different experience - then bringing you out if you want to use the clear glasses to see, effectively changing it into AR.
A new partnership with cloud graphics company OTOY Inc., has helped create the experience that allows ODG's hardware to switch between VR and AR modes, delivering 4K 3D at up to 120fps.
The platform will also enable full adaptive opacity for occlusion and realistic lighting in mixed reality experiences
Because OTOY's cloud-based light field content is interactive, live and unlimited in scale, the viewer will be able to enjoy complex games, photorealistic VR/MR media and shared experiences. These experiences can be streamed to the device and re-projected with extremely low latency and minimal battery drain on the glasses.
There's still no word on a mass release but it sounds like the company is nearly ready.
Transcending input devices
Meta has a team of neuroscientists in its offices and CEO Meron Gribetz has made it clear that the Meta 2 has been designed with the brain at the forefront. In his talk simply titled '1965-2016,' he discussed how flat input devices like the mouse, trackpad and touchscreen have been our main form of interaction with digital interfaces.
He promises that in spring 2017, no one in the Meta office will be tied down to a normal computer set-up or tethered to a mouse, saying: "We won't need to use input devices anymore. We're transcending them."
Gribetz went on to show off the Meta 2's progress with hand tracking developed using the company's neuro-interface guidelines (which Gribetz discussed during a Ted Talk in March). The fundamental points of the guidelines are about using your hands naturally to pick up objects more seamlessly - which is something I personally felt like the Meta 2 needed to work on. Apparently, it looks like the team has already been making adjustments as Gribetz's demo showed an improved process where he used his hands to knock about cubes, carefully pick them up and sculpt.
At this rate, his statement about interfaces may just come true.
VR can still be better
With the Oculus Rift and HTC Vive out, plus PS VR on the way you'd think that that it's for VR - but of course, there's always room for improvement. It's definitely the year for VR with the HMD's flooding the scene but as Nvidia general manager Zvi Greenstein said in his keynote, "All the pieces are falling into place...there's much more to do." He went on to list better latency, physics, audio and more are needed for the ultimate VR experiences.
Greenstein then showed off Nvidia's first VR game (first revealed in May), VR Funhouse to demonstrate how the open source availability will allow developers to learn and make their own experiences even better.