That's mixed reality and the advancement of Oculus Insight, Oculus' new inside-out tracking system. When Quest launches, Insight will be limited to room-scale tracking, but that isn't stopping Oculus from pushing what's possible.
Read this: All the news from Oculus Connect 2018
At Connect, it set up a custom demo of Dead and Buried using Quest headsets and three new technologies it's working on: Mixed reality, arena-scale inside-out tracking via Insight and co-location multiplayer. And we were able to try it all out.
The Quest's mixed reality uses its four wide-camera sensors on the front to scan the room and objects in the room. In our demo, this meant allowing us to see a version of the world that looked like a giant sketch of the room. It's fairly primitive, but the headset is basically reading what it can from its surroundings and recreating it.
The result literally looks like a moving sketch, and it's surprisingly detailed for prototype technology. I was able to see sketches of the people around me, the arena I was standing in, the lights on the ceiling, the people in the line for The Void across the convention space and even the Modular watch face on my Apple Watch.
It was enough visual information to let me navigate around the environment, which featured boxes, without feeling like I was going to trip. The game uses this same information to render everything. So when I looked down at my character, the game was using information from the headset to render that right away.
It all works really, really well. There were moments when I lost tracking on my Touch controllers, like kneeling behind a box and trying to peek around a corner, but for the most part I was able to rack up a pretty decent score for my team β so it was good enough.
What was even more impressive was the arena-scale tracking. I was able to walk around and duck and hide behind boxes with no problem at all. I never felt restricted and the quality never felt like it eroded. More importantly, it was a blast. I was so into shooting the other team that I sometimes forgot I was testing out prototype technology.
My only question with arena-scale is real-world use cases for it. This could help enable a new category of low-end location VR studios. It could also let groups of people go out and have fun playing VR together. Or, you know, help people who have large living rooms and want to take advantage of it. The possibilities are endless, and that's what's so exciting.
Oculus didn't actually let us roam around the whole room, which was 4,000 square feet. We only got to roam around half of it, and weren't allowed to go into the opposing team's base. That could be because of safety precautions, but before the match we were able to roam around as much as possible β so it doesn't feel like a technical limitation Oculus is hiding.
Finally, there's co-location multiplayer. Oculus made a master spatial map of the play area. There's a local Wi-Fi network that Oculus is using to sync the headsets to the map, but the headsets are doing the work, and understand where it is in the map and where everyone else is, too. Similarly, Oculus rigged up an iPad with Insight technology in an app that allowed it to act as a window to the game world the rest of us were playing in. So not only could you and friends use a Quest to play in a world in a shared play space, friends with other devices can get in on the same space too.
Individually, the three technologies that Oculus is working on are well-done and necessary. Some other companies, like Microsoft with its Windows Mixed Reality, have similar things either out or in the pipeline. However, what makes Oculus' version impressive is that they all work together, and they all work together in a way that doesn't feel like a prototype on a high-end standalone headset that's just around the corner. There's no telling when this technology will be ready and become a final product, but it's all still incredibly exciting.