When I tried Tobii's virtual reality eye-tracking tech at GDC 2017 last March, all I wanted to do was catch it out. For the most part it accurately followed my stare, but there were a few moments where I tripped it up. But this time, ten months on, it was ready for me. Eye tracking is going to be the next big thing for virtual reality. Why? Because it just doesn't make sense to go forward without it.
Higher resolutions and better fields of view are always going to be welcome, and necessary, advances for VR, but they don't solve some of the fundamentals of human interaction. When you think about hanging out with friends in VR in the future, do you all have glazed, lifeless stares, or do your eyes follow each other around the room, like they do in real life? Exactly.
Read this: Best VR headsets 2017
Not only that, eye tracking could lower the requirements of VR, leading to more affordable machines, thanks to foveated rendering.
Swedish company Tobii has been building eye-tracking tech for many years, but only recently has it turned its attention to VR. It's not the only one in the game: HTC announced it's bringing an eye-tracking accessory to the Vive from one of its accelerator program startups; and Fove is shipping developer kits for its own eye-tracking system.
A natural paradigm
But nothing has impressed me quite like Tobii's latest demo. Once again it had integrated the tech into a HTC Vive, the only giveaway being a ring of sensors around each lens. After a bit of calibrating, requiring me to follow a dot with my eyes, I was thrown into a series of demos.
For the first I was warped into a home cinema, where I could use my eyes to scan across a library of videos and land on the one I wanted. The idea was to demonstrate how current VR technology will have you point your face at something to select it, but eye tracking meant I could simply look at a movie to pick it out instead.
Similarly, in a demo that reminded me a little of Star Trek Bridge Crew, I could use my eyes to carefully select different buttons on a dashboard in front of m. It was here I tried to catch it out again, but it worked with fantastic precision. Then a third demo put me in a stranger's living room and had me put on a (virtual) augmented reality headset (yeah it was getting pretty meta at this point), which projected planets and spaceships around the room. I could control the movement of these either by pointing my face at them at them or by using my eyes, the purpose being to show how much easier the latter option was.
"When you think about it, humans never point their heads when they want to interact with something," said Tobii president Oscar Werner. "But that's actually what many headsets do today."
He makes a good point. We've almost become accustomed to this new interface in VR, but it's not natural. "This pointing interaction is an artificial construct we created, just like a mouse pointer," said Werner. "But that's very artificial." With eye-tracking it goes back to a two-step process. Head movement-point-click becomes look-click. It's why the smartphone touchscreen paradigm works so well.
"Where you look is a very good approximation of what you think," says Werner. "This device is no longer blind; it understands things about you, your intent. And developers can use that in many ways."
Lowering the bar
There's definitely a lot more potential beyond what I was shown here. For example, interactions with other people in games, real and AI, suddenly become more interesting. An NPC may react to you just by you looking them in the eye, or in a shared social environment a glance would let someone know you're speaking to them and not the other 20 people in the room. "I think it's impossible to do good social interaction without eye tracking in VR," said Werner. "Otherwise every single multiplayer character will be staring the thousand mile stare."
What I hadn't initially noticed during the space dashboard demo was that it was using foveated rendering. This is where the computer reduces the image quality in the peripheral vision, so only the spot you're looking at is fully rendered, and the overall workload is reduced. As soon as Werner pointed out that it was using foveated rendering, I became aware that there was some warping in my peripheral but it was the fact I hadn't noticed it till then that struck me the most.
This could have different implications. You could get a better resolution experience, or one with a higher frame rate, but it could also mean more demanding VR on cheaper machines, as some of the heavy lifting would be reduced. Werner says Tobii has seen between 30% and 50% reduction in GPU demand, but it's hard to say just how much the impact will be when this goes to market.
Then there's something I hadn't thought about: security. Werner says this tech will recognise individual users by their retinas, so you could put on the headset and it would load up your individual profile or avatar.
"Why is Apple putting a user facing sensor on the iPhone X? Why is Microsoft putting Windows Hello sensors in all its computers? Why is Huawei announcing a Face ID? Why are Google and Facebook buying eye-tracking companies? Because devices should not be blind to the users."
So when will we see it? Tobii says it's in conversations with around 10 headset manufacturers right now, and Werner says we can expect to see the technology in systems towards the end of 2018 and early 2019.
I certainly hope so; eye-tracking doesn't feel like an optional upgrade, but rather a necessity for VR to evolve. So much so, that when I went from Tobii's demo to trying the new HTC Vive Pro, I couldn't help but feel – despite the crisper resolution – I was taking a step back. Eye-tracking won't just be a huge deal for users, but for developers who want to make VR feel more natural and lower the barrier of entry. For all of those reasons, it can't come soon enough.
How we test