What AR needs to do to succeed, according to one of its biggest investors

The AR future is still far away, but how do we get there?
Ghost in the Shell
Wareable is reader-powered. If you click through using links on the site, we may earn an affiliate commission. Learn more

Back in 2015 Amitt Mahajan, Paul Bragiel, and Phil Chen, the founder of HTC Vive, looked at the landscape of tech and saw that VR and AR were emerging. They had spent a lot of time watching how platforms grow and develop, so they decided to start a $10 million fund called Presence Capital, which would only invest in VR and AR companies in the early stages – getting them in on the next wave of computing early.

Now, Presence Capital counts companies like Upload, Meta, Nomadic, The Rogue Initiative and Harmonix amongst its portfolio. As you might notice, these companies are mostly focused on content and experiences, and that's not a coincidence; Presence Capital is more interested in software or content-based VR and AR companies than hardware makers, because hardware makers are stuck in a more fragmented world where they're competing against bigger companies and will, eventually, become more like peripheral manufacturers.

The VR world is maturing now, and Mahajan says Facebook is in the best position to take an Apple-style leadership role in the industry because of Oculus. Not only does he think Oculus is probably the best VR headset from a user experience perspective, he thinks the standalone Oculus Go headset is going to make VR much more appealing for mass audiences.

But what about the world of AR? How do we get to the dream of AR smartglasses and context lenses? And more importantly, who takes us there?

How about these apples?

What AR needs to do to succeed, according to one of its biggest investors

Apple, for its part, looks to be sitting out the VR race. As CEO Tim Cook has repeatedly said, the company thinks VR is too isolationist; he believes AR is much more versatile, both inside and outside the home.

But how do we actually get to that AR future? If you look at what Apple and Google are doing with ARKit and ARCore, you'd think the solution would be to build an ecosystem that developers can play around in. But that might not be enough on its own, according to Mahajan. Apple needs to be a better leader.

I call it the Shazam for everything… you point your camera at anything in the world and get information on it

"They haven't really demonstrated any first party applications that are really great utilities, the things that you want to use every single day," Mahajan says. "The animated emojis and stuff are cool, but they're demo pieces for the technology. Apple's in a position where if they launched a better version of Photos – or a better version of one of their core applications as a part of ARKit – they can demonstrate to everyone the potential of the technology."

Mahajan has a thesis – that he admits he hopes is wrong – that most ARKit apps at the moment are novelties that only blow people away because they're fascinated by the technology. There are not yet any great utilities from ARKit, he argues, except for AR rulers that can be genuinely useful. Most of the apps, however, are the kind of thing you open up and play with for a few minutes and then close.

"It's not the sort of thing that a week later or a month later I'm gonna open up the app again and be like 'lemme see the dragon again' or 'I really need to use this app every day,'" he says. "If you look at your home screen these are the apps you use all the time."

So, until we get those AR smartglasses, the app ecosystem needs to be working on AR apps that have repeatability. That doesn't necessarily mean launching another Ikea app, either. As Mahajan says, most AR apps that try to be utilities are actually features rather than real apps. He points to Amazon recently putting AR into its mainline Amazon app, letting people place objects in their home, as a good example.

"If you have an existing app with distribution, why would you launch a new app you have to send people to? You don't have a different app for search, right? It's not like Amazon search is a separate app, it's just a little search button in the app."

You can see why Mahajan thinks Apple could do a better job of asserting its style of leadership in AR. If Apple were to create its own AR utility app that was genuinely useful, it could help steer other companies in a similar direction.

Additionally, he argues that Apple is in the best position to lead augmented reality because it's fully in control of the user experience in a way that other companies aren't. That goes from the hardware down to the software and services. That's perfect for one of the most important windows into our augmented reality future – the camera.

A Shazam for everything

What AR needs to do to succeed, according to one of its biggest investors

While creating a vibrant and mature ecosystem of augmented reality apps is important for when we have AR smartglasses on our persons at all time, the other important – and more often forgotten about – part of the equation is the camera. Mahajan looks at this as an input-output situation.

The fancy graphics we see when we think of augmented reality are the output; it's your computer placing things all over your environment, giving you layers of information over what you're looking at. However, the less sexy, but just as important, part is the input. This is your camera scanning your environment, getting a sense of where you are and how you relate to the world.

Until we get to smartglasses, the camera is our point of entry into the augmented world. More specifically, our camera apps. For instance, Mahajan points out when you aim your camera at a QR code it pops up a little bubble that tells you where you can get more information. The next step of this, he says, is an app store built into camera apps, kind of like the iMessage app store, that lets you easily bring up AR apps within your camera.

So you could open up your camera, click Yelp and point at a building. You'd then get your Yelp reviews. Or, while in a Best Buy you bring up the Best Buy app, point at a barcode or product and get your final price – after taxes.

"I call it the Shazam for everything," Mahajan says. "The idea that you can point your camera at anything in the world and get information on it is a huge part of what augmented reality's potential is and it starts with the camera."

While he thinks Apple is best set up to lead us into this future, he also says Facebook is a quiet competitor because of how much image data it's collecting from Facebook photos and Instagram. These allow it to feed its machine learning algorithms with a whole bunch of photos of, well, everything, which means Facebook could learn to build out one of the two halves of AR better than anyone else: the camera that understands the world.

On your head, in your eyes

What AR needs to do to succeed, according to one of its biggest investors

The intelligence of the camera and the app ecosystem are parallel developments. They need to both mature and grow at the same time for us to get to our augmented reality future. At the same time, they're not what's holding AR headsets back.

"Both of these technologies are necessary for us to get the headsets," Mahajan says. "But the problem with the headsets is less of a computing software problem – I mean there is some of that too – but today it's more of a miniaturization and physics problem"

There are three major factors keeping AR headsets from arriving right now. The first is the display. There are companies who are working on devices that essentially bounce the image off glass and into your vision, which is the classic Pepper's Ghost effect. But there are also companies like Magic Leap and Avegant, which are working with light field displays that beam high quality 3D imagery right into your eyes.

Once you figure that out, you've got to work on positional tracking, which Mahajan says Microsoft is probably the best at right now. How does your headset know where you are in relation to the world? How does it know where you're standing in relation to the augmented reality objects floating around in your vision?

And finally, on top of all that, you have to find a way to miniaturize where the computer is and what's powering it. While Mahajan thinks a mass market AR headset is still five years away, he points out that the VR equivalent, Oculus Go, is only coming out next year. If VR is only getting to that point now, imagine how far away the AR equivalent is.

Here's one you might not have thought of

What AR needs to do to succeed, according to one of its biggest investors

Mahajan tells me that one of the most important long-term developments for augmented reality is something called the AR Cloud. While we worry about creating good AR apps and getting comfortable headsets we can use on a daily basis, the infrastructure of the future is a problem that also needs to be figured out.

"If you use an ARKit app and you have some virtual objects on a table in front of you – if you close that app and come back tomorrow those virtual objects are gone. There's no persistence. And if you and I want to share an experience there's no way to do that."

You may already be familiar with the cloud. It's small things like taking a bunch of pictures on your phone, uploading them to Google Photos or iCloud and then opening up your laptop and tablet and automatically having them there too. AR needs that for persistent experiences, but it also needs something even bigger than that: it needs something like the internet.

If you log onto Facebook and post an update and then log off, your friend can log on later and see your update. That's kind of what needs to happen for a more shared AR world. Mahajan uses sci-fi novel Rainbow's End, in which everyone in the world has AR glasses, to paint a picture of our future being layers of reality, which you can turn on or off like layers in Photoshop.

There's no persistence. And if you and I want to share an experience there's no way to do that

For instance, he explains, say you go to a restaurant. The restaurant owns that space, so it owns the AR experience, and when you enter your AR glasses or contacts automatically adjust to see what the restaurant wants to show you. That could be virtual menus on the wall, or dietary information popping out of the food. Whatever it is, it could be the default view in that restaurant for normal people. However, say a firefighter walks into the restaurant, they'd be treated to the view of emergency personnel, which would point out fire exits and extinguishers and smoke alarms.

The AR cloud is the only way to get to that point, and Mahajan says we're still "really, really early in that process." He does emphasise that if we're going to get to that AR future, the one we dream of, the AR cloud has to be fully mature and ready to go, which will obviously launch a whole host of complications and societal questions that we can't even begun to ponder. Either way, it's up to the companies in the AR race to take us there.

"All of these companies – Apple, Google, Facebook, all of them – need to build something or have something in this space or they're not going to be able to build this AR future they're hinting at."

TAGGED VR AR

How we test



Husain Sumra

By

Husain joined Wareable in 2017 as a member of our San Fransisco based team. Husain is a movies expert, and runs his own blog, and contributes to MacRumors.

He has spent hours in the world of virtual reality, getting eyes on Oculus Rift, HTC Vive and Samsung Gear VR. 

At Wareable, Husain's role is to investigate, report and write features and news about the wearable industry – from smartwatches and fitness trackers to health devices, virtual reality, augmented reality and more.

He writes buyers guides, how-to content, hardware reviews and more.


Related stories