While Apple isn't too big on virtual reality, it's actually quite interested in augmented reality. In fact, every once in a while CEO Tim Cook just has to mention how bullish Apple is on AR, and how profound the company thinks it could be.
In 2017, Apple stopped talking the talk and started walking the walk. The company announced ARKit at its Worldwide Developer Conference, letting the world know that when iOS 11 drops on 19 September hundreds of millions of iOS device owners will be able to take a dive into high quality mobile AR.
Hands on: Apple Watch Series 3 review
Apple is setting AR up to be a cornerstone of its business in the future, so that begs a couple of important questions: What exactly is ARKit, how does it work and who is working on it? Let's take a look.
How it works
ARKit is a new software framework that allows developers to create augmented reality experiences for iOS devices. The idea here is to stick to a common Apple talking point: Apple can uniquely blend software and hardware to create a more complete, high quality experience than competitors.
The software works in conjunction with iPhone and iPad hardware to get a better sense of the world around you. The heart of this is something Apple calls Visual Inertial Odometry (OIS), which is a fancy and complicated way of saying that it combines camera-sensing data with data from sensors like gyroscopes and accelerometers. Together, these allow apps tapping into ARKit to better sense their relation to the environment it's in.
Second, the camera is constantly sensing the light in the room and looking for flat surfaces. It can locate and track horizontal surfaces that it can place virtual objects on. Once it's done that, it can estimate the total amount of light in that environment and calibrate itself to achieve that for the AR object's needs.
Apple makes some of the more powerful and energy efficient mobile processors in the game, and ARKit gets to fully access them as needed. Specifically, ARKit will run on the A9, A10 and new A11 processors. All of those special software frameworks that Apple built for graphics, like Metal and SceneKit, can also be taken advantage of ‚Äď on top of third-party options like Unity.
On the newly announced iPhone X, ARKit gets a bit of a boost in the X's front camera, which is able to take an intricate 3D scan of your face. Intricate enough to where Apple says masks from Hollywood special effects artists and photos can't spoof it.
What this means for AR is that it can enable highly advanced facial mapping. It's kind of like a Snapchat filter ‚Äď but on steroids. In fact, Apple used the TrueDepth camera to show off some new, advanced filters that were mapped much better than current Snapchat face filters.
What you can do
On the iPhone X, you can use the new TrueDepth camera for advanced Snapchat filters, but you can also use it for something Apple has dubbed Animoji. Basically, these are the emoji you know and love with a twist. You can use that camera to control how they move, and you can basically do a motion capture session like you're in a James Cameron movie ‚Äď but for iMessage.
If you don't have ‚Äď or don't want ‚Äď an iPhone X, there are going to be plenty of other augmented reality adventures for you to take with iOS 11. The biggest of them is IKEA Place, an app that'll let you place true-to-scale IKEA products in your home with augmented reality via ARKit.
Read this: The developers making magic with ARKit
While IKEA is currently the biggest name releasing an ARKit app, it's not the only one. There have been plenty of developers this summer experimenting with ways to utilise augmented reality in Apple's ecosystem. There's stuff in the fashion world, like ModiFace, which lets you see how your hair would work in different dyes.
Of course, AR is perfectly suited for navigation. Or, say, how about finding your friends at a crowded public event? That's what the Neon app is aiming to use ARKit for. You just pop open the app, look around and some blaring neon will point out where your buddy is. It's incredibly simple, and incredibly useful.
There are also mapping apps, so instead of looking down at a map and trying to situate yourself, you just point your camera somewhere and let the overplayed navigation graphics show you the way. Of course, there are also measuring apps that feel like they could become the first big AR consumer app boom.
Developers keep finding incredible ways to push AR kit, and sometimes it's difficult to tell what's real and what's augmented. Take Kabaq, for instance ‚Äď an AR app that's supposed to help you choose what to eat at restaurants. Instead of relying on words alone to convey menu items, it visualises it for you right there on your table. As you can see from the video demo, it's hard to tell whether that's a real burger or a virtual burger. It's virtual, but this is the power of ARKit.
And yes, there are games. At its iPhone X event, Apple showed off a new augmented reality multiplayer game called The Machines, which seems heavily inspired by MOBA games like League Of Legends. Basically, you have to fight each other and try to destroy each other's bases. You can move around the battlefield physically to get new vantage points to give you an advantage. Even better, moving around changes your relation to the game. For instance, the closer you are to the action the louder the volume is.
How it feels
Back at WWDC, we were able to check out some ARKit demos and were impressed by what we saw. There's a remarkable fidelity to objects in ARKit. For instance, in our demos of Google's AR, and even in the rudimentary AR in Snapchat, there's a way to break the immersion. At some point, the virtual object gets misplaced or acts weird and starts looping around, the AR software confused about what to do. However, as hard as we tried to break ARKit during our hands-on session, we couldn't do it. The objects stayed in place as if they were really there.
For some new technologies, you have to really feel the difference to know there's a difference. That's not quite the case with ARKit, as you can see in the numerous demos that this is a high-quality AR product on the way. The only question is where it's going.
Apple isn't the only one playing in augmented reality. This is Apple, of course, it's never first. Companies like Microsoft, Google, Snap, Facebook and Magic Leap have been sowing the AR future for quite a while now.
Google is probably Apple's most direct competition. The search giant has embarked on AR glasses before with Google Glass, which, um, failed, but more recently it's announced ARCore, which is basically the Android version of ARKit.
ARCore is based on Project Tango technology, but how it works in practice is going to be crucial. Google is aiming for 100 million test devices, which will include its own Pixel phones and the Samsung Galaxy S8, but the company also has to work with third parties to make sure ARCore is fully supported during the test phase and beyond. Google may also have a more difficult time ensuring comparative experiences across the wide array of Android devices, which sometimes don't receive the newest operating system upgrade for quite a while.
Microsoft has also been in the AR world for a while with HoloLens, but that's currently mostly aimed toward enterprise use cases. For consumers, Microsoft is much more focused on the potential of mixed reality, and that currently doesn't overlap too much with Apple's more pure AR.
Snap is definitely looking at AR glasses, and has thus far been the biggest mainstream force in AR that's not Pok√©mon Go thanks to Snapchat filters. Snap has the talent to put together a compelling AR team, and it has the fashion sense that understands how to make something cool and hip and wearable, but right now it's still just a camera company and budding social network. It would be difficult for Snap to ramp up AR into a full-fledged way of living. Plus, Apple and Snap are friendly at the moment, with Apple announcing that it partnered with Snap on high-end new filters for iPhone X.
Mark Zuckerberg's Facebook is has made no secret of the fact it's working toward AR smartglasses. Facebook has the social networks and messaging platforms to make an augmented reality platform appealing, but Facebook's forays into hardware haven't been so good ‚Äď outside of Oculus, which it bought.
Finally, there's Magic Leap. The mysterious Florida-based startup is developing what seems to be a complete AR solution, with CEO Rony Abovitz calling it an "operating system for reality." Beyonce wasn't impressed, but plenty of other people have been, and on pure technology and potential alone it's hard to count out Magic Leap in the AR brouhaha. But, well, Magic Leap is also just a well-funded startup that hasn't actually shipped anything yet.
Apple is maybe the best positioned to push AR forward. It has an ecosystem, it has hardware experience, it has a focus on consumers and it knows how to make exciting hardware products that people want to wear (hello, AirPods and Apple Watch). But the problem with Apple is that it also moves slow.
The future of ARKit
During its iPhone X event, Apple continually referenced how the cameras in both the iPhone 8 and iPhone X were calibrated for AR. Augmented reality was on Apple's mind when they were making these phones, which is obvious when you realize Apple is working on AR glasses.
The play here is simple. Apple is using its mammoth iPhone audience to build the biggest AR ecosystem in the world, and when everything is ready ‚Äď when there are games and productivity apps and a whole bunch of great AR experiences ‚Äď the hardware doesn't have to worry about usefulness.
The future of AR is in front of our faces, not in metal slabs in our hands, and ARKit's future is powering that experience. Just when ARKit will power this experience is a different story, though, as Apple is currently having trouble finding a compelling application of AR for smartglasses.
When and how
ARKit makes its big debut when iOS 11 drops on 19 September. The requirement to run ARKit on iOS 11 is simple: You need an iOS device with at least an A9 chip running at its heart.
So if you've got an iPhone 6S or later, an iPhone SE, an iPad Pro (any size) or a 2017 iPad you're good to go. You'll get a front row seat to the build of Apple's future smart glass ecosystem.