Apple's reality: All the latest details on Apple's AR exploits

Tim Cook ain't playing around

Apple's AR exploits jumped into high gear back when it announced ARKit in June 2017. That was only the beginning, as the company is also working on a pair of AR smartglasses, with reports saying it'll unveil them in late 2019 and release them in early 2020.

But how will it all work? Who's behind it, and what kind of hardware can we actually expect? Will there be an app store at first? Will Apple be too late to take on rivals like Magic Leap? Well, let's start off with taking a look at hardware.

Apple's AR hardware

Apple's reality: All the latest details on Apple's AR exploits

Apple appears to be leveraging established partnerships to build the hardware for its glasses. For instance, Nikkei reports that Catcher Technology, which makes metal framing and cases for the iPhone, is going to start making lightweight framing for augmented reality devices. It's not saying it's doing this for Apple, but the fact that it's a key partner for Apple makes it an easy leap.

By the way, "lightweight" framing indicates a form factor more like actual glasses, rather than a headset or visor. This would fit in with Apple's obsessiveness with building the most svelte gadgets possible. It would also match with Apple's desire to make wearables that are also fashionable, like the Apple Watch.

The display, according to Digitimes, is going to utilize MicroLED. That's a technology that utilizes half the power, is brighter and higher contrast than regular AMOLED displays. Plus, it's thinner, which makes it easier to fit into lighter frames. However, this is far from a sure thing. Apple wanted to debut MicroLED on the Apple Watch but wasn't able to. Still, MicroLED appears to be the go-to tech for AR smartglasses, and it's likely they'll make their way into Apple's smartglasses at some point.

Back in early 2017, Robert Scoble said an anonymous Carl Zeiss employee told him the company was working with Apple on a light pair of augmented reality smartglasses. Carl Zeiss experiments on lenses for augmented reality, so if this is true it's likely that's where Apple's lenses could come from.

As for the glass, that will likely come from Corning, a big Apple supplier. Corning actually patented something called a Wide Field Display, which is used for AR and enables a field of view between 40 and 70 degrees. A good number of AR glasses sport a field of view of 45 degrees, so that's not too bad, but it could be better.

Read next: Best ARKit apps and games

Apple is looking to power this new device with a custom system-on-a-chip, much like it does with the Apple Watch, according to Bloomberg. These chips, according to CNet, would be able to power an 8K display for each eye. It would also be wireless tethered to a box that houses these chips, kind of like the Magic Leap One.

There is a bit of a winkle. Reliable Apple analyst Ming-Chi Kuo says Apple is readying an Apple Watch-like pair of smartglasses for that early 2020 release. These glasses would be an accessory that lean heavily on your phone for power.

It's possible that Apple may have looked at the success of the Apple Watch and decided the best way to launch AR smartglasses was to make it an iPhone accessory first. Then, as it gets easier to miniaturize parts, it could built an all-in-one standalone device when the time is right. And if there's one thing Apple is mostly good at, it's timing.

And then there's the control methods. According to Bloomberg, in late 2017 Apple was reportedly unsure about how users will actually control the device. It looked at Siri, touch panels and head gestures to get around the interface. At the time, it appeared Apple hadn't settled on any of them yet.

That Bloomberg report partially corroborated a 2017 leak from alleged Foxconn insiders on Reddit. The leak claimed that the device would have a microphone, accelerometer and magnetometer. It would also use bone conduction for audio and sport a 428 x 240 resolution in champagne and black designs for both men and women. The interaction methods were on par with the Bloomberg report. There would be a capacitive touch strip on the arm for volume and call functions, head gestures to control apps and Siri.

The Siri claim seems natural given AirPods, and backed up with updated patents that included references to the voice assistant as a remote for smartglasses. Also in the world of patents, Apple made plans for a finger-mounted device that would let you interact with objects, kind of like North's ring for Focals.

Apple's AR software

Apple's reality: All the latest details on Apple's AR exploits

And then there's the software. It's reportedly going to run a new fork of iOS called rOS, which stands for Reality Operating System. Apple is currently prototyping AR applications for this new headset, so goes the Bloomberg report.

These apps are both rebuilt versions of Apple apps, like Maps and Messages, and brand-new apps like virtual meeting rooms and 360-degree video playback. We don't yet know how the interface will look or what kind of things rOS will be capable of, though.

We are told that rOS will have an App Store where you can download applications built for ARKit. And interestingly, Apple is internally using an HTC Vive and developing a Samsung Gear-like device to test AR applications for the headset.

This matches up with an earlier Bloomberg report, which says Apple had been in talks with potential suppliers for components of a glasses-like device late in 2016, and "has ordered small quantities of near-eye displays from one supplier" for testing purposes. The device would connect to the iPhone and present images over the wearer's vision. It would show images and other information in the wearer's field of vision, and may use augmented reality, the anonymous sources said.

Obviously, if Apple is looking to build an AR device they're going to need test devices to work out interface and app development. Using virtual reality, a more advanced medium at this stage, is a good step toward doing that.

Who's behind Apple's AR?

What's the latest with Apple's AR exploits?

That would be the same team behind ARKit, led by former Dolby hardware head Mike Rockwell, who has assembled a team consisting of veterans from Oculus, Microsoft's HoloLens team, Amazon's Lumberyard VR platform, Google Earth, and, perhaps unexpectedly, Hollywood special effects studio Weta Digital, which was behind movies like The Lord Of The Rings trilogy and Avatar.

Apple has also moved over people from its camera team to work on the project, which makes sense because the camera is vital toward AR. As AR-focused VC Amitt Mahajan told us, AR not only needs to be good at displaying virtual objects on top of real ones, it needs to be able to see and understand the world around you to do it properly.

Apple has also been spanning up augmented reality companies to bolster its talent. In November 2017, it bought startup Vrvana, who previously developed a mixed reality headset called the Totem.

Must-read: Meet Apple's augmented reality dream team

Though it went unreleased and was mostly used for demos, the Totem used pass-through cameras on the outside to view the world and display it on OLED displays inside the headset. Then it would overlay things on top of it, and could even switch to VR if needed. It's likely that Vrvana and the Totem are being used to develop Apple's AR glasses, rather than become a consumer product.

It also purchased augmented reality company Metaio, which was reportedly so sold on Apple's pitch for AR that it threw the bankers out of the room and wanted to sell itself to Apple for cheap. More recently, Apple has been picking up patents that Metaio had been working on.

Apple is also beefing up ARKit's abilities in iOS so that developers can do more things, like allow apps to remember where you placed virtual objects – so that if you close and open them everything is exactly where you left it.

Apple's AR challenges

Apple's reality: All the latest details on Apple's AR exploits

Augmented reality is not easy, and there are a number of technical challenges that need to be overcome in software, AI, graphics and manufacturing before a true mass-market pair of glasses is available. CEO Tim Cook has been very up front about Apple's challenges in the space. Speaking with The Independent, he explained them.

"There are rumors and stuff about companies working on those – we obviously don't talk about what we're working on," Cook said. "But today I can tell you the technology itself doesn't exist to do that in a quality way. The display technology required, as well as putting enough stuff around your face – there's huge challenges with that. The field of view, the quality of the display itself, it's not there yet."

Cook continued: "We don't give a rat's about being first, we want to be the best, and give people a great experience. But now anything you would see on the market any time soon would not be something any of us would be satisfied with. Nor do I think the vast majority of people would be satisfied." He did conclude, however, that "Most technology challenges can be solved" but said it's "a matter of how long."

For a while, Apple was still trying to figure out compelling use cases for AR moving forward, according to Financial Times and Wired. It's also got multiple prototypes around, with some as simple as Snap Spectacles, recoding video and relying on the iPhone as a display. One thing is for sure: Apple has a real pair of AR glasses, according to AR-focused VC Matthew Miesnieks, who says he's spoken to people who have held them.

Even if Apple has figured out use cases, the Foxconn leak reveled there could be trouble behind the scenes. Chiefly, the employees noted there's currently a 65% chance that "Project Mirrorshades" is completely scrabbled. The 2017 leak said the smartglasses could be delayed until 2018 or 2019. It's 2019 now, and we haven't seen anything yet.

However, that timeline was even more aggressive than the reported "aggressive" timeline of revealing the glasses in 2019 and releasing them in 2020. Even then, that seems ambitious as the tech doesn't seem fully ready, as per Cook's comments.

With that said, it's been two years since that report and Apple still seems committed to AR. AppleInsider reported that Apple was among the many industry giants meeting with AR suppliers at CES 2019, which perhaps means that a 2020 release isn't completely out of the realms of possibility.

While it's clear Apple is still working things out, it's worth noting that the Cupertino company typically doesn't "lock" device designs in until a year before they ship. So it's likely those decisions are being made now if Apple really is releasing in early 2020.

Now, it's no secret that Tim Cook is excited for AR, and the launch of ARKit proves that Apple is serious about this space. As does the fact that Apple purchased a number of companies that are well-versed in AR software. Or companies like SensoMotoric Instruments, which specialize in eye tracking. Wouldn't that be helpful for a pair of AR glasses?

Apple has clearly invested a lot into this project, but Apple is also not afraid of dropping a project or scaling back if it's not working out. Take a look at Apple's big self-driving car project, which saw it hire a number of car engineers only to see it sputter out eventually.

Apple does have further plans for AR, though, that hint at a larger AR ecosystem. In November 2017 we reported on word that Apple could launch a mixed reality TV within the next year. It would use the TrueDepth camera system on the iPhone X to be able perform Kinect-like sensing in your living room.

If we do see the glasses launched – whatever the case with Foxconn, glasses seem like an eventuality anyway – it would be the third major new hardware category launched under Tim Cook, after the Apple Watch and HomePod. The Apple supremo has also gently nudged away speculation that Apple would be going first to VR, saying that he sees more value in augmented reality. Ol' Timmy Cook: all about the wearables.

What do you think?

Reply to
Your comment