As part of Wareable's AR Week, we're republishing some older features around augmented reality. This is one of them.
10 years ago, Dr. Allen Yang hadn't read a single academic paper about augmented or virtual reality. He couldn't find any. His only exposure to AR and VR came from Hollywood: Star Trek's Holodeck, Iron Man's Jarvis system and Princess Leia's message in Star Wars. Then, in 2012, a switch flipped. Google Glass was announced and the Oculus Rift was launched on Kickstarter.
Now at Berkeley, Yang helps run two programs for AR and VR, aimed at getting undergraduate students invested in the technology. The VR lab he and his colleagues set up had five members when it opened. Two years later, there are over 200 of them.
Essential reading: Why building AR glasses is so hard
The influx is simply down to letting the tech speak for itself, and Yang tells us that the VR lab's booth during Berkeley's annual Cal Day open campus was regularly the most visited.
About a half a year ago, Immerex, a VR company that wants to bring virtual reality to the movie business, approached Berkeley about a partnership and eventually donated money to help fund Berkeley's VR and AR program.
The program is four-pronged, with goals to use VR on Berkeley's diverse campus, in the medical field, on an open-source AR project and a new way to control and communicate with robots. They plan to use the funding to create facilities that can support all of them.
The 21st century computer lab
Some Berkeley disciplines have already started using AR and VR to teach classes. There's an Art History class that teaches the art of VR, and one of the architecture professors uses VR to teach students how to design buildings. The problem, unfortunately, is that hauling all that equipment to different classes is a pain.
So some of that sweet Immerex funding is going toward building a VR classroom that teachers can rent out. Dotted around the main VR room, which can hold 20 students, will be separate break-off rooms that can be used by students to develop their own VR and AR projects. The goal is to create a 21st century computer lab, one that makes it as easy as possible for students to get access to VR and AR.
Yang and his students are looking at ways to expand VR and AR into other disciplines too. They want to help medical students study medicine in 3D, offering a more real sense of presence rather than relying on 2D materials like textbooks. They also want to replace expensive, single-use medical dummies with AR and VR options, which can be programmed to be adaptive and multi-use. Plus, they can simulate surgical complications better.
They're even looking to figure out a new cinematic language, since filmmakers have told them introducing VR to filmmaking is difficult. Movies are linear, with a single perspective, the filmmaker's, while VR allows the user to look around whenever they want. The filmmaker loses control.
By now, you've probably seen one of the hundreds of terrifying Boston Dynamics videos. The company's weird robots can be kicked off balance and it'll readjust and steady itself. It's all pretty terrifying, but part of that terror, according to Yang, is a lack of real communication between humans and robots.
"If there's a robot over there - scary or not - you look at this robot and have no idea what this robot is trying to do. You have no idea what this robot can do. You don't know its task list. You don't even know its battery level."
For contrast, consider how humans communicate. When you meet a person for the first time, you observe their facial reactions to judge their intentions. You can quickly figure out whether they're tired or sad or angry. We can't do that with robots. Enter ISAACS, which stands for Immersive Semi-Autonomous Area Command System, a way for humans to understand the intention of robots.
Things to read next
The system would use AR with headsets like Microsoft HoloLens to tell you what a robot's task is and what it plans on doing next; you would be able to read a robot by simply looking at it. The second part of ISAACS is control. Yang thinks that controlling drones and robots is in the same place as computers were in the 60s. You need an advanced understanding of what to do and how. Like the advent of graphical user interfaces and touch-based interfaces, something better needs to come along.
"There's no graphical intuitive interface to control robots," Yang says. There's no easy way to teach parents, grandparents and regular people how to control them in simple ways. "Otherwise, fundamentally, there's going to be a barrier for robots to coexist with people in the same space."
ISAACS has a long way to go, largely because of safety. It's one thing to get humans to understand robots, it's another to get robots to understand humans, so that human error doesn't result in humans getting injured by a rogue drone's blades. Open Ark, the lab's open-source AR platform, however, doesn't have a long way to go.
Open Ark comes from a fondness of Berkeley's tradition of embracing open source projects, from all the back to Unix in the 1970s to more recent deep learning.
"What we want is a fully open research program," Yang says. "We give you the source code, we give the source code to other universities and we give the source code to companies." Since corporations are some of the driving entities behind AR and VR at the moment, Yang believes open source at the academic level is important to push the technology forward, and growing understanding.
Currently, Open Ark technology allows Berkeley to use a single device for AR without trackers or beacons. Although that AR is limited to sitting down at a table, with a limited augmented area. Later this year, the students hope to finalize a version that allows for more mobility.
One of the problems with healthcare, according to Yang, is that the best doctors and hospitals tend to be concentrated in metropolitan areas. "This imbalance has been true in the US, and also true in developing countries. So one way to address that, and also lower the cost of healthcare and provide better healthcare, is telemedicine."
Augmented and virtual reality are two pieces of tech that can help make telemedicine easier for people. A doctor in San Francisco could help a patient in Des Moines, Iowa, for instance. And further, if medical resources and patient information are in two different locations, they could be brought together easily for better analyzation.
Physical healthcare isn't all though, and Yang says there's potential for mental rehabilitation using healthcare. He points out that many dentists already play music in the background to attempt to distract you from procedures, and others use headsets that play movies for the same reason. But what if a dentist could use VR to fully immerse someone in another world, better distracting them from the pain of a root canal?
Berkeley is launching new VR classes this fall, so they'll need the new VR classroom and an upgraded VR lab to be ready this summer. There'll also be a new, second VR lab, named after Immerex. It'll be fully transparent, with glass walls that allow students walking by to peek in to what's being worked on.
"We also want to inspire the future," Yang remarks.