The atmosphere of last week's NYU Jubilee, a showcase of virtual reality projects employing everything from simple mobile VR to cordless Vive rigs to Oculus Touch controllers, could only be described as, well, jubilant.
One student developer proudly confessed to spending 90 hours over two weeks to finish his build in time for the event. Telling the creators you enjoyed their experiences usually evoked beaming smiles and sighs of relief.
But despite the hobbyist atmosphere - developers kept sneaking away from their stations to try out their peers' work - the projects were anything but amateur. Many of the these people, a mixture of NYU students, graduates and professors, are actively seeking funding and exposure to take their work outside of academia.
Must-read: The best VR games to play right now
The event was sponsored by Oculus and Time Inc with reps from VR game and film studios in attendance looking for the next big idea in VR - names including Schell Games and Lord of the Rings' VFX head Ollie Rankin.
These innovative uses of virtual and mixed reality could prove successful in invading and changing the VR market in the near future.
To Be With Hamlet
Robert Corey (left) towers over Zachary Koval (right) as Hamlet's Father.
Shakespeare claimed that "All the world's a stage"; NYU's Virtual Reality Lab is using VR to turn the stage of Hamlet into an immersive, interactive world for theater-goers to enjoy from home. At the Jubilee, I met Javier Molina, NYU Tandon adjunct professor and director of NYU's MAGNET VR Lab, who co-heads the To Be With Hamlet project with David Gochfeld.
I donned a Vive headset and watched the motion-captured performance of the two actors above, portraying Act I Scene V of Hamlet: the encounter between Hamlet and his father's ghost. Using Vive controllers, I could select a portion of the stage and teleport to it; as the Ghost described his own murder, I jumped in between the two actors to get a close-up of his monologue, then switched vantage points as he paced across the stage.
The pre-recorded performance used OptiTrack motion capture technology and a Sense 3D scanner for the actors' movements, as well as spatialized audio. Eventually, Molina said, they intend to host free, live motion-capture performances where Vive-wearing audience members across the globe can observe Shakespearean scenes and interact with their fellow theater-goers. To achieve this, they use the M3diate platform, another NYU alumni creation that puts up to 15 simultaneous Vive users into one environment.
They don't expect viewers to sit in VR for a four-hour play like Hamlet, and are toying with performance lengths
Of course, some technical challenges must still be solved. The team has yet to implement expensive facial tracking measures to match the Unreal models to the actors' expressions in real time - especially noticeable when you teleport up close and listen to the actors' passionate performances projecting from unmoving lips.
The actors themselves don't wear VR headsets during their performances in a black box studio, so they can't connect with the audience as they might on stage. And, as Molina noted, they don't expect viewers to sit in VR for a four-hour play like Hamlet, and are toying with performance lengths.
Talal Choudhury's SMAKK Down takes a simple concept - VR boxing, using Oculus Touch sensors coded to register punch direction and velocity - and adds some fun flourishes. Using Occipital's 3D structure sensor, Choudhury scanned random people, including himself, into the game as enemies for the user to battle in the ring. Up to two players can fight cooperatively at a time, while other users can spectate and add hazards or effects to prank players.
- Opto Air wants good mobile VR for the massesIs there room between Gears and Daydreams for this London startup?
- VR goes big at Sundance 2017Here's the big VR trends and news from Park City, Utah
- Intel's virtual reality plans laid bareWhat's behind Intel's interesting VR hardware choices
- Your guide to virtual reality etiquetteVR headset and pants = strong look
My brief experience beating up the smiling scanned avatars was hilariously fun in a low-strategy Wii Boxing kind of way (just swing your arms really fast). Their appearances reminded me of HD versions of Goldeneye low-res actor scans: just realistic enough to immerse you without being graphically intensive.
A fan of retro side-scrolling co-op fighting games like Final Fight, Choudhury wants to take his proof of concept and develop it into an online co-op experience. Players could wander the 3D streets of New York or Central Park, battling a series of "real" people along the way instead of generic, generated enemies you typically see in video games.
His next planned step is to use a VR camera to register the user's entire body movement and reward good boxing form in-game.
The Myth Machine
Virtual reality lends itself to the "walking simulator" style of game that lets you slowly explore a world. But Myth Machine, a "tech-gothic immersive mystery set in a near-future world," eschewed spoon-feeding me a narrative in favor of slow, haunting world-building during my thirty-minute demo.
"[We] have to be careful with psychological effects in VR"
You feel a lot like Portal's Chell, executing seemingly innocuous tasks that slowly reveal something more sinister, and learning about the world through a tech source that describes your eerie surroundings with a blasé, deceptive cheeriness.
Without spoiling the secrets the defunct startup hides, the narrative leaves room for psychological effects à la Amnesia or Eternal Darkness to play out in VR, though I didn't run into any. The non-VR demo has some elements that haven't been ported to VR yet, and the team programmer I spoke to said that they "have to be careful with psychological effects in VR." Personally, I'm hoping the final version lets loose and scares my pants off.
Making shapes in 4D
Myth Machine's team wants funding to develop and sell their game product, but Wenbo Lan and NYU professor Ken Perlin want to use VR to make educational concepts more accessible.
Lan, an NYU grad student in computer science, helped me strap on a wireless Vive backpack rig and inserted me into a virtual space where I could manipulate objects in four dimensions using the Vive controller. Specifically, I could change the size and dimensions of "hypercubes" and "aerochorons", geometric shapes I had trouble comprehending until I could observe them in a "real" space.
The team's next project moves beyond manipulation into actually building structures - a little like Minecraft. Using VR's unlimited dimensional space is allowing them to create experiences that other mediums cannot simulate.
Perlin and Lan also work on Holojam, a mixed-reality program that uses attachable sensors to turn any mobile device into an advanced VR tracking device. In essence, their team seeks to democratize VR by allowing motion tracking for smartphone VR, instead of it being restricted to $500+ Oculus and Vive owners, and to make augmented reality more feasible for low-end devices.
Read next: Why you need Oculus Touch
Holojam's optical markers attach to your mobile headset, allowing tracking cameras to pick up on your position accurately and register your body movement, and then render an avatar that your fellow mobile users in the room can pick up on. Holojam has used this tech to host massive social experiences: Flock, a sandbox dancing party depicted above that can host up to thirty people at once, and Dia de los Holos, an interactive parade of ghosts and skeletons.
Unfortunately, I didn't get a chance to try Holojam's demos out myself, but they may have the most commercial potential of any of the projects at the Jubilee. They are one of five finalists for the AR and VR technology award at the South by Southwest Accelerator Pitch event this March, where companies often receive millions in funding and buyout offers. If they impress investors, Holojam's team could receive the funding they need to distribute their mobile tech on a large scale.