It's practically everyone's childhood dream to push objects forward with your mind and during Tribeca Film Festival, my dream was fulfilled. All it took was a Kickstarted Emotiv Insight EEG headset, a Raspberry Pi and one Disney Store BB-8 Sphero (no Midichlorians required).
IBM engineer Joshua Carr used the Watson Internet of Things platform and commercially available hardware to develop technology for controlling Bluetooth-enabled devices via brainwaves. He showed off this tech to the public at the Tribeca Festival Hub where I was able to try out controlling the little Star Wars droid for myself.
Once I put on the headset, Carr had me stand still for 30 seconds as the Emotiv Insight SDK transmitted my brainwaves to the Pi, registering my baseline thought responses to the noisy festival surroundings. Then Carr told me to maintain a specific thought for seven seconds, which the software would register as my mental command to "push" the BB-8 forward.
Doing my best Jedi impression, I pushed my hand forward and thought "PUSH! PUSH! PUSH!" over and over, trying not to feel silly. After a couple of agonizing minutes, the Pi finally registered the thought command. It sent out a Python script that took over the BB-8's Bluetooth, ordering it forward into the opposite wall.
The Force was too strong with this one.
Overall, the experience was more frustrating than empowering. The moments where BB-8 moved forward felt random instead of a natural progression of thought, and I wasn't sure if I could replicate it before I had to pass the headset off to the next person. Was I not concentrating hard enough until the end? Or had I been concentrating too hard on "nothing" during the baseline readings, throwing off the measurements?
Carr reassured me that the cognitive delay was natural. It took him 45 minutes to successfully send his first command, as the software analyzed his specific way of thinking and he got accustomed to better visualizing and concentrating on his desires.
He did joke that his niece had no trouble moving BB-8 with her imagination with her first try. She pictured riding a unicorn on a rainbow, and BB-8 rolled forward with gusto. Obviously, the subject of the thought matters less than its strength and specificity.
Once the user gets past their first hurdle, Carr assures me that he or she can focus on much more than commanding a children's toy.
"Pushing BB-8 is just 'generic action A,'" said Carr, "but with time you will be able to add multiple actions. You can point to the ceiling and have the software translate that into turning your lights on and off. Or your headset can register your bad mood on the drive home and pour you a cup of wine as you pull into the driveway." All this could be done through the cloud.
The future software behind your smart home?
Carr's first plan for improving the technology is avoiding false positives. Once the headset is fully attuned to your thought processes, then turning off your smart TV or adjusting the thermostat could be literally as easy as thinking about it. That also means you could trigger your devices with an idle thought or hand wave, which is something Carr wants to avoid. Getting your thought commands mapped to the cloud takes around eight hours in total right now.
Read next: VR Star Wars on HTC Vive is everything
He wasn't specific on if and when his code will be released to the public, but it appears this tech demo is less about selling IBM products and more about software developers adopting IBM's Internet of Things cloud for their own programming - with their own commercially available hardware.
"It took me 2 nights of coding and 5 beers to program this rig," said Carr, suggesting that other programming experts could and should try to replicate his work to control their smart devices.
As a non-programmer, I don't intend to dive into his code, but I can definitely see mind-controlled tech moving beyond demos in the near future. We'll have to wait and see what IBM decides to focus on after BB-8.
How we test