Neurable is building a brain-computer interface that integrates directly into the virtual reality headset. Neurable uses dry EEG sensors from Wearable Sensing, which means that it doesn’t require gel in order to get a good signal from the EEG making it a lot more user friendly than BCIs that require gel. I had a chance to try the demo at SIGGRAPH 2017, which was showing off what Neurable refers to at “Telekinetic Presence.” It is the closest thing I’ve ever experienced in VR to having the technology read my mind, as it ran a calibration phase to be able to detect the brainwaves associated with intentional action. Once it’s trained, then it’s a matter of looking at specific objects in a virtual environment, and then experiencing a state of pure magic when it feels like you can start to move objects around with your mind alone.
Neurable CEO Dr. Ramses Alcaide suspects this type of magical, BCI mind control mechanic is going to be a huge affordance for what makes spatial computing unique. He said that the graphical user interface plus the mouse unlocked the potential of the personal computer, and that the capacitive touch screens unlocked the potential of mobile phones. He’s hoping that Neurable’s BCI can help to unlock the potential of 3DUI interactions with virtual and augmented reality. I had a chance to catch up with Alcaide at SIGGRAPH 2017 where we talked about the design decisions and tradeoffs behind their BCI system, their ambitions for building the telekinetic presence of the future, and their work on an operating system in a spatial computing environment that aims to create a world without limitations.
LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST
This is a listener-supported podcast through the Voices of VR Patreon.