Craig Chapman is a movement neuroscientist at the University of Alberta, and he’s received an EEG in the Wild grant form the Canadian Institute for Advanced Research in order to study the combination of movement tracked by VR technologies, eye gaze, and EEG data. Chapman is collaborating with other CIFAR scholars Alona Fyshe and Joel Zylberberg to use the motion data and eye tracking data from immersive technologies in order to automatically label the EEG, which will then be used to train machine learning models to see if they can start to detect the neural signatures that will be able to predict future behavior. I had a chance to talk to Chapman at the Canadian Institute for Advanced Research workshop on the Future of Neuroscience & VR that took place May 22 and 23, 2019. We talked about about his research, embodied cognition, and how virtual reality is revolutionizing his neuroscience research in being able to correlate the trifecta of movement data, eye gaze, and EEG data.
LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST
This is a listener-supported podcast through the Voices of VR Patreon.