Stanford University has just published an important research paper that hows how motion tracked data in VR can be identifiable of specific users. The paper titled Personal identifiability of user tracking data during observation of 360-degree VR video was published in Scientific Reports on October 15th with authors including Mark Roman Miller, Fernanda Herrera, Hanseul Jun, James A. Landay & Jeremy N. Bailenson.
I had a chance to catch up with Miller on October 12th to summarize their major findings that included a 95% accuracy in being able to identify one of 511 different participants from a 20-second sample size from a 10-minute session of watching a 360-video, and then rating their emotional reaction using the HTC Vive hand-tracked controllers. Even though they’re watching a 360 video, they have access to a 90Hz feed of 6DoF information from the head pose in addition to two 6Dof-tracked hands. From this basic motion-tracked data, they’re able to extrapolate a unique signature of someone’s body size, height, and nuances of how they hold and use the controllers, which ends up being enough information to reliably identify someone given the right machine learning algorithm.
I talk with Miller about the experimental process and analysis, as well as some of the implications of this study. Currently this type of motion tracked data is typically considered to be de-identified data, but research like this may start to reclassify motion tracked data as personally-identifiable and potentially even classified as biometric data. We also talk about how specific medical information can also be inferred from the recording on this motion-tracked data. There’s more ways to make this type of research more robust across multiple contexts over time, but it’s generally pointing to the possibility that there are some immutable characteristics that can be extrapolated and inferred from this data.
LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST
This is a listener-supported podcast through the Voices of VR Patreon.
Podcast: Play in new window | Download