LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST
On May 20th, Oculus will be releasing a “My Privacy Center” web interface that will allow users to download a copy of the personal data that Oculus has collected, view the information that Oculus collects when you use their platform, and set privacy settings around who can see your real name, real name search, sharing your Oculus apps & activity, as well as who can see your friends list. Hall and Cohen told me that Oculus is really committed to transparency, and these automated privacy tools will be a huge step in actually allowing users to audit what data are being collected.
Both the current and new privacy policies are more likely to grant Oculus permissions for what data they can collect than to detail the obligations for how Oculus plans on capturing and storing that data. Hall and Cohen described to me how Oculus takes a tiered approach to privacy where there are at least three major tiers of data that are collected: data that are collected and tied back to personal identity (which they try to limit), data that are de-identified and shared in aggregate (things like physical movements taken at a low sample frequency), and then personal information that is useful for VR and is only stored locally on your machine (like the height of the player).
None of the de-identified data that’s captured is going to show up in the new My Privacy Center, which means that there is currently no way for users to audit what types of de-identified data are being captured. There’s also no mechanism for users to see if the sample frequency of the recording of physical movements increases, and there’s no disclosure obligation by Oculus to let users know if they do increase the frequency or start capturing new types of physical movements. If Oculus is truly committed to full transparency, then they should provide a master list of all of the different types of data that are being collected in a table format with details about the different tiers of how that data are being stored, and what information is being shared with other Facebook-family services.
The new GDPR law also says that “it must be as easy to withdraw consent as it is to give it,” but there is not any indication that Oculus is going to be providing ways to opt out of having any types of data being captured and recorded as this granularity of control was not shown in initial screenshots of the new My Privacy Center.
But both the old and new privacy policies say that all data collected by Oculus can be also shared with Facebook. “Sharing Within Related Companies. Depending on which services you use, we share information within the family of related companies that are legally part of the same group of companies that Oculus is part of, or that become part of that group, such as Facebook.” It also says that they can use information to “market to you on and off our Services,” which may have been intended to mean e-mail, but it can also read to mean that Oculus data can be used to advertise to you on Facebook.
All of the biometric data experts that I’ve talked with have warned about the concerns about biometric data privacy. Behavioral neuroscientist John Burkhardt warns that there’s an unknown ethical threshold between predicting and controlling behavior with access to biometric data streams like eye tracking, facial tracking & emotional detection, galvanic skin response, EEG, EMG, and ECG.
Privacy advocate Sarah Downey warns that VR could turn out to be the most powerful surveillance technology ever created if companies start recording biometric data, or it could be the last bastion of privacy. She also points out that the more data that companies record, that the more that weakens American’s Fourth Amendment protections which can make it less likely that people will speak freely into their First Amendment rights to free speech.
Jim Preston warns against the dangers of performance-based marketing companies like Facebook or Google having access to biometric data, and that it’s mortgaging our rights to privacy in exchange for free services. He says that privacy is a really complicated topic, and that it’s going to take the entire VR industry to be engaged in these discussions.
Advanced Brain Monitoring CEO Chris Berka says that some biometric data should be considered medical information protected by HIPAA regulations, and that commercial companies will have to be navigating some sensitive issues for how they store and treat biometric data. Tobii’s VP of Products and integrations Johan Hellqvist says that companies should be asking for explicit consent before they consider recording eye tracking data.
So I’ve had many conversations with biometric data experts warning about how this data from your body reveals whole new levels of unconscious information about what you value, what you’re paying attention to, and perhaps even what you find interesting. Biometric data will be a gold mine for performance-based marketing companies like Google and Facebook, and so it’s not incredibly surprising that Oculus is leaving the door open for how they will treat it. But it’s also quite disappointing that Oculus is not being more proactive in participating in a larger conversation about biometric data while also seemingly discounting it as a concern that is really far off in the future when I’m seeing mobile VR prototypes at GDC 2018 from Qualcomm that have Tobii eye tracking technology built in. I expect to see eye tracking and facial tracking technologies released in VR and AR hardware within the next 1-3 years, which is not so off into the future.
There may also be issues with recording this type of biometric data in what is presumed to be de-identified, but that there could be unique biometric signatures that de-anonymize it. Open BCI’s Conor Russomanno warns that it may turn out that EEG data may actually end up having unique biometric signatures that means that the data may not be able to be fully anonymized.
When I asked why they removed this security section, Hall said that they’re not trying to make a claim that data is 100% secure, but they also didn’t see that this passage was necessary. It also happened to scare people. I don’t think it should have been removed because I think it’s actually honest about the reality of how any data that’s collected actually isn’t 100% secure and that it can never be guaranteed to be 100% secure. People should be scared because we should be trying to limit what data are being captured and recorded.
All data provided to third parties should be assumed that it’s possible to get hacked and potentially leak out onto the dark web. So when I expressed concern to Cohen that de-identified data being collected could be unlocked with the right biometric key his response was that you’d need to have access to the full set of data, and that this data is stored securely on their private servers. But information could have the potential to be hacked and leaked, and there could be a lot of unintended consequences of allowing biometric data to be captured and recorded in what is presumed to be a safe vault, but turns out to get hacked, leaked, and get into the wrong hands.
So Cohen’s response to my concern implies that data are completely safe in their hands, and that we shouldn’t worry about this scenario. Perhaps it’s low probability, but I’d argue that we should be thinking about the real risk that decades worth of biometric data could eventually be leaked out onto the dark web, unlocked with biometric signatures, and what could happen if a bad actor wanted to manipulate us if they had access to the most intimate data about our unconscious behaviors, values, and beliefs. Engineering the future depends upon all sorts of risks and tradeoffs, and it may turn out that some of these dystopian worst-case scenarios are so low risk as to not to worry about them. But perhaps we should be imagining these worst-base scenarios in order to think deeply about the risks of what data is being collected, and whether or not biometric data will be able to be fully de-identifiable.
So overall, the impression that I got from Hall and Cohen is that Oculus is earnestly trying to be on the right side of the transparency, and they’re trying to really build trust with users in order to grow the VR and AR ecosystem. The problem that I have is that there is still a lack of full transparency and communication about the types of data that are collected and how it’s stored, but also what types of data may prove interesting and valuable for Facebook to use for advertising purposes.
Both Hall & Cohen emphasized that they’re taking the most conservative interpretations of these types of passages, and that they’re trying to build trust with users, and that their new privacy tools will be providing new levels of transparency and accountability. A lot of these tools seem to be implemented as compelled by the new GDPR laws, and an open question is whether it requires these types of laws encourage Oculus to continue to implement privacy best practices or whether or not they’ll continue to go above and beyond what these policies require and start to provide even more details and information on what exactly is being recorded and tied to identity, what’s being recorded as de-identified information, and what’s stored locally on your computer.
I’m also happy to start a deeper dialogue with people who are directly on the Privacy XFN team at Facebook/Oculus who are starting to think about these deeper issues about privacy in VR and AR, and some of the privacy challenges that come with biometric data. It’s been difficult to have an embodied conversation with privacy experts at Facebook or Google, and I’m glad that the cultural conversation has changed to the point where I’m able to have an in-depth conversation about these topics. And hopefully this marks a change in how Oculus is engaging with press after not taking any press interviews at either Oculus Connect 4 or GDC 2018.
I was happy to hear how much consideration is being taken about how this data are being collected from this conversation, and I hope that Oculus finds some better ways to share this type of information in a more comprehensive and up-to-date fashion. The GDPR catalyzed a lot of great progress here, and I hope that Oculus doesn’t wait for more laws and regulations to keep on improving and updating their privacy practices.
This is a listener-supported podcast through the Voices of VR Patreon.
Support Voices of VR
- Subscribe on iTunes
- Donate to the Voices of VR Podcast Patreon
Music: Fatality & Summer Trip