#641: Oculus’ Privacy Architects on their Open-Ended Privacy Policy & Biometric Data

Oculus will be releasing a new Privacy Policy and Terms of Service tomorrow that will go into effect on May 20th, just five days before the EU’s General Data Protection Regulation (GDPR) privacy law enforcement deadline of May 25th. I had a chance to review the new privacy policy and terms of service as well as talk with the lead privacy policy architect Jenny Hall and a privacy cross-functional team member Max Cohen, who leads product for the Oculus platform. Generally, both the old and new Oculus privacy policies are written in an open-ended way that provides Oculus great leeway in being able to capture and record a lot of different types of data, and the new privacy policy actually adds a number of new passages that allows for new types of data to be collected. Hall & Cohen emphasize that Oculus is committed to transparency and building trust, and that they need this flexibility to account for future applications that haven’t even been imagined yet. But as the line between Oculus and Facebook continues to blur, there are still many open questions for what types of data or biometric gathered from VR is going to prove to be useful for Facebook’s advertising bottom line.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

jennifer-hallIn talking with Hall and Cohen, they were able to detail how Oculus is taking a much more conservative approach than a worst-case scenario interpretation of what the privacy policy affords, but up to this point their limited implementations have relied upon a “just trust us” approach with not a lot of transparency on the full range of data that is actually being captured and how it is being stored. Oculus will soon be releasing more GDPR-inspired transparency tools so that users will be able to audit what personal data are being recorded so that users will be able to see for themselves, but these tools still will not reveal everything that Oculus is capturing and recording.

max-cohenOn May 20th, Oculus will be releasing a “My Privacy Center” web interface that will allow users to download a copy of the personal data that Oculus has collected, view the information that Oculus collects when you use their platform, and set privacy settings around who can see your real name, real name search, sharing your Oculus apps & activity, as well as who can see your friends list. Hall and Cohen told me that Oculus is really committed to transparency, and these automated privacy tools will be a huge step in actually allowing users to audit what data are being collected.

The current privacy policy allows users to request to download and review your data, but I found their previous method to be both unreliable and non-responsive. Oculus did not respond to my previous email requests that I sent to privacy@oculus.com in January and March 2017, and so I’m happy to see that the GDPR obligations have catalyzed an automated web interface that will provide immediate access to the private data Oculus has captured. When asked if all of the GDPR obligations will be provided to all of the users around the world, an Oculus PR representative responded, “We are making sure everyone has the same settings, controls, and privacy protections no matter where they live, so not just Europe but globally. The GDPR’s penalties and notification policies are specific to EU law.”

Both the current and new privacy policies are more likely to grant Oculus permissions for what data they can collect than to detail the obligations for how Oculus plans on capturing and storing that data. Hall and Cohen described to me how Oculus takes a tiered approach to privacy where there are at least three major tiers of data that are collected: data that are collected and tied back to personal identity (which they try to limit), data that are de-identified and shared in aggregate (things like physical movements taken at a low sample frequency), and then personal information that is useful for VR and is only stored locally on your machine (like the height of the player).

However, Oculus does not disclose in the privacy policy which tier data will be captured at. For example, in the “Information Automatically Collected About You When You Use Our Services” section, Oculus only says that they collect “information about your environment, physical movements, and dimensions when you use an XR device.” Oculus doesn’t specify that their current recordings of physical movement data are not tied to your identity, that the sample frequencies are too low to fully reconstruct movements, and that it is only presented in aggregate form. This is the type of information that Hall and Cohen provided to me when I asked about it, but Oculus hasn’t disclosed this information in any other way.

The way the privacy policy is written implies that physical movements could indeed be tied to personal identity at as high of a sample frequency as they would want. It’s this level of vague open-ended language that allows Oculus to capture data at a much high fidelity than they currently are. Because Oculus doesn’t commit to any specifics in the privacy policy, then this means that they don’t have to commit to notifying users if their implementation changes. Currently Oculus isn’t tying physical movements to identity, but that could change next month and there are not any notification obligations that are specified in the privacy policy. The privacy policy merely states that Oculus can record physical movements without being overly prescriptive for how Oculus decides to implement it.

It is worth pointing out that both Hall and Cohen emphasized over and over again that they’re really committed to transparency, and that most of their interpretations of the privacy policy are very conservative. They’re not trying to scare users, but rather build trust with them. Users will be able to have tools in May to be able to verify what data are actually being recorded, and if there is a mismatch of expectations of having way more data that’s captured than users were expecting, then that’ll cause users to lose trust with Oculus. It takes a lot of time to build trust, but it can be lost in a moment and Cohen emphasized that losing trust can be detrimental for Oculus. So I took this message to be on good faith that Oculus’ Privacy Policy needs to be flexible enough for them to be able to provide the services that they are providing, but the privacy policy still only provides limited obligations for what Oculus is committed to providing.

It is likely that this is because Oculus is trying to keep their privacy policy simple in response to GDPR obligations to have human-readable privacy policies that give concrete examples. Hall also said that they’re trying to prevent the policy from exploding into hundreds of pages long. Once downloadable access to what exact data are actually collected and tied to identity will also likely solve some of these problems of having open-ended and vague language in the privacy policy, but it won’t solve all of the transparency issues about what exactly is being recorded.

None of the de-identified data that’s captured is going to show up in the new My Privacy Center, which means that there is currently no way for users to audit what types of de-identified data are being captured. There’s also no mechanism for users to see if the sample frequency of the recording of physical movements increases, and there’s no disclosure obligation by Oculus to let users know if they do increase the frequency or start capturing new types of physical movements. If Oculus is truly committed to full transparency, then they should provide a master list of all of the different types of data that are being collected in a table format with details about the different tiers of how that data are being stored, and what information is being shared with other Facebook-family services.

The new GDPR law also says that “it must be as easy to withdraw consent as it is to give it,” but there is not any indication that Oculus is going to be providing ways to opt out of having any types of data being captured and recorded as this granularity of control was not shown in initial screenshots of the new My Privacy Center.

One of the most concerning new passages in the new privacy policy is this statement: “We collect information about the people, content, and experiences you connect to and how you interact with them across our Services.” This could potentially open the door for Oculus to start correlating what content you’re specifically looking at within a VR experience, and then feed that data to Facebook for advertising purposes. One of the passages in the “How do we use information?” section says that the information that they gather is used “To market to you. We use the information we collect to send you promotional messages and content and otherwise market to you on and off our Services.” When I asked Hall about reading these two passages together, she said that the marketing passage currently means to sending promotional emails about VR experience that you might like, and that Oculus doesn’t have any current plans to do any more sophisticated advertising.

But both the old and new privacy policies say that all data collected by Oculus can be also shared with Facebook. “Sharing Within Related Companies. Depending on which services you use, we share information within the family of related companies that are legally part of the same group of companies that Oculus is part of, or that become part of that group, such as Facebook.” It also says that they can use information to “market to you on and off our Services,” which may have been intended to mean e-mail, but it can also read to mean that Oculus data can be used to advertise to you on Facebook.

So even if Oculus doesn’t have any plans to do any advertising, Oculus has set up the legal framework to be able to send data over to Facebook where it can be used for advertising purposes. There is no where that Oculus has committed to disclosing what specific information is ever shared with Facebook, or what type of data might prove to be useful for advertising purposes. Even if Oculus isn’t currently sharing any data with Facebook, and even if they don’t have any near-term plans to do so, they have granted themselves this right in their privacy policy with no further obligations for disclosing what data are being shared to other services.

UPDATE It looks like Oculus’ blog post has a FAQ with the question and answer of “Is my Oculus data used to target ads to me on Facebook? We don’t share data with Facebook that would allow third parties to target advertisements based on your use of the Oculus Platform.” So they’re saying that they’re not currently sharing data that would be used by third parties for advertising, but their privacy policy technically allows this to happen in the future. This is another example of how open-ended their policy is where a close reading of the policy would allow this to happen in the future, and there are not any commitments made in the privacy policy to disclose to users if this changes in the future or any transparency on what specific data (de-identified or identified data) is going to ever be shared with Facebook. Also, does not sharing Oculus data directly to third party advertisers mean that Facebook won’t be using data from Oculus to create more specific psychographic profiles? This could indirectly benefit advertisers. Again, there is no obligation that Oculus has made anywhere to fully disclose what information might be shared between Oculus and Facebook.

The other biggest open question that I have for Oculus and Facebook is what their philosophical stance on recording biometric data is going to be. I was disappointed to hear that they are not taking any stance on biometric data yet, which means that they’re still leaving the door open to potentially capturing and recording biometric data in the future. Cohen said that there aren’t any Oculus platform technologies released yet that are recording biometric data, and so they’re currently having those discussions internally on the Privacy XFN team. Hall said that these questions about biometric data seem to be way off in the future, and that they are not prepared to make any statements on it yet. Just because Oculus hasn’t released any products yet to directly capture biometric data or that it is still in the future doesn’t mean that Oculus can’t have an opinion about biometric data and how they plan on treating it. Hall did say that they would likely update their privacy policy to account for biometric data, but it’s also possible that this privacy policy will be unchanged once products that can capture biometric data are released here in the near future.

All of the biometric data experts that I’ve talked with have warned about the concerns about biometric data privacy. Behavioral neuroscientist John Burkhardt warns that there’s an unknown ethical threshold between predicting and controlling behavior with access to biometric data streams like eye tracking, facial tracking & emotional detection, galvanic skin response, EEG, EMG, and ECG.

Privacy advocate Sarah Downey warns that VR could turn out to be the most powerful surveillance technology ever created if companies start recording biometric data, or it could be the last bastion of privacy. She also points out that the more data that companies record, that the more that weakens American’s Fourth Amendment protections which can make it less likely that people will speak freely into their First Amendment rights to free speech.

Jim Preston warns against the dangers of performance-based marketing companies like Facebook or Google having access to biometric data, and that it’s mortgaging our rights to privacy in exchange for free services. He says that privacy is a really complicated topic, and that it’s going to take the entire VR industry to be engaged in these discussions.

Advanced Brain Monitoring CEO Chris Berka says that some biometric data should be considered medical information protected by HIPAA regulations, and that commercial companies will have to be navigating some sensitive issues for how they store and treat biometric data. Tobii’s VP of Products and integrations Johan Hellqvist says that companies should be asking for explicit consent before they consider recording eye tracking data.

So I’ve had many conversations with biometric data experts warning about how this data from your body reveals whole new levels of unconscious information about what you value, what you’re paying attention to, and perhaps even what you find interesting. Biometric data will be a gold mine for performance-based marketing companies like Google and Facebook, and so it’s not incredibly surprising that Oculus is leaving the door open for how they will treat it. But it’s also quite disappointing that Oculus is not being more proactive in participating in a larger conversation about biometric data while also seemingly discounting it as a concern that is really far off in the future when I’m seeing mobile VR prototypes at GDC 2018 from Qualcomm that have Tobii eye tracking technology built in. I expect to see eye tracking and facial tracking technologies released in VR and AR hardware within the next 1-3 years, which is not so off into the future.

The fact that Oculus has said that they can record physical movements could already mean that they’ve created the legal framework to capture other types of biometric data. When I asked whether or not “physical movements” could be interpreted to be eye movements or facial movements, then Hall wasn’t willing to provide a definitive answer and said that they currently had not been thinking about it in that way. But the way that the current privacy policy is written is open-ended enough that it could already give Oculus the right to record eye tracking movements or facial movements, and tie it to our identity if they chose to do so.

There may also be issues with recording this type of biometric data in what is presumed to be de-identified, but that there could be unique biometric signatures that de-anonymize it. Open BCI’s Conor Russomanno warns that it may turn out that EEG data may actually end up having unique biometric signatures that means that the data may not be able to be fully anonymized.

This has implications for what may be presumably be de-identified biometric data, but that there may be a unique biometric key that unlocks the identity information. Oculus ensures us that they use state of the art security practices, but data can never be completely guaranteed to be safe and secure. Oculus is actually removing the Security disclaimer in their privacy policy that used to read, “Please note that no data transmission or storage can be guaranteed to be 100% secure. As a result, while we strive to protect the information we maintain, we cannot guarantee or warrant the security of any information you disclose or transmit to our Services and cannot be responsible for the theft, destruction, or inadvertent disclosure of information.”

When I asked why they removed this security section, Hall said that they’re not trying to make a claim that data is 100% secure, but they also didn’t see that this passage was necessary. It also happened to scare people. I don’t think it should have been removed because I think it’s actually honest about the reality of how any data that’s collected actually isn’t 100% secure and that it can never be guaranteed to be 100% secure. People should be scared because we should be trying to limit what data are being captured and recorded.

All data provided to third parties should be assumed that it’s possible to get hacked and potentially leak out onto the dark web. So when I expressed concern to Cohen that de-identified data being collected could be unlocked with the right biometric key his response was that you’d need to have access to the full set of data, and that this data is stored securely on their private servers. But information could have the potential to be hacked and leaked, and there could be a lot of unintended consequences of allowing biometric data to be captured and recorded in what is presumed to be a safe vault, but turns out to get hacked, leaked, and get into the wrong hands.

So Cohen’s response to my concern implies that data are completely safe in their hands, and that we shouldn’t worry about this scenario. Perhaps it’s low probability, but I’d argue that we should be thinking about the real risk that decades worth of biometric data could eventually be leaked out onto the dark web, unlocked with biometric signatures, and what could happen if a bad actor wanted to manipulate us if they had access to the most intimate data about our unconscious behaviors, values, and beliefs. Engineering the future depends upon all sorts of risks and tradeoffs, and it may turn out that some of these dystopian worst-case scenarios are so low risk as to not to worry about them. But perhaps we should be imagining these worst-base scenarios in order to think deeply about the risks of what data is being collected, and whether or not biometric data will be able to be fully de-identifiable.

So overall, the impression that I got from Hall and Cohen is that Oculus is earnestly trying to be on the right side of the transparency, and they’re trying to really build trust with users in order to grow the VR and AR ecosystem. The problem that I have is that there is still a lack of full transparency and communication about the types of data that are collected and how it’s stored, but also what types of data may prove interesting and valuable for Facebook to use for advertising purposes.

The line between Oculus and Facebook continues to blur, and so I can’t help but to read the privacy policy with a lens of the worst-case scenario of how Facebook might want to gather biometric data about people to feed into their advertising systems. Oculus provided a lot of transparency with the data that are being collected, and hopefully their My Privacy Tool will help with that. But there are entire classes of data, and the specifics of how the data are captured and stored that are completely opaque. And on top of that, there are no obligations for notification or disclosure that they’re writing into their privacy policy, and so whatever is happening today doesn’t mean that this is what will be true tomorrow.

Just because Oculus isn’t living into the full extent of what their privacy policy affords, it’s written open-ended enough for them to grow into it and create new products that weren’t even imagined or implemented at the time of the writing. This allows them flexibility, but this also means that there are many passages in their privacy policy that are written in such an open-ended and vague way as to be possibly interpreted to mean a lot of scary things. Hall claimed that the new privacy policy isn’t trying to gain new rights, but the passage of “We collect information about the people, content, and experiences you connect to and how you interact with them across our Services” could open the door to allow Oculus to more precisely track how you interact with specific content within a VR experience.

Both Hall & Cohen emphasized that they’re taking the most conservative interpretations of these types of passages, and that they’re trying to build trust with users, and that their new privacy tools will be providing new levels of transparency and accountability. A lot of these tools seem to be implemented as compelled by the new GDPR laws, and an open question is whether it requires these types of laws encourage Oculus to continue to implement privacy best practices or whether or not they’ll continue to go above and beyond what these policies require and start to provide even more details and information on what exactly is being recorded and tied to identity, what’s being recorded as de-identified information, and what’s stored locally on your computer.

I’m also happy to start a deeper dialogue with people who are directly on the Privacy XFN team at Facebook/Oculus who are starting to think about these deeper issues about privacy in VR and AR, and some of the privacy challenges that come with biometric data. It’s been difficult to have an embodied conversation with privacy experts at Facebook or Google, and I’m glad that the cultural conversation has changed to the point where I’m able to have an in-depth conversation about these topics. And hopefully this marks a change in how Oculus is engaging with press after not taking any press interviews at either Oculus Connect 4 or GDC 2018.

I was happy to hear how much consideration is being taken about how this data are being collected from this conversation, and I hope that Oculus finds some better ways to share this type of information in a more comprehensive and up-to-date fashion. The GDPR catalyzed a lot of great progress here, and I hope that Oculus doesn’t wait for more laws and regulations to keep on improving and updating their privacy practices.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality


Support Voices of VR

Music: Fatality & Summer Trip

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. My name is Kent Bye and welcome to The Voices of VR Podcast. So both virtual reality and augmented reality are going to introduce all sorts of new challenging privacy questions as we move forward. This technology is going to have the capability to record and capture biometric data about us that is going to be incredibly revealing. As soon as you start to tap into different aspects of our biometric data, then you have the potential where some of these companies that are gathering and recording this data could start to know us better than we know ourselves, which in talking to John Burkhart from iMotions, he said, look, as a behavior neuroscientist, there's this unknown ethical threshold between predicting and controlling behavior. So you have this line that gets blurred between advertising and behavior modification. And I think that we're starting to see some early indications of that with some of the Facebook and Google, just different ways that algorithms are being able to impact our individual lives, but also us collectively. So one solution to all of this, as suggested by Sarah Downey, a privacy advocate, is for companies just to simply not record data that they don't need. Because anytime that you are allowing a company to record data on you and tie that to your identity, that is weakening your protections for that data to remain private. According to the third party doctrine, it says that any data you hand over to a third party has no reasonable expectation for privacy. It basically erodes the Fourth Amendment rights for that data to remain private. So the more that we collectively decide to let companies record data on us, then that is, in some real sense, it is collectively and culturally and legally weakening our legal rights to privacy, which as that continues to erode away, that has a direct impact for how we are able to speak freely. the weakening of the Fourth Amendment in the United States then weakens the First Amendment. So with virtual reality, there's going to be all sorts of new types of data that we have to collectively look at and make a decision as to whether or not we want companies to be recording this data and what the implications of that are going to be. So privacy in VR is a challenging topic. And in talking to Jim Preston, he said, you know, we actually need the entire industry to be engaged in this conversation. And it's actually been very difficult for me to engage some of the largest players to sit down and to have an embodied conversation about privacy. And I'm happy to say that I've actually had the opportunity to sit down with some of the architects of privacy policy at Oculus, as well as someone who's on the Privacy XFN team. So this cross-functional team at Facebook and Oculus that have been starting to think about these larger privacy issues. you tend to see in the privacy policy something that's very open-ended and widely varying interpretation and without the capability to actually see what data is being recorded then we in some sense have to assume the worst or we just have to have the trust that they're not going to be actually recording things as to the extent as what is described in their privacy policy. So With GDPR, this is the EU's general data protection regulations. These are new privacy laws that are going to affect for all EU citizens that are requiring these international corporations to update their privacy tools and policies. And so these new regulations are going to affect on May 25th, and on May 20th, Oculus is actually going to be implementing their new privacy policy as well as a new terms of service. Oculus is going to be releasing some new tools that is going to allow people to have some insight as to what data are actually being collected. But at this point, we haven't seen much transparency as to what has or has not been recorded. So in some ways, this is the first opportunity that I've had to actually talk to different people that are involved within these privacy discussions at Facebook and Oculus to be able to actually walk through the privacy policy and call out specific passages that, you know, I find concerning or these larger issues around biometric data and how they're thinking about that. So that's what we're covering on today's episode of the Voices of VR podcast. So on today's podcast, I had a chance to talk to both Jenny Hill, she's on the Oculus legal team, as well as Max Cohen, he's the lead product for Oculus platform. And I had a chance to sit down and talk with them on Tuesday, April 17, 2018. They were in Menlo Park, California, and I was in Portland, Oregon. So with that, let's go ahead and dive right in.

[00:04:37.389] Jenny Hall: Great. Thanks, Kent. My name's Jenny Hall. I'm on the Oculus legal team. I head up our privacy programs on the legal side here in the US.

[00:04:47.331] Max Cohen: And I'm Max Cohen. I lead product for Oculus platform.

[00:04:52.152] Jenny Hall: And I'll take you through a little bit of an intro, and then we can go right into Q&A to discuss the topics that are important to you.

[00:04:58.434] Kent Bye: OK.

[00:04:58.574] Jenny Hall: Sounds great. Awesome. So we here at Oculus, we know we're at the forefront of new technology. We're at the beginning phases of VR, beginning phases of AR. We know that there are sensitive categories of data that we're thinking about. People care about it. People want to know about it. So we're super committed to transparency and letting users know exactly how we're treating their information. So we're rolling out some changes over the next few days and months that we hope will make it even clearer to users, to people about how we collect, use, and share their data. So I'll provide a quick summary of these, and then we can get more into the details in Q&A. So the first thing that we're announcing is My Privacy Center. My Privacy Center is a tool that's going to be launching in late May that will be a centralized repository where people can go and learn about their privacy settings, set their communication preferences, and access the new tools that we're building. These new tools were inspired by GDPR, which is the new data protection regulation that is coming into effect in the EU in late May, but we are rolling them out globally. We want to do what's right by our global community, not just what is the baseline required by law. So these tools will allow you to access data about you in the Oculus system. They will allow you to download data, and they will allow you to delete your account if you want. We hope you don't want to. The second thing that we're announcing is updates to our privacy policy in terms of service. These updates are not about us taking you right. They're totally about providing more transparency to people on our platform. We've heard feedback. We understand people want more tangible examples of the types of data that we collect, better descriptions of how we use that data, and that's what we've tried to implement in the privacy policy and the terms of service. We've updated them to include things like our new social feature, and some new advancements we're making in abuse reporting. So that's announcement number two. And number three is we're incorporating our Oculus code of conduct into the terms of service. Just as respecting people's privacy is critical to our success, so it's keeping people safe in VR. We've had a code of conduct here at Oculus since the inception of our social product, but we're taking this opportunity to make it even more visible to people about how we expect them to behave in VR. so that we can create a safe and vibrant VR community. So with that, I'll hand it over to Max to talk about it from a product perspective.

[00:07:37.353] Max Cohen: Great. Thanks, Jenny. Yeah, I'll just build briefly off of what Jenny said and talk a little bit about how we think about either new software or new hardware and where privacy fits in. So there's really two core principles that we think about that drive our product development process. The first one is that we only want to collect data that is necessary for a good VR experience. And the second principle is that we have to involve privacy thinking right at the outset. It can't be an afterthought. It has to be something that we're talking about even when we're at the initial PRD phase where we're thinking about what are the actual requirements that are going to go into the product that we're building. So in terms of that first principle, there's going to be some data elements that we have to collect just to make VR function. A good example of this is movement data. So if you turn your head in a headset or if you're using your touch controllers, we need to know the position and space of those devices so that VR can function. And this is analogous, in my mind, to mouse or keyboard input. If you're in VR, that movement data is that actual input. But there's also an element of what data the developers need. So in order for a VR ecosystem to thrive, developers are going to have to know specific things like what kind of hardware people have, the average size of their play spaces, and so on. We're never going to share individual data about this, but we do aggregate and de-identify this data and make it available to developers and people alike. public as part of our Oculus Platform Stats hardware survey. There's also developer dashboards that devs can know how their apps doing. What kind of revenue are they making? How is their app performing? How many installs they're getting? But again, that's always at the aggregated and anonymized level. In terms of that second principle, we've taken a step of explicitly creating what we call a privacy XFM team that brings together people from policy, legal, program management, product, and if appropriate, security, and they get involved right at the beginning of the product development process. So as we think about new products like inside-out tracking that's on our Santa Cruz prototype, we're going to be faced with new privacy challenges and we want to make sure that we stay true to those principles right from the start. And we're going to be open and transparent about the data that we collect and make sure that people know and are able to access that data wherever possible. So with that, Kev, we're happy to engage on any questions you have.

[00:09:43.302] Kent Bye: Great. Yeah, I've got a number of questions here. The first question is, I think there's a difference between using information ephemerally and then recording it in the long term. So why is it that you need to record physical movements and store them and tie them to identity?

[00:10:00.032] Max Cohen: Yeah, so one good example is how we can generate data on the place case. So it's something that by knowing where people are going in terms of the averages of how far they might range from one side to side, we can then generate the ability for developers to know that people who use that application or that experience, that it might be a space that's three feet by five feet or eight feet by eight feet, or maybe people are generally staying seated and not going too far, and that helps them add new updates to those particular applications. It also helps with us knowing things about how often the product is dropped, and it can help inform even the hardware process and give us insights as we develop future iterations of the hardware.

[00:10:39.587] Jenny Hall: And just to build off that a little bit, I wanted to clarify that we actually don't tie this information to your personal identity. When this information is transmitted to our servers, we divorce that from any kind of identifying information. So we may know that like 500 people have a certain play space, but we don't know that Max Cohen's play space is two and

[00:11:07.437] Kent Bye: So I guess one thing that is already possible with physical movement is something like gate detection. That is that even though you're saying that you're de-identifying it, I think already it's possible to identify people as they move through a space. I can identify my partner as she's walking across the hallway. Just the same in these different VR experiences, you can start to identify people just by the way they move. And with AI algorithms on top of that, I think the risk of any information that's recorded, it may turn out there are biometric data markers. And so what is being treated as de-identified data may actually turn out to be personally identifiable. I think that's the tricky thing here with any physical movements. I guess the question is, like, what happens if this information does have a unique biometric identifier and that what has been decided to be de-identified now actually turns out to be personally identifiable within the next two to five to 10 years with AI training algorithms?

[00:12:09.145] Max Cohen: I think there's two pieces of that. The first piece is that we're not recording the entirety of the data. We're just taking samples that get us what we need, which is to generate those averages in the play spaces. So I think that when you're talking about gate detection, that is actually looking at video or your own eyes of watching someone walk and move. That is not what we are recording. We're recording a small fraction of the amount of movement data, which enables us to generate some of these averages. The second piece of that is around information security. So the aggregated, de-identified information is stored on our Oculus servers. And without physical access to that data, you wouldn't be able to run any AI algorithms or anything like that. So that comes back to also, we take security extremely seriously here. We do get to leverage a lot of the best practices that the industry knows. And so that is a separate component. I think that's almost outside the privacy side to make sure that we are treating your data responsibly.

[00:13:06.399] Kent Bye: I guess another thing that the way that the privacy policy is written, though, you may be only currently taking sample data. But I guess the question is, is there anything in the privacy policy that would prevent you from taking either higher samples of data? Because as it's written now, it's you're taking information about your physical movements. It's not actually specifying the frequency or anything. So at any point, you could change that. Is that correct?

[00:13:30.900] Jenny Hall: So we are not planning to do anything crazy with anybody's data here. Our interests are aligned with yours. Users are paying attention and users care about this stuff. If we do something that people aren't going to like, if we do something that is scary and we don't inform people about it adequately, they're not going to want to use our services anymore. That's why we're trying to be super transparent in these policies and make sure that people understand what our services entail.

[00:14:04.886] Max Cohen: We know that trust is continuously earned, but it can be lost in a moment. And so one mistake where we did something that users do not feel like they were appropriately disclosed to or that we did something above board would be incredibly damaging to our business.

[00:14:20.978] Kent Bye: So I guess one of the things that I was looking at as well is the interchange that happened between Oculus and Al Franken. And he was asking about recording of voiceover IP information. And currently they said, well, all the information that's coming over voiceover IP is cached. However, in the future, we are the developer community may develop new experiences or improve existing products that will require us to store communications. So I think I took from that response that, hey, you know, we're not recording voiceover IP now, but if in the future, if we are the developer community develop new experiences to improve existing products, the way that the privacy policy is written is that we could turn that switch on at any moment. And I think that's the fear is that the way that the privacy policy is written is that there's no obligation for you to disclose that now all of a sudden voiceover IP communications have been recorded.

[00:15:13.175] Jenny Hall: So I think that's actually a really great example of why our privacy policy is written the way it is, specifically with respect to audio. So at the time we responded to that letter from Al Franken, we had some minimal voice services in our product. We still adhere to the commitments that we made in the letter in response to Al Franken, where if Max and I are interacting in an Oculus room, Oculus does not store audio information except for the temporary caching that's required to enable that communication between Max and myself. But since then, we've developed some new products based on what we know that users want. So for example, we have a new audio command feature that allows people to have simple commands to their devices and launch software. So I can say, hey, Oculus, launch Robo Recall. These are voluntary services. People have to affirmatively interact with them in order to turn them on. And in those contexts, when a user has turned them on, we only listen for a small period of time. And we have to record that information transmitted to our servers in order to process that scan. We also will store sample splits in a de-identified format in order to improve our learning algorithms. So that if somebody is saying, launch Robo Recall, and we're actually launching Robo Apocalypse, then we can go back and fix things to make sure that they're working correctly. So I think this is a good example of why we have to be a little bit flexible in our privacy policy, but also illustrates how we're very thoughtful about privacy. Max mentioned the Privacy Cross-Functional Review Team when he was beginning to chat with you. And I think that's a good example of how the Privacy Cross-Functional Review Team added value and thought about the privacy of people on our platform when we were developing new products. So this team, as Max mentioned, it is comprised of legal, policy, program management, product teams from the Oculus side. We sit down with the product team and we understand the product and we start building in privacy by design. So we think about things like, do we need to store the data? Do we need to collect it at all? If we have to bring it up to our servers, do we need to keep it in identified format? Can we provide better user educational experiences? Can we provide users with controls? Are we appropriately securing this data? So, you know, the privacy protections that I talked about with respect to the voice command were effectuated as a result of this cross-functional review. And I think, hopefully, illustrate how we're being super thoughtful about this.

[00:17:56.675] Kent Bye: So I think another area that I have a lot of concern around is around biometric data. That could be anything from like eye tracking, facial movements, and eventually in the next two to five to ten years, galvanic skin response, EMG, EEG, ECG, all of this is data that is on the technological roadmap for where I see VR is going. Maybe in the short term, eye tracking and facial movements with sort of stress sensors that are built into the head mounted displays, which was part of Oculus research and how these program to be able to get more facial expressions. And so I think that in the short term, we don't have as much of that biometric data now, but physical movements is sort of the first indication of that. And so are you considering like the movement of an eye to be physical movement or the movement of your face to be physical movement? Because the way that the privacy policy is written now, you could say physical movement is encapsulating some of that biometric data.

[00:18:54.354] Max Cohen: So as we design products that may include some of those technologies, those are the questions that the privacy XFM team will have to wrestle with. I don't have answers to those yet because we don't have products that are shipping with any technologies like that. But these are the types of things that we are chatting about internally. I can say one thing. There's a piece of data, which is your height. I can tell you kind of how we handle that right now, which might be illustrative about how we think about these types of problems. So for the Rift to operate properly, we need to have a good sense of how tall you are so that everything is rendered correctly on screen. We actually just store that data on the person's computer. It's on the client side. It's not transmitted back to our servers. Because by doing that, we're able to make the system function, but we are not storing every individual's height. So that's one example as we think about this type of information, the way Oculus has treated this in the past and how we intend to think about these types of things in the future.

[00:19:49.915] Kent Bye: Well, I think the concern is with the third party doctrine. And maybe, Jenny, you can speak a little bit to this specifically in terms of how that third party doctrine is related. My understanding of the third party doctrine is that any time an individual lets a third party, say, Facebook or Oculus, record data, then there's no reasonable expectation of that data to remain private, which means that if the government comes to Facebook without a search warrant and says, hey, we want all of this, eye tracking data, we want the emotional data, we want the facial movements of this individual, then as long as there's the legal jurisdiction, then there's no reasonable expectation for that information to remain private. So there's kind of like this relationship that I see that the more that we have these opportunities to record and track and store this biometric data, then in the long term, the third party doctrine says there's no reasonable expectation of that to remain private and that reduces our fourth minute protections to privacy.

[00:20:45.592] Jenny Hall: So we're totally with you on this one. Oculus and Facebook are about connecting people and fostering authentic communication. We certainly do not want to be the impetus for any sort of chilling effect that would hamper those kinds of authentic communications. So we think about this on the front end and the back end. On the front end, we think about this through some of the privacy protections that we have talked about previously. Do we need to collect this information? If we collect it, do we need to store it in an identifiable format? And then on the back end, we have just a really amazing team of lawyers that makes sure that we are cooperating with law enforcement, but we are fighting hard against overbroad warrants and requests that are not authorized. So I think on the front end and the back end, we are really trying to stay, we're staying consistent with, I think, the sentiment that you feel, which is, We don't want to hamper authentic communications by, you know, reducing people's reasonable expectations of privacy.

[00:21:47.791] Kent Bye: Well, I think I guess the question for you, Jenny, is legally is the movement of the eyes or the movement of the faces of like facial movement, is that considered physical movements?

[00:22:01.402] Jenny Hall: So, you know, we haven't been considering that in terms of actually thinking about that in connection with our privacy policy or how we would treat it, because is pretty far out in the future. I will say we're starting to have internal discussions. I'm going up to our Seattle facility in the not-too-distant future to have discussions with the team as we're building out new technology. But that's not a question that I could answer for you here today. I can say that we want to be super transparent with people that use our platform. Our privacy policy is not the only place where we provide transparency. For example, when we launched new social features, We didn't necessarily expect our people using our services to go refer back to the privacy policy. We announced privacy settings. We asked people to review and change their privacy settings or confirm privacy settings when we launched the ability for you to link your Facebook and Oculus accounts. We provided prominent in-product notice about how that worked and the implications because it makes more sense there than in the privacy policy. It makes more sense to do it in the product when the person is actually interacting with the feature that you're talking about. So we have a layered approach to privacy and we're committed to being transparent in the privacy policy in our product. Anything we can do to make sure our users understand what we're doing.

[00:23:27.129] Kent Bye: Right. Yeah, I guess the thing that I would say is that, you know, if there are new products that are coming out within the next two or three years, I would imagine that this privacy policy that's being released would be accounting for that. Or let's say if there's eye tracking in the next version of the Oculus headset, Santa Cruz or whatever, it hasn't been announced or anything, but I can imagine something within the next two to five years that eye tracking is going to be a part of the technology that's being delivered. Illinois, for example, has passed laws that have said, okay, there's certain biometric identifiers that include retina, iris scans, things like that, that, you know, eye tracking technology would be able to presumably take pictures of our eyes. Then that's another layer of sort of biometric privacy that if this is the privacy policy that is being released now, you know, I kind of want to see like, well, what is the plan for the future over the next two to five years? And at this point, I don't see much of a plan at all, or I guess what I'm hearing from you is that you're going to come back and like update this privacy policy once you actually have the physical technologies.

[00:24:28.736] Jenny Hall: Yes, absolutely. When we cross that bridge, we will look through our disclosures and make sure it's totally transparent to people how we're using information. We don't want to be seen as hiding the ball. We're super proud of our privacy policy. I think we're the only people in the industry that have a VR specific privacy policy, which I hope is evidence of the fact that we really care and want to make people really aware of the information that we need to collect in order to make this product function.

[00:24:56.697] Max Cohen: And that's part of the Privacy XFN process. One of the tools is the ability to update the policy. But as Jenny said, we don't have to be reliant on that. So we can provide disclosures right in the UI, in companion apps. And so we're going to use everything at our disposal to be as transparent so people don't get surprised.

[00:25:15.427] Kent Bye: Cool. There's at least two very important questions that I want to get in here before we run out of time. One is that there seems to be a new passage in here. It says, we collect information about the people, content, and experiences you connect to and how you interact with them across your services, which this is the first time that I've seen that you're basically saying, OK, whatever you're doing and looking at within these VR experiences, we're going to now pay attention to what you're looking at and how you're interacting with different aspects of an experience, which as written is pretty broad in terms of like, you know, what I'm looking at, what I'm paying attention to. So maybe you could unpack that a little bit in terms of what your intention there was to be able to correlate what people are doing and how they're interacting with experiences and then what you're doing with that data.

[00:26:03.117] Jenny Hall: Absolutely. So I think the first example here is with our social experiences, right? So if Max and I are connected, if we're friends on Oculus, then we, Oculus, store data in order to make that connection and allow us to have that connection across different experiences. So, for example, if Max and I are in a party, Oculus has that data. And if we say, oh, we want to launch into Antar Wars together, Oculus can take that data and the information about the connection between Max and myself and launch us into that application. Another example here is content, right? We have thousands of applications in our store and it's oftentimes difficult for people to find apps that they connect with and that resonate with them. So we can do things like understand the types of applications you're typically interacting with. So if you're somebody who's interacting with sports games, we can surface those to you as a priority and not maybe a first person shooter that might not be interesting to you. We also have opportunities for you to designate interests for yourself that can work in the same way.

[00:27:20.493] Kent Bye: And the other question that I had that seemed like it was new within this privacy policy is the correlation of the abuse reports, the code of conduct. And like, if you violate the terms of service, then people will be able to submit an abuse report that may contain video of you. And so I'm curious, like what that means in terms of like these abuse reports. And it sounds like if I do something that violates the terms of service code of conduct, there may be a recording of that, that people are going to be submitting for review.

[00:27:50.628] Jenny Hall: So this already happened today. My understanding, and Max may be able to provide more color on this, is people can record things on existing technology that's not provided by Oculus, and they send us videos of abuse happening in VR. And we want to be really thoughtful about how we enforce upon abuse reports. If we don't have evidence of abuse, then that becomes an ability to abuse people and troll people in its own, right? submit a bunch of abuse reports against Max if I don't like Max and hopefully get him banned from the service.

[00:28:24.781] Max Cohen: I did have friends that got me kicked off the turntable at FM stage every time I got up because they kept on doing that. So this is actually a real life thing that happened to me.

[00:28:32.827] Jenny Hall: I mean, you probably deserved it. The music wasn't very good. But so that's the type of experience that we're talking about there. We think it's really important. I'm sure, you know, you're a VR aficionado. You've been watching the news and you've seen reports of of women having bad experiences in VR. And if that continues to happen, we're not going to be able to create a thriving VR community where people feel welcome. So that's what we're talking about there.

[00:29:00.681] Max Cohen: This isn't surreptitious video recording that's persistent that Oculus is doing and using to enact. This is user-generated abuse prevention reports that then go to a team that can review this. so that we can accurately be an arbiter between whether or not abuse happened. And the reality is that we want to be very responsive to this and do it in a way where we take these reports seriously. And by having video and audio recordings of the abuse, if it took place, that is really helpful for us to quickly action and make sure that VR is safe for as many people as possible.

[00:29:34.386] Kent Bye: I see. So it's it's self-recording. And then how do you prevent from someone fabricating a false report? Because, you know, it's this day and age, you can create basically anything you want and video is submitted. So I guess you're not doing anything on your end, but this is all up to people to record on their end. And then they could potentially edit it if they wanted to.

[00:29:52.750] Max Cohen: So this allows us to generate tools that people can use that we do run that will capture that video. So, again, it would still be initiated as part of the abuse prevention process. But this isn't always video submitted by someone else where it's just being sent in through a contact form. This is something that we do have interest in developing tools right in the user interface that people can do to capture this, which makes it much less subject to the type of editing that you're talking about.

[00:30:21.498] Kent Bye: Okay, and it sounds like on May 20th, these new tools that are being launched, will there be all of the data that you record that is stored? Because I think at this point, it's been a little bit of like a not knowing what's been recorded. And so it's a bit of like assuming the worst until we actually see what's recorded. So I'm guessing from everything, from the physical movements and everything else? Or is that de-anonymized to identity? So there's things that you're recording of me, but isn't it tied to my identity? So I guess everything that is tied to my identity, I'm going to have access to with these privacy tools. Is that correct?

[00:30:53.276] Jenny Hall: So what you've, you've kind of outlined one of the many reasons why this new access tool is great for privacy. You know, I think as you've seen with our privacy policy, we always deal with the challenge of making the privacy policy comprehensive, but also readable, where we don't want to draft a 100-page privacy policy that nobody will be able to read. So we focus on the categories that are most important to people. The access tool will certainly supplement that and allow you to access data that's associated with your account. I will say there are a couple of categories that won't be included in the access tool. One of them you've hit the nail on the head with is the movement information because we de-identify that. We've done de-identification for privacy-protected purposes in there. And then there is information that is stored on the device locally, like Max mentioned, height earlier. We don't actually collect that. We don't need to collect it. So we don't have it about you and can't deliver it to you in the tool. There will also be some information that we can't provide for legal or safety or trade secret reasons. So, for example, like payment information for security reasons, we can't provide that in our access tool. But I think you're going to be pleased to see the amount of information that we're putting in these tools. And it's really going to unpack a lot of the things in the privacy policy.

[00:32:21.085] Kent Bye: I guess one of the things I also saw in this new privacy policy is that you have this ability to provide third party developers information about me now. So I'm assuming that with this new privacy tool, I'll have all the information that you have on me. And I guess some of that information that you have on me, you're going to now be providing to third party developers. Maybe you could elaborate on what that information is that you're going to be providing.

[00:32:43.531] Jenny Hall: So we provide a variety of tools that allow developers to get data in order to make the product function. So we have APIs. Of course, developers need to understand information about where you're positioned in order to deliver you the content. And then we have some social APIs that we provide that provide developers with information in order to make social experiences on their end as well. And we have a robust system of protections in place surrounding the information that we provide to developers. On the front end, we have a app review process where our team scan apps to make sure there's no security vulnerabilities that could impact user privacy. We've rejected a number of apps from the Oculus Store because we thought that they would create these security vulnerabilities. We also have contractual protections in place with our developers to make sure they're using data appropriately. We surface developer privacy policies in our Oculus Store so users can have the opportunity to review those privacy policies and make informed decisions about whether they want to interact with the content. And then after third-party apps are in distribution, We also periodically audit our APIs to make sure we're not seeing any evidence of nefarious behavior.

[00:34:08.216] Max Cohen: And two more things worth flagging. We don't actually provide the email addresses of people who have that application, which is something that is different from a lot of other stores and platforms out there, because we don't want to make it so that developers can market based off email address. That's information that we protect. We are also auditing the permissions requested by applications. And so when apps are being submitted to us, we will push back at times if we feel like there is a permission being requested on a mobile device that we don't think is actually necessary for that app to run.

[00:34:41.811] Kent Bye: Great. And finally, I'd be curious to hear from each of you what you think the ultimate potential of virtual reality is.

[00:34:51.576] Max Cohen: Sure, I'll start. One of the reasons I came to Oculus and got involved is I just had a newborn son at the time. And I was thinking about the way I learned when I was growing up and how education, there's always new studies that are coming out, but the fundamentals haven't really changed that much. And I was just thinking about how inefficient it is to really read a textbook or to have to look at a static webpage to understand about how the pyramids were built in Egypt, or to understand ancient Rome, or to understand what's going on in the world today, where it's hard to build that empathetic connection. And so I would love for my son, when he's in high school, to be using VR as a primary interface in order to educate himself about culture, about history, about subjects like learning languages and math much more efficiently. And so the passion that I followed by coming here was creating this enabling technology that allows developers to try to figure out a lot of these things. And I will say that my expectations have been surpassed in all of the new and innovative ways that people have been using VR, that I think that if you look 5, 10, 15 years in the future, I do think that VR, while not yet an inevitability, is going to be something that all of us will be using all the time.

[00:36:10.372] Jenny Hall: And I will say my example here is not one that I had initially when I started at Oculus, but it has evolved over time. I have an 18-year-old son who recently unforgivably left me to go off to college. They always do. I know, it's so rude. But I love the potential for VR to enable us to have meaningful connections, even though I'm in California and he's in Colorado.

[00:36:39.505] Kent Bye: Hmm. Great. Yeah. And, you know, there's lots of more little nuggets here that changes the nuances. I'm going to be talking about, you know, things I noticed that, you know, that was taken out, that security is not 100% safe, that, you know, there's other things that I'll be sort of unpacking in the takeaway. But I just want to see if there's anything else that's left unsaid that you guys would like to say.

[00:36:58.755] Jenny Hall: Is there anything else that you want to dive into? I don't want to leave your questions.

[00:37:05.651] Kent Bye: No, I think that, well, the other, I guess the thing that I saw was that there was a passage in the original thing that said security is not 100% safe. There's no guarantee that we can protect your information. And that was taken out. I still think that's true. I don't think that anything that can be online can be 100% secure. And I'm just wondering why that sort of caution about this data can never be 100% safe was taken out.

[00:37:29.477] Jenny Hall: So we're not, representing that data is 100% safe. We are absolutely with you on that one. We do have state-of-the-art security systems here that make every attempt to keep data safe. The reason that we took that out was just because it seemed unnecessary. It seemed like it was scaring people.

[00:37:56.235] Kent Bye: Okay. I think it's, I mean, to me, it feels like it's sort of a reflection of the truth, but at the same time, um, yeah, I could see how it could do that. The other thing I just would say is that there's, there's information in there to be, you know, there seems to be more of a connection of Oculus and Facebook that, you know, Oculus is a Facebook company and that some of this information that's going to be gathered on us is going to be used to be marketed to us and to measure how users respond to different marketing sort of campaigns. So. I guess there's this inherent tension here between the incentive to be able to take information and data from me that's going to be used to be marketing versus that information sort of being sort of unconscious data that, you know, for example, the biometric data, I think is a big, huge thing that in the future, that's going to be huge insights to my unconscious. I was not completely satisfied of where things are going to be at for how that data is going to be used. I guess there's this underlying tension here that is written into the privacy policy that there are data that is useful for marketing and for advertising. And I don't know if that's going to be somehow tied to my identity in the future, I guess. And so I guess I'm a little confused as to what information would be useful, or if you have any idea of what information would be currently useful for advertising or marketing.

[00:39:11.394] Jenny Hall: So we're not currently doing any advertising on the Oculus platform. And we don't have any plans to in the near future. So I don't think we have thought through that.

[00:39:26.695] Kent Bye: Well, I guess it's in the privacy. Why is it in the privacy policy then? I mean, I guess it's it's it's the permission is there to do it. So it's sort of like it's there. So I guess I'm confused as to why. Why is it in the privacy policy if you're it's going to be there eventually?

[00:39:40.748] Jenny Hall: So we have provisions in the privacy policy that talk about providing you with marketing and promotional opportunities. If that's what you're referring to, that provision in the privacy policy refers to things like our notifications and our marketing emails that allow us to get you content that we think you might be interested in. So, again, if you are saying that you're interested in sports games, we can send you an email when something new comes out.

[00:40:13.252] Max Cohen: There's a live stream of the Marshmallow, but you're also allowed to opt out of that, and so you can choose to turn that off if you want.

[00:40:19.057] Jenny Hall: And our in-product notifications features, we also have very granular opt-outs that allow you to opt out of getting notifications from specific apps or services.

[00:40:30.142] Kent Bye: Yeah, I guess if you combine that with collecting information about the people, content, and experiences, how you're interacting with these experiences, that's basically saying we're going to look at how you're interacting with any experience, so what you're paying attention to and what you interact with, and that That's basically opening up the door to be able to take what I'm paying attention to and what I'm interacting with. Um, so I guess that's the thing that I, and, and having that in combination to the marketing sort of implies all now, all of a sudden that Facebook can have access to anything I'm looking at in a VR experience. If you're correlating and collecting information about how people are in content and experiences, how you connect and how you interact with them, that means that you're looking at what I'm looking at in VR and you're able to make that correlation to be able to tie that into potentially a profile about me to be able to advertise to me.

[00:41:17.611] Jenny Hall: So we talked a little bit before about how we kind of try and focus on the most important categories in the privacy policy. And just because they are there does not mean that we're necessarily using data in that way. And I totally understand your point about how read together, it could be read to imply that we are doing those things. But this brings back the purpose of the key tenants of our privacy cross-functional review team. Those folks are all, all of us are thinking about people that are on the Oculus platform and making sure that when we launch new products, we're not doing things that are outside of the scope of their expectations. And we're providing controls and transparency and whatnot. So I would say we are committed to providing that transparency and our interests are aligned. If we do something that users don't expect, if we do something that gives people the impression that we don't respect their privacy, they're going to lose trust in us and they're not going to use our services.

[00:42:19.506] Kent Bye: OK, well, that makes sense. And I think that once the tools are made available so that the users can actually see what's recorded, I think that'll help alleviate some of these fears that have been there. Because there was a provision in the privacy policy to email you if you wanted to see the data. And I did that last year, and I never heard a response. And so I think that having web accessible tools for people to automatically look at stuff that is there, I think will, I think, alleviate some of these deeper fears as to what is and what isn't being recorded. Great. Awesome. Well, thanks again for joining me. I think you addressed all my big questions. And I just want to thank you for taking the time to chat with me today and to really go over your new privacy policy and to be able to answer all these questions that I had about it. So I just really appreciate for your time today. So thank you. So that was Jenny Hall. She's on the Oculus legal team, as well as Max Cohen, the lead product for the Oculus platform. So I have a number of different takeaways about this interview is that, first of all, I'm just really happy to have the opportunity to talk to these people who are directly involved with privacy policy. I think the frustrating thing for me has been that I haven't had access to have this type of embodied conversation and to ask these direct questions. And I think that as virtual reality and augmented reality moves forward, then there are going to actually be all these really complicated and challenging privacy issues. That's something that Max admitted to with even the Santa Cruz headset, which is a mobile headset. which may actually have some augmented reality capabilities as well, which could imply that it may actually be used within public contexts and not just in a private context. VR tends to be a little bit more private and AR tends to be a little bit more public. And so there's more privacy challenges and issues when it comes to AR because there's more of a challenge of what does it mean to have other people that are possibly included within various different experiences. But the biggest challenge that I see to virtual reality is this biometric data. And I'm not quite sure I got very many satisfactory answers on how Facebook is currently thinking about biometric data. Now, Max Cohen says, you know, there hasn't been any technology products that have been released. And Jenny said that this is something that is what she perceives to be something that is way out into the future. When I went to GDC, I saw a Qualcomm had a mobile headset and it had some eye tracking technologies from Tobii and eye tracking technologies being integrated as a supplemental kit, but not something that's built into the technology. This is something that I've started to see within the last couple of years. And so the next official releases of these headsets, maybe Santa Cruz and maybe the other second version of the Oculus Rift. I expect that more of these types of technologies are going to start to be built in. It's something that I see within the next one or two or three years for sure. Even being able to capture facial expressions. So in some ways capturing your emotional data is a huge thing that is missing from virtual reality right now. You basically have to push buttons and abstract your emotions. What we really need is to not have any abstraction at all and just do a direct recording. Now, the challenge is, is that whenever you start to record your facial movements, then now all of a sudden, this is very intimate, emotional profile data that is being made available through the technology. So as the technology progresses, there's going to be more and more capabilities to capture things like Gavin. skin response, EMG, which is your muscle movements, EEG, which is your brainwaves, ECG, which is your heart rate. You have eye tracking and facial movements. So as we move forward, more and more of this biometric data is essentially like a key to your unconscious. It's like the Rosetta Stone to what you find valuable. It's what you're paying attention to and what is important to you, which could be a goldmine of data for companies like Facebook or Google, who are basically performance-based marketing companies who are trying to figure out what our deepest interests and values are so that they can create these psychographic profiles to be able to target us with advertising. So Facebook at its core is an advertising company and so Oculus is a subsidiary of Facebook but in some sense that line between those two companies I think are becoming more and more blurred over time even in the privacy policy there's just these indications that that line is essentially disappeared. It's in the privacy policy that any data that is being collected by Oculus is being shared with the other companies that are in the family of Facebook. So even if Oculus isn't doing advertising, then some of the data that are being collected by Oculus could be shared with Facebook and that is given as a right within the privacy policy. So I think that this is one of the most concerning things and moving forward in the future is what data are going to be interesting and important as you're in these virtual reality experiences and are they going to be making available this biometric data in the future. At this point, the overwhelming takeaway that I got from both Jenny and Max is that, hey, you know what, the privacy policy is written in an open-ended way to give us flexibility so that, you know, it allows us to do products that we didn't even think about when the privacy policy was first written. Just as an example, the social VR experiences of Facebook Spaces is a good example of something that was accounted for in the privacy policy, but it hadn't existed at all yet. you know, at the time they responded to Al Franken, where Al Franken was saying, Hey, look, your privacy policy is actually saying that you could start recording voiceover IP data at any moment. So what's the deal? And they said, Well, we're not recording anything yet. Everything's just cash. But sometime in the future, we may be actually developing some products that actually uses those provisions of the privacy policy. And that's exactly what has happened. And I think actually I'm talking to Max here in the course of this interview. At this point, they're not recording anything, but they are going to be doing some type of features for abuse reporting. So they're adding in their terms of service, the code of conduct that was, I guess, more of an informal policy when it comes to their specific like social VR experiences, but they're adding it to all of their social VR experiences and all of their experiences in general. Meaning that if you have a social VR experience and an application that is some social VR, gathering that is not being controlled by Facebook, now all of a sudden their code of conduct of not doing trolling or harassing or all these different conditions that is trying to create these safe spaces, if you violate those terms of service, it sounds like there's an opportunity for you to get banned. So at this point, the only way that people have been able to actually document and provide objective evidence of that is to record themselves within social VR to capture video and to send it in. And they need to actually know that this isn't a forgery in some way. And so they will potentially have to start recording various aspects of your interactions or at least have a running cache so that if you hit the record button, it may be, you know, how there's the last 30 seconds or a minute or five minutes or something like that where it's not necessarily being recorded automatically but that there could be in a case when there was a specific violation maybe it goes back and starts to capture specific events so that they can start to perhaps have some ways of moderating these different spaces. I guess the open question is, is it possible to have a technological solution to be able to create safe spaces? Or is it always going to be some combination of culture and technology? And what are those other cultural things that need to happen in order to cultivate these safe community spaces? because I think at some point if you only try to take a technological solution that means that they're gonna have to start to record and capture every interaction and have artificial intelligence automated to be able to you know detect some of these things and so there's various privacy trade-offs of only taking a technological approach and recognizing that there may be other sociological and cultural aspects of how do you build a culture and a community and rituals and and We don't necessarily have a really great example of how to make this perfect balance of culture and technology to be able to actually create these safe online spaces. And especially as we get bigger and bigger and increase in scale, some of these challenges become more and more challenging. I think, you know, Twitter is probably on the front lines of having to deal with a lot of these various issues over the last couple of years. And it's a really challenging problem to be able to do. So I guess we'll see what their approach is going to be. But at this point, they're not automatically recording your interactions. But it seems like it could be on the technological roadmap, given the fact that they're implementing this code of conduct within their terms of service, as well as perhaps creating more sophisticated tools for reporting abuse. So overall, it sounds like their privacy policy is very open-ended. However, they're not using the extent of what's laid out in the privacy policy. They are leaving that door open in the future. I guess for me, I want to see these trade-offs between what they're allowing themselves within the privacy policy and what they're providing us protection to the users in terms of disclosure and consent. You know, anytime that they are making these changes, I would love to know that, hey, now we're going to actually go out of our way to inform you and to ask for your consent for this. Now, they're saying that that's what they're going to do, but that's not actually written in the privacy policy as being sort of a legal obligation that they're standing up to. They're kind of going above and beyond what is dictated in the privacy policy to be able to do that. So I guess that's a frustrating thing is to read a privacy policy. It's basically everything that they're allowing themselves to do, but very little in terms of their obligations. I think the GDPR is something that is forcing a lot of obligations for them to start to live into what type of things that I would love to see in terms of, Hey, I'd love to have a user interface to be able to see what data are being recorded, because whatever data I give over to Oculus is technically my data. So I want to see what data I'm giving over to them. And at this point, they had a mechanism to be able to email them to say, if you want to see what that data, email us and we'll send it to you." And when I did that, I didn't get a response, but the tools aren't there. It should be much easier. Anybody should be able to just log on and to go into their privacy center and see what data are there. And I think that's what they're going to be launching in response to the GDPR, which were the EU's legal regulations to do this general data protection regulations so that It's kind of spelling out the data privacy protections that I think companies should be already implementing. And it's actually just like best practices that they need to live into. And what Oculus said is that they were inspired to this to then sort of roll this out to all of their users. And I guess citizens for the EU, there's actually legal backing and protections. But for other people in the United States and around the world, There's, you know, I guess an open question as to whether or not they'll actually implement all of these things and have all the same data protections that are spelled out in the GDPR regulations, and if they're going to follow those same rules for everybody around the world. So I guess we'll wait and see what those tools look like on May 20th. We'll see these new MyPrivacy Center settings, as well as be able to see the data that are actually being collected, at least some of the data. There's some data that's being collected that's being de-identified and aggregated and we won't have access to that. And so there's also information that they're storing locally on your computer that they have access to, but they're not seeing, which is great because that is showing that they have these many different tiers and layers of privacy either. You have data that's super private that you want to just have access to but leave that on your computer There's different physical movements that they're recording and de-identifying it So there's not of a high sample frequency, which you know isn't specified in the privacy policy So, you know at any moment they could increase that if they wanted to and use that to train AI algorithms or whatnot. And then there's the information that they're collecting and then connecting to your identity. So I guess the thing that I would wonder as things move forward is what is the advertising interest of Facebook when it comes to what kind of data are being collected? What is going to happen with the biometric data? Because biometric data that is then connected to your personal identity, I think is Starting to blur the lines in terms of this amount of information that they have that is you know the information on your body is pretty ephemeral and It's meant to sort of come and go and you have an emotion or feeling and it's not meant to necessarily be captured stored forever and you know reducing our fourth amendment rights to the right to have private feelings and private thoughts and and The more that these technologies are starting to capture that and store it forever and tie it to our identities and be used to create psychographic profiles to sell us advertising, that's a technological roadmap that is leading us more and more into a surveillance type of society. And I think that, if anything, I was a little disappointed that Facebook wasn't coming out as a hard line saying, you know what, biometric data should be considered as health data. We are not gonna capture it. We have no plans to ever even consider capturing it. This is sort of a red line for us that we're not going to be caption recording this information and potentially weakening the Fourth Amendment protections of privacy and For our users and I think that in the absence of them closing that door They're kind of leaving that door open to be like we're going to check it out and to really you know Think about these various issues, but I would encourage people to listen to episode 517 biometric data streams and the unknown ethical threshold of predicting and controlling behavior because I think a The behavioral neuroscientist of John Burkhart of iMotions, he really just kind of spells out like this is the different biometric data streams that are available and these are the various implications of that. And up to this point, that data has only really been used for science and research, but also medical information. And the fact that this data may start to be made available to performance-based marketing companies that have a model of surveillance capitalism, then you start to have even more of a blurring of the line of what is predicting behavior and what is controlling behavior. But overall, it sounds like that Oculus is telling a story that they are really committed to transparency and that they're going to have more and more of these tools. And we shouldn't be worrying that they're going to be going to the extremes of what their privacy policy allows, that they're trying to build up trust within the community, and that anything that they're doing and that is going to be perceived as being sneaky is just going to kind of kill that trust. Anytime that they do something that violates the user's trust, then that is going to make it even more difficult for them to build that trust in the future. So it's something that's aggregated slowly over time and, but it could be lost in a moment is what Max had said. So that's all that I have for today. And I just wanted to thank you for listening to the Voices of the Air podcast. And if you enjoy the podcast, then please do consider becoming a member. This is a listener supported podcast. And so I do rely upon individual donations in order to continue to bring you this coverage. And $5 a month is a great level to be able to continue to bring you this type of coverage and have these types of in-depth conversations. So if you like that and you want to see more, then please do become a member to my Patreon. You can join today at patreon.com slash Voices of VR. Thanks for listening.

More from this show