The VR Privacy Summit happened at Standord University on November 8th to explore pressing issues around biometric data privacy in virtual reality. There were around 50 representatives from around the VR industry with representatives from nearly all of the major companies, and I was the only journalist present for the event. The gathering was held under the Chatam House rules meaning that participants can talk about what was talked about at the meeting without providing identifying information about what other people said.
I talk to the Electronic Frontier Foundation’s senior investigate researcher Dave Maass who attended the VR Privacy Summit To learn more some of the VR-specific biometric data concerns. Maass is a VR enthusiast, and talked about the EFF’s VR experience Spot the Surveillance, which was just relesed. I talk to Maass about the EFF’s perspective on what’s happening with privacy on the web, some of their broader privacy initiatives, how the EFF and other consumer privacy non-private advocates are still ramping up on the privacy issues in VR, and some of his takeaways from the first VR Privacy Summit.
LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
Rough Transcript
[00:00:05.412] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR podcast. So on November 8th, 2018 was the VR Privacy Summit that happened at Stanford University. And it was organized by Philip Rosedale and High Fidelity, Jeremy Bailenson at Stanford University, Jessica Outlaw, an independent researcher, as well as myself, who I was involved with some of the logistics. But mostly I've been trying to draw awareness to the larger issue of VR privacy. So back in 2016 is when I first started to hear about privacy and virtual reality as being an issue. I was at the Silicon Valley Virtual Reality Conference and I just started to hear about it emerging from different people within the community as it being a concern. And I started doing more and more interviews about this and what I found is that it's a very complicated issue when it comes to the types of data that are going to be able to be collected within a virtual reality experience. are going to be able to provide information about ourselves that are way more intimate than other types of information that we are consenting to be able to give. When you're talking about your body and your unconscious processes, you're talking about a level of your unconscious psyche, which essentially could be like a Rosetta Stone that goes way beyond what any type of explicit information that you're doing in the web now. A lot of this information up to this point has been captured under the context of medical information. And so what does it mean to then take this same type of information and provide it to a major technology company whose primary business model could be something along the lines of surveillance capitalism? So this to me was a huge issue and I've been talking about it quite a lot. And I think that there's been a general awareness within the larger VR industry to be able to actually talk about some of these issues. And so there was the VR Privacy Summit that happened at Stanford and a number of people from all across the industry and it was like a private invite only. And I was actually the only journalist that was there, but in the capacity of not only being a journalist and covering this issue, but also just as a community advocate speaking on behalf of the individual privacy interests of the consumer. In some ways, I've been on the bleeding edge of this topic and the other entities and organizations such as the ACLU, EFF, Center for Media and Justice, the CDT, or EPIC, have not necessarily had either virtual or augmented reality on their radar. But I was happy to see that Dave Moss is a VR enthusiast that is from the EFF and he was actually there to be able to start to talk about how the EFF is just now starting to look at some of these issues. And so I had a chance to talk to Dave who was at the VR Privacy Summit and speaking about the larger context of what's happening about privacy on the web and with technology companies in general. So that's what we're covering on today's episode of the Voices of VR podcast. So this interview with Dave Moss happened on Thursday, November 8th, 2018 at the VR Privacy Summit at Stanford University in Palo Alto, California. So with that, let's go ahead and dive right in.
[00:02:56.013] Dave Maass: My name is Dave Moss, I'm a senior investigative researcher at the Electronic Frontier Foundation, and I tend to describe myself as a muckraker and a noisemaker. I often investigate the police and tech companies and then make noise about it until something is done about it. The reason I'm here, though, is that a few years ago I started getting interested in VR and pushing EFF to start looking at this emerging technology. And that's kind of gone in two directions for us. One is looking at the privacy implications of VR. And the other is looking at how VR could be an activism tool for us. So in addition to coming here, it's just sort of serendipitous that there's this event here that's really great to talk about from a privacy perspective. But I'm also here as a kind of VR creator because on Monday, EFF launched its VR project for the first time called Spot the Surveillance, which is nowhere nearly as sophisticated as most of the things being offered by people here. But we're very proud about it. And it's a very nice little training tool for people.
[00:03:47.425] Kent Bye: Maybe you could describe what happens in the experience.
[00:03:49.526] Dave Maass: So Spot the Surveillance VR, it immerses you in a street scene in Western edition San Francisco. It's actually like a photo, a 360 degree photo that's not CGI or anything like that. And it's not video. It's just a still photo. And it's in front of a police department where a police stop is occurring, where two cops are speaking to a young black man. and you look around the environment for various pieces of surveillance technology, and as you see one, there's a pop-up that shows you a closer-up picture of it, and gives you information about what it does, how it's used, and just various basic things you need to know about it. And for us, there's a few reasons for this. It's not like a fun game. This is not something that's like, ooh, I scored a bunch of points, and I'm really, really excited about it. It's more of a tool, a different way to ingest the information that we have on our website, where instead of You reading it and trying to piece it together in your head how this stuff on our website or this stuff in our legal briefs affects your daily life, you're actually in the environment and can look around and see like, oh, you're talking about license plate readers. This is what it looks like and this is where I would find it. And, you know, the utility is not just like training people to understand the technology, but also if they are in a situation like a police stop or they're a witness to a police interaction, those go by very quickly. There's a lot of adrenaline involved, and sometimes you're not noticing the things in the environment. And so by being able to slow it down, freeze it in time, that maybe gives you the opportunity to practice those sort of things. Noticing whether the cop has a body camera, noticing whether there's a pan, tilt, zoom camera above you. And that's kind of the utility as well.
[00:05:22.400] Kent Bye: OK, yeah. So today we're here at Stanford for the VR Privacy Summit. And I know that privacy online is a vast issue. It's not like it's a new thing. It's been happening for a long time. The metaphor that I use is surveillance capitalism, which is that we're essentially mortgaging our privacy in order to get access to free services. And that most people are OK with that. And they're OK with being able to not have to worry about paying for things. But yet, I think that we're, just over the last year or so, we're starting to see some of the unintended consequences or the externalities or the costs of what that means to be able to have this business model based upon collecting all sorts of data and then using it for different advertising purposes. And so I'm just curious if you could say from your perspective at EFF or how you think about this larger issue of privacy on the web.
[00:06:10.595] Dave Maass: I would disagree with the idea that people are okay with it as much as they are okay with trying not to think about it. People who smoke cigarettes aren't okay with getting cancer. They just don't think about getting cancer as they're doing the smoking. They compartmentalize that elsewhere. I think that's what's happening with a lot of these websites. Either they know there's something going on. If they just don't look at it too closely, they can just pretend it's not there. And until something catches up with them, then, oh, well, now I've had a data breach. Now I've had my credit card compromised. Then that's when it's like, oh, I could have taken these steps in the past, but now it's coming and getting me. And I do hope that people do start paying more attention to privacy. I think people are. I think since, you know, probably around the era of Edward Snowden, it really started taking off. And then I think with Cambridge Analytica, it got its second giant boost. But just sort of more generally, I guess, to answer your question about how we're looking at it, it's a very complex area because lawmakers get things wrong so badly. And when Congress does something wrong, it lasts for decades and decades and decades. I think the good example of that is the Computer Fraud and Abuse Act, badly designed in the 1980s, still badly designed in 2018, still causing problems for all sorts of people. Same with the Digital Millennium Copyright Act, same with the Electronic Communications Privacy Act. These are all things that are in dire need of updating. So we want to make sure that if legislation is going to get passed to address issues of privacy, that they are designed looking forward to the future and they're not full of badly written pieces of policy that didn't go through a good debate. We have on our website various things that we think should go into a privacy law. A lot of it is related to having a right to know what data is being collected, who it's being shared with, who has access to it, having the right to export your data or to take it with you. These are sort of the general principles that we're looking at. There's a lot more, I just don't have our own website in front of me right now, so talking to you. to go down the long, like, you know, 3,000-word blog post we have on this. But, you know, we do a lot, and we've tended to address this issue often designing technological tools. Privacy Badger is probably our most common one that people use, where you have a plugin that goes into your browser and blocks third-party trackers and lets you know who's tracking you on a particular website. I don't know how that really transfers over into VR. Can you build a Privacy Badger for VR? That's, like, a question that we're going to probably have to face. Not probably today, because VR is not in quite mass adoption yet, but we will probably have to face that in the future. We really try to figure out how to deal with mobile first, you know.
[00:08:42.716] Kent Bye: Yeah, so for me, there's a number of different lenses that I try to make sense of this, and I use Lawrence Lessig's framework for the different knobs that you can turn to be able to have collective change on a society. So first of all, there's technology and the ways that technology changes communication, but also just the different ways that you can do these type of technological tools to be able to inform people that they can start to Use and have a little bit more of transparency Then there's the legal domain and the different lawsuits that could happen if there's some sort of either violation bringing about lawsuits to be actually go through the court to be able to challenge these things and then there's the culture which is just essentially just educating people so that they can either have the culture and the values that's going to drive the other aspects of creating markets or creating technology or creating laws. And then finally, there's the economic markets. And so either you're making an economic solution or there's a market dynamics that are playing into part all of this. At this point, it's surveillance capitalism is basically the model by which a lot of the major companies are operating. And so are there alternative models, whether it's the basic attention coin or subscription models or other, you pay for access. So I'm just curious from the EFF's perspective, where you are addressing different issues of privacy if it's both a technological, legal, cultural, and economic or market dimensions?
[00:09:58.193] Dave Maass: Crowe. I think the first three tend to be more within our mission statement. We are a nonprofit and we have a mission that's very clearly defined. We're not an industry association that is trying to come up with new ways to make money or not make money, so it's kind of a bit difficult. But we are concerned with this idea you described as surveillance capitalism, however you want to put it. we don't like the idea. With things related to privacy, like broadband privacy or online privacy, we don't want to see a system where we want people to be able to opt in and opt out of it, but we don't want people to be penalized because they opted out. We don't want to see them charged more because they opted out, which is kind of the same thing as paying somebody to opt in is almost the same thing as charging them more to opt out. That is definitely a concern. I think what I often talk about is that Privacy is a feature that you can sell that people want privacy people want security and Like any of these other things that people are marketing that saying you need this feature you need this feature most of the time people don't need that feature people actually do need privacy and people do need security and you can give that to them and market them in a way that people actually pay for it if that's how your model works I I You know, I'm just in general concerned with this idea that, you know, Facebook claims that it's OK that it's collecting all of our data because it's not actually selling our personal information to advertisers and that advertisers will never know who you are because they're only buying the ability to put stuff in front of a certain set of filters. I mean, to me, it's they're taking something from us and putting together this giant aggregated scale, and they're still turning us into a commodity. And I do find that problematic. I mean, the idea that if you haven't bought something, then maybe you're the product is like, I think, a fair way to say it. And I think that that is how a lot of companies look at us. But I don't think that's sustainable in the long term. And I don't necessarily know that we're going to be that valuable to them in the long term. The more data that's being collected on us, the less valuable it necessarily becomes.
[00:11:55.889] Kent Bye: Well, I totally agree with all of that. But at the same time, when I talk to people like Vint Cerf from Google and say, hey, maybe you should stop doing this surveillance capitalism thing, and his response is like, well, how do you propose giving universal access to all human knowledge to everyone in the world? And that's sort of the mission statement. And the story is that there's enough people that are buying ads that it allows for these services to be free. And I don't have an answer for that to say, how do you operate at that scale to swap out this existing business model and do something else. I mean, I went to the Decentralized Web Summit. There's some promising philosophical ideas. But pragmatically, it's not ready for primetime. It's not ready to swap out and to create this whole decentralized blockchain future that everybody's on the blockchain. And maybe there's more of an equal exchange for you can sell your data somehow through a blockchain mechanism. But what do you propose? What's the alternative?
[00:12:44.175] Dave Maass: Gardner. I should say that my role at EFF is usually just filing lots of FOIA requests and giving people a hard time. I'm not an economist and I'm not a venture capitalist. If I was, I would be wealthy and I probably wouldn't be here right now talking to you and fighting the good fight. I will say that that rhetoric is definitely kind of troubling. Certainly almost all these tech companies have found ways to give all their employees free food every day. They found ways to find money to spread it far and wide to members of Congress and to local legislatures. They have found ways to give lots of free things to people in the world without necessarily trading people's privacy on it. I think that it's weird for me to hear someone, and I'm not going to call it Vint Cerf personally because I don't know him and I haven't had this discussion with him, But from a community that prides itself on being able to nerd its way out of anything and that everything is disruptible, surely its own assumptions about how you give things to people free in the world is also disruptible. So why not actually start thinking about that, how we can do all of these things? The one sector that I don't expect to hear something can't be done from is the tech community. That's how I would respond to them. You're saying, how are we going to get stuff for free? Figure it out. You figured out everything else. You think everything is doable. You can fix transportation. You can fix housing and hotels. Whatever you think you can fix, let's start looking at fixing this. On the grand scheme of things, providing education and materials to people in the world is a doable, fixable task.
[00:14:16.923] Kent Bye: So what, for you, are some of the either biggest takeaways that you're taking away from the VR Privacy Summit, or deep insights that you had from the day?
[00:14:24.309] Dave Maass: So a lot of the things that were discussed in the first half of the day were things that I started raising about two or three years ago, and I didn't have the language to describe what it was. And at the time, because nobody else at EFF or nobody else in really I knew had ever tried VR and realized how far it had come, that it wasn't just a Nintendo Virtual Boy anymore. It was actually like real powerful experiences that were cheap enough that people can have. If you have a PlayStation, you can have a PSVR. And so it was good to see that I'm not the only one worrying about these things, that there are people who've given it a lot of thought and who have all seemed to have come to the same conclusions independently. And that's really powerful. The idea that everything it's collecting is a biometric, and that there are way more kinds of biometrics that are being opened. The privacy implications of VR are not exactly like the same one-to-one as just browsing the internet. There are just whole new things that are being opened up. And I think that's a lot of the takeaways. I learned a lot more about the capabilities of what some of these things can do and what some of the problems are over the horizon. I was also really interested that, you know, in the sort of second half, people were proposing solutions. Some of them are just not within the realm of what EFF works on or would have a strong position on one way or another. Some of them were things that I thought that, like, we could really get behind. And it was exciting to hear that not coming from me and the ACLU and the Center for Media Justice and CDT and Epic all sitting around a room deciding what we think VR should do, but it was actually from people working in VR saying, like, I work in medicine in VR, I work in advertising in VR, I work for these whatever companies, and we're under Chatham House rules, so I can't say especially who said what, but I can tell you that there was like a wide variety of people all throwing out ideas and thinking these were good ideas and interesting ideas.
[00:16:03.567] Kent Bye: So just to kind of dive into some of the technical bits, what were some of those things that you think that the EFF could get behind?
[00:16:09.837] Dave Maass: One of the things I immediately put on the board, which I thought got a lot of other votes, was that people should opt into sharing their data, not opt out of it, which is just a pretty simple concept. And it really also was very similar to a lot of the other things people had put up, but maybe articulated in a different way. But I think that that is just a really important function, that if you're making people opt out, you're assuming people are going to read through and figure out what they're opting out of. But a lot of people will actually opt into something that they haven't read. Like, I mean, I guess they do. I don't know. I'm not drawn from like research here, like scientific research. I'm just in the sort of general common understanding of how people work. Like people are more likely to passively do whatever the status is. And if the thing is opt into this, people are going to be like, no, I don't want to opt into it because I don't want to read this whole thing. But the same thing as if you want to opt out of this, people are going to be like, I don't know. I don't want to read what I'm opting out of either. So.
[00:17:04.579] Kent Bye: Yeah, there was a mention of the importance of default settings, of maybe it's 4% of the people that change their default settings. So 96% of the people are not going to change the default settings, and how important those default settings are.
[00:17:15.182] Dave Maass: We heard a really powerful story about the implications of somebody who's using a service for the first time and doesn't know what the default settings are. And I can't go into it much more beyond that without giving away necessarily who it was the story was coming from. But it did really hit hard for me on why that is so important. And I mean, I think in my head, I thought like, yeah, we need to make sure we know what the settings are, or we've often phrased it as like, you shouldn't need to be a wizard to understand how to control Facebook settings. But now thinking about it, it's like, okay, I am going to be paying more attention to what the default settings are. And then that is a very important policy decision that's being made. And certainly Facebook is giving it a lot of thought, and Google's giving a lot of thought what the default settings are, and everybody else in the space should be as well.
[00:17:58.745] Kent Bye: So what's next for you? And as you go back and tell EFF, what do you see as kind of the next steps for privacy and VR and how potentially the EFF could potentially be involved in that future?
[00:18:10.250] Dave Maass: Well, I have a lot of notes to type up, and it'll take me a few days, mostly from the first half of things. I want people to start looking at how this needs to be integrated into our work on medical privacy right off the bat. Really, though, I think the short term is that we just launched our VR project, and we're demoing it in public for the first time at Aaron Swartz Day International Hackathon this weekend. Everybody, if you're not familiar with Aaron Swartz, go look up his work. He was an amazing human being. You know, it's hard to say. It'll be interesting to see what comes out of this. It's really difficult for us at the moment because there's just so much going on in the world in terms of technology. Ever since the 2016 election, the trajectory of what we could work on has changed dramatically. And this is not necessarily a political thing. This is just under the Obama administration, there were a lot of things that we didn't need to worry about. And so we were able to go and start spreading out and thinking, OK, well, let's look at these issues that we didn't have time to work on before because we don't have to worry about these other issues. Or we're making progress here so we can actually not go get the entire army out to go fight this thing. We can actually have some good solid negotiations with Congress or with the President. Now there's just been all these new fronts opened all over the place. Nobody would have thought that we were gonna have to start talking about whether you know You're allowed to burn the flag in America and nobody, you know We didn't think we were gonna have to deal with discussions of opening up the libel laws Like these are like things that suddenly have to draw our concern from other areas And so if this had been three years ago, I'd have been like, oh, I'm gonna run back We're gonna start like hammering out white papers on VR. We're gonna start doing all these things But I'm not sure that we're gonna be able to like jump on it as quickly as I would under other circumstances I think it's important that we start looking at this more and I do think we probably have a little bit more time to see how things are developing because it just hasn't had, and you can totally disagree with me on this, but from my perspective VR hasn't had quite as large adoption as quickly as people thought it was going to, and that may change next year with some of the new devices that are coming out. And then we'll see. So if it starts getting, like, taking off way more, then maybe we can start working on it. I worry, though, that what's going to happen is we're going to have to deal with it reactively. We're going to have to have the violent video game battles all over again because some legislator in Georgia will have a story about a local kid who got, like, too obsessed with whatever the virtual reality Grand Theft Auto is. And then we're going to have to have a really tedious thing, battles that we won over and over and over again, and we're going to have to do that again with VR. But I don't know. I mean, it's hard to say. I wish I had a better answer for you on like where exactly to go with this. I think that this is one of the first meetings that I've even heard occurring like this and I think that's like good progress. I mean, I'm just excited to see what kind of document they produce from all the information that came out today. Certainly, I've talked to a lot of people here that hopefully we will connect again later and maybe see if there's ways to collaborate or, you know, I can help them think through some of the things that they were thinking about doing.
[00:21:06.034] Kent Bye: Great. And finally, what do you think is the ultimate potential of virtual reality and what it might be able to enable?
[00:21:17.821] Dave Maass: The ultimate potential of virtual reality. I mean, I think there will hit a point where there are people who live most of their lives in virtual reality. I think that would be a big area for it, that it becomes not just an edge culture, but an actual full-on recognizable way that people live their lives. That's potentially where it could go. I think more interestingly will be the way that it moves away from games into all other facets of industry and not even like Education or like we've heard a lot about education and medicine here those are like two fields that people keep talking about But I want I will be interested when this becomes a thing that every architect uses like you can't build a building anymore without putting it in VR first and Examining it and taking people on tours through that where government has VR as part of Not a novelty to like let's show members of the public and dazzle them with our like virtual rendering of what this park is going to look like, but where they can't actually do the park because they have to like model it like that and they have to explore it in and out in that kind of way. I think that will be interesting is when VR becomes a more adopted tool just like a drawing table had been. I think that's probably a good potential for it. It's hard to say though, because I hear people saying AR is going to be it, we're going to just hop over VR straight into AR. Which, to be honest, in a policy level I'm really interested in AR, but in a personal area I'm not super interested in AR. I do have several VR systems at home and I spend a lot of time in it. So I am, on a personal level, very interested in VR.
[00:22:50.609] Kent Bye: Is there anything else that's left unsaid that you'd like to say to the immersive community?
[00:22:54.830] Dave Maass: One of the things that I think is interesting about immersive experiences is that it's becoming more than just technology. It's becoming more than just a way that we do things with a headset. I think that you have to look at it in a greater context where you have immersive theater and you have Escape rooms. I think this is all the same genre and it's all meeting the same desires and needs of a 21st century Population and I find that really fascinating that you know, we tend to think about it as a technology But there's the VR the technology but like immersiveness is a greater concept that people are looking for in their existence Awesome.
[00:23:35.338] Kent Bye: Well, thank you so much. Hey, no worries. Thank you very much So that was Dave Moss. He's a senior investigative researcher at EFF. So I have a number of different takeaways about this interview is that first of all, first of all, it was just great to see somebody else from this space of nonprofits that are looking at different privacy issues. He's from the Electronic Frontier Foundation, and there really wasn't a lot of other representation from the ACLU, Center for Media and Justice, CDT, or EPIC. this intention of this gathering was to try to get a lot of the major players from the virtual reality industry there to be able to start to talk about these things. And I think that a lot of these nonprofits aren't up to speed as to what's happening. And I think as we move forward, I hope that some of these initial information that comes out of this can start to pull in these other industry groups to start to think about these issues. Because privacy in VR is a very, very complicated issue that is bringing in so many different disciplines. I mean, there's a legal dimension, there's medical issues, There's the technological issues. And then there's all of the ways that you could use this data for amazing things for medicine and education and learning. But also, like, what happens if this data gets into the wrong hands? And so a lot of what was happening at the VR Privacy Summit in the afternoon was these workshops where we start to think about the disaster scenarios for what could possibly go wrong with the gathering and capturing of some of this data. So there's a number of different ideas of suggestions of ways to potentially address this issue of privacy within virtual reality. But at the heart of it, I think there's this fundamental tension between the existing business models of a lot of these major tech companies and the interests of privacy. They're pretty much the exact polar opposite interest of each other. Until there's a completely new business model that arises, then we're going to continue to see the same behavior. And no matter what happens with either legislation or something like GDPR or some of these privacy interest groups, I think that at the end of the day, until the economic business model issue is solved, we're still going to find this tension of the companies wanting to encroach on this different data as much as they can. At the same time, though, I think that there's a lot of great ideas that are coming out of the VR Privacy Summit. I'm going to be unpacking them over the next number of different interviews. And so the number one topic was this idea of an institutional review board for the privacy policy. So within the medical community, in order to do any medical research, which is dealing with issues of privacy and consent and informed consent with users, they have an institutional review board, similar to how a lot of how academic research that involves human beings has to go through this process to make sure that you're not going to be doing things that are going to be unethical in any way. And we don't have anything that's equivalent to that for the context of a privacy policy for these companies. And so there's no accountability, there's no one who's looking over it and saying, Hey, is this actually in the interest of users? And it's basically like the company saying, this is what we want to do for the terms of service where you have to like opt out of doing it. And then actually you can't even use a service unless you opt in. And so that was one of the big things that Dave Moss was saying is like, what would it mean for us to actually opt in and consent to the data that we're providing? And you'll have that sometimes with using your phone where you're like, Oh, I don't know if I necessarily want to share my location data with Google. But then at the same time, when you try to use some of the features on the maps, if you don't have the location services turned on, then you can't use that service. And so there's a very real connection between Oh, well, actually, this is the benefit that I get from actually turning on this service. And so I think one of the challenge of a lot of this type of data is that they're just collecting and hoarding as much data as they possibly can. And a lot of times they don't even know what they're going to be doing with it. They're just thinking that sometime in the potential future, they could start to use it and harvest it to have some sort of insider benefit. And so they're just gathering all of it. And so a lot of times they might not even be able to tell the story of the deeper intention of why they're gathering it, they just see that it's there, and it might be useful in the future. And so they're going to record it. But the problem, I think, is that there's this concept of the third-party doctrine, which is essentially saying that any data that you're giving over to a third party has no reasonable expectation to remain private. And I think this is one of the things that I tried to speak about a little bit at the VR Privacy Summit, but I'm not quite sure if the implications of that necessarily came through. And the way that I would say it is that whenever you go and do a search for Google, because they're recording that, that essentially means that your search data is no longer reasonably expected to remain private. which then the ISP companies went to Congress and said, hey, we have access to the navigation history of someone's history on the web. We should be able to sell that because Google is able to record it. And so because they're recording it, there's no reasonable expectation for that to remain private. And so as we start to collect more and more of this data, this third party doctrine, it actually changes the legal definition of what is expected to be private. So we're basically collectively saying as a society that we're kind of okay with companies taking all of our navigation history on the web and selling it to the highest bidder. And so once we start to allow other companies to record and save this biometric data, that actually changes the legal definition that we don't feel like some of this information that should actually be classified as like medical information, whether it's our eye tracking data, our galvanic scan response, our heart rate variability, all of these things are things that are very intimate biometric data markers. one of the discussions that was happening at the VR privacy summit was, should we let people record this at all? Or should there be a limitation for how long they record it? And I think that there is a lot of amazing applications, and I'll be diving into some of the medical applications that are going to be used for that. But I was a little disappointed that people wanted to capture it and use it because I'm not necessarily sure that people were fully aware of what that would mean for the third party doctrine and basically saying that we are no longer having a reasonable expectation that this information is remaining private at all. And so I was definitely taking some of the most ideal positions in the room on some of these issues and topics. And so I am looking forward to, as this move forward, to be pulling in more of the legal resources from the ACLU to be able to talk about some of these deeper legal issues, because I think that the legal issues are an open question. I mean, one of the things that I actually told the people in the room is that If we want to fix this, we actually have to address it at the legal level to change the legal definitions of the third party doctrine so that we can ensure that if we want to capture this biometric data, then we better pass some laws to be able to say that this is private information, that we actually don't want this to be shared in specific contexts. And so there's going to be a complete reevaluation of what it means to start to have this biometric data available and what companies should ethically be responsible for doing or not doing with it. There was certainly consensus within the VR Privacy Summit that one of the things that people were saying is that we definitely should limit it, but there wasn't any definitions in terms of what that even means. Whether that's you do real-time processing on it, or if you save it for a day, or aggregate it over a rolling window so you have some memory but it actually goes away. So to me, the biometric data issue is probably the thing that concerns me the most. I'm thinking about it philosophically and from a pragmatic perspective, a lot of things I'm talking about is still going to be like two or five years down the road for what's going to be possible. But once the science continues to go down that point, I think we're going to be continually surprised about what's going to be possible with this type of biometric data and what that's going to be able to actually tell about us. So there's another thing that Dave had said within the context of this interview is that essentially like well because VR isn't at this mass level yet then we actually have more time to be able to start to figure this stuff out and I didn't disagree with him in the moment because he was right in the sense that VR hasn't been able to take off but I would disagree with the sentiment that we have all the time in the world to be able to figure this out. Because in some ways, if we do that, then we're going to definitely move into this more reactive position on it. And if there's a time to really start to look at this and address it, it's right now before a lot of these privacy policies are widely distributed amongst all the major tech companies. I think Oculus and Facebook definitely has the most liberal in terms of allowing them to start to gather biometric data. It's completely within their privacy policy now for them to start to capture biometric data and to be able to correlate that to whatever's happening within the context of an experience. And so they can presumably already take what you're looking at within the context of an experience and then extrapolate some sort of information in terms of whether or not you've been able to see or pay attention to different things. And if there is going to be some sort of either body that's going to be looking at this or some sort of legislation that is going to be along the lines of HIPAA, then we're going to need a lot more of the consumer privacy advocacy organizations to take virtual and augmented reality seriously and to look deeply into some of these issues that we have. because these organizations from ACLU, EFF, Center for Media Justice, CDT, EPIC, all of them are looking at the broader privacy implications of what's going to be in the best interest of consumers. And I think it would be good to have their voices participate in this larger discussion as it moves forward, because I think that this first VR Privacy Summit was like the first meeting, but it's certainly not the last. And there's a lot of great ideas that were ideated, but I think as we move forward, it's going to be a process of trying to broaden the coalition and get even more of the VR industry there and starting to really be educated and think about some of these broader issues. So that's all that I have for today and I'm going to be diving into more coverage from the VR Privacy Summit over the next number of episodes. I talked to eight total people across four interviews. There's about 50 people there altogether so I just try to take a sampling of some of the big highlights that are coming out of the conference and I'll be diving into those in the next number of episodes. And I just wanted to send a shout out to my Patreon supporters because I wouldn't be able to do this type of journalism and do this type of coverage without your support. I was the only journalist that was at this VR Privacy Summit. And one of the things that took away from the Privacy Summit is that the coverage that I was doing on this topic I think has made a huge difference of setting the context to be able to start to have a meeting like this and then to have it grow even further beyond this and to have it gain more momentum from here and moving forward. And I feel optimistic that this is a great first step towards opening that dialogue up because it's an interdisciplinary conversation and with a lot of people involved. So the support that you're giving here to the Voices of VR podcast is making a huge difference. And I would invite anybody who's not already a Patreon to become a member of my Patreon and to support the work that I'm doing here as well as this type of coverage. So you can become a member and donate today at patreon.com slash Voices of VR. Thanks for listening.