#493: Is Virtual Reality the Most Powerful Surveillance Technology or Last Bastion of Privacy?

saray-downeyVirtual Reality has the potential to enable so many amazing utopian futures, but it also has the potential to become one of the most intimate surveillance technologies that could create a Big Brother dystopia of political and economic control. But if privacy considerations are built into virtual reality technologies from the beginning, then Accomplice investor Sarah Downey argues that the metaverse could actually be one of the last bastions of privacy that we have in our lives.

Downey has a legal background in privacy and previously worked at a privacy start-up, and she’s currently investing in virtual reality technologies that could have privacy implications such as the brain-control interface company Neurable that can detect user intent.

Privacy Facilitates Authentic Free Speech

Downey believes that privacy is a fundamental requirement for freedom of expression. In order to have the full potential of First Amendment that guarantees the freedom of speech in the United States, then you need to have the protections of the Fourth Amendment that protects a reasonable expectation of privacy. She makes the point our digital footprints are starting to bleed into our real lives, and that this will lead to less authentic interactions in the real world.

The advertising business models of Google and Facebook rely upon creating a unified profile that connects your online behavior to your actual identity, which creates a persistent digital footprint that follows you where ever you go. As search histories have been subpoenaed into child custody cases or social media posts get people fired, it’s created an online environment where shared content is being considered public and permanent not ephemeral.

This has a chilling effect that creates what Downey calls a “fraudulent shell that limits authenticity.” The erosion of a truly private context has created a stale and boring environment that has limited her authentic expression on sites like Facebook, and she warns that our unified digital footprints will start to spread into the real world as augmented reality technologies with facial recognition start to spread. As we start to loose the feeling of anonymity in public spaces, then we’ll all be living out the first episode of season three of Black Mirror called “Nosedive” where every human interaction is rated on a five-star scale.

Weakening the Fourth Amendment by Sharing Private Data

Downey also argues that the Fourth Amendment is based upon a culturally reasonable expectation of privacy, which means that our collective use of mobile and web technologies has had a very real legal effect on our constitutional rights. There’s a subjective interpretation of the law that’s constantly evolving as we use technology to share the more intimate parts of our lives. If we feel confident enough to share something with a third-party company, then it’s not really legally private in the sense that it can be subpoenaed and used within a court of law.

Creating Unified Super Profiles

There are different classes of private information, and so companies like Google and Facebook are able to collect massive behavioral histories of individuals as long as they don’t share access to the personally identifiable information. They can anonymize and aggregate collective behavioral information that’s provided to their advertising customers, which enables them to create a business model that is based upon detailed surveillance of all of our online behavior.

Google recently quietly created an opt-in that links all of your historical web browsing history from DoubleClick to your user identity, and Facebook actually buys external financial and mortgage data to merge with your web browsing and social media interactions to create a super profile that contains demographic data that you never explicitly shared.

The Intimate Data We’ll Share with VR

Now with virtual reality technologies, Oculus’ privacy policy says that they can capture and record “information about your physical movements and dimensions when you use a virtual reality headset.” This is starting with head and hand tracked information, but if this is correlated to the VR experience then the content of your gaze and what you’re looking at could already start to be recorded, stored forever, and combined with other public or commercially available information in your Facebook super profile.

As of right now, none of the information gathered by a virtual reality technologies has been determined to be definitively classified as “personally identifiable information,” which enables VR hardware companies and application developers to capture and store whatever they like. But once there are eye tracking technologies with more sophisticated facial detection or eventually brain-control interfaces, then VR technology will have the capability to capture and store extremely intimate data including facial expressions, eye movements, eye gaze, gait, hand & head movements, engagement, speech patterns, emotional states, brainwaves from EEG sensors as well as attention, interest, intent, and potentially even eventually our thoughts.

VR Biometric Data is Not Personally Identifiable (Yet)

There are some existing biometric identifiers that can connect information gathered from your body that can personally identify you, which include things like facial features, fingerprint, hand geometry, retina, iris, gait, signature, vein patterns, DNA, voice, and typing rhythm.

Right now your gait, voices, or retina or iris as captured by an eye tracking camera could be biometric data that proves to be personally identifiable. It’s also likely that the combination of other factors like your body, hand, and head movements taken together may prove to create a unique kinematic fingerprint that could also personally identifiable you with the proper machine learning algorithm. This could mean data is being anonymously stored today that could eventually be aggregated to personally identify you, which is a special class of PII that requires special legal protections.

OpenBCI co-founder Conor Russomanno told me that EEG brainwave data may turn out to have a unique fingerprint that can not ever fully be anonymized and could be potentially be tracked back to individuals. What are the implications of storing massive troves of physical data gathered from VR headsets and hand tracked controllers that turns out to be personally identifiable? Downey suggests that the best answer from a privacy perspective is to not record and store the information in the first place.

VR Companies Are Not Being Proactive with Privacy

There’s a set of self-regulatory principles for online behavioral advertising that companies have collectively agreed to follow to help with the Federal Trade Commission’s oversight of companies protecting the privacy of individuals. But up to this point all of the major virtual reality companies have not taken a proactive approach to educate, be transparent and provide consumer controls to opt-out of what may be recorded and stored from a VR system.

Google has the most detailed privacy dashboard to be able to review and control what they’ve recorded from your regular account (including interactive maps of location history and voice recordings of talking to Google Assistant), but they don’t have any specific information related to virtual reality yet. You can see what ad preference categories that Facebook has selected for you, but their privacy policy explanation page shows very little of the raw data that they’ve collected. The HTC Vive links to HTC’s privacy policy, which hasn’t been updated since September 29, 2014 and predates the Vive so there’s no specific VR information. And there’s no specific indication of VR data being capturing or tracked in Valve’s or Samsung’s privacy policy.

Oculus’ Privacy Policy is the only one to call out any specific VR data being collected, which means that the other companies aren’t recording any head or hand tracked information yet, or they’re not properly disclosing it if they are.

Oculus’ Independence from Facebook is Fading

The VR Heads site did a great comparison of the different privacy policies of VR companies pointing out some of the commonalities and differences. They also flagged concern with Oculus’ privacy policy saying, “The company states that all of that information is necessary to help make your game experience more immersive; they also use the data to make improvements on future games. But permanently storing that data, and then sharing it? That’s a bit invasive.”

Oculus made this statement about privacy in response to an UploadVR report from April 2016:

We want to create the absolute best VR experience for people, and to do that, we need to understand how our products are being used and we’re thinking about privacy every step of the way. The Oculus privacy policy was drafted so we could be very clear with the people who use our services about the ways we receive or collect information, and how we may use it. For example, one thing we may do is use information to improve our services and to make sure everything is working properly — such as checking device stability and addressing technical issues to improve the overall experience.

Lastly, Facebook owns Oculus and helps run some Oculus services, such as elements of our infrastructure, but we’re not sharing information with Facebook at this time. We don’t have advertising yet and Facebook is not using Oculus data for advertising – though these are things we may consider in the future.

Just because Oculus hasn’t shared information with Facebook as of early 2016, then that doesn’t mean that they won’t and they don’t plan to. In fact, it’s likely that they will otherwise they wouldn’t have included the legal language to do so.

The boundaries of independence between Oculus and Facebook have been fading lately. Facebook has been taking more and more of an active part in running Oculus as shown by the Oculus logo including mention of Facebook, with Brendan Iribe recently stepping down leaving Oculus without a CEO, and with Mark Zuckerberg giving a much more in-depth demo about the future of VR and Facebook at the recent Oculus Connect 3.

Any early comfort that Oculus would be run as an independent company from Facebook is starting to fade, and the bottom line is that there’s nothing stopping Oculus from feeding as much intimate data about body movements into Facebook’s unified super profiles of personally identifiable users. It’s starting with physical movements, but it’s likely that future generations of VR technology will have eye tracking technologies built in. Oculus’ privacy policy is laying down the legal framework to be able to capture and store everything you look at and interact with in virtual worlds, and then tie that back to your actual identity.

The Metaverse as the Last Bastion of Privacy?

As these online profiles start to merge into our real world with augmented reality technologies, then it’s going to vastly reduce our sense of privacy. Downey is optimistic about the potential of a virtual reality metaverse could become one of the last bastions of privacy that we have, if VR technologies are architected with privacy in mind.

She encourages VR application and hardware developers to minimize data collection and to maintain as little data as possible. She also suggests to not personally identify people, and to use decentralized payment options like Bitcoin or other cryptocurrencies as to not tie information back to a singular identity. Finally to avoid using social sign-ins so as to not have people’s actions be tied back to a persistent identity that’s permanent stored and shared forever.

Open Questions to VR Companies for Regulators

Virtual reality technologies are going to have increased scrutiny from public policy creators in 2017, and there has already been a Senate Commerce hearing about Augmented Reality in November of 2016.

Some of the open questions that should be asked of virtual reality hardware and software developers are:

  • What information is being tracked, recorded, and permanently stored from VR technologies?
  • Is this information being stored with the legal protections of personally identifiable information?
  • What is the potential for some of anonymized physical data to end up being personally identifiable using machine learning?
  • Why haven’t Privacy Policies been updated to reflect what VR data is being tracked and stored? If nothing is being tracked, then are they willing to make explicit statements saying that certain information will not be tracked and stored?
  • What controls will be made available for users to opt-out of being tracked?
  • What will be the safeguards in place to prevent the use of eye tracking cameras to personally identify people with biometric retina or iris scans?
  • Are any of our voice conversations are being recorded for social VR interactions?
  • Can VR companies ensure that there any private contexts in virtual reality where we are not being tracked and recorded? Or is recording everything the default?
  • What kind of safeguards can be imposed to limit the tying our virtual actions to our actual identity in order to preserve our Fourth Amendment rights?
  • How are VR application developers going to be educated and held accountable for their responsibilities of the types of sensitive personally identifiable information that could be recorded and stored within their experiences?


The technological trend over the last ten to twenty years has been that our behaviors with technology have been weakening our Fourth Amendment protections of a reasonable expectation of privacy. As we start to provide more and more intimate data that VR and AR companies are recording and storing everything, then are we yielding more of our rights to a reasonable expectation of privacy? If we completely erode our right to privacy, then it will have serious implications on our First Amendment rights to free speech.

As virtual reality consumers, we should be demanding that VR companies DO NOT record and store this information in order to protect us from overreaching governments or hostile state actors who could capture this information and use it against us.

In order to have freedom of expression in an authentic way, then we need to have a container of privacy. Otherwise, we’ll be moving towards the dystopian futures envisioned by Black Mirror where our digital footprint bleeds over into our real life that constrains all of our social and economic interactions.

Is VR going to be the most powerful surveillance technology ever created or the last bastion of privacy? It’s up to us to decide. We need to make these privacy challenges to VR companies now before it’s too late.

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip

Rough Transcript

[00:00:05.412] Kent Bye: The Voices of VR Podcast. My name is Kent Bye and welcome to the Voices of VR podcast. So virtual reality is a paradoxical technology that I think has the potential to be one of the most amazing and transformative technologies that we've seen in our lifetimes. But it also has the potential to become the most intimate surveillance technology and the enabler of a dystopian future. So I tend to spend a lot of time on the Voices of VR podcast looking at the utopian optimistic futures, but today I'm going to be looking at some of the risks of virtual reality when it comes to privacy and as we start to integrate more and more technologies that are able to read our brainwaves and detect our intent and our attention and our emotions. What's it mean to be able to give all this intimate information to these companies that are essentially trying to create these unified profiles on us to send us advertisements? So on today's episode, I have Sarah Downey. She's an early stage investor with Accomplice Ventures, and she's actually invested into Nurble, which is a brain control interface which can detect intent as to whether or not you're intending to take action. And so Sarah makes the argument that the Fourth Amendment, which gives us all of our rights to privacy, is a crucial element to freedom of expression in the First Amendment. That once we start to make everything into the public domain, then the decreasing amount of private spaces in our lives is going to have that side effect of essentially making us live within a fraudulent shell of inauthenticity and that the metaverse may actually be one of the last bastions of privacy that we have in our lives if it's architected with privacy in mind. So we're going to be doing a deep dive into privacy and virtual reality on today's episode of the Voices of VR podcast. But first, a quick word from our sponsor. Today's episode is brought to you by the Voices of VR Patreon campaign. The Voices of VR podcast started as a passion project, but now it's my livelihood. And so if you're enjoying the content on the Voices of VR podcast, then consider it a service to you and the wider community and send me a tip. Just a couple of dollars a month makes a huge difference, especially if everybody contributes. So donate today at patreon.com slash Voices of VR. So this interview with Sarah happened at the Upload VR Party at CES that was happening in Las Vegas from January 5th to 8th. So with that, let's go ahead and dive right in.

[00:02:47.309] Sarah Downey: I'm Sarah Downey. I'm an early stage investor at Accomplice, which is a fund in Boston. And I'm really building out our AR, VR practice. And just personally, I've been into this stuff my whole life. I've always been a big video gamer and sci-fi fan. So I'm excited it's real life, finally.

[00:03:05.897] Kent Bye: Great. So we were just talking about some of your investments in brain control interfaces. And maybe you could start there and talk a bit about what you're seeing in the BCI space and virtual reality.

[00:03:16.648] Sarah Downey: Yeah, there's a company called Nurable, still very early, but we put $250K in them. They're from Michigan originally. The CEO did his PhD in BCI, Brain-Computer Interface, and essentially his algorithm reduces the amount of time that's required to actually control things with your thoughts. Because that's important, before you would have to think very hard and calibrate this, and it was very specific to very different people. So this makes it time effective, and they're really targeting the virtual reality industry, especially gaming. So they're making BCI faster, they're making it easier to calibrate, and they're targeting the virtual reality gaming industry specifically. So I think that That's a really, really cool thought, right? Like, imagine if you were playing a Star Wars game, you'd really have the Force. I think that's something everybody wants, right?

[00:04:10.444] Kent Bye: I imagine they're using some sort of, like, AI training, or what is it that they're doing in order to reduce the noise to be able to get a discrete signal enough to actually perform some action.

[00:04:20.284] Sarah Downey: Yeah, so that's actually what the CEO's PhD was all about. That was his thesis. They worked specifically with a brain wave called the P300, which is all around intent. So it fires when you look at an object and you intend to do something about it. And it's kind of standard across everybody. We all have it. It's always firing. And so they're focused there, but they do have a lot of machine learning and AI experts in-house to really build out their SDK and get it in the hands of developers and make this a real thing, right? Because they have the technology, but the big business problem is like, how do you get it in the hands of all these developers and thus in the hands of consumers?

[00:04:57.803] Kent Bye: Yeah, that's exciting and also terribly scary that for me the implications of having a technology that can read our intent and be able to discern what we're looking at. I think that there's a lot of privacy issues and you mentioned that you actually have a background in the law around privacy. Maybe you could just sort of give a little background in terms of your study or expertise in this realm.

[00:05:21.304] Sarah Downey: Yeah, so I'm an attorney, although I'm not technically practicing now because I'm a VC. I started at a consumer privacy startup actually in Boston and they made a browser add-on that would block web trackers because there are just hundreds and hundreds of these things following you around. And that's concerning, but it's not the be-all and end-all of privacy. They can tell what kind of sites are you going to, there's various identifiers on the browser that can kind of target you versus me. But when you're talking about a VR headset or an AR headset, there is so much more data collected at such a more personal level. I don't mean to be a pessimist about this, but when you look at Facebook and their purchase of Oculus, The first thing I thought was their product is the user. They're selling the user. They're selling the user's data. And right now, they're making a ton on ad revenue. And really, the best they can do is to say, we showed this ad. We had this number of impressions. But now they own an environment where they own the user's gaze. They can see what you're looking at and for how long and is that interesting to you. And then, you know, that is definitely a source of a lot of money for them and ad revenue. And it's also a big privacy concern because, you know, I think to me, one of the most interesting and powerful points about the metaverse and virtual reality is you can be anyone. You can be whoever you want to be. You are your avatar. Think about Second Life times a million, right? So I don't know that consumers want to be identified and tracked and followed in that way. And we'll see. Facebook's making a big bet on owning the social virtual lair. But my gut is that it's not going to work out that way. I think that people want to be free. Like, to me, the privacy issue is a free speech issue more than anything else, because when you know that you're being watched, when you know you have an audience, you don't say the same things that you would. Like, I say different things around my grandma than I do to my best friend, than I do to my boss. And we all do, and that kind of allows us this sphere of privacy to be who we are and say what we want to say. And I think that the virtual reality world offers the biggest benefits in terms of self-expression, but also some of the biggest risks if you look at the amount of data and the personalization of that data.

[00:07:35.332] Kent Bye: Yeah, I really see this as a paradigm shift in a number of ways in that we're doing this context switch into having a lot of biometric data, whether it's our body movements, our eye gaze, you know, eventually with EEG, our intent, our emotional states, our facial expressions, what we're looking at, what we're paying attention to, you know, all of this data that It feels like the first iterations of the web were really based upon this model of explicit consent, that if you're going to yield your birthdate, for example, you would actually have to take an action to type that in. But then with all the cookies and the tracking, there was this passive layer of our behavior that was being tracked that I think was starting the blurring of that transition from explicit consent to implicit consent.

[00:08:22.649] Sarah Downey: That's what worries me a lot about augmented reality in particular, is that You know, you can get off the shelf facial recognition and detection software right now, and imagine that in an AR headset, if that becomes ubiquitous, which it will, I could be walking down the street, look at you, not know you, but just be able to scan your face, match that against a database, you know, public database, LinkedIn, Facebook, Google image search, whatever, and all the associated data, like if you've ever done a reverse image search on Google with your picture, you'd be shocked at the amount of information that comes up and how accurate it is. Think about that applied in AR in the real world. Like, you'd never really have a blind date, you know? You'd never really go into a coffee date or a work negotiation without an insane amount of information on that person if you wanted it. Which I think is both extremely useful and extremely scary. Like all technology, right? It's not the tech that's the problem, it's the people and what they do with it.

[00:09:19.289] Kent Bye: Well, I think that part of this shift into these, we call them experiential age immersive technologies, is that you do have all these higher fidelities of information that Conor Russomano, back in episode 365 of The Voices of ER, was basically saying that the EEGs could potentially have a digital fingerprint. such that your brainwaves are unique, just like your fingerprints are unique. And so you can't really ever truly anonymize that data. So if you're collecting it and sending it over the wire and it's being stored somewhere, then that can be traced back to you. Even if you try to disconnect your explicit identity, it's sort of like your DNA or your fingerprint. And so to me, I've been trying to find somebody who is a biometrics expert in privacy to see what is needed here in terms of the law. actual technology implementation is happening faster than the laws can keep up with it. And you know that's always been the case, but yet we're just sort of diving in headfirst and I'm thinking about these things, but I don't know what kind of institutions are required to be able to actually have some sort of counterweight to these huge companies that are just moving forward with this plan, but yet we don't really have a voice in how this actually plays out.

[00:10:31.622] Sarah Downey: Yeah, you know, it's interesting because we have PII, personally identifiable information, which is a legal category when it comes to information protection. And that's very specifically laid out, you know, what counts as PII, what doesn't. There's whole industries of people, privacy professionals, whose jobs are around what counts as PII, what doesn't, how we protect that. So I think there has to be, regarding virtual reality, explicit characterization of things like EEG activity, brainwave activity, things that are personally identifiable and unique that haven't really been contemplated previously. And you're right. The law is always so behind technology. And we're on this exponential curve where the faster these things grow, the further behind the law is. So I don't really think the law is going to be the place to solve this stuff. It's going to have to react, not be proactive. It's really up to the companies developing this technology to build privacy by design from the beginning. That's the easiest way to do it, because if you start thinking about it from the beginning, it incorporates itself into every aspect of the design and product process. It's really hard to throw it on at the end. You know, so like, there's a company in our portfolio called T-Vision Insights. They're trying to replace the Nielsen rating system for TV. They use a Connect box a person would opt into this, right? So the Kinect box is passively watching in the home, detects how many people are in the house in front of the TV, detects how many of them are watching, and what their genders are, because the skeletal system is visible in the Kinect, so they can tell if you're male or female by the way your hips are. They can tell if you're happy, you're smiling, or you're not smiling, if you're wearing glasses, different characteristics, and how interested you are. And they can do this on a very segmented way. So if you're watching the Oscars, they know this segment of the award was the most interesting, or this Super Bowl commercial was the most interesting, and people left the room during this part. But that's an example of something that's basically augmenting reality, and could be very scary from a privacy perspective. They've built it in such a way from the ground up that it's non-personally identifiable. They only identify, you know, this is female of household number one. This is male of household number one. It's all binary, non-identifiable. So I think that's a good example of how you can anonymize from the beginning. You know, they don't care that it's Kent Bye who's watching this show. They just care that you're a male of this age and you're happy at this point and you're sad at this point and you left the room at this point.

[00:13:01.522] Kent Bye: Yeah, and I think that if you look at both Facebook and Google, it seems like these are these big companies that are playing a big part in the future of virtual reality. And I actually had a chance to do an interview with engineers at Google Earth. To me, that was really the first experience that I've seen from Google that made me really question, OK, what are you tracking? Are you tracking where I'm looking at, where I'm going? And I wanted to know, what are you doing with this information? And they basically said, our terms of service and our privacy policy are the same ones that we have existing services. and they said you can look at your privacy policy and then also that our users who use our services trust us, which is saying that people who don't trust them don't use them. So people who use them, they're saying they trust them.

[00:13:44.598] Sarah Downey: I mean, that's bullshit. As an attorney, I can tell you that it is. The reason is, if you ever do read those policies, which nobody would because it would amount to the equivalent of, like, I saw a study saying 45 hours a year, right? I mean, you cannot do that. You can't afford to do that. And even if you did, if you read most of them, there's always a clause that says we can change this at any time, for any reason, with or without your consent. So even if you do read it, There's no reason for them to not change it, and they do, and they often do it at shady times like Friday nights, when even if you were watching, you wouldn't be paying attention right then. It's just the reality is, you know, there are trade-offs that come along with using technology. You're not going to be educated to a lot of those trade-offs, so I think it's important that journalists shed light on that. There are some great privacy journalists out there, like Julia Angwin at ProPublica is one of my favorites. She's great. She really goes after the facts. It's tough. It's getting tougher and tougher for people to really educate themselves on the data trail that they're leaving. And also, correspondingly, the decisions that are being made about the digital you, you know, the real you based on the digital you, the trail that you leave. You know, I've seen back when I worked at this privacy company, Avon, I saw people's credit limits being dropped, people losing jobs or not getting hired because of what was out there on their digital footprint. It's becoming more and more of a thing, like you're seeing people's Google search histories being used in litigation as examples in child support cases for why you should or shouldn't have access to your kids. Just recently, the first example of an Alexa search warrant. There was a murder involving a hot tub at this guy's house. They ended up doing a warrant for the Alexa data that had gone in previous. So basically anything digital that's recorded, that's maintained, can be accessed by law enforcement and often is. So really one of the best things you can do for privacy is not collect the information to begin with. If you don't have it, you can't turn it over. So that's like what apps like Signal is an amazing private messaging app. They just don't store the data on their servers. And because they don't, and because it features two-way encryption, you can be very certain that there isn't anything to receive if the FBI comes asking for it.

[00:16:09.996] Kent Bye: Yeah, and the other dimension here is Mark Finchley, he's, you know, one of the originators of VRML and one of the things he said to me in my interview here at CES is that we live in a hostile environment online. I mean, it's like really wars are happening with hacking and identity theft and It's not just trusting these companies and trusting the government, it's also just the fact that this is stored somewhere, then this type of information could get into the wrong hands and who knows what they might be able to do with it. So right now it's just like social security numbers and credit card numbers, there's a financial impact, but I don't know what the implication of that is. lack of security as a top priority from the top down within our institutions of government for whatever reasons they want to leave these back doors open to exploit them instead of fixing them. It just creates this environment where there's so many different people who are able to get access to this privileged information and then what does that look like? What if an enemy state or a hacking group, it seems like lives could potentially be destroyed.

[00:17:16.703] Sarah Downey: Yeah, I think that hacking attacks like that are the biggest risk to our way of living. I mean, it's becoming the more immersed we are in anything connected, the more the risk becomes. And like, if we're heading towards a Ready Player One-esque future where we're all plugged into the metaverse and that's where we live, basically, most of our lives, then we're leaving a footprint that unfortunately could be exploited without proper controls. That's why encryption is so important, right? There's an interesting series of Carnegie Mellon studies on this where using facial recognition software, again, off-the-shelf stuff, really easy to use. This was a couple of years ago. People could basically take a picture with a mobile phone of a person on the street figure out their identity, name, that kind of stuff, and then go on something like LinkedIn and basically hold the right amount of information to get a social security number. Because social security numbers aren't random, they're based on patterns involving things like where you were born, your date of birth, your age. So you can figure out a surprising amount of information from things like that, which is why I very closely guard my LinkedIn connections. And it feels like a lot of people don't do that. But think about it. Think about how many security questions online are like, where did you go to school? What's your mom's maiden name? Those kind of things. Those are very easy to get. And when you combine the digital world with the real world in the way that augmented reality or virtual reality do, you're really crossing these privacy and security borders that are pretty threatening.

[00:18:55.149] Kent Bye: Yeah, and it seems like there's a lack of ethics that's driving the values within our technology and that it's a little bit of like a means to the end of whatever's needed to be able to increase the bottom line of profits is that the users are kind of the last of that chain of consideration. And I think that, for me, I wonder whether or not this surveillance business model, which is essentially what we have, is like using corporate surveillance to be able to increase private profits. That, you know, in the experiential age, maybe we'll be moving to less of a, you know, I want to experience everything for free. But yet, if you're experiencing something that is a real experience, you may treat it the same as you go to a movie or go to a concert, or you may actually pay for it up front, rather than going to an environment where you're going to be surveilled and then potentially have all this information on your digital footprint forever.

[00:19:48.576] Sarah Downey: Yeah, you know what's interesting is the private space in the physical world is shrinking, often with technology, right? Like there's light posts in various cities that are also cameras, you know, there's cameras everywhere, there's tiny microphones that are surveilling people everywhere. Often it's for things like gunshot detection, right? But the Fourth Amendment, which protects people from unreasonable searches and seizures, is based on a reasonable expectation of privacy. So there's a subjective component to that. It means what is reasonable? What do people think is reasonable? And that's been eroded a shocking amount over the last 10, 20 years, especially with things like social networks. So there's an actual legal effect of these kinds of things creeping in on us. To me, that's why the metaverse and virtual reality are so much more important, in a way, because if somebody designs it correctly, it could be the only really, truly private place that we have. If somebody is designing that and building that, I want to talk to you and I want to give you money. Because to me, that's where the First Amendment can really reach its highest potential. That's where people can be themselves, even if they're not being themselves. I forget who said it, but there's a great quote about people show their real selves when they put on a mask. and I really believe that to be true and that's why I think the metaverse is so important. You think about like we take for granted where we are in the United States but there's so many countries and so many societies where you're persecuted for things just like being a woman and I think the metaverse could offer a truly open, expressive, and safe place, assuming the privacy controls are there. But you're right. I mean, you frame it correctly that these big tech companies, they're monetizing based on data. They have almost no incentive to do the right thing when it comes to privacy because they're all fighting for their bottom line and their revenues and their shareholders as well. And it's important, you know, But that's why it's really important that privacy and usability go hand in hand, which they often don't. Like I mentioned Signal before, that's a great example of a truly terrific app with a great UI. But unfortunately, it's not the case when it comes to privacy. Things like PGP, they're really hard to use. And we have to get to a point where it just becomes second nature. Or I think if people realize the downside to not having privacy, they might be willing to pay. It's hard to say that, you know, people, especially millennials, are really used to getting things for free, especially content. And privacy erosions are often insidious. They're small, they're slow, it's like the lobster in the pocketing boil, he doesn't realize. So I think If people were aware of the kinds of negatives that can come out about them and the kinds of things like, again, not getting hired, losing a job, losing custody, getting arrested, whatever the negative is, if they're aware of that, like, hell, I'd be, I'm today willing to pay for privacy, but I think that's, unfortunately, that's not the case for the majority of consumers.

[00:22:50.015] Kent Bye: What do you think are some of the key components that are really required to have a truly anonymous metaverse where people can perhaps have a persistent identity, so imagine that there's some identity component that's not centralized in a Facebook or a centralized social media site, but perhaps using technologies like the blockchain to have some sort of distributed identity verification. That seems to be one component, but what are all the different things that you think need to come together to really have a truly anonymous metaverse?

[00:23:18.483] Sarah Downey: Well, like I said before, the most important thing is data collection being minimal. If you don't collect it, you don't have to turn it over to law enforcement. So if you're tracking logs, records, sign-ins, whatever, the FBI comes to you, they want something, you have to hand it over in most cases. And they'll often do a gag order so you can't even talk about the fact that you have to turn it over. So, maintaining as little data as possible is key. That's hard though, because at the same time, if you're a product manager, you want to know how are people using the product and what are they doing, what's the sticking point, what's the activation point, all these things. So you kind of have to work harder to use analytics that aren't personally identifiable. So often you have to kind of build your own system for doing that. But there are ways to do that. There are experts in this. So I think there's a way to get analytics without identifying who people are personally. The key is you don't want to personally identify them. I also think payment is important. So if you use things like Bitcoin or other cryptocurrencies, that goes very far as far as protecting identity. You know, I think not requiring a social sign-in is key. Yes, it's easier because you don't have to create a new username and password, but again, the downside is it's tied to this account that knows an insane amount about you. So the key is making a UI and an experience that is fun and easy for the user without pulling in too much data from elsewhere. And I think you can get around that, honestly, if you make a product that is just so good and engaging. And that a lot of these things depend on there being an audience, right? Like if I go in the Metaverse and no one's there, probably not going to stick around. But if people are there and it's a great experience, my friends are there and I have a good time, I'll stick around and I don't need a Facebook sign-in to do that.

[00:25:04.967] Kent Bye: Yeah, and it seems like, you know, that's sort of the dynamic between the closed-wall gardens and the open ecosystem is that oftentimes those premium experiences are going to be on those closed-wall garden experiences and that, you know, over time they're going to have more capital, the better content creation tools. But I feel like it's a value shift in the people who are actually using the technology in that In some ways, I don't know if the solution is to have the market decide by having viable alternatives that have just as compelling content. But at what point people are going to realize, hey, I'm tired of being surveilled and having all this happening. And I think you mentioned some of those real-life costs that have happened from people having their digital footprint come back and basically either prevent them from getting custody for their children or losing a job. But it's not to the point where it's in the mass consciousness because it's, like you said, there's like the social water cooler. People go online, they want to see the pictures of their family. It's almost like they've hijacked the parts of our brain that are connected to our family and been able to exploit that for their own bottom line of profits.

[00:26:12.751] Sarah Downey: There's a couple interesting points to that. One is people generally post the things that enforce the image that they want. There might be an increased amount of content that people are posting, but it's not authentic, right? It's like that's one of the reasons why Snapchat has really taken off is because Yeah, of course, they're still storing everything on their servers. Do not believe for a second that they're not, right? They have it. But the fact that it's ephemeral to most people is a really good selling point because you can be who you are for 24 hours and that kind of mitigates the risk of that, right? Whereas Facebook, everyone and their mom and their dad and their gym coach are on there and it's like you can't be yourself because you know that the audience is watching, right? So that's an important thing. I also think Black Mirror, do you watch that show? So I was just watching the episode Nosedive where the main character lives in this world where everybody rates every interaction they have with each other and it's a dystopia. Like it looks like a utopia because everyone's smiling and giving each other five stars. It's an absolute nightmare and that's again to me it all comes back to this issue of privacy means freedom of speech and freedom of expression It gives you a safe sphere to be who you are. And without that, you're just kind of self-censoring, or you're putting out an image of yourself, like you only post the most attractive photos, or you only post the great, beautiful photos of food, like hashtag blessed, look at me, look at my life, everything is great. And it's just this kind of like fraudulent shell that limits authenticity, and it's kind of, you know, that's why I like Snapchat, because you see the ugly but real side of people.

[00:27:53.255] Kent Bye: Well, the thing that you're saying that I really resonate is that Facebook started with a certain amount of privacy, and then over time they've slowly eroded that. And I've noticed just myself that that's constrained what I'm saying on there. In fact, I hardly post anything to Facebook at all. And I've experimented with Snapchat, but it's sort of like a medium that I've used it to be able to actually become more authentic in the moment for myself, and that's a really interesting use case. But if we kind of look at what you're saying here, that the privacy is really a key part of our freedom of expression, and that is actually like eroding into our real life in terms of these augmented reality technologies and virtual reality technologies, and that the same vibe that we get from Facebook is sort of the trajectory of this Black Mirror episode of Nosedive where people are kind of rating each other and you start to essentially start to quantify things that are essentially subjective qualitative experiences and you're putting a number on the people. I don't know. People often say right now that there's more use cases for augmented reality and that there's an assumption that people are just going to want to wear their augmented reality glasses around. But the downside is that it's going to turn everything into this public place, essentially. And you're right, that virtual reality may actually be the one last bastion of privacy.

[00:29:10.161] Sarah Downey: That's kind of what I'm betting on and that's why, you know, I've always looked forward to that as a sci-fi fan. You know, I really look forward to the day where I can go into a totally different world and be a totally different person because I think that's healthy. And so much of the technology and the companies and things that we do every day are trying desperately to tie us to one unified identity and follow us everywhere that we go. That is the business model. And I think it's healthy for us to do the opposite and escape. You either have to physically escape to get off the radar these days, you have to physically go to a far away place, or you would have to do something like disappear into the metaverse and just be a totally different person. But yeah, I think it's healthy. And I agree that there are more use cases for augmented reality. Like for Upload, I wrote a a piece that was like every use case I could think of. And it was hundreds, right? And it's exciting and it grows every single day. But there's a safety in virtual reality that comes with that enclosure and comes with that total immersion where even in the middle of a conference floor like CES, you can suddenly be somewhere completely different. And it's like relaxing. It's escapism. And that's You know, a thing that sci-fi and gamer fans have in common is that escapism expands your mind and allows you to be somewhere else and it feels good. That's, to me, why we're going to have to have a solution that allows us to truly do that and doesn't take our online identity and just clip it onto us and follow us wherever we go. I mean, that's not fun. There will be cases for that, absolutely. We're going to need to do conference calls. We're going to need to socialize. We're going to want to see our friends who already know who we are. But I'm interested in the weird hangouts with people that I never knew and will never know outside of the metaverse. That's, in some ways, the way that you can have the most authentic and real interactions.

[00:31:08.004] Kent Bye: Great. And finally, what do you see as the ultimate potential of virtual reality? And what am I able to enable?

[00:31:15.926] Sarah Downey: Oh man, there's so much. It's hard to say. I think there's huge use cases for education. I really love the idea that a kid in the middle of Africa, as long as they have a solid enough internet connection, can go and take a class with the best professors in the world. You know, and then when you add AR capabilities to that, they could actually see themselves, like, in the stands, right? And then the professor could know, okay, I haven't had an eye contact with that student in a while and they seem distracted. Let me look at him or her and engage them. So I think it makes education scalable and education is so important. Just as a purely, like, selfish hedonistic, from that standpoint, I love the fact that I could just go basically like be in the Matrix. It's one of my favorite movies and it is for so many reasons but I love the fact that I could go and just have these weird one-on-one interactions with people about you know you never know who's there and you never know what you could get out of people. I feel like the more technology has kind of invaded our lives in the real world the less easy it is almost to talk to people because you always have this phone with you that's kind of a shield and you can go into it and kind of isolate yourself and detach yourself And I look forward to just having these totally honest, open conversations with people in these safe, virtual environments. Of course, I'm really excited about gaming. I mean, I have every system that's been out forever. I've got a tattoo of the Triforce from The Legend of Zelda. I've spent a lot of my time in virtual worlds, you know, on the flat screen, but I'm really excited for games to go beyond where they are right now in VR, which kind of feels like a demo. You go in, you're like, oh, it was fun. I'll come out. You're not like, oh, shit, I spent 10 hours in that, the way that you do with, like, Witcher 3 I'm playing right now. And holy god, like, I'll look at the clock and it's four in the morning and I haven't showered and I'm not a person. So I'm really, really looking forward to VR games getting to that point because the potential, I mean if I spend 10 hours on Witcher 3 on my couch looking at a TV screen, imagine how dangerous and amazing it could be if I'm in that world and I'm fighting those fiends right there.

[00:33:32.731] Kent Bye: Is there anything else that's left unsaid that you'd like to say?

[00:33:37.432] Sarah Downey: I guess, I don't know, I was walking around the floor at CES all day looking at all the stuff that was out there and I was kind of, I guess, let down by the VR portion. It just didn't seem like there was anything I hadn't seen. And I was kind of heartened by the AR portion, because although it's still nowhere near where we want, there were a lot more headset manufacturers than I thought would be there, and they were using some interesting tech. It's not perfect, right? We're years away from AR. But it was cool to try that stuff and see the sheer volume of companies that are really betting on this. To me, it's the biggest paradigm shift that's going to happen in our lives. I feel like we're lucky to be alive right now, right at that inflection point. I mean this is only gonna happen once and it's so cool to be here right now.

[00:34:29.421] Kent Bye: Yeah just a comment on the CES because I had a similar thought of going into the Convention Center South and seeing a lot of the official virtual reality marketplace. There's a lot of middle tier, lower tier headsets from different countries and just I think that the issue that we have in virtual reality right now is that you have the biggest top five VR headsets, you have the content ecosystem. And so if you're not already playing into that ecosystem, then are you going to be able to actually generate both the content and the audience for that? So it feels like the things that I'm looking at are things that are kind of feeding into the next generation, whether it's audio or whether it's like haptics and all these other devices that are still the DK1 era moving into a full production quality, but yet there's going to be more devices that are going to be fleshing out this, what we have essentially right now, which is a very good visual system, some, you know, audio that's a little half-baked, and then the very little to very minimal haptics. And so, as we get those three major things really fleshed out over the next five to ten years, then, to me, the most compelling stuff here has been the content in terms of what people are doing with motion platforms and haptics and devices to be able to fly in the air. So those are the little gems that I've been finding so far.

[00:35:46.210] Sarah Downey: You know, it's funny because from an investment perspective, I totally agree with what you just said, where it's just saturated in terms of headsets. I'm never going to make a bet on a headset. It takes an insane amount of capital to get where you need to go. There's so many of them, right? And then the content, although it is what's going to define success and really bring this to a consumer mass market, It's also like hard to invest in that because it's such a hit driven business even more so than venture already is, right? So the things that I'm looking at and interested in are the middleware, kind of like the software, what allows you to stream VR because it takes an insane amount of data. The things that make it more real, technologies like foveated rendering, right? Like smart creative ways to reduce data consumption without harming the user experience or You're like, damn, I wish I had invested in Unity, right? Because they have this store that's going to be like the Apple store in terms of assets, right? Or like 3D modeling, 3D content creation, the glue, the software that holds it together. And you're right, like 3D audio is super important. Because without that, it really takes away from the experience. I'm interested to see how, like you mentioned, the haptics, the other peripherals, that's going to be really important. But I'm also, you know, I hope we just get to a point where we almost have like the nerve gear, like in Sword Art Online, the anime, or something that just like plugs right into you and gives you all the sensory input that you need to believe something is truly, truly real. Because it's either that or you have like a super high-end virtual reality room. with an omnidirectional treadmill, and a body suit, and a wind machine, and smell-o-vision, and all these things that are multi, multi, hundreds of thousands of dollars. It's basically like turning your house into a VR arcade. So I hope we just go the neural direct input route and make it believable that way.

[00:37:38.461] Kent Bye: Awesome. Well, thank you so much.

[00:37:39.863] Sarah Downey: Yeah, thank you. This is super fun.

[00:37:42.388] Kent Bye: So that was Sarah Downey. She's an early stage investor with Accomplice Ventures. So I have a number of different takeaways about this interview is that, first of all, I think there's just a lot of really canny insight that Sarah's bringing to the issue of privacy within virtual reality. She's essentially saying that the Fourth Amendment is actually based upon this reasonable expectation of privacy. So our cultural behavior that how we use technology and what we share on mobile and web technologies actually has a very real legal impact on our constitutional rights. That the more that we're voluntarily using these technologies and services to record and share these intimate moments of our life, then the less reasonable expectation we have to that privacy and the more likely that that data could just be collected by the government at any moment. So Sarah's just making the point that our privacy and having those private spaces are actually super critical to having free speech and to have this safe container to be able to really freely express yourself. And I've certainly noticed this within Facebook. And, you know, whenever I post something on Facebook, I just have this expectation that it could end up on the front page of The New York Times. And Sarah's really bringing out the point that we kind of have this self-censoring behavior where we're only putting our best face forward and it's just this fraudulent shell of inauthenticity. So when you live in a digital environment where anything that you write down could be subpoenaed and collected by the government and used in official court cases, like your search history results could be used against you in a child custody case, it starts to have this chilling effect and it comes a little bit of a ghost town of inauthentic expression. And with augmented and virtual reality technologies, the risk is that that same type of behavior could potentially spread out to everywhere that you go because the biometric information of your face could be connecting you to your singular online identities that could essentially make your social media profiles and your entire history just follow you everywhere that you go. So a lot of the technological trends are that they're trying to desperately tie us to this singular unified identity so that they could follow us everywhere and take your entire behaviors, everything you ever do, and tie that back to you so they can basically just do this super detailed advertising targeting to you. That's the default business model for companies like Facebook and Google. So that trajectory and pressures are such that as we start to add more and more of these virtual reality technologies, then it just starts to really beg the question as to what can be captured and what can they then be able to tell about you. So in listening to Sarah, she's essentially laying out a number of different distinct types of information that is related to privacy. The most sensitive one is the personal identifiable information. This is anything that can be used to identify, contact, or locate a single person. So that's like your full name, your address, phone number, email address, billing information, you know, a whole long list of things that are essentially sacred that cannot really be shared. There's another category of sensitive personal information that is confidential medical facts, racial or ethnic origins, political or religious beliefs, sexuality. All these things are essentially the things that you're not allowed to legally ask within a job interview, for example. It's a taboo to discriminate against someone for any one of these types of sensitive information, whether it be for a job or housing or whatever else. So some of these technology companies may have access to the information and there's different laws and regulations that they have to follow regarding this information. Then there's a whole class of non-personally identifiable information. This is basically everything else that they're collecting on you. These are things that can be anonymized, and if you were to look at a list of information, you may not necessarily be able to determine who that person is. So they just collect as much as they possibly can and whenever they make that correlation between your personal identity and all this aggregate information about your entire history and your digital footprint, all that information is churned through algorithms that discern different categories that you're in so that advertisers can pick those categories and send ads that are very specific to you. And because Google and Facebook know who you are, they're able to take the money from the advertiser and give you the ad. But there's a little bit of a church and state that's separating you from your entire history and your digital footprint. But yeah, at the same time, if the government comes to any of these companies and asks for any of that big dossier of information, they have to turn it over. And they often can't tell anybody about it. Or it could be hacked, and that information could be used against you in different ways. So I spent some time taking a look at the different privacy policies and there was one line in particular from Oculus that really jumped out at me. It says that information about your physical movements and dimensions when you're in a virtual reality headset. So essentially all your head movements, your hand movements, your gaze detection, all these things that they can already start to put into a database. And so Oculus and their privacy policy is basically saying that they are starting to record your physical movements and dimensions when you're moving around in a virtual reality headset, and likely specifically your head gaze into what you're looking at and engaging with. It takes the application developer to be able to extrapolate the content of what you're actually looking at, but because they're recording it, they have the potential to be able to tie that back to all of your other personal profile. And one of the things that Facebook is doing is they're saying in their privacy policy, we have all these different services that we're able to feed back into your singular unified identity. And Facebook actually goes out and buys other private data from your financial data, your mortgage information, and they tie that in to have this very specific demographic information because part of Facebook's policy is to have your actual name and identity so that they're able to tie that personally identifiable information to information that you didn't even explicitly give to them, but they're working with other data partners to create this super profile of who you are and what your financial standing is. So as you move into virtual reality, you're starting to get more and more intimate information. So I see that there's all sorts of open questions as to whether or not some of the information that is being collected in virtual reality will turn out to be a part of this class of personally identifiable information. Right now, that connection has not been made. And so they're essentially able to collect whatever they want and tie it to your profile. And so there's a whole class of biometric identifiers. These are things that you're able to essentially tie back to your identity. So there's things like looking at your facial features, your fingerprints, your hand geometry, your retinas and your irises, your signature, your vein patterns, your DNA, your voice, as well as how you type and your typing rhythm. So all of these different biometric identifiers are ways that they've already been able to take that information and tie it back to your singular identity. The things that really jump out here are the things like the eyes and the retinas. You know, once you get eye tracking, they're able to essentially have access to that information to be able to get this retinal scan. And once you have that retinal scan, then that could potentially be tied back into your personally identifiable information. At this point, there's been no eye tracking technology that's shipping in virtual reality headsets, but I expect in the second generation, we're going to start to see this. And is that going to change the class of information that is connected? And will it change the way that some of this information is gathered and stored? So then there's all sorts of other information that's going to be super intimate, that's going to be collected by virtual reality in the future. That's going to include like your facial expressions, your eye movements, your eye gaze, your gait if you're walking around, your hand and head movements, and your body movements as well as more and more things get tracked. your engagement with different objects within a VR experience, speech patterns if you're talking and doing conversational interfaces, your emotional states that they may be able to extrapolate from your facial expressions, or other methods such as if they're looking at EEG. So if we start to use brain control interfaces, then you start to actually have access to your brainwaves, and what you're paying attention to, and whether or not you're interested in it, and whether or not you have specific intent. and potentially even being able to extrapolate our thoughts. So a lot of these different things that I'm listing here would be just essentially a bunch of raw numbers that if you looked at, any human wouldn't be able to make any sense of, unless you were actually seeing it embodied within an avatar. There's already anecdotal evidence within people who have been using high fidelity saying that as you watch somebody move around, they're able to identify different team members that they're working with based upon how they're moving around. So that right there is telling me that it is already possible by other humans to be able to take this information in aggregate and be able to connect that back to your personal identity. But once you add more and more artificial intelligent algorithms into all this information, then who knows what's going to be able to be extrapolated and if there's going to be a unique biometric signature based upon a lot of this information. So a lot of it is going to be starting to be captured and stored. And I think it's just an open question as to whether or not it'll eventually prove to be this different class of information that's personally identifiable. So aside from Oculus, up to this point, there's been no other of the other virtual reality companies that have even explicitly started to mention some of the special considerations in their privacy policy for how some of this information is being gathered and collected. So right now, these other companies could already be collecting information that they haven't been able to disclose or say yet. But overall, one of the big points that Sarah is saying is that as we move forward, it may be more and more difficult to escape our digital footprint as more and more immersive technologies are out there, and that the metaverse may actually be one of the last bastions of privacy on the web if the infrastructure is architected correctly. Sarah talks about her sci-fi dreams of being able to escape her reality and to go into these different worlds, and that as you put on a different mask, you're able to become more of yourself. But there's a number of things that have to come together in order to have this truly anonymous metaverse. One is to just minimize the amount of data collection that's happening. If you don't need to collect it and you're not doing anything with it really, then just don't even record it. Maintain as little data as possible. And then to then not personally identify people in any way, not be constantly trying to tie everything that somebody does back to the singular identity. And then the payment is actually one of those things that typically can personally identify people. And so if there's a more decentralized system like a Bitcoin or some other type of cryptocurrency, then that's going to give a layer of anonymity as well. and then to not require any social sign-in from these social media sites that's going to then immediately link everything that you do there into these huge social media profiles. So finally, I just really appreciated a lot of the deep insights that I got from this interview and it helped me to actually dive into a lot of different privacy policies and to really put a framework into some of the interactions that I've had on Facebook, my experimentations with Snapchat and the ephemeral nature of that and how that was really allowing me to be a lot more authentic there. But that we just overall need these technological solutions that allow us the ability to just escape and to expand our minds and to not have to worry about everything that we do being recorded onto this unified identity that is going to follow us around forever. And whether or not there's other advertisers that could get direct access to it, it's still recorded in a database. It could be subpoenaed. It could get hacked. And so there's just all these different risks that the more and more that we live our lives within these digital realms, then privacy may actually prove to be something that we're willing to pay for so that we're not being subjected to the surveillance business models of all these different companies. So that's all that I have for today. I just wanted to thank you for listening to the Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends, and become a donor. Just a few dollars a month makes a huge difference. Go to patreon.com slash Voices of VR. Thanks for listening.

More from this show