#833 XR Ethics: Open AR Cloud Panel Discussion on XR Privacy

Open AR Cloud held it’s first symposium before the start of Augmented World Expo this year, and I participated on a panel discussion on XR Privacy & Security which included the following panelists:

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to The Voices of VR Podcast. So continuing on in my series on XR ethics and privacy, this is a panel discussion that I was on right before Augmented World Expo. So there's this group called OpenARCloud. It's a distributed group that's trying to look at the open way of doing the AR cloud and also just think about these larger issues like ethics and privacy. And so they held a privacy and security panel discussion right before Augmented World Expo. And so this was the last panel of the day. So on this panel with me was Brian Wassum. He's a lawyer who's looking at the different intellectual property and other law aspects of augmented reality. I know he's been doing a lot of work with First Amendment speech rights of augmented reality. We have Kavya Perlman, who is the founder of the XR Safety Initiative. We have Lisa Watts. She's the founder of One Slash 21. And then Damon Hernandez, he's the product manager for Samsung Research. So this was just kind of an open-ended discussion, brainstorm about privacy and security and mixed reality. So that's what we're covering on today's episode of the Voices of VR podcast. So this panel discussion happened at the 2019 State of the AR Cloud Symposium and Showcase that was put on by OpenARCloud. And that happened on Tuesday, May 28th, 2019 in San Jose, California. So with that, let's go ahead and dive right in.

[00:01:38.743] Colin Steinmann: Good afternoon, everybody. My name is Colin. I'm one of the co-founders of OpenARCloud. And today, we're going to have a little panel discussion about privacy and security in the era of constant reality capture. So let me introduce you all to our esteemed panel. Going down the line, we'll start with Damon Hernandez. Damon works in many areas of the 3D industry and has been active in virtual environments and the web 3D for almost 20 years. He is actively involved with the 3D web's convergence with other technologies including IoT, GIS, CAD, BIM, CAM, AR, VR, mobile, and advises a variety of entities that use these technologies. He is currently at Samsung Research America making the web browsing experience awesome. Next we have Lisa M. Watts. Lisa is a storyteller and technologist known for her business insight, creativity and relationship building. She's behind some of the most innovative projects in the immersive media space in the last few years. At Fortune 50 company Intel Corporation, she led global marketing strategy and storytelling for virtual reality. She invested in groundbreaking independent projects such as Leviathan, Spheres, Tree, Zero Days VR, and helped launch the VR League with ESL and Oculus. Next up, we have Kent Bye. Kent has conducted over a thousand voices of VR podcast interviews featuring pioneering artists, storytellers, and technologists driving the resurgence of virtual and augmented reality. He's a philosopher, oral historian, and experiential journalist helping to define the patterns of immersive storytelling, experiential design, and the ultimate potential of mixed reality. He was a co-organizer of the VR Privacy Summit at Stanford University in 2018. Next up, we have Kavya Perlman. Kavya is a co-founder of the nonprofit effort, the Mixed Reality Safety Initiative, XRSI for short. XRSI promotes privacy, security, ethics, and works towards developing standards around application security for virtual reality, augmented reality, and mixed reality. And last but certainly not least, we have Brian Wassum. Brian is a globally recognized thought leader in the brand new legal issues raised by augmented reality and other cutting-edge emerging media. Through his blog, AugmentedLegality.com, and by publishing the first book on AR Law, he has achieved victories for his clients in federal and state trial and appeals courts across the country, including in the first case to apply First Amendment freedoms of speech to location-based augmented reality games like Pokemon Go. Wonderful to have you all here. Today we're going to talk a little bit about privacy and security. And so I've got a few topics prepped. I thought we might start, since this is OpenARCloud, by talking about the burdens and responsibilities of running an OpenARCloud platform and balancing the needs of all the different parties who are interested in the data that goes into and out of an AR cloud platform. And by that, I mean, how do you balance the needs of all these different parties, property owners, location scanners, people who build augmented reality experiences at that location, consumers who come to consume AR content in that location, businesses, organizations, governments with their own legal requirements about what AR or reality capture is allowed to happen at that location. How are we going to balance this? What's being done today? What do you guys think is coming in the future? I just want to leave it real open-ended. Let any one of you jump in.

[00:05:32.826] Brian D. Wassom: Well, you want a single-sentence answer to all those different parts? So to answer that entirely, I think will take the whole hour and all of us speaking, but at least to get it started, you mentioned property owners. And that to me, that's the topic I'm actually speaking on at the Expo later this week. And because inspired by my experiences in the past couple of years with clients coming to me, this number one question they ask and And from another angle, a lot of the developers, even folks that I know through this community, have raised the idea at different times. And everybody sees the issue of, well, how do I own augmented space? How do I prevent things that I don't want in augmented space to be there in certain locations? What makes AR different from any other digital medium is that it's superimposed on the physical world, right? Well, the people that own those locations in the physical world sometimes have opinions about what goes there and what is perceived to be augmented on top of their space. And so the question that I get a lot is from business owners, from property owners saying, how do I keep that stuff off my building? How do I prevent content that I don't want being there? My background is in representing media companies, and so I've been a First Amendment purist on these things for much of my career and much of the commentary that I've had on this. And my answer has reflexively been, anyway, that you can't, because property rights control what you can do in this physical space. You can prevent someone from physically trespassing onto your property. You can prevent someone from maligning your property in a physical way, from drawing physical graffiti on it with a spray paint can, but you can't prevent someone from putting AR content on there because now we're talking in air quotes because it's not really there. And that's the mantra that I've kept coming back to is it's not really there and that matters for purposes of privacy law because privacy law only governs physical things, physical intrusions. It's not meant to govern digital space just because of a cleverly designed two-dimensional display that is put on a mobile device to make it look like, to create the illusion that it's really there. It's not really there and that matters. Now, in recent years, we've had experiences that kind of test the boundaries of that. And I think that, frankly, we're going to, that is going to be a more nuanced question going forward. Overnight, in July 2016, Niantic rolls out Pokemon Go, and then all of a sudden, people are behaving in ways they never did before. They're walking on property and using property in ways they never did before. That tests the bounds of trespass, of nuisance. just settled a month ago, a series of class action lawsuits that were filed right after that game came out. It's been litigated in the courts this whole time, and the answer was not clear. The courts were not as receptive to this argument that, oops, sorry, no trespass, first amendment issue. They didn't see it in quite that stark of terms. Are there arguments to be made that why property law should extend that way? There are certainly a number of law professors that think so now. But yeah, that's my answer to that question, is that property rights are a real question.

[00:08:44.496] Kavya Pearlman: Yes, some of my thoughts really today is a significant day. That's something that I feel we all of us as an industry need to do from top to bottom from the organization level to the consumer level. We really just need to come together as the mass adoption happens. We need awareness, stuff like the thing that we today launched, Ready Hacker One platform for awareness and media awareness and bringing people, contributors together to talk about these issues. Well, it will take a while to get to the legal changes and hopefully introduce AR guidelines into CCPA type of stuff, California Privacy Law type of stuff. But what we can do now is we can conduct research together and we can put together guidelines and sort of policies in place. Because I think we can very easily say evil empire, but the evil empire itself really sometimes doesn't know, having been worked at Facebook for their third party security during US presidential election time. I can tell you that intentions of an individual is not waking up, I'm going to violate XYZ today, no. everybody wants to do great work. But they really do not have trickle down enforcement or accountability. So, coming together, holding some of the organizations accountable. Coming together, doing research to put these policies, guidelines in place. One of the early works and our cofounder, Ibrahim Bajili's work is to find novel attacks in VR. So things that have not yet really happened. And the key is like, you know, staying ahead of the intelligence curve. How you do that? Do the research, find those hacks, and then verbalize it, standardize it, and let the community know that, hey, as you are going into this domain, these are the possibilities. So they could take that into consideration, make those decisions, and then sort of have that trickle down Making a statement like future is private is very easy. But then what does that actually mean to the developer who's coding? That's something that needs to be translated. Especially in this domain. Because there is very little research. So who does the responsibility fall upon? I think it falls upon all of us. And that's something that we need to acknowledge and just really internalize. Some organizations are like, oh, yeah, I'm counting money right now. I'll talk to you after a year or two years. But that's fine. Bring them together. I think that's the important piece here. We're in this together.

[00:11:25.261] Kent Bye: Yeah, and from my perspective, I've been really concerned about biometric data and what is going to be made available for people, whether it's our galvanic skin response, what we're looking at, what we're paying attention to, our facial expressions, our emotions. I was just at a neuroscience and VR conference where there was some latest research where they were able to look at the brainwaves in someone's mind and be able to use a natural language processing and AI to essentially read your thoughts. And the person who was saying that was like, within five years, this is going to be able to be like in a non-invasive EEG technology. We're going to be able to like know what you're thinking. And then there was this chilled silence in the room. It was like, oh, shit. Do we really want Facebook and Google knowing what we're thinking? And they're working on all this stuff, too. They're actively moving all this forward. And on the one hand, it's going to be an amazing user interface when technology can just read your mind. But where is that data going? Where is it being stored? I think having your thoughts be recorded and stored in all of your emotional, like ephemeral emotions that you're having in a moment. You can imagine you're looking at a screen, you see something on the content, someone distracts you, you look up and then like you're looking at something else, you get really excited and all of a sudden you have a camera that's being passively turned on that's recording all of your emotions of what you're looking at and all of a sudden you have a positive emotional reaction to something that you may not want to be associated. and they're trying to quantify our lives in a way of creating these psychographic profiles to describe who you are so they can sell stuff to you. Now, the thing that I'm concerned about is the fact that there's a third party doctrine that says that any information you give to a third party is no longer reasonably expected to be private. There's the Carpenter case that has some good indications that maybe it's not going to be a blanket universal. But at this point, it's pretty much, yeah, anything you give to a third party, you can just assume has no Fourth Amendment protections. The government can go and get that information. So, if you're a company that is storing a whole bunch of really intimate biometric data about somebody, that means that a totalitarian government could go to you and say, we want all of the information on this person. What was Kent By's emotional reactions for the last decade? And that's the situation that we're creating is like essentially the infrastructure for a totalitarian state. Now, on the other hand, there's all of these amazing applications of what we can do with all of this biometric data. I just went to a Weekend Futures Summit that was looking at the cross-section of psychedelics, virtual reality, immersive technologies, and meditation. So, there's really interesting intersections for how you can use VR as a digital drug. Adam Ghazali is a neuroscientist who's getting FDA approval for a VR experience. It's a game to treat autism. And so, we have experiential medicine that's coming. And the healing capacities of what this is going to be able to do for you to have deep insight as to what is happening in your consciousness, you're going to be able to assess yourself and keep track of yourself and to be able to like do things that we never imagined possible. All that is at the same time, those same biometric data technology is going to be enabling all of this. And yet, on the other hand, that data in the wrong hands could be used to not only predict what you're going to do, but control what you're going to do. You know, a behavioral neuroscientist told me that, and it felt like just like hyperbolic fantasy when he told me this before Cambridge Analytica. But to see what is possible with having centralized repositories of all the psychographic data at the scale of 2.7 people, we're talking about like national security implications here in terms of what we're doing, what we're creating. And I think the people that are in this room are going to be at the forefront of creating the alternative, which is the open alternative, the decentralized architectures, homeomorphic encryption, differential privacy, finding ways that you can actually like still do real-time processing on this information and get some insight. But do we really need to be storehousing all this data in this repository so they can retrain their AI algorithms for you over decades? And I think that's what they want. they want to capture all this information to be able to train their AI that's going to continue to get a lot better. As long as they get it now, that's the end game. And so this is like a, after the quest is out, this is like a little window that we have to actually put forth alternatives and to have people that are in this room create a competitive alternative to these closed walled garden platforms. And so I'm excited to be here and discuss this with this panel, but also to be talking to people in this room to see what you think about how to design something that's a lot different.

[00:16:09.864] Colin Steinmann: It's not mandatory that everybody weigh on every question, but I really do want to hear from everyone. Please, please.

[00:16:17.450] Lisa Watts: So if we think about the past and the current situations, we need to start informing the future. So we actually live in a pretty significant augmented reality world already. And so Brian's talking about property rights and projecting an augmented reality visualization on a business or a property. you have a condition today where Yelp is already doing that same thing. So they have these algorithms where the negative feedback is more likely to see the light of day. Their justification for that is that, well, that person's feedback is more valid because they give feedback in more places. But if you log on to your business profile, you'll see 10, 12, 15, 20 very positive feedbacks that will never, ever be posted to your profile because that person just isn't on those platforms a lot and their feedback is deemed not valid because they're not an active person on that platform. You might see that with Google reviews, you might see those with other things. So we are already having these projections onto the physical world, onto things, onto people, onto all of these that are already skewed. Right? So we have to take a look. And I think Christine and I had some very interesting late night conversations about this as we were prepping for today, which was around, we have to have an honest conversation about where we are right now. Because we always kind of think it's always over the cognitive horizon. It's always kind of the next thing. The foundations for the future are today. And that's why I'm excited to be here with all of you, because I think we have to acknowledge that data is data. and how it gets projected, just because it's moving from two dimensions to three dimensions, et cetera, it's still the same issues. And we have to make sure that we understand how to correct in the future the sins of the past and the sins of the present. So this is, you know, I don't have the magic bullet, but just bringing that to bear, AR isn't the next thing. AR is now an augmented reality, isn't just something you put on your face. It's what we're being fed moment on moment, day on day. And at some point in time, we're all going to be in a car that's going to drive itself. We're going to be driven down a street, and we're going to be told what to think. So we need to seriously think about that, because we're already being told what to think. And we're getting very complacent about that. We're kind of getting used to being told what to think. So the next step isn't that far away. So we really need to have a serious conversation and think about what's happening now and how we want to correct that for the future.

[00:18:54.115] Colin Steinmann: All right, next question. Well, Kent, I feel like you read my mind or something, because my next topic that I wanted to bring up was that of anonymization. You talk about collecting quantified self-information and how it's going into these large databases. In the case of augmented reality, that can also be people who are scanning the world and inadvertently giving away behavioral information about, for instance, if they have a tremor or something, that can be diagnosed. through their shake and inadvertently sent up to this data set where you thought you were contributing a little bit of street map data, something like that. So what a lot of platforms seem to be talking about these days is their workaround for collecting this data is anonymization. And you hit on some terms that I'm hoping to get some feedback on. Homeomorphic encryption versus differential privacy. This is kind of the new privacy versus I've been hearing in the last few weeks is Apple's going around saying, oh, we can still learn from the crowd through differential privacy, but we're not collecting so much information that that information could be reconstructed later. Meanwhile, other big tech platforms are going, oh, well, we're using anonymization techniques, and we're using homeomorphic encryption to change the IDs of all the identifying information in this data set. Unfortunately, what we've seen is researchers at Carnegie Mellon have successfully taken anonymized medical data, cross-referenced it with voting records, and reconstructed the identities of the people in that data set. And so I want to hear from this panel, do you think that it is possible inductively to ever anonymize a data set is this just a talking point for the tech industry to keep suckling off of data that they probably shouldn't have for a few more years or is this a genuine solution that we need to embrace and be a differential privacy. anonymization or encrypting id or some other thing is is there some practice there where we can have our cake and eat it to have these rich data sets and yet keep them anonymized in such a way that we're not giving away psychographic profiles to totalitarian governments when they come knocking

[00:21:03.470] Damon Hernandez: I would ask who's going to enforce that right i mean who's going to enforce that someone is going to go that that route and that would be to me the question i would ask to follow up with that

[00:21:17.124] Colin Steinmann: I ask the questions here.

[00:21:18.245] Kent Bye: At the VR Privacy Summit, one of the insights that came out was that we kind of need an institutional review board for privacy, but more interfacing with these companies because there is no real auditing or any accountability there. Now, in talking to different people in the industry, I know that the founder of OpenBCI, Conor Rusomano, he told me a couple of years ago, he said, you know, this EEG data, it could turn out that there's like a unique fingerprint in pretty much anybody's biometric data. Now, at this point, a lot of it is data that cannot be identified. But I suspect that it's possible that every single biometric piece of data that you're collecting could potentially have some sort of unique biometric identifier, especially if you combine it with other things as well. You may be rating a very specific signature that imagine if you're in a virtual world and someone goes in there and is looking at you, they may be able to record your behaviors and what's being radiated. and then be able to, if all this biometric data is out in the dark web, then it's anonymized, it's de-identified, and so that de-identified data is a different ontological class than personally identifiable information, which PII has very specific privacy laws around, but if it's de-identified, then that's the thing that they're using.

[00:22:38.057] Colin Steinmann: Can you explain PII to those in the audience that don't know the term?

[00:22:41.318] Kent Bye: The personally identifiable information is any information that can be tied back to your identity. So it's like your name, your address. There's all sorts of like your financial records, your social security number. There's lots of privacy laws and other people that are specifically focused on PII and privacy would be able to go much more in-depth explanation. I'm not a lawyer. I'm just talking to lots of different people. But my sense is that biometric data at this point is being treated as if it is de-identified. So they're basically recording as much as they want. And they're like, well, this is not personally identifiable, so we're good. And my argument is that a lot of the stuff, even like gate detection, so as you're moving around, how you move your body has a very specific muscle length and how you walk, that's PII, that's personally identifiable, that's very easily translated and be able to unlock information. So even if you're walking around and you're not trying to hide the gate information, that within itself is PII. That's one thing. I'm not so sure about the different differential privacy homeomorphic encryption. I'm looking forward to talking to more experts about that to get more information. But I do want to say one more thing, which is that if you read through the privacy policy for Oculus and Facebook, they have in there all of these things that are essentially identifying you. like you're never going to be anonymous when you use the Oculus Quest. They know what your IP is, you know, they know where you are. They have kind of a standard stuff in there to locate and identify you. But the thing at F8 that really gave me pause was they said, oh, you know, just like your iPhone, we're going to be able to ensure that you have your own identity and no one spooks your identity. And you're going to have a security feature of scanning your eye and having a fingerprint. So once they have that, they're like, oh, you're never going to be anonymous ever. They're always going to know who you are, even if you're using a friend's headset. They're going to be able to look at your personally identifiable information through your biometric data that they're going to pass off as a security feature. But be warned, it's not a security feature. It's to be able to identify you.

[00:24:45.111] Brian D. Wassom: You asked if there was a magic bullet. The magic bullet is consent. These rules only matter if you're gathering data and the subjects of that data don't know about it. But consent is an easier said than done. It's a tricky thing. So that isn't necessarily the most practical solution, but that's the answer to your question.

[00:25:04.711] Kent Bye: But they have a hitch in contracts with privacy policies that are pretty bunk. I mean, that seems to be like you sign a contract in terms of service that you are consenting whenever you sign on that. So that seems to be a pretty big hole.

[00:25:20.152] Kavya Pearlman: Remember, we talked about it at Stanford's Privacy Summit. Consent. What is consent? Do I agree to this? Click here. What if I don't agree? Go to hell. What is consent? Do I really have a choice here? So it's interesting that the consent comes from our legal friend. But it is really just a tool to collect more. And then let's say if you want to withdraw consent, let's say I want to say, hey, I don't want to be on Facebook anymore or something. Oh, your friends are going to miss you. Here is your friend's picture. So all of that collectively breaks consent. I don't think consent is really a good way to protect us or inform us. It's not even well informed. We don't know what we're getting into. There's like a 70 page EULA that you have to look up. I'm sure you could do a good job, but what about the rest of us?

[00:26:24.418] Colin Steinmann: We have a question from the audience. Let me bring you the mic.

[00:26:27.665] Questioner 1 : So consent I think is interesting because what about I think the play maybe devil's advocate for a second the idea of a paparazzo And I'm taking a picture of a celebrity and the celebrity does not consent to me to take their picture But I'm exercising my First Amendment free speech. There's light that is radiating off of their personally identifiable facial features that I am recording on my digital device and I'm going to profit from that without that person's consent in any way. Does that relate in any way? I just feel that's got to be somewhat tangential

[00:26:57.129] Brian D. Wassom: Well, consent only matters if it's something you have the right to withhold consent of. We already have more fundamental laws than that, like the First Amendment that say things that you put out there in the public sphere belong to the public.

[00:27:09.557] Questioner 1 : So my gait, is that in the public sphere or not? Because it is, as much as my face is, right? What's that? Your gait? For instance, something like my gait or my EEG, my brainwave scan or whatever, whether or not we had a good way to measure it earlier doesn't mean it's not freely putting it out there in the public sphere just by being me.

[00:27:31.907] Brian D. Wassom: Where we draw the lines are open to debate. I don't disagree with most of what you said. Like I said, consent conceptually is the answer. How we define that, what it really means, where the lines around that are, very authority issues. But we appropriately give a lot of credence and pay a lot of attention to privacy, especially now that we have unprecedented capability to gather information, hoover it all up, right? These were questions that didn't need to be asked a generation ago. So it's appropriate that we give them more attention now, but privacy isn't the only consideration.

[00:28:06.505] Kent Bye: I just want to jump on the consent point and say that when you install like Instagram, you say that I'm consenting to let this app use my camera, but there's also patents that Facebook has that says they can use passive data capture to be able to look at your face as you're looking at content. I don't know if they're doing that or not, but everything that's in their privacy policy allows them to do that. And they have patents that say that we've got technology to be able to, like, capture your emotions as you're looking at content. So there's, like, levels in which they're asking us to give consent for all these things, but yet the degree to which they're taking it is to the extreme. We kind of have to assume the worst, honestly. If it's written in the privacy policy and we consent to it, then we just have to assume that they're doing that. So we have to assume that they're listening to us or looking at our emotions. And so it would be nice to be like, oh, do you consent to having your face secretly recorded and having your emotions harvested? I'm sure there'd be a lot of people that would be like, no, I'm not really that OK with that.

[00:29:10.057] Brian D. Wassom: It's not an assumption. We know they're doing that, right? All the revelations of Facebook putting intentionally negative content out there to manipulate user reaction and see if negative content was more interactive than positive content, that kind of stuff, we know this.

[00:29:25.499] Kent Bye: The recording, the secretly recording you with your camera. I don't know if they're doing that. They have patents for it, but it's super sneaky if they are, but it wouldn't surprise me.

[00:29:33.365] Colin Steinmann: I'm seeing a lot of excitement from the audience. I have four hands that have gone up, so I'm, I have more canned questions, but I think this audience has even better questions. So you were first, sir.

[00:29:44.204] Questioner 2: So it feels like it's getting to a point where constitutional rights are being infringed upon, or perhaps even impeded. So what would the government's role be in, or would they have one in playing a part in governing? From the first question, from actual space and real estate to privacy, is that something, to me it seems like that would be a necessity in order to, because they would be the only ones that would have the authority to lay down some sanctions. But I'd love to hear your thoughts.

[00:30:21.318] Damon Hernandez: Well, in this one, I default to the lawyer to chime in. Some of the work that I've done in the past was dealing with actual policy makers. So people who are on the Cybersecurity Council and some of these others. Because when we talk about the consumer space, what we forget is someone owns this property. And for them, Congressman Himes was very freaked out about the fact that his kid could come in and play a HoloLens game and be mapping the interior of a federal building. Now, that's not talked about. Sure, that saves cash locally, but that's going to go to the cloud one day. And what was interesting is the person who initially went down this rabbit hole is not in the ARVR space, they're just kind of a security nut. And so I think that one part of the value of organizations like this, people like Brian and others who are going forward, is looking at how do we start to put these things in place to have the discussion around guidelines before the pendulum swings the extreme other way, to where you have Congress, they're calling people in because, yeah, and now it's a PR disaster, okay? I went to a facility and while I'm trying an SD.ai or whatever app, that I'm mapping the inside of this, right? And so I think that from a policy perspective, those people need to be informed for damn sure, right, of what's going on. And then they just need to have the confidence in the folks, that's their rabbit hole, I personally feel. to manage it but to me that's something that no one is talking about because I deal a lot with the built environment sector and they give a shit if you're going into their space and scanning it and so that's a whole different thing of I'm putting a Pokemon in your backyard when it's saying that you're either collecting the IP of my design architect I don't want to share or you're collecting the interior of my space as an owner or developer, I don't want to share. There's a lot of these things. For me, I think that from the physical environment, not so much I'm throwing some digital litter on your property, but I'm actually acquiring your space. The policy makers do need to be somewhere in that conversation.

[00:32:26.639] Questioner 2: I guess the big question also from a bigger picture standpoint, it seems like it will be internationally will be it'll just be a lot easier to connect internationally, which, internationally, there's a lot of different cultures perceive different actions, different what is lewd, what is not, and is that something that have to go to, I don't know, the UN or, you know?

[00:32:54.212] Damon Hernandez: I think other places in Europe are going to do it first. We'll do it last, to be quite honest, I feel, as a nation. Even then, they'll leapfrog a lot of that legacy. So I think that definitely Europe will follow a lot there, much like GDPR. And I think that in these smaller countries, like the Scandinavian countries, the size of the Bay Area, you can definitely do it. And then also, I think that we'll see it in more foreign-type governments, right? I mean, look at China. You think they're just going to be having scans and AR cloud going all willy-nilly to Facebook and these others? Hell no, right? That's another great example. So I think that we'll see it in these other places where they care about that more, and then hopefully we can take those best practices, those best processes, and then bring those back home.

[00:33:37.188] Kavya Pearlman: I feel like the whole government piece is very interesting because haven't we seen already what the government can do? They hold congressional hearings after people move fast and break democracy. So that's something that is to be expected that, of course, we'll have some congressional hearing after multiple disasters, but I think what we can do is, as technologists, as stakeholders, we can potentially bring the policy makers to the table, have these conversations, and better inform them that, hey, some people are moving fast and breaking things. And these are the implications. And let's do something about it before it's too late and then you organize these congressional hearings, five or ten of them. But what's the point of it all? We have already broken things. And that's why for XR domain, I mean, oh my gosh, I'm so thrilled and I absolutely am honored to do this. And that's what we are committed to. That's why this collaboration is so significant. We're gonna partner up and we're gonna bring these policy makers to the table and have these conversations and at the global level. Because the US itself, okay, there are loopholes everywhere. So if this country doesn't listen to your voice, yes, we'll go to the UN. We will talk at the global level. Somebody has got to listen to it because the implications are for the humanity. It's like, how can you possibly ignore this? We can't. So bring the government to us and have these tough conversations.

[00:35:21.997] Colin Steinmann: I see that Lisa has something to say too. So I want to hear that. And then we're going to try and get a few more questions in before we're out of time.

[00:35:27.722] Lisa Watts: Yeah, I mean, I, you know, again, in a role to anchor it to what's happening now, I mean, what you guys are saying is it's the equivalent of the stop sign doesn't go in until somebody gets killed, right? So we need something like Red Hat, you know, who is an independent global body that is looking for security issues that can operate with some amount of clout to expose things like the EULA is giving permission for whatever it's giving you permission for, without having some way to balance the scales. Because right now, there's a lot of great things that can be done online, and we all want access to those things. But in a lot of ways, you're held hostage to the EULA. You want the ice cream. then you have to do all these terrible things to get it, right? And that there has to be some balancing of the scales. And whether it's the equivalent of Red Hat, an organization like this, or some other that actually builds the international cloud to do that and goes to the UN and exposes these things for what they are, so that Europe or whoever is going to operate first can go, that we need that sort of inertia around personal privacy and safety and all those sorts of things.

[00:36:41.716] Kent Bye: Yeah, I just wanted to jump in quickly and say that there actually isn't like a consistent constitutional given right. It's kind of a patchwork of multiple amendments. And there's like different laws and different contexts as well. So like children have like COPA act to protect children for privacy. Then there's like The FTC, ultimately, is the arbiter of, like, enforcing privacy violations with these FTC consent orders that, you know, like Facebook and Google both have some. But there isn't, like, a consistent entity. It's not a singular vision that's holistically taking a look at everything. GDPR, I think, is a good first step in Europe, but that's not United States. And so, in a very specific legal context, I don't know if you have any more. From the lawyer, the one single lawyer on the panel, if you have more to say. But we should probably go to the next question.

[00:37:33.623] Colin Steinmann: Yeah, let's get a few more in here.

[00:37:35.044] Questioner 3: I was just going to ask a related question. But also, it's strange to me that this whole AR Cloud thing, it's not just about private companies. It's about the government itself will be spatialized, traffic and ordinance and all that. And I think the spatial rules and permissions and contracts on those are, in some ways, the most critical. Because the private sector, we can change. We can do things. But what gets actually baked into where you don't have consent, the government doesn't ask for consent to be in the government. you have to be here. So I was wondering if there have been any thoughts about how we can start to phrase the spatialization or the AR cloud of the government, because I think that's the one that our society needs to get right the most accurately, and then everything else I think will almost be an echo of how that one is structured.

[00:38:24.225] Kavya Pearlman: What just happened at San Francisco, you know, the surveillance ban, that law, that gives me hope that there is still some time to really kind of bake these things into the whole ecosystem. But we have to talk about it. We have to be bold. We can't be bushy washy and make some tough decisions and bring people to the table. But I'm just hopeful. That's all.

[00:38:50.741] Kent Bye: Not necessarily specific to the government but I did an interview with Mark Pesce who was talking about the mixed reality service which was some method to be able to try to mediate permissions in some sort of decentralized blockchain manner if you wanted to give consent on your property for a lot of people to augment it or do specific things or if you wanted to maybe restrict it in some way like the Holocaust Museum and people playing Pokemon Go is an example where You know, there is a maybe a free speech right for you to do that. But if it's private property, then maybe you also want to, you know, create a certain culture around a shared agreement. And is there going to be some way to mediate that? I mean, that's what comes to mind when I think of like shared space and how to like control it. Like there could be some method like that. I haven't thought too much about, like, the governmental. And the first thing that comes to mind is, like, the FCC has, like, they regulate the wavelength bands for the different frequencies. But this seems like it's just spatial and nothing that'd be specific. Like, it's kind of open to anybody to do whatever they want. So, I don't know what would need to be mediated.

[00:39:56.932] Damon Hernandez: I would split it between two just real quick, right? Go to the small, like I said earlier, right? Look at what the Scandinavian countries would do. right? You're Finland, you're Denmark. I mean, the whole 3D model of Helsinki is available, right, for people to go in some of Espoo and some of the surrounding area. That's fascinating, right? So you have these small people who are already progressive and thinking that way. Now go to the other scale, China. And so that's where I think that what's gonna be interesting from a municipality perspective, how does a government do this? Because a government can't just, especially here in the US, if you ever wanna deal with it, go to five different cities and ask them for their 3D city data. First of all, see what you get, right? And so if you start with the way that they've already organized their data and how they don't know what to do with it, unfortunately again, I think you're gonna have to go to these other countries to look at it from a country perspective. This is how they did this well, Singapore and their Smart Nation initiative, or virtual Singapore, and then try to bring that. Or then look at the people who, like Mayor Steve Adler in Austin, and some of these other places where the cities themselves are progressive, what Gavin Newsom did with Data SF and some of these others. Start there. But I think there are those places where it's more fertile for that type of discussion.

[00:41:10.752] Colin Steinmann: All right. Tony, I saw a question from you earlier.

[00:41:16.335] Questioner 4: I don't know how this is going to go over and it, and I can't believe it's coming out of my mouth, but I'm hearing a bunch of people like really complaining and like, we're all in this tizzy and getting all anxious. And so the way my mind works, I started going, well, what are we really concerned about now? Right. Like number one, I don't want people to know my stuff because I'd be embarrassed or I'm concerned that you would try to exploit me. Or maybe my security?

[00:41:47.328] Damon Hernandez: I care about, as an enterprise, my IP first. That's the thing I care about, the stuff that's going to impact my budget. What concerns me, I can't speak for the rest of the panelists, because everyone who knows me knows that I couldn't care more about business to consumer. So, from the enterprise perspective, it's when I'm augmenting a piece of content into that, I care about who's looking at that content. That CAD model that's coming out of that CAD package into Unity, which already freaks me out, into then an AR solution being stored in a cloud, those are the things that I guess concern me. Because it's intellectual property. It's intellectual property, yes.

[00:42:28.695] Questioner 4: And you're concerned, but there are already laws about somebody using your intellectual property.

[00:42:34.298] Damon Hernandez: But where do I find the weak link in that? So like upstairs in that conversation, right? We were talking about that. Now, if I'm gonna use one of these solutions and that data leaks out, do I go after the Unity developer who developed it? Do I go after Unity because the weak link is there? Do I go after the cloud service that they used, right? Where do I even know where to go, right? And so for me, I care most about one, the intellectual property. The other thing I care about, which isn't too much, but we're starting to see more of it because of awesome engineers like one of the ones in this room, is where people are hooking up the AR to IoT and sensors, and not just one-directional, but bi-directional. So now I'm pulling up an AR experience where I could shut off the ventilation in this room, and now you're controlling the physical space. So those are the two things that I personally care the most about, is IP, and then the control of that back into the physical environment.

[00:43:31.799] Kent Bye: And I could spin off a lot of blackmail scenarios that would really freak you out. And the ones that come to mind is like, OK, there's eye-tracking data. I could look at what your pupils are doing. I know what you're interested in. I know what you like. I know what sexual preference you are.

[00:43:47.251] Questioner 4: But what I'm saying, like what I run the security and privacy group.

[00:43:53.499] Kent Bye: So I'm just trying to say that the issue is that once you start to store house this data, you can have 10 years of data on people, and you can say anything you want about anybody. There's a lot of technical debt with machine learning. So it's not like it's perfect, going to understand everything. And if there's some filters that get run on, and that if that data has no reasonable expectation to be private, you have a potential totalitarian government that could go to these companies and then basically profile anybody. And this is happening in China. You have the social scores on people, that they're putting a number on you, that they're quantifying whether or not you have access so that you can travel, whether you can get on the bus, whether you have access to education. This is actually happening right now. Like, there's social scores that are implicit within any sort of social VR or AR application to be able to mediate trust. Well, what happens if the government gets that and starts to dictate what you do and do not have access to as a citizen? I mean, if you look at what's happening in China, it's really scary. They're having like re-education where they're- It's a credit score, man.

[00:44:52.624] Damon Hernandez: It's a different kind of credit score.

[00:44:54.524] Kent Bye: It's a different credit score, but it has real impact with what happens with what you have access in life. And you have the ability to put on a VR headset and to stress test you and really say are you faithful to the communist government. And if you're not then I mean this is this is sort of like the implication to sort of look at the biometric. It's like a polygraph test.

[00:45:14.386] Questioner 4: You're concerned. conversation is the government's going to get a hold of this data and they'll know you intimately.

[00:45:21.864] Kent Bye: It doesn't have to be the government. It could be if anybody's storehousing it. It could be on the dark web. And then somebody could anybody could get like to could watch you in a VR experience to get your biometric data from watching you. And then if they have access to that that's basically like the most intimate information about you for a decade. I mean, we already have that amount of information on us on the web right now. So you go forward 5 to 10 years and that's just an enormous amount of information that's essentially like a Rosetta Stone to your psyche.

[00:45:51.341] Questioner 4: that if it's not, I'm not, I don't mean to be argumentative, but the concern to distill it is that I'll be exploited if I expose myself to that degree.

[00:46:03.903] Kent Bye: It's it's like if you're if you have right now, we have social media filter bubbles where Facebook can make you have certain very specific emotions. Well, if you extrapolate that out to every dimension of your reality, then if that gets into the wrong hands, if they can predict your behavior, they can control your behavior. So it's.

[00:46:23.123] Damon Hernandez: If it makes you feel better as a hardware manufacturer, unless putting EEGs in these headsets and others, I can have a dedicated million-plus people that are going to dedicate to doing that. It's not going to happen. That's why you're seeing these third-party peripherals. We had a conversation at Stanford where people were wanting all this biometric data, and the turn to them was saying, hey, for your entire industry, which is pretty damn small, from just Samsung, how many millions of units can you commit to to put in one sensor? Because now I have to change a whole manufacturing line and a whole bunch of other things. an issue in time, if there is enough monetization incentive for the hardware manufacturers to put that stuff in there, I don't think there's going to be any time soon. Eye tracking, yes, but some of these others, that's all going to be third party, I feel, in the near future.

[00:47:11.187] Lisa Watts: If we come back down to the privacy and consent that we started on, some of it is really about just being a human and having those choices and understanding what choices that you're making and how do we make that more obvious. and easier so that we can control. And maybe we don't live in a government area. We have to make hard choices to live, to survive. But as humans, I think we would love to live a very enriched life that has a lot of choice. And I think that comes down to fundamentally as a human race, can we support and celebrate and give humans the tools to let them be human? And you can see totalitarian governments are not on that same page. Are we gonna let the rest of the world go that direction? What are we gonna do about it?

[00:48:03.235] Brian D. Wassom: I would throw out though, just that this is along the lines of what you're suggesting. These are questions of degree, not kind. Speaking of being human, humans have gathered information on other humans since the dawn of time. Governments have gathered information on their subjects from the dawn of time. We're just now talking about they can do that even more, more efficiently. And yes, that's significant, it has consequences, but it's also inevitable. And so the question is, if that happens in China, that's probably more scary than if it happens here. And it's also the consequence of living in an open society. We get the benefits of an open society, but we put a lot of information out there as a result. But the answer to that, though, is more openness, more accountability, holding the companies and the governments that have this information and are in the position to do something nefarious with them accountable. and not letting them get away with it. That's why we do things like this. But there are plenty of places in the world where we might not be able to have this conversation. Here, we're fortunate enough to be able to do that. And so we need to be vigilant about holding our government accountable and making sure that slippery slope doesn't happen.

[00:49:05.552] Colin Steinmann: All right, we're a little over time, folks. Got two questions in the audience. If the panel's OK with it, we'd love to try and squeeze them in.

[00:49:13.479] Damon Hernandez: Traffic for two more hours.

[00:49:14.460] Colin Steinmann: All right, all right.

[00:49:16.119] Questioner 5: Yes, so my question is from way long time ago, but it kind of maybe will help wrap it up. So when you were talking about consent, then you talked about consent, something popped into my mind, a law in the US about attractive nuisance, which is it's a property law. And it basically says that even if you have a fence, if there's something attractive in there, and somebody climbs over the fence and gets hurt on it, you're actually still liable. So it was very interesting because I read every legal document I've ever signed, except for on the internet now, because they're 200 pages long or 70 pages long, and you just want the app to try it out anyway, right? So my question is, what are the tools that are in place now that we can use to help have a fighting chance? I think IP, if you look at television, there's some rules there for IP. Kent mentioned FCC. I've always thought that's where the power is because all these companies are using public airwaves. What tools do we have available? What law sets do we have available to help us out?

[00:50:16.174] Brian D. Wassom: Sure. Well, I think AR actually offers some solutions in this regard. So for one thing, attractive nuisance applies to children and people of diminished mental capacity, where you have a heightened accountability for drawing in people to something that's dangerous and they don't have the capacity to judge that it's dangerous. That's an issue in AR, but that's a different issue, I think, than what I think you're intending to talk about.

[00:50:38.678] Questioner 5: But you don't actually know what they're going to do. You don't know how you're going to get hurt by what they're trying to do.

[00:50:45.730] Brian D. Wassom: Yeah, well, yes. Nuisance different than the kind of torts that we're talking about. The FTC, Federal Trade Commission, I think is one of the most important agencies that we have to govern this stuff because they have a very flexible power to control what they consider to be deceptive trade practices. And we've seen that evolve even over social media. So social media ad campaigns, they've been very vigorous about enforcing the disclosure of paid content. So a celebrity tweets about a product and they're getting paid for it and it doesn't say hashtag sponsored or hashtag ad, that company is going to get in trouble. Influencers doing the same thing because we want to be able to trust that what we read doesn't have an ulterior motive that we're not aware of. And there's nothing in the law that said that until they interpreted it that way. So they have that flexibility. I think when we're moving towards an augmented future where we have the capacity to throw data up in our faces in a way that isn't 12 pages of text, but rather up in our eyeballs, it's a lot easier then for consumer advocates to push for regulations or FTC to interpret the law in a way that requires you to put disclosures up in your face in a way that makes sense. you walk into this room or you're in this area and you're going to be video recorded or your EEG is going to be recorded as a result, you can mandate that that kind of information be thrown up in your face, big red flag, says, hey, do you consent to your brain wave being recorded right now? Rather than having to click out a box that says I agree.

[00:52:17.154] Kavya Pearlman: I feel like there is no tool however there is an opportunity and it's building and I think Colin was mentioning earlier this opportunity is to build sort of a MPAA type rating system where when you walk into these spaces you can immediately with the labeling sort of tell whether it is super secure, it is semi-secure or am I just out in an open space and that is an opportunity for I would say all of us to sort of come together and build that sort of framing and the MPA type of rating system. So immediately I should know, okay, those kinds of tools would be helpful, but we don't have it yet.

[00:52:52.282] Kent Bye: Yeah, two quick thoughts. One is that you can look to social VR experiences like Rec Room, where they try to give you an experiential tour of the culture and trying to teach you what the code of conduct is of how to interact with people. And I feel like that's more about how you're interacting with each of the other people within the experience itself, but there's nothing equivalent to the privacy policy or the terms of service. But it'd be nice to see, well, with the experiential initiation into using the app, so you were fully disclosed. Another point is that anything that's recorded or captured, there's no sort of auditing trail. Companies, they're essentially saying, we can record as much as we want, and we can change it at any moment. So, like, I will do an interview with, like, Facebook, and I'll be like, are you doing this? Are you recording conversations? And they'll be like, no. But then, right there in the privacy policy, it says they can, so then the next day, the very next day, they could start recording it, and it'd be like, there's no accountability for them to disclose, like, oh, by the way, we are now starting to record this. So there's a part of, like, a lack of transparency and accountability for what the companies are actually doing. And the last point I just make is that there's the third party doctrine, which I think is the most egregious law that's out there in terms of saying that any information you give to any third party is basically no longer private. And that needs to change. So in terms of a tool, there's like ACLU and other entities like that that are trying to make changes to these types of underlying laws that are like the fabric of how things work. and they need to change and evolve to like how people are actually using technology today because you know that decision and the way that it's being practiced right now doesn't take into consideration like all the things that have changed over the last 30 years with technology.

[00:54:38.122] Brian D. Wassom: Just a couple of years ago, the Supreme Court says the third party doctrine no longer applies to cell phone location data, right? That they can't warrantlessly search your cell phone data, even though it's technically in the hands of a third party. So it's moving in that direction.

[00:54:51.449] Colin Steinmann: How's that working out so far with the cops and the warrantless location data?

[00:54:56.772] Brian D. Wassom: There are limits.

[00:54:58.355] Kent Bye: Yeah, the carpenter case is a good trajectory, but I think there needs to be a lot more momentum and awareness, I think, within the larger industry and a lot more help and support to help bring more cases to challenge these things.

[00:55:11.298] Colin Steinmann: All right. Final question of our panel. Please, sir, take it away.

[00:55:15.439] Questioner 6: I didn't realize it was going to be the last question. I didn't want to end it on such a harrowing note. But I think there's an important point we have to distinguish. And when you were talking about the legal protection, against corporate espionage implies the assumption that people won't do it, even though that there's a law against it. And I think when we look at power between government and people, it's always a struggle between the ability of the people to defend themselves from the ability of a government encroaching on what is against the law. And so I guess my question for the panel is, do you think that as individuals, our ability to defend ourselves from larger entities, whether it be corporate or government, from misusing our data, do you think our vulnerability is increasing or decreasing? And if you do think it's increasing, is it increasing at a more increasing rate? And also, what legislation would help us defend ourselves from that?

[00:56:12.803] Damon Hernandez: Do we put on the spot? that's a great question I mean as far as the legislation goes I mean I again it's still so new right I think that's one of the issues I mean the whole point of the open AR cloud group is to define this right and so how can you start to set rules or legislation or other things when a lot of people don't know what it is, right, and so I think that we have a lot to do going forward. I think what would be nice to see is the precedent that the companies and platforms that are in this room, right, address to the concerns that someone have said, I'd love to know how some of these platforms now when I come and visit them in six months, or hear just their narrative at all, or as a developer, how are you now saying hey man I'm gonna let you know I'm reading your brainwaves and all this other stuff I'm not burying that in a TOS that I'm gonna click this because I want to go into the experience right? That is a great question but I I am very much freaked out because even though I did not say that it was about embarrassment, I think a lot of it is, right? I mean, let's try that as a thing after this, right? Everyone get up one of those meetup tags and write something on there that you are most embarrassed about, like the top three, and walk around this room and do that, right? So I think in one way that could be a key thing. But again, not to go too far, I think that again, it's how do the companies involved set the precedent? How are guidelines looked at from that? Then how can they be involved with associations like this to either have that governance, that direction, and then have that conversation with the attorneys, with the policymakers, and the others? That would be something that I may suggest as a way of maybe looking at this going forward.

[00:58:08.447] Kent Bye: My opinion is that we're definitely more vulnerable, which may not be a surprise based upon what I've been saying. you know, there's an opportunity for many different vectors for how to address these variety of these different issues. Well, first, I'll say that Edward Snowden, with a lot of the release of the documents he made, made it clear that there's a fire hose that the U.S. government is having for these social media companies is essentially like a goldmine of Intel for everybody that is not in the United States and potentially even for people that are in the United States who knows like what is actually happening. But there's like this sinister connection between the government and these companies that is getting them all sorts of information what's happening around the world. And I don't feel like there's going to be a regulatory impulse to be able to stop that. And it's only going to get even more. And so Lawrence Lessig has a very nice way of looking at how there's these different vectors. There are laws and regulation, but there's also like the culture. So awareness of people deciding to delete Facebook or ban it, but that's, you know, people are still very happy to make this trade off between getting free stuff and mortgaging their privacy. Like people get free stuff and they like the stuff. And sometimes when you cut off the data, the services will arguably be worse. And so it's increasing their direct experience. And so it's this sort of contract that we have. So culturally, I think it's still a long ways to go. We have to have a lot more direct impact on people's lives, more than just having our democracy hijacked by information warfare, for people to actually wake up and say, this is something that I'm going to change my behaviors day to day. But there's also like technology and architecture and code so that you can actually like build something different. And I think that's what people like Mozilla and the hubs that they're doing is that they're actually building something that is architecting it from a privacy in mind. And that's something to look at. But then there's also just the economics and so business case. So are the people in this room going to be able to create like a viable alternative to like the powerhouse of the Facebooks or Googles or HoloLens or whatever entities out there. I feel like Microsoft is more or less on the better side than the other entities. Apple seems to be on the favor of privacy but these are still like to some extent Apple at least is a closed wall garden. So I think each company is taking a different vector and I think There's always going to be like, you know, Neil told me a few years ago, Neil Trevitt of Kronos Group, and I say this all the time, it's like, for every successful open standard, there's a proprietary competitor. So I feel like there's going to be that. The class is going to be the class and it's going to do stuff that is going to be way better than anybody else is going to be able to do. It's going to be that console. But we need those open alternatives that are out there. So if anything, it's not going to be like, hey, let's have the government save us in this case. I think it's going to have to come from all those different vectors.

[01:00:58.228] Lisa Watts: And I would just say there's not one ring to rule them all. There's not one thing that we can do that's going to solve all our problems. It's everything all together. It's everyone's perspective. We just have to remember that this is not for the faint of heart, and there's no lazy way out. So we've got to pull together as a community, and everybody's got to do their part. from the developer who's making that choice. Like, I might make the choice to recycle or reuse containers or not consume so much paper or, you know, reduce my plastics. It's exactly the same decision that can be made in an individual level, a developer level, policymaker level. It's on all of us to make it and we've got to do the work. That's the only choice we have.

[01:01:40.955] Colin Steinmann: We got some of the right minds in the room today and I hope that's a good start. Guys, that was an amazing panel. I wish we could run it for another two hours, because I have so many more questions. But I think it's time for closing remarks, so I'm going to hand it off to Jan here.

[01:01:54.491] Lisa Watts: Wow. Did you like the panel?

[01:02:02.666] Kent Bye: So that was the State of AR Cloud Symposium Showcase panel on privacy and security featuring Brian Wassum, he's an XR thought leader, Kavya Perlman, founder of XR Safety Initiative, myself from the Voices of VR, Lisa Watts, founder of One Slash 21, and then Damon Hernandez, he's a product manager at Samsung Research. So, uh, instead of going through a lot of my specific takeaways from this panel discussion, I think I'm going to defer a lot of the synthesis into the final talk that I have, which is like this distilled down 30 minute, here's all the takeaways about the series. These open-ended discussions I think are good for ideating and brainstorming. And it really helped me to start to map out the landscape even more. The big new thing I think in this discussion for me was a little bit of the discussions around. the legal aspect and the law and the first amendment rights of being able to have augmented reality content into virtual space because it's quote-unquote not real. I'm not a huge fan of that argument of just relying upon saying these immersive experiences aren't real and that's what makes it okay. I do think that that is in contradiction to other things like harassment, for example. If you take the stance that immersive experiences aren't real, then what do you say about harassment? because that certainly feels just as real as anything else. So I do think that it's a real experience, but it's not a real tangible property, which when it comes to property rights, then I think it is this etheric realm that is able to be overlaid. And it's like a bubble of your own personal meaning. As you walk around, you're able to associate your meaning onto things and then put it into this etheric virtual world. So I think there is a difference between the digital and the real in that sense. And so experientially, I think it's just as real as anything else. But when it comes to like a different class of how that is handled legally, Brian is starting to try to define that a little bit more clearly. So anyway, it was just interesting to hear some of those perspectives on that. And yeah, like I said, there's going to be a deep dive into my XR ethics manifesto in a video that kind of goes into a whole synthesis. So I'll defer a lot of the other stuff to that. So that's all that I have for today. And I just wanted to thank you for listening to the Voices of VR podcast. And if you enjoyed this podcast, then please do spread the word, tell your friends, and consider becoming a member of this Patreon. I rely upon donations from people like yourself in order to bring you this coverage. I like the independence that gives me and I'd like to be able to cover all these topics and to do the work that it requires to be able to do these deep dives. It takes quite a lot of time and effort and energy. And if you value that as a service to you and the larger community, then become a member of my Patreon to be able to help support these types of conversations. So you can become a member and donate today at patreon.com slash Voices of VR. Thanks for listening.

More from this show