#832 XR Ethics: SXSW Panel on XR Ethics

rori-duboff
For the past seven months, I’ve been investigating the topic of ethics in XR by participating on a number of panel discussions, conducting interviews, and presenting talks. At SXSW this year, I was invited by Rory DuBoff, who is Accenture Interactive’s Director of Extended Reality Innovation & Strategy, to participate on a sponsored panel discussion at SXSW on March 10, 2019. I participated on this panel on Designing Ethical Virtual Worlds along with Jessica Lauretti, who is the founder of This is Laurels, former head of RYOT Studio, and built the VR News arm of Huffington Post.

jessica-lauretti

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to The Voices of VR Podcast. So ethics and privacy and XR is something that I've been talking about for over three years now. But for the last seven months, I've been doing this deep dive of going around to different locations, giving panel discussions and talking to lots of different people in the industry and giving a number of different talks and just trying to synthesize all these different conversations. Ethics is a little bit of like trying to figure out your own sense of moral intuition and what's the threshold of what's OK and what's not OK. And because XR is mashing together all these different contexts, then it kind of requires you to look at all these variety of different issues and then trying to come up with this ethical framework to try to make sense of how you engineer specific experiences that are taking into consideration all these different ethical trade-offs. The thing about ethics is that There's certainly going to be benefits, but there's also going to be harms. And you have to weigh the benefits with the harms as we're creating these immersive technologies. And there's no perfect answer. It's not like you're going to completely figure it out. It's just going to be the less worse solution. And that's challenging for people who want to have those clear answers. So when I talk to someone like Diane Hossfeld, she talks about privacy engineering as something that is no perfect solution. I was trained as an engineer and so engineering is this practice of trying to weigh all these different trade-offs and then trying to design a system that serves all of those the best that you can. And then sometimes with ethics, it's like looking at the whole picture of seeing there's so many different aspects of life that you're now starting to try to integrate in. And so it's actually a very challenging problem and something that I feel like because I'm going around and talking to all these different people, it's helped me to start to cultivate my own sense of moral intuition when it comes to the vast landscape of virtual reality. And so this is going to be a 14 part series because there's a episode 831 that with HTC where it kind of dives into HTC's approach, but over 11 and a half hours of actual interview content. And I'm going to be going through and trying to introduce and contextualize this whole journey that I've been on for the last seven months. So. where we're going to begin with this is at South by Southwest. Back in March, I was invited by Accenture to speak on this panel. It was a sponsored panel that they did at South by Southwest, and they wanted to just have this conversation about ethics and virtual humans. And I think this is a good place to start just because it starts to try to lay out the landscape. This is something that, you know, I had been covering for a while, but this is the first panel discussion that I had given about this. And I was trying to find these different dialectics and trade-offs and see what are those equivalence classes and what do you have to trade off? And so this feature is Rory Dubuff. She is the managing director of innovation at Accenture, focusing on content innovation and focused on extended reality. And Jessica Loretti, she's the founder and principal of This Is Laurels. She's the former head of Riot Studio and she's helped build out the VR news program there at Huffington Post. So that's what we'll be covering on today's episode of the Voices of VR podcast. So this panel discussion happened on Sunday, March 10th, 2019 at the South by Southwest Festival in Austin, Texas. So with that, let's go ahead and dive right in.

[00:03:22.963] Rory DuBoff: Good afternoon, everyone. So I'm really excited to have this opportunity today, because we're going to have a really great discussion. So I'm going to introduce myself, then let Kent and Jessica talk. Then I'm going to speak a little bit. And then we just wanted to give opportunity for each of us to sort of give a little bit of a POV. And then we're going to have a discussion, and then if we want, some Q&A. So my name is Rory Duboff. I run content innovation focused on extended reality, XR, VR, AR, all of that at Accenture Interactive. And I've been at Accenture the last two years, kind of been in this space for the last five or so years, come from a background in the digital world of all sorts. So that's me.

[00:04:05.282] Kent Bye: Yeah, my name is Kent Bye, and for the past five years, I've been traveling to a lot of the major VR and AR gatherings. And at this point, I've done over 1,000 interviews with a lot of the pioneers of this modern resurgence of AR and VR, really tracking the evolution of this communications medium. And so ethics and privacy is something that I think has organically emerged from that community and has been a topic that I've been covering pretty closely over the last three years or so.

[00:04:30.271] Jessica Lauretti: Hi everyone, my name is Jessica Loretti. I'm the founder and principal of Laurels, which is a consulting practice that works with media and tech CEOs and startups. In my previous life, I was the global head of Riot Studio, which is the creative studio for Verizon Media. And as a part of that piece of work, I built the virtual reality news arm of the Huffington Post. So my background is more in documentary filmmaking and come from the VR perspective in terms of a content producer.

[00:05:07.806] Rory DuBoff: Great. So I'm going to just kick it off here. On that cover slide, by the way, so I actually had a baby four months ago and I got to spend the last three or four months trying not to focus on technology and just hang out with him. It forces you when you're like, I'm so into VR, AR, this whole world. And then you have a child, and you try to reconcile that with a little baby. And it really strikes a lot of what we're going to talk about today, which is what's good for the next generation? Because I believe in technology, but I want to believe it in a way that can benefit the future. When this idea of having this discussion came up, and I think there's a lot of panels going on at South By right now about ethics, so this is a big topic. I was kind of thinking, it reminded me of the Star Wars analogy of the forces of dark and light, or good and bad. And what we've been seeing in the last year with what's happening on the web, with the dark web, or deep fake, you know, these bots that you don't even know if they're human or not. And there's so much controversy around, is this technology been doing good for us? Is social media been benefiting us? And I think it's when we move into this space around VR and AR, we are facing these challenges very much so. And we'll talk about them more, but I kind of phrase it up as this kind of, you know, world in which we need to figure out that balance. You know, there are a lot of companies right now talking about what is that future world. So, you know, you hear Magic Leap call it the Magic Verse, or, you know, you hear Microsoft referring to the Edge, or you have Mirror World, or the Mesh, or AR Cloud is another big area. the Oasis from Ready Player One. And everybody's trying to claim or coin that future state that merges the physical and the digital worlds together. And what is that vision? What do we want that space to look like? How do we learn from what happened with the web as it is now, or social media is? And how can we possibly maybe make this new space achieve the right principles that we want to achieve? So, for me, when I think about designing ethical immersive worlds, the key points that I think about are this notion of human-centered, that all comes from having a purpose and a real need that people have for using technology, so not tech for just tech's sake. I think about accessibility, so the ability for, you know, not just a certain class of people or a certain group of people to use technology, but the ability to make it affordable and accessible to all populations. I think about authenticity, and that's a big one, right? With so much of the discussion around deceptivity in the news and media, how do we make communications and content authentic? I think about accountability, especially when we talk about data and how companies and brands can be accountable, and I think about empowerment. Because for me, the XR space is enabling us to do things in ways that we never thought possible in 2D media. it empowers you to embrace this like superpowers, a lot of new opportunities, go places, see things that you couldn't do up until now. So we're gonna dive deeper into that, that's kind of just some of like when I frame it out for how at Accenture Interactive we're looking at this and I'm approaching it. I'm gonna now let Kent talk a little bit about, as he mentioned, he's done thousands of interviews and had the chance to talk to some spectacular people, and I imagine you'll be able to now share some of the things you've been discussing in this space.

[00:08:37.167] Kent Bye: Great, thanks. Yeah, I'm going to hop down here. All right, so yeah, my name's Kent Bye. I do the Voices of VR podcast. And so I'm going to be exploring these four major aspects of the open questions of XR ethics, the privacy and biometric data, the ethics of virtual humans, virtual harassment, and trauma and consent. So privacy and biometric data, I think, is probably one of the things that concerns me the most. Because what VR affords is being able to capture all sorts of information that you're not even aware of. And that information could actually tell more about someone else than you know about yourself. And so what's it mean when you start to give over that information to someone who may be potentially trying to control and manipulate you? So there's all sorts of different biometric data streams, whether that's eye tracking, what are you paying attention to, facial expressions, what are your emotions that you're feeling in that moment, EEG, ECG, EMG, galvanic skin response, your heart rate variability. So all this information we're radiating from our body And it represents this information that is so intimate that it could actually have all this capability to give us this deep insight for ourselves to be able to do amazing things of self-awareness and healing. But at the same time, it could also create this trove of information that could be exploited by information warfare. So the way that I like to sort of think about it is this polarity, where both of those are true at the same time. Yes, we can do real-time analysis of biometric data for insight. But at the same time, it could be harvested for profit. But also, it's going to weaken our ability to have that information private. Anything we give to a third party has no reasonable expectation to be private. And because of that, you're basically weakening the Fourth Amendment protections, which means that you're basically agreeing that the government could have access to all this information as well. So do we want the government to have access to what we're feeling, what we're looking at, what we're paying attention to? It's basically a dystopian Big Brother scenario that we're maybe unconsciously starting to create if we don't start to think about, what about the third party doctrine? Okay, so that's one. The next is the ethics of virtual humans. So Micah was presented at LeapCon for the first time. I had an experience. I actually was on a panel at Sundance with Alice Roe. And one of the things that they were trying to do with Micah was to not call Micah a virtual assistant. And they're trying to get away from taking technology and treating it as if it was something that you're just speaking really horribly to, frankly, and that you can actually have an embodiment that is training us to try to look at the embodied AI and to just speak with it with empathy and treat it as it is a human. So that's a good aspect. But the negative aspect that was explored in an experience at Sundance called The Jester's Tale, which is basically like, what does it mean if AI is trying to manipulate you? And you're trying to convince the AI that you're human by doing this Turing test. In the end, the experience has you decide whether or not you're going to essentially sacrifice yourself to be killed or to be subjected to a lab experiment. So what does it mean for AI to start to manipulate us by all those different aspects of humanity and emotional connection? Are we going to be surrendering our sense of our own agency to be able to have companies control us, essentially? So you're either going to practice building those empathetic skills with virtual humans to improve our relationships, or we're going to be exploited those emotional connections to virtual humans to be manipulated and controlled. So the next one that I think comes up a lot in the VR community is virtual harassment, where when you're in VR and people start to mess with you and harass you, then people often have the experience that it feels like just as real as being harassed in real life. And so how do you create safe online environments when people can be really horrible to each other? So there's, I think, a philosophical assumption that some people believe that these virtual interactions aren't real. And if they don't believe that they're real, they're going to be more likely to treat other people's emotions as if they're a video game, or it's not real, so I can do whatever I want. say that these virtual experiences are just as real as any other experience, then you start to treat it more as a face-to-face interaction. Like, would you do the same thing if you were accountable with your identity and not have that animinimi? So there's this dialectic between the reality of these experiences. And I think we're leading towards this. These are just as real as any other experience. Therefore, we should start to have these different codes of conduct and ways that we're actually cultivating these positive behaviors. But that's also really difficult. And so I think a lot of the methods that people like Facebook have suggested is to basically surveil everything that happens within the experience, which has all the sorts of privacy implications. So you're fixing the toxic culture through technology and surveillance, rather than using something that's being able to cultivate a positive culture within sociological best practices. So how do you do that? How do you cultivate a culture so that people are nice to each other? And that's, I think, a big open question for how that is actually done. And finally, this is really intense, most intense experience I've ever had in VR in terms of killing floor incursions, extremely bloody and gory. I think in the future, we're going to be very careful about what kind of experiences we are putting into our body, especially when they're in our own home. Do we want a murder scene playing out in our home? And then that creates a permanent memory. But also, I think we're going to need to map out the cartography of trigger points of trauma so that we know, if you're going to be exploring that, we need to have some method of consent so that you're able to understand the trauma risks and cultivate a culture of that consent. But at the same time, we don't want to have the dampening of the free speech rights for people who want to have those types of experiences. And so there's actually the free speech versus the cultivation of that community. So, but the major point that I have is that all these things are happening at the same time and there's like these dialectics that part of the ethical discussions is to make these moral judgments as to what's more important, the rights of the individual or the rights of the collective. So, that's it.

[00:14:34.405] Jessica Lauretti: OK, thanks. Take that clicker. So hi, everybody. I'm just going to tell you quickly the kind of story of how I got into VR and kind of what I learned through that experience. So my background, as I mentioned, was in documentary filmmaking. And I've worked in digital media throughout the course of my career. And about three years ago, I got a phone call from a friend and former colleague, Bryn Moser, who was the former founder and CEO of Riot. And they had started off as like a social good media platform, and they sort of, in one of their pivots as a startup, really embraced VR as kind of a first adopter in the space. And their whole thing was like, let's take everything that we know about run-and-gun DIY doc filmmaking and apply it to VR. So they got a lot of attention in the space because of that approach. They were acquired by Huffington Post. Brynn calls me and says, Jessica, I'm sitting here in the Huffington Post office. Will you come and build my New York office? And I said, sure. Sounds great. Sounds fun. I had never made VR before. I knew about it, but I thought it was sort of like a kitschy thing that people were doing for PR stunts, to be totally transparent. I'm from New York, so I'm kind of cynical. a little jaded about new trends that are popping up. But I was like, cool, let's do this. I'm going to learn about this new thing. And he said to me, the only thing is you have to become a VR expert overnight. And I said, no problem. I got this. So done. So I start two weeks later. And, you know, I'm thinking it's like first week of a new job, you're going to go out to coffees, meet some people, kind of see what's going on. And my first day on the job was the day of the Orlando Pulse nightclub shooting. So I walked into the Huffington Post newsroom. I remember distinctly walking through Astor Place in New York, texting the news director, like, what are we going to do? So basically I had to figure out in literally 24 hours everything that I needed to know about VR and what we were going to make to handle a pretty profound and tragic event that needs a lot of sensitivity and thoughtfulness around it. So we get into the office with the team, and I'm like, OK, guys, what should we do? I kind of told the one producer, I was like, just go get in a cab, go to JFK, get on the next flight to Orlando. By the time you land, we'll tell you where you're staying and what you're doing. Just go. So she leaves, and we're sitting there trying to think of what to do. So the first thing we thought was like, oh, I know what we could do. Why don't we do the candlelight vigil? And keep in mind, too, that we're trying to come up with something that we could do that was a good use of the format, a good way to tell the story. It had to be in a breaking news environment, so we had to make it in 24 hours. And there was also, you know, it had to be different than, like, what the video team was doing and different than all the other verticals across Lafayette Post were doing. And there's an expectation for it to go viral, right, and be, like, a popular piece of content on the site. So first we're like, let's do the candlelight vigil. Then we're like, I don't know. It's just like a bunch of people kind of standing around. Is that really a good use of VR? Then we're like, oh, well, let's do the protest in New York and kind of same problem. We're like, I don't know. It's just sort of a bunch of people standing around. Is that really worth looking at in 360? Then we kind of were like, OK, let's go back to square zero and come up with something. And we had an insight in the middle of the day where we were like, wow, like 49 people were lost. Like, that's a lot of people. Like, that's a huge number. There's something really profound in that amount. And so we had this idea, this concept, to visualize what 49 people looked like. So what we ended up doing is putting the camera in the middle and putting 49 people in a circle. And all we did was say, today, we lost these people in this horrible trauma and attack. And take a look around and see what that means. And you start to kind of look through and it's like hair raising on your arms. Honestly, you start to think about... Wow, each one of these people had a mom and a dad and a dog and like hopes and dreams and a whole life of things. So the end result is that the user in the audience has an experience of what it's kind of like to be at a candlelight vigil, but not showing them a candlelight vigil, right? It's the same kind of reflective concept. was really powerful, really worked, and that was kind of my intro into VR. And so just to say quickly some of the things that I've learned, I think that we know that this is powerful, we know with that comes a great amount of responsibility. So the one thing, considerations, this is participatory content. So it has a different kind of approach when you're making it. You have to think about, it's not necessarily about, you know, for example, a lot of people who work in VR come out of the film industry, right? And so directors, you have a vision, you have a story you're trying to tell. This is really different. Like, it's not about you. It's about your audience. And it's about how they're going to experience it and how they're going to think and feel about it. And that's a really different creative process on the front end. Also, it's really undefined, right? We don't really know what it is. Every time somebody makes a new piece of content, it's sort of changing and growing and pushing the industry forward. To me, people often ask me, like, what's the opportunity? To me, it's like an existential one. Like, imagine if you could go back in a time machine and somebody said, like, you could be a part of the invention of cinema. Like, that's pretty cool. To have a seat at the table, to help define something that doesn't even exist, and you're a part of that conversation, I think is very profound and a great honor. And then lastly, just that it's really important to get involved. So I always thought of myself as like, not really like a tech person, and I actually think that that's why it's been so interesting for me to work in this space, because I think I bring more of a lens of, a philosopher, an artist, a dreamer, you know, those kind of profiles as opposed to, you know, whatever the stereotype of who works in tech.

[00:20:55.041] Rory DuBoff: So this, thank you both, that was great. I'm thinking, so lots of good stuff here and I guess the question becomes, you know, we're Accenture Interactive and there's lots of other brands and marketers here and this is a lot of stuff to think about and so what of all of this should we start thinking about in relation to the clients, the companies, the brands out there and doing this. Because on one hand, we're out, you know, we have, we're trying to convince all of our clients like this is a really powerful medium and you should invest in it and it's great. And I think we are seeing, especially this year, more companies investing and we're getting really good results. But once you start seeing those results come, it's like you're unleashing like what happens social. Suddenly, the brands realize, oh my god, look what happens when I am a social success. And you unleash the floodgates. And then you go in, and then now we're like, oh, crap. We didn't moderate it properly. We didn't do this properly. And there's a lot of backlash. And so what I'd like to discuss or figure out is, is there anything we can think about now in advance and be more strategic? So as we start getting more traction, I mean, it's still a new space. And like I said, we have to create demand. we're more prepared. So I'll start with one question. So Micah is Magic Leap's avatar, or not, it's not, I don't even know. Virtual human. So while you were speaking, I'm thinking, huh, virtual humans, this is a growing thing, just like chatbots, but I'd say virtual humans are much cooler. Should every brand now be thinking about virtual humans? All my clients, should I be talking to them and saying, you need a virtual human? Are we all going to have to have our own virtual humans? And what is a smart way to think about that? Because are we going to have replicas of ourselves? Is that something a brand, a consumer-facing brand, should be thinking about?

[00:22:51.955] Kent Bye: Well, I wanted to take a step back and to put the ethical framework and using Facebook as an example of a company that's really trying to quantify into numbers very specific things in terms of what they can see. They can see attention on a website and that's what they were optimized for. But in the process of optimizing for that quantification of that number, they've missed about all the different externalized costs that happen to society. And so that's where the ethical implication comes in. We're not thinking about the whole ecosystem of that community. Exactly. And I think that there's a challenge there that comes with it's very easy to measure success by turning things into numbers. But a lot of this experiential technology cannot be reduced down to a number. It's a qualitative experience. And so I just came back from the Immersive Design Summit, where all sorts of different brand activations, the people who were behind both the Westworld and the Game of Thrones, Micah 2 and the Giant Spoon. The experiential marketing is all about giving people a direct experience that is trying to get something that's deeper that's communicating of that brand. But I think getting away from how do you measure whether or not you're successful, it's almost like, well, did people have a good time? Did it spur a lot of conversations? Did people feel connected? All those things are not things you can turn down into numbers. And so that's what I would first take a step back and say that that's a larger trend.

[00:24:13.831] Rory DuBoff: Yeah. Sorry. One thing to comment. Yes, sentiment analysis. We have the Accenture Interactive. We have a cantina with a bunch of demos. And we have a Disney experience. It's basically tracking your facial recognition and serving up content using, are you happy, surprised? So I think you're right. We're going to see a shift in people looking more at sentiment as a way to create communications and measure success. It kind of happened a little bit. They tried to with social sentiment, but it always turned back to numbers. Numbers will always be important, but I hope we continue to see sentiment. Sorry, go ahead.

[00:24:46.909] Jessica Lauretti: That's OK. I think when I think about avatars, I'm going to answer with two different hats on. So the first hat is someone who works in digital media and works in brand marketing and those kind of professions. Like, yes, absolutely. Brands need to start thinking about that. It's interesting you use the comparison of social because I also kind of talk about it like that. I think of it's like there was a time, I don't know, like five years ago or something where everyone was kind of like, do I really need like a Facebook and an Instagram? and a Twitter and a Pinterest and a this and a that. And the answer was what? Yes, you do. And the way that I think about it is that because this is so new and we don't totally understand how to use it yet, we have to practice. And so brands specifically have to practice and start with small projects so that they can learn. And a lot of brands too are really just now fluent in video, fluent in social. To add a new thing is such a huge, heavy lift of education in terms of a client relationship. To put my other hat on as more of kind of like a philosopher, I think, when I think about avatars, I think about it like, who do we look to as our heroes, right? So I was watching recently this documentary, Generation Wealth, if you've seen it, by Lauren Greenfield. It's really great if you haven't. So you know, check it out. I mean, what she talks about in that is how because there's not real upward mobility in America, there used to be this concept that you were like keeping up with the Joneses. And that was just like, you'd emulate what your neighbors were doing and what your community was doing. But now what people do is they keep up with the Kardashians. And so that has become our kind of baseline. for who we're looking at for a reflection of what our society values, who we're supposed to look like, what we're supposed to buy, all of these things that that has become. And so when we start to think about avatars and making avatars and what those avatars look like, it's going to be a similar experience, right? It's like, who do you look up to? Who's your idol? Those are the people that you're going to be pulling your creative references from and you're going to be trying to emulate. So I think kind of more generally, that's a little scary to me.

[00:27:11.412] Rory DuBoff: Yeah, no, I mean, it's a good point. With the rise of AR filters and lenses, and everybody's now all about transforming themselves into, I've always wanted to look, and so the notion of a role model, when you think about the XR space, and like how it's all about transformation, and so having the right role models, like how do people want to transform themselves, and brands and companies putting the right with avatars or role models out there. I mean, there's a lot of discussion around Micah and how she looks and whether that was even, you know, like, is that the right thing that she's an attractive looking avatar female? Should she not be? I don't know. You know, so I think In the social media space, it was all about the personality of a brand. Should the brand be funny? I mean, there was all that discussion about the tone and what you should say. And I remember when brands didn't even want to open up to have a dialogue because they were afraid of what people said. So now that you have to manifest that into an actual human or something, what does that look like? What is a Unilever or all these different brands, what does that become?

[00:28:18.674] Kent Bye: One way that I think about this is that in traditional storytelling, you have very fixed authorship where you're in complete control of the message and the story that goes out there. And as you go into the participatory experiential age, it's more about inviting the audience to be a co-creator of a living story that is in collaboration. So it's less about you getting forth all of the story that you want to get out. And it's like starting a conversation. And I think that, to me, I think with the virtual humans, it's moving more to that conversational mode where you're now having a personality that is, in some ways, a representation of the personality of the brand that then you're able to interact with in different ways. But I think there's a tension between the individual and the collective that And the individual would be, what does the brand want? What does the company want? This is the agenda that we're thrusting upon you. Whereas it's like, how do you start to let go of that control and to bring in that participation in different ways?

[00:29:17.947] Rory DuBoff: So another thing I wanted to talk about, you know, I've been framing it as we're moving in, you know, from this era of like transactional data, right, to more social data and now human data because we have the ability through biometrics and we're starting to see like, you know, gaze-based activations and the ability to monitor people's heart rate and I think you referenced it, it's exciting, there's potential, but there's also this, do we have some thoughts about how we should approach that? It's so almost, it's a lot to sort of think through. I don't know.

[00:29:53.792] Jessica Lauretti: really confident that the majority of the population even knows enough about the data that's happening currently. I always tell a story about my mom, which is that once, a couple years ago, she said to me, Why am I getting Facebook ads for the show Hamilton that we're going to see for my birthday in six months? And I said, well, mom, it's because they read your email and they pull things from it and they serve you ads based on that. And she was like, they do what? Like, you know what I mean? And that's like, my mom's a high school history teacher in the suburbs in Connecticut, and like, she didn't know. Like, you know, so to me, it's like, I don't even think people really understand what's happening currently. And I think I would just like double, triple, quadruple down on how important the educational factor is, and that it's the responsibility of folks working in this space to make sure that people It's not just some 5,000 page consent form that's like a legal document that people actually understand what they're taking from you, what they're doing with it, why they're doing that, what they're getting out of it. Like really having a kind of more holistic education component to it.

[00:31:11.785] Kent Bye: Yeah, I've been covering this and some of the people that are doing a lot of cutting edge work, Adam Ghazali of the UCSF has been looking at what is the neuroscience aspect of how to integrate the biometric data that you're emitting and how do you turn that into different gameplay elements. And so I feel like it's still very early stages. Other people like Robin Haneke has also been looking at what are the passive type of that's coming from your body, how can you start to categorize the audience member into certain temperaments that then you can design specific interactions based upon what they're feeling. And so the dream is that you go into an interactive narrative experience and then based upon how your body's reacting, it's able to modulate that experience to give you a more intense experience and to really control that dramatic and narrative tension. And I think there's certainly where it gets a little bit into the ethical issues is where is that data going? Is it being captured? Is it being stored? In my mind, all biometric data should be ephemeral and not recorded. And there should be a little bit more of a real-time processing of that data. rather than capturing it. Because once you capture it, it's almost like, oh, I had this emotion and feeling. And then all the AI is basically trying to extrapolate intention upon that. And it can only go so far. And to me, I just feel like it's a slippery slope to start to go down that. The Canadian Institute for Advanced Research is also going to be holding, like I helped to moderate a panel last year at GDC with neuroscientists who were trying to interact with the game community because the neuroscientists don't understand how to do game design and the game designers don't know a lot about neuroscience. And so there's a lot of overlap. And the big question was like, well, how do you get the data between the two of them without introducing all sorts of tricky issues? So they're going to be having a thing in May, where they're going to be bringing together both game designers, neuroscientists, and a lot of people from the medical community to have this discussion. So I see VR and AR as this cross-disciplinary melting pot that's starting to bring these people from these different disciplines where they can start to collaborate with each other. So that, to me, is exciting. But the data and what happens to the data, I think, is the biggest open question.

[00:33:16.313] Jessica Lauretti: Yeah, just one more quick story. I recently did an Ancestry.com, one of those DNA tests. I don't know if any of you have done that. And I was sitting there thinking for a minute before I did it, I was like, I mean, they say they don't do anything with it, but... Or what if they change their mind later? And I was just like, I couldn't even go down that road, because it was such a spiral to think about what they were going to do with my DNA. I couldn't really handle it. But I think there are things happening now. And as it's constantly moving, and it's moving so fast, it's important to stop and ask some important questions.

[00:33:53.053] Rory DuBoff: Well, so I remember when you used to get a GPS notification on your phone, and it was like, no, I'm not going to let them know where I am. I mean, now it's like, whatever. I always have GPS on. So I actually have a question for the audience. How many people here in the last, let's say, 30 days have been asked if you will give permission for the camera to look at you? So there you go. So, I mean, that, like, a year ago would be, are you crazy? You're not accessing my own camera on my phone. So you see how that's happening, and it's just like, in as much as we say, like, we'll never do it, no way, like, never gonna give your DNA, that sounds crazy too, but, I mean, especially with the camera, because so much is happening now through the apps on the camera, and I'm sure people were like, nobody's ever gonna give permission, and now they are. These things are coming quicker and faster, and having a camera data strategy. I mean, I hate to throw strategy behind everything, or camera data plan. We have to think about that, because somebody is going to say, I didn't give you access to. I mean, we're hearing that with Alexa and all the voice activations, listening in. So we crave it. We want it. And I think we're getting benefits and good experiences of it. But there's also this potential, right,

[00:35:09.827] Kent Bye: Yeah, well, I was a co-organizer of the VR Privacy Summit with Jeremy Bailenson of Stanford and High Fidelity and Jessica Outlaw. And we had brought together 50 of the big players in the VR industry to have a whole day-long workshop on all the different privacy risks. And the goal was to come out of there with some sort of thing that we could give back to the community at large. What happened, though, was that it's such a huge topic that you can't wrap your mind around it within eight hours. And so we did a lot of good brainstorming. And I did some interviews that captured a lot of those conversations. But then I was at the American Philosophical Association. It was like a philosophy conference. And the founder of the philosophy of privacy is Dr. Anita Allen. And she's the current president of the APA. And she got in front of the entire philosophical community and was basically like, We need to really get on what's happening in the world right now with privacy. And she gave a rousing speech. This is the open questions for the comprehensive framework for privacy. And so here's the founder of Philosophy of Privacy saying, we do not have a current comprehensive framework for the philosophy of privacy. And then I was sitting there. I was like, oh, that means that Google and Facebook and Amazon, all these companies, are the ones making those decisions. And the ethical issue is those decisions are being made for their own profit, not thinking about the larger cost to what the risks to our privacy are. And so I think it's actually one of the deepest sort of existential crises of our time that we don't have this framework that we can make these decisions as to what should be private and what should be public. And it's kind of left up to these companies to make that decision. So I think that there's a bit of a backlash that started to come with Cambridge Analytica and seeing what can happen when this data that we think of is in this protective vault, but actually gets into the wrong hands. I mean, it's actually a national security risk. And there's so many other different dimensions. So yeah.

[00:37:06.640] Jessica Lauretti: Yeah, I think I just want to plus one on that because I think when we talk about if I can segue into one of the other questions about Accessibility and access the way that I think about that is similarly just like follow the money, right? Because whoever is paying for this is ultimately making decisions about how it ends up and if we try to approach virtual world as a product, like we're selling a toaster to a consumer, I don't think that that's the right way to do that. And so I think that there's two things. One, I think that brands in a contemporary society have to take responsibility for their place in society. And if they're going to give us all of these things and they're gonna make money off of it, then they also have to give back and that they have to also be considerate about how they're making it and that there has to be some kind of double bottom line where it's not just about making money but it's also about creating something of value to society and to give back to human civilization and the way that we move forward as a society. And then I forgot the other thing I was gonna say.

[00:38:17.431] Rory DuBoff: Well, I will say, I mean, the reason to have this discussion now is because I think too many people are just like technology bad, don't want to partake. And, you know, I will say like Accenture, we work with all the clients we're talking about. And I think that, yes, there's obviously money at play, but I think those brands have a responsibility and we work with them to actually address this conversation. And, you know, part of our role is trying to work with them to address it. But I think it's so important that this conversation be had versus just like, most of the time, it's too much, don't understand it. Not that your mom is at fault. I don't expect her to have to pay. But I think a lot of people have that. And with this new space, I encourage and I'm trying to get smarter and more knowledgeable so that we can actually have a discussion and get to a place that we're better than. Because Facebook. I don't want to get too much into Facebook, but I don't think they wanted what has happened to happen. I think there's actually an intention for it was supposed to be, it is supposed to be a social network. And I don't think that they were interested in having it overtaken by any fake stuff. So we need to figure it out and help companies like that and the Googles and all these others that are going to be around. and make it right for this whole new space, this whole new XR space, which is so new. I've been on panels in the end. Everybody always has a question like, so what are the, and I never know how to answer it because we haven't formulated to your point. So we do have to prioritize it. We need to keep talking about it and actually put some of that money towards these questions.

[00:39:54.452] Jessica Lauretti: Yeah, I think the other thing, sorry, I remembered the other thing I wanted to say, which is that we also need cultural institutions, governments to counterbalance what the brands are doing, right? We need something with some autonomy that at least pretends that it has some higher purpose or, you know, or at least strives to achieve that. Also working in this space and also funding these kind of development programs just so we can balance out what the brands are doing.

[00:40:25.143] Kent Bye: I was going to say that I was at the Decentralized Web Summit, and I had a chance to do an interview with Vint Cerf.

[00:40:30.844] Rory DuBoff: You get around. I'm getting the sense you get your ear.

[00:40:33.085] Jessica Lauretti: He's on the circus.

[00:40:33.945] Rory DuBoff: I was at the this, I was at the that. I know it's your job, so it's okay. It's cool.

[00:40:39.266] Kent Bye: So Vint Cerf is one of the co-inventors of the internet, and I was at the Decentralized Web Summit, which was basically this group of people trying to figure out how to There's a pendulum that swings between centralization and decentralization. And so they were looking at a lot of the blockchain technologies, trying to architect what is the future of the decentralization of the web. And so I'm talking to them. I'm like, hey, maybe Google should stop doing surveillance capitalism. And his response was essentially like, well, how are you going to pay for access to all of human knowledge for everybody in the world for free? And I'm like, well, OK, I can't answer that. And I think nobody in the world can answer that right now. And so there's a bit of, yes, these surveillance capitalism business models are terrible. But there's nothing out there currently that's going to supplant that and provide something that's better. Now, you look at something like Netflix or Spotify, which is a subscription model, where you pay and you say, hey, this is the amount where I'm going to have access. You don't have to own the experience. You just have to have access to the experience. And so as we move from the information age to the experiential age, it's about access to the experiences. I feel like the subscription model is a key. But to operate that at scale at both Facebook and Google, there's a question in my mind, which is like, well, should we have networks that are that big? And if we are, is there a better way that's more ethical to sustain it and to have people say, I'm not going to mortgage my privacy for this service that I'm getting for free right now. I'm actually going to pay for it. But you're going to sort of have that benefit where you're actually, rather than giving away something that everything's just free and people don't understand necessarily how that happens, but as we have more Cambridge Analytica, more data breaches, more your data getting out there on the dark web, it's sort of like all sorts of nightmare scenarios. How many more of those do we have to have before we switch over into a business model that nobody has figured out yet?

[00:42:33.587] Jessica Lauretti: Well, I think it to totally agree with all that. I think that there's that saying that's like, well, if it's free, then you're the product. And I think it just comes down to the way that I think about it is like, what do we value as a society? Because that's often the way that we show value is through money, right? As we say, this is valuable to me, I will pay for it. And so it's like a similar conversation I think that's having in the media space around fake news. It's like, We have corporate media and this is one of the things that happens when you have a media that is paid for by advertisement and not paid for by the people. And so I think that that subscription model, the membership model kind of It's sort of like, at what point do we decide we want to take this into our hands, right? I often think of it, it's like a lot of people are kind of sitting around being like, oh, you need to go do this. It's like, no, no, you need to go do this. Like, we all need to go do this. This is our responsibility to kind of hold up a mirror to ourselves and say, what kind of world do we want to live in? Like, what do we believe in? What's important to us? What are we passionate about? And if we're not passionate about it, then capitalism will run amok.

[00:43:49.565] Rory DuBoff: But I will say though, I think that both of you are bringing up a very good point, which is we are coming off a period where free was the driving model for brands or companies and it was all about advertising for free. I think there's going to be now in this new space an opportunity, companies will need to make money, that's just the way, but to charge, to charge for content because people will say, I'm willing to pay, I'm willing to pay for experiences, especially in VR or AR that are so intimate, where you're getting in a headset and you're like, where you're really intimate, you'll pay, you know? And I think that's what we're seeing is probably gonna be a shift. And so instead of, you know, the model where eyeballs are what you're going after, you're talking about quality content, engagement, and it might be just a different model. So to me, you know, and then it will open up to Q&A, I don't think we can ever get in a situation where we're just like, you know, Corporate's bad, and it's going to destroy the XR space. No. It's like we need to figure it out together and do it in a way that we provide guidelines and recommendations and set out policies so that it actually benefits both of us and everyone. So that's what I'm hoping we can figure it out. So let's do some.

[00:45:02.275] Kent Bye: I just want to share one. So one of the themes that I saw come up both at the Immersive Design Summit and talking to Kathy Bisbee, who does a public VR lab, She was at the Future of Storytelling, and she was struck by how many brand managers were thinking about what she essentially does, which is community organizing. So there seems to be this shift towards focusing on the cultivation of community. And I feel like with these immersive theater experiences, with VR and AR, you have the capability to not just think about the projection of your story onto your audience, but to actually facilitate experiences for people that can participate in that is going to be the process of people creating culture that's going to be things that they actually want to believe in. And so there's an organization called 13EXP that was announced at the Immersive Design Summit that is going to be creating like these 13 different IP properties that are trying to encourage people to do the world building of a culture that they want to live into. There was people from WakandaCon and the Black Panther Lab Project, which essentially is looking at the world that was created in Black Panther and actually creating the world that was depicted in Wakanda. And so this whole 13 XP is going to be doing all these different experiences that are trying to encourage people to participate in the co-creation of culture. So in Ready Player One, we get this image of all these branded IPs that people are participating. I think that's going to happen, but I think there's going to be a lot more from the grassroots of the types of world-building culture that we're going to be able to actually co-create and participate in. And I think that, in terms of aligning with these big companies and brands, it's like trying to create all those sociological cultural artifacts that are going to be creating a culture that you want to live in. And that's going to be through that community building.

[00:46:46.349] Rory DuBoff: That's awesome. That's very cool. So yeah, I think we have like, how are we doing on time? We've got about 10 minutes. Does anyone have any questions?

[00:46:56.346] Questioner 1: I'm a graduate student at UT, and I'm currently building a civil discourse app that inspires people to talk civilly about civil discussions. So my question touches on community building, empathy, and accountability. When you're trying to get people to adapt to a concept or use your app or go to your VR experience, that inspires them to be a better version of themselves than Facebook has trained them to be. We have trolls now, we have people shouting at their friends and friending each other because of Facebook. differences and opinions. We didn't have that before social media, I feel like, or at least not as rapid. So once you've gone to that place, is there hope to bring people back from that, or do you have to adapt? And then if so, how do you go about it?

[00:47:42.783] Jessica Lauretti: I mean, I think we always have to hope, right? That people can come back. I do think we are living in a very interesting time that is definitely the lack of healthy to be. Actually, right before I came here, a friend of mine was telling me that she went to see Trevor Noah. And he talked about that, that the thing that we don't see at home is that all of these senators get up there and yell and scream and disagree, and then they go out and get drunk together afterwards. And they're all best friends and have worked together for 20 years, but you don't see that at home. You just see the argument. And I thought that that was a really interesting way to talk and think about it. I think that there's definitely been a reaction of people are like oh wait okay I'm only seeing and actually this is the way like your subconscious brain works it's like whatever your narrative is about your life the way that you perceive the world you go out and find situations like that to legitimize your worldview. So if you're like, dating sucks, you like always meet the worst guy. Like always, because you're like, this sucks. I was right. It sucks. Right. So it's like, I think there's something like, I've been very interested in psychology, and I've been reading a lot about that, like really thinking about how do we understand the way human beings are, and not be mad at it or try to change it, but try to use it to kind of bring people back into communities. But I do think there's just a general feeling that people are like, I mean, I'm like exhausted from digital life. Like you can't even read an article anymore without like it never loads. There's like 5000 ads. It's like it's hard to kind of do anything. And I think people are sick of it and kind of I want to read a book. I want to go to a coffee shop and talk to other human beings. They want to watch a movie because they're so sick of serialized content, because the numbers and the eyeballs. So everything's a series. Why did I just spend 50 hours watching a story that could have been a half an hour long? So I think that there's just like, you know, society, people shift. You know, we get into one thing and then we're like, okay, we pushed that too far. We need to go back this way a little bit.

[00:49:56.727] Kent Bye: Yeah. And I would say that this is a huge open question in terms of how do you cultivate community? How do you create a culture? Because culture is a bunch of individuals making decisions. And so how do you have, at that small level, people make the decisions that you want to aggregate into the culture that you want to create? It's sort of like every startup, every company, anybody that has a group of people that tries to cultivate that culture faces that issue. And so Jessica Outlaw, I have an interview with her where she dove into the sociological aspects of how do you have the heroes? Who are the stories? What are the jokes? What are the taboos? What are the things you can't talk about? So looking at it from the sociological lens and seeing how do you have the rules and how do you, by embodying those different actions, how can you do that in a way that scales up? I think you can do it in small communities, and I think that's been done. The challenge is how do you take that to be mass scale? But I think if you think about it in terms of, well, how would you do that in a small scale? And look at what happens in the immersive theater and these different experiences that are trying to set a magic circle, which is like, these are the rules. This is what we're going to do. And then from there, creating those small-scale experiences and then figuring out how to scale them up. But I think it comes down to, at the end, face-to-face interactions with everybody consenting amongst these missions and these visions that the entity has. And then all the actions at a small microcosm scale are embodying that deeper vision and purpose that you have.

[00:51:20.330] Questioner 2: Thank you so much. Hi, I have a two-parter, so I'll take either part. The first is I'm working on my first VR project, VX project, related to trying to create empathy and understanding around the migrant experience. And in particular, some of the aspects don't get covered, like levity, tenderness, some of the dilemmas and decisions parents face. And I'm trying to anticipate what are the ethical questions that I need to be grappling with as we begin the project. A big one on my mind is triggering. a lot of this, especially when people are playing conflict, there's a lot of fear and how do you know the line of when not to trigger? So I'm curious what you think, what questions I need to be considering as I go into the project. And then the second question is, I'd love to copy your homework since you are the subject matter experts. Where do you draw the line on privacy? Because I'll just kind of do what all of you are doing because that's just easier.

[00:52:20.998] Jessica Lauretti: I'll take the second part. I mean, I think that I actually don't really do that much. I've read that Mark Zuckerberg even has a piece of tape over the video camera in his laptop, so you might want to try that. I know some people who are really into, what's that messaging app? Signal. Signal, yeah, yeah, end-to-end encryption. So that's a good one if you're really passionate about it. I maybe should care a little more, but I'm pretty free with it all.

[00:52:51.397] Kent Bye: Yeah, so you can go through your privacy settings and sort of turn things off. And sometimes I turn stuff off, sometimes I leave things on. Sometimes I like to see what's happening. I advocate a lot about privacy, but I also want to have the experience because there's a lot of things that are specialized and tuned for that. So I sort of go back and forth on different aspects. But in terms of the triggering of the content, I mean, there is the triggering consent and just saying and warnings, I think, that there's no existing review boards to say this is a rated R or whatever. But I think more than that, there's going to be things in VR that we've never seen before. And so there's going to be things that are triggering that we don't know that are triggering. And so I think because VR is so small, we haven't scaled it up to see like, OK, here's the cartography of all human trauma and things that could, like, have associative connections to these aspects. I think people who have gone through a lot of PTSD, I think they have to be very diligent. And sometimes it's like you're in that experience, and sometimes it's too late, and you have to eject. So finding ways that, like the experience to look to is Testimony VR by Zohar Kafir, which it's about testimony from women who have experienced sexual assault. And if it gets too intense, you look away, and it stops. Right, so figuring out ways that if you're immersed in an experience, can you stop whatever may be too intense by maintaining that immersion, but allowing yourself to go into a safe space? So thinking about if you are dealing with really intense subject matter, then is there a way that you can give the audience to take a break and pause? and have the opportunity to come back to it so they don't feel like they're penalizing themselves by missing a part of the narrative, by having to rewatch everything, but to just sort of pause. And if they want to come back, they can. So that's what I would say is some of the stuff that I've seen that is kind of handling that. But yeah, I think in the future, we're going to have a lot more ways of reviewing and having a discussion and disclosing and having that consent. But we're not there yet.

[00:54:52.390] Rory DuBoff: I would also say that the context matters. If you're a VR creator and an artist, you're held up to a different type of responsibility than a corporation or a brand. So I mean, I think that you should be free to express your view. And to me, it's different than a company that's going out there trying to sell something, or there's a different set of principles that they need to achieve. So not that nobody's unaccountable, but it's different. For me, I think it's different.

[00:55:21.918] Jessica Lauretti: I'd agree with that and also just add quickly, I think it's like, just use your best judgment as like a human, you know? Like, what if you were gonna tell a friend a really traumatic story about something that happened to you? You might say at the beginning, this is kind of a crazy story. Just wanna give you the heads up on that, you know? And it's like, maybe you want people to feel scared. Maybe you want them to feel sad. Like, those are good things, but I think just, you know, be responsible with how you're making other people feel.

[00:55:49.948] Rory DuBoff: So we have three minutes, so probably just one last question, I guess, if you want to go. OK.

[00:55:56.135] Questioner 3: Hi. So I'm writing a paper. I'm a student from Singapore, and I'm writing a paper on the ethics surrounding immersive storytelling methods. And my question has to do with personal consent. So yesterday, I attended a talk about the inclusion of consent. Someone mentioned about the illusion of control that people in immersive storytelling experiences have. And so how do you relate that to the issue of harassment and consent?

[00:56:22.539] Rory DuBoff: I don't think I understood this. Can you just the end of it again? Somebody said what? What was that?

[00:56:29.003] Questioner 3: Someone mentioned about people being in these experiences, they have the illusion of control. So they don't actually have the control over what they're going through in these experience and so I'm asking, my question is how do you relate this illusion of control that we have to the issue of harassment and consent?

[00:56:51.557] Kent Bye: Well, I'll do the consent. There's an experience called Soundself with Robert Arnett, who he actually had this experience of having all this sound design. And what he realized was that there's this spectrum between control, where you feel like you are able to speak something and you're able to have instant feedback. And then there's the chaos, which is like you speak and you have no idea what's happening. And the sweet spot is somewhere in the middle. where it's not like an instrument, but you're doing stuff where you can see the traces of your agency, but you're not able to control it completely. And that has like this magic sweet spot. Buffalo Vision has been doing a lot of this stuff as well. And what he found was that if he was taking that slider to that chaos, and people were starting to scream, it would actually be very traumatizing for people because it would take someone's control out of the experience completely. And I think that was interesting for me to hear of that, like, okay, well, if you are having playing with that, how much are control or not control, and if you slide it too far, and all of a sudden it just gets so chaotic, it can be very triggering and traumatizing for people. Yeah, and I think, like I said before, it's difficult to have that, to describe this is the experience you're going to have. Because even if I told you that, you wouldn't realize that you may even be having a visceral embodied reaction until you actually are in the experience. And so there's a part of you don't know what's going to trigger you until you are there and you see it a lot of times. And I think, yeah, it's a tricky issue. The harassment, I don't know if I can make a connection to that.

[00:58:23.035] Rory DuBoff: Yeah, I mean, I think they're struggling right now. But it's the same issue of harassment on Facebook. It's the same problems we're facing in social media that we're facing in virtual media. So you dealt with the issue of control, I think. It's a different type of control. But moderators, all of that, are sort of things we need to figure out. So anyhow, I think we're done. So thank you both. Thank you for joining. Thank you for participating.

[00:58:53.511] Kent Bye: So that was a panel discussion at South by Southwest with Accenture, and it also features Worry2Buff. She's at Accenture, focused on content innovation, focused on extended reality, as well as Jessica Loretti. She's a founder and principal of This Is Laurels. She's the former head of Riot Studio and helped build the VR news arm of Huffington Post. So I have a number of different takeaways about this panel discussion is that first of all, Well, for me, this was the first time to really start to dive into this. And I had given just like a five minute presentation trying to focus on a variety of these different dialectics. I focused on the issues of biometric data and privacy, virtual humans, and whether or not AI is going to be able to control and manipulate us versus the amount of empathy that you can start to generate, virtual harassment and creating and cultivating safe online environments versus the surveillance implications. and also talking about what's real and what's not real, and then trying to map out your trauma and consent and free speech and some of those different trade-offs. And when I heard Rory, it was interesting to hear that she's taking more of a principle approach, so there's this concept called virtue ethics, so looking at the higher level virtues, and if you look at a lot of code of conduct across different disciplines and industries, sometimes they'll try to start with this higher level of these moral virtues that you're trying to strive for, and then they start to be a little bit more prescriptive in terms of the rules. that you're trying to follow. So some of the things that Rory is looking at is the human centered accessibility, authenticity, accountability, and empowerment are the ways that she was framing this whole thing. But throughout this conversation, we're trying to hit upon these different contexts and map it out a little bit, but just through the process of conversation and dialectic. And so talking about the dark web, deep fakes, fake humans and bots, what is real, what's not real, the dialectic between free speech and trying to be trauma aware. diving into a bit of AR filters and our own sense of embodiment and role models and keeping up the Kardashians and different social implications there in terms of what does it mean to be able to Have you connected to all these people who have a certain? Lifestyle and then what's that mean in terms of setting new normative standards for what people are looking up to then moving away from transactional data into the human data and biometric data and and how you have experiences that are reacting to what's happening within your body. Some of the work from Robin Honecke or Adam Ghazali trying to be responsive into the different experiences that you create versus how much autonomy and control is that taking away once you start to do that. And then looking at neuroscience and gamers and how they're trying to collaborate with each other and what are the data sharing implications of that. accessibility and access I think was a big topic that Rory was bringing up as well and also just for these companies trying to have like a double bottom line so not just completely focusing on their profit but also looking at how they can be a public benefit and add to society not take away the business model I think right now of surveillance capitalism we have to sacrifice different aspects of our autonomy and our privacy in order to have this free access to information, having a new business model is yet to be determined. You know, however, Vint Cerf is saying, hey, once you start to go to that metered approach, then don't underestimate the impact of having access to all this free information. We've really gotten used to it and enjoy it. But yet at the same time, it's been done through mortgaging our privacy, which collectively, there's some downsides both to our own individual autonomy, but also as a collective, there's these other downsides that we have to kind of figure out This is something that I talk about later in the series, but moving away from surveilling individuals and then moving to trying to pay attention to the context that people are in and then trying to do more contextually aware advertising. And so trying to fit into different aspects of what content that you're seeing is rather than you as an individual and trying to follow you around into all these other different contexts and places. And then lots of discussions about community building and how there's a lot of big shifts of cultivating community. What's it mean for these brands to be able to start to try to engage audiences, build up audiences. And then once you build a community, then you have all these different aspects of now you're responsible for moderation and harassment and hate speech and content moderation, it starts to have all these additional problems that get introduced once you are providing new capabilities. And I think with immersive technologies, that's no different. You're introducing higher bandwidth ways of giving people direct experiences and to really try to translate what is happening with your brand and to be able to put that into an experience that unfolds over time. But then there's all sorts of other ethical implications that start to come up with that. So different aspects of what Jessica Outlaw is talking about in terms of the different sociological aspects of cultivating community. And there's technology that could be polarizing. And so different filter bubbles of reality trying to give people what they want, but at the same time, it can start to have these other trade-offs of having a lack of civil discourse and a lack of healthy debate and being introduced to new ideas and the diversity of new ideas. And then finally, just talking about empathy and content warnings and being trauma aware, giving some sort of heads up of a trigger warning or content warning or informed consent is going to be a little bit different within virtual reality, just because people may not know fully what they're consenting to. And so, you know, there could be ratings board that could start to do that, but we still don't know the full extent. It's also impossible to know what the trauma profile for any individual is going to be. And so things, you know, like testimony VR, if you are doing intense content, if there's a way for you to quickly eject out of that experience and not feel like you have to rip the headset off, but maybe if there's a software way where you can start to turn your head in that case, where you wanted to stop the experience, if it becomes a little bit too intense. And so trying to build within the software architectures, ways of allowing people to eject out of experiences easily, if they need to. So this was a great opportunity just to have like an open-ended conversation and start to map out the landscape. I think at this point, it was still very early for me of trying to pin all these different things down. And then was the start of a whole seven month journey for me, just diving into this topic. Having lots of interviews and then I would get these things scheduled and then I would need to present a talk. And so I was talking to different people, trying to digest as much information as I could, where I went to the virtual for a whole think tank and then augmented world expo and the decentralized webcam and that SIG graph, and then the view source conference. And then at the green light. XR strategy week. So just really a series of different panel discussions and talks and interviews. So I've aired a number of those conversations already, but this is really kind of a distillation of a lot of those different conversations in this 14 part series that I'm going to be kicking off now. So yeah, I'm excited to dive into this. So that's all that I have for today. And I just wanted to thank you for joining me here on the Voices of VR podcast. And if you enjoy this podcast and enjoy this coverage, I am a independent creator. I rely upon donations and support from my listeners. And so if you enjoy this coverage and want to see more, then please do become a member of the Patreon. Just $5 a month is a great amount to give and allows me to continue to bring you this coverage. So, you can donate and become a member today at patreon.com. Thanks for listening.

More from this show