On August 7, 2020, I participated in a panel discussion about Ethical Design in Immersive Media organized by the University of Oregon as a part of Design Week Portland. The panel featured Stanford University Virtual Human Interaction Lab’s Jeremy Bailenson, who talked about biometric data privacy in XR. Oregon Reality Lab Director Donna Davis talked about accessibility & diversity of identity representation. I gave an overview of ethical issues that designers face, and the conversation was moderated by University of Oregon Assistant Professor of Media Psychology Daniel Pimentel.
LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
Rough Transcript
[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. So in today's episode, I'm going to be airing a webinar panel discussion that I was a panelist on. I was invited by Donna Davis from the University of Oregon. She's the director of the Oregon Reality Lab there. And she was collaborating with Design Week Portland. So there's lots of different creative design agencies here in Portland, and they just have a whole design week where they have a bunch of discussions and panels and whatnot. And so it was all happening virtually this year with the pandemic, of course. And so I was able to participate on the panel with Donna Davis, as well as Jeremy Bailenson from Stanford. And we're talking about the topic of ethical design and immersive media. So another panel discussion about ethics. And for me, I was just trying to give like an overview, very quick overview with some specific things to take into consideration, as well as Jeremy Bailenson talking a lot about personally identifiable information. What kind of information can you glean off the information that you have in these different immersive environments? Donna Davis, who a lot of work that she's focusing in on is on accessibility and, you know, working in Second Life and making sure that there's diversity inclusion and avatar representations within these virtual worlds. So that's what we're covering on today's episode of the Voices of VR podcast. So this panel discussion with myself, Jeremy Bailenson, Donna Davis, and the moderator, Dani Pimentel happened on Friday, August 7th, 2020. So with that, let's go ahead and dive right in.
[00:01:36.175] Erika Berardi: Okay. Hi everyone. Welcome. My name is Erica Berardi. I'm the project manager with the University of Oregon School of Journalism and Communication here in Portland. You are at the webinar for Ethical Design and Immersive Media. We're happy you guys are here and we really wanted to thank Design Week Portland We've been a part of the festival since the beginning, I think. It's really an honor to be a part of the community with creatives, innovators, activists just like you, and really just here to make the world a better place. And so we're hoping we can do that through this webinar and kind of give you guys some innovative ideas on things that are coming down the pipe and how to stay really ethical and how that looks in the immersive world. And I wanted to introduce Danny Pimentel. So he is a great incoming professor for the University of Oregon here in Portland. He is spending his summer moving from Florida to the West Coast. And so we're excited to have him part of Portland. He got his PhD in advertising from University of Florida and also the lab coordinator for the media effects and technology lab. So he's very well versed in immersive technology, immersive media. He has also a master's degree in global strategic communication from Florida International University and the former director of a nonprofit called URA. and the Director of VR for Changeville, which is a social change festival in Florida. He's also done TED Talks, and as he's made his waves through Florida, we're excited to have him here on the West Coast. And so I'll go ahead and toss it over to Danny, and he'll introduce the speakers and go with the show.
[00:03:15.728] Daniel Pimentel: Thank you so much for setting us up, Erika. Like she said, my name is Daniel Pimentel, Assistant Professor of Immersive Media Psychology at the University of Oregon. Today, though, I have the distinct pleasure of moderating this panel with some of the world's leading thinkers with regards to immersive media, everything from 360 video, virtual and augmented reality. These panelists really are at the forefront of understanding how these platforms can really influence all facets of the human experience, both now and in the future. So I'd like to kick it off with some introductions. And I'll start off with Dr. Jeremy Bailenson, our first panelist. Dr. Bailenson is a cognitive psychologist by trade, the founding director of Stanford's Virtual Human Interaction Lab, where he and his team produced hundreds of academic papers over the past two decades, exploring the psychological effects of immersive media, specifically His work describes the ways in which our virtual experiences, whether we're talking AR, VR, how they can influence how we think about ourselves, how we think about others, and encourage positive social and environmental change in the process. So you'll find them in any popular press, Wired, Wall Street Journal, Nat Geo, among many others, and his 2018 book, Experience on Demand, which got great reviews and I really enjoyed, is a great resource if you're wanting to understand how far the technology has come, as well as the promise and potential dangers, which we'll talk about today, associated with it too. So thank you so much for joining us, Jeremy. Our second panelist is Kent Bye. Kent is a journalist, a VR historian, and the founder and host of the Popular Voices of VR podcast. Kent has been documenting the evolution of VR for many years, and he's conducted over, I think, 1,400 interviews with many thought leaders in the immersive media space. I'm personally very excited to have him on as well, because he's someone who has a wealth of knowledge and insight as it relates to some of the ethical considerations in these emerging media platforms. His proposed XR Ethics Manifesto is a great resource, I think, for understanding some of the ethical considerations that we should have as users, designers, as corporations, and as governments. What we need to keep top of mind as we shift more towards virtual spaces. He's spoken at events like South by Southwest, 2019 Augmented World Expo, among many others, and excited to get his thoughts and perspectives on how we as creators, as designers, can move towards more equitable and safer, immersive world. So Kent, thank you so much for joining us.
[00:05:36.145] Kent Bye: Yeah, it's great to be here. Thank you.
[00:05:38.642] Daniel Pimentel: And certainly, last but not least, Dr. Donna Davis, someone I am very proud to call my colleague. She's an associate professor at the University of Oregon School of Journalism and Communications here in Portland, where she's also the director of the Oregon Reality Lab, leading really cool cutting-edge research initiatives on how immersive media can serve as an ethical tool for social innovation. Her work really examines the transformative benefits of virtual worlds, whether we're talking about Second Life or Sansar and Also, how humans can embody virtual avatars and how these embodied experiences can be used to improve the quality of life for people with disabilities. So her work is really at the front of this stuff and has been funded by the National Science Foundation. She's produced a documentary called Our Digital Selves, which highlights a lot of this, sort of the uses and benefits of virtual worlds for people with disabilities. So I'm really excited to get her thoughts regarding access, accessibility, and inclusive design in the context of immersive media. So Donna, thank you for joining us. My pleasure, thank you. All right, so to provide some context based on this panel discussion, we're obviously here to discuss ethical design considerations within immersive media, the whole suite, whether we're talking about mixed reality, virtual reality. But I think it's important to sort of step back and acknowledge how in a post COVID-19 world, immersive platforms have really become increasingly relevant as we adapt to this new normal, right? This perfect storm of social isolation, restrictions in human experiences, how that has led the popular press to really dub this as VR's moment, right? And it's a solution to a problem. And maybe it's not entirely merited, but in many ways, I've come to experience this over the past couple of months, right? So for example, like many of you, my summer conferences were canceled, organizers shifted to virtual worlds like AllSpace VR, Mozilla Hubs. As my concerts were canceled, you know, I have found myself in Oculus venues, you know, in a virtual mosh pit. As my gyms were closed, I found myself using VR games like Supernatural and Beat Saber to stay active. And obviously VR lends itself to these use cases because you have the ability to feel present in those spaces. with those people. But in entering these spaces, you know, we have many ethical dilemmas to take note of, certain problems that crop up as a function of shifting towards that. So from privacy considerations to health implications, things designers should keep top of mind. So my first question, sort of with that backdrop, is directed to Kent. So what are some of the key ethical considerations when we think about XR design, AR, VR, mixed reality, that designers should really be particularly keen on and focusing on moving forward?
[00:08:15.123] Kent Bye: Sure. Uh, let me just, uh, share my screen. So yeah, I'm going to talk just a brief overview. Cause any one of these topics in ethics, we could dive into for hours. Like I gave a talk on ethical moral dilemmas at AWE. I gave a whole half hour XR ethics manifesto and my pinned tweet on my Twitter page at can't buy has a list of like over 40 interviews I've done on this topic. But one of the things that I found in talking to lots of people is that there's lots of different contexts that VR is in, whether it's your home and family, medical information, entertainment, your sense of your body, embodiment. And one of the things about ethics is that you're blurring together a lot of these different contexts. And so a lot of what I try to do in my XR Ethics Manifesto is kind of map out each of these different contexts and what the specific considerations are. And since today we're talking about design and designers, I'm just going to quickly go through some of the specific concerns as you're creating virtual worlds, what things you should be taking into consideration. And the first one is just motion sickness. VR has the potential to create motion sickness. So there's a bit of an ethical responsibility to make comfortable VR experiences and to not induce motion sickness and to have proper teleportation options. There's lots of implications of avatars. I know Jeremy's done a lot of work on this in terms of what type of either direct experiences you can modulate with different avatars, but also just having a diversity of different avatars that are available that could accurately represent the diversity and plurality of expression that you want. biometric data is a huge issue, both in terms of what these platforms have access to in terms of our ad tracking data, what can they extrapolate in terms of personally identifiable information, but really intimate information, our sexual preferences, what we are paying attention to, what we value. Lots of implications here that are mostly at the platform side, but also as consumers, we need to be aware of if we're giving over access to biometric data, what kind of information are we giving access to? And I know that Jeremy is going to be diving into that in more detail. Hyperreality is a video by Keiichi Matsuda. It's an augmented reality speculative future where he's showing a future where we just have advertisements everywhere. So this is a bit of like the ethics of how much clutter do we have in our world and what are the relationship between this type of surveillance capitalism relationship to us in the future. Volumetric privacy. So if you're capturing somebody's home, then that could be also revealing information. And so you know, if you are doing volumetric captures and storing that, then there could be information that's very private that's being leaked out there. So just taking into consideration the rooms and what should be public and what should be private. There's like this larger consent to augment, is this up to the owners of the property? So there's an example of Pokemon Go, augmenting within a Holocaust museum. So there may be context in which it's inappropriate to do a game like that. And so is that a free speech right for people to be able to augment whatever they want? Or if you're on somebody's physical property, then is it up to that property owner to determine that and negotiating that free speech versus the consent to augment? And how do we navigate that in the future? One of the strengths of VR is that it could help cure PTSD. But if you can cure PTSD, then there's also the implication that you could potentially invoke and cause trauma in people. And so there's this whole question around virtual violence. Hitman 3, the trailer just came out yesterday, and I watched it. And there's the scene at the end where you're literally strangling another virtual being. And so what's it mean to have these video game experiences where you're starting to have more and more visceral embodied experiences of violence? What is that doing to our psyche? But there's also the skill acquisition. So benefic use of being able to use VR to be able to train skills for soldiers that actually need to do some of these things. But there's also like the distraction people walking off of cliffs and Pokemon go. And so what are the ethics around making sure that people are still aware of their environments? There's virtual being influencers. And so, you know, if we are going to be interacting with these virtual beings, what's it mean to have anthropomorphized representations that are going to be trying to influence us? Virtual harassment's a huge issue. So how do you mitigate? personal space bubbles and being able to mute people, block people, but just deal with harassment in general. What's the code of conduct. If you have a social VR experience, algorithmic biases. And so just making sure that whatever algorithms you have are not unduly biased and then accessibility. I know Donna's going to be diving into that and just having accessible experiences and making sure that you're not just creating experiences that are going to be unaccessible for people, but also thinking about flat displays. If you have like a portal display, if you have a virtual reality display. And so really thinking about this. design across multiple modalities and making sure that it's the most accessible as you can make it. And so here's just some references if you wanted to look into some more information. But yeah, I think that's an overview. And I think from there, we'll kind of be diving into each of these.
[00:12:51.166] Daniel Pimentel: Perfect. Thank you so much, Ken. I think that really does a great job of sort of putting a nice picture on the landscape of some of the things designers need to be thinking of. And it's kind of segues nicely into sort of privacy and data concerns, because in the examples you're giving, obviously, when we dive into these immersive environments, users are handing over uniquely rich sets of data, right? We're talking physiological, conversational, proxemic, it's almost like the human experience is becoming more quantified to a degree. as a function of that. But obviously these data points are necessary to create a rich user experience, right? So Jeremy, I was hoping if you could speak a bit about some of the ethical considerations as it relates to privacy in VR, user data, anything related to that trade-off between privacy and presence, I guess is a good way to articulate it.
[00:13:33.331] Jeremy Bailenson: Yeah, thank you, Danny. And thanks, Kent, for setting the table so wonderfully there. And Danny, congratulations on a great job at Oregon. Oregon, congratulations on finding an amazing scholar. Fantastic, just wonderful. So my name is Jeremy Balanson, and I arrived at Stanford in 2003 to build a virtual reality lab there. I'd been studying VR since 1999 at UC Santa Barbara. And one of the first non-academic grants I ever got was in 2003. 2004 is when the money came. It was a conglomeration of car companies that were donating money to the lab as gifts. And they wanted me to study virtual reality and instantiate it as a car simulator. So we had this amazing car simulator built at Stanford, which you can think of it as that cave in virtual reality using projection screens. And we had a steering wheel and a whole setup back then. And the car companies, they wanted to see if we could use machine learning to detect what's called pre-accident face. We called it the pre-accident face. accident phase. And the notion here is that if there's cameras in an automobile, you can detect someone who's about to have a bad driving experience, meaning they crossed the line or they ran a light or they got in a car accident by evaluating their body language. And the approach that we took to study this is we brought in hundreds of drivers into the simulator. We used the virtual reality tracking system to track their body and also their face, given this was a cave that wasn't goggles-based, and we were using face tracking. And we'd have people drive in the simulators for about 40 minutes. And every once in a while, a driver would make a bad driving decision. And we would try to capture the data before that decision. So the notion would be the car, if it knew this, would be able to correct that behavior or tell the driver that you're getting tired, et cetera. And so we ended up publishing this paper many years later, I think not till 2009, and we reliably could predict over chance if somebody was going to make a driving mistake simply by looking at their body language in the moments before they made that mistake. But an interesting thing occurred. In the hundreds of subjects that we ran, what we learned is just from evaluating their body language and their gestures, you could actually predict if somebody was going to be a bad driver, not in the moment, but in general. In other words, there's variance in everybody on this call. Some of us drive better than others. There's variance in our skills. And we could actually pick that up from the way somebody moves their body and face during driving. And so my question to the people that can say yes or no in chat here, do you think insurance companies should have access to that data? And before you quickly say no, because we're all privacy fans, realize that last year, 40,000 people died in car accidents. And if you could find out somebody was a really bad driver, and their insurance would have to be really high in order to get on that wheel, do you want insurance companies to have that information? So interesting quandary there. We then began a quest both in my lab and with other labs to really study what The tracking data that virtual reality gives you can predict, and without going into too much detail, I'm happy to send people papers. Scholars have shown that you can predict if someone has ADHD. They can predict if someone has autism. They can predict if you've had a stroke or not. They can predict which stages in early dementia you may be in based on how you're moving your bodies in vr They can predict if you like content that you're seeing and so we know vr is indicative of Movement and movement is indicative of mental states and so you know you think about social media when you're typing something you're doing this consciously you're thinking about what you want to do, you're self-presenting. But very few of us have been taught how to regulate our nonverbal behavior, which is why nonverbal scholars often refer to this as nonverbal leakage, not something you can manage consciously. So in many ways, the nonverbal tracking data you get from VR is more telling than the things that you say, because we all have been trained since a very early age how to regulate our speaking, not so much on our gestures. And so I want to read you an anecdote that I wrote. This is fictional. But I think it really kind of sets the frame here of what can happen when very large companies have access to all of our tracking data. It seemed like a game when Riley first started the virtual reality maze. He used a room scale setup. So by physically walking around his room, he could solve puzzles and visit different parts of the virtual maze. His friends were networked into the game. So even though they were in their own living rooms, when Riley turned his head, he could make eye contact with them. He could even give them virtual high fives, slapping avatar hands, gave him the sensation of haptic feedback from his controller. What Riley didn't know was that the startup that created this game had decided to sell its users tracking data. Riley also didn't know that a 20 minute VR game session recorded 2 million points of data about his body movement and that an insurance company was one of the customers buying the game data. A month after playing the game, Riley was turned down for a new life insurance policy. Given his excellent health, he couldn't understand why. Several appeals later, the insurance company disclosed that Riley's tracking data from the VR maze game revealed behavioral movement patterns often seen among people in the very early stages of dementia. Later, Riley's sister, who had not played the VR maze game, was also rejected for life and long-term care insurance policies as dementia tends to run in families. And if you, as homework assignments, if you want to go and read the different privacy policies of the different AR and VR companies, many of them are very transparent about that. They own this data and some of them are transparent that they will sell the data. So something to think about. And the other thing I want to talk about is I want to push a little bit on Ken's point he brought up about distraction. And so, uh, Danny mentioned my book experience on demand. When I was on book tour in 2018, I talked to most major software and hardware companies, and I went and spoke to a lot of phone companies. And what I said to, you know, I would speak to the CC of Samsung and I would say to them, imagine you had a time machine and you could go back 10 years. Okay. It's 10 years. Ago, and the smartphones are just coming out. They're not yet ubiquitous and you can make a decision And so the question I would ask everybody in that room is how many people do you think are killed by people? Driving while texting each day driving while using your smartphone each day in the united states Conservatively 10 people a day are killed because somebody is using their phone while driving a car, you know, pause on that for a second 10 people a day are killed because we feel it's more important to use our phones than to drive properly. And so I say now we're in this special moment with VR. VR is just about to come out. And why I like this antidote for this audience is you guys are designers. And imagine you could go back in time 10 years ago and you could put a feature in the phone that didn't allow it to work properly if you were going more than 10 kilometers an hour. It only worked if you were going by walking speed or maybe running speed, but certainly not by 60 miles an hour. Would you design your phone differently? And I asked them, I don't want to make them feel bad in the moment. But now as we're designing VR, it's up to us to make it so VR doesn't work in a moving car because people will be killed. Now, those of you listening are probably laughing and you say nobody would use VR while driving. That sounds insane. But we have two data points for this. The first is, as Kent brought up the game Pokemon Go, there are many documented cases of people getting into car accidents. because they were playing Pokemon Go, meaning they were looking through a camera on a smartphone as opposed to the road while playing the game. The second data point is you may have seen an advertisement by an HMD company called Varjo, where they were basically showing the resolution of their HMD was so high you can literally drive a car while using it. And simple priming designers to think that way. So my call to designers, and I'll close on this sentence here, is make it so your app or your hardware device will not run in a moving car. And of everything I can tell you today, that's going to be by far the thing you can do that saves the most lives.
[00:21:47.990] Daniel Pimentel: Thank you so much, Jeremy. Yeah, you brought up some really interesting points, and you identify several aspects of design that we need to take into account. Specifically, you just talked about distractions a big one, but also I want to sort of focus a little bit on what you said with regards to body movement, right? the physiological data that's collected, that's being used, and in some cases can be misused. And that made me think a lot about the role of body movement and gait as it relates to presentation. And I know we have Donna here who I want to ask this following question to, as it relates to, you know, self-presentation and accessibility in VR. So one of the things I thought about was, you know, while we may not want certain nonverbal leakage to leak out, In some ways, other people might want a particular gate, for example, on their avatar when they're walking through social VR as a way of expressing themselves accurately and showing their true self, for example. But how do you reconcile that with some of the issues that Jeremy brought up in terms of extrapolating that data and arriving at certain conclusions? So just top level, Donna, what are some considerations that you have personally as it relates to accessibility in virtual spaces, as it relates to XR and the full suite of immersive media platforms? ethical considerations as it relates to that that you'd like to highlight.
[00:23:03.439] Donna Davis: To that point, before I jump into that, I also noticed there was a question from the audience that ties back to that for Jeremy specifically, where they were asking, will VRA or XR be the horse or the wagon being pulled in predicting things like data for autism? One of the big dangers is surveillance scoring. And if car and insurance companies don't update their own behaviors, Will XR just be another appendage of an outmoded but very dangerous monster? That's a super interesting question.
[00:23:34.693] Jeremy Bailenson: Would you like me to answer now?
[00:23:36.215] Donna Davis: Yeah.
[00:23:36.735] Jeremy Bailenson: Yeah. Look, I think it's important that people talk about this stuff early and often. And, you know, circling back to the car accident, we can predict who's a bad driver. If you can predict someone has early onset dementia, and you can let them know that three years earlier than they would have been detected, their lives are going to be improved immensely. And so all of this stuff is super complicated and has two sides. And it's really important to just be transparent about it. To your example on the car or the horse, I like that because I think the answer is We have to make our content so good that people want to buy it as opposed to making it a vehicle that is selling advertising. In other words, if your thing is good enough, people pay for it and you don't have to give it away for free so that others can advertise. And the point of the advertising is taking the data. I will say one more thing, which is, and this takes a little while to wrap your head around the way we do advertising currently is by product placement. Okay. There's a new way of advertising I believe will emerge with VR. And I'm not anti-advertising, by the way. If you're using a free product, someone's got to pay for it. And advertising is how we get free stuff. And a good advertisement connects somebody with a product they want. So I'm not anti anyone that works in marketing. It's a system that works for all of us. However, there's going to be a new mode. And the mode is someone's going to build a VR scenario. And it's going to be really interesting. And you're going to want to do it. It's part of the game, and you're doing it. The point of the scenario is to elicit a pattern of body movements from a person, okay? In other words, you give somebody some scenario and it gives them a certain mental state and then you see what they do. Maybe they grab a red drink as opposed to a blue drink based on what they're feeling or they go to the red wrapper. And so in other words, what you're doing is you're building models that are designed to predict behavior. Now imagine you're later on in the physical store, you've got a simple camera, 3D camera that does 3D vision on the body. It detects a pattern that you demonstrated previously in the game and then alters something. It could be an audio that comes over the supermarket. It could be a digital panel that's there, but the big idea here, which is challenging ethically an opportunity for those that are trying to build applications is when the point of the advertisement is not to just simply show you a product, but to see what you would do in a certain situation so that you can do the same thing later on. It takes a little while to wrap your head around that, but that's something to think about.
[00:26:01.439] Donna Davis: I've been making that point now for many, many years because I work in social virtual worlds. These are spaces where people can build their own content, create their own experiences, interact with others in a very similar way that we do in the physical world. One of the biggest things that happens because of that is fashion, that people like to accessorize their avatar through fashion. in social virtual worlds, especially, I mean, I have worked with what I now refer to as the legacy medium, which is Second Life. It's now 17 years old. So it has become legacy content, but watching the evolution of that space and the capabilities of that space, watching people buy clothes or buy fashion or wear fashion in the virtual world, I lived in Italy many years ago and I always remembered that what I saw happen in Italy showed up in the U.S. a year to three years later. I'm seeing the same thing happen now in a social virtual world where people are purchasing things in VR and I see it showing up a year or so later as popular fashion in the physical world. So, from an advertising and marketing perspective, what a wonderful focus group. It goes to the point you were just making, Jeremy. Kent, did you want to add to that before I move into answering Danny's question? No, go ahead. Okay. In responding to that issue of accessibility and design, I am also going to share my screen to take you into the Oregon Reality Lab. My avatar is Trady Felicimo. If you've been in the OR Lab in Portland, you will recognize this face. complete with the images of the city outside the windows, and that's the capture I did in photographs. Here I am in the lab, and as I think about what we do with creating content as designers, what I want you to do is think about the person that may not fit the norm of ability, especially as we're thinking right now about diversity, equity, and inclusion, and the ethics of design when you're creating these virtual spaces and bodies. So as an example, think about a person who is deaf. In a virtual environment, I can type in the local chat box, and voila, it shows up in this local chat box in the corner. So whenever we host events in this space, we are always making sure that as someone is speaking, that we are transcribing in the local chat box. So it makes it very accessible for those who are hearing impaired. Think also about the blind. What happens when the blind are using screen readers If I'm at an event, I use a gesture. You see applause show up. And if I had my volumes turned up, you would have also heard cloud applause. So if I'm at a music event, which happens very regularly in this platform, you will see this kind of chat show up just like that. And on the screen, if a blind person is using a screen reader, what they're then hearing is asterisk, star asterisk, plus, period, comma, period, et cetera, et cetera, et cetera. Before it ever even gets to the word applause, it's almost impossible for them to experience the space using these very, very fun visual creative gestures. But when that translates to someone in audio it's untenable. So as we think about the design of those sorts of creative and fun things that we can do in these platforms, think about voice, about the use of text. And then what I'd also like to challenge you to do is think about it from those who are body diverse. And I'm going to show you an image. This is an image of me outside of Linden Labs headquarters in San Francisco a couple of years ago with Cody, who has cerebral palsy. When we traveled with Cody to Linden Lab, we were trying to talk to the designers and developers, and I know we have a lot of developers in the audience as well, about how we're designing both the hardware and the software of these technologies For somebody like Cody, you can see from this piece that's mounted on the bar of his wheelchair, that's his cell phone attached to it. And he types with his nose on that cell phone, and that's how he can communicate in the virtual world. So that becomes his keyboard, if you will. When he and Tom Belster, who is my co-investigator on this particular project, were On this trip, one of the things that we discovered is that he had just had hip surgery and his caregiver who traveled with us couldn't get him to work on his exercise. He wasn't doing his leg exercise. And Tom said, well, what if we put the hand controllers of the Oculus on his feet? And if he moves his feet, will he be watching his avatar's hand move? And in fact, we put him on his feet, he started kicking, and he was watching his avatar in Sansar, not in Second Life, but in Sansar, which was VR enabled, he was watching his avatar's arms wave with his kick. And he can't physically do that on his body in the physical world. So for him to watch himself, if you will, waving his arms through his own motor function, the joyful noise that Cody shared could be heard blocks away. It was a life changing moment for him. So as we think about body diversity, think about how the hardware that we're using can also open doors and opportunities for people who don't walk or look like many of us. What I also want to then address is as we think about design, as we also think about identity, And one of the things that is a really interesting and challenging piece, I know that Jeremy has quite a few different avatars that he uses in different platforms. If you were to create an account in Second Life, you can choose an avatar. And these are the options that will come up for us. So if you're looking at the new avatars, this is their definition of old female. I'm sorry, I'm going to say mature female. I consider myself a mature female. mature male, and of course they're in a tux and a gown. Then we have, this is use. And really fascinating, I was just in here with a brand-new person yesterday who was an African-American woman, a Black woman. And when you first open up the screen, this is all you see. So, she didn't even see the option that she could choose a female avatar that looked like her. So, as we think about avatar design and identity, and we think about diversity, equity, and inclusion, think about not only the body diversity and the way that we all live through our own physical bodies and how that translates to our embodied beings in digital and virtual spaces. Actually, it's really fascinating. If you look at fantasy avatars, they're a little more diverse. Not a whole lot, but they're a little more diverse. We can choose a vampire and we have more diverse vampires than we have regular avatars. And then these are the classic avatars. that have become far more diverse in the last year than they have ever been. So these are the basic avatars that you can choose when you create your new account. But even as you look at these avatars, I'm going to also point out that while you have diversity in skin and hair color, they're all young and they're all thin. So even as we think about body diversity and we think about avatar design, what about people who have different shapes? It's really fascinating as I've spent almost 13 years in this environment to watch the picture of the bodies over that time has been amazing because they also always used to be tall and skinny. Now you see very curvaceous women very commonly. And again, I'm going to go to the cart and the horse question, or the chicken and the egg. Is the avatar influencing what we see as norm in the physical world? Does the physical world translate as what we find? norm in the virtual world. We think about avatar design. The challenge is to think about all of those visual ways that we think about identity and how we represent those in avatars. As we think about avatar design, I'm going to flip around here to the back wall in the lab. I've taken some images here. These are the avatars that you can choose from in your basic account on AltSpaceVR. So, interesting choice of the way they represent race. Also, really interesting choice in the way they represent age. So, if that's what I'm supposed to look like and that's the only option I have, how is that going to influence people's adoption of VR as they look for themselves or look for something other than, I think it's also really interesting that, of course, in VR, we still don't have arms and legs. Right now we have these blocks with little heads on top of them, but we're starting to actually have hands. This is an avatar design for Requim. So as we think about, again, there's diversity in skin color. and hair style, but it's super limited. But you see way more diversity in the way we look as human beings than you see in the avatars, which I'm just going to pose that question or that prompt to all of us that are working in VR spaces, AR, XR, VR, virtual reality, and social virtual worlds. Think about people like Fran, who was a woman with Parkinson's disease who passed away just a year and a half ago at 92 years old. These are people that have been part of a Parkinson's support group with me in the virtual world for the last eight, almost nine years. We all flew to San Diego to celebrate Fran's 90th birthday. and where we think about photorealism being important, and with avatars sometimes, especially as we think about in work, and what do we have to think about when we think about the way we look at work? Am I supposed to look like me? Or, as Fran would say, when we said, wow, you could do photorealism now, Fran's immediate response was, why the heck would I want to look like me? I prefer for people to think of me as the 30 something year old me who loves to dance and swim and play, not the me that's trapped inside this broken old body. So even as we think about identity and visual identity, how do we, all the isms that we have in the world, especially as we're thinking about DEI right now, think about how we put people in a box based on do they have gray hair? Are they round? Are they all of those things that are about people with ability and disability? Think about the ethics of design when we're thinking about identity and how important is flexibility? Working with the folks at the Mayo Clinic, one of the things that Brian Cahoy used to say is that they had to really work through from an HR perspective what it meant to have people from all around the world who were coming to meetings as avatars and they had a dress code. So what about the doctor who was working in the rainforest in Brazil where the dress code was not much? And they had to come to agreement that, you know, we want you to come as authentic you. And if authentic you is in shorts and bare feet in the middle of the forest, cause it's hot and that's the only way you can be. then your avatar can look like that. So they still had to have some limits on what they would allow and not allow. But generally speaking, they had to open up what would have been norm for them.
[00:38:55.727] Daniel Pimentel: Right. That's a lot to unpack there, Donna, especially, you know, you sort of emphasize the importance of virtual identities, you know, how important it is to users of these virtual worlds. I don't know if anyone wants to chime in on this topic really quick before we pose some other questions and jump into the Q&A. Otherwise, I have a question of my own that I want to lob back. There's a really good question in the Q&A about children in VR that I'll get to in a second, but as it relates to virtual identities, you sort of touched on that in terms of how having this greater diversity of choice, greater freedom of expression of ourself in the virtual world influences that user experience, but also, and it can be very rewarding, like some of the examples you gave, but it could also backfire in certain cases when we think about how not only our virtual selves influences the amount of information we can extract from the user, but also it can influence our behavior in the virtual environment. And so I know Jeremy, for example, you did a lot of work around this produce effect, for example, and things like that, how our avatars influence behavior and decision-making. What are some of the concerns when we think about our designers in the audience that may be interested in experiential marketing, that are looking at creating experiences that are essentially built to persuade, that might put you in another body without your decision or without your control, and how that might have implications if it's an advertisement to buy, you know, a car or a soda or whatever it is. What are some of those concerns and maybe best practices or suggestions you have for people that might be encountering that?
[00:40:20.267] Jeremy Bailenson: So I'm not going to come up with much on best practices because I think no one's figured that out yet. I will say between the work that we did 15 years ago on the Proteus effect, which basically says when you wear an avatar body, you slowly and surely take on the characteristics of that body. So when you wear attractive avatars, you're more social. When you wear taller avatars, you negotiate more aggressively. And this work's been replicated for 20 years. Newer research on something called body transfer which is a more neuroscience technique, which is by putting someone in avatar and having them do certain movements, they can take on the characteristics of a child perceptually. So there's a lot of work that does say that wearing an avatar does change your behavior even long after the VR experience goes on. Now, that being said, there's just Not a ton of work ideas. I'm going to show you one avatar here that I was just playing around with. This is an avatar from Lumi, and this is very much in tune with what Donna was just talking about. But what are the implications of wearing an avatar that's a totally different age from you? The point that Donna was raising, which I liked, which designers can take home, we know that avatar use does change the way you talk, change the way you gesture, change the way you think about yourself. The choices you make do matter. And so I, maybe my colleagues can come up with a better solution on tangible things for designers, but I can say from the psychology of it, we know that the body where it does make a difference.
[00:41:52.945] Kent Bye: Yeah. And I just say that a lot of the experiential marketing that I see, well, there's the virtual market that happened within VR chat. That was a market where you see a lot of different experiential marketing experiments that are happening there. I think they've had like four of them so far, and the next one's coming up here in the winter and they are selling avatars. And so you go in and you embody different avatars, you have a mirror. And so it's more of a market for you to try on the avatars and to buy them and. And so it's more specifically selling avatars. But I haven't seen as much experiential marketing where you are really embodying a certain identity, because there's all sorts of storytelling implications there in terms of like, who are you? What character are you? That's harder to do from a storytelling perspective. So I tend to see more of your embodying a sidekick. And there's other protagonists that are there that are driving the narrative of the story. Or if you think about identity in terms of marketing, experiential marketing, you think of facial filters on Snapchat, where that tends to be more about leaving it up to the user to share a modulation of their own identity. And that then becomes a part of the brand expression. But the brand has to be willing to give the users quite a lot of latitude to remix their brand. And brands tend to be resistant to giving that much freedom, because who knows what is going to come of that. And so I think the identity is actually maybe something that will be coming, I think, a lot later, because of all these variety of issues that I just talked about there.
[00:43:16.689] Donna Davis: And it's a really interesting time when you think about it from that marketing and identity perspective that It all goes back to what is your purpose in that space? So gamers are about role play and your avatar takes on the role that you play in the game. And even in social VR, it can take on the role that you want to present yourself as in the game. But as we're talking more and more on these kinds of screens, where it's, we got to sit here doing the Hollywood squares, or can you create these immersive experiences where you, we did a meeting in here last week in the OR lab, virtually with a group of people who haven't been able to meet in there for six months. And we all went, oh, it just felt so good that we all felt like we were all sitting together in the lab again, in a way that we can't do right now. And from a psychological perspective, and you think about it from an immersive perspective and the influence that that can have on the way that we interact with each other in these kinds of meetings. Or, or, or. But as we think about it from a professional and a marketing perspective, what can we learn from these experiences that might guide the way we think about each other as we look at each other as avatars, or that we can think about the way we present ourselves as an avatar? How important is customization? I think Jeremy's done a lot of work that says customization is really important.
[00:44:45.359] Jeremy Bailenson: On the advertising front and identity in particular, Grace Ahn is a professor at the University of Georgia, and she did a lot of work in graduate school on something called self-endorsing. And this is what happens when instead of product placement in the scene, the products are placed on you. And there's a way to think about that. So her research showed that you like the brands better, you remember the brands better, that you're influenced more. But you know, VR goggles typically cost 500 bucks if you were to get them for free. But every time that you saw your hand in the scene, it had a Pepsi logo on it. Would you save 600 bucks to have a virtual tattoo of Pepsi on your hand forever? I don't know. I bet you a lot of people would. I mean, that's the Facebook model, right? You pick the new product and you see advertisement that's basically on your stream. So I think there would be hunger in the market for free stuff with a little bit of product placement.
[00:45:34.918] Kent Bye: I just want to jump in about the advertising because I know there was a question about advertising. And I think there's a distinction that I think is important to make between just advertising in general and surveillance capitalism, which I think is an extension of the existing ecosystem of advertising that's so pernicious that I also think is extremely problematic when you start to extend out surveillance capitalism business models into the virtual worlds when you have access to all this biometric data, galvanic skin response, what you're looking at, eventually EEG data that could literally read your mind. We're talking about a pathway that is leading into like this dystopic future of being able to read our thoughts and to predict our behaviors and model ourselves perhaps more accurately than the stories that we tell about ourselves. So I think it's important to say that we do actually need new business models that go beyond surveillance capitalism, because that trajectory is extremely problematic. And I don't know if that's going to come just from the culture of people saying, no, I don't want to have all this stuff tracked, or if we actually need to have some level of policy to either give privacy as a human right that's embedded into a fundamental part of our legal structure, rather than something that's kind of piecemealed together. So I think that's, at the policy level, something that is still yet to be determined. But in the short term, as consumers, we have to be aware of the risks of melding immersive technologies with the existing business models of surveillance capitalism.
[00:46:57.063] Daniel Pimentel: Right. I think that's a great point, Kent. Before we wrap up, as we're sort of inching towards the end here, I do want to address some of the questions also that relate explicitly to what you all are talking about. In particular, I want to focus on Breanne Baker. She brought up, do you think it would be ideal to keep avatars for young children extremely abstract, given what Jeremy was just talking about? She meant the Proteus effect. It's sort of the same idea of gender-neutral toys, brandless Montessori toys. So just getting at this notion of the developing brain in children and how powerful these immersive experiences can be, how do we take into consideration younger audiences when we're crafting these immersive experiences? And feel free to chime in as you want.
[00:47:36.357] Jeremy Bailenson: I posted in the chat a report we wrote for Common Sense Media. That's a lot of what I know about VRKids. On the Avatar side, I really like your suggestion. I want to think through it. I haven't thought about whether they should be specific or general, depending on what Piagetian phase they're in. I really like the question. I don't know the answer, though.
[00:47:54.380] Kent Bye: I'd say you could look at what's happening in Rec Room. For me, I think that's probably an environment where you do have a lot of teenagers that are experimenting with that. Also like Minecraft and Roblox. These are just environments to see what's already happening. And I do think a lot of the business model for folks like Rec Room are all about like selling different ways to modulate your identity within these experiences.
[00:48:17.940] Donna Davis: It's also going to be super interesting because I know there was also a question about implicit bias, even as we think about children being more engaged in these environments. As you say, they've been in Minecraft, they're in Fortnite. Gen Z has lived in the screen from the beginning. So what have they learned and how does that influence the way they're thinking about each other? And as we think about implicit bias, is it different because they've grown up in a screen with avatars? And as we talk about policy and, you know, there's currently very little policy to protect children from the choices they make in these spaces and the effect that these spaces have on children, much less us old folks.
[00:49:02.977] Daniel Pimentel: Right, so great point on the, yeah, Wes did mention, you know, thoughts about sort of sharing, like you mentioned, VR as a tool to combat implicit bias with all the events happening now, police training of the like. I don't know if anyone wants to touch on that very briefly as we sort of reach the end here. Another question we have posted is by Keith, who says, decisions about surveillance capital need to be made first before XR decisions can be made safely. So I don't know who wants to tackle that.
[00:49:30.699] Kent Bye: I don't think that's anything that any one individual can make a decision on. I think they happen and they coexist. Something like GDPR is something that's fighting back. So there's legal structures that do start to try to have more privacy concerns, but it's basically the economic engine that's running Facebook that's really driving it. I think the question is how far do they go about all this biometric data and how much is that integrated into the underlying business model, a lot of the way it is right now is you pay to get access. But, you know, there's an argument, like Jeremy was saying, would you be willing to give up some of your privacy or identity expression agency to be able to bankroll and fund some of this? I think A lot of that has yet to be fully determined, but I feel like we're kind of just letting the big companies do what they're doing. And there's not a lot of recourse we can do to stop it other if there's going to be a big massive movement culturally, or there's laws that are passed that are saying, Hey, you know, this biometric data, maybe we shouldn't be using that for surveillance capitalism.
[00:50:26.455] Jeremy Bailenson: I'll just quickly say in the policing, we just made public that for the past few years, we've been working. Jennifer Eberhardt is a professor at Stanford, a MacArthur Genius Award winner that studies racial bias. We have been using VR, not in race in particular, but in de-escalation techniques for police and working with a very large West Coast law enforcement agency where we have virtual reality scenarios where the officers are Getting extra repetitions to interact with various people and to work on de-escalation techniques So it's something I've been working on for a while There's still not much to say on it other than we've been really knee-deep and working on the partnership With the law enforcement agencies, which is going really well and on content development But it's going to be something we're excited to continue to work on and yeah Sweet.
[00:51:17.214] Daniel Pimentel: Thank you. Jeremy Donna. Did you want to jump in?
[00:51:20.094] Donna Davis: I've always said that currently, immersive technologies don't come with a Surgeon General's warning. Sometimes I think they should. But as we think about, you know, what kind of effect are they going to have on our, I mean, there's been some really interesting research on too much time with a lens that close to us, is it elongating our eyeballs? you know, physiologically, how is it changing our body? Does it actually change our body? Is it changing the way our mind is processing data? The image you showed, Kent, about the new killer game, you know, these are things, you know, they just have said World Health Organization last year finally recognized gaming as an addiction issue. These are things I think we need to be having these conversations about as we think about the power of being able to get into that screen where it takes you that much deeper. Danny could have even addressed some of these things as he just put people inside a sea turtle. in the study that he did and took them into the ocean where their hands were the sea turtles flipper. You know, what are the big questions we should be asking each other and our industry and researchers about that help us think about how we protect people from potentially harmful consequences while we're also looking at the potential gifts that can come from this kind of immersiveness, and creativity, and escapism, and entertainment, and, and, and, and, and connection. Don't throw the baby out with the bathwater, but let's make sure the bathwater isn't too hot.
[00:52:58.337] Daniel Pimentel: Lisa, Kent, did you want to add anything?
[00:53:01.242] Kent Bye: Well, as designers, I think that ethics is not something that is a clear line. There's never a clear line with ethics. It's always kind of blurry and ambiguous. And there's always trade-offs. When I put together my XR ethical manifesto, I put out, in the perfect situation, here's what you do. And then I realized, if you were to make anything, you could never do everything perfect. There's always trade-offs between these different contexts. I think that's the challenge is that whatever you're doing, you can't do it perfect. And you have to figure out because we don't have perfect information. We don't know about all this stuff. There's going to be potential harms that could be done with this technology. And I think the role as designers is to try to just mitigate what those harms are and to try to, as best you can, be aware of all the potential risks, and rather than move fast and break things and just put things out and figure it out later, that you do a little bit of work ahead of time to see, okay, if you're going to be doing social VR, what are the policies around harassment? What about avatar representations? I mean, there's all these additional questions that you have to ask yourself around algorithmic bias and diversity and inclusion, representation. But my takeaway is that there's no perfect answer and that we're all just trying to figure it out. And each iteration is helping to perhaps give. Important information to other people that are also pioneering in this space.
[00:54:15.868] Daniel Pimentel: Right. Like I completely agree. I think there are no necessarily concrete solutions, just really important conversations that designers need to be having with all various kinds of stakeholders. And, you know, with that being said, hopefully people who attended today can sort of leave with a greater awareness of some of these considerations. and immersive. They might not be working specifically with a VR platform, but they may in the future do so. And so having sort of actionable steps of things they can collectively think about or individually to design with ethical integrity, I kind of think is the main takeaway. And I think all of you did a wonderful job in sort of laying that out. So I want to just say thank you to such an amazing panel. Thank you for sharing your insights.
[00:54:54.440] Erika Berardi: Yes. Thank you to our panelists. Thank all of you guys for coming. Thank you, Danny, for moderating. You did fantastic. We all are giving you a virtual round of applause right now. So we will be emailing this recording to everyone. And have a wonderful Friday and the rest of Design Week Portland. Thank you.
[00:55:14.086] Kent Bye: Thank you so much. Thanks, everybody. Bye, everybody.
[00:55:17.427] Donna Davis: Take care.
[00:55:18.878] Kent Bye: So that was the panel discussion, Ethical Design and Immersive Media that happened during Design Week Portland, organized by the University of Oregon. So it featured Jeremy Bailenson, he's the founding director of Stanford University's Virtual Human Interaction Lab. Donna Davis, she's an associate professor there at the University of Oregon, and she's the director of the Strategic Communication Master's Program, as well as the Oregon Reality Lab Director. And then Daniel Pimentel, he's an assistant professor of media psychology there at the University of Oregon. So I have a number of different takeaways about this panel discussion, is that first of all, Well, I was really struck by a lot of stuff that Jeremy was talking about in terms of, you know, specific things that you can glean from the data. I mean, I think there's a lot of research that I hope to at some point dive into a lot more details into some of the specifics of personally identifiable information that could be extracted from different behaviors within virtual reality. I know he's continuing to work on those issues. And so Jeremy shared this fictional anecdote, which was you go into a VR maze experience, you do a bunch of different things, and then people are actually creating that experience or taking that data and then selling it to people like insurance companies to be able to extrapolate whether or not you're at risk of having dementia or something like that. And then family members also being denied based upon data that's being extrapolated. So it starts to ripple out into people that go beyond just your decisions. So the decisions that other people make having some impact on your lives as well. So a nice little black mirror-esque scenario, but something that is not so implausible. One of the things that Jeremy said to follow along with that is when we have experiential marketing experiences, there could be ways of tracking information about our behavior, what we're paying attention to, what we're valuing, and maybe they're able to profile us in different ways. And so if we're getting access to these free experiential marketing experiences, then what is happening to the data on the backend and what are they going to be able to potentially use with that data that they're capturing on us? you know, it could be real benefit for them in terms of what they're able to glean about us based upon these different scenarios that they're putting us in. And then Donna Davis, you know, a lot of the stuff she's talking about here is the diversity, inclusion, and identity and these virtual representations online, but also different ways to make virtual worlds accessible. I think because Second Life has been around so long and they have such a large population of the users of Second Life who are not fully abled and So there's been quite a lot of focus on accessibility issues within Second Life being screen-based, so people with computers being able to type in text and be able to communicate. And so for a long time, that was a primary mode of being able to communicate within Second Life is through that text. And if you're deaf, then, you know, that's actually a lot more accessible interfaces within virtual worlds. And what I've been seeing within virtual reality technologies is that those different types of considerations have been thought of last, because we spent so many years just trying to get VR just to simply work for people who are able-bodied. Now that they are working for people who are able-bodied, then how do we go back and start to reconsider, like, okay, if we are going to have these text-based mediums, when you're in VR, text-based interaction is actually like a terrible interface. screen resolutions are so small, you can't even read the text anyway. And so as we get larger and larger resolution and have potentially more AI to be able to speak and have that being transcribed, maybe there'll be more accessible options there. But generally, again, I think that's still a big issue within the larger immersive technology sphere is there's a lot of complicated technology in order to make it And it's already such a niche of a niche of a niche that a lot of folks who have been creating these virtual worlds have been just trying to focus on creating their own viable business models with the small groups of people who are already using it. But as we move forward, I think this is absolutely crucial. And just to see a lot of the lessons that have been learned from things like Second Life, an example that she gave was having these visual communication languages of emojis or taking these Unicode characters and stringing them together so it gives this pretty picture that people may be putting. And aesthetically, it's very pleasing for people who can see, but for people who can't see, when there's a screen reader reading that, it basically turns into garbage and disrupts any sort of viable communication that's happening there. And so how to actually deal with making things accessible for people who have impaired vision. I actually have a whole interview that I did with Donna over three years ago, and I'll be diving into that after I put this episode out, just because there's a lot of other aspects and considerations and lessons learned that we can learn from Second Life in terms of how to bring more accessibility into these virtual worlds. The last point that jumped out to me was just this concept and idea that Jeremy shared of, you know, what does it mean to be able to brand yourself with virtual tattoos? Would you be willing to have that tattoo on yourself and to get access for things for free? I've already started to see that a little bit within VRChat. There is the virtual market where you could try on different avatars And either you buy the avatar outright, or if you don't, you have this like QR code that comes up and this is plastered all over your body. So you're essentially walking around with an avatar representation, marketing it to other people who may want to buy that and not have that QR code there. But if people are comfortable with advertising that way, then there's different ways that the artist can be able to tag those different avatars like it's a sample or you know there's maybe a legitimate way in which it's a little bit less intrusive than a giant QR code on your chest and maybe it's something that you have the name of the artist or something like that you know different ways of like he's saying that this metaphoric way of tattooing pepsi onto your wrist And, you know, he's saying overall that he's not anti-advertising. And I'd say I'm not anti-advertising as well. I'm just anti-surveillance capitalism. And I think there could be an equivalence of those two things coming together of having undue surveillance in all of our behaviors and tying that into these different predictive models and undermining our agency within these different environments. I think that's certainly problematic. think Jeremy's point is that there could be legitimate ways in which that there are market mechanisms for advertising to be able to have access to free experiences but with the trade-off of You express that brand as a part of your own identity but also generally when you have these avatar representations whether or not you have a diversity of different age and race and that you can start to take on the specific characteristics that you project onto the meaning of those different avatar representations, and you start to take on those characteristics. And so if you're a child, if you're a different race, if you're more or less attractive, and so that's the Proteus effect, and also body transfer, some of the research that Jeremy has been doing, and also the wider community in terms of looking at the impact of what's it mean to have these avatar representations within these virtual worlds. So again, overall, there's so many different issues of ethics and design. And for my part here, trying to condense down and I put together some slides and I'll maybe put together a little short little five minute video as a little taster, but that within itself could be a whole hour long talk, really diving deep into all these different dimensions of ethics and. And as designers, what can we actually do about it? And what are some of the things that are kind of out of our hands in terms of maybe we need help from policy levels, you know, like this thing around privacy. Is that something that the market's gonna take care of? Is that something that the culture is going to be aware of? And it looks like the different cultural and the market dynamics are leading towards a path where that's not gonna have anything change anytime soon, unless there's something from external laws and regulations from like Europe, from, as an example, GDPR, to be able to shift those different types of economic behaviors. So, yeah, I think there's still a lot of things that are yet to be determined at those larger issues, but I was happy just to be able to participate in this and to be able to cover some of these different issues that we talked about here. So, that's all that I have for today, and I just wanted to thank you for listening to the Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a lessor-supporter podcast, and so I do rely upon donations from people like yourself in order to continue to bring you this coverage. So, you can become a member and donate today at patreon.com slash voicesofvr. Thanks for listening.