#1057: What Parents Should Know about Social VR, Understanding Social VR Harassment, & Parental Guidance for the Metaverse with Lance G. Powell, Jr.

parental-guidance-for-the-metaverse
There are a lot of unsupervised children in different social VR applications, and hopefully this interview with Lance G. Powell, Jr. about his article Parental Guide for the Metaverse can be a part of helping inform parents about some best practices for navigating virtual reality with your child.

Many social VR spaces are still largely unmoderated, which means that there’s lots of harassment, trolling, and intolerant behaviors happening. Powell wrote his Master’s Thesis on A Framework for Understanding and Detecting Harassment in Social VR, where he started to map out signatures of harassing and abusive behaviors including “sounds related to sexual activity, presence of sexual or hate-related imagery, incitation to violence, large quantity and repetition of vulgar language, large variety of taboo or controversial topics, proximity of vulgar language and behavior to login time, physically mimicking sexual activity, and suprasegmental features of speech.”

Parental supervision in VR is a hot topic right now, and Powell wrote up a list of 10 points to help parents start to navigate the risks within social VR applications.

  1. Don’t Talk to Strangers (if you’re under 13)
  2. Don’t Go to Other People’s Private Worlds [of people you don’t know]
  3. Learn the Safety Tools
  4. Record the Incident
  5. Only Go to Private Worlds
  6. Understand the Social Norms of a World / Environment
  7. Be Cautious about How You Report Harassment
  8. Avoid Conflict Avoidance
  9. Plot Twist — Your Kid Might Be Part of the Problem
  10. Play Half+Half on Quest

I had an extended discussion with Powell on his list in this podcast interview, and overall I believe this a great start in helping to have a safe and enjoyable experience in social VR spaces. I don’t agree with all of his points or perspectives, and we deliberate those within our extended discussion.

For parents, one of the most important things to know is that the minimum age for using VR is 13 years old. According to the Oculus Quest Terms of Service

Oculus Products are intended solely for users who are 13 or older. Any registration for, or use of, Oculus Products by anyone under the age of 13 is unauthorized, unlicensed and in violation of these Oculus Terms. You certify that you are of the legal age of majority in the jurisdiction in which you reside or, if you are between the ages of 13 and the legal age of majority, that you are using the Oculus Products with the supervision of your parent or legal guardian who agrees to be bound by the Terms, and that you have reviewed the Terms with your parent or guardian so that you both understand all of your rights and obligations.
[Italics added for emphasis]

There’s a clause in there that 13-18 year old teenagers should be “using the Oculus Products with the supervision of your parent or legal guardian.” This type of parental supervision is not currently happening, and there’s no way for Meta to monitor or enforce this. Age verification has been a hot topic in the VR community, and Andrew Bosworth (aka “Boz”) responded to his Instagram 2/10/22 AMA (mirrored here) to the question “How are you going to handle the “kids in VR” problem long term? Age Verification?”

“You know, we’ve been very clear, [VR] is a product that’s for people who are 13+, and that parents should be monitoring even children ages 13+ and their usage, which you can do through casting and streaming on the app to a phone or to the TV. All of our content is rated. All of our developers have to follow our content guidelines and policies. This is an issue that parents have to take seriously. I don’t know of any system that is going to verify ages that can’t be circumvented if you have unsupervised youth. Listen, I’m a parent myself. I understand the importance of this stuff. I really do. It’s important to me too on a personal level. But that’s why I’ve got to take responsibility for it myself. So, you know, we have the systems and policies in place. We’re very clear that is a product for 13+, and that parents should be monitoring kids between ages 13 to 18 for their usage.”

Powell’s reaction to Boz’s statement is that RecRoom has implemented junior accounts for children 12 and younger, and that there are ways to start to mitigate this by having accounts run credit card transactions to verify their age. Boz is correct in that there are not perfect solutions that will always prevent circumvention. But even Meta’s Horizon Worlds is alleged only available for “people 18 years or older in the US and Canada,” and I say allegedly here because Boz is saying that Meta has no pragmatic way of enforcing this 18+ requirement, and is instead putting the burden onto parents to monitor everything that their children 13-18 do within virtual reality. Powell says that it’s probably unrealistic and unreasonable to watch everything your teenagers are doing in VR, but that at least if they’re in the same room with you then you can start to hear what they’re saying and what’s happening.

This is XR Engineer Avi Bar-Zeev’s reaction to Boz’s take on age verification, “This is BS. Apple implements TouchID, FaceID, ScreenTime, age checks to make their phones safer. He goes on to say, “We’re expected to believe that a company built on identifying us across the entire World Wide Web (even when we delete our accounts!) based on cookies, trackers and tiny unique differences in our phone sensor data can’t tell kids from registered adults?”

https://twitter.com/avibarzeev/status/1496659779548184577

There are other privacy implications for Meta always knowing who is using a headset, but there’s certainly ways in which this type of biometric identifying information could be used to help protect children and prevent the widespread use of underaged kids within VR. But for the moment, Meta is passing the buck to parents to responsibly monitor their children. That isn’t currently happening at scale, many VR applications are flooded with kids younger than 13 and teenagers.

The minimum age for VR is 13 years old also in part due to Meta’s required compliance for the Children’s Online Privacy Protection Rule (COPPA) in the United States. Companies are not allowed to collect data on children less than 13 years old, & Meta has not announced any plans for putting in safeguards for children 12 or younger to come within COPPA compliance.

Another reason for the 13+ requirement is probably because a lot of the social VR applications have an Entertainment Software Rating Board (ESRB) minimum rating of Teen (i.e. 13+), and some apps like VRChat and Meta’s Horizon Worlds should probably have an ESRB rating of Adult and 18+.

RecRoom actually has an ESRB rating of 10+, and they do have junior accounts available for children who are 12 years old or younger, which restricts who they speak with. But it’s also actually not recommended that children less than 13 should be using VR at all yet, as the minimum age for the Quest is 13+. The content of RecRoom is technically 10+ as you can access it via 2D platforms like XBox, PlayStation, iOS, and Android, but the VR version should probably be rated Teen to make this more clear to parents. Especially because the visual systems of kids less than 13 may still be developing and the long-term effects of the Vergence Accommodation Conflict are currently unknown for how it could impact their vision. Early studies indicate that there are potentially more risks than harms for children, and so for a variety of these reasons, 13 years old is the recommended minimum age for using VR.

There are also other social VR platforms like VRChat where there are other risks. The BBC published an article on February 23rd, 2022 titled “Metaverse app allows kids into virtual strip clubs” where they reported that Andy Burrows from the National Society for the Prevention of Cruelty to Children said that children are “being exposed to entirely inappropriate, really incredibly harmful experiences” within VRChat. The BBC reports that a “researcher posing as a 13-year-old girl witnessed grooming, sexual material, racist insults & a rape threat in the virtual-reality world.” There are a lot of sexual harassment, grooming, and generally toxic trolling behaviors that are happening with public instances of VRChat, which Powell’s list has some ways to mitigate these risks. However, there are many people within the VRChat community who are suggesting that perhaps it is time for VRChat to become an 18+ application as there are a lot of things happening within the platform that make it not suitable for teenage kids.

You can check out the comments and Quote Tweet comments on this Tweet for more discussion on that front.

There probably is some responsibility left to platforms like VRChat to enforce their Terms of Service and Community Guidelines via moderation. In looking at their policies, VRChat’s Terms of Service says that users should not “post, upload, or distribute any User Content or other content that is unlawful, defamatory, libelous, inaccurate, or that a reasonable person could deem to be objectionable, profane, indecent, pornographic, harassing, threatening, embarrassing, hateful, or otherwise inappropriate.” This is mostly around avatars and worlds that are being uploaded, which does have different layers of having things live within community labs before it is made more widely available through search.

The harder thing to moderate is violating behaviors, especially because there are up to 90,000 concurrent users within VRChat at peak hours, which makes it extremely difficult to have any sort of moderation system to prevent violating behaviors from their Terms of Service to “use the Platform in any manner to harass, abuse, stalk, threaten, defame, or otherwise infringe or violate the rights of any other party.” The community guidelines in VRChat do not accept intolerance such as “Hate speech, including language, symbols, and actions” nor “Discrimination towards spiritual beliefs, gender, sexual orientation, sexual identity, disability, and/or any other personally identifying factors.” They also do no permit harassment of “repeatedly approaching an individual with the intent to disturb or upset” or “going through other individuals and channels such as social media to continue to harass an individual after being blocked.” But again, because these violating behaviors occur on an individual basis in real-time on the platform, then it is difficult to use automated technologies to police individual behaviors. In order to enforce these, then you have to use the built-in safety tools for blocking or reporting violating behaviors.

All of this reiterates the importance of parental oversight if your teenage is spending times on these social VR platforms.

But what makes VRChat different than other social VR platforms is that they do have some tolerance for sexually explicit behaviors on private instances. Public instances are open for anyone from the public to enter, and private instances are restricted to the getting access bases upon knowing someone who is within that instance, which is like being metaphorically behind closed doors within someone’s private residence.

In terms of inappropriate conduct section of VRChat’s community guidelines, VRChat does not allow any type of pornography or nudity anywhere on their platform either within public or private instances. However, they also say that they do not allow “live-streaming, advertising or publicly sharing content that is sexually explicit in nature or simulates sex acts is not permitted.” Notice the “publicly sharing” of sexually explicit behaviors clause here. This implies that sexually explicit content and the simulation of sex acts are only not permitted on the platform if it’s publicly livestreamed, advertised, or publicly shared. This means that these things are okay as long as it is within a private instance.

The implication here is that sexually explicit behavior is okay as long as it happens between consenting adults on private instances. The issue here is that the minimum age for using VRChat is 13 years old, and because there’s no age verification, then there’s no guarantee for the users of VRChat who are engaging in sexually explicit behavior on private instances is actually between consenting adults. The presence of sexual predators and sexual grooming is a whole other issue that the BBC article pointing out as the researcher went undercover as a 13 year old and personally experience sexual grooming.

Part of VRChat’s Terms of Service is that teenagers from 13-18 need to get consent from their parents, but it is unclear how the platform is currently enforcing this. Here’s what their Terms of Service says,

2. Eligibility
You must be at least 13 years of age to use the Platform. By agreeing to this TOU, you represent and warrant to us that: (a) you are at least 13 years of age; (b) you have not previously been suspended or removed from the Platform; and (c) your registration and your use of the Platform complies with all applicable laws and regulations. If you are at least 13 but are under the age of 18, you may only use the Platform with your parent’s or guardian’s consent and your parent or guardian must provide consent after having read this TOU.

There’s an implication here that parents need to be consenting to their teenager being on VRChat, and according to Meta’s Boz they should also be more directly supervising their activities in VR. Because there doesn’t seem to be much moderation happening on VRChat in private instances at all, then it is pretty much the Wild West when it comes to what may be happening behind closed doors on their platform, especially when it comes to sexually explicit activity.

It was unclear from the BBC article whether or not they were on a public or private instance, but either way it shows the possibility of teenagers hanging out in a virtual strip club. It is for this reason that a lot of VRChat users who do engage in 18+ behaviors on the platform are in favor of increasing the minimum age for VRChat from 13 years old to 18 years old. It is for this reason, that Powell, myself, and others have a hard time recommending to parents that they should be letting their teenage hang out on VRChat unsupervised, and potentially not even be let onto VRChat at all.

Boz said that all of the content on their platform is rated, and VRChat is currently rated by the ESRB as “Teen.”
ESRB-teen-VRChat

However, as the BBC article points out, a researcher posing as a 13-year old was using VRChat and reports, “The BBC News researcher – using an app with a minimum age rating of 13 – visited virtual-reality rooms where avatars were simulating sex. She was shown sex toys and condoms, and approached by numerous adult men.” Either VRChat needs better moderation strategies for violating content, or their minimum age by the ESRB needs to increased to Adult (18+). ESRB-ratings

The ESRB rating and Terms of Service for Meta’s Horizon Worlds is sending mixed messages for what the minimum age requirement is. There’s a blog post where Meta declares that the age requirement for Horizon Worlds is 18+, but yet a minimum age is not listed part of their Horizon Worlds Terms of Service, and the ESRB rating of Horizon worlds is rated as Teen (13+).
horizon-worlds-esrb

Figuring out how to deal with children in VR is a hot topic right now because there are a lot of unregulated and unmoderated risks without any proper age verification systems, and the ESRB ratings don’t seem to be reflecting the current recommended practices. There are terms of service obligations and community guidelines, but again there seems to be a different enforcement philosophy between public vs private instances on VRChat, and there’s not a robust enough automated systems or manual oversight to be able to monitor and enforce their guidelines. So I can’t recommend VRChat as being a safe space for children and teenagers to hang out in, and it’s probably best to be avoided, which re-emphasizes the need for proper parental supervision.

That all said, it is also possible to have enjoyable experiences on social VR platforms with teenagers with some parental supervision and following some of Powell’s Parental Guide for the Metaverse that should be helpful for not only the parents, but also the teenagers if they follow some of the best practices listed in this document. And listen into our discussion, which breaks it down even more and other things to consider as your children explore what’s possible within virtual reality.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.412] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye and welcome to the Voices of VR podcast. So on February 23rd, 2022, the BBC wrote an article titled Metaverse App Allows Kids Into Virtual Strip Clubs. So the tweet that I made was that the BBC reports that children are being exposed to entirely inappropriate, incredibly harmful experiences within VR chat. A researcher posing as a 13 year old girl witnessed grooming, sexual material, racist insults and a rape threat in virtual reality world. So I posted this on February 23rd, early in the morning, and it ended up getting over 100 quote tweets and nearly 100 different comments. It was hitting a lot of hot-button issues here. Number one, that there are a lot of underage kids that are in virtual reality. 13 is supposed to be the minimum age. But even from 13 to 18, a lot of the comments that we were getting from VRChat users was that VRChat is not really a great application for kids who are from 13 to 18. and there's a lot of unsupervised kids that are running around and just not a great situation for them. There is the potential of sexual predators and other situations and contexts that is kind of like the equivalent of sending your kid out into a city unsupervised. So that's one big hot topic issue. The other issue is around harassment and the types of situations you get into these public instances of social VR worlds. So, there's a lot of people that were then commenting on, you know, what the best practices were, and there's ways of blocking or muting people. So, there was a lot of discussion that was happening as a result of this article. And one person, Lance Powell, who has done quite a lot of exploration of social VR worlds, actually wrote his master's thesis about a framework for understanding and detecting harassment within social VR. Just putting himself into social VR spaces just to understand the types of harassment that were happening and ways to mitigate it. He saw this BBC article and wrote up a whole article called Parental Guidance for the Metaverse. I'm just going to quickly run through his top 10 points, and then we'll dive in and unpack it in much greater detail within the context of this interview. So, number one is don't talk to strangers. Don't go into other people's private worlds, especially for people that you don't know. Learn the safety tools. There's a lot of different tools that you can use to be able to have a better experience in terms of blocking and muting. And if you do encounter these different situations, there's certain tools that are built into the software. There's not a really great onboarding for teaching the users for how to use those. And so that was a discussion point that we brought up to record the incident. So if there are things that are happening to record and to be able to report that back into the social VR platforms to only go into private worlds. And so when you go into these public worlds, you're facing lots of these different types of toxic, harassing, trolling behaviors. Number six is to understand the social norms of a world and the environment. This very thing about how there are ways in which people are having good experiences within these social VR worlds and what are the best practices for doing that. Number seven was to be cautious around how you report harassment. I think this one is actually less of advice for parents and more a general commentary about the discussion around VR chat and the people who are really identifying with it. If there's any one of these points that I disagree with, aspects of No. 7 that I have the most resistance to, which we discussed here in the podcast. No. 8 is to avoid conflict avoidance, so to be able to step up to people when there's harassing behaviors. No. 9 is the plot twist of what happens if your kid might be a part of the problem, who's actually the one who's doing the harassing. And then No. 10 is just a suggestion to play a game called Half & Half on Quest. So in the context of this conversation we talk about each of these different points But also explore a little bit about his framework for understanding and detecting harassment within social VR Just to understand the landscape and some of the different technological solutions that are there But also how this is not a solution that can have a complete problem solved from engineering perspective and technology alone I mean these are human beings engaging with technology and there's always going to be some aspect of abusive behavior. And so the question is then how do you mitigate that, not only for yourself as you're going around these social worlds, but especially if you're a parent, how to either supervise with your kid who are going into social VR or potentially not even having them go into different applications like VR chat is a big discussion that we had as well. and the challenge of having a lot of unsupervised kids within VR in general and all the different variety of problems that come up, but also just generally these larger issues around harassment and trolling that we start to unpack here on this episode. So that's what we're coming on today's episode of the Wasteless VR Podcast. So this interview with Lance happened on Thursday, February 24th, 2022. So with that, let's go ahead and dive right in.

[00:04:43.550] Lance G Powell Jr: Hi, my name is Lance Powell. I am the CTO of VEDX Solutions, which is an immersive education company. I also, in my free time, work with Euromersive Turkey, which is a local community helping support VR because currently I'm in Istanbul. I also have a very unpopular YouTube show called VR Special that's been going on since essentially the pandemic began. And I guess I'm here today because I have a research background in harassment within social VR. And I did that as part of my master's study in Bosworth University, where I graduated with a degree in cognitive science.

[00:05:25.875] Kent Bye: Okay. Yeah. Maybe you can give a bit more context as to your background and your journey into VR.

[00:05:31.800] Lance G Powell Jr: So my journey into VR began at the beginning of 2016, when there was a VR-first lab that opened up in Istanbul University called Abaçeşehir University, and tried out VR for the first time meaningfully, and I was immediately hooked. I begged them to let me become a member, and I became one of the more active participants there, like helping the community grow, helping support events such as hackathons, and over time developing an eye for design and so forth, but I was immediately interested in networked experiences within VR. So at that time, AltSpace was just coming out, along with VRChat and RecRoom, among some others, and I became active there as well, because even though they were new, there weren't many people, And the types of activities they offered were limited at the time. I saw a lot of potential there. And I've been happy to see all three of those platforms survive until now.

[00:06:33.228] Kent Bye: Okay. Yeah. Well, I guess there is an article by the BBC that came out just a couple of days ago that was investigating a number of different harassment and sexual harassment and racist language and a lot of sexually explicit content that was in various different public rooms of VR chat. And so there's a lot of reaction from the community in terms of There's certainly a lot of kids who are in VR. The minimum age is supposed to be 13 years old. And so there's lots of kids who are less than that age in VR and these different social VR platforms, but also a larger discussions around whether or not kids who are from 13 to 18 should even be in VR chat and whether or not it should be an 18 plus. And so a lot of discussions around supervision of children within VR, if they are teenagers, And what Meta's official position is, is that parents should be streamcasting whatever's happening in VR and monitoring what their kids are doing in VR. So maybe we could just start there. And as you start to see this, not only the article that was reported by the BBC to shine a light of some of the culture that's happening within these different platforms and what your initial take is in terms of what this current situation is in terms of harassment within VR. And then your take in terms of what is the most viable solutions, not only from a platform level, but also from a parental level and as users, what they should know.

[00:07:56.976] Lance G Powell Jr: So I can begin with the BBC article just because I also had a strong reaction to it, which is why I ended up writing a blog in one sitting about parental guidance within social VR. I don't know completely what happened as far as the journalist, as I remember, she claimed to be undercover posing as a 13 year old within VR chat. And I have a lot of questions about how somebody poses. Has a 13 year old like did she announce it to one person and that's it because it seems like it was a long drawn-out experience that said the problem that she's describing is certainly plausible and I've seen like some sexual content there and also Underage people could often be involved which is why we need to take steps to protect them but I think there's a wider disagreement about the what those steps are and who should be taking those steps. Because either you need the platform to do something for you, which can be restrictive to your experience within VRChat and other platforms, but you can also appeal to a wider community, allowing them to clean up their own backyard, basically. And I personally would prefer that we help support communities so that They can monitor and take care of the people around them so that it is a safe and welcoming place for everybody. And people get the experience they want out of being there. But a lot of other people, they have a different approach. Like I believe some people suggested like legislating activity within social VR. And I respectfully disagree with that because it's a legal nightmare. Like when you think about it in practical terms.

[00:09:44.470] Kent Bye: Yeah, well, it seems like that there's certainly no lack of harassing and trolling type of behaviors that are happening in these platforms. And I think one thing that I appreciate from the BBC article is just to shine a light on this as an issue within the larger social VR ecosystem. And from there, what are the things that the parents should be aware of in terms of like, this is the types of things that can happen. certainly there's some worst case scenarios that are plausible from my perspective. I'm not questioning whether or not this had actually happened. It's more of what are the types of ways that you should react to it. The number one response that I got from the VRChat community was number one, that VR should not be used as babysitting, that parents should not be throwing their kids into any social VR platform without any type of parental supervision. And then number two, there are questions around whether or not some of this experiences that are in a platform like VRChat should just be age gated. Age verification was a topic. I know that Andrew Bosworth of Meta has, what he said was that any system that he knows at least could be circumvented as long as you have parents who are not properly supervising their kids. and that they're putting forth what the parents should be doing. But then the reality is that there's a lot of unsupervised kids who are running around in these social platforms, creating not only havoc, controlling and disruptive behavior for other kids, but also just for the adults as well. Overall, just disrupting the platform. So I guess the number one blame that I saw is that parents should be taking more responsibility. I don't know if that's the total story, because I do think there are probably some things like, you know, the more that I was listening to this reaction, the more I was like, well, maybe kids under 18 shouldn't be in VRChat. Maybe this just is not a good platform for them. Maybe there's other platforms like Rec Room or other experiences that they could do that are maybe more suitable for them. So just curious to hear some of your initial takes from some of those reactions, as well as where you stand on some of those issues.

[00:11:37.503] Lance G Powell Jr: Right. So to give some context, I am actually a father of a seven year old child, and some of these platforms are accused of having children as young as seven running around in them. So When I talk about what I would do in this situation, I'm not speaking hypothetically. It's like I fully intend to follow my own advice with regard to a social VR. So I believe the parents have a lot of responsibility here and I think you would have to be just completely ignorant or crazy to put an underage child into VR chat just because of the level of abuse that can happen. So I agree that parents should be watching how their children use VR. And additionally, the policy of keeping children under 13 out completely is a valid one. 13 to 18 years old, I think 18 years old is the American perspective because different cultures have different ideas about what the proper level of maturity is. But for 13 to 18 years old, I would still take precautions with my own child to make sure that they aren't going into situations that are abusive and are going to make people kind of question the soul of humanity because they saw something dark and disturbing that they can't forget. So yeah, parents have responsibility, but also platforms need to have mechanisms for reporting if somebody is thought to be underage. So I believe I mentioned Rec Room in the blog. and how they have a system of reporting people who are clearly children, forebeing children, and once you do that, you aren't shut out of the experience. Someone on a Rec Room Junior account is still able to go to all of the different worlds, but they cannot speak to anybody or hear anybody, and they also have a protective bubble around them, so that if they get close enough to another player, they'll just disappear. So it's virtually eliminated the ability to harass that young person, but they can still go in there, they can see the different worlds, they can have narrative experiences and enjoy the game. So they're still getting all that they want from the platform and the environments there. And when they're old enough, they'll be able to connect with other people there and have some meaningful social relationships as well. But yeah, I completely agree. Like under 13, you're not really ready for that. And so they talk about circumventing these different security measures with regard to age. And this is true, but I talked to Mama Monkey about this particular topic in Rec Room. And what they do is if they find that somebody is probably underage, very likely underage, they'll ask them to do a credit card transaction. and verify through that credit card transaction to Rec Room that they are in fact old enough to be playing. And very often parents will not just give their credit cards to a young child. So it works for them. But yeah, you'll never eliminate it completely. But if your community is strong enough, they'll understand that they need to work to protect children. And sometimes protecting children means reporting that they're underage and just shouldn't be in the experience. as well, because people who are serious about their communities in VRChat or Rec Room, they do not want to see them taken apart because they couldn't solve these problems.

[00:15:05.557] Kent Bye: Okay, well, you have 10 points of parental guidance that I want to dive into at some point, but maybe first, before we get to that, you did write a thesis about harassment in VR. And so it sounds like that you had spent quite a lot of time across these different social VR platforms and create a bit of a taxonomy of the types of harassment that you're seeing. And so maybe before we start to dig into what the guidance for parents might be, be curious to hear a little bit more about how you describe the types of harassment that you see within these social VR platforms, what people typically do to be disruptive or try to harass other people.

[00:15:44.131] Lance G Powell Jr: Okay. So actually before even talking about the type of harassment, I dove into what harassment is within the thesis itself. And some definitions, especially legal definitions, talk about how harassing behavior is one that limits your access to an activity or experience. So the earliest forms of harassment were workplace harassment, because if you're being harassed at work, it limits your ability to keep a job because your boss is insisting on dating you or something like that. And the same for education, like if you have a teacher who is sexually harassing you as a student, it limits your ability to get an education. And the type of harassment we're talking about within social VR platforms is the ability to have a good experience within that platform in that moment. So I do like to bring in that context because there's not just one type of harassment and I think People who only focus on this single form of harassment are forgetting a wider context and also introducing the same degrees of victimization. Because not getting an education can be very impactful. Not being able to work can as well. It can really harm your life. But not being able to enjoy that platform within that moment or feeling stressed or anxious because what's happening It is unpleasant, but it's something that you should be able to recover from. Because one thing that people forget, surprisingly, is that in the end, you can just take off the headset and it's all over at any moment. You can pull the kill switch. So it's unfortunate that we have to do that and to build a strong platform. Of course, we don't want people pulling off their headset, but in the end, you can.

[00:17:36.463] Kent Bye: I did want to jump in and just say that a common theme that I saw come up with in a lot of discussions was this is the internet, this happens everywhere. And so, yes, it's also true that some of these harassing type of behaviors happen online, but also there's a tendency to say just because it's happening within a virtually mediated environment that it isn't real or genuine in any way. But I think that within VR, it's also as real as other type of harassment behaviors. But sometimes the different situations and contexts can feel immersive to the point where people have the same type of embodied reactions to intrusions into people's physical spaces or the verbal abuse or the different type of real-time reactions that people have within a virtual context can be actually amplified or just as intense or real as the types of harassing behaviors that can happen in physical reality and type of bullying that might happen within, say, a school context. So I do think that there is a tendency to start to diminish the impact of people being immersed within these environments. And there's a different quality of the immersion that happens with the embodiment within these virtual spaces that I think probably to be more serious than, say, allowing your teenager to have a phone and to go on to any other corner of the internet where there's all sorts of things that could be way worse in terms of content, but in terms of the immersion, I do think that there's something that's different. So I'm not sure if you have any reactions to that.

[00:19:00.628] Lance G Powell Jr: No, I absolutely do. And I completely agree that that type of harassment does trigger a response because your brain has essentially been tricked into experiencing those virtual spaces as real spaces. But if you go into virtual reality, understanding that you are actually in control as an individual user, and you can end this experience whenever you want, or you have the security tools available to you and you can limit how you interact with different people, then you'll be better prepared for what happens. So a first timer, early user won't have that level of sophistication. So they walked into it, they had a really negative experience. They're applying real world social rules to what's happening. So they don't want to seem rude or confrontational. But despite this, like some kind of verbal or sexual harassment is happening, and you don't know what to do. So if you go into these platforms prepared, and I do highly encourage preparedness, then you're going to know how to get out of that situation. Like whether it's something as extreme as taking your headset off. or just muting or blocking that person. And those muting or blocking tools are available in every platform, essentially. So as long as you can access them quickly, you can end it in a way. So I'm mostly concerned as well about those early users, because it's heartbreaking to have somebody go into an experience that is somehow triggering and so forth. But if you learn from that, it should only happen that first time. It does, unfortunately, tarnish the experience from then forward, but hopefully you can overcome that.

[00:20:41.258] Kent Bye: Well, one of the things that you lay out in your point number three of learning the safety tools that we're kind of talking about here is that there's a number of different countermeasures that you can take in terms of muting and blocking and kicking, which is a little bit of a voting procedure that happens from everybody that's in an instance that decides to get rid of people because they're doing behavior that's going against the code of conduct and for each. social platform, there's a certain amount of rules that either you can violate and people can choose to kick you out if there's a collective consensus that people are being disruptive. There's also personal space bubbles and ways of reporting abusive behavior or having moderation and admins who are there who are dropping in and keeping tabs onto what's happening. And so there's some different layers of moderation to make sure that because if there's collusion, then you can certainly have the situation where there's groups that are going around and having abusive behavior that some of the different countermeasures are then mitigated if there's a critical mass of people who are not self-policing in that way. So the thing that I found from these different platforms is that there's not necessarily a great onboarding to how to use these tools to put you into situations to help train you how to protect yourself within these situations. And so I think part of the tenor of the complaints that I saw within the VRChat community was almost blaming the victim for not taking more proactive controls over putting themselves into situations in these public worlds or like to not be familiar with what the safety tools are. But my response to that is that even myself, I don't know if I would be as proficient to use it even as a regular user because I've, I'm sort of mitigating my risk by just not going into a lot of public spaces and these situations in context, but also this pretty overwhelming if. someone is coming into VR for the first time, all the different menu systems and all the things that are happening in there. And it's not so clear as to how to get out of a situation. I know that Meta's Horizon Worlds has a safety bubble that allows you to escape from whatever situation is, and you go into a private space. But a lot of these other places, you're still immersed within that world. And you, like you said, part of the harassment or trolling behavior is that you're being disrupted of whatever the activity is. And so if you're playing a game and you start getting harassed, you basically have to leave that situation and stop whatever that you're doing, which people have a certain amount of flow or investment into that. And so it could be a situation where they're not just going to immediately move. So part of what I'm taking away from that is that there are certain ways that people can protect themselves, but yet there's a certain amount of fluency that it takes time to even learn those tools. And if you're throwing yourself into these different public instances with people who are acting in an abusive way, then you're very likely going to be running into different situations. And so then the question becomes, well, to what degree is that the responsibility of the platform is to help teach people that that's the situation and learn what these social norms are or ensure that people are familiar enough with these safety tools so that they can at least do a baseline level of protecting themselves as they're going into these social VR spaces.

[00:23:49.958] Lance G Powell Jr: Yeah, so you're right that there are a few factors that make preventing harassment or using the tools more difficult. You brought up one, like a group ganging up on you, which can happen. And I do describe one instance that happened in 2017 in alt space of one person, like radically harassing everybody in the room with racist slurs and homophobic remarks and sexist remarks and started physically mimicking sexual acts with different people. And he was alone for the first minute or so, but a couple of men in the room, they saw the behavior. And they thought it gave them clearance to like pitch in and do the same thing. So this kind of behavior doesn't happen with just one person. So it can be a group of people. So that complicates things as far as muting or blocking. But when I see these things personally, as an experienced user, I know it's time to leave the room, like even if I'm invested in an experience. For example, I might be in rec room playing a specific game and the members on my team suddenly say something incredibly racist. And as much as I want to finish the current game that I'm in, I know it's time to leave. So knowing when it's time to leave is important and having the ability to do that quickly is important as well.

[00:25:09.692] Kent Bye: And when you leave, I guess in rec room, it keeps a log of who you were recently nearby. So you can then follow up and report them. And the other social VR spaces, sometimes you have to do it in real time. Like you have to point at them and if people are running around, sometimes it can be difficult, but to actually kind of follow up and report that type of behavior, that is that something that you typically do is report trolling behaviors.

[00:25:30.221] Lance G Powell Jr: So, yeah, I report everybody. Um, like I don't hold back with reporting if I see something wrong. Because I think it's part of being a strong community is like drawing the line and saying that if you cross the line, you don't belong here, basically. And we have to limit your access to this community. So if somebody that shouldn't be using a racial slur uses it, I will report them. If somebody is using homophobic slurs, I'll report them. If the person is clearly too young and they don't understand what they're saying completely, I'll report them for being too young. This is advice that members of Rec Room gave to me back in 2017. It's like, if you see this kind of behavior, like we want to have a good community here. So be a good citizen and report it. And I know they have volunteer moderators as well. And it's kind of their way of having soft power over what happens is if you have a dedicated group of people who are reporting this behavior and you say, okay, when they make a report and they're not report spamming, then we know to take them out either by suspending their accounts, putting them on junior accounts or blocking them completely. So yeah, I firmly believe that communities, if they want to continue, you can't do what they're calling victim blaming in this case. You need to point out the tools that are there so the experience isn't repeated. And also target the individual who is responsible for the harassing behavior, because I want to see the harassers removed. more than anybody, right? Because some people have the argument, it's like, oh, it's just we are chat, people are having a bit of fun. And yeah, that's what bullies say, like when they bully people, it's like, yeah, they're having a bit of fun. It's not acceptable. You need to remove those people from the platform, or at least suspend them for long enough so that they fall in line and don't do it again. With the immediate situation of harassing behavior, one other complicating factor is something like microphone spamming. like very often you'll go into rooms that have a microphone where you broadcast your voice across the room and you might be quite distant from them and this also makes it hard to report who is doing it because actually you don't know and another kind of soft power that a platform could have is for example recording all public broadcasts so that you know who is speaking within a microphone for everyone to hear and in those cases there is specific language that is flagged you remove them as quickly as you can as a platform.

[00:28:04.588] Kent Bye: Yeah, as I've been in different rec room rooms, whenever there's a public microphone, it's only a matter of time before someone gets onto the microphone and start saying any number of horrible racist things that, like you said, it is difficult to, like you showed me actually in rec room that you can go in and see who's speaking at any moment, but yet the same time, if you have two guns in your hands in rec room, you have to actually drop one of your guns to even open up in the one on the menu. And then go in there and basically find a safe space to be able to go in and do that. But it can be difficult sometimes to even know who is speaking and doing that. But I think that's a good point that there's things at the platform level that they can start to say, Hey, if there's things that are, you're broadcasting to everybody in an instance, then that should be treated as something that's on the public record and can be recorded and go back and looked at to be able to ensure that it's not a way for people to amplify. trolling behaviors and racist language in a way that's at this point pretty inaccountable for people that creates a toxic situation for the entire community.

[00:29:06.017] Lance G Powell Jr: Right. And yeah, it was discouraging to learn that even you, who's an experienced Rec Room user, didn't know the icons to look for to find out who was speaking and using that racist or homophobic language, which is the fault of the platform, actually. So they do bear some responsibility for their onboarding process and it's important to identify exactly the types of harassment that can occur and to show it to new users, like make them aware of the type of harassment that they can experience. So one, they're not surprised by it or caught off guard, and two, they know how to counter it. So for example, as part of the onboarding, I could say that when you are in VRChat, you might hear racist or homophobic slurs. This is the microphone icon and you press this to mute them and report it. So this is how you solve the situation. Someone might invite you to a private instance. Be aware that some private instances have sexual content. So if you don't know that person, don't go with them. If that person continues to harass you and invites you, you can block that person. Like, and here's the block icon and walk people through how to use it. It's a bit of a contradiction because you want to make people feel secure within the platform that they're visiting and also make them aware that there are specific dangers that can occur. So you have to walk a fine line to make them aware of the problems, but feel confident enough to deal with those problems. So I think most of the effort should go into onboarding. Because I know several of these platforms had user policies at the beginning, which were incredibly spare. I believe for both Rec Room and VRChat, they said, be excellent to each other. And that was it. In the early days of VRChat, that's all they had as a policy, be excellent to each other. And to me, that was an absurd thing to say, because that is open to a lot of ambiguity, because people just have a different idea of what excellent is. And when you cross the line from excellent into non-excellent, so there needs to be a lot of specificity in revealing these problems to new users and a lot of help in helping people counter it. So for example, When we have a new user come into VRChat, we can say, OK, this is how you teleport. Show them how to teleport. This is how to access your menu. Then they access the menu. And look over there. That person's harassing you. This is how you block them. I want you to block them now. So that way, you can't, as a user, come into the platform and say, I wasn't aware of these tools. You have to use these tools to be able to get through the onboarding process to access the platform. So I think that would be a very good step towards solving this problem.

[00:31:59.701] Kent Bye: Yeah, whenever I've had people who are in the VR chat for the first time, it's not necessarily intuitive that in order to select people to friend them, that you actually have to open up your menu. And then from there, instead of pointing at the menu, actually point at a person. And then when you point at the person, then it brings up a whole other sub menu that allows you to friend them. But in this case, it would also mean that's how you block people. So it's actually kind of like a pretty advanced like user interface to friend people, but then to block people. Whether that's made easier or whether that's something that is part of the onboarding process. Yeah. And there's tools there to start to mitigate that from the frontline and from the platform level, but just making sure that the tools are easy to use and that people know how to use them is part of it. But I'd love to hear you describe the different types of abusive behavior, trolling behavior that you started to categorize in your thesis. And just from your observation in terms of what are the different types of things that you see that if parents are going to be aware that they're going to be sending their kids unsupervised into these different areas, what type of things they may be encountering?

[00:33:06.522] Lance G Powell Jr: Well, the first rule is don't send your child into these platforms unsupervised. So if you break that rule, then, I mean, God help your child because you didn't as a parent. So speaking as a parent, again, I would never do that.

[00:33:19.537] Kent Bye: So. And just to follow up on that point is that Monboz was talking about this in his Instagram story on February 10th. He said that they have casting abilities so that you can mirror cast either what is happening on the VR platform, either into your phone or a TV so that you could potentially be listening into a session or watching a session. But there are at least ways that When you're in VR, especially on the quest, it's difficult to see what's happening, but there are ways to get a window into that, which requires the parent to be able to even know how to set that up, which is yet another potential technological barrier for parents to be able to do proper supervision of what's happening in. Cause you could have things that are happening with your child in VR chat. And unless you have that window, you actually don't have much insight as to what's actually happening. within them. So that was at least the recommendation that was coming from Andrew Bosworth was that if your child is from 13 to up to 18, that you should really be supervising everything that's happening of what they're doing in VR.

[00:34:17.028] Lance G Powell Jr: I mean, to get into the head of a 13 to 18 year old, I don't think they would appreciate their parents watching everything they do inside the platform. So I don't think that's entirely realistic. But if you don't want to broadcast to your iPad and watch everything your child is doing, just being in the same room is probably sufficient. Because when you're using a Quest, you can hear everything that's happening and you can hear what your child is saying. And that's probably sufficient. So you can continue checking emails while your child is playing and you're not intruding on your teenager's personal space too much. because they should be allowed to also form their own friendships and follow their own interests within these spaces, which can be completely benign. Yeah, so as for other things that we can be doing to protect our children, I'll actually go back to the list, is we should encourage them not to engage with strangers, first of all, because you don't know what their motivations are. especially people who just suddenly approach you and start asking questions, because you could be meeting someone who does not have your best intentions at heart. Like, even though they might seem friendly or social, people can fool you. And one of the main places that you can see this problem is in hub worlds, which I also mentioned in the list, because there are hub worlds that are supposed to be general areas. In Rec Room, it would be the Rec Center, And there's nothing really to do there aside from meeting with other people. And this is most of the time where I see trolling or toxic behavior happen because people entertain themselves by abusing others, unfortunately. And this was also true of the campfire scene in AltSpace. So the campfire was the de facto hub for AltSpace for years. And when I was doing my research for my thesis, this is where I spent most of my time. As I said, I would go there on Friday and Saturday nights late enough so that I knew the Americans had been drinking and just watch what happens. And more times than not, there would be one guy or a group of guys engaging in harassing behavior because that's how they enjoy their weekend. It's like getting drunk and abusing people. And again, this is really unfortunate, but I kind of selected for this behavior as well, because I knew that's where the problem would happen. Because over time I became sophisticated enough to know like when and where to expect harassment, but a new user wouldn't have that sophistication. So they might just through bad luck end up there. So yeah, one, don't talk to strangers. Two, I would say if You're sensitive enough about harassing situations or you just want to become familiar with the platform itself without having to worry about the behavior of other people. You should exclusively go to private worlds so that you can gain that level of comfort and still be able to see different places and experience different things. And when you're secure enough with the platform, yes, you can go to public areas and start meeting other people as well, but you're not doing it before you're ready. But I know many very experienced users of Rec Room, VRChat, and AltSpace who never go to public areas because they have a good network of friends who also use the platform, and there's never a need to go out into public. So when I go to VRChat, when I go to Rec Room, I'm with my friends and I'm only with my friends. And we're friends because we show mutual respect. We help each other have fun. We play games together. We have amazing conversations. And it's an overall good experience and completely free of risk. So when you enter into public areas where you receive harassment, just be aware that there's a whole swath of other experienced users who are having a great time on a different instance. And they never have to worry about harassment because they did the work to build up a network. and bring those people into the same space.

[00:38:21.129] Kent Bye: Yeah, that's, if you read through your top 10 list of suggestions, number two is don't go into other people's private worlds, especially the people that you don't know. And number five is to only go into private worlds. So in some ways you have to get to the point of either meeting people, either offline or in Discord or other contexts, or I think that's probably one of the challenges is that if you're just being thrown into some of these different social VR worlds that if you're spending a lot of time in these public instances, like you said, that is the one area where you see the highest risk of people experiencing these things. And the people that I know who spend a lot of time within these social VR platforms, they often have built up their social network to the point where they're spending most of their time in these private instances to stay away from a lot of the more toxic controlling behavior that happens in those public instances. So in some ways it's just knowing that's part of the social norms is that there is that risk that's there that people, if they want to be horrible people, they're going to be horrible people. And there's nothing technologically that you can do to completely eliminate that. So it's always going to be a cat and mouse game of people who are going to be having that type of transgressive behavior versus what the platform tools are going to be able to do to prevent that. And so it seems like a part of human behavior that if you're going to be in these situations and these social VR platforms, that it's never going to be completely safe and that you're assuming some level of risk, but at least that as you go through this list, here's some things to kind of help you ensure that you might have a little bit better experience than if you're just kind of throwing yourself into these situations. But at the same time, there's other things that still probably need to happen at the platform level in terms of moderation and other things of just cultivating these safe online cultures and spaces that maybe has also gone too far in terms of generally, if people go into these public spaces, having this type of experience.

[00:40:07.085] Lance G Powell Jr: Yeah, well, again, I think a heavy-handed approach tends to alienate your core users, and you always want to balance the needs of your core users against your new users. So again, I think a very sophisticated onboarding process would be a big step, like in helping set expectations for one, what type of behavior is accepted, and two, how to combat unacceptable behavior. Another point that they should focus on is empowering communities. You could, for example, deputize people that you trust who've been on the platform for a long time. They've built up worlds and they have a vested interest in keeping a safe environment for everybody. So I think those are the two main things that they can do. But there are also other solutions you could introduce as well. So my thesis gets a bit into using natural language processing to detect with some probability, like if specific language is harassing, and also image recognition. So within these platforms, I believe all of them, they have a native camera where you could take pictures and it appears on the website or on your personal account. And why not see what kind of pictures they're taking just to make sure it's not somehow illicit. So as I was preparing my thesis, a lot of the research went into how people abuse content creation tools, which is actually something we haven't talked about. Because whenever you give somebody the ability to build something or draw something, you're very quickly going to find a lot of people drew or built dicks, essentially. So within the trolling community, something called the time to dick ratio, like how long it takes them to put some representation of a dick into the virtual world. And as a response to that, I looked into the image recognition tools through machine learning and started using the maker pin inside of Rec Room just to draw hundreds and hundreds of sample images of dicks within Rec Room and then taking pictures of it. But yeah, that was part of the study process.

[00:42:15.539] Kent Bye: So you were trying to draw them to see if you could train an AI model to have automatic moderation tools and stuff like that.

[00:42:22.111] Lance G Powell Jr: Exactly. So I was seeing how viable it was to do that. And like, even with my very limited machine learning experience, I was able to get it to a high level of certainty. And then if the level of certainty or probability is high enough, like above 80% or 90% that the image that you're getting is actually a dick, then you can send it onto a human moderator who would then confirm it and then flag a person for harassing behavior because you're essentially creating sexual images within a public space, which again is against the policies of the platform. So those tools need to exist as well. But as a casual user of these platforms, you don't ever need to be aware of these tools. So this is what I mean by soft power, because when you're suddenly limiting, like how people interact with each other, like either through a bubble or muting, or just not allowing you to speak to each other, then people are very aware of what's happening and they might not appreciate their experience being restricted in such a way. But finding ways that people aren't aware of for flagging unwanted behavior would be a very good approach to maintaining the experience that people want to have.

[00:43:37.202] Kent Bye: Yeah, there seems to be a number of things that as we're talking through a lot of these things, things that are happening at the platform level, like as an example, there's an implicit social score that I think each of the different social VR platforms are using to be able to help navigate when they're receiving different reports of harassing behavior. Just to kind of get a sense of what the trust and safety level is, how many hours, if it's a new account, then it's maybe more likely to be a trolling account, or if there's a social network where they're connected to a lot of people who have a lot of reports of harassing behavior, that could be an indicator. So there's a lot of ways in which that on the back end, there's these implicit social scores that are invisible. These companies don't like to talk about how they are calculating those invisible social scores because that can then be gamed in terms of people trying to counteraction and subvert detection. So as you have gone into these different spaces you you're listing out a number of different types of trolling behavior like in your list you have things like sounds related to sexual activity and the presence of sexual or hate related imagery or inciting violence in different ways or large quantity and repetition of vulgar language the large variety of taboo or controversial topics proximity of vulgar language and behavior to login time and physically mimicking sexual activities and suprasegmental features of speech. So as you start to look at these different types, maybe you could just do a run through and recounting of what some of these mean in terms of what you've seen and what you're observing and how you start to categorize this. These are the common themes that you see whenever you start to see trolling behaviors. These are a lot of the patterns that you start to see.

[00:45:15.195] Lance G Powell Jr: So yeah, I can go through them. The most prominent one is probably hate speech that shows prejudice or does encourage people to violence or marginalizes people based on their gender, ethnicity, country of origin. It can even be their age if you're abusing children as well. And a lot of times these things are easily flagged through specific language, and we know what those words are, but that is not always the case. So sometimes it's only flagged through your behavior, being unnecessarily aggressive with somebody or just overall rude, or you can even be ambiguous about it and say, I don't like your type of person. And while you didn't explicitly say what that type is, it's implicitly understood what you mean by that. So that kind of behavior can occur and it makes it hard to report or take care of as a platform. But again, you should be using the security measures that you have as a user to take care of that. There can be images as well because there are a lot of symbols such as swastikas that can appear in different places and just a swastika alone would count as a kind of hate speech because it represents hate and images of Nazism and people sometimes use them to be transgressive but they need to be made aware that we take these things very seriously because people are threatened by it because they might have a history of violence directed at them under this symbol. There is microphone spamming, which we mentioned as a broadcasting measure, but it's not only that because I've more than once met people who are just playing porn videos into their microphone so the person walking around has the sound of like female orgasm coming out of them and like again they're not saying anything in particular but it's still wildly inappropriate like even though there's no image to look at so we need ways of muting those people and if necessary suspending them too There are types of harassing behavior that are not related to language. So we're all perfectly aware of the jerk off motion. So if you go into a public space and just do that jerk off motion, which sometimes is genuine, unfortunately, like some people actually go into these spaces and pleasure themselves. And that kind of motion is also wildly inappropriate, but very difficult to monitor as well. So yeah, people need again for this one a blocking feature because muting is completely ineffective in these cases. Because yeah, it's just emotion.

[00:47:56.173] Kent Bye: I think that is a good overview of a lot of those. And I think part of the other dynamic that was happening in the BBC article was that there are potential sexual predators that are grooming children to be able to be in certain sexually explicit situations within the platform like VRChat. And in terms of service, it says there's no pornographic material, but then in the community guidelines says that sexually explicit behavior is not allowed in public spaces, which gives an implication that if you're in a private space with consenting adults, then if you are doing the erotic role play or sexually explicit type of behaviors, that as long as it's in between consenting adults and in a space that's not being publicly broadcast in any fashion, then it's a little bit of the Wild West in terms of what is being allowed on the platform. But you have this situation where you have this culture of some of that stuff happening. Maybe people are ending up in a private space that they don't realize, or you're having this boundary between sexually explicit behaviors that are happening that are violating of the community guidelines versus what's happening between consenting adults. So that seems to be another angle, especially around the BBC article. I mean, there's certainly probably another area where it's just a pure trolling in terms of violating a space and threats of rape and things that are clearly inappropriate and not consenting at all. It's just the trollish manifestation. But I feel like there's probably another flavor of those gray lines of discussing some of these things that are happening, especially in the context of VR chat that has this weird situation where some of the things are implicitly allowed in the private instances versus what is violating for just a pure terms of service violation, regardless of whether it's in a private or public instance. And then the whole aspect of whether or not it's going against their community guidelines and a code of conduct violation if it's happening in a public versus private space.

[00:49:48.211] Lance G Powell Jr: Right. Well, with sex in particular and how it manifests within social VR platforms, it's helpful to imagine that, for example, VR chat is just a big building with many rooms and behind one of the doors, there is sexual behavior that is intended only for consenting adults who are over 18 years old. And VRChat seemingly wants to encourage them to have that experience if that's what they would want from it. Because if you try to police that type of behavior, it becomes rather alienating to the community who wants that, and they don't see a good argument for why they shouldn't be allowed to do that. And the argument against that is the what about the children argument, which is a good sign that there's a moral panic happening. That's where the argument comes in as like parents should be watching their children so they don't just walk into places like this because largely they don't want that. So that's one set of people and I'm sure these people who are members of groups oriented towards sex and having orgies Like they hate sexual predators more than anybody because along with the children who suffer also they suffer as well because they're grouped in with These other predators under the umbrella of just general sexual behavior But like they're doing it in one would argue like a healthy and fulfilling way But the predators are trying to actually victimize somebody and yeah, they do use every social VR platform to do that which means we need to watch out for Them and like really flag that behavior as much as possible Because yeah, it can do some very real and lasting damage even along with everything I said before even if the child Like goes into the place and they understand that they have control over the experience. But again, they're just a child they can get Confused by what they're seeing and they don't know to leave the experience and real damage can result from that and that's what everybody wants to avoid Which is why we need to take steps early on to make sure that the kids don't get in there like to put as much Responsibility as possible on the parents shoulders because like I'll say it a third time I would never let my child go into VR chat like and like even 13 14 years old and I'd be quite hesitant and would probably stay in the room just to make sure a negative experience doesn't happen. That said, there are people who are setting up separate VR platforms around sex culture, essentially. Maybe you've heard of Raspberry Dreamland is trying to do this. and like they are that 18 plus platform and many of their meetings are about sexual behavior like talking about group sex and meeting with porn stars and talking about what it's like to be a sex worker or in the porn industry and because they're explicitly an 18 plus platform like they can dismiss the arguments against it because like we said this at the beginning and you've agreed to it Like the terms of service clearly state that you cannot be a hero if you're under 18 years old. So if they grow, they would hopefully escape that criticism because they would say this platform is explicitly about sex. So please don't claim otherwise.

[00:53:01.805] Kent Bye: Yeah, there seems to be a lot of the discussion that was happening, at least on Twitter, where a lot of the people in VRChat were saying, look, VRChat is basically like throwing your kid into a city unsupervised. Like, would you let your kid just go to a whole wide range of different types of nightclubs in the city without knowing what they're doing? And VRChat is kind of like that. And I didn't see a lot of really compelling arguments arguing that kids 13 to 18 should be in VRChat. Most of the people I saw were saying like, this is not a platform that is meant for young children. But I'm sure there are certainly a number of different experiences that are acceptable and okay. So it's hard for me to say that there should be a hard line either way, but that seemed to be a big part of the debate. But maybe with the time we have left, I'd love to get through the top 10 lists that you have here and just kind of chat through each of these. And so we've talked to number one, don't talk to strangers if you're under 12 or 13. Number two, don't go into other people's private worlds, especially with people that you don't know. And then number three is learn the safety tools. We've talked a little bit about this in terms of muting, blocking, kicking bubbles, reporting and admins and, and how there should be ways of better onboarding from these different social VR platforms. But what are some other things that you want to just say about the learning, the safety tools?

[00:54:11.664] Lance G Powell Jr: Those are the basic safety tools and they will cover most situations, but as far as mechanisms for stopping harassing behavior. There aren't that many on offer. It's more a certain point about having the awareness, like the social awareness to know what types of people you're talking to and interacting with. Because you might along the way, see some red flags that show you that this is not a person that I would want to speak to either because they're going to make unwanted sexual advances or use language that makes me uncomfortable or anxious. So a social awareness is also helpful. And young people might not have that level of social development. So they don't know what they're getting into. So again, I think the use of private worlds is the most important because it's foolproof. If you go to a private world as a young user, nothing bad is going to happen to you. You're not accidentally going to be harassed because nobody is allowed in there. It's a locked door and only you have the key. So that's quite important.

[00:55:13.816] Kent Bye: One thing I just wanted to jump in and say, in terms of the other types of social tools that are available, especially in say Rec Room and as well as in VRChat, is that in Rec Room, you can change who you're hearing. Like if you only want to listen to your friends or your close friends, or so you can basically be in these situations and have the games, but not have to hear people harassing you in different ways. They may still be able to get up and physically harass you in certain ways, but in terms of the audio, there's ways at least controlling who you're hearing in these worlds, which I can making a better experience in a lot of situations, but also in VR chat, there's ways that you can control what avatars you're seeing and what trust level that you have in terms of what advertisers are seeing and what you're hearing and whatnot. So I think there's a lot of ways of kind of dialing in the different types of experience that you're having as well for each of these platforms that have different ways of being able to have additional levels of control in terms of what you're seeing and what you're hearing.

[00:56:06.393] Lance G Powell Jr: Yeah, this is also true for VRChat, by the way. You mentioned Rec Room, but VRChat does the same thing. They have comfort levels, I think they call them, where you can be open to any type of experience or do something very restrictive, as you said. Even with the avatars, like not being able to see other people's avatars. They just show up as a generic robot in some cases. So yeah, those tools are available as well.

[00:56:28.912] Kent Bye: And in meta horizons, they have a button that's a shield that basically takes you into a private space that allows you to either report. And so it allows you to have a nested context that is a little bit an escape hatch in some ways, if you're in a situation. So I think that's a good idea that it might be useful in other situations as well.

[00:56:47.537] Lance G Powell Jr: Yeah. So again, to bring it back to the BBC article, if somebody isn't aware of these tools, then that. is the fault of the platform actually, because they didn't introduce them in a way that made them a knowledgeable user. But at the same time, reporting it in the way that they did, while it does for some of us start a conversation as it does here, it can also serve as a flag to other sexual predators as a place where they can go and thrive and groom young people as well. So I think there are some unintended consequences about going public with this type of behavior because like now people see this BBC article and they're aware that that's an option for them because what you do in social media is monitored, right? So you're always watched when you use Facebook and so forth to target young people but social VR platforms might not be monitored in the same way so it is quite easy to get away with it. So I think being responsible about choosing an audience is helpful as well.

[00:57:50.621] Kent Bye: Yeah, where I land on that is just that I think it's just good to have this type of report because it is happening. And that's part of journalism's job is to shine a light on this and that there are certain aspects where it's a responsibility of the community and the individuals to have the tools to be able to protect yourself, but also of the platform level to be able to make sure that there's things that they can do. I mean, for me, I see that there's this combination of both the culture that people are going to be bad. And if you're going to be in these online spaces, there's a certain amount of risk, just like there's risk on any experience that you're having online. But then there's the aspects of the policies that are happening within these companies and social platforms, but also the laws in terms of what is legal or not legal. And COPA compliance is an example, trying to protect children ensuring that kids that are less than 13, that's part of the reason why it's 13, is the COPPA compliance within the United States, that you're collecting data on those children that are less than 13, then it's a violation of the COPPA law. So there's laws, but then there's also the market dynamics in terms of what kind of competition there is between, say, Meta and Rec Room and AltSpace. I mean, each of these have their own focus of what they're trying to do and the different Problems are trying to solve, which gives different experiences. So there's different market dynamics. And then the last is the technological architecture and the code. A lot of the things that we're talking about in terms of how to ensure that these are safe spaces. So there seems to be a combination of all those things when I start to think about it. So whenever there's a report like this that comes out, then yes, there's always going to be gaps in the culture in terms of abusive behavior, but there can be other ways of having other mitigating factors to create these situations that are enjoyable. It may end up that in the long game, as we look at VR 20 to 30 years from now, most of the activities do happen in these private spaces because of all these various risks that happen in the public space. The public internet is different in that way that you don't have those same dynamics. And so maybe as we move forward with the future of these immersive worlds, maybe a mitigating factor is the social network of the people that you know, as well as more of these private contexts that are kind of avoiding the public commons that is inviting this type of. behavior that's hard to really control if you're having anybody come in from the public.

[00:59:59.420] Lance G Powell Jr: Yeah, no, no, that's an excellent point. But I guess with this particular case, the BBC article is painfully short, because I wanted to know a lot more about what happens, like how you announced your age to this group of people and what the result was, because they describe it in very broad terms, what happened, which was negative enough. But when somebody is a journalist who is going into VRChat specifically to find a story, there's a lot of selection bias that happens as far as where you decide to go and who do you decide to talk to. Because it wasn't an accident that they talked to specific people who were bad intentions and so forth. So explaining that in a long form would have been much more satisfying. But I gave some attention to speaking and engaging with the communities within VRChat. and other platforms just because there are communities that really believe in the platform and its potential for fulfillment and creativity and real genuine friendship. And when somebody goes outside of the community to criticize what's happening within VRChat, it's not that they're only attacking the platform itself, Like, even though there are things that they can do better, but it's an attack on the community, which is just trying to survive. Because there have been a number of social VR platforms over the last six years who have come and gone just because they couldn't gain traction. And when some platforms survive for a while and find some success, it's being undercut by people who are just saying it's a haven for predation. And that's not the image most regular users have of these platforms, and it's not one that they would like to project. So when somebody does that, they are, one, embarrassed and also disempowered, because the platform is sometimes taking autonomy out of their hands just by saying, OK, everybody has a bubble, and now you're going to get these warnings constantly. I don't care if you've been here for five or six years. This is what life is now. So it's community development is the most important thing to help these platforms grow and be healthy. So people use the example of sending a child out into a city and expecting them just to be safe. And of course they won't because opportunistic people with bad intentions are on the lookout for a child to abuse them. But we should make it more like sending your child out into a small town where everybody knows each other and they recognize the social norms. And if somebody crosses the line and starts abusing that, they'll know how to self-correct for that, rather than relying on the platform or a legal system to decide what is right and wrong in this case. I mean, that's the underlying messages, like supporting communities.

[01:02:49.747] Kent Bye: Yeah, I think if you go back in history, you'll probably find moral panics that happen with every new technology that comes about, where there's a moment of people saying, what about the children? But I guess there's the freedom of autonomy that comes with the different types of situations and contexts versus ways that you're trying to either create spaces that are safe for children, or maybe some of these spaces just aren't safe for children, then children shouldn't be there. which gets into a whole other aspects of age verification and then the privacy implications of like verifying that people are a certain age that then is incurring people's privacy about who they are. So then that is another layer in order to have anonymity. It starts to have all these different trade-offs. So no matter what way you go at this issue, it's sort of hits all these hot button issues that seem to come up in terms of how to really viably address a lot of these different things. if it is solely on the shoulders of the parents or if there's other things at the platform level that can start to set up these things. But I think the point in terms of having these deep nuanced conversations like we're having, you know, to really dig into the nuances of the story, which is why, in a lot of reasons, I am grateful for that article because it's catalyzing these conversations that need to be happening anyway. wrote up these top 10 lists to be able to try to say, OK, as a parent who spent quite a lot of time in these social VR platforms, here is how you would take it yourself, but have other parents have some guidance as to how to navigate this.

[01:04:10.930] Lance G Powell Jr: Yeah, absolutely. And I'd like to quickly talk about number nine and 10 on the list, because number nine was about how sometimes the child is the problem, because they may meet with people in their real life who are bad influences on them, or they understood like specific hateful behavior as being normal within social media, and they import that behavior into social VR as well, which is another reason why access for young children should be restricted just until they learn what behavior is allowed and not. By the time they're a teenager, very often they choose to continue with this behavior. And at that point, yeah, we do need to penalize it as well and kind of section the adults off from the teenagers in some cases. But yes, it's not only predatory behavior by adults. Like sometimes it is the children who are making the experience worse for everybody. But number 10, it might have seemed tongue in cheek, but I mentioned a platform called Half and Half. which came out, I want to say, like three, four years ago. And this is a platform in which everybody is a cute blob monster thing, and you can't understand each other speak. So like, whenever you speak, it's filtered as nonsense gobbledygook. So you're not actually saying anything, but they just understand that you're making noise. And I was giving this as sincere advice for people who want to expose their kids to social PR platforms without any of the risk involved, because there's no potential way to abuse anybody, like even through super segmental cues, because you're just a wavy blob monster. So, so it's completely innocuous and a fun experience as well. If I wanted to add something to that list, I would add, stay anonymous. as a child because generally what happens in social virtual worlds stays in social virtual worlds. So you might be harassed within those social VR sessions, but it won't follow you into the real world unless people can identify you. So if you want to avoid harassment following you to your Twitter account or email or so forth, Yeah, just use a pseudonym. It's like, unfortunately for myself, I have always used my own name. And I know the same is true for you, Kent. So we're kind of on the hook for all of our behavior. But yeah, I would not recommend it for other people. It's like just, especially if you're worried about trolling and so forth, just find a nickname that you like.

[01:06:39.395] Kent Bye: Yeah. Yeah, that's a decision that each person has to figure out in terms of their identity. But I do want to just make sure that we hit all your major points. So we had the number one, don't talk to strangers. Number two, don't go into other people's private worlds, especially people you don't know. Number three, learn the safety tools that we talked a little bit about. Number four is record the incident. So if you do have something that happens, there is an ability to be able to record what happened, especially if it's ongoing. So then you have a little bit more objective evidence to be able to submit. I know that in, say, MetaHorizons, they have kind of a running recordings that's happening. So there's a buffer in terms of maybe 30 or 60 seconds worth so that when you do report, it's automatically happened. But I guess in other platforms, you kind of have to use the built in tools on the quest to record and then either submit that directly as evidence into the platform or offline and discord. And so, yeah, I'm not sure if other things you want to say just in terms of just documenting abusive behavior to be able to ensure that if there is stuff that's happening, that it's just some other additional record other than just a flag that something happened that may not have any other record otherwise.

[01:07:46.152] Lance G Powell Jr: Yeah. So if you want to be an especially good citizen and, and clean up the platform, you can go through the work of recording harassing behavior, especially prolonged behavior. And if you upload that to YouTube as an unlisted link, the platforms are really on the hook to do something about that specific person. So you can expect a very expedited response if you do something like that. And I encourage people to do it. I don't encourage people to make that available to the general public because there can be some backlash from people, again, who don't want that negative view of their community to be available to everybody. But again, it spurs conversation, so it's very debatable. But yeah, if you do that, you can expect a speedy response. And if you do report people, I know at least within Rec Room, they do as a platform, thank you for doing that. They'll inform you if action was taken and they will send you a thank you note saying that's very helpful to the platform. Thank you for reporting. We took action. See you again. And that's gratifying to receive. So congratulating people who report negative behavior can incentivize them to continue doing that.

[01:08:57.605] Kent Bye: Yeah, and one of the things I wanted to also mention is that for at least Horizon Worlds, there's a connection to your Facebook account. And so some of the violating behavior may put your Facebook account in danger. And if you lose your Facebook account, then you could also potentially lose access to your VR hardware. So having behavior that's violating could actually potentially end up meaning that you lose your access to all VR experiences. I mean, I guess it's technically true and say Android or iOS that that could also be, but this is something where if you're in a third party app, I'm not sure if it's going to eventually get to that point, but it's possible that some of these things could go to the platform scale of losing access to the VR hardware, which I think. it creates another dynamic of in order to really ensure that these are safe spaces that if people are really egregiously violating that then I mean you can always create fake accounts to be able to circumvent some of that but at least there's things that put a little bit more stake I know at least when I'm in situation, I think about, well, what if people filed a bunch of fake reports? Would that mean that I'm going to be potentially at risk of losing access to my VR hardware? Because I got myself mixed up into a gang of trolls that basically had a backlash. and filing false reports in some way. So I don't know if that's a legitimate concern that I have or how to deal with the types of dynamics that you get in terms of recording. And like you were saying, the public versus private and creating a target for yourself or that if other people are going to vote kick you or have other ways to damage your own implicit social score and have you be at risk of losing access to these spaces.

[01:10:36.503] Lance G Powell Jr: Yeah, no, that is a valid concern, which is another reason why I don't like the idea of having the platforms police every type of behavior unless they can reach a very high burden of proof. Because I can't believe we talked this long without mentioning the metaverse, but like with a large interoperable metaverse where your identity is persistent across many different spaces, If they also join forces with their security detail, that can have quite a large effect on your ability to operate within that metaverse. And if the worst case happens, you can be a block from it completely. Think there are certainly some cases where that should happen like especially with sexual predators who are grooming young people for their purposes But you're right that there are false reports like I've been reported on by people who didn't like me confronting them about negative behavior and I don't know what happened to them, but like I'm quite well known to the record platform so like I would expect if they're doing their due diligence is they are a blocking the person who reported me because like I've spoken at both of the Rec Room conferences by now and like I've known the developers since they started. So yeah, they don't know who they're dealing with, by the way. But the tools that we use for security can also be used against us. So we need to be careful about doing any overly punitive action based on that, because there can be a lot of false positives. in these cases. So it's, in my opinion and others as well, it's unfortunate, but it's better to go with a false negative rather than a false positive. So if somebody is guilty, but for whatever reason, the platform decides not to take action because they didn't reach a certain burden of proof, then somehow that is preferred to unjustly persecuting somebody who actually didn't do anything wrong. But yeah, that's a balance we have to strike. And like cold, hard AI, it fails sometimes. And this can happen. So maybe you have the experience on social media like Facebook Of having a post flagged for something that you said wrong. I had this experience very recently And I appealed it like I immediately appealed it because I actually wasn't inciting arson as they said I was in the post and On appeal they corrected themselves like oh, sorry And if Facebook is a major player within social virtual worlds, I have to consider this as well. So are you going to block me from horizons because I was wrongly targeted by the platform or a group of people? So yeah, that's a big risk.

[01:13:13.636] Kent Bye: Yeah, I had the similar experience where I was going on the Meta's website that they were announcing that they were having like a Black History Month series of events. And I clicked through like three links and because I clicked through three links, they thought that I was doing too much activity and I got auto-blocked that I could no longer even look at it. So I feel like there's this AI moderation that I also get a little bit. Hesitant around to even spending much time on the meta horizon worlds platform because like if I'm on that platform and I get somehow Triggered on these reports that that could have me lose access to the VR hardware And so it's just sort of like the fear around like the AI overlords that are overly Trying to police things and create this safe online space that could then like you said if there's too many false positives people who weren't legitimately doing anything wrong getting banned and There's a balance they have to create in terms of the risks that you take to be on a platform like this versus to be able to create it perfectly safe. you know, having something perfectly safe is going to be kind of boring. And so you have to have some freedom for people to be able to act. But I think that's part of the tension here that they're talking about. So that was a recording incident. The number five is the only go into private worlds that we've talked about a little bit in terms of like, yeah, when you go into these public spaces, that's where you're going to see most of the toxic kind of trolling behaviors. So if you want to avoid that, you have to in some ways either be pulling in your social network from other ways or to go in the friends plus or into these situations where it's like going to a cocktail party and building up a network of these spaces that are a little bit more trusted. So it's kind of like this process of building in friends in real life, you know, how you gather and meet people at events, but only going to the private world seems like a big mitigating factor to avoid a lot of the most egregious trolling types of toxic behaviors that are happening on these platforms.

[01:14:59.537] Lance G Powell Jr: Yeah. Well, if it's a private instance, and you are the host of that instance, the only people that get in are people you allow in. So you're the gatekeeper for your own experience. If you're really sensitive to uncomfortable or anxiety producing encounters, then it's highly recommended because everybody there is there because you allow it. And if you do tire of one of your guests because they said something harassing, you can immediately remove them and you're not going through the voting system to kick them. It's like you as the host are allowed to force them out because they don't belong there.

[01:15:36.845] Kent Bye: Yeah, so there's a lot more control that you have when you're running these instances and controlling who comes in, come out. Sometimes you're in other people's rooms and that can be just as well. But when I see the general culture within VRChat, most of the stuff that I end up going to are in these private instances anyway. And there's different ways, at least in VR chat, where you can join people in. And so there's ways sometimes it'll be frustrating when you open up VR chat and everybody's in a private instance and you don't know what they're doing. And, but yeah, there's a ways that you can kind of flag people as to what degree that they're available for them for you to be able to join in. So yeah, just different layers of privacy there that I think helps. But number six is understanding the social norms of the world and environment. Maybe you can speak a little bit about what you mean by that.

[01:16:17.282] Lance G Powell Jr: So this goes back to what I was saying about hub areas, which is often the first place that people go to. It's where people get their first impression of a platform, but it's also the place where by having a lot of new users who don't really understand the platform and also having people who enjoy being toxic in public, you're going to have a lot of harassment as well. So just understanding that these specific hub worlds with not a lot to offer as far as activities are quite likely to have harassing behavior will help you realize that like one you shouldn't go back there and also that's not what the experience is so this is just like a waiting area for newbies but to get to the real experience of a platform you need to start exploring and there like if you're seeking out like good narrative experiences or games or artistic creations you're going to start having a lot more positive encounters But you never know sometimes. Part of being a victim of harassment is just being in the wrong place at the wrong time. And to address the victim blaming accusation from Twitter, I wouldn't blame somebody for being in the wrong place at the wrong time because you didn't know. How could you know? And you didn't know the tools. You don't completely understand the tools to combat that until you need them. When you go and, for example, to bring it back to my friend who was at a KKK rally, like she didn't mean to do that. Like that was completely by accident and that is really a worst case scenario. But that kind of experience never happened again. And by now she's been using VRChat for a few years. So yeah, this is why education is important and the platforms need to support that and the communities need to help out with somebody who is new as well.

[01:18:04.136] Kent Bye: Yeah, the victim blaming, well, the allegations of potential victim blaming of be cautious of how you report harassment. So maybe you could just sort of explain what you meant in this point, because I do think that as people were reading through your top 10 list, this is probably the one that maybe brought up the most resistance or comment in terms of people questioning what exactly what you're trying to say in this number seven.

[01:18:24.688] Lance G Powell Jr: No, that is very, very understandable, because number seven is also the longest. And what I'm presenting is kind of a peek inside the comments I've seen from people in different Facebook groups, like oriented towards a social virtual world, and also people I've talked to on the topic. And they see this situation very differently than the general public does. Because if someone has a harassing experience and they talk about it within the community, the community will be entirely supportive. If it's a strong community, and I do believe that platforms like Rec Room and VRChat, and to some extent AltSpace, like they do have good communities. But when you go outside those communities, you start to get a lot of questions, questions that aren't often asked out loud, but if you press people, they'll start about what your motivations are for this experience. So if you're hoping to bring awareness and enact change within the platform, then we start to ask, okay, so why did you go to the BBC? Or why did you put it on a public blog that you then shared with your group and got retweeted by a number of people? Because you're not reaching the platform exactly. Instead, you're putting the platform and the community on the defensive. But you're reaching a lot of people who have no vested interests in the VR platform, and they're not really interested in what happens to it. Because if for most people, like if VRChat disappeared overnight, it wouldn't matter to them. But like for many of my friends, like it's their friend network, it's their creative outlet, and it's their livelihood, you know, because people are profiting from their time there. and I'm very aware of what happens to them and VRChat becomes part of their identity. Some people, they're educated in one department, they have a specific job, they have this type of family life and this type of belief system. but the people I engage with, those are in VRChat. So VRChat is absorbed into my identity. It's part of who I am. So if you suddenly are seen to be attacking like who somebody is, like I'm a community member of VRChat and you are attacking VRChat, then a person in that community will do legitimate victim blaming in that case. And it can be harsh and you're going to likely create a lot of unintended consequences because you were trying to get an audience on your side who has nothing to do with VRChat and don't really understand the context of what's happening. So that's what I was trying to express is like you're just creating a lot of doubt about what your real intentions are like when you spread something so widely. because they feel like you're just drawing attention to yourself by putting yourself at the center of it through harassing behavior. And it has a potential for negative offense. Again, this is arguable, but this is also what I've seen other people express. It has negative effects for the platform, the communities with a vested interest in the platform. So yeah, that's just what I see. And careful consideration is important. And if people say, yes, I've honestly thought about each of these things, And I still think it's the right thing to do. And then it's like, okay, then do it. You're perfectly free to do it. But just expect that these different groups are not going to agree with your conclusion. And you might receive online attacks on top of the harassment you received within the social virtual world.

[01:22:03.015] Kent Bye: Yeah, if I were to try to summarize some of what is coming up here is that, well, for one thing, your article is titled parental guidance about how to supervise what's happening within these social VR spaces. But this number seven, in some ways, is not directed so much to the parents as to more journalists, because it's switching context into what I saw at least happen was that You have the perception of the BBC reporter who may not be fully ramped up into the full context coming in, reporting on stuff that's actually happening that is, I think, worth shining a light on that these type of toxic behaviors happening. But the reaction that I saw a lot of people from VRChat was, well, this person should have done this. They should have known how to use the tools. They should know that there's privacy settings or they should have known this or that. And that there is this sort of like reaction that came from the community that does have victim blaming. So being aware that if you do have some of this stuff. that if you do go public with it, then there's other things that you're tapping into other social dynamics that are complicating things, whereas you should perhaps use the internal tools to report things. But in this case, it's a journalist who's trying to point to these larger dynamics, which I think is a different situation, a different context. And that your observation is that people within VRChat don't want to stir up a moral panic that takes away their social connections just because there's a bunch of parents who are allowing their kids unsupervised to have these toxic behaviors potentially amplified in terms of part of the problem is that not having good enough parental supervision of how people are even using social VR and that there's a response to say, well, that we should ban all kids from this platform so we can keep our platform. which then introduces other things or this other aspect of recognizing that with any platform, there's going to be certain types of risks that are happening and that there are certainly additional impacts when it comes to when you're immersed in VR, but there's these other sociological, cultural dynamics of people who are invested into these communities and find a lot of value. And at the end of the day, don't want to have that go away because of toxic elements that are perhaps impossible to completely eliminate from human behavior. But finding ways from a platform level, how to mitigate it.

[01:24:13.874] Lance G Powell Jr: Yeah, no, absolutely. And I'm not saying that the opinions that are listed in number seven are somehow correct. Like personally, I think this is really something you should look at case by case, because there could be situations that I agree where writing a public article or a blog could be the right call. But I can also imagine situations where that is not the right call as well. So I wouldn't say. Either the victim is always right, or the victim is always wrong, or the victim should be blamed. And again, if you're walking in as a new user, you don't know what the tools are, and you're basically unaware of what's happening, you can't be blamed for that. It's an absurd thing to do. And I think within the community response within VRChat, a small shift in tone would help things a lot. So I can say, yeah, you should have known the tools. And instead of saying that, try to be helpful. Just say, oh, if this happens again, there are these tools. So yeah, I don't like people using third conditional statements like that. So like should have, could have, things like that. It's just like, try to be helpful to people. And like everybody has their own way of processing a negative experience. And some people need to do it out loud, and that's perfectly fine. But support them in giving them a better experience. So the next time you see somebody who had a harassing experience, and they post about it on Twitter, contact them. Say, hey, I'm really sorry this happened to you. I hope this never happens again. If you want, join me in VRChat. This is my username. And I'll show you around to some places and teach you how to use the tools, just so we can avoid this. And this is a very restorative approach to this experience. And we can take something that was quite negative and then turn it into a positive in the same way. Because I don't like when the conversations themselves become toxic on both sides. Like one saying, wow, you're victim blaming. And it's like one and the other saying, no, you're ignorant. And it just becomes a back and forth of name calling. And it really pollutes the experience.

[01:26:25.402] Kent Bye: Yeah. Yeah. I think that makes sense. Well, the last one that we haven't talked about yet, number eight, we've, we talked about the nine and 10, the number nine being the plot test that your kid actually might be the problem. Number 10 being playing half and half, but number eight, avoid conflict avoidance. What do you mean by that?

[01:26:40.989] Lance G Powell Jr: So avoid conflict avoidance. I guess at this point, I need to give a shout out to Jessica Outlaw, who you interviewed on your podcast as well, because she talked a lot about. what you can do as a bystander. So if you see somebody who crossed the line or who is harassing, jump in and do something to help the situation, either by initiating a vote kick, so they leave the room, or just supporting the person who is being harassed. If that person doesn't want your help, then there's nothing you can do about that, really. But make sure you do everything you can to make them have a good experience. And yeah, that's what I meant by that. So just don't be afraid to To help out and jump in when you know something is wrong So it's something I do personally and I encourage others to do as well Yeah, like we talked about earlier about doing the reporting and making sure that

[01:27:31.986] Kent Bye: reporting people, but also, I guess this is also more along the lines that you're directly intervening in conversations with other people or trying to step in or come to someone's defense. I feel like there is a bystander effect that happens in psychology, which is that you see things that are happening, just kind of like assume that other people are going to do something about it. And that there collectively ends up being a lot of people who just end up staining by watching it happen without. stepping up and doing something about it. So I guess it's kind of overcoming that. I think it would be nice to have training and that to have people to have some embodied experiences of that, because it is something that if people are not familiar with being able to do that, how to do that in a way that feels like it's within the cultural norms. I know a lot of what Jessica outlaws approaches is that saying that this type of social trolling behavior that you can never fully technologically engineer a solution to solve all these problems that there's a part of culture making and place making that happens that comes from different rituals that happen from Who are the heroes? What are the stories we tell? What are the other aspects of that culture that are trying to be generated? Because it's not something like a situation where you can just have AI moderation or the technology deterministically coming in and solving all these very real human problems. At some level, there's a part of responsibility for us to be able to participate in these communities and that in order to really make these spaces comfortable in some ways the VR is a reflection of our own limitations of humans and that it's a mirror to our shadow aspects of ourselves and that if there are these really toxic environments that that's almost a reflection of the people that are in that community, and that there's a bit of a collective responsibility of people to try to create a culture that they want to be in. And so if you see things that are going against that, then you have the broken windows effect that if you see one people violating that behavior, then there's an encouragement to spiral out of control. So how do you, as a community, have responsibility to participate in that culture making But at the same time, you don't want to just go in and essentially become a moderator because then it becomes a situation that's not fun to be a part of. And so, yeah, there's these trade-offs between the collective responsibility versus what you want to personally get enjoyment out of as you go into these experiences.

[01:29:50.540] Lance G Powell Jr: Right. Actually, something that happened very recently in Rec Room is they decided to give everybody on the platform superhero capes. And they did it with the express purpose of organizing like different activities and specifically dining rooms based on these superhero themes. But I very suspiciously think that somebody read a book on enclosed cognition and decided to give superhero capes as a way of helping people form the attitude that they should have within the platform. So if you're a superhero, you do the right thing. Like you don't harass or troll people. You treat people with respect and justice. Also as a superhero, if you see somebody in trouble, you do what you can to help them. And in the end, I don't know if it works. And I mean, I'd love to look at their internal data to see what happened because of this, but I'm fairly confident that that was the motivation behind it is just a cheap way. of turning each of your users into a vigilante for justice. And you can't say that you did nothing because you're trying to nudge people in the right direction. So maybe that's the book they read, The Nudge Effect, I think it's called. And yeah, I'd be curious to see how that develops and if it really helps. But I think there are creative solutions out there that we haven't talked about yet that could have very positive effects for these communities.

[01:31:12.600] Kent Bye: Yeah, there's also Jeremy Bailenson who's coined the phrase of the proteus effect, which is the effect of when you are a superhero, you tend to have more pro-social type of behaviors. I suspect there probably was a big part creating these in-groups that allow people to have social behaviors where they're able to have more ad hoc collaboration amongst people that are on their same superhero tribe that they've been automatically selected. The entire population of Rec Room was put into one of these four groups that then allows social dynamics to emerge where group behavior can be more emergent based upon this implicit group that you've been put into. So I think there's probably a game design element that may have been the underlying motivation there to explore more social gaming. That's what I suspect at least, but that's an interesting thesis that part of being a superhero is to take more responsibility for participating in the culture. Now, just as we start to wrap up, is there anything else that we haven't mentioned that is worth mentioning here before we start to wrap up the conversation?

[01:32:11.477] Lance G Powell Jr: No, I just want to compliment other people who are working in this area. Like somehow we didn't mention Kapia Perlman also who has been on your platform, who is working a lot towards security. And I could be wrong, but a part of the security she's interested in is securing users from the platform. Because when we start strictly monitoring all user behavior, then that information can be used for other purposes, such as advertising or trying to change our market behavior. So that's a risk as well. So when you're asking your platform to incorporate more surveillance tools, you might be losing control of how that data from surveillance is being used. So be aware that that dynamic is at play as well. So it's another reason I'm kind of skeptical about platform responses.

[01:32:59.402] Kent Bye: Yeah, that's a caveat permanent of the XR safety initiative. And she's certainly been looking at both online safety for kids and other privacy issues as well. Yeah, there are trade-offs there between if you want the safest platform, then you have literally everything that you say or do being recorded, which then is an incursion on our privacy. So yeah, using a moral panic around the kids to then lead towards a dystopic surveillance is not necessarily the reaction that we want to see go down. But yeah, we've covered a lot of the aspects of the social VR and harassment, but I'm just curious from your perspective, what do you see the ultimate potential of VR and what am I be able to enable?

[01:33:37.041] Lance G Powell Jr: So I'm, despite our conversation, very optimistic about the future of VR because I see it as a way to really give people fulfilling social experiences that they might not have access to in their immediate environment. Because I know, especially because of being two years into the pandemic, A lot of people have felt lonely and desperate and they've had a terrible time of it. But if you speak to people who are daily active users within social VR spaces, they're having a completely separate experience because they have their network of friends available and they're able to express themselves creatively and have fun. And it's a beautiful experience that I see as growing like once we can create healthy communities to mitigate a lot of the risks that we discussed. So I know depression is on the rise. Suicide is also on the rise. And I see it as a way of reversing those trends because people do feel suicidal very often because they lack a sense of purpose and also feel isolated. And social VR is a way to counter that completely. So that's what I see the ultimate potential being is just creating a social experience that is accessible to many more people.

[01:34:54.267] Kent Bye: Great. Is there, is there anything else that's left and said that you'd like to say to the broader immersive community?

[01:34:59.271] Lance G Powell Jr: No, I just like to say that I look forward to continuing the conversation within these social VR platforms. And I want to hear about positive experiences that you have as well. So like, don't sound an alarm every time there's a fire, but also highlight some of the great and beautiful experiences that are out there. For example, one person I wanted to mention was Joe hunting. Hugh just released a film on VRChat called We Met in VR that featured at Sundance. So definitely look into that too, because if the first time you hear about VRChat is because of child predators, then maybe see the beautiful side as well by learning about how people have formed real relationships through this medium.

[01:35:42.043] Kent Bye: Yeah, Joe Hunting's We Met in Virtual Reality, which premiered at Sundance and I had a chance to see it and do an interview with him. And yeah, highly recommend checking that out just to see a lot of the culture that's happening in the communities and the different communities that he was tracking there. Yeah, that's probably another thing worth mentioning is that just as there's individuals that you can meet within these social platforms, there's entire communities that have meetups and gatherings and are that space that feels like a little bit more controlled in terms of the policing and It's more of a established social network and graph. And over time, it'll be interesting to see how MetaHorizons is using people's established social networks as they are coming into these VR spaces, coming in with those established relationships that may create a different tenor. I've already noticed that there's a different vibe that happens on MetaHorizons, just because a lot of times people may already know each other from physical reality. But yeah, I guess, you know, if people want to take a look, you have a medium article that you wrote, the parental guidance for the metaverse that you just published in response to the whole discussions that are happening, as well as your thesis called a framework for understanding and detecting harassment and social VR, which is linked from your medium article, which is a pretty extensive deep dive into more of the platform level things in the history of harassment that we were able to talk about. So people want to have more information there. They can have access to that. And Lance, yeah, thanks for taking the time to dive in and unpack this a little bit. I know this is something that you've spent quite a lot of hours within social VR to really understand the culture and the tools and some of the different dynamics and just appreciate taking the time to unpack this very complicated issue and to do a little bit of a deep dive. And yeah, thanks for taking the time and being on the podcast.

[01:37:18.405] Lance G Powell Jr: All right. Thank you very much.

[01:37:21.235] Kent Bye: So that was Lance Powell. He's the CTO of FedEx Solutions, an immersive education company. He also wrote his master's thesis titled A Framework for Understanding and Detecting Harassment within Social VR, which is linked in the show notes down below. And then also a piece called Parental Guidance for the Metaverse with 10 different points of don't talk to strangers, don't go to other people's private worlds. Learn the safety tools, record the incidents, only go into private worlds, understand the social norms of the world and the environment, be cautious about how you report harassment, avoid conflict avoidance, plot twist, your kid might be the one who's part of the problem, and then the last one of play half-and-half on the quest. I've a number of different takeaways about this interview. First of all, I think generally these are a lot of really good pieces of advice. There is a kind of a weird situation that is things that Lance would say as an example of like if you learn the tools that you should only have harassment that happens to you like once. I don't necessarily think that's universally true. I do think that there are certain populations that are targeted harassment and it's a larger systemic situation that goes above and beyond what any of the technological solutions can happen. Certainly, the types of hate speech that's happening, if you are a part of that marginalized community, then you will experience a lot more harassment within these social VR spaces. So, I think there's a lot more complicated dynamics there. I do think that there's a part of Lance who's identifying with these communities, and he's listening to a lot of these communities are saying, and as I look through a lot of the quote tweets and reactions, there is a lot of people whose reactions are, well, if you go into VRChat, will you expect that you should learn the tools because there's ways that the tools could be able to prevent some of this stuff? But I think my pushback to that is that the tools are not intuitive, they're complicated, and there's not great onboarding. If people are going into these situations and they're facing this type of harassment, then it's a larger issue that needs to be addressed either at the platform level, the education level, or at some level that needs to be taken care of, to not just have these really super-toxic environments and then people just accept that that's reality. I do think that people who are really hardcore users have understood that those different types of public spaces are not great places to be hanging out. So they end up cultivating their own social network, and they're spending most of their time in these private instances. So that's probably a big way in which that people are mitigating the more dark side of things that are happening on these public instances. And so just thinking generally about, well, how do you clean that up or just cultivate a situation where it's not like that? Are we destined to only have these private instances moving forward, or are there different types of public spaces that people can actually come together in a way that It feels more like you're going into a city. So what's the baseline of decorum that they want to have in these different types of environments? So that's a larger question for how to cultivate that and really generate that. And Meta has their own approach with having a lot of moderation. There doesn't seem to be as much moderation within VRChat. But I think there's a number of different dialectics, I'd say, that are coming up. The big thing around kids in VR and not in VR, a lot of people that I've heard from the Twitter reaction from that tweet, and by the way, you should check out and just read just to see the range of different perspectives. But a lot of people were saying that, number one, that there's a lot of unsupervised kids. And so the parents' role in terms of being educated into understanding what's happening in these VR spaces. Andrew Bosworth is suggesting that parents should be monitoring everything that's happening within their kids and what they're doing in VR sessions. through streamcasting. Lance's reaction to that was like, that's not really necessarily reasonable for most teenagers, if you're going to ask them if they would mind that their parents are going to be watching them play. I just don't think that's necessarily reasonable to have that type of oversight. Lance did say he thinks it's probably good enough for the parents to be in the same room, which I think is probably good. That's obviously not going to always work out in terms of where the adults are at and where the kids are at, and then if the kids are going to have to be supervised any time that they're going into these VR spaces. But I do think that it is an issue and a problem that does need to either have more supervision or that some of these different experiences like VRChat just need to be age-gated. There is this kind of weird public-private differences in the different terms of service and the community guidelines from VRChat, where there's a lot more acceptable things that are happening behind closed doors with consenting adults. versus what is happening generally in these public worlds. So the type of harassment or trolling or people getting up and violating personal spaces or rape threats and stuff. The video that is shown by the BBC, Lance was skeptical about the intention behind it, but I'm not as skeptical at all. I just watched the footage and the footage is there and it shows what the different types of interactions that are happening there. It was only about a minute worth of footage. So, there is indeed a lot of context that's missing. But the BBC researcher Agnes Crawford was mostly muted a lot of times, it looked like, but sometimes speaking, saying, you know, stop that, when there was someone who was putting a beer bottle within her private areas and transgressing her personal spaces. There's a lot of different things that are happening that are not necessarily suited for kids. There's other questions around age gating and whether or not there should be age verification. Andrew Bosworth was skeptical that there would be any type of viable age verification that's not going to be able to be circumvented in some fashion. There doesn't seem to be any larger effort within Facebook to systemically address some of these different issues. They seem to be happy to have this as an open problem that is passing the buck over to the parents and saying, it's out of our hands, there's nothing else that we can do. That said, there is a lot more of a lockdown feeling that is happening on Meta's Horizon Worlds, but it's also not as interesting. There doesn't seem to be as much creativity. It's not great lighting and aesthetics aren't so great. They did announce recently that they had over 300,000 visitors since they opened the doors, as well as going to Horizon venues, but that's tough to tell because some people may have just gone in there to check it out, to see what's going on. I think over time, as we see what the monthly recurring users are going to be, it'll have a little bit better sense. But like Lance was saying, when you have established social networks of people, you know, people who already have their social graph within Meta's Facebook, that may actually help in terms of having those cohesive relationships that people go and do different things within the different worlds, within Horizon Worlds, although there doesn't seem to be nearly as many interesting things to go do in Horizon Worlds as there are, say, in VRChat. The age gating and the different privacy implications are still a big open question in terms of where that's going to go. If there's going to be more reports like this that are just bringing more attention, that is going to potentially bring more legislators to come and start to say, hey, look, it doesn't seem like this is necessarily a good idea to have teenagers running around these different spaces. I do think that there probably are a lot of different use cases where it's totally fine. There seem to be a number of different situations and contexts within Rec Room that doesn't seem to be as much of a problem, just because there's more games and things for the teenagers to do. And the whole issue of harassment and trolling and sexual predators, that's a whole other situation. One of the things that Lance said is that as an adult, you have a sense of being able to read people and know when to leave a situation and know what can and cannot be trusted, and that it's possible that kids who are 13 to 18 just aren't as aware of some of those different issues and may find themselves in situations. He said something very interesting, which is that most of the times when we find ourselves in these social situations, there's a certain amount of etiquette that we have that we don't just eject or remove ourselves, but that there is a certain amount of personal responsibility that we have within these virtual spaces that we do actually have control, that we can take off the headset or that we can block people or whatnot. The blocking people and all that stuff is, from a user interface thing, it's not always clear. There's not just a button that you can push, and it's difficult. The thing that Lance was saying around harassment was that it's disrupting your normal flow. If you're having a normal flow of playing a game within Rec Room or doing something, if you have the harassing behavior, then it is often quite disruptive to the thing that you want to be doing. Some of these different maps within Rec Room, for example, it can be difficult to get access to certain guns or certain things. It can be quite an investment to get to a point where you're having fun, and if you have that type of harassing behavior, then you're forced to be able to give up on that. So, there's a lot of dynamics that doesn't always make it so that it's just a no-brainer to take it off. especially if you're immersed within the situation. And I think just saying that you can just rip off the headset may diminish the reality of being in an embodied situation where you have different transgressions in your personal boundaries or different harassment that's happening that can feel just as real as other type of physical harassment or bullying, or go above and beyond with different types of things that are already happening in the wider Internet. A lot of reactions are like, this is the Internet, what do you expect? Just get used to having to navigate some of those things. There's a degree of that, but there's also a degree in which that there's something distinctly different within the virtual reality that we shouldn't just dismiss it as being less than in terms of other types of abuse or harassment. You know, for me, I don't see it as much of that hierarchy because especially if people have physically already gone through that, it could be stirring up a lot of memories that are just as traumatic. And so the type of PTSD and reactions physiologically sometimes don't have any differentiation from the virtual versus physical. Certainly, there's a difference between physical assault and things that happen with your physical health, but in terms of the emotional and the psychological impact, it can be just as visceral and just as real. I think that's probably another area where I have a little bit more disagreements with Lance's take on some of this stuff. But I think the perspective that he's really trying to give is that people who have a lot of time invested into these social VR platforms, that when they see someone who's relatively uninformed about the dynamics of the platform to come in and what's perceived as a little bit of a hit piece, but if you want to go in and try to find some of this stuff, it's certainly not that hard to find. And that for the people who are members of the VRChat community, that they have a lot of established relationships and careers and just a whole social life in many different contexts where this is a big part of their life. And to see it attacked in that way without showing the other side of it or that there's different ways of how to mitigate it, I think there's a challenge here of moving into this next phase of new people that are coming in and are not really ramped up into the social norms and how to navigate all these different aspects of how to block people or to be able to use the safety tools that are actually there, which I think is a part of this need that I think Lance and I were talking about, to be able to have new phases of onboarding, just to ensure that people are able to understand how to protect themselves as they go into navigating the metaverse, but also to understand the cultural norms in terms of where you could go to have a good experience and where not. But that all said, there are certainly people who don't know any of that, and you can't expect people to be ramped up on that knowledge. These situations are there, so it's a larger question as to whether or not it's solely up to the parents to be able to sort that out, or if there's other things from a moderation perspective on the platform, or if there's other things that need to be age-gated. And like Lance was saying, there's a lot of ways in which that the community is also responsible for co-creating and cultivating a healthy ecosystem and a culture. And that if all of these new people are coming in and they're not as interested in that, then it can be challenging to have a cohesive culture. And so there's these different sub pockets of the culture that you're exposed to. So certainly that's already happened with the fractionalization of these different types of sub communities within VRChat. So. Anyway, that's some of my take and just trying to map up some of the different cultural dynamics and dig into a little bit more of the nuances of this story. Because I do think that that's an important story to shine a spotlight on this stuff that's happening. I don't have a good solution for how to change those things. But I think if we're just talking about it and then just finding ways for each individual to have a better experience, I think for the most part, a lot of these 10 guidelines is a good roadmap that I agree with in terms of how to have a good experience on a social VR platform like VRChat or Rec Room or Altspace. So, that's all that I have for today, and I just wanted to thank you for listening to the Voices of VR podcast. If you enjoy the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a message-supported podcast, and I do rely upon donations in order to continue to bring you this coverage. So you can become a member and donate today at patreon.com slash voicesofvr. Thanks for listening.

More from this show