I spoke with Harry X about potential gaps in VRChat’s moderation of NSFW avatars that he details in a series of articles here:
- The Dark Reality of VRChat: How Public Sexual Avatars Are Slipping Through the Cracks (2025, April 29)
- VRChat’s Dangerous Oversight: A Breeding Ground for Public NSFW Avatars (2025, May 4)
- VRChat’s Complete Failure in Avatar Moderation – Over 300 TOS-Violating Avatars Reported, Zero Action Taken (2025, May 10)
I did have a chance to follow up with VRChat’s new trust and safety lead in a subsequent interview to go into more details for how VRChat is responding to these potential gaps. You can see more context in the rough transcript below.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
Podcast: Play in new window | Download
Rough Transcript
[00:00:05.458] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the instructions and forms of immersive storytelling and the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So I'm going to be diving into a number of different interviews just broadly around VRChat, and then I'm going to be diving into Rainy and Submersive, which is covering lots of different artists that are working within the context of VRChat. So I got an email from Harry X on Friday, March 13th, and he is a concerned VRChat user. And he was seeing that there was a lot of these not safe for work adult avatars that were publicly available for anybody that's in VRChat could start to use them. And so he was reporting these avatars and finding that there wasn't any fast action that was happening. And sometimes the action that was taken were to make them rather than being public, but to be private, but they were still available for the people that were using them. So they weren't being removed from the platform. And so he made a series of different public posts, and he was filing different tickets. And he reached out to me just because he wasn't getting a lot of response from VRChat. And he asked me if I wanted to do an interview about it. I had just gotten back from Augmented World Expo, and I had been in touch with VRChat. I was going to do an interview around the avatar marketplace that they were doing. And then after I did that interview, I actually did a conversation with VRChat's new trust and safety lead, Joon Young Ro. So This will be like a trilogy of interviews where I'm listening to these claims and allegations that are being made by a concerned VRChat user around some of the cracks and downfalls of the trust and safety processes at VRChat. And then I'll be diving into more broadly, like how VRChat is starting to get into more of the avatar marketplace. And then I have a conversation with the trust and safety lead, who's a new hire, been there for a couple months, and Has lots of plans that are going to be implemented over the course of like two to three years. And so there's kind of like a reevaluation of some of these different existing trust and safety processes that is underway that I think is starting to be pointed out in this conversation with Harry. So to get the full context, listen to the next three interviews to hear some of the different claims and allegations and then be diving into it. So just a bit of a context for the evidence that Harry had sent to me. He sent me like a big file folder of specific avatars that had genitalia being exposed and different prefabs that were enabling different types of interactive sexual acts. And, you know, these were just public avatars that he was coming across. so i did go through and see all these different photos when i actually went into vr chat to see if some of these were available you know there's like hundreds of different avatars and i didn't look and check to see for every one and i i wasn't able to see a number of them and so for my spot check it seemed like a lot of these were no longer public but they may still be private which i think is part of what he was saying was that you know if you already have access to these avatars they wouldn't be available for other people but people who do have them may make them available for other people because you can toggle that So anyway, that's the larger context for this conversation, and I'm going to be diving into more detail around some of these different claims and allegations in the next couple of interviews, specifically in episode 1636 with Joon Young Ro. So we're covering all that and more on today's episode of the Voices of VR podcast. So this interview with Harry X happened on Tuesday, May 27th, 2025. So with that, let's go ahead and dive right in.
[00:03:24.964] Harry X: Hello there. My name is Harry X, formerly known as Harry V, and VR is definitely a passion of mine. Started from V-Time XR, went on to SansR, then Neos, and then in 2022, like the very end of it, to 2023, I then discovered VRChat. So VRChat's been the one that I've spent most of my time in. I mean, I'm on there like, oh, three hours a day, probably too long. And there are many players that sort of just live, you know, and sleep in VRChat. I have over 2,000 hours, over 2,000, I think, two point... Let me give you an exact amount. Going to my Steam library... VRChat, 2,771.4 hours in VRChat, quite a lot of hours. And I'm a trusted user and I'm a VRChat Plus subscriber, which is a $9, well, it's a $10 subscription. It's 9.99 a month. So I'm very well invested within VRChat and the whole ecosystem. I know it very well. And I've met the different types of player bases. So yeah, that's a bit about me.
[00:04:55.282] Kent Bye: So I know that for lots of folks within VRChat, they may be anonymous or may want to provide certain barriers between their private life. But I do like to ask folks if they're willing to provide a little bit more context as to your background and your journey into VR.
[00:05:10.245] Harry X: Oh, of course. Yeah. So I discovered and got into VR during the pandemic. I started off with the Oculus Go. For me back then, I had some money in one of my savings accounts and I was like, hmm, I either get VR or I get a 3D printer and I go down that route. And I chose virtual reality. So, yeah, back when I first started living with my parents and Oculus Go, and that's when I played the V-Time XR. And that really did push me to be comfortable with myself and talking to new people and, you know, just meet people and make friends, you know, across the globe. And that's sort of how I started my journey in the art and to where I got to today, you know, and when I started, I was, I think I was 19 at the time. So yeah, I started kind of young, 24 now, turning 25 this year. So, yeah, you know, also my work background is I used to sort of work for a charity that sort of specialized in helping people understand technology, but it was a market where our clients would be 60 plus, right? And VR has really helped me. Like I said, it's opened me up. I now live in Canada, you know, with my partner who I met in VRChat. So yeah, like VRChat has good points, but we can't just focus on the core good points of VRChat like everyone else does. We have to sort of address the issues that no one is actually repairing or focusing on or making it properly known.
[00:07:00.981] Kent Bye: Yeah, so you reached out to me because you have been tracking how there's a lot of not safe for work avatars that are a violation of their terms of service or code of conduct. And so my understanding is that if you are in a private instance, you can basically kind of do whatever you want, although there's probably limitations in terms of if you're broadcasting and out, so you can't stream, if you read through their terms of service, there are certain things that are carved out where you can't do anything public and you shouldn't be in public instances with certain types of behaviors. But when you're behind closed doors and private instances, then my understanding is that even if it's perhaps technically against some of the different terms of service, then if it's not interfacing with the public, then they're either turning a blind eye or not moderating certain standards within the context of what their terms of service are. So maybe you could elaborate from that point, from my understanding, if that's generally correct, or some more specific nuance in terms of these different types of violations of not safe for work avatars that you've been trying to report, but you still are finding instances of some of these avatars that are public that should not be public. And just describe a little bit of how you first came across this and why you wanted to really push it forward and what some of the reaction has been so far.
[00:08:18.629] Harry X: Yeah, of course. I mean, I always knew that these avatars existed. I used to sometimes use them, but I didn't really know of the dark side back then. So I sort of just, like, I had one of those avatars. So the avatars have been out there for years. I mean, essentially what it is, is it's something called an SPS or a DPS toggle avatar. And essentially, that's genitalia, be it anthropomorphic, if you're in the furry field, mechanical, if you're in the robotics genre, or humanoid, if you're in the human field. So essentially, with their terms of service, what they say... is you're not allowed to wear an avatar with those toggles in public worlds I find that it's okay to wear those avatars I mean I don't necessarily myself because I have my own and I don't really have those toggles but I'm like you're okay to wear it in a public world As long as you don't get naked or do anything private to really make that avatar stand out. But that's essentially what VRChat say. They say you shouldn't be wearing any of those avatars in any public instances. However, they do tend to bend the rules a little bit. to certain folk and i'm seeing it with their moderation team as well which i'll touch on later but i haven't checked the streaming essentially with the other legalities in place as well with these public models people tend to wrap their whole identity around one public model so essentially how it works is a creator makes an avatar you know puts all the toggles on And then they list it for sale on places like Gumroad or Jinxy and the other likes. And essentially what a client or user is meant to do is maybe try on a light version of that avatar that might not have all the toggles and go, hmm, I really like this. I want to make it my own or I would like to buy this and have it as my own model. So often some folk go out and they buy the model from Gumroad and then they have to upload it to their account. However, that's not always the case. And in the streaming content with those avatars, not only does that go against the transit service, but it also goes against the creator's transit service. If the person hasn't bought that model, they're not allowed to make money off it or give it airtime under Creative Commons because they don't have the rights to do that. However, that's not the dark side. The dark side that I've been focusing on is essentially the children. And again, it is very much a VRChat issue and it's also a creator issue as well. And the issue is, is they make, so a lot of the children, pretty much 95% of the children on VRChat's platform. And remember VRChat's average player base to this day is over 100,000 people online a day. Okay, so with that in mind, like I said, you've got different percentages, but 95%.
[00:11:45.381] Kent Bye: Just to clarify, so what's the, is that 90, are you saying 95% are children or what do you mean by 95%?
[00:11:51.122] Harry X: Just want to clarify that. 95% of the children's play base, sorry, play on Quest is what I was going to end that statement with. They're Questies, they go on Quest, which means they don't need a computer to run VR chat. They can run it on their MetaQuest. But there are also a lot of adults who play on the MetaQuest. For me, I'm PC and PC VR. I have a MetaQuest 3 and I link it up to my PC so that opens up more avatars and all the worlds and everything else. But The issue with the creators making avatars with those toggles, Quest compatible, is a child can essentially find that avatar in, let's say, Prismic's avatar world or any avatar search world. Or if you go into a public instance and someone has their cloning on, because they don't always turn it off, if it's a public model and someone's left their cloning on, you can clone their avatar. There's nothing stopping me from cloning someone's avatar if their cloning is on. And it's not just me who can do that. It's children who can do that. And within the first three months of playing VRChat, I was having a bit of like an avatar identity crisis, trying to find an avatar. that could work for me. I went into an avatar search world, just a public instance, and I was sort of looking at some of the male avatars, and I overheard a bunch of children in the corner, and they were saying something along the lines of, look at what my avatar can do. And I sort of turned around thinking, what on earth? And what they did is they had male human genitalia on that avatar and they had access to that and by their voice they sounded young like they sounded below 10 and they were just giggling and laughing and I was thinking oh my god like you're a child you shouldn't have access to any of this but like I said they can get into VRChat it's really easy you don't even have to put in your age to get into VRChat there's no limitations like that So yeah, that was essentially my experience. So I knew children existed and learning more and more about the dark side of VRChat has not just sort of made me aware that children exist, but also made me aware that there are dozens and hundreds of paedophiles that go on VRChat because they know that there are kids on the platform and they know where to look. I mean, if I open up my menu, I know that most of the worlds that children hang out in are worlds like the Black Cat, you know, you can easily go in there and there'll be a group of people and Like over half of them will be children, you know, so like I said, it's just, it's a whole ground for just predators to just, you know, find their next victim. And sadly, the Yarchat is the grey area. And what I mean about the grey area is this. And this is what I've learned from doing my own research, is so many of the victims in VRChat, the predator will ask the child to go on to another platform. As soon as a child registers any form of interest or whatever, the predator will ask the child to go on another platform called Discord, right? Normally it's Discord, right? So the child goes on Discord and they basically talk off platform. They talk on Discord via voice call, a video call, text message, you know, anything. And it's on Discord where, you know, photos are shared, videos are shared. Again, you've got video calling. So sadly... VR chat sort of becomes this gray area. So all the evidence is essentially stored on a Discord chat, right? But then you go into VR chat and then, you know, as I, again, like, as I've been looking into it, what the predator then does is... They'll ask the child, you know, do you want to meet up at said time, at said place? And then the child comes on, and then what the predator normally does is pick from a list of avatars that they know, match their criterion, get fully naked, has genitalia toggles, and then turn on their cloning and allow the child to copy their desired avatar And then they will engage in something called ERP, which is electronic role play, and it's sexual. So with these toggles, what it allows you to do with DPS, SPS, TPS, and the lollipop system, so for example penis genitalia will be able to magnetically lock into the other player's orifice be it a vagina or something else it would magnetically lock into that orifice and then they can engage in the sexual activity and that's what these functions are and again they're tailored for adults of consenting ages, but sadly, because they're made public, not only do the predators find them and allow children to clone them, but like I said, I can go into Prismic's avatar world. They've got this new button that says, show recently updated avatars. And then it brings up hundreds and hundreds of avatars. And then it's just a matter of clicking on a few avatar thumbnails, turning into the avatar and then opening up the radial menu and finding out what toggles it has, you know, and seeing if it can get naked. You know, like I said, these avatars are sadly the majority. That's what I've learned. And I can't exactly sort of target the predator issue as a whole. So what I've been doing as soon as I learned about the dark side of VRChat was, again, reading the terms of service, getting an understanding of it, and also finding these avatars and reporting them to VRChat. And normally VRChat have been... You know, they sent me an email. Because what you do when you report an avatar, you basically have to get naked in the avatar, show the pathway via the radial menu as for how you got naked. There's a text box above it, and you need to say what toggles it has, and then attach the images, the screenshots above. And then enter the user's ID who made that avatar public. And then also enter your user ID, obviously, so it can be linked to your VRChat account because you're the one making the report. And then an email, and then obviously the reason for reporting, right? So essentially you have to, if you want an avatar dealt with or a user dealt with, you have to utilize their ticketing system. You can't report inside of the app. If you do, your reports don't get taken seriously. There's no ticket number. There's nothing. The chat will ignore the reports and it's really bothersome. So I learned that I had to go off platform and go to their ticketing system. Anyway, most of the times I get an email saying a moderation action was dealt with. And I was like, great, the avatar is taken down, you know, and there was like a penalty applied. That's what I thought at the start. But what I was doing when I was reporting them, I have like a VRC report list because I've got VRChat+. It enables me to have five more groups that I can group these avatars into. And what I was doing is I was saving them under those VRC report lists because I was thinking, right, I reported it today. In two days' time, it will be taken down. Then I can make sure it's actually taken down and dealt with and I can check the list. When I checked it inside of VRChat itself, it had the avatar thumbnail and then it has like a little question mark on the corner. And I was like, great, that's taken down. But when I went on the website and I saw my list, what I realized was that question mark just meant the avatar is now private. It wasn't taken down. And my issue with that was you can easily make an avatar public again if it's uploaded to your account by going to the VRChat website. You just click on the selected avatar and then there's a blue button on the right-hand side and you can set it back to public. So essentially, the ones who are still playing VRChat with those offending avatars can re-publicize those avatar and make them public again, which sort of would undo... Not only that, but also mean that no one has to abide by VRChat's terms of service. And that really, again, it really does bother me because the one thing that I can do, like I said, is make these avatars harder to get. Because a child will never have an avatar uploaded to their account. They won't have their own avatar. they will use other people's models because they don't have the money themselves to hire someone get someone to upload it to their account and have their own model you know because an upload is something between 10 to 20 dollars and then you've got the avatar prefab price which is normally 30 to 40 dollars so you're spending 60 to 70 you know just for the whole thing which makes it less enticing for a child So that's why they rely on these public models and these models being so prevalent within avatar search worlds. Pretty big red flag, you know?
[00:22:38.817] Kent Bye: Yeah, so it sounds like that you were discovering these. You sent me over a drive of a lot of these reports that you were making. Yes. So I've seen the evidence of the stuff that you're reporting. You also sent me a number of different articles, both public and private in the sense that were issues that were applied to VRChat's ticketing system. And so on April 29th, 2025, you wrote an article on Medium titled The Dark Reality of VRChat, How Public Sexual Avatars Are Slipping Through the Cracks. Then on May 1st, 2025, this is from a Google cache because when I actually try to go to the ask.vrchat.com, it's the VR chat canny. There's a post that's called ongoing silence from VR chat leadership. When I click through, it actually says that, oops, that page doesn't exist or it's private. And so either that's standard practice or with these different types of reports, they don't. It's not standard practice. Okay, let me just finish the other two and then we'll pass it back over to you. So you have the ongoing silence from VRChat leadership submitted to the canny that is now no longer online, but it's still in Google's cache. I could see evidence of it. Then you have VRChat's dangerous oversight, a breeding ground for public not safer work, Avatars that was published on May 4th, 2025. That was a Hacker Noon article. And then another canny article that you posted on May 5th, 2025, it says 80 plus avatars reported with evidence, zero action from leadership. Again, that's not resolving because that page doesn't exist or it's private. So it was deleted. then you have you went to reddit to r slash virtual reality back on may 10th 2025 in a post called vr chats complete failure and avatar moderation over 300 terms of service violating avatars reported zero action taken and so it sounds like that you were reporting things maybe you were doing things privately and then discovered this and then decided to go public with all these articles and then even then continue to submit different tickets that we can see evidence of within Google, but have since been deleted. So maybe go back to like when you started, the first post that you posted here was like in April 29, 2025. Talk a bit about like what you were doing leading up to that and then what you were seeing that made you decide to go ahead and continue to post these issues and these other articles and reach out to me to start to cover this more.
[00:24:59.111] Harry X: So, like I said, I was watching, there's a few YouTube videos that sort of highlighted the dark side of VRChat, and this is how I sort of learned of it, like into greater detail. But I initially started with trying to reach out to VRChat itself for a moderator. Now, there's a list of moderators, you know, they're commonly known. You've got Tupper, you've got Strands, Those are the two big ones that have, you know, really big airtime and everyone knows who they are. But I was trying to get a hold of just one moderator because obviously they're not the only moderator. VRChat has a bunch of different moderators. And I was trying to get a hold of the moderators via the ticketing system because I know that the tickets were looked at. So I filled out some of the fields because I wanted to talk to a moderator about this issue and my findings. And I wanted to know what was going on. And I wanted to know if they were made aware of this issue. Right. Like I said, I'm paying for VRChat+, which is the cost of a Netflix subscription. And, you know, I don't get much for that, but I do acknowledge that I'm supporting their system. So I was like, right, I want to make sure that the platform is accessible. morally good going forward. You know, I don't want it to be a breeding ground for predators. You know, I want it to be a nice, safe platform for us adults to meet new people. And I wanted to convey my feelings and my concerns to moderator, but I had a couple of weeks where just no one would even respond. They would still be looking at my tickets because they responded to my avatar takedown requests, you know, for those violating toggles, but they wouldn't respond to me just wanting to talk to a moderator, you know, and I didn't even want like a face-to-face conversation. It just... You know, maybe some email correspondence, just someone to talk to from their staff where I could relay some concerns. And I was ghosted. So that was essentially, you know, I was like, right, OK, they're not going to listen to me. I'm not a big person. I'm not Thea, who has a virtual reality show. Love their show, by the way. or Thrill, who again, they have a virtual reality show. I'm like, I'm not these people with a big audience influence and, you know, I'm just a user, you know, don't get me wrong, I'd love to have influence if I could, but... You know, as a standard user, I'm just noticing issues and not seeing them get addressed. So to be treated in this way, especially because, like I said, I was paying for this premium and to be treated like I was just not even worth a correspondence is... Made me so angry, especially because this isn't something like, you know, I've been hard done by. This is a really core issue that is, you know, federal law, you know, is going against several different international laws and there's online safety concerns. I was just wondering how something so big and prevalent can just be sort of brushed under the carpet. So, of course, that's what led me to write the article on Medium and also Hacker Noon, which was another one. And I reached out to a few people. The other people that I reached out to were VRChat's managing team. So I managed to find direct emails for VRChat's team. And I sent them all an individual email disclosing what I saw and saying, you know, I'd like to have a chance to talk to someone about this issue. I've requested for a moderator, but I was ignored. And obviously how something like this would affect their position, you know, and federal law. And I received, I messaged six people and I had no one respond. Like all of them ignored my email, which was, again, quite scary because I'm like, well, why are they ghosting me? So again, that sort of made me confused. And it's like, why are they not wanting to sort out this issue? Why do they keep brushing it under the carpet? So then again, I was really concerned. And then, like I said, I discovered you and I was like, oh, please, can I, I'll message him. Hopefully I can get an interview and just share because I'm not going to be able to do anything. I don't have enough time. I don't have any audience. I don't really, I'm not really in this field, you know, I'm just some guy playing VR chat, you know, you know, who likes to dance in VR chat. Like I, I like the MMD worlds, but like, like I said, it's just, I'm just a player. I'm not anyone of importance. I'm not in this field, but this has to be dealt with. This can't just be brushed under the carpet. So I discovered, like I said, you and, I wanted to see if you were able to just at least interview me and I could tell you what I found and, you know, because this is your field. I also, like I said, wrote the Hacker Noon article and that didn't have as much traction as I'd hoped because all of those articles, my hopes were that it would have enough traction to get, I don't know, a parent who maybe subscribes to those blogs and be like, oh, I have a child in VRChat. My child is in danger. Maybe I shouldn't give them access to the headset because the issue I keep seeing is, children are getting these standalone headsets because parents think that their headset is some kind of Xbox or gaming console. And it couldn't be further from the case. You know, there are many good games targeted for children. But, you know, there's no restriction of stopping a child from downloading VRChat, creating a VRChat account. And like I said, unrestricted environment, unmoderated, and it's unmoderated too. Obviously, you've got Facebook Horizons, Horizon Worlds, which is better moderated and then you had old space vr and in every public world every large public world like the campfire there would always be a moderator there so if there was a child in that world you know they would be allowed to hang out there but if there was a predator trying to prey on that child then you know a moderator would be dealing with it vr chat doesn't have anything like this So all they rely on is the ticketing system. The other thing that I discovered is the moderators hang out in their own groups. So they're not really going out into the actual platform and hanging out in these worlds actually moderating. They're just sitting in their small groups You know, so they're not actually dealing with people inside of the platform. They're only dealing with people via looking at their tickets via Zendesk. So, yeah.
[00:32:40.936] Kent Bye: Yeah, so it sounds like that you were reporting these and weren't getting any response. And just as I was just Googling for some of these ongoing silence from VRChat leadership, I see like at least three or four posts that I'm presuming are all from you. Going back three weeks ago, there's a post called 80 plus avatars reported with evidence, zero action from VRChat moderation. And then another one a couple of days later saying two days, 70 reports, still no action. What's going on? And then on May 30th, 2025, ongoing silence from VRChat leadership. And then May 1st, 2025, ongoing silence from VRChat leadership. And then May 5th, 2025, over 300 avatars reported, VRChat moderation still silent. I'm assuming these are all from you. and they're all the same message when I link on them, when I log into the canny, you have to go through and get authenticated through your VRChat account. And then once you get there, it says, oops, that page doesn't exist or is private. And so you'd mentioned that these aren't typically made private or deleted. And so have you been able to document that or maybe just elaborate on what the standard practice usually is? And it seems as though that the evidence of this is being deleted, but yet I could still see it on Google.
[00:33:55.154] Harry X: Yeah, so I never made that private. I would never make that private because that's within VRChat's Ask Forum. I want people to see that. I want people to be like, oh, it's an issue. You know, maybe we need to rectify this. Right. And this is internal, like only VRChat members see this. So again, this wasn't for publicity or anything like that. I was just trying to get it out there internally for them to say, please, can you just respond to my email or, you know, do something. Yesterday, my forum account got suspended. For the ask.vrchat forum, because again, I was pointing out issues with some of the moderation and they were like, nope, not having that. And they suspended my account. So I can't log back into it, but I can still access the forum. But yeah, I can't log in anymore. It just won't let me.
[00:34:54.269] Kent Bye: Okay. So you had mentioned that what they told you was something like, it looks like you're an AI bot that was spamming their reports.
[00:35:02.111] Harry X: That was the first one. The second time they permanently suspended me, I managed to, while the page was still active, I got a message where it was saying, user is suspended, reason spam reporting other user and general hostility reported. So I managed to get that one. I'll share that in the chat. So that happened yesterday. I also uploaded it to the Ice Drive folder. And that's what will happen if someone finds my account on the ask.forum.
[00:35:37.648] Kent Bye: Okay, so this user is suspended reasons, spam reporting, other user and general hostility. So the generous explanation is that there is some threshold of AI that, you know, once you exceed it, then it automatically bans you algorithmically and then takes all of your posts down. the more malicious interpretation is that they're deliberately deleting it and trying to hide the evidence that you're reporting. The answer might be somewhere in the middle, some sort of weird automated AI algorithms that are being triggered because you are submitting all of these. You're saying like 80 to 300 different reports, but you're just trying to document it through their system and you're getting flagged and then being reported for spam. That may be an automated system or it could be malicious decision. Just a quick clarification. It sounds like you were using large language models to some extent, like AI reports, like what were you using AI for and why was it a problem? Was it hallucinating? Was it including fake information or maybe just describe like what were you doing with the AI and what was the input and output?
[00:36:42.404] Harry X: I was just to reword my issues and concerns to make those posts in terms of bringing people to the attention that this is the issue. And these avatars violate terms of service when there isn't proper moderation for it. And then when I would have a response on that thread from someone, I would then use Tragic E.T., to respond to the person, to convey a message, you know, saying, you know, this is incorrect because this, you know, just to reword what I would say so it came across in a more professional light and it wasn't too angry, you know, it was professional. So just for that reason.
[00:37:28.089] Kent Bye: Okay. Yeah. And you know, for me personally, I'm not a huge fan of using AI for communications like that, just because I like to keep it in my own voice. And so I can understand if stylistic, they don't like it. But I think the issue here is more along the lines of. the context and the content of what you're reporting is actually like independent of whatever AI communication helpers you may be using to add elaboration or make it longer or whatever the AI was doing. There's still an issue there that goes above and beyond that, that is regarding these violating avatars that you're addressing.
[00:38:02.580] Harry X: That's correct. Yeah.
[00:38:05.182] Kent Bye: Okay. Okay. Well, I typically do a lot of oral history, which is covering people's stories and what's happening. And whenever there's conflict, then it starts to push me a little bit more into the journalism. And so I'll likely need to check out with VRChat, see if they have any comments, maybe even bring them on to have more of a conversation. in-depth conversation about these issues and release these at the same time. But what would you like to see happen? Like, should there be a system that does like checksum? So to check out all the uploaded avatars to make sure that they could ban them. And then if they are banned them from being public, then People could always make a slight change and re-upload it and it would change the checksum. And so it could be yet another type of cat and mouse game. But it sounds like even if they are just making them private, then people can remake them public and the issue is still there, especially if people already have these avatars and they could share them with others. And so what do you think the best solution is? What would you like to see?
[00:39:01.756] Harry X: I mean, ideally, what I would like to see is a file checking system. So when you upload an avatar, you have to download their plugin app called VRChat Creator Companion. And what this does is it communicates with Unity to VRChat and VRChat servers to upload the avatar to your account. So from their own app, VRChat Creator Companion, I would like them to implement a file checking system that scans the files and looks for things like DPS, SPS, TPS, PCS, and the Lollipop system. because it's those files with those file names that then can get flagged to VRChat's server, and then they can say, well, we can't upload this avatar as a public because it has this... on it because there's always a prefab. It's never its own system. So I would like to see that. And that would also help really minimize the prevalence of these avatars in public worlds and public searching worlds. The other thing I would like to see moving forward is because VRChat's age for age requirement for its users is only 13 plus. And we have to remember that the legal age of, let's say, intercourse. between consenting adults in the UK it's you know 16 you know it's also 18 everywhere else right and I think that's how we have to approach it because those people are going to immediately see that kind of content and be put into an environment where you know they're going to be faced with certain people coming after them because of their gender trying to get with them xyz you So ideally, I would like to see VRChat age requirement being 18 plus, because it really is. You know, you are going and you are talking to strangers and you are going to have stuff happen to you. So you need to be, you know, at least mostly matured so you can be able to handle all of this and deal with it. So I would like to see the age requirement go up. The other thing that I want to touch on, it was actually Dara Fogg who pointed this out when I was watching one of the videos. They said, when you look at the branding of VRChat, be it how they choose to promote VRChat, it's all bright, vibrant colors. And we know bright, vibrant colors are aimed at more like children because they see it as exciting. When you look at their initial advertisement, if you go on YouTube and type in VRChat trailer, you will see VRChat advertised as some big playground. All of the avatars are cartoony. All of the avatars are bright, vibrant colors. All they're showing is how much fun you can have in VRChat, and they're not showing the worlds that are PC only. They're not showing how realistic avatars can get, because children don't want that. They want something, you know, bright, vibrant, and anime-like that they can have fun with, right? Because, again, with us adults, we want to see realism, but the children don't want to see that, right? They want a big playground. So essentially, I would like them to make all of their advertisements less child friendly and more targeted towards, you know, adults who want to see that realism, because then it again. removes or at least makes it a lot less persuasive to children. I don't want to see children on here. It's, you know, it's not safe to be a child on here. You know, there is a lot of mental mind games going on. There is a lot of cults on VRChat that no one talks about. There is paedophilia that nobody talks about. There are strip clubs that you could go into. And you can see two adults grinding on each other on the stage with a huge audience. There's so much. Even the groups are heavily sexualized. Everything from kinks to what you can do. There are groups that literally organize orgies to have sex with anyone in that instance. You know, it's just crazy. And you have to be an adult. You need to be an adult if you're going to be immersed in somewhere like that. there needs to be a firm 18 plus you know because essentially when you know even if a child doesn't go there they're still going to see it that's the issue VRChat did take a bit of a step in the right direction where they introduced something called age verification and What that meant is you had to upload a picture of yourself and your ID so VRChat could verify that you were 18+. And if you were 18+, you would get this tag on your profile that you are verified 18+. So I have that. But the issue with that is they gated it behind a paywall. You have to pay to do that. And then it's like, well, why bother? Because... The other issue with the age verification is it doesn't do anything. If I'm not age verified, I can still wear those avatars. If I'm not age verified, I can still go into those strip clubs. Age verification has no barrier at the moment. It's simply a tag. There is nothing implemented on it. And the other thing I would like to see going forward, if If there are going to be kids on the platform, then I want a firm requirement where the child must enter or the person must enter their age and date of birth. And then, you know, they must do some kind of age verification with their ID. So then what they do is they have different servers and they have... You know, one server that will only put them in communication with other kids and then they'll have another server that will only put us adults in communication with us adults and no overlap. That's what I want to see going forward. If the kids are going to be there... I'm not happy about it, but if you're going to put them in there, there has to be a way that stops me from talking to a child, that stops a predator from talking to a child, because that should never be the case. You know, if they're not my child, if I don't know them outside of VR chat, I should not be talking with them, period, right? This is why I'm confused why stuff like this hasn't been done before. You know, this idea isn't outlandish. It's just they have millions coming in from their investors and they just choose to update the app in terms of adding certain features like, you know, a camera dolly and more camera add-ons rather than addressing issues that can actually make the platform safer to be in.
[00:46:22.642] Kent Bye: Yeah, VRChat did have layoffs last year. And so I think they're in the mode of... 30% layoffs. Yeah, 30% of the staff. So I think they're at the same time trying to like scramble to come up with a good monetization and their long-term strategies of figuring out what are the features that are going to make them profitable and viable in the future. So I imagine that's what they've been focusing on, especially if they're shorthanded. But all these issues are important. And what usually drives... companies to actually implement these is regulation that forces the hand so that they don't have any choice and so we're in this kind of wild west period where these platforms can kind of do it whatever they want and it's still small enough relative to other platforms that hasn't really caught the attention or it's on the radar of regulators to pass specific laws and so One quick follow-on that you had mentioned earlier, some of the research that you had done in terms of just some of the mechanics of the grooming and predator behaviors, this idea that there would be an initial connection within VRChat, and then it would switch context over to Discord. By that point, it's kind of like... off the platform and kind of beyond the scope of what VRChat can even do. Was that from what other videos have been covering or is that something that just came up in your own research or maybe just elaborate on how you came across that information?
[00:47:38.621] Harry X: No. When you type in VRChat in YouTube, you'll get one of two videos. You'll get trolling, which obviously that's going to be everywhere. And then you'll get, I caught this predator. There's a few channels that focus on catching VRChat predators, but the common theme is always the predator finds the victim in VRChat and then they disclose and discuss messages and send messages online. on Discord, and then they go back to VRChat to do the ERP, and then they go back to Discord to keep exchanging messages. So again, VRChat hosted the environment that got the predator in the same environment as the victim. It facilitated it. It's always the same theme. But again, VRChat isn't liable because the evidence is only on Discord because obviously you've got the written communication and you've got the photo exchange there. The victim will never just sit there and record evidence of paedophilia inside of VRChat because they're a child, they're not going to... And I'm like, oh, this is a pedophile. Let me continue to engage with this pedophile and record the whole thing. It never works like that. Because with VRChat, they rely on an evidence system. So let's say for me, let's say if someone's being really rude to me and they're saying slurs and stuff, in order for me to write a report for them to be dealt with i need to basically record that interaction like via recording my screen so you know which is ridiculous because i'm not going to think of that when they're being really rude to me i'm just going to be like engaged in the conversation but that's the issue that i have and because these erp interactions happen in private worlds there isn't really a way for a moderator to sort of like get in or force their way in you know so again there's a lot of things happening in private instances of you know worlds and And like I said, the children aren't going to be like, oh, let me record this interaction, because they're not going to know what's happening. They're not going to understand how bad it truly is. And obviously, there was a case that was made I think I might send you a YouTube video. Someone basically made an hour and a half long video documenting someone's child was groomed and that child ended up committing suicide and the pedophile got away with it because they contacted the child first. They found the child on VRChat and legally VRChat is considered this grey area. right so the issue was is the pedophile got away with it because all the actual evidence was on discord and none of the physical evidence of like like i said the grooming itself was recorded in vr chat so even if vr chat was responsible for hosting that sort of thing they got away with it. And that's not the only case. You know, there are dozens and dozens and dozens of other cases. And the Archat gets away with it. Because like I said, the evidence is in Discord. So it makes Discord responsible and not the Archat responsible.
[00:51:14.011] Kent Bye: Yeah. Rather than content moderation and we're in a phase of needing to have like behavior and conduct moderation where it's in real time and it's, it's a much different problem that, you know, either you're going to just start recording everything and then potentially have other privacy implications there. Or, you know, there are some real-time systems that can start to detect different keywords in terms of like trolling type of behaviors that have been experimented with and on different social VR platforms. And I don't know to what extent they're being deployed, but yeah, this kind of like problem of any sort of one on one interaction could have potential terms of service violation. And yeah, unless you're recording it, then that's just not going to be documented. And so it's a much broader issue with VR and social VR, how to create these safe online spaces in a way that is able to make it easier for the capturing and reporting of these and also enforcement of the terms of service. The terms of service can be there, but if it's not being enforced, then it's not really setting up those boundaries other than the difference between like a terms of service and code of conduct. A code of conduct is basically like the cultural behaviors and then the terms of service is enforced, then it functionally gets turned into more of a code of conduct that, again, is contingent on whoever happens to be witnessing it in the moment. So anyway, there's some broader issues there that are not unique to VRChat, but that are I think coming to a head because VRChat is certainly one of the biggest social VR platforms that are out there. And so it's really important to at least start to document some of these gaps in the enforcement of their existing terms of service and code of conduct. And finally, what do you think the ultimate potential of virtual reality might be and what it might be able to enable?
[00:52:59.631] Harry X: Oh, I love it. I mean, like I said, I think it's amazing. And if VRChat can be moderated properly, these mediums would allow the person to discover who they are virtually and also help them discover who they are outside of VR. So for me, before I started my VR journey, I was like really closeted, right? And since doing VR, talking to random people, seeing how just okay it is to be gay. It's like, wow. And it's really like got me out of my shell. VR has so many great factors. And like I said, if it wasn't for VR, I wouldn't be living in Canada right now with my partner. You know, it had such a big impact. Everything from discovering who you are to, you know, finding out groups of people, communication, making friends with people you would never encounter before. in the daily world. Like there's just so many things you can do to it. You've also got the knowledge sharing. I mean, when I first started my VR journey, I didn't know a thing about anything. I was just like, oh, it's VR, you know. But I learned so much through it. And now, you know, I've built worlds with made avatars, you know, I know stuff like that. And that's how I can be proficient in this field and understand like, oh, this is an issue. So the potential is massive. And I expect VR to keep going forward and become more mainstream. And it's become more mainstream because of Facebook, you know, with their Meta Quest 2 and then their Meta Quest 3 and the 3S. And hopefully it won't be this year, it'll be next year, 2026 with the four, Metaverse 4. It's going to keep getting popular and it's going to keep getting popular because people want to go out of their comfort zone. They want to explore who they are. And that's what VR can facilitate. That's what it can allow you to do. And not only that, I mean, there's a lot of disabled people who have, you know, let's say physical disabilities. So, you know, to be able to have no disabilities in VR could really open it up for more. So, like I said, I hope and expect it to, you know, go on the rise and VR chat isn't the only metaverse, but it's certainly one of the most popular metaverses. So, you know, with that rising, so yeah, you know.
[00:55:23.456] Kent Bye: Yeah. Nice. Very cool. And yeah, just certainly a lot of things that are happening in the VRChat culture and community and with over 2,700 hours in there, then I can certainly tell that you've seen quite a lot of the potentials. Oh, yeah.
[00:55:41.555] Harry X: That's just VR chat. That's ignoring my No Man's Sky or my Neos or Sansa. That's just VR chat in itself. I use VR more than I'd like to admit. I think I have a problem. But no, like I said, it allows me to be more free and talk to people that I never would talk to or never would have the opportunity to talk to because of country lock or region bound. Right. Yeah.
[00:56:10.963] Kent Bye: Great. Is there anything else that's left inside that you'd like to say the broader immersive or VRChat community?
[00:56:21.894] Harry X: I just hope that they're going to take a step in the right direction and VRChat can be a much safer place going forward. So for VRChat, yes. In the VR to VR industries, please give us more pancake lenses on headsets. I'm so fed up of aspheric lenses or Fresnel lenses. Give us more pancake lenses. But that's hardware, sorry. But yeah, going forward with VRChat, I would love to see a safer environment. Like I said, I just want to go online, you know, and be free, have fun, and it to be safe, not just for me, but for everyone, how it should be, you know?
[00:57:07.156] Kent Bye: Awesome. Well, Harry X, thanks so much for doing all this research and reaching out to share some of the challenges you've been having of trying to report all these moderators for the goal of trying to make VRChat safe online place. And it sounds like there's some more systemic infrastructure issues that need to be also put into place at some point in order to address this issue. I think with things like this, you're never going to be able to have a complete technological fix to these things where it's going to have to be a mix of the culture and the technology and the economic incentives and maybe even laws that are passed that are forcing the hand of VRChat to implement specific architectures or protections. Yeah, I think this is probably one of the areas where as things move forward, I could see that there might be legislation enforcing specific architectures or just like, you know, there's rules that technology companies can't collect data information on. kids who are less than 13 years old. And so there are these COPPA protection laws that are trying to protect the privacy. And, you know, when you have these shared virtual spaces, then what are the long-term implications of going one step beyond just COPPA compliance? Are there other types of architectures that need to be considered, which is something that you were proposing may need to be implemented. I could see connecting to friends and family as a context, but then in public instances, maybe I could see exceptions to that. But generally, I think that it's something to consider as we move forward. Although if you think of the internet, there's no tiered system of the internet, but that's only 2D. So yeah, it would be kind of a new precedent in I'm a bit skeptical that you'd be able to actually pull it off and then enforce it. But I think it's one of those areas. What's that?
[00:58:55.945] Harry X: Well, they could just have something at least. Yeah. Yeah.
[00:59:01.146] Kent Bye: Yeah. Well, anyway, thanks again for joining me here on the podcast to help share what you've been finding and yeah, talk a little bit more about your journey as well. So thanks again for joining me here on the podcast.
[00:59:11.648] Harry X: Thank you very much.
[00:59:13.595] Kent Bye: Thanks again for listening to this episode of the Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a supported podcast, and so I do rely upon donations from people like yourself in order to continue to bring this coverage. So you can become a member and donate today at patreon.com slash voicesofvr. Thanks for listening.
