#391: Social Dynamics within Sony’s PlayStationVR GDC Demo

At GDC this year, Sony revealed some of their first experiments into social VR with a four-person demo that was showing at their press event. The avatars looked like South Park characters, and the thumbs up or down gestures using the move controllers would trigger similar emoji-like expressions that were shown to the other avatars in the experience.

I had a chance to catch up with Ellie of Sony’s Online Technology Group to learn more about their iterative design process around discovering a locomotion technique that was comfortable, some of the most surprising and joy-induced social interactions that they discovered, cultivating the sense of shared spectacle, and creating objects that optimize communication between players. I also share some of my own memories and reflections of the experience as well, and it’s some positive signs that Sony is both exploring and innovating in the social VR space.

LISTEN TO THE VOICES OF VR PODCAST

Here’s an audience member’s recording of the same social VR demo that I experience at GDC. This was presented during a VRDC session:

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip


Support Voices of VR

Music: Fatality & Summer Trip

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. My name is Kent Bye, and welcome to The Voices of VR Podcast. On today's episode, I'm going to be talking about the Sony PlayStation VR social experience that premiered at GDC. So at the Sony press event, they had anywhere from 25 to 40 different VR experiences that you could try out, and one of which was a station where you can go on to this social VR experience with four other attendees and so it was a bit of a guided tour where you're walking around and doing these different social interactions and it was actually really fun and engaging and Sonny was able to tweak a lot of little small details to make the engagement feel very expressive as well as engaging and surprising and fun. So, I'll be talking with Ellie of the Sony Online Technology Group about the social VR experience and some of their lessons learned on today's episode of the Voices of VR podcast. But first, a quick word from our sponsor. Today's episode is brought to you by Unity. Unity is a great way to get involved into virtual reality development, even if you don't want to become an expert on every dimension of creating a VR experience. The Unity Asset Store has a lot of different 3D models and scripts to get you started. For example, Technolust's Blair Renaud has won artistic achievement awards using a lot of the assets from the Unity Store. I'm not actually doing a lot of modeling and art for the game. It's a lot of kit bashing, taking Unity assets, tearing them apart and putting them back together. Get started in helping make your VR dreams come true with Unity and the Unity Asset Store. So this interview with Ellie happened at an exclusive press event for the PlayStation VR during GDC on March 15th, 2016. So with that, let's go ahead and dive right in.

[00:02:02.414] Elli Shapiro: My name's Ellie, I work at the online technology group at Sony and we have built a social VR experience which is all about social and communication. So the way it works is there's a microphone in the headset so there's very high quality voice chat going on and then because of the way the controllers in the headset are both tracked there's an awful lot of body language that you can put through and so that's what we had to start, very abstract avatars. Because you can get quite realistic body movement, then to have unrealistic avatars would be worse than having completely abstract avatars. So we have these abstract cartoon avatars, and the body language comes across from the people. You find that if you know someone, then you can identify them from their body language.

[00:02:41.787] Kent Bye: Interesting. And it's pretty, you know, like, cartoonish in terms of, like, you have huge hands and you have your head, and your body and your feet aren't really being tracked, but just the hands and the head alone, I guess, is enough for you to determine who's who.

[00:02:53.772] Elli Shapiro: Absolutely, you also have in some of the VR experiences you have a head that's kind of the body hangs off the head so you have a body that's suspended from the head whereas this we have the feet attached to the floor and the body will bend according to the way you move your head so it actually gives you more communication in the way you can pose. There's also a gesture system, so as well as the default palm, you can do a thumbs up and you can point. And that affects your face, so if you pull a thumb, a thumbs up, you pull a happy face. If you do a thumbs down, you pull a sad face. There's multiple other expressions that you can pull. If you make fists, you become angry. So it's all about communication. Anything that we could do to communicate in there, we have done. All the toys in there are about communication. We've got ping pong, football. We've got communal block building. There's also the sense of shared spectacle, so in there we have four buttons that you can press. You need all four players to be in there to press the buttons at the same time in order to release a monster and then you have this sense of shared spectacle so you all stare at the monster together and that's something that you're experiencing together.

[00:03:55.326] Kent Bye: Yeah, one of the interesting things that you have in there is a mirror where you can start to learn how these different gestures work. And, you know, from a first person perspective, you can see your hands and you can see whether or not you're giving a thumbs up, thumbs down. And then you look in the mirror and you see this kind of amplification of your emotion by it being reflected in your face through kind of like these emoticon expressions on the face. And so it's interesting because you can't necessarily see what your face is doing, but other people can and you can see your own hands.

[00:04:24.745] Elli Shapiro: Absolutely. And we hope that that's something people will learn. And you find that people do, if you watch their faces in real life, they will mirror what they're trying to do in VR. So if they do the thumbs up, you'll see they do a massive grin. If they do the fists, you'll see them baring their teeth in real life, which I think is really amazing. The mirror also helps because some of the gestures are more subtle. So they're not just about the fact that it's a thumb, it's the fact that it's a thumbs up. You can do a shrug and you pull like a meh face. There's quite a few in there that aren't all demonstrated in the demo today.

[00:04:54.152] Kent Bye: Yeah, so this is kind of like an open world sandbox that we saw today with the social VR experience where you're kind of going around in it. You actually have a little bit of like a guided tour to have like this experience with these different types of physics interactions, whether you're kicking a ball across the area or you're throwing blocks at each other or building blocks or playing ping pong. But I'm curious, like where you see this going in terms of like, is this going to be an experience where it's going to be just kind of like this open world playground for people to get together with their friends and kind of just mess around in VR?

[00:05:23.812] Elli Shapiro: There are multiple options for the future. It doesn't have a definite future at the moment. It is very much a demo. It's to inform what is possible with communication in VR. But yeah, it's basically a place to hang out, to express yourself, to be with your friends, or also to meet people.

[00:05:38.076] Kent Bye: What were some of the big lessons that you've learned and have taken away from this tech demo that you've put together here for GDC?

[00:05:45.334] Elli Shapiro: That VR is amazing when it's social. There is so much that you can put across due to the track controller, so much body language. Also, we've included locomotion, which a lot of VR things don't. And that's because that is a very social signal. Standing right up close to someone is very different from backing away. Also, the sense of abstraction. We found that with the very abstract figures, then you can still get across an awful lot about a person.

[00:06:10.284] Kent Bye: Is there anything that you found that really didn't work that you had to take out?

[00:06:14.663] Elli Shapiro: We've iterated through at least 22 different movement schemes. The one that we've got in, we're happy with. There's probably more room for improvement. So the move controller has these amazing two buttons, a big thumb button and the trigger, which are just so easy to find with your hand. You don't need to see your hands to use them. So we've been trying to put as much as we can on those buttons as possible. So yeah, rejigging how that works to enable the gesture scheme and the movement scheme at the same time has been quite tricky. But after 22 attempts, we've got something we're quite happy with.

[00:06:44.585] Kent Bye: Yeah, I was pretty animated in my experience of the social VR and I found myself wanting to turn in real life rather than, you know, use the controller and a lot of times I ended up kind of occluding my move controllers because I just got so immersed. Moving and turning in VR is very natural and intuitive but yet using the controllers doesn't necessarily feel as intuitive but yet you kind of have to be facing forward in order to have the camera be able to see the controllers. So it feels like an interesting trade-off that you have to do in order to have the best tracking that you can but also allow people to have intuitive movements.

[00:07:19.480] Elli Shapiro: Yeah, that's been a very big concern with choosing the movement scheme. So the incremental snap rotate has been the most intuitive we can find for that to encourage people to use the controller, particularly when they're moving around, to use the controller to orient themselves rather than orient themselves in real space. It is important that they can turn a little and if someone's standing next to you, you need to be able to turn to face them and then turn back the way you want to face. We're also experimenting with methods of incentivizing people to face forward and alerting them when they're not. We don't want people to have to take the headset off to work out where the camera is and to work out when they're turned around, so we're building things in to help them do that in-game. So in this demo today, if you're turned around more than about 90 degrees, then some arrows on the floor will appear to show you which way you should be facing. Also, your feet in-game will always be facing towards the camera, so you can always look down at them for a visual cue of which way you should be facing.

[00:08:07.866] Kent Bye: I miss that. So I had to have the minder come up to me two or three times and turn me around and face the right way, because I just was turning and dancing and all sorts of crazy stuff. But yeah, to me, I think social and VR is one of the most compelling use cases. And do you foresee using more and more like, I guess, are there existing social friend network integrations that you foresee being integrated into the Sony PlayStation VR experiences?

[00:08:37.585] Elli Shapiro: Well on the platform level at the moment you have the PlayStation Party, the voice chat, so foreseeably there could be some VR application of that. That would be something that would be nice.

[00:08:49.236] Kent Bye: So what are some of your either favorite memories or stories of having social experiences in VR?

[00:08:55.204] Elli Shapiro: So in the demo we've got today, there's a space hopper. So you hold the space hopper and you wave it by your feet and you go flying into the air. So that actually came out of a bug. When we first introduced the beach balls, which are slightly less interesting, we had a physics collision issue with your feet. So if you were holding this beach ball and accidentally waved it at your feet, it would accelerate you off in some random direction and spin you all over the place. And it was really fun. It was a bit disorientating and quite unpredictable, but it was super fun. And we left it a long time before we fixed that bug because everyone just loved it. And then the Space Hopper is the toned down, much more predictable version of that.

[00:09:31.681] Kent Bye: Yeah, I noticed in talking to different people who develop social VR, they've often said that sometimes the bugs are the most compelling things because it ends up being things that you could never see in reality.

[00:09:41.348] Elli Shapiro: Yeah, absolutely. I mean, everything you can think of, you should just try. And sometimes things you can't think of happen, and they're the best things.

[00:09:50.053] Kent Bye: What's your favorite thing to do in social VR?

[00:09:52.895] Elli Shapiro: It also involves the space hoppers. So something that probably isn't shown in the demo here is that you can use the space hopper on someone else. So you can creep up behind someone, wave the space hopper at their feet, and then send them flying.

[00:10:05.268] Kent Bye: I think someone may have accidentally did that to me because I hadn't done it and all of a sudden I was shooting through the air and I just had to shut my eyes because I get a little motion sick from that. What type of sensitivities have you found with people? Is this a comfortable enough experience for people or do you find that some people are feeling a little motion sickness?

[00:10:24.163] Elli Shapiro: On the whole, it's been pretty fine, pretty well received. The space hopper and the trampoline, we call them the super user features. So at the moment, with the kind of curated demo, we'll warn people. And if they seem a bit sensitive, then we recommend they don't use the space hopper.

[00:10:39.028] Kent Bye: So what do you want to experience in VR then?

[00:10:41.429] Elli Shapiro: I mean, shared spaces are amazing. And being in a social place, working with communication is brilliant. And I think a lot of the team would agree that the ideal thing to build would be the metaverse.

[00:10:53.929] Kent Bye: And finally, what do you see as the ultimate potential of virtual reality and what it might be able to enable?

[00:11:00.769] Elli Shapiro: It has a lot of applications outside of gaming as well. I've seen some amazing experiments inducing empathy in people, which is really brilliant. There's that experiment where two people wore headsets with cameras that showed them what it was like from someone else's viewpoint and that induced a lot of empathy with that other person. I think there's a lot of really interesting kind of outside the box, not necessarily gaming related things, a lot of kind of health related, mental health related things that will also be really super awesome.

[00:11:26.382] Kent Bye: Great. Well, thank you so much.

[00:11:27.843] Elli Shapiro: Thank you.

[00:11:29.425] Kent Bye: So that was Ellie of Sony's online technology group, and she was a part of the team that worked on the social VR experience that premiered at GDC this year. So a number of different takeaways from this interview is that first of all, I think that social VR is going to be one of the most compelling use cases that is going to be driving people to engage with VR. And I think that Sony had a lot of really interesting small but subtle things that they were doing in this VR experience that made it a little bit more fun than some of the other social VR experiences that I've had. So in social VR, you have one of a couple of options, one of which is that you basically prevent people from moving around. And on the one hand, that will prevent any sort of motion sickness that may result from some subsection of people. But it also kind of creates this closed container where it's really good to have intimate conversations. This is a little different because it's a little bit more of an open world. And they actually did a really good job of doing things to minimize motion sickness in terms of having not really complicated textures, so there wasn't a lot of extra optical flow, and like Eli said, they had iterated through at least 22 different locomotion schemes, and the one that they came up with was actually pretty comfortable, and I'm pretty sensitive when it comes to VR locomotion. I'll be airing an interview that I did with Jason Gerald at the IEEE VR this year where he kind of really goes into a lot of the different motion sickness triggers and the research into that. But suffice to say that Sony did a really good job with this social VR prototype. But the other thing that this open world approach gives you is this sense of exploration and having a bit of a shared activity with complete strangers. There are certain moments that I think really stand out for me, one of which is that we were standing around this table and the tour guide from Sony was kind of trying to describe things and I just like to really try to disrupt the Social dynamics just to see what people's reactions were and I mean you could kind of call it trolling a little bit But I was just trying to have fun with the situation so I picked up a ball and I just threw it right at the head of the speaker and I he took out his hand and he just caught it. And it was sort of like such a surprising thing because it was one of those delightful things that I didn't know what was going to happen. I was just testing the balance to see if it was going to bounce off of him or how he would react. But he actually just kind of like had lightning quick reflexes where he just flipped up his hand and caught the ball and everybody kind of broke up in laughter because we weren't kind of expecting that was even possible. So I think it kind of gets to this principle in social VR where if you could find these little things that are surprising and delightful, it actually makes a huge difference in the experience. And having these little trampoline and exploding mushroom things that shoot you up in the air, it's kind of a unique feeling that you've never actually had in real life. And it starts to get to some of the strengths of VR to be able to give you some of these fantastical experiences. But they also had this other really simple dynamic which is you're essentially like picking up paintballs and throwing them at people and they're kind of splattering all over the place. Now again in real life you could do that but most people don't because it's just annoying when people are throwing paintballs at your face but in VR you could totally do it and it doesn't really matter all that much. So it was a bit of a shared safe environment for people to kind of play around with this. I imagine that if you're in actual like other contexts in social VR, it wouldn't be necessarily appropriate to be throwing paintballs in each other's faces. It could feel like trolling behavior. But in this context, I think it really worked and it was just a lot of fun. The other thing that I want to mention is that they had this kind of interesting combination of being able to do hand gestures and your hand gestures were able to kind of express that similar type of emotion on your face as kind of like this emoticon. Now you can't really actually see what your face is doing so it's less of a feedback in that way and it's more of how you're relating to other people. Now they did have a mirror where you could actually look at yourself and I thought it was a really interesting comment when Ellie said that what they notice is that people start to mirror what they're doing in VR. So for example if they start to make the avatar look really angry then Ellie's basically saying what they found is that people tend to have this embodied cognition where they start to to identify and mirror that similar emotion that they're expressing and so even though you may not always be aware of the emotion that you're putting out, sometimes the other people's reactions to you can help amplify whatever your hand gestures may be trying to express. So in the long run, I think we're going to get to the point where we can actually track a lot more of this stuff in terms of facial expressions and eye tracking and eye gaze and all this stuff. But at this point, they really need to add a lot of different technological things in terms of stress sensors within the virtual reality HMD in order to start to measure some of the muscle tension also just have cameras that are a little bit more sophisticated in terms of Being able to just record the occluded part of your lower face and extrapolate the rest of your faces expressions as well as kind of doing a sensor fusion in terms of whatever kind of an internal camera may be able to detect from what kind of facial expressions and emotions that you're exhibiting and So I did an interview with Hal Lee, who's an assistant professor at USC back in episode 234, and it's worth taking a listen to if you're interested in the future of facial tracking within VR. I think he's kind of one of the leading researchers in that field, and he's actually been working with Oculus in terms of trying to integrate some of this latest technology But the comment that he made to me is that you're really going to want some sort of both kind of facial expressions as well as eye tracking to really make that social interactions to take it to the next level. That said, you can actually get a pretty far away with just hand tracking and being able to track the location of the head. and add some specific art styles and maybe some other saccades and other body language cues to kind of amplify some of these social cues that we expect and make it a little bit less uncanny. But this cartoon style was kind of like in the style of a South Park and It actually worked pretty well. And you know, the other thing that I just wanted to kind of mention is that at this press conference, they were broadcasting these field trips with four different people at the same time, up on a large screen, and there was a surprising number of people that were just watching it. And so You know, a lot of times people may want to watch somebody playing VR, but it's really sometimes not all that interesting, especially if it's a first-person perspective and they're kind of having their own experience. But whenever you start to have many different people within VR and they're kind of playing with each other and throwing stuff in each other's faces and, you know, shooting them up in the air, you kind of have this thing where you are actually more interested in watching it because you know that there's people behind those avatars. that they kind of represent some part of our human natures that are being expressed in these virtual worlds. And it just makes it a lot more interesting for us to watch. If we knew that they were just artificial intelligent robots, then maybe we'd watch it for a little bit. But it wouldn't be as nearly as compelling as knowing that they're actual real humans, because there's this element of unpredictability and surprise. And that type of novelty makes it fun to actually be in the VR experience, but it actually makes it pretty interesting for people to watch. There's a clip from one of the VRDC sessions that happened at GDC where they had some of the PlayStation VR developers up on stage talking about this experience and kind of running through the demo. And you can kind of hear the audience reaction, just kind of laughing as they're giving high fives and doing different silly things within the experience. So it's worth checking out that if you want to get a little bit more context as to what this specific experience was like. But overall, I just am happy to see that Sony is kind of moving in this direction, at least creating some of these demo prototypes to see what social dynamics and interactions can be like. And no specific plans as to whether or not this is going to be a self-contained type of experience. But I would be very surprised if of all the different 50 PlayStation VR experiences that are supposed to come out by the end of the year, that there aren't any type of social experiences that are kind of like this in some type of way. So if you are interested in sharing a social VR experience with me, as well as other fellow Voices of VR listeners, then I'm actually going to be having a virtual meetup within Altspace this Thursday, June 30th, at 7 p.m. Pacific time. So we're actually going to be talking about machine learning in VR. I've been doing a little bit of a deep dive within AI and machine learning this past week, and just be kind of sharing some thoughts. And I'm actually really curious to hear what other people are doing or thinking about this cross-section between creative AI, machine learning, and virtual reality. This week, I also happened to start up a Voices of AI Twitter. So if you're interested in learning more about machine learning and AI, go check out Voices of AI on Twitter. And I hope to perhaps start a new podcast if I get all my sponsors lined up over the next couple of months, as well as more Patreon contributors. So if you like the idea, then vote with your dollars and drop on by to patreon.com slash Voices of VR.

More from this show