#143: Cymatic Bruce on building community with AltSpaceVR, & experiments in gesture input, VR locomotion, & multi-player web content in VR

cymatic-bruceCymatic Bruce Wooden talks about the latest developments in the AltSpaceVR social application now that they’ve opened it up to a public beta. One of the things that Bruce mentioned is that people often think about social media when they hear about social VR, and he suggests that perhaps a more descriptive term would be “Community VR.”

As AltSpaceVR prepares for the consumer launch of the virtual reality HMDs, building communities is going to be one of their places where they’re focusing their attention. They’re also going to be continuing to add features and functionality to push the envelop on different virtual interactions within virtual environments.

AltSpaceVR has been implementing more expressive gesture controls by using the Leap Motion and Kinect, and they’re starting to implement the Perception Neuron suits as well. I’ve personally noticed that there can be a power differential with more social capital going to those who have access to more technology because it enables them to be more expressive and command the conversation more. Bruce says that his observation is that having more expressive gestures seems to improve the experience for everyone involved, but that the power differential is something to watch and look out for. He suggests that perhaps in the future that special guest speakers will come to the AltSpaceVR headquarters and get geared up with all of the latest technologies,

One of the other innovations that AltSpaceVR has been pioneering has been their teleportation locomotion technique. This is a very elegant solution for people who are susceptible to motion sickness caused by VR locomotion. But yet Bruce warns that there are downsides and new social norms developing because it is weird and awkward to be in a group conversation and then just phase out and disappear without a trace.

Bruce talks about the evolution of the user flow, and how they initially hid the action bar based up 2D design standards, but it was difficult for people to find the controls and so they exposed it. They’ve also been optimizing the sound design in order to find the right levels that are comfortable and have the right amount of decay.

Bruce also talks about the choice to go with robots instead of more human-like characters. They experimented with avatars that were really photorealistic to being more abstract, and they felt more emotionally connected to the abstract avatars. There was some creepy dead eyes and unexpressive faces when using the more photorealistic avatars.

There was also a recent internal 48-hour hackathon using their Web SDK that allows you to bring interactive, 3D web content into virtual reality via JavaScript and three.js. They developed a Dungeons & Dragons tabletop application, hand puppets and tone garden. They also brought in some external developers who created a multi-player Floppy Bird clone called Floppy Dragon where others can try to crash the dragon. They’ll also be searching for other developers to come on to make some multi-player experiences with their Web SDK.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast.

[00:00:12.120] Cymatic Bruce: I'm Bruce Wooden, also known as Cymatic Bruce, Head of Developer Relations at AltspaceVR. And we are happy to celebrate our transition to open beta. So you can get into Altspace anytime you want. It is open from here on out.

[00:00:26.493] Kent Bye: And so you've been having kind of like some private beta weekends where people can kind of check things out. What were some of the feedback and lessons that you've learned? I guess it's been about a year since you've formally announced the existence of AltSpaceVR. So what were some of the big lessons along the way for actually having users come use it?

[00:00:44.769] Cymatic Bruce: Yeah, I think there was definitely a lot of lessons as far as just the user flow. There was a lot of focus on, you know, let's make it as easy as possible to start up AltSpace and get rolling. So from a UI perspective, a lot of feedback there. We started out with, you know, we had a design crew that was coming from the professional 2D design space, and there's kind of a tendency to hide things. And so we started off with our action bar. You right-click to bring it up, and otherwise you don't see anything, because we were like, oh, we don't want to clutter the view. but then no one was discovering the action bar. So we kind of went through a process of changing some things to make them a little more discoverable, a little more intuitive based on looking in your environment. And there's been some changes to the input systems as well and additions that we've made to the input system based on user feedback. I think one of the biggest things people want is that But a lot of people want continuous turning. They're not VR noobs anymore. So we've got that on controller. We're still working out how to do it on a mouse when you also have a mouse cursor that's browsing the web. And so it's like a lot of things that are intersecting that make it a little complex. But yeah, we're working on that. And then volume, you know, really getting a lot of feedback as far as what's a comfortable volume, hearing people, steps we can take to make sure that that whole sound experience is very satisfying. So yeah, and then new features, like new features on our roadmap that are on our roadmap now have come from feedback from the users that have been during our beta access weekend. It's really, really valuable to get that feedback from users that are in there, that are in the product for hours and saying, hey, we really want such and such.

[00:02:22.808] Kent Bye: Yeah, maybe you could talk a bit about the locomotion scheme that you have and the different options that are available for people.

[00:02:29.870] Cymatic Bruce: Sure, so the main locomotion scheme is dependent on just mouse, just mouse only. And that's basically a click-to-teleport system that we pioneered in, this was like, summer last year actually before we didn't announce it or anything but that was a part of right from the get-go we're like hey this is a great solution for locomotion get people around so they can visualize where you're going to be and then click and be in that spot so that's been pretty effective. Then you can use the keyboard to get around WASD and use Q and E to do comfort turning. So that's available. We have the Xbox controller. You can get around using that. You hold the trigger and then you can kind of move your mouse for you continuously and move throughout the space. So that's kind of what we have going on as far as locomotion and there's input solutions which we're experimenting with. We'll leave motion and connect which are Not necessarily locomotion, a little bit with Kinect because you actually can walk around in real life and your avatar will move in virtual space. So that's really cool to move in physical space one-to-one with your avatar movement in virtual space. So we're looking forward to Vive development when we'll be able to click the teleport system, we'll essentially be moving your bubble of influence. where you're able to walk around and evive and physically move one-to-one, and if I want to move over there, I teleport, and now I'm able to influence that area around that spot, right? So I think we have some really exciting stuff on the horizon there.

[00:03:57.418] Kent Bye: Yeah, I think the click to teleport locomotion solution is really elegant from the perspective of the user. I guess the thing that I wonder is if other people are looking at you and then suddenly you just like phase out and disappear, is that something that kind of breaks presence for other people to see people kind of going in and out like that? I'm curious if you've gotten any feedback about that.

[00:04:17.081] Cymatic Bruce: We have, we have got feedback about that, and it comes up all the time with our internal use too, or when we're doing a demo, even to the point where people will apologize for being rude for teleporting away during a conversation. It's weird all these social norms are carrying over to VR. But yeah, so we're really thinking about, well, how are we going to do this? a light trail that interpolates between the positions so you kind of get a sense of where the people, where the person went to and where they're going. So yeah, we're definitely, that's on a road map too to introduce something where we have some effects that are going to show, hey, this is where a person teleported. Hey, this person has their private display up, don't bother them. There are other like kind of very simple graphical cues to kind of let you know how a person is maneuvering in the space and what they're doing.

[00:05:02.590] Kent Bye: Yeah, and talk a bit about the experiments that you've done with actually getting your hands within social VR and what being able to gesture with more than head nods is bringing you.

[00:05:12.415] Cymatic Bruce: Yeah, so we started with the Intel RealSense. That was really kind of our first big run. Intel really liked the demo. They were like, hey, we'd love to have you at CES if you'd like to try to incorporate RealSense and have an experience. So we're like, hey, let's tackle this and see what we can do and try to see, you know, it's really, I like to think as a lot of the censors where they are now you have to think in terms of gross motor skills like a toddler because a lot of people they think VR oh I've got my fingers in there and they immediately want to start creating fine art and pinching things very carefully and it's not there yet. In a lot of cases so I think what we're really we're focused on okay, let what let's do some big gross motions like we have this volleyball experience We're basically put both hands up boom double palms, and it's like the ball goes away from you right and we've really been big on let's experiment with these big gross motions here and try to make it as accurate to the joints as we can. What's interesting with our Leap Motion is that it'll actually move the joint slightly of the little robot hand based on where your joint position is, which is kind of neat. So if you compare your hand to a smaller person's hand, it should be different size. which is kind of cool to do in virtual reality. So that's been interesting. We're still experimenting with like, how are we going to do UI? Again, like with these gross motor motions and keeping people comfortable. The Kinect is probably going to be a little bit better. There's some CCP experiments that were very, very encouraging and very cool that CCP showed off at their conference, the EVE conference there. So that was really neat. And then we just got our perception neuron kit, so we'll be implementing that probably within the next couple weeks. So that's going to be interesting too, to see what we can do there just from a nonverbal perspective first, and then how do we get the UI functioning with gestures, with something that feels natural that's not going to tire you out.

[00:06:57.087] Kent Bye: Yeah, one of the interesting points that came up at the IEEE VR is the phenomena of having asymmetrical power dynamics whenever someone can kind of express themselves more because they have more money or resources versus someone who doesn't have all the fancy technology like a leap or a Vive or a perception neuron. So they're just having their head. So have you experienced that or seen that at all in terms of like people who have their hands in the game? Maybe they get to be more expressive or actually have more power in social situations in that way.

[00:07:26.184] Cymatic Bruce: Yeah, it's kind of interesting. So part of it is that, you know, we've tried to not represent what we don't track. So if you're a 2D client, you're just like, you're kind of kind of a statue, just turning your whole body like old school Batman. And then all the way up to having a connect when you're like really expressive. And what we're actually finding is that the experience is much more rich for the people watching the person using the connect and the person using it. that as I was in a connect and very few people didn't have one, they were getting a lot of entertainment out of me walking around and pretending to hug people or putting my arm around people or rubbing a person's head and it's making the experience, it's heightening the experience for everyone, right? And I think what we'll end up starting, because this tracking hardware is going to be kind of scarce, what we're looking to do is have a lot of one-to-many stuff like that, where we have someone come to our office that's going to be kitted out with a perception neuron, and they'll be getting a presentation in higher fidelity than everyone else, and everyone gets to see the expressiveness of that interesting VIP, right? And I think that's a really cool way. a stop gap, I guess, until everyone has that solution. That's really great. And I think the other thing also to kind of get that cool factor, even if you just have a rift, for example, is maybe change the size of the body parts. Let's go. If you only have a rift and most of your stuff is coming from motion is coming from your head. Just make the head bigger, make it cooler or something. I don't know. So we've had some preliminary discussions about that, but in general though, I think it hasn't been a negative thing. Like, oh man, that guy has a rift. I feel bad. It's more like, whoa, do this. Walk over there. So it's a lot more like, oh, this is cool. I'm seeing this thing. And it's been additive, I think.

[00:09:10.060] Kent Bye: Yeah, just from my own personal experience of being in RiftMax with someone who had hydras, it was sort of like, I did feel like, oh man, I feel kind of like, wah wah, like I don't have the full expressiveness that I want. And so the people who don't have it may not get enough attention or respect in sort of these social dynamics. So I think I guess it's something to kind of look out for.

[00:09:30.993] Cymatic Bruce: Yeah, it is. I think that is something that could end up being a concern where a person that has more could express more, could actually command the conversation better, right? Especially if we're talking about using it for like a business meeting or something, something like that. But yeah, I think that could come into play. So I think that's why we're looking at doing something a little different. So if you only have this type of technology, we can automatically detect it and then have something cool around that. So you don't feel as like, oh man, I'm left out of this experience.

[00:10:02.611] Kent Bye: And in terms of the avatars, I guess it's an interesting design choice to go with robots, which are humanoidistic, but not really humans. And was that a design decision to kind of avoid falling into the uncanny valley? Or maybe you could talk a little bit about the range of different stylized avatars that you've experimented with.

[00:10:19.247] Cymatic Bruce: Very much so. Very, very intentional on that front. So we've iterated with some stuff that was kind of photorealistic to very, very abstract. And we ended up starting pretty abstract because that was the point where we could feel emotionally connected to the other avatars we were looking at. And right now, and I mean, there's some other solutions that are on their way that will probably, you know, change this. I mean, in the span of a year, who knows? But yeah, right now, the photorealistic stuff that we tried was pretty creepy. And it's like, you know, having the dead eyes and the unexpressive face and and also not just the graphics themselves but the animation if it's not rigged like you expect a normal human to move I mean where humans are hardwired to see a certain type of motion and recognize that as human and if you have something that's 98% or 97% there it's just so It just puts you off so much. So that is definitely something that we've always been guided by. Let's make sure that you can emotionally connect with the person that you're talking with. So far, we've taken the tact like, well, we can't really get the photorealism we want, but let's just make sure we have all the animation that we can get and that position data and tracking data is represented well. So that's been kind of the focus from the outset. So yeah, but I think we'll eventually get to a point where we'll have some really cool solutions that'll be animated well, that'll look either stylized or photorealistic, that'll be more human, and I think that will be really, really cool. I think that's what we want to be, but it's gonna be a slow, iterative process, and we'll be trying a bunch of solutions out, talking to partners, and see what we can come up with there.

[00:11:56.810] Kent Bye: And Altspace recently had an internal hackathon where you're sharing some of these kind of more game-like or engaging type of interactions within Altspace. So maybe talk a bit about what the intent of that hackathon was and what came out of it.

[00:12:10.538] Cymatic Bruce: Sure, so we have this SDK, it's a web app SDK that's basically making the JavaScript and the 3JS from a web page manifest itself in VR and full 3D and be interactive. So a web page is no longer this flat surface, it's a thing that you can interact with in full 3D and VR, which there's lots of cool applications that are doing this, changing our definition of what the web is, which is great. So we had an internal jam really to find out what's missing from our SDK, what type of things that we would like to do that should be put in there. So out of that came V20, kind of a tabletop D&D gaming app. It was a 48-hour period. We put that together, which was neat. And we had like a hand puppets app and a tone garden, which was very interesting. And it was really cool, lots of lessons learned, and we went into the lab, worked more on the SDK, and then we brought in TMEC and SM Sith Lord as some MVP developers to help us out and from an external point of view, come in and see if they could make something in a couple weeks. and have something ready to show off for SPVR Expo, which they did. So there's Floppy Dragon, the multiplayer Floppy Bird clone that is possibly the most frustrating thing you can ever see. So not only is it like Floppy Bird-ish, it's a dragon, but also people in the space can manipulate the pipes in real time to jack you up. That has been fantastically interesting. And then SM Sith Lord did a great thing called Interstellar Defense, kind of this love letter to like Galaga and Gradius and these old school shooters. It's simple, but it's a lot of fun. And to see like, you know, this huge turret in front of you and like a planet hanging in the midair and ships that are actually flying towards you in 3D. It's awesome. And this is all, you know, JavaScript and 3JS. Really cool. So we're looking to get more of that And so we'll be looking for developers that come on board and make more cool Experiences with us and we've got a list of developers that are chomping at the bit to get in and do some cool stuff So we're we're excited to see how that goes And finally with the kind of launch windows for the consumer versions of all these virtual reality headsets now being kind of walking in place What is alt space VR doing in order to prepare for the big consumer launch of VR? Yeah, I think really what we're doing right now, with this transition to open beta, the focus is going to be on getting that roadmap knocked out with features that are requested and features that we envision that we would like to add, and content. We want to build a community. I've been thinking about this term social VR and how so many people outside of the VR bubble just automatically think social media. They think, oh, you guys are doing VR Facebook. And it's like, well, no. I think it's more community VR. We're trying to build communities, right? I think that may be a better term. where, you know, it's kind of, I think the way we view all space is that you share a couch with anyone around the world, and that's the kind of, enjoy each other's company, enjoy experiences together, and we build that community by really encouraging events. We have people that can create their own events and form a community around that, but also we want to have cool stuff. We want to have some cool dude that scanned a volcano freaking put that volcano in alt space which is there now and have him give a tour of what he scanned right we want some guy to do a nuclear non-proliferation course and have like a 3d web app that's taking apart the nuclear warhead and you can watch that right we want to have dance parties we want to have presentations we want to have people that are going to be meeting up and learning different languages We want all of those things to be happening on a continual basis, movie nights, whatever it is, so that you build a community around being in alt space and that you have a reason to come back. So when the novelty wears off and you're like, oh, you get over the cool factor, you're like, well, this is a fulfilling experience and I had a great time doing this and I'm going to keep doing it because it's really cool and I've met some people, I've made new friends. I've had some fulfilling discussions and the whole experience has been worth it. So I think that's really what we're focused on. So by the time we get to those consumer launches, we'll be the place to be. Like, you know, it'll be the thing. Like, it'll be like, oh man, have you heard the social VR thing and hanging out with people in VR? So as these people get their vibes and their riffs, like, this is what we want, you know, people after they're done, you know, geeking out over Elite Dangerous to come talk about it and hang out in all space, right? So yeah, it'd be nice.

[00:16:36.081] Kent Bye: Awesome. Well, thank you so much. Yeah. Thank you. Appreciate it. And thank you for listening. If you'd like to support the Voices of VR podcast, then please consider becoming a patron at patreon.com slash voices of VR.

More from this show