#214: Using three.js to Create Social VR applications with AltspaceVR’s JavaScript SDK

Gavan-WilhiteThe new JavaScript SDK from AltspaceVR is going to allow front-end web developers to quickly and easily create social VR applications. The AltspaceVR SDK is based upon three.js, which means that developers can create a WebVR-enabled application on the web, and then bring up that experience from a Chromium browser within AltspaceVR and have a fully immersive VR experience that is aware of the other people within Altspace.

WebVR is still currently suffering from a lack of optimization in order to hit the target latency specifications, and Altspace provides the user with a native Unity application that will be performant enough to run at the desired framerate and latency. The available APIs also allow for the exchange of information to the web application including “natural social interactions, synchronized multiplayer capabilities, networking, VOIP and immersive virtual environments.”

I talked with AltspaceVR co-founder and director of engineering Gavan Wilhite about what this new SDK, what it will enable front-end developers to do, and what some of the implications of having a cross-platform VR environment that has a Vive, Oculus Rift, and Oculus Mobile GearVR headset.

Gavan also talked about the new live coding capabilities and integration of CodePen, which will enable some really interesting interactive and social construction of VR experiences. Blind typing and coding is still a barrier to entry for this, but will likely be a useful skill to develop to be able to quickly and easily experiment with different VR experiences while in VR. And as Gavan noted, often it’s the accidents and glitches that end up being some of the most entertaining and fun things to happen within VR.

Gavan also mentioned that AltspaceVR is offering up grants up to $150,000 to different developers to kickstart these types of multiplayer, open web VR apps that can be used within AltspaceVR.

You can visit the AltspaceVR Developer Portal to download the new JavaScript SDK or apply for the AltspaceVR Developer Initiative Program.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast.

[00:00:12.039] Gavan Wilhite: My name is Gavin Wilhite and I'm one of the co-founders and director of engineering here at AllspaceVR. So what we're announcing is the release of our JavaScript SDK. So what that means is that means that you can use web technology. So you can basically create a web page. that is a virtual reality application. So you can load it up in Altspace, and it has access to be able to render 3D objects all around you inside of VR, as well as being able to access VR-specific APIs that we're providing, like the ability to know where your hands are in space, know how big objects are in real-world units. Like, for example, it's really difficult on the web to render something life scale on a screen because you never know exactly how big their screen is. You can't get a sense of those units. But in VR, we know exactly how big the screens are. And so we can let you render sort of life scale things. There's also abilities that we give for people to understand sort of more information about the size of their windows because we have sort of 3D windows now rather than just 2D windows. And it really just allows you to very rapidly create social multiplayer holographic VR apps using web technologies.

[00:01:23.144] Kent Bye: Great. And so I know that, you know, I've done a number of interviews on web VR and kind of looking at how you can integrate virtual reality from the web browser as a portal into that. And so with alt space VR, you're kind of releasing a unity application, but yet you're opening up your development to be able to include some of the same web stack technology. So maybe you could talk about like how you see that coming together and what this means.

[00:01:52.051] Gavan Wilhite: Yeah, definitely. So it's definitely very interesting stuff, the WebVR, and what it is, it's just a slightly different way of approaching it initially. I think the two sort of implementations will probably converge at some point. And the way that I usually think about it is WebVR is sort of, it literally does use this, and it's also sort of spiritually this, where it's full screen mode, right? Where WebVR is going to be taking over your sort of entire VR experience. When you're making a WebVR app, you're going to be implementing all the environment yourself, all the avatars yourself, all that kind of stuff. Whereas the VR web apps that you'd be building in Altspace is much more like it's running inside of a web browser that is inside of Altspace. And that web browser just happens to be sort of holographic. So when you're making web apps for Altspace, you're not worrying about avatars and environments and stuff like that. So that's more of sort of amazing, because that's more of like sort of a traditional web browser model. Whereas Firefox, and rightly so, because it worked well with our technology stack, went first for sort of the full screen mode, if that makes sense.

[00:02:53.766] Kent Bye: I see, yeah. So it sounds like you're adding, you know, alt spaces, adding this social dimension where you may still have this Unity wrapper for right now, at least. But it sounds like you're kind of opening up VR development to WebVR or people who do HTML, JavaScript, 3JS, you know, people who are native web developers. It sounds like this SDK may give them sort of a doorway or an entryway to get into being able to create some immersive environments, but have kind of like the performance benefits of a Unity wrapper.

[00:03:26.301] Gavan Wilhite: Absolutely. Yeah. And, you know, there's a number of aspects to that, right? Because, you know, you can use the SDK to create a entirely new application. You can use it to sort of take an existing sort of 3D web app. So there's like lots of WebGL games and stuff like that. Or you can just take an existing web page and say that you have a product preview window on one of the pages, but you want it so that when people have it open in alt space, they can sort of see a 3D model sort of floating in that preview window. That's another way that you can use the SDK. So you can either augment existing web pages, you can port 3D web apps, or you can make things from scratch.

[00:04:02.375] Kent Bye: And one of the other things that's, I guess I find interesting, your approach is that you've already been working with the Vive and HTC and starting to experiment with what does it mean to have your hands within a social VR experience. And when you look at kind of like the, the web standards, that's not really fixed into a standardized way for how anybody could be able to write a web VR app to be able to get two handed interactions in there. And so. Where do you see that interaction between what you're doing, I guess, probably more at the Unity layer, but also perhaps making these WebVR, let's just call it that, technology for your JavaScript SDK, how people are going to be able to do these two-handed interactions within that JavaScript interface?

[00:04:47.915] Gavan Wilhite: Yeah, so one of the decisions that we made was rather than to try to target WebGL, and we may still do something in that direction, we rather targeted Three.js. And so Three.js is a JavaScript 3D engine that runs in the browser. It's the most popular one, and it's great because it's pretty lightweight. And so what that allowed us to do was rather than worrying directly about sort of browser standards, we thought more about what makes sense in sort of the Three.js ecosystem. And Three.js is good enough where like I wouldn't be totally surprised if somehow it starts becoming more of a standard. But what that allows us to do is create functions that are namespaced and named and done in like a Three.js convention so that it works well with that. And then I think down the road we can worry a little bit more about how this comes back in to full web standards.

[00:05:40.218] Kent Bye: So does that mean that you're kind of releasing a Three.js plug-in back to that open source project?

[00:05:45.740] Gavan Wilhite: Of a sort, yeah. The SDK sort of interrupts with Three.js. There's some pull requests that we probably want to make to Three.js. There's some things around like event bubbling and stuff like that that we've worked on that we think would be valuable to other folks. But yeah, you can think of it sort of as a Three.js plug-in. So Three.js is renderer agnostic. It can render to Canvas, it can use a software renderer, and it can render to WebGL. And so we basically just implemented our own renderer, which allows it to render to Altspace and then also provide these other APIs.

[00:06:17.274] Kent Bye: I see. And so, you know, I remember a while back within Altspace, you had this internal jam to play with this JavaScript SDK. And so maybe you could talk a bit about some of the things that you've already developed internally with this SDK and what type of things you might expect to come out of it.

[00:06:34.744] Gavan Wilhite: Definitely. Yeah, so, and again, back to your point where you said that, you know, one of the cool things about it that I didn't touch on is that obviously you're in AltSpace, right? So we have some synchronization utilities and stuff like that, so it's very easy to make it so that multiple people can use at the same time. So that's really one of the things that we are focusing on in the game jams and also something that we're really looking for in these multiplayer apps. And so one of the things that really surprised us was When you're making these JavaScript web apps, it's very easy, especially in game jams, just to sort of do them off the cuff and make them work, and the code's a little bit dirty, but it just functions. And so you end up with a number of bugs, but a lot of the times, for some reason, VR makes them into features, just because it's so sort of ridiculous. and you can play with them. So like we had these puppets apps that somebody made, or that one of the teams made during the game jam, and you know, a few of us were skeptical initially, but as soon as you had that on your hands, and I think it was, we had Puppet of Eric, our CEO, and you could sort of like waka your hand, and it would like waka his mouth, and like there's just so many amusing things, and when it would glitch out, it was even more funny. And so sort of embracing the sort of bugs and glitches and stuff like that was something that we really learned and enjoyed. And then just like the crazy amounts of things that you can do even with the limited functionality that we have right now has also really blown me away. We did in the last game jam and I don't know if we've actually launched this yet or let people see it but we built a laser tag app and that is a ton of fun. Just being able to sort of use the alt space environment as more of this sort of full game and being able to like you know hide around corners and lean out and try to tag people and so that was that was incredibly fun.

[00:08:11.238] Kent Bye: So yeah, that kind of brings up a point in terms of, you know, you have this, I kind of think of it as this self-contained web browser-esque, like we're writing stuff into Three.js and in order to make a laser tag, I would imagine that you would need to know some information about where those alt space avatars are located within that space, kind of like maybe even a X, Y, Z coordinates for position. And so what type of information are you able to feed back into that SDK based upon like what inputs can it take from the alt space environment?

[00:08:43.646] Gavan Wilhite: Definitely. So to speak on this one directly, so we have an API that's called the Git Tracking Skeleton or Git 3.js Tracking Skeleton. And so what that does is, at Altspace, we have to support a large number of input devices. And so what we do is we've created this unified tracking skeleton that we can feed all of that information into. And even if you're sort of like looking around with your head, we'll take that as sort of assumed information about where your head is looking in 2D mode, say. But that all gets fed into a single unified tracking skeleton, and you can get that as a 3JS object representation inside of the JavaScript context. And the nice thing there is that those objects are automatically updated by AltSpace. So you could just pull those locations every frame. So every JavaScript request animation frame, you could ask for where the head is positioned and rotated. Or what's even better is you can just child an object under that Three.js object, and then it's just automatically updated. It's just as if you'd sort of childed it in Unity. But we can let people do that in Three.js, and that allows us to do even more optimizations as we go into the future to make sure that the latency is near zero on, you know, when you move your head and stuff like that.

[00:09:51.372] Kent Bye: So it sounds like you kind of have like two environments. So you have like the alt space environment. It's almost like within that you have like a child, like 3GS environment that is almost like mirroring those objects almost like as a null object or an object that's kind of like just mirroring those objects that you can see within the alt space environment. Is that correct?

[00:10:11.051] Gavan Wilhite: Yeah, absolutely. We've been sort of calling them symbolic objects, exactly to your point there, where that's exactly what they are. So it allows you to have multiple applications that are able to interact with these objects and not sort of collide. And yeah, it's a great way of sort of expressing the power of both sort of unity and alt space to these JavaScript web applications.

[00:10:30.333] Kent Bye: Yeah, and I'm curious to like your kind of long-term strategy when you think about this, because if we think about like the idea of the metaverse, then it's going to be more of like being able to host applications on a web server and then kind of being linked together. And because you are kind of doing all this integrations with the WebVR and technologies and 3JS, do you foresee a time where you may, if once the browsers get optimized to the point where you may not need that Unity wrapper, that you just go straight to like a web stack technology so that you can start to spin up these instances on different websites?

[00:11:07.630] Gavan Wilhite: Yeah, it's an interesting question, and a lot of it comes down to sort of where in the stack AltspaceVR is. And at this point, I think we're always going to be pretty low on the stack. And so, you know, I think the metaphors start to break down a little bit, but I would think of Altspace as either being or having a web browser more so than being a web content. And so rather than opening up a web page in Chrome, you would open up that web page in Altspace. So it's unlikely that the main Altspace client will be, and things can change, but it's unlikely that you will enter the Altspace experience by booting up Chrome. But the web experiences will be running inside of Altspace. And I do think, to your point, that I think a lot of VR experiences will have a very web-centric stack to them, just because that's the way that we know, have lots of users, have lots of persistent data. And there's lots of good models from gaming, too, that we'll need to use. But I think people will be surprised about how much of that stack is based on the web.

[00:12:04.620] Kent Bye: I see. And because you are using a browser, there are a number of different browsers that are out there, including kind of like the Chrome open source version and then the Mozilla Firefox version. curious if you're using more of a WebKit or Mozilla based foundation for your browser and if there is a specific browser that you're using within Altspace.

[00:12:24.197] Gavan Wilhite: Yeah. So right now we're using a, it's a customized version of Chromium. And as of right now, uh, so just if anybody's building an app, it is Chromium 28. So it's a little bit older, but we will be upgrading that hopefully in the next couple of weeks or months, but it's a pretty modern version of Chrome and You know, I've been following what they've been doing with the stuff at Mozilla, including Servo, which looks pretty cool. But as of right now, Chromium is just the easiest to implement, so.

[00:12:52.423] Kent Bye: Okay. So that's what people should be targeting. If they're building something that should work in that Chromium build, it should work in the alt space then.

[00:12:59.386] Gavan Wilhite: Yeah, yeah. And another point that's good to make here is that because this is just 3JS, and because you can just swap renders, If you're making an app for Altspace, you can basically just check to see if you're in Altspace, and if not, you can just use a WebGL renderer. So your app should work both in Altspace and then also on a mobile phone and on a tablet and on a desktop machine. And it's not going to have access to the tracking skeleton data, but all the other logic and rendering and stuff like that should work.

[00:13:24.515] Kent Bye: I see. So because it's got that 3GS wrapper, you're a little less concerned as the browser-specific differences or features, I guess.

[00:13:32.396] Gavan Wilhite: Yeah, and it's also great because obviously we're super excited about Altspace and we want it to succeed and we definitely believe that it's going to be this amazing thing, but developers want to make sure that if something goes wrong, they don't want to lock into a single company. And so if you're building an app for Altspace, you can know that this code, the vast majority of this code is completely portable. So it's very important to us.

[00:13:53.226] Kent Bye: I'm curious about Altspace's strategy or the target demographic that you're going after because obviously Facebook bought Oculus, so we're going to be expecting a lot of really big social applications within that container of Facebook. Oculus is potentially going to be including some social components within their SDK, even. So I'd imagine that that may be a possibility where more and more social components are going to be included within these other apps. And so for AltSpace, what target demographic are you going after in this social space? And how are you going to really differentiate yourself from like the Facebook social applications?

[00:14:33.505] Gavan Wilhite: Yeah, and it'll definitely be interesting to see what they release here, but a couple points I would make is we think that basing this on the open web is super important, and so when you're making an application, a VR web app for Altspace, you can host it anywhere. We are not gatekeepers for this stuff. You can make whatever content you want. Any restrictions on that will be basically somebody's permission settings for their space. But if you want to make an app, you can do whatever you want. I think that's a super important point for VR, just to keep it open and make sure that there's not this person telling you what you can and cannot make. I think also the really important point is that Altspace is going to be on as many platforms as possible. you know, who knows what Oculus is going to release here, but at least with AllSpace, you know that it's going to work on the Vive, it's going to work on the Gear VR, you know, you're sort of not locked into a specific brand, if that makes sense.

[00:15:27.029] Kent Bye: Yeah, and just today, as we're recording this, you just announced that you have, you know, official support for Gear VR and for the HTC Vive. And so maybe you just say a few words in terms of, you know, some of the challenges or what makes that so special for what type of things you're able to do in Altspace on these kind of a cross-platform approach of all the different virtual reality hardware that's out there.

[00:15:48.416] Gavan Wilhite: Yeah, you know, it's exciting because, you know, a lot of it, so, I mean, there's obviously interesting challenges around, you know, we've got Take those three examples, with Oculus right now we're primarily using keyboard and mouse, with Vive we're using the motion controllers, and with Gear VR we're using the touchpad on the side. And there's other options for each platform, but that's sort of the direction we've been going for the last couple months. And each of those has special benefits and has its unique drawbacks. And so when you're in a space with people with different devices, they can do different things, which is kind of neat. So, you know, if you're working on, say, like a game together with people all around the world, and you're working on the same game at the same time, because we have sort of synchronized coding environments, You know, the person with the Vive might be the artist who's, you know, positioning things around in space, where the person with the Oculus sitting on his desktop is the programmer who's scripting them out. And, you know, somebody from marketing might be, you know, looking on the Gear VR because he's traveling and doesn't have a big setup. So you get to have these sort of heterogeneous interactions with all these different devices that I'm super excited to see how that pans out.

[00:16:53.547] Kent Bye: Yeah, this is I guess something I asked Somatic Bruce last time too and it kind of comes up again is that I'd imagine that what might happen with these different devices from Gear VR to the Oculus to the Vive is that as you increase in the fidelity of having your hands and potentially even full body moving around that you kind of have a little bit more expression and power in a certain way and so there's a power differential in terms of how much kind of emotion or expression that you can give. And so I'm curious if you've thought more about that and that sort of power differential that may happen if someone has more access to like better VR technology, what that means for the social dynamics for having different levels that people can really express themselves in VR.

[00:17:36.379] Gavan Wilhite: Yeah, that's a tricky question. A lot of that's going to come down to, I think, rather subtle things around avatar design and, you know, I think there's a general sense that it would be nice if there was incentive for people to have awesome VR technology because it just makes the experience better for everyone, but we know that that's not going to be available to everyone. So I think there's going to be a fine line there where, you know, it'd be nice to nudge people towards like, it'd be awesome if we were all tracked in here, but you don't want to make it sort of overwhelming. I think playing up the unique advantages of each platform will be important. You know, even Gear VR, you know, the ability to rapidly sort of spin around and you just have this freedom of expression with your head that I think that you don't have, you know, with some of the other devices. And so I think we'll be able to find unique things about all of them to play up, but it would be nice to give people a nudge towards better hardware.

[00:18:23.801] Kent Bye: And going back to the JavaScript SDK, what are you doing with audio in terms of, is there an audio component? Because I know Altspace is kind of doing its own 3D positional audio, but is that something that would even be needed to be passed into the JavaScript layer?

[00:18:38.772] Gavan Wilhite: Right, yeah. So right now, if you use Web Audio, it will just work. It will come from the center of the browser in Altspace. So it's not going to be... We call these things enclosures, the 3D window that your web content is inside of. So if you have a very large enclosure that's covering the entire space, It is going to be coming from the middle of that, I think. And I think it is spatialized. I might have to double check on this. But hopefully, in the future, we can make it so that you can position audio elements. But for right now, Web Audio is totally supported. You can do all sorts of sound effects. You can even theoretically use Flash for your audio, though it will only work on the desktop. And please don't do that. But we know that sometimes there is trickiness around perfect loops and multiple audio sources. But if you use Web Audio, it should just work.

[00:19:23.178] Kent Bye: Great, and what type of applications do you hope to see people start to make? Is it games? Is it sort of enterprise-y stuff? Or what type of things do you want to see people make?

[00:19:33.070] Gavan Wilhite: I think the one thing that has not been built yet that I'm super excited to see is prototyping and creation tools. So, sculpture things, game building toolkits, construction sets. I think those will be really amazing. There's some challenges there around dynamic mesh manipulation, but with the new version of our SDK, which supports a larger gamut of Three.js geometry types and dynamic meshes and textures, That stuff should become more possible. So I think that's really exciting to me, just to see people create more stuff inside of VR. One of the things that we're announcing on Thursday is support for CodePen, which allows you to do live editing inside of VR, which is great. So that's one. Obviously games are going to be incredible. That's most of what we've seen so far, and those are only going to get better. You know, there's going to be things that none of us have thought of so far. I don't think that anybody has figured out what the killer app is going to be, and so when that comes out, that's obviously going to be the super exciting thing. But, yeah, and even to the point where you can start to make like avatar accessories with this stuff, because you can parent things to like your center eye or your head position. So seeing what people come up with to create crazy Wizards apps that you can use to cast spells and it's just going to be so, yeah.

[00:20:48.771] Kent Bye: Yeah, maybe you could say a few more words about this code pen and what that means to be able to do coding of, I presume, kind of like using the JavaScript SDK, you know, what type of IDE this is within the VR environment.

[00:21:01.453] Gavan Wilhite: Yeah. And so, you know, the nice thing, since this is just JavaScript code, you can use just a wide variety of different online IDEs to write your code. I've tried out Cloud9 that works. I haven't used it in a few months, but it should still work. And what we recently did was look at CodePen. So CodePen is a great way of doing like small, quick applications. And you can like share them and fork them and all that kind of stuff right in the browser, which is great. And you can just type out code and it will live execute it. Which catches you by surprise, because as soon as you write in your render loop, it immediately pops up there. But it is that promise of, okay, I can finally start coding and designing and building applications for VR inside of VR. And it has live reloading, so as you change the position of something, it will move. And it has the challenges that we all know, you know, blind typing in VR is a little bit challenging. AltSpace hasn't released any, you know, real solutions for that yet. But, you know, I personally have gotten pretty good at it. I have to make sure that I use a keyboard that doesn't have a super weird layout. I used one of those DOS Ultimate keyboards for a while that has completely blank keys, so I learned all the symbols. But, yeah, you can go in there, you can live code. Yeah, it's great. So Brian Paris was behind RiffSketch, and he joined us here at Altspace a few months back, and he's been helping to make a lot of that stuff happen, so that's been really exciting as well.

[00:22:21.271] Kent Bye: Awesome, and finally to kind of wrap things up, I'm just curious from your own perspective, what do you think is the ultimate potential of virtual reality and what it might be able to enable?

[00:22:31.017] Gavan Wilhite: Oh, me. You know, I think it's the first time that we get to address all the pixels, right? It's the first time that we get to write to every part of reality that we're experiencing around us. And it will be this fluid thing that things will exist in VR, things will exist in AR, but I have a hard time believing that most stuff will not be, and I use this term holographic, because I really, I think it's important to not make too big of a distinction on what the exact output device is. But just this sense that you can control everything that you're experiencing around you and how much is being overwritten I think is going to be the obvious sort of future of technology. And this is a slightly weird one, but so we built this V20, this tabletop gaming experience in the SDK. And one of the coolest things about it for me was that when I'm playing Dungeons and Dragons, I don't like being reminded that I'm using technology. And this is using the most technology you could possibly have. VR devices, tracking skeletons, all this kind of stuff. But I was able to suspend the disbelief because we were able to design the things where the dice felt like they were physical. There was not glowing screens. And I think the ability for VR to push the technology behind the scenes is going to be one of the coolest things about it in the future. Awesome.

[00:23:51.384] Kent Bye: And is there anything else that's left unsaid that you'd like to say?

[00:23:54.878] Gavan Wilhite: If you're interested, check out developer.altvr.com. It's a great place to find documentation examples. People are proposing projects and trying to organize teams. We have the developer initiative program, so we set aside $150,000 to help fund teams to make VR web apps. You own all the stuff that you make, and you can use them on other platforms, but we just want to help kickstart the virtual reality web. So details there are at the developer portal also. So super excited to work with people to build some awesome stuff. And thanks so much for talking with me.

[00:24:26.379] Kent Bye: Awesome. Thank you, Gavin. Yep. You have a good one. And thank you for listening. If you'd like to support the Voices of VR podcast, then please consider becoming a patron at patreon.com slash Voices of VR.

More from this show