#362: Future of VR on the Open Web: WebVR, WebGL 2, & WebAssembly

brandon-jonesBrandon Jones has been one of the lead developers on the WebVR API over the past couple of years as part of his 20% project at Google. He announced this week that he’s now going to be working on WebVR full-time, which is a great indicator that Google is putting more resources in supporting VR on the open web. I had a chance to catch up with Brandon at GDC to talk about all of the web technologies enabling web browsers to drive room-scale Vive experiences and WebGL exports from Unity & Unreal Engine. Some of the highlights include a new WebVR 1.0 draft spec, the Gamepad API, WebGL 2, and WebAssembly.

I expect that there will be more announcements about what Google is doing in VR next week at Google I/O. Google is definitely investing in the future of VR and the open web with Brandon working on this full-time, as well as with their recent hiring of Josh Carpenter to the WebVR team.


Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip

Rough Transcript

[00:00:05.412] Kent Bye: The Voices of VR Podcast. My name is Kent Bye, and welcome to the Voices of VR podcast. Today, I talked to Brandon Jones of Google, and he's been one of the leaders of WebVR over the last couple of years. And he was actually on the Chrome team working on WebGL, and WebVR has been his 20% side project for the last couple of years. But just recently, he has now moved on to working on WebVR full-time. And so at the time of this interview at GDC, he was not yet working on WebVR full-time, but we still kind of covered some of the biggest developments that happened in WebVR over the last year, as well as what's coming in the future, such as room-scale experiences in WebVR, as well as being able to export your Unity or Unreal Engine project and be able to drive it on the web. So we'll be covering a lot of where WebVR has been and where it's going on today's episode of The Voices of VR. But first, a quick word from our sponsor. Today's episode is brought to you by the Virtual World Society. The Virtual World Society wants to use VR to change the world. So they are interested in bridging the gap between communities in need with researchers, with creative communities, as well with community of providers who could help deliver these VR experiences to the communities. If you're interested in getting more involved in virtual reality and want to help make a difference in the world, then sign up at virtualworldsociety.org and start to get more involved. Check out the Virtual World Society booth at the Augmented World Expo, June 1st and 2nd. So this interview with Google's Brandon Jones happened on the Expo floor at GDC, right next to the Mozilla WebVR demos that were happening there. So with that, let's go ahead and dive right in.

[00:01:56.009] Brandon Jones: I'm Brandon Jones. I work on the Google Chrome team. And I've been working on exposing a set of VR APIs to the web. Call it WebVR.

[00:02:05.186] Kent Bye: Great. So I've heard that there's been some either new specs or some new changes that are coming along. Maybe you could tell me a bit about like what's new.

[00:02:12.108] Brandon Jones: Yeah, absolutely. So when the specs first started, it was me and some guys from Mozilla. At that time, the state of the art in VR was the DK1 and Cardboard. Like those are the devices that everybody had access to. And obviously VR has come a long way since then. The devices look quite a bit different. The SDKs look quite a bit different. And so, just before we really unleashed this on the world and got it into the stable browsers, we wanted to make sure that the APIs we were exposing actually matched the hardware that was available and the capabilities that they exposed. And so, at the beginning of this year, there was a group of us from Google, Mozilla, Microsoft, Samsung, you know, a bunch of people who had some interest in it, all got together and talked about, well, how can we change this API to make sure it's going to be future-proof, to make sure it's going to be meeting the needs of the upcoming sets of VR hardware. And we announced that just about a month ago and released what we're hoping to be the 1.0 spec online. And we're still doing some minor revisions to it, but we feel pretty comfortable that that's going to meet everybody's needs. And this gives us stuff like a much better set of capabilities for testing MI on something that has the capabilities of a cardboard device or something that has the capabilities of like a Vive. And so we can do things like room scale VR experiences on the web.

[00:03:35.683] Kent Bye: Wow, yeah, that's pretty mind-blowing to think about. Last year at GDC, when Valve and Vive were putting the room scale out into the world in such a bold way, I never thought about trying to drive something like that on the web. So how far away do you think we are for actually driving a comfortable experience at a room scale on the Vive?

[00:03:56.368] Brandon Jones: we're much closer than you would think. Like, there are still some technical details to work out, but we have proof of concepts right now that show that if you have a machine that is VR-spec, it's what the manufacturers of these devices are recommending, which of course is not a trivial machine, but if you have one of these headsets, that's really what you should be running on, then yeah, you can do a totally comfortable experience on the web. And it is not going to look like your AAA Unreal Engine titles. I feel strongly anyways that if that's the type of content that you want to do, then you're really better served by going with Unreal Engine or Unity or Crytek or any of the big names that are out there right now. What we're really trying to target is kind of the long tail of content where you've got things like data visualization, or real estate, or mapping, or just art. That it's not the type of thing that you'd really want to bundle up into an executable, but it makes a lot of sense if you just link to it on Twitter. And those usually are not going to be the type of things that require quite as much graphical oomph. And they run really, really comfortably on the web. You can totally hit 90 frames per second. You can totally walk around in a room, and it feels good. There's a little bit more latency than we'd like, but it's really only maybe a frame off of a native experience right now. And it's something that's very acceptable.

[00:05:22.781] Kent Bye: Is that because you're moving towards having binaries that are able to control the runtime rather than going directly through the browser?

[00:05:31.469] Brandon Jones: Well, we still want to go through the browser. There's an awful lot of capabilities that that brings with it. The new revision of the API does kind of make some explicit remarks about how we're only targeting WebGL for this first version. We might expand that in the future. And we're only allowing certain types of presentation pipelines to work. And that's because that gives us the most direct path from the JavaScript doing the rendering to the GPU. And yeah, there's still bits and pieces of the browser in there that, obviously, up until this point, browsers have not been geared towards VR content. They're dealing with a very different type of content day to day. And so there's just a lot of it that's not really optimized for this quick throughput. And we have to work around that a little bit and that's where you get some of these like one frame of latency type things is, you know, you produce something and then it goes through a compositor and the compositor's trying to do a very different type of rendering than your average VR scene. We're pretty confident that we can get past that and we can skip over that and just give us a fast path for explicitly for VR that ignores a lot of the rest of what the browser is trying to do.

[00:06:46.894] Kent Bye: Yeah, I think one interesting thing about, in terms of art styles, I think that there is a little bit of an aesthetic. If you go with a low-poly art style in a VR experience, even if it's a native app, it kind of has this otherworldly feel, like you are just transported into this magical place. And that it doesn't have to be photorealistic in order to be a really good and compelling VR experience. And so I think that the web could really start to drive these low-poly experiences that are super compelling and have really interesting content.

[00:07:16.162] Brandon Jones: Yeah, definitely. I totally agree. Some of my favorite experiences in VR thus far have been the kind of low-intensity ones that you're talking about, where Oculus' Paper Town demo or like the little animals around the fire, you know, some of the games that I've played where they actually use like 8-bit style pixel art, but in VR it can actually be really compelling. The other thing to realize is that there's a lot of people who are looking at VR content with a focus on 360 panoramas or videos or stuff like that. And that is typically not content that is very intensive to render. It can oftentimes require high bandwidth. or other limiters like that, but the actual rendering, like the processing of the system, is not the bottleneck there. And so you can totally do 360 panoramas or videos on the web, and it's not really all that far off from the native experience. It's really perfectly acceptable for that kind of content. And I believe that that kind of content makes a lot more sense on the web than it does within some specific store from a particular vendor.

[00:08:27.537] Kent Bye: I think probably one of the challenging positions that you would inevitably find yourself in with trying to define a standard API across all the VR systems right now is that the inputs are such in flux in terms of the capabilities, the buttons, and so even developers within different experiences may have to tune or tweak whether it's on the HTC Vive with their motion track controllers and the lighthouse, or if it's on the Oculus Touch, you know, with the different types of button configurations. from the perspective of the web and defining this API, are you just saying, you know, you have motion track controllers and you get a trigger and that's it? Or are there other buttons that you're adding as well?

[00:09:04.398] Brandon Jones: Yeah, so for one, I should point out that the WebVR version 1 spec explicitly does not include how controllers work. And that's mostly because we believe very strongly that that shouldn't be part of a VR spec specifically, that should be part of the gamepad spec, which is something that already exists. And we're actually pretty enthusiastic about expanding that to take into account these kind of controllers, because that also goes beyond just VR. It's totally feasible that you can get a Wii controller and sync that up to the browser and actually get the accelerometer working and get some basic position and orientation out of that. And, you know, that wouldn't make sense if it was attached to a VR spec, but as a standalone, like, gamepad thing, we can totally do that. But that being said, yeah, we want to make that part of the gamepad specs so that can track the six degree of freedom pose of all these controllers. And the gamepad spec is actually already pretty robust to this. It exposes an array of axes and an array of buttons and doesn't particularly assign much meaning to them. There's some standardized mappings that say, well, if your gamepad looks like a standard PlayStation or Xbox controller, then we can put the buttons in a certain order so that you know what does what. but the API itself doesn't actually prescribe what those mean. And so we should be able to say, okay, well, it has a pose, it has a position and orientation in space, and it has N buttons, and what those do is up to you. And you'll be able to identify, is it an Oculus Touch or is it a Vive, probably just by looking at the name of the gamepad itself. And then we just want developers to be able to decide what that means within the space of their content. Talking to your other point about how it's difficult to kind of address the full spectrum of inputs, this is not something that's new to VR as far as the web is concerned, you know, unlike consoles or even a lot of PCs to a certain degree. The web can't rely on any particular form of input. I could be working on a desktop with a mouse and keyboard, I could be working on a mobile device with touch-only inputs, or I could be working in a VR headset. The legacy of the web is that you have to be able to account for this huge wide range of experiences. And it's a perfectly valid choice to say, look, there's just certain devices that I don't support. So if I'm going to build a website that's explicitly designed for something like the Vive or Oculus with touch, then yeah, that's probably not going to work on a mobile device with a cardboard harness, and that's okay. But you have the tools at your disposal to be able to make that choice and to be able to support the full spectrum if that's what you want.

[00:11:48.842] Kent Bye: And so it's been about a year since I think I last saw you at the last GDC. And so there's been a lot that's happened in terms of the world of web VR, in terms of content that's been out there. It seems to have at least some momentum within the larger web development community. But also there was A-Frame that was released. And I'm just curious if you could comment on some of the big milestones that you've seen over the last year.

[00:12:10.560] Brandon Jones: Yeah, A-Frame is definitely one of the big ones. The fact that we were able to all come together and put out this new revision of the API is another one. It just shows that within the community that's working towards this stuff, there's a lot of solidarity and momentum towards getting this to a point where it's gone out to the public beyond just experimental binaries. And I'm really excited that we're moving that way. We've also just started to see a lot more content developers pick it up and start to experiment with it. and doing some really interesting creative stuff. Most of it's been very artistic content that I've seen, like Khabibo and people like that within the community just going crazy with it and saying, well, what can I do when I have VR on the web? And then, of course, there's been just the outside the web, the VR community as a whole. That's a huge milestone for the web, is the fact that New York Times is sending out cardboard to all of its subscribers. That's several million new headsets out there in people's hands. And then you've got the Vive and Oculus Rift that are actually going to be shipping to people within a few weeks now. And getting consumer hardware into people's hands and then giving them the opportunity to test it on a platform like this, that's a huge milestone. Because right now it really is restricted to just the developers and the enthusiasts. And we need more widespread VR in order for it to be a compelling experience for both the developer and the consumer.

[00:13:46.055] Kent Bye: Great. And at this point, if someone wants to actually watch and see some of this WebVR content, they have to kind of get a nightly or a development build of Chrome and Firefox. And so I'm just curious if you see that the 1.0 API has to get a little bit more solidified before it's going to be put into the mainline, or at what point do you see that this is going to start to be integrated into the mainline web browser?

[00:14:12.152] Brandon Jones: Well, the fun part is that if you're really interested in this, you can go try it out right now. The best way to do it, go to webvr.info and it's a very sparse web page right now, but it'll give you the basic links that you need to get the desktop binary like experimental builds if you need for both Chrome or Mozilla. And you can try that out if you have DK2 or a Vive. Or you can go on your mobile phone right now. If you go to webvr.info, there's a link down in there that's WebVR samples. And this is just a GitHub repo of code that we've put together to demonstrate how the API works. And even if your phone does not have a WebVR enabled browser, you can go to that page, there's links that say use polyfill. and we've created a JavaScript shim that actually emulates the API, covers most of the important functionality, and you can go on your phone right now, click that link, and see a WebVR experience that you use with a cardboard device. And it works, it runs fast, it's not the top-of-the-line, high-end experience that we can deliver, but it's perfectly good enough for somebody to get a taste of what's possible.

[00:15:26.557] Kent Bye: And I've also heard that there's been a little bit of buzz of a lot faster rendering engines that have been coming out there in terms of being able to drive performance a lot more. Because I think one of the biggest concerns that I've seen with web VR content is that the latency, all the experiences that I've seen, at least last year, were pretty slow and not really tolerable for high-end VR experiences. And so I'm just curious, what are some of the innovations that need to happen in order to take those last steps?

[00:15:53.592] Brandon Jones: So there's a couple of things. One, with the new API, we've designed it in such a way that we're able to much more tightly control the render loop. And so it allows us to sidestep some of the problems that we've had around running at the appropriate frame rate and pushing out frames at the right time and everything. And so the experimental builds that you can get right now actually run quite a bit better than the previous ones. Like I said, there's still some technical issues to work out. They're certainly not what you would expect to see in a final stable build, but they're enough to give you that proof of concept and say, yeah, this works, it's stable, it doesn't make me sick, that's important. And then on the other side, you know, completely disconnected from WebVR, but something that will be impacting it, is that there's a lot of new tech coming to the web community in the form of WebGL 2. So we're going to be exposing a whole bunch of new graphics capabilities, the ES 3.0 feature set to the web, and that should be coming sometime this year. We're also, both Chrome and Mozilla and Microsoft are all working towards a technology called WebAssembly. which is an evolution of Asm.js, if you're familiar with that. And it will allow us to do cross-compiled builds from things like Unity or Unreal, and deploy them to the browser in really nice compact bytecode formats, and run very, very quickly at just a very small performance impact from what the native code would otherwise be. And so all of these technologies that are coming together, and of course the browsers just continuously improve anyways. We make the JavaScript engines faster, we make the memory management better, browsers are constantly improving. So all of this coming together will give us the boost we need to get rid of a lot of the issues that we're seeing with VR and latency in the browser and allow people to build really not dumbed down experiences. Like I said, you probably don't want to try for AAA games in the browser. You're going to be better served by the existing tools and the ecosystems around them. But we don't want you to feel like you have to dumb your content down just because you want to do it on the browser.

[00:18:08.242] Kent Bye: Now, I know from the evolution of Google Cardboard, which started as a 20% project and then kind of grew into an entire VR division. So I'm just curious if you've stopped working on like WebGL specific stuff, and if you're doing like WebVR full time, or if you're in that VR division.

[00:18:24.632] Brandon Jones: I'm not part of Google's VR division. Not that I haven't been tempted. But I do work with them. And they're doing some really awesome stuff over there. But I feel like my goals for WebVR are actually going to be much better accomplished from within the Chrome team. I am not working on WebVR full-time, although I've spent a lot more time on it recently as we've been building up the 1.0 API. I'm still technically on WebGL full-time and I am very excited to get WebGL 2.0 out because it will give people a lot more capabilities, a lot more higher-end rendering capabilities that will benefit VR in the end. But I can definitely see myself transitioning to doing VR full-time once we get to the point where it's really going to be landing in stable Chrome and getting out to developers. Because these things do need support going forward, and I want to make sure that somebody's there to do it.

[00:19:24.690] Kent Bye: So is WebVR still kind of technically your 20% project then?

[00:19:28.472] Brandon Jones: In technical terms, yes. For the last few months, it's been more of an 80% project. Yeah, it's still technically my side thing, but I can maybe see that changing in the future.

[00:19:41.101] Kent Bye: So, what type of experiences do you want to have in VR then?

[00:19:44.363] Brandon Jones: What do I personally want to have? Oh my goodness. I want to be surprised. Like, I want to have experiences that I didn't know I wanted. And I've had some of those so far, and it's been really fun. Like I mentioned before, playing kind of like an 8-bit game, I never would have thought that that would be a compelling experience in VR. And yet, you know, when you put it in front of me, it's like, oh, well, yeah, obviously this is awesome. And I've wanted this all my life, and I just didn't know it. I also want to be able to, like, have a lot more kind of mundane experiences. It's a weird way to put it. But so much of what I've seen in VR is just absolutely mind-blowing. And that's awesome. It's these big special effects and giant scenery and huge action and all this stuff. But one thing that I recently was house shopping. And one thing that I really would have loved is going through all these home sites to be able to just say, OK, I want to do a VR walkthrough of this place. This is the most mundane thing you can possibly imagine. It's not going to be the type of thing that shows up on any of the big screens here at GDC. But it would be so useful. And it would have real value to being in VR rather than just, it's cool because it's VR. No, you get a good sense of the space that you're in and stuff like that. It would transform that home shopping experience. I would love to see more of that kind of thing, where it's just the basic day-to-day utilities that everybody wants to use that are made better by the fact that they're in VR.

[00:21:28.677] Kent Bye: I've heard that in playing VR is that it's so active and it gets you into moving your body in a new way that I've heard that you may have actually like lost some weight in VR. Is that true? Just a little bit.

[00:21:39.384] Brandon Jones: Like it's definitely not something where, you know, I've been shedding the pounds like crazy. But yeah, I tell you what, Space Pirate Trainer and Ninja Trainer for the Vive are quite the workout. Like, you can really get into them. And yeah, I've lost a little bit of weight, you know, a few pounds. Yeah, I've actually kind of started to try and work that into my day-to-day routine just to say, yeah, I want to, like, jump in here and do just half an hour of some experience like that just because it's active and just because, you know, it gets me up out of my chair that I've been programming in all day and gets me moving around the room. And I think it's great.

[00:22:23.388] Kent Bye: What was that moment like when you realized that doing VR could be a way to get in better shape?

[00:22:29.192] Brandon Jones: Exciting and exhausting. There comes a point where you realize, man, I'm just a little too fat for VR. I got to fix that. I can foresee all these really awesome experiences coming down the line that I'm only going to be able to play for 10 minutes at a time because I'm not in the shape that I want to be. And so to that degree, it's actually a pretty good motivator to say, you know, hey, the future of computing is going to be a lot more active. And, you know, for people like me who have unfortunately grown accustomed to like my job is sitting in a chair coding all day, you know, It's actually an interesting motivation to kind of change that routine and say, you know, I'm really looking forward to being able to make computing a more full body engaging experience.

[00:23:22.559] Kent Bye: And finally, what do you see as kind of the ultimate potential of virtual reality and what it might be able to enable? Wow, that's a really good question.

[00:23:31.925] Brandon Jones: So, I guess I'll answer this in a little bit of a cliche way by referring to a book, which is not Ready Player One, by the way. I was extremely impressed by the vision of actually augmented reality that was presented in a book called Rainbow's End. Verner Vinge, I think it is, where it goes into a society that is not dystopian, refreshingly, where AR and VR are an everyday part of everybody's life. They're used for education, they're used for the workplace, they're used for just navigating around, social interaction and everything. And it's a really interesting look at just how society might function when these technologies just become part of the everyday. And, I mean, it's a little dystopian. Like, it's not all, you know, rainbows and roses. But I would love to see a little bit more of that shine through into our lives where, you know, we have access to these things just on a day-to-day basis. And it's not something special. It's not something that draws the big crowds at GDC anymore. It's just, yeah, this is just as ubiquitous as our mobile phones are now. To be able to see how that will enable new experiences and new capabilities in our lives is, I think, the most exciting thing to me.

[00:24:52.184] Kent Bye: Awesome. Well, thank you so much. Thank you. So that was Brandon Jones of Google, who is now working on WebVR full-time since the recording of this interview at GDC. Also since this recording of GDC, Josh Carpenter, formerly of Mozilla, is now working at Google on WebVR. So I just did an interview with Josh on A-Frame and WebVR in episode 350. So go check that out to get a little bit more details of some of the big breakthroughs of rendering technologies. And we're going to be able to see experiences on the open web being run at 90 frames per second, as well as in room scale, which that was one of the big takeaways from this interview is to really just think about the potential of doing a room scale experience that is driven by the web. And so, to me, it just opens up my mind in terms of the possibilities of what does this mean for the future of the metaverse when you can start to have room-scale experiences driven through a web browser. You start to have a little bit more of a vision of how we can be moving towards an open metaverse rather than one that's more of an app-driven walled garden approach. And so this is very exciting news, and I'm looking forward to hearing if there's any more announcements that are being made next week at Google I.O., which I am going to be at Google I.O. next week covering the event and being able to do some interviews. I'm sure that there's going to be some different announcements that are going to be made there. There's been rumors about Android VR, and there's likely going to be a little bit more of an update of something beyond Google Cardboard, perhaps something more akin to the Gear VR. I would be really surprised if it was something beyond that, because I think that Google really is trying to operate at scale, and so they're not as interested in having just a small adoption rate. They're looking for on the scale of millions, which is why they've at this point have over 5 million different Google Cardboard headsets out there in the wild. And so if you're interested in hearing a little bit more of the past, present, and future of Google and what they're doing in VR, go back and listen to episode 304. And next week at Google I-O, we're going to be hearing a lot more information as well. So just a couple other quick takeaways from this interview is that first of all, you know, just the concept of the responsive web design, how the existing web has the ability to detect what type of device you have. And so whether you're on a PC or a tablet or a mobile phone, the website kind of adapts its content to be able to create the best form factor for that platform. And so applying those principles to virtual reality, then you can start to think of ways that content could change and scale whether or not it's for the Google Cardboard, the Gear VR, desktop PC, or even room scale. And so this concept of responsive web design, I think, is going to be applied to the open web and web VR. And there'll be different design patterns to detect and change and adapt your experience depending on whatever device you have. And the final point that really stood out to me in this interview was just this little discussion here at the end about the future of immersive computing and how it may inspire a lot of people to get more in shape and to be more active from just standing in front of the computer all day, but actually moving your body. And there's this concept of embodied cognition, which talks about how moving your body can actually help the way that you think. And so I think there's going to be a lot of really interesting applications of this active computing with how we're moving our body and how we actually remember things. Because it turns out we don't just remember from our brains, but we actually remember from our full bodies. And so when we start to engage this concept of embodied cognition, then I think there's going to be a lot of really interesting applications and implications of that. So just to wrap things up here, I will be in San Francisco next week at the Google I.O. Conference, as well as before that, the Experiential Technology and Neural Gaming Conference and Expo, as well as the Rauthenberg Founders Day. So I'll have a busy week next week doing a lot of different interviews. And so if you see me, flag me down and let's talk about what you're doing in VR. So if you have been enjoying the Voices of VR podcast, then please consider becoming a donor at patreon.com slash Voices of VR.

More from this show