#112: Brandon Jones on WebVR for Chrome

brandon-jones
Brandon Jones is a WebGL developer who started implementing WebVR into the Chrome web browswer as a part of his 20% project at Google. He’s been working on it for the past year in order to get VR into the browser. You can find the latest WebVR-enabled Chrome build here.

Brandon talks about the moment and growth of WebVR over the past year, and how he sees VR as an area of computing that people are very interested in. WebVR and WebGL are very interrelated areas, and so one could consider that he’s working all the time on WebVR.

He talks about the Khronos Group standards committee for WebGL, and the fact that the WebVR is currently homeless in terms of a standards committee and it’s uncertain as to whether the W3C or Khronos Group will be the governing body. You can check out the latest WebVR spec here.

Reducing latency is the number one focus for working on WebVR in Chrome, and the latest latency with a DK1 was 64ms of seconds for motion to photos, and is likely lower with the DK2 with the faster framerates. They’ve also integrated timewarp into WebVR in Chrome in order to help with reducing perceived latency. He talks about some of the ongoing work for Chrome to make realtime WebGL rendering a lot faster, as well as some of the other optimizations to the browser that could be made to make it more efficient if it’s known that the output display is VR.

Google is not targeting WebVR for the Gear VR at first because it’s not meant to be an end-to-end VR experience. In other words, they’re not creating browsers that work in VR, but rather making VR experiences that work in the browser.

Brandon talks about Google Cardboard, and some of the initial reactions to it and the growing interest around it. His own personal reaction to seeing Cardboard for the first time was to laugh in his manager’s face, but he very quickly went to “This is crazy!” to “This is brilliant!” after trying it out and seeing it’s potential. He talks about some of the more compelling Cardboard experiences he’s had, and how he sees it being able to fill a need for consuming 360-degree videos, 360-degree photos, and other ephemeral content.

He talks about some of the future optimizations that Unity and web browsers could make in order to streamline the viewing of WebGL content on the web. The current downloads are fairly monolithic and could be made to be more friendly for the web by dynamically streaming of assets and content.

Finally, Brandon doesn’t see the web as a platform for Triple AAA VR gaming since it’s not optimized to be able to maximize the use of native resources. Instead, he sees that web will be great for sharing ephemeral content that you don’t want to download. He also sees that a lot of the framework for the metaverse is already here with the Internet and cites Vlad Vukicevic who said, “We already have the metaverse, it’s the Internet in 2D!”

For more information on WebVR, then be sure to check it out here.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast.

[00:00:11.998] Brandon Jones: I'm Brandon Jones. I work on the Chrome team, primarily as a WebGL implementer. But as of late, I've been doing WebVR as my 20% project for about the last year or so, in cooperation with Vlad from Mozilla, who's been doing the same thing. Put simply, our goal is to get VR into the browser. We want it to work with as many headsets as possible. We want it to work with as many types of content as possible, and as soon as possible. But it's going to be a little bit of a process to get there.

[00:00:43.862] Kent Bye: Nice. And so, yeah, I didn't realize that was your 20% project. And so it's not sort of into the official full-time yet. But do you feel that this is something that may have legs to be able to turn into something that's more full-time for you?

[00:00:57.145] Brandon Jones: I definitely do. You look at something like the Cardboard team over at Google. They're hiring right now. They've got a decent number of people who are working on that full-time. It's obviously a area of computing that people feel is worth putting time into. So, yeah, I definitely think that this is something that could evolve into something more. Although I should point out that I do say that this is my 20% project. And it is. I literally spend about 20% of my time on it. But working on WebGL alongside it, they're very complementary technologies. VR on the web I don't think can exist in a significant way without the ability to do high quality, fast 3D rendering along with it. And as such, you know, making WebGL better makes WebVR better. And so in that sense, I could be considered to be spending 100% of my time there.

[00:01:51.756] Kent Bye: So yeah, we're in San Francisco, and GDC is happening in the background, and we're at a WebGL meetup. And at the beginning of the meetup, there's a number of people that came up and sort of were representing the WebGL Standards Committee. Maybe, you know, are you a part of that? And what's it involved with? You know, what type of stuff you guys are doing?

[00:02:09.283] Brandon Jones: Yeah, so WebGL is a standard that is managed by the Khronos group. This is the same group that does OpenGL and OpenCL and now the new Vulkan API. And so Google and Mozilla and even Microsoft and Apple now are all part of the Khronos working group for WebGL. We all try to push that forward in tandem. Right now, that's primarily because it's actually a mirror of the mobile APIs for 3D. OpenGL ES, well, ES 2.0 and WebGL 2, which is upcoming, which we just presented on, will mirror the OpenGL ES 3.0 standard. And then where we go from there is not yet determined. But thus far, it's been pretty profitable to work under the Khronos banner. They're pretty good about bringing the right people together to get these things done. I should mention that the WebVR standard is somewhat homeless right now. We haven't gotten it to the point where we want to shop it around and figure out the right working group for it. It's not under Khronos. I don't know that that would be the appropriate place for it. It's probably something that fits better under W3C or WhatWeG. We haven't determined where we want to develop that under yet.

[00:03:25.009] Kent Bye: I see. And so what's the difference between the Kronos group and something like the W3C?

[00:03:30.418] Brandon Jones: Well, they're both standards groups. I think the primary difference is that Khronos has traditionally focused a lot more on APIs that allow you to interface with hardware. W3C is something that focuses much more on standards for the web. WebVR could be considered to be both. And so, you know, once again, it's not quite clear where that's going to land. I guess another way of putting it is that Khronos is in the web, or Kronos has its fingers in the web because the web adopted 3D, not because they are a web standards group. So it's a little bit of an outlier there, which is why I say that WebVR might fit better under one of the other groups, but we'll see.

[00:04:15.797] Kent Bye: I see. And so, you know, I was at the SVR con last May and had first talked to Vlad about WebVR and what was, yeah, actually hadn't even launched yet with WebVR. It was just sort of this initial integrating VR with the web. And so where would you say that we're at in terms of latency and performance and plugging in hardware into the browser and having something that has an experience that is performant enough to not cause nausea?

[00:04:41.763] Brandon Jones: That's a really good question, and that is probably the number one technical focus that we have on the VR side right now. I can tell you that at this point, the last I measured with Chrome, and the numbers are a bit different between Chrome and Firefox. I don't know what Firefox's numbers are. We were getting about 64 milliseconds latency from, you know, motion to photons. And that was actually measured with a DK1. I'd imagine that the same number would come up in the 50s with a DK2 because of the higher frame rate. What it turns out to be is that Chrome has a buffer of frames that it works through. And it's usually about three frames long. And so when you produce the frame in JavaScript, you actually have to cycle through like three more frames on the GPU before that finally reaches the screen. And this is because the browser has traditionally focused on web content, you know, your traditional New York Times or CNN or Amazon or Google or stuff like that, where latency doesn't matter quite so much as smoothness. You know, you want to be able to put your finger on your phone and flick it and have it be a nice smooth experience. For VR, on the other hand, latency is king. And so by integrating these APIs into the browser, we're actually fighting against grain just a little bit. But this has turned out to be something that people who are working on the browsers are actually encouraging of. They're excited to see that we're working on VR specifically because it will shine a light on areas of performance that are not doing well. for this kind of content. And there's a variety of different tricks that we're discussing about how to get around that. The most prominent one that I found on Chrome, and I don't know if Firefox has implemented this yet, but on Chrome we actually do use Time Warp when you're using the Oculus Rift SDK. and so even though the actual latency is 64 some milliseconds the apparent latency is far far better and it's not a perfect solution it doesn't account for head translation but you can put on the headset and look around and feel like it's really good responsive rendering and it doesn't make you sick right away you know it's always going to be better to reduce actual latency and not have to rely on those tricks so much. But it's a decent stopgap solution for making sure that nobody's throwing up along the way.

[00:07:09.824] Kent Bye: And so is this a matter of going back and looking at how the basic browser is architected at a foundational level and sort of doing a re-architecture in order to make it more efficient to be able to get latency down to more of the 20 millisecond range?

[00:07:26.540] Brandon Jones: Very much so, although describing it as a re-architecting is maybe not the right way to go. What we absolutely don't want to do is say that we need to discard a decade's worth of browser architecture experience because we want VR to go fast. You know, there's a lot of work that's gone into making sure that standard web content, which still makes up the vast majority of what you see online, performs as good as possible. And that is a continuously ongoing job. I mean, we have a great many people at Google who that's the only thing they do, day in and day out, is think about how to make performance on the web better. What's probably more accurate to say is that we're going to find the areas of the browser that we can successfully sidestep and take them out of the process. Because when you're doing something in VR, Most of the time, and I think there are going to be edge cases, but most of the time you're going to have a full screen WebGL canvas that is presenting a scene in its entirety and you're not going to have an awful lot of interaction with the DOM or with some of these other traditional web constructs. And in doing so, we can say, well, if the only thing that we're doing is showing a full-screen WebGL canvas on the HMD, maybe we don't need the compositor, and maybe we don't need this pipeline, and maybe we can use that as a flag to shortcut our way directly to the display. And so those are some of the possibilities that we're looking at right now to improve the overall latency.

[00:08:58.397] Kent Bye: And what do you see as sort of the mobile experience? Because I know that with Google you have Cardboard and you have Gear VR and there seems to be the screens of being mobile phones. It seems like it's pretty much a natural thing to just use your cell phone and to put it into a simplified HMD of some sort to be able to jump into virtual reality. And so where do you see that sort of combination with taking web content or being able to go to a URL and start to dive right into a VR experience?

[00:09:30.821] Brandon Jones: This is a really good question. So I should first point out that something like Gear VR, while I find it to be a fascinating platform, is not necessarily the platform that we're targeting right now. Simply because, for one, it's a closed ecosystem. And in order to get into that ecosystem, I don't think that they would turn away something like Chrome or Firefox. I think they'd actually be thrilled with it. But we would have to present an experience that is VR end-to-end. That's the goal of Gear VR, is once you put it in, you don't have to leave that experience until you're ready to step out of VR. And that's not something that we're prepared for yet. We are very explicitly not creating browsers that work in VR. We're creating VR experiences that work in the browser. And so for the time being, you're still going to have to navigate between them in a more traditional way. For cardboard, that would mean taking the phone out of the harness, navigating a web page like normal, and then probably clicking a button somewhere to indicate I want to go into a VR experience again. On the Cardboard side of things, yeah, I mean, it's generally going to work exactly like I just described. For Chrome, and I don't think Firefox has really investigated this too much yet, but for Chrome, we are working with the Cardboard APIs. And so once again, like I said, we have people that are dedicated to working on that. And as it improves, our integration with the Chrome browser is going to improve. So any tricks that they come up with to make mobile VR better, is something that WebVR will mostly automatically inherit. It's not always a clean-cut fit, you know, the VR APIs that exist right now across the board, not just on mobile, are all very oriented towards your traditional game engines, you know, something that looks like Unreal or Unity. The browser is definitely not that. It's about the farthest thing away from that that you can get. But there are still enough similarities that we can typically use the same frameworks and the same methods.

[00:11:32.723] Kent Bye: Yeah, and I heard that Cardboard was also started as a 20% project. Maybe you could talk a bit about the history of where that started and where it's at now at Google.

[00:11:42.966] Brandon Jones: Yeah, so Cardboard was started as a 20% project. So this is my personal opinion. This is nothing official from Google. But I feel like the reaction to it at Google I-O when it was announced officially kind of took some people by surprise. It was such a simple platform. But it was really popular. People loved it because it was so simple. I mean, it was the most scaled down solution for VR you could possibly think of. And as a result, it was really attractive to people who don't want to plop down the money for an Oculus Rift and a system that's able to run it. And so I think that seeing the popularity of this platform, that enabled it to grow. But yeah, it's definitely moved beyond that at this point. We've blogged about it before. Google has. We're hiring people for it. We're moving forward. It's something we're taking seriously as a platform. And exactly which form that takes in the future, I don't really know. But you can be sure that it's something that there's a lot of interest in.

[00:12:52.356] Kent Bye: Yeah, when it first came out, my first thought was it was kind of like, you know, is this a joke? Or, you know, is Google trying to troll, like, the VR community by, like, kind of making fun of? But it does seem like out of that has come at least a developer ecosystem and a lot of apps. And, you know, it seems to really have legs within Google and with Android ecosystem.

[00:13:12.973] Brandon Jones: Yeah, so I gotta say, I remember before it was actually announced at I.O., I had heard some whisperings that Google was doing something with VR, and I was already working on WebVR at this point. It wasn't really publicized, but I was definitely working on the code for it. So I was really excited about this, and my manager called me into his office one day and handed me this little cardboard slab and said, this is our VR solution. And I laughed in his face. I just thought it was so absurd, you know, holding this little cardboard platter. And then I started, you know, peeling it apart and putting it together. And I quickly went from, oh my gosh, this is crazy, to oh my gosh, this is brilliant. You know, just seeing how the little thing folded up and came together and the little magnet trigger and everything like that. Very, very quickly I looked at it and realized this is going to be, you know, people are going to love this. And yeah, so it's been fun to see that that intuition was right on because it really is something that's caught on and it's kind of had a life of its own. People really have been building some awesome experiences for it.

[00:14:18.537] Kent Bye: Yeah, I haven't spent a lot of time in the Google Cardboard realm, so what type of experiences have you seen out there that you really enjoy or that you think people should check out?

[00:14:28.061] Brandon Jones: So most of the ones that I've seen that are really good tend to be things that cater to the limitations of the platform. So they're not usually super heavy on graphics. A lot of time it's just simple 360 video. But that can be very compelling in that scenario, like there's the Paul McCartney concert that I think Jaunt VR did. The original cardboard demos, the Windy Day and the Earth demo that they had in there and stuff like that. These are all really, really cool examples of what you could do. I recently saw one, and I cannot remember for the life of me, what the name of the application was. It was effectively a two-stick shooter, but with no sticks. It was set in this little miniaturized bedroom, and you were a wizard that went around blasting these little toy dolls. And it used a somewhat awkward scheme where you were either constantly shooting or constantly moving, and it was constantly following your gaze. But, you know, despite the awkwardness of the controls, it was a really fun experience because the environments were well built. It was cute. It, you know, really played with stereoscopy in a way that was effective and noticeable. And it was just doing something that was different and that was outside the realm of just looking at a video. And I honestly, I don't remember too many names of different apps, but I have definitely been surprised by the ingenuity of what people can do in such a limited situation. And it really is, like, cardboard is limited in interesting ways. It's limited not only in processing power, because it is a mobile phone, but in input. You really only have, like, one bit of input with a little magnet trigger, and not even that is reliable. So a lot of times people find different ways of getting around that. And so you're really just stuck with, well, I can look in different directions. People do such cool stuff with even that limitation. It's really inspiring to see. And so you just think, my goodness, if this is what we can do when the only input you have is turning your head, when we start to ramp up to Valve's solution with the controllers in hand and everything like that, how much better is this all going to be? It's really exciting to project forward and think about where we're going to go.

[00:16:42.802] Kent Bye: Yeah, and being at GDC this week, there's been a lot of big announcements with virtual reality. But also with Unity has finally come out with Unity 5. And I know that one of the big features that people have been waiting for with Unity 5 has been to be able to export into WebGL. So with that capability now and being able to take WebGL content from a Unity program and to put it into the browser, what are the implications of that when it comes to what you're doing and also with how that plays into WebVR?

[00:17:12.784] Brandon Jones: So, I think the big implication is that it provides tools into the space that are of a caliber that just haven't been present thus far. You know, WebGL is a fairly low-level API, and while we have a myriad of really interesting libraries that have been built up around it, the primary one being 3JS, makes it very easy to throw scenes together. There's still just been something lacking from the tooling world, where, you know, in the native space we've had years and years of evolution of these tools to build really high quality graphics content and games and scripting and everything like that. And that's just not an ecosystem that has had the chance yet to evolve on the web. So being able to take the cumulative experience of the native space and with a click of a button bring it down into the web is really going to be huge. I mean, it's such an awesome way to bridge that gap and allow people to use the tools that they're already familiar with to create really, really incredible content. You know, it's not completely a magic bullet. It has some downsides to it. The primary one being that the downloads can be a little monolithic right now. And that's something that, you know, people are aware of and we're trying to work on it from various different directions. But the quality of these exports is only going to get better. They're only going to become more friendly to the web as time goes on. And it's an incredible platform. Unity and Unreal are incredible platforms. And it can only mean good things to have them accessible to the web.

[00:18:55.506] Kent Bye: Yeah, because at the moment when you do a WebGL plugin or a browser plugin from Unity, then you have to like sit there with the Unity loading screen for all everything to download. And so, you know, do you foresee it being eventually so that, you know, you can have a little bit more dynamic or instantaneous or, you know, stuff that's at least the first part of it or, you know, what are some of the ways that you would imagine to get around that? process of having to get that many assets down. Is there a way to do that without having to sit there and preload it? Or what are some of the solutions that would be proposed to get around that?

[00:19:29.235] Brandon Jones: So, the big thing that is going to have to happen there is that people are going to have to migrate towards more of a streaming mentality, which is something that the web has been kind of fostering for a long, long time, but the gaming space, not so much. I mean, it is there, but especially with a lot of mobile-focused games, which tend to be the ones that port over best to the web. The technical capabilities are really well aligned between mobile and the web. On a mobile device, you've got flash storage and you can just dump a bunch of resources into it and then read them on the fly. And it works really well. That's not something that works well on the web, though. And so the reason why you see these monolithic downloads is because when you take these games that have been designed to load assets that way, and you just do a straight export, the export can't fix that for you. And so it serializes all of your assets into effectively a giant data array that gets downloaded at the beginning of time. when you visit that Unity page and then can be read from on the fly as if you were reading it from a hard disk. That's certainly not the only way to go about loading these resources, and I'm not fully aware of what capabilities Unity has in this direction, but you know, the ability to go out to the web and say, All right, I'm coming up on this position in the map, so I'm going to need the traffic light resource and the mailbox resource and the chapel resource. And pull these things down as you're actually going through the environments is, I think, ultimately going to be the direction that we need to go in order to get content that performs really well on the web from a download perspective. Because what you really want is to be able to Start up the web page, get into the basic content as fast as possible, and then have the browser just begin streaming down the resources that you want in the background. Because browsers are really good at that. They've spent a long time fostering that capability. And so the more we can take advantage of that for things like Unity and Unreal, the better.

[00:21:42.023] Kent Bye: I see. Great. So in the context of web VR and where you see that going, what do you see as the ultimate potential for virtual reality and what it might enable people to do?

[00:21:54.428] Brandon Jones: Well, so the answer that I give to this is kind of boring, I think, and I give it to a lot of people. But I think it's worth reiterating that I don't actually see the web in relation to VR as a platform for AAA gaming or anything like that. I think it will be tried. I think it will be tried to some degree of success. But I think that you're always going to have a class of applications that will just need every last drop of performance that you can give them. And the web is not that platform. And I don't think it has to be that platform. We always want to make it better. But it's never going to quite match up to the bare metal grit that you can get from native. But on the flip side, native is not very good at delivering quick piecemeal experiences. I mean, imagine a YouTube of VR for a moment. Somebody sends you, hey, I want to show you this awesome 360 stereoscopic cat video that I found. And you click a link and it pops up and says, all right, go to Steam and download the 360 cat viewer. And then navigate to this resource. Native is just not really the right place for that. But if you can go to that website and click the link that your friend just sent you on Facebook and have it pop up and say, all right, put on your headset now. put on your Oculus Rift, laugh at the cat in stereoscopic 3D, and then pull it off and move along your merry little way, this is the type of content the web excels at. It's the ability to deliver quick pieces of content, this type of stuff that nobody would download an app for. But if you can view it by just clicking that link, clicking that play button, you will absolutely do it. I think video is going to be huge. You know, video is massive on the web. And I think that that will continue through into VR. I think things like architectural visualization will be huge. My father-in-law works in real estate. and I showed him the Oculus Rift and Cardboard, and the very first thing out of his mouth is, how can I use this for my business? Because he sees the potential in being able to hand clients, even just a Cardboard viewer, and say, here's a preview of one of the properties that we have on the market. If you like it, we can go walk through the real thing, and be able to cull the options really quickly that way. That's the type of content that may not be a good fit for the real world, You can imagine like a Zillow where some properties have a button next to them that says, walk through this property in VR. And, you know, that'll be a great experience. This is something that already exists. There's a company called Matterport that does basically Google Street View for real estate. And they have demoed to me, you know, web VR versions of their viewer. And it's incredible. And so I can't wait to get the technology to the point where they can actually put that on the web as a product. and not just some experiment that they're doing in the back room. These are the type of things that I think the web will excel at. So it doesn't have to, you know, fill the same shoes as native to be successful.

[00:25:09.723] Kent Bye: Would you imagine something like the metaverse be through the medium of a web browser? Because this starts to get into the internet and the metaverse have all these sort of interconnected network of different spaces in VR. It would be virtual spaces and in the web it's web pages. So there's that overlap there. But do you foresee something like the metaverse being mediated through a browser?

[00:25:30.059] Brandon Jones: Yeah, so I have mixed feelings on this. I do think that if we ever get something like the Metaverse, I kind of feel like of course it's going to be through the web. Because, you know, you go back and read like Snow Crash or Ready Player One or something like that, and basically what they're describing is the internet in 3D. And Vlad had a great quote. He's like, you know, we already have the metaverse. It's the internet. It's just in 2D. And I agree with that totally. And so, yeah, if we're going to have something that looks like the metaverse that we've all read about, I think it's going to be on the web. I don't think it's quite going to be the way that, you know, it's described in sci-fi. I have a very hard time envisioning people wanting to spend a lot of time in a world that is you know, virtual but human scale, I guess is maybe that's an awkward way of putting it. But you think about if you went to Amazon, and in order to find a product, you actually had to walk down physical aisles, nobody would do that. It's insane. And so I think you're going to see an increasing mashup of those two worlds, but we're going to keep the best of both of them. We're going to take the times where it makes a lot of sense to actually be physically in a space and use that to our advantage, like the real estate walkthroughs, or games, or chat, or watching sports, or anything like that. And then we're going to, in some way, and I don't know what these interactions are going to look like yet, I don't think anybody does, but in some way we're going to take all of the experience that we've gained from desktop, from mobile, from these years and years and years of UI experience, and then apply them to the infinite canvas that will be our digital world, where we're no longer confined by, you know, the size of your screen and how big a phone you want to keep in your pocket. But those lessons aren't going to just go away. Those interactions aren't going to become instantly obsolete. Some of them are, but not all of them. And we're just going to have a much, much bigger canvas to paint them on. And so I think you're going to get a mashup of those two types of interaction. And, you know, it sounds awkward now. I'm sure it will find a way to make it fairly natural. And I don't really know what that's going to look like. I'm really excited to find out, though.

[00:27:56.529] Kent Bye: Great. Well, thank you so much. Thank you.

More from this show