#114: Yuval Boger on Open Source Virtual Reality

yuval-bogerYuval Boger is the CEO of Sensics and a founding parter for the Open Source Virtual Reality (OSVR) project in collaboration with Razer.

The vision for OSVR is to be able to create a standardized middleware software layer that helps VR developers integrate a wider range of VR peripheral input devices as well as to create a hackable platform that VR hardware developers can add their specific hardware customizations or implement specialized VR algorithms. Having an open source VR HMD is a great vision that has gathered a lot of interest from a variety of different industrial VR manufacturers as well as VR peripheral manufacturers. As the VR ecosystem grows, then I think that there will be an increasing need for something like OSVR.

Boger talks about some of the features of OSVR including:

  • Implications of having open hardware,
  • Difference between OSVR’s abstraction layers for sensors and rendering
  • Negligible latency tradeoffs for using OSVR and the benefits
  • Will OSVR standards limit or encourage innovation?
  • Support for over a dozen different HMD manufacturers and many different input controllers
  • Android mobile integration and using OSVR for the sensor integration for Gear VR and the Oculus SDK for the rendering layer

It feels like OSVR is probably where Linux was probably a year or two after it was first released compared to the other VR HMDs that are on the market. But in the long-run an open source model is something to keep your eye on. You can check out the OSVR Github repos here and their github landing page here: http://osvr.github.io.

It looks like a lot of the software code is licensed under the Apache License, Version 2.0. You can also sign up to download some of the OSVR hardware schematics from their website, which is licensed under the Google’s Project Ara Module Development Kit License because open source hardware licenses are less defined.

One announcement that came out at the IEEE VR conference is that OSVR announced that they’ll be collaborating with 28 leading VR labs at Universities around the country. I’d expect to that OSVR will be a great platform for hardware and software hackers as well as VR academics to have a baseline platform to experiment and innovate.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast.

[00:00:12.056] Yuval Boger: I'm Yuval Boguer, I'm CEO of Sensix. Sensix is a founding partner of OSVR. OSVR stands for Open Source Virtual Reality. It has a hardware component and a software component. The software component is essentially a middleware platform that connects games to peripherals, to algorithm vendors, to hardware. And then there's the open-source hardware, which is a nice, wide-field, high-resolution, head-mounted display that is completely hackable. You can download the schematics, you can make any changes you want, you could make it your own. I think both are equally exciting. They work together nicely, but they can be also used independently. On the software side, many years ago, when you bought a new printer, you had to upgrade your word processor, because the printer driver was part of the word processor distribution. And today, of course, it doesn't happen. You buy a new printer. You connect it into your computer. Computer says, oh, here's a new printer. I'll get the driver. And I'm all set, because the operating system has a print services layer. The same thing happens with VR. You don't want to write a game that works only on a Sony, or only on a Virtuix Omni, or only on a Sixth Sense, because many peripherals come to market, many HMDs come to market, and you don't want to be chasing the driver game all day. That's where OSVR comes in. We define a generic interface for different kinds of devices. So you might say there are 10 different eye-tracking companies, and they all have different interfaces, but fundamentally, they just provide gaze direction and blink detection and something else. So there's a generic OSVR interface. So as a game developer, you would say, oh, I just need to know where you're looking at, and OSVR does the heavy lifting to identify, to configure, and talk with each of the individual vendors. Turns out that everyone loves the concept because the game developers say, oh, I now have access to a lot of hardware. I mean, OSVR today, with the recent announcement of the new partners, we've got 50 partners since January, runs on about 20 different head-mounted displays. you can write a game, here's the configuration file for the head-mount display, you drop it into, say, the OSVR Unity plug-in, and you're all set. You don't have to write 20 versions of the game to do that. So the game developers love it, the hardware vendors love it, because once they write a plug-in for OSVR, they all of a sudden have access to all this content that uses OSVR. So they don't have to go and chase the studios, oh, please, please, please, create a version for me, it just works. So that's been great. And on the hardware side, we have some nice innovations. We have really nice optics. We have onboard signal processing. So for instance, we can carry the video signal wirelessly into the headset. Or I could take my phone, connect the cable to the headset, and see the contents of my phone screen in the headset. A lot of things that perhaps are not available in many other products. And we use it to trigger additional innovation, to say, here's what you can do. Here's what we did, let's see what you can do given the open source nature of this.

[00:03:15.687] Kent Bye: Yeah, and I know with doing a virtual reality project in, say, Unity, you would drop the, for Oculus Rift, the Oculus SDK, you drop in that specific camera, and then you have to output a version of that. It has an Oculus Direct to Rift. With this, something with OSVR, would you be able to, instead of using the Oculus Rift SDK, use the OSVR SDK, and then still have something that's compatible with the Rift?

[00:03:39.011] Yuval Boger: Precisely. So Oculus Rift is one of the HMDs that's supported by OSVR. As a matter of fact, we just published a white paper just a few days ago that shows in, I think, three or four easy steps how you can take an Oculus Rift game and make it into an OSVR game in Unity, we have a plugin, you replace the plugin with our plugin and do a couple more steps and you're done. So, I think that it's advantageous for a game developer to say, how do I get the largest variety of hardware that can run on it, as opposed to limiting yourself to just one device, because if you limit yourself to one device in a year's time, that device might be outdated, And now what? You're going to have to keep chasing the hardware.

[00:04:21.809] Kent Bye: Right. And I know here at GDC, Valve just announced the Lighthouse, which is, you know, sort of this open source platform to be able to do laser scanning and positional tracking. And, you know, that's what they're using for their own positional trackers. Is that something that you've been also working with Valve to integrate some of their latest hardware technology to be compatible with OSVR?

[00:04:41.424] Yuval Boger: We haven't been working with Valve much to date, but we absolutely see how the valve products integrate into OSVR. If you have a sensor that gives you positional tracking, whether it's optical tracking, whether it's the valve tracking, whether it's a 6N stem, what have you, from the application perspective, it doesn't really matter. how you get the tracking information as long as you have it. So we're looking forward to getting Valve on board to either us creating or the community creating or Valve creating a plug-in for OSVR that brings their hardware into the family.

[00:05:19.078] Kent Bye: I see. In the absence of something like OSVR, I would imagine you would have to do is create up to 20 different versions of a binary executable that then would be very specific to that platform. And so I guess the ideal would be that you're able to just create one file and it works on everything. Is that sort of where you're going?

[00:05:37.834] Yuval Boger: That's the goal. I mean, OSVR not only does support a wide variety of hardware, but it also runs on many operating systems. OSVR runs on Windows. It runs on Linux. It runs on Android. I'm sure we'll have other ports soon. It connects to Unreal. It connects to Unity. We just had a community member do a monogame integration, just in terms of yet another engine that could connect to OSVR. And we feel that getting a standard interface for these peripherals is great. Imagine what would happen if Ford and GM didn't agree on the type of fuel that they need to drive the cars. It would be really hard to find a gas station that serves your car. And now, okay, it's the same gas, more or less. It's the same type of pipe. You can still have different cars and have the same gas station. And I think OSVR serves that purpose of accelerating growth by bringing together different hardware, different games. By the way, it's not just hardware games. It's also algorithms. So we can go now to researchers or anyone who's got unique technology and say you have a unique algorithm, maybe you've got a slam algorithm, maybe you've got an eye tracking algorithm, maybe you've got a gesture engine, maybe you've got voice recognition or face recognition. Plug your specific piece of expertise into OSVR and show the world what you can do and all of a sudden you'll see an amazing growth in the breadth of features and the selection of hardware that's available within OSVR.

[00:07:03.063] Kent Bye: And so in virtual reality, I know that latency is king, you know, and anything that is delaying any, you know, at the level of milliseconds is going to perhaps have a significant impact on something that's either 75, 90 or 20 hertz. And so since OSVR in some ways is an abstraction layer, is there any latency implications for using something like OSVR?

[00:07:24.875] Yuval Boger: In general, virtual reality, we think about it as two parts, sensing and rendering. Sensing is about getting all the inputs, whether it's voice or position or eye tracking and so on, and then the rendering pipeline. In the initial months of OSVR, at least the public months, we've been focusing on the sensing side. and we've gone through a lot of measurements and a lot of work to make sure that the performance is very high and the abstraction layer doesn't slow you down. Fortunately, what's happening on the rendering side is that you see AMD with LiquidVR, Nvidia has their stuff, Intel has things in the works, so we see a lot of the hardware vendors Working to improve the latency on the hardware gear and and that's great I mean we look forward to working with them and so to the benefit of everyone So does that mean that there's zero impact or zero latency or you know?

[00:08:13.592] Kent Bye: I'm just trying to get a sense that if you use osvr what the trade-off might be.

[00:08:17.454] Yuval Boger: I mean it's negligible latency there's always Every piece of code every instruction adds some non-zero latency that's always the case, but I think the benefit is there I mean for instance we have a a wireless video link that can carry HD 1080 signals wirelessly, including to the OSVR Hacker Developer Kit. And the wireless link adds one millisecond of latency. And so if you want to use wireless, you get the benefits of wireless untethered video, and you pay the one millisecond latency price. If you don't want to use to pay that one millisecond, you can use a cable. Same with OSVR. I mean, there's very little latency. But obviously, with every piece of software, it adds something.

[00:08:59.025] Kent Bye: OK. Yeah, that makes sense. Now, because there's so many new input controllers and you have things that maybe you have full skeletal tracking of tracking all the fingers and stuff that something that in some ways I would imagine that it may be difficult for if you have like say 30 or 40 sensors on your body to come up with some sort of like standardized template that would kind of be applied to all of them. I guess what I'm getting at is that We're at the phase where virtual reality is so early that would something like OSVR be a standard that would be limiting innovation in terms of the types of input that a new crazy device might want to be using?

[00:09:39.636] Yuval Boger: Well, obviously we see OSVR as encouraging innovation and the reason it encourages innovation is that if you have any kind of unique technology, hardware, software, optics, sensing, gaming, and so on, you don't have to build the entire ecosystem to bring it to market. So previously, if you had an eye tracker, now you either need to get someone to integrate it for you, or you need to build your own HMD around it. Not anymore. I mean, you can take the designs and hack them and put your eye tracker there. Same with software. I think it encourages innovation. The way OSVR software works is that we focus on the functionality of the device. So for instance, going back to the PC world, you might have a device that's a printer and a scanner. It's an integrated all-in-one device. And that, to the PC, it might have a printer interface and a scanner interface. It just happens to be in the same physical device. So we certainly see devices that have multiple functions in them. I mean, even an HMD has a display interface, has a tracker interface, maybe an audio interface. and so on. We're not creating the standards on our own. What we've been doing is we've been working with our partners, whether it's SoftKinetic, Leap Motion, Sixth Sense, Nod, and others, to say, here's how we view the world in terms of the interfaces, the type of data that your device generates. Please give us your feedback. We're posting that online so everyone can comment on it. And then once we agree that this makes sense, now we're going to implement our side, you're going to implement your side, and now you've got an OSVR plug-in. So I think that we're accelerating innovation through making it easier for devices to come quicker into the game and into the market.

[00:11:20.575] Kent Bye: And I think you mentioned 30 HMDs. I didn't even realize there were that many at this point. Maybe you could talk about some of the big ones that are out there, and maybe some of the other ones that most people probably haven't heard of yet.

[00:11:31.299] Yuval Boger: So I think by now we support 10 or 12 HMD manufacturers. So for instance, we announced that Vuzix has joined OSVR. So actually, even here on the show floor, you can see a Radial G racing game running on the OSVR hardware. Vuzix booth you'll see the same game running on their hardware and it was so great that you know they really didn't need to do much to get it running they just had to put the configuration file in the right place and now Unity or OSVR over Unity took it from there. We have VR Union which is a nice wide field HMD that's being shown here also a new supporter of OSVR and several others and of course some of these vendors like Sensex, for instance, where I come from, we have multiple HMDs. So they're all supported to make life easier for our customers.

[00:12:26.764] Kent Bye: I see. Yeah, I think that makes sense. And talk about Sensex has been in virtual reality for a long, long time. Maybe people from the consumer VR may not be aware of your company and what type of stuff you do. So maybe you could tell us a little bit more about Sensex.

[00:12:39.934] Yuval Boger: So Sensex started over a decade ago. It was an outgrowth of a project that Johns Hopkins University A major Japanese car company came to Johns Hopkins, and Hopkins did a project for them to create a super wide, super high resolution head-mounted display for car design, for being able to visualize the interior of a car. That prototype, which was created over 10 years ago, had 150 degree field of view and 8 million pixels per eye. So even by today's standard, that is a very, very impressive piece of hardware. And since then, Sensex has been selling for many years primarily into the professional market, whether it's a military training or academic research or commercial entities that wanted to do high-end VR. We still do that, but now we've taken our expertise and our technology and worked with Fraser to create OSVR to bring some of our expertise into the consumer market. And we're also doing things with other vendors on, again, bringing our technology to medical fields, to entertainment fields or to other things that are not immediately gaming.

[00:13:46.654] Kent Bye: And so, why Razer? What does Razer do? And talk a bit about the partnership of how you two are working together.

[00:13:52.397] Yuval Boger: I mean, we've always admired Razer in terms of their products and the brand. And of course, the understanding of gamers. Razer is a global leader in products for gamers. And so when you want to create a gaming HMD, that's really a great place to start. They've been interested in VR for a while and we've been interested in gamers for a while. So when we got together about 18-20 months ago and decided to work together, it was like a marriage made in heaven. We brought our VR expertise, they brought some ideas and of course they brought their design capabilities, their production capabilities, brand and some of the other assets that they have and together we created OSVR.

[00:14:32.052] Kent Bye: And one of the things that I've noticed about OSVR is that it has a different approach to the lenses. Maybe you could talk about how are the lenses in the OSVR different than maybe some of the other HMDs that are out there.

[00:14:44.115] Yuval Boger: There are many different types of optical designs, and of course everyone or every project prefers a different design, just like there's no single card that's good for everyone. Optics is one of our specialties. We always look at optics design as a study in trade-offs. You can want wide field of view or very low distortion or very low cost. lightweight, maybe you have certain materials that you like, maybe you want the lens to be very comfortable, maybe you want to be able to wear glasses with it, and so on and so on. So there's a whole list of requirements. We, together with Razer, came up with a list of requirements that we felt was satisfactory to VR. We wanted to create a high-quality experience in terms of very low distortion, so you don't have to work hard in the GPU to fix it. We wanted an image that's clear all the way to the edges, so there's no blurriness and little or no color breakup. And so what we had to do to achieve that... is to use more optical elements. When you look at other products, they typically use a single asphere lens in each eye, and we use, in this case, we use two. And using two lenses allows us to have better control over distortion and over the color breakup. I put a pretty lengthy blog post on my blog, VRGuide.net, that talks about these particular optics. and shows how they're built, what the performance looks like, and talks about the trade-offs. So if anyone is interested in a little bit of behind-the-scenes on the optics design, they're welcome to go and read that post.

[00:16:13.098] Kent Bye: Yeah, and you had mentioned that, you know, because it's an open-source project, OSVR, that, you know, you have community collaboration on both the code and potentially the hardware side. And what have been some of the early wins that you've already seen in terms of that collaboration, in terms of the software and possibly even the hardware?

[00:16:31.025] Yuval Boger: We're opening up the software to the public in a couple of days. So until now, we've been in closed beta. And even in closed beta, we are so happy to see the response and the willingness and desire of people to contribute. Some people have reviewed the code and said, well, you know, we would like you to do it a little bit differently. Some have created adapters for a particular game engine. Some have looked at that and started working on creating device drivers. Some are looking at that and creating what we call analysis plugins, things that take information from sensors and make them higher quality or better for the game consumption. There are a lot of universities that are showing interest in contributing there. We've also launched, together with Razer, an academic program where we make it easy, sometimes free, for universities to get an OSVR headset to start experimenting with it and see what they can do. And honestly, we can't wait until a couple more days where we open it to the public and see what happens there in terms of innovation, porting to other operating systems, additional game engines, algorithms, and so forth.

[00:17:39.534] Kent Bye: And in terms of the hardware being open source and hackable, what kind of implications do you see that having? I mean, why would somebody want to go in and change anything?

[00:17:49.999] Yuval Boger: Well, they might want to change something if they have special needs. So for instance, Even at the show, we're showing a couple of different what we call face plates, you know, the plate going outside. It's designed to be hackable, so it's a removable plate, and now you can start adding module to it. Maybe you want to add infrared LEDs, maybe you want to add cameras, maybe you want to add a particular sensor. Maybe you say, well, I like your strap, but I want a strap that has a different color or holds the head in a different way. Maybe you want a higher resolution screen or a different type of screen. And so by allowing you to change you know, making the system modular and allowing you to change just what you want. I think we make it easier and faster for you to get to market. We have people who say, Oh, I'd like to implement some unique function in your FPGA. It's great that you've got this processing power on board. Let's do this and that on it. And now they have a platform that they don't have to reverse engineer. They can sort of forward engineer it, take what we have and take it from there.

[00:18:53.295] Kent Bye: And because there are so many different VR input controllers that are out there as independent companies, and they have their own SDK, would using something like OSVR mean that you wouldn't have to bother with their SDK? You just use OSVR, and you've had the majority of them just sort of do their integrations through your plugin?

[00:19:11.709] Yuval Boger: Precisely. So for every type of device, you can work directly through the OSVR API, and that's it. If a new device comes on board that is not supported, either the vendor adds the device or some customer says, oh, I can see the API. I can see how OSVR works. I have an example plug-in from OSVR to show how it's done. So it's very easy to create a plug-in. When a hardware vendor comes out with a new version or new capability, they can just upgrade and distribute the OSVR plug-in. So the game could benefit from the extra performance and sometimes the extra features without having to recompile. So we've tried to make it, I mean, we understand that VR is fast moving, there are a lot of moving parts, and that it's just not right to freeze everything in time and say, this is what you do, and let's not change anything until I'm done. as opposed to giving you the flexibility to mix and match and basically plug and play with all these devices. I see.

[00:20:10.332] Kent Bye: And so in the web design world, there's something like progressive enhancement, where if you don't have JavaScript enabled, then it still works. And so would that be something similar to you start up the game, and if it detects that you have a STEM controller or a VirtueX Omni, that it would enable that, but otherwise you'd sort of fall back to a game controller then?

[00:20:30.322] Yuval Boger: Exactly. A game could identify what devices you have, And then either the game could decide, or of course the user could decide. I mean, just like I might have two printers connected to my computer, and today I decide to print on the color printer, and tomorrow I want to print on the laser printer. Imagine that you have a Razer Hydra, and now you bought a 6th Sense STEM, and you say, OK, great, I want to move to this controller. Or maybe now you're using the Valve controller that gives you positional tracking, but you still want to run the same game. So as long as you have an OSVR plug-in for the Valve Motion Controller, then you're good.

[00:21:04.389] Kent Bye: What are some of the controllers that are already integrated in OSVR then?

[00:21:08.172] Yuval Boger: We have a very large amount of motion trackers, head trackers, position trackers, and so on. In this show, we're showing Nod in terms of the ring. We're showing positional sound, directional sound, in this case from Visisonics. So the list of devices is really growing every day. both because vendors come to us and say, oh, we'd like to write, help us out. And either we just show them example code or we can write it for them or that people come and say, oh, I've got this unique EEG cap and I want to integrate it into my stuff. And we say, well, you know, we're not, that's not super high on our priorities right now, but here's how you would go about writing it yourself. And if you do, please consider contributing back to the community so that other people could improve it or other people that have the same need could use it. I mean, I think the philosophy behind OSVR is received exceptionally well. People like openness, people like to be able to choose. I mean, imagine if you're going to buy another HMD and that HMD has some kind of, let's say, eye tracking. Well, why did they have this particular eye tracker inside? Is it because it's the best one? Is it because that's what they could get? a financial decision? Is it something that they just decided to develop in-house? With OSVR, you can choose your own. I mean, whatever eye tracker you want, if it has an OSVR driver, great. If it doesn't, then let's create one. And you can make your own choice, just like you can run Android on a big phone or a small phone, a powerful tablet, game console, and so on. We let people choose the configuration of their system.

[00:22:52.120] Kent Bye: So with Gear VR and mobile VR, I know that performance is even more important. And so does OSVR have integrations for Gear VR to be able to use your plug-in rather than the one from Oculus?

[00:23:07.768] Yuval Boger: OSVR runs today on Android, and that's something that we're going to keep improving very much over the next few months. So if you think about the Android model, I mean, you mentioned the Samsung Galaxy or Samsung Note. Well, that's an Android-based phone. So LG or Samsung or HTC or whoever took Android, said, OK, we like these features. Now we're going to make some vendor-specific optimizations or add some apps or add some unique features that we have. They can do the same with OSVR. They can take OSVR running on Android. We'll be able to take a Unity plug-in for OSVR running on Android and then optimize it based on their specific hardware. Maybe some people have, you know, kernel access and can change the underlying code. Some people don't want to do it for a more casual experience. So I don't think that OSVR is necessarily a shrink-wrapped solution for running VR on the phone, but I think it's a great starting point towards optimizing the VR experience on each particular device.

[00:24:09.215] Kent Bye: So would you recommend people do a specific Gear VR that's really to the metal, as fast as it can be, rather than using OSVR then?

[00:24:19.163] Yuval Boger: Again, if you look at Gear VR, I think that a lot of the optimization that has been done is around the rendering side. How do you minimize latency? But then there's a whole bunch of sensing implications. What controllers can you connect to Gear VR? which of the phone sensors can you use? Could you connect a Bluetooth device? And not just could you connect it, but could you connect it in a way that you could swap the device tomorrow with something better? So we could certainly see a situation where OSVR coexists within Gear VR. You can take Gear VR and say, I'll use OSVR as the sensing layer to give me the unified access to all the different kinds of peripherals, And if Samsung did a good enough job on the rendering and latency side, then that's wonderful. So they don't need to do any more work. They can just focus on the sensing side.

[00:25:10.081] Kent Bye: And what are some of your thoughts here being at GDC and seeing all the virtual reality news that's been coming out here and all the different VR displays that are here on the expo floor?

[00:25:20.799] Yuval Boger: Well, we've been doing VR for a decade. So we've seen the ups and downs and ups and downs. Someone told me that people like us have a lot of arrows in our back from all the battles we fought in VR. So coming here and seeing VR becoming mainstream and seeing all these new products is wonderful. It's exciting. We're glad to bring our experience. We're glad to be a small part or hopefully a bigger part of it. But it's a wonderful time to be in VR.

[00:25:47.858] Kent Bye: And finally, what do you see as the ultimate potential for VR and what it might be able to enable?

[00:25:54.124] Yuval Boger: A lot of, well obviously in GDC a lot of focus is on gaming, but we've seen applications for VR and some of our customers use it on the professional side, whether it's training, rehabilitation, low vision, I mean, if you have a teenage daughter, wouldn't you want her to be in a driving simulator in VR before she actually gets on the road? For 10 years now, every day, someone else comes in and says, oh I've got this great idea on what to do with VR and you know now they finally have affordable hardware they can start doing it. So while there's a nice focus on gaming at the moment I think there's so many different ways that VR is going to be useful that to us it speaks again to the importance of having underlying unifying layer that can bring all the hardware in and make it easier to focus on the experience that you're trying to create as opposed to worrying about the nuts and bolts of what's underneath.

[00:26:51.605] Kent Bye: Great, well thank you so much. My pleasure, thanks for having me.

More from this show