Valve’s Joe Ludwig talks about the latest updates on the Khronos Group’s VR standardization process that is now being called “OpenXR.” Ludwig says that OpenXR is still primarily creating an open and royalty-free open standard for virtual reality, but that they wanted to plan for the future and eventually accommodate augmented reality as well. In my Voices of VR interview with Ludwig, he talks about the OpenXR standardization process from Valve’s perspective and how they want to see VR become an open of a platform just like the PC.
LISTEN TO THE VOICES OF VR PODCAST
The OpenXR working group has just completed it’s exploratory process and there are still numerous open debates, and the Khronos Group is making this announcement of a name and logo at GDC in order to encourage more VR headset and peripheral companies to get involved in this standardization process. Ludwig can’t speak on behalf of any OpenXR decisions yet, but was able to provide more insight behind Valve’s motivations in the process, which is to develop a standard that will what they see as a minimal baseline for a quality VR experience as well as to make VR an open platform. OpenXR will also span the full spectrum from 3DoF mobile to 6Dof room-scale, and so there are many active discussions with the working group about what all will be included in the 1.0 specification.
VR is a new computing platform, and this OpenXR standard aims to help keep both VR and AR as open platforms. This Khronos Group OpenXR initiative aims to lower the barriers to innovation for virtual reality so that eventually a VR peripheral company just has to write a single driver to work with all of the various VR headsets. But in order to know what APIs should be available for developers, then this standardization process requires the participation from as many VR companies as possible. Part of the announcement at GDC is to say that the working group has finished their preliminary exploration, and that they’re ready for more companies to get involved.
In my previous interview with Khronos Group President Neil Trevett, he said that this standardization process typically takes about 18 months or so. Given that it was first announced in December 2016, then I’d expect that we might be seeing a 1.0 specification for OpenXR sometime in the first half of 2018. It also depends upon how motivated all of the participants are, and there seems to be a critical mass of major players in the industry to help make this happen and so it could happen sooner.
As to whether or not this OpenXR will mean that any VR headset will work with any VR software, that’s one of the theoretical technical goals but there are many constraints to making this happen. Ludwig said that while technically this could be made possible with OpenXR, there will still be a layer of business decisions around platform exclusives. When talking to Nate Mitchell of Oculus, even if Oculus implements OpenXR then they still want to make sure that it would be a quality experience. Ludwig said that there will be other constraints of having the proper input controls, button configurations, and set of minimal hardware available for some experiences to work properly. It’s also still too early for what the final OpenXR spec will look like for companies to make any specific commitments about cross-compatibility, and I’ll have more details on Oculus’ perspective on OpenXR early next week with a Voices of VR interview with Nate Mitchell.
Overall, I think that this OpenXR is probably one of the most significant collaborations across the entire VR industry. The Khronos Group says that the OpenXR “cross-platform VR standard eliminates industry fragmentation by enabling applications to be written once to run on any VR system, and to access VR devices integrated into those VR systems to be used by applications.” If VR and AR want become the next computing platform, then OpenXR is a key initiative to help make that happen.
Donate to the Voices of VR Podcast Patreon
[00:00:05.452] Kent Bye: The Voices of VR Podcast. My name is Kent Bye, and welcome to the Voices of VR Podcast. So on today, February 27th, 2017, the Kronos Group is announcing the name of their virtual reality standardization process. And that name is OpenXR. So OpenXR is essentially trying to standardize all the different APIs across all the different major virtual reality companies. This was first announced back on December 6th, which was a call for participation in this standardization process. And at that point, they already had a lot of the major key players within the VR community involved in this process. So they definitely have momentum to make this happen. So at this point, the working group has been working for a couple of months now, and I had a chance to catch up with Valve's Joe Ludwig, who has been leading up the effort on Valve's side. And so at this point, the standardization process has been working in a private exploratory group to hash out some of the initial preliminary questions. And at GDC this week, they're announcing kind of an update, which is essentially they have a name for the effort, which is Open XR, and they're wanting to have more different virtual reality companies get involved in the process. So on today's episode, I talked to Joe Ligwood talking about this standardization process from Valve's perspective. And while he's not able to speak on behalf of the working group and talk about some of the decisions that have been made there, he's able to talk to Valve's perspective of what they really want to have happen with this standardization process, which is to be able to take the virtual reality platform and create a number of different open standards such that it is an open platform, much like the personal computer. So that's what we'll be covering on today's episode of the Voices of VR podcast. But first, a quick word from our sponsor. Today's episode is brought to you by the Voices of VR Patreon campaign. The Voices of VR podcast started as a passion project, but now it's my livelihood. And so, if you're enjoying the content on the Voices of VR podcast, then consider it a service to you and the wider community and send me a tip. Just a couple of dollars a month makes a huge difference, especially if everybody contributes. So, donate today at patreon.com slash Voices of VR. So this interview with Joe happened on February 23rd. And what had happened was that there was a number of journalists that had gone to Valve to do kind of like a group interview with Gabe Newell and the rest of the VR team. And they mentioned the VR standards, but they didn't necessarily go into depth as to any of the updates. And so I reached out to Joe to see if he wanted to talk about what has been happening. And It just so happened that they were wanting to make some announcements and have this open call for participation to get more companies involved into this standardization process. So with that, let's go ahead and dive right in.
[00:03:06.034] Joe Ludwig: Hi, my name is Joe Ludwig. I've been on the VR team at Valve for five and a bit years, and I've worked on a bunch of things. Most recently, I've been involved in our initiative to share the Lighthouse tracking technology outside of Valve with more partners, and also with the VR standards initiative with Khronos.
[00:03:24.661] Kent Bye: Yeah, I had a chance to catch up with Neil Trivett of the Kronos Group three times now, back at GDC 2015, back at SIGGRAPH last year, as well as when they first announced this initiative for open standards. So maybe you could catch us up a little bit as to what's been happening with these open standards.
[00:03:43.092] Joe Ludwig: Well, a lot's been happening. So we started working with Kronos back in October, I guess, to spin up the standards effort. And it's just proceeding apace. It takes a while to get through these things because you have to incorporate input from a lot of different places. We basically have most of the VR industry on the platform side is involved in this effort, which is great. It's good to get all those voices involved. And we're just cranking on the standards. The big thing that is happening this week at GDC is it finally has a name and a logo. So the standard is going to be called OpenXR. And we'll be talking to more people about it this week and hopefully getting more people involved if they aren't already.
[00:04:24.185] Kent Bye: Yeah, so the X, I've been seeing more of that because we have virtual reality and then we have augmented reality. And so we have this whole mixed reality spectrum that whether or not you're seeing the real world and bringing in virtual objects or whether or not you're completely immersed in a virtual world and sometimes bringing in real world objects into the virtual world, but there's this blend and continuum. Is that kind of the idea is that this open XR is trying to not only create the open standard for the VR but also kind of expand out into augmented reality as well.
[00:04:58.430] Joe Ludwig: So what we're working on right now is very much a standard for the VR industry, which we at Valve think is very important because of the impact that open standards have had on the PC and they've really been the foundation of our whole business. The ability to have this open ecosystem where anyone can participate and innovation can happen in a lot of different quarters all at the same time. So, you know, that's why the open standard part of this is so important. But the Kronos initiative is really focused on VR right now. The working group felt like it was likely that this would move beyond VR at some point down the road. And so we didn't want to lock ourselves into a name that 10 years from now might seem a little dated because it's just VR. But right now, for the standard that we're working on for this first draft, we're really focused on VR specifically.
[00:05:43.478] Kent Bye: Yeah, so, you know, the long trajectory of virtual reality is that it's going to start to hack every one of the senses. And it's, you know, really starting right now with the visuals as well as the auditory. But I can see how in the long trajectory of it, it's going to start to bring in all the other senses, starting with touch and haptics. But one of the things that Neil Trevitt said is that in looking at the differences between the two SDKs that are out there right now, there wasn't enough differences between them with what's already being implemented with the visual and maybe the auditory, but maybe just starting with the visual part, but that wasn't enough differences to really justify this fracturing of these different SDKs. And so Maybe you could comment on what you see the impact of trying to standardize into one SDK if that's going to enable software developers to maybe write it once and then have it work on every platform.
[00:06:39.658] Joe Ludwig: So one of the first reactions that we get from people or we got from people when we first started bringing up this notion of standardizing VR is that it's early and that there isn't enough overlap to build it into a standard. But when you dig into it a little bit, there's a ton of functionality that's very much common between the different VR SDKs. And so we at Valve felt like it was time to standardize those things. One of the benefits of having an extensible standard like the one that we're working on with OpenXR is that that standard can form a foundation, and on top of that, you can build extensions that add specific additional features for things like, as you say, different senses, or for new peripherals, or other sorts of devices, or other sorts of application-facing features. Those can all be added as extensions on top of the common base. Standardizing that common base enables innovation, not just in the software or in the devices, but also in the APIs and the runtimes. So that's one of the things we're hoping to accomplish with this standard.
[00:07:38.372] Kent Bye: Yeah, I guess the follow-up there is that if you decide to still stick with the SteamVR implementation of this open standard, if it's going to kind of be built in to your existing platform, if that means then You could do a detection of the HMD and have it completely work with the Oculus Rift. I know there's still some subtle differences sometimes when it comes to input controls and things like that, but I'm just curious to hear if that's kind of the goal of both from Valve's side. If you implement this, it would make it platform agnostic so that people could start to use Oculus Rift or vice versa. If Oculus is doing the same thing on their side, such that if something is coming out on Oculus Home, they'll be able to use it on a Vive and it should work as well.
[00:08:24.998] Joe Ludwig: So there are a few different components in there. One is exclusives to stores, and that's sort of a business level thing. That's not really a technical question. But below that, we have the question of whether a device can be supported by a runtime that was built by the company that didn't build the device. And I think what we'll see in the long run is that these things start to standardize, hopefully with the device side of the OpenXR standard. In the short run, though, there are a lot of devices out there in the world already, and exactly what different device vendors are going to do is kind of up to them. From our side, what we see is that we think it's important for the devices that we participate in building to be as open as possible at the driver level, at the device level, and also at the application level on the top of the API. The third thing that's sort of wrapped up in what you're asking about is whether or not all experiences would work with all hardware. And from the input side, there are some questions there. If the input requires a certain number of buttons or has other requirements on whether it has a joystick or a trackpad or whatever, then those input requirements or even the shape of the controller might cause the games to want to change their experience to customize them for certain control input schemes. So, you know, as we're figuring out what the VR equivalent of the mouse and keyboard are, or the VR equivalent of the gamepad, There will be some experimentation among the application developers to try to build those controls into their experiences, and they'll end up adapting to different input devices. So that's something we'll have to work out with the application developers across the industry as we settle on what those long-term form factors are for input devices.
[00:10:05.482] Kent Bye: You mentioned that this is a modular architecture. And so, you know, you have something like eye tracking, for example, where we don't have something that's already built in until everybody's SDKs. And so is this something that is trying to have a flexible enough architecture to be able to add in all sorts of different peripherals like eye tracking?
[00:10:24.160] Joe Ludwig: The goal with the API is much like other Khronos APIs, Vulkan or OpenGL, is that it's extensible, either through an extension that's ratified by Khronos or through an extension that a vendor releases on their own. You can pretty much add any functionality you want to the system. And we're trying to make that as extensible as possible so that a lot of additional features in eye tracking would be a good example. can be added without much headache on the part of the application developers or the device vendor or the runtime vendor.
[00:10:56.688] Kent Bye: Now, when you look at the impact of this OXR initiative, what do you see are some of the first early wins of the overall impact of the VR ecosystem?
[00:11:07.559] Joe Ludwig: Well, I mean, we really see this as a part of our bigger attempt to try to keep VR as open as possible. And so OpenXR and the standard on the software side couples in with what we're doing with Lighthouse or SteamVR tracking, as we call it officially, where we're sharing it as widely as possible. We just dropped the class requirement for that system so that you can just download the materials online and start working on your device. We also think it mixes in with what we're doing with support for Linux and other platforms. Now, you know, we've got a developer release out there so you can use the Vive on your Linux machine. And OpenXR is sort of the software component of that. We want applications to be able to be written once so that they can support any headset, and we want devices to be able to have their drivers written once so that they can support any runtime or any piece of software. And we think that that abstraction allows more innovation on any side of any of those interfaces. So if someone wants to do something in an application and they want it to work for the maximum number of users, then they should support an open standard for the interface to the hardware, and they'll maximize their potential audience. The same thing is true for a device manufacturer. So a device manufacturer writes a driver, they write to a standard device interface, and it will work on the maximum possible number of applications, maximum number of possible stores and platforms. And we think that that's similar to the way that the PC ecosystem grew up around open standards like the bus, or eventually things like the keyboard connections, mouse connections, and then USB later, and VGA, and HDMI, and all these other standards that allowed the hardware in the PC ecosystem to be standardized and yet still people could form entire companies around innovating in this way or that way. So we think that OpenXR and the work we're doing with Khronos and the work we're doing in these other areas is just pushing that forward into VR.
[00:13:01.090] Kent Bye: So within the SDK, there's a number of different things, both from Oculus and Valve's side, when it comes to like trying to predict movements and trying to accommodate for when you may be dropping frames, whether it's the asynchronous time warp on the Oculus side or the asynchronous reprojection on the Valve side. Can you talk about that process of taking some of these kind of algorithmic fixes on the SDK side and how that's kind of fitting into this standardization process?
[00:13:30.182] Joe Ludwig: So one of the benefits of having so many different companies involved in the standardization process is that they all bring different perspectives and different requirements. So we hear different things from somebody who's working on mobile, like say Google with Daydream, than we hear from somebody who's working on a PC, you know, like we are at Valve, or at least the Rift side of Oculus is. So we're trying to develop a standard that encompasses all those points of view and make sure that in the implementations, any of those vendors can put out what they think is the best option for users, and they can include all those features. While there aren't a huge number of direct implications of something like asynchronous time warp or asynchronous reprojection, as we like to call it, The API is designed in such a way that those features can be implemented underneath of it. And if a vendor decides to implement them, then those will be available to users who use that runtime versus another runtime may do things a little differently. So the goal isn't so much to support any specific feature, at least at that level. It's more to unlock the ability for the runtime vendor to actually implement those features on their side and provide them to users.
[00:15:26.725] Joe Ludwig: We're certainly trying to cover that range. The hope with the standard effort is that those devices and all those platforms will be able to use the new standard when it's available. I think Valve is actually much more focused on the high end of that. We think that six-dial positional tracking is a critical component of a VR experience, specifically for comfort of the user, but for hand controllers, also for fidelity of input and the ability to do different things and see your hands and use your hands as part of your VR experience. So while we're focused on that high end, there are other people involved in the standardization process that are aiming at a different price point, aiming at a different sort of level of just put it on, try it out than we are. And so their voices are involved in this effort. And so I would expect the standard when it comes out that probably would support all the way from three DOF, no controller, all the way up to a high end experience like the Vive.
[00:16:24.067] Kent Bye: And I'm wondering if you could comment on something specific such as the chaperone system versus the guardian system. Is this a matter of breaking it down to the APIs and saying, okay, this is what any space management system to protect you from the boundaries should be able to have and then define those APIs and then you kind of I've already have these two implementations of that, but I'm just curious to hear more about that process of if once you try to break it down to that API, if that changes how you've implemented anything in the chaperone system, for example, and if you have this kind of dialogue between these two implementations and then trying to extract the common features and then spread out the learnings from each of those to both specific implementations of that.
[00:17:10.131] Joe Ludwig: So I expect that that will happen going forward. So far, we haven't really settled on the specifics of much of the API. And a lot of these things are open debates in the working group. But I would expect going forward, you know, once as the standard gets more firm and more complete, and obviously after it gets ratified and becomes public and published, then I would expect that to start to feed back into how we think about certain things. And we may start to think about the way we would implement something somewhat in terms of the API, because obviously that does happen. You design an API that you think provides as much value as possible to the application developer on the other side of the API, while not being an onerous burden on the implementation that the API is providing access to. So there may be things that, as the standard is ratified, we'll end up doing things differently than we would have so that they're more friendly to that standard API. than they would have been otherwise if we'd still been rolling our own API.
[00:18:11.705] Kent Bye: So it sounds like right now you have a number of different major virtual reality players that are involved in these discussions, which as far as I can tell, have been happening mostly behind closed doors within your own email lists or forms or whatever you have. And so I'm curious, where are you at in the overall process of this? It feels like this GDC is a bit of a mile marker in the sense that you're making some announcements. saying there's been progress that's being made, but where are we at now? And where do you see it once you have like kind of a 1.0 version of this open standard?
[00:18:45.134] Joe Ludwig: So we're still some ways away from the 1.0 version of the standard. The way that the Khronos process works is that you start out as an exploratory group, and that group writes this document that sort of describes what the standard is it's trying to create, and then transforms into a working group. And that happened for the VR effort about a month and a half ago, so it's still pretty early in the process. The reason that a lot of that activity happens behind closed doors is because these are people working, in many cases, in public companies, and they're working on teams that might have different goals in the market. So it's important for them to be able to speak freely from a technical and engineering point of view so that they can come up with the right answer without having to run everything past their marketing department and everything past their lawyers, because everything that they say in this forum is going to show up on the front page of some news site. So by keeping it private, anybody who wants to is able to participate, but they agree to non-disclosure to participate in that. And that allows them to talk amongst themselves much more freely than they would be able to if they were totally in public. And then when they're actually finished with their effort and have hashed out a lot of the details, then they step out and release the spec. The two big things that are happening this week at GDC with OpenXR is first the announcement of the name and the logo and that this really is an effort that we're really pushing. And the second is, we really want more people to be involved. So if there are any companies out there that are working on VR, either as device manufacturers, or game engine developers, or they're working on tracking systems, or runtimes, or they have particular features they're looking to get integrated into other people's hardware, or whatever, if they're involved in the VR space, and they want to participate in how the standard ends up, what ends up being in the standard and how it is formed, then this is our opportunity to get in. It's still early, and there's plenty of time to help to ensure that the standard meets the needs that you need it to meet as whatever your company is, whatever it does.
[00:20:49.108] Kent Bye: Well, the interesting thing that I see happening is that there's kind of a parallel tracks here between what's happening within the hardware manufacturers and what's happening with, say, the web, because I know that there's WebVR, which is a standardization process that's trying to essentially have the browser talk to the headset. And in the process of talking to a lot of these web developers that are doing this, they would have the insight, for example, that they don't need to necessarily integrate all of the controls into the VR standard, because there's already like a gamepad spec that defines how you do input controls into an experience with the web. And so you start to break down the standardization into these different component parts. And so given that insight, I'm curious if you have that same type of thing with this standard. Is it mostly just looking at the headset or are you also including the input controls as a part of this kind of unified standard?
[00:21:45.408] Joe Ludwig: So you mentioned a few things in there that are active points of debate. And so I definitely can't speak for the working group or what Khronos is going to end up with. From our point of view at Valve, we think that what should end up in this standard is a minimum set of features that allow a developer to implement experience that would support multiple sets of hardware. So we think that access to drawings that display, head tracking data, input data, including things like the pose of the controller, the position and orientation of the controller when that input event happened for events like throwing, and haptic events on the output side. So there's sort of a minimum set of things that we think are important to include in the standard, and we're pushing for that. And whether or not something ends up in or not, or whether it's optional as an extension, or whether it's a core part of the interface, those are all things that we're trying to sort out in the working group, and that's the consensus-driven process. But every company that goes into this has their own position for what they think is important. And for us, it's really, you know, we want application developers to be able to target a bunch of different hardware, and we want hardware developers to be able to target a bunch of different applications without having to code specifically to anyone's private API.
[00:22:58.411] Kent Bye: So for you, what do you hope to see as kind of the best case scenario as to what would happen with this standardizations process? I know that Valve's been doing a lot of things with trying to do royalty-free technology and really push these open standards to be able to get this technology into the hands of people. But for you, I want to just kind of hear how you see that kind of play out as an inflection point and really leveraging this exponential technology of virtual reality, how you see that kind of playing out into the larger ecosystem of VR.
[00:23:32.212] Joe Ludwig: So one of the benefits of open standards is that they help to make it easier for people to participate and cooperate together without the need for a vertically integrated walled garden. And so we think that having standards on the device side and on the application side will help to essentially lock VR open so that it's more like the open PC ecosystem and less like certain other platforms that are much more closed down. So that's really the end state. And everything we do tries to support that. So the reason that we have SteamVR tracking out there as a royalty-free license for the tracking system is that we've had probably 100 companies come through our doors over the last five years who wanted to demo some piece of some VR technology And a really common theme among those companies was that they didn't have a tracking system. Tracking systems are very difficult to put together. So by building a tracking system and making it available to people, we can lower the bar to actually build a legitimate VR device that people can actually see some use from. So that's what we did. We built the tracking system, we invested heavily in that, and now we're sharing it with the developer community in a way that allows them to put it in whatever device they want, We don't demand any approval of what devices they build. There are no royalties. The only requirement is that the code ships from Steam, so the users will end up installing Steam. But that's it. So all of those device manufacturers can go ahead and make their device, and those devices are then available to all the users and all the application developers. And that's the kind of end goal for OpenXR and really everything else that we're doing in VR, is that times 100, where there are a bunch of different technologies that people can use that are standardized technologies, mix them together and make something new in the same way that you can make a new PC by taking all these off-the-shelf parts that all hook together in standard ways and build something that no one's built before.
[00:25:31.032] Kent Bye: Awesome. And finally, what do you see as kind of the ultimate potential of virtual reality and what it might be able to enable?
[00:25:39.792] Joe Ludwig: Well, the last time you asked me this question, I think what I said was that shared experiences together in VR were the thing. And I really still believe that. I think that the biggest potential is that VR can be a technology that brings people together. And one of the things you commonly hear about AR versus VR is that AR is open and social and much more friendly. and that VR isolates you and it cuts you off from the world. And I don't really think that's the right way to think about it because when you put the VR headset on, inside of the VR headset is everyone else in the shared experience. When you put the AR headset on, well, it's just the people who are actually in your room. So I think that VR has a potential to be much more social and open people up to each other much more. And so I still think that my answer is basically the same thing, which is VR lets you do things that you can't do in the real world, in real life. And it also lets you do those things with other people. So I think doing things that you can't normally do in the real world with other people is the ultimate potential.
[00:26:41.481] Kent Bye: Awesome. Well, thank you so much for joining me today, Joe.
[00:26:45.345] Joe Ludwig: Thank you.
[00:26:46.638] Kent Bye: So that was Joe Legwig. He's been working on virtual reality systems at Valve for about five years now, and he's leading up Valve's participation in the OpenXR standardization process with the Khronos Group. So I have a number of different takeaways about this interview is that first of all, just overall, I think that this was an interesting interview in the sense of trying to suss out some of the deeper values and motivations for Valve. It's pretty clear that Valve wants virtual reality to be seen as this new computing platform and to be open as much as possible, much like the personal computer is open. And so in that process, they want to try to abstract all the fundamental components of what makes up what they see as the minimum viable virtual reality experience, which includes six degree of freedom, hand track controllers, as well as positional tracking. And so I think that is a little bit of a baseline for a VR experience. And so this is a little bit of a difficult topic to cover just because there have been a lot of the discussions that are happening within the context of trying to flesh out the spec have been happening behind closed doors. And so this was kind of like my first opportunity to ask some very pointed specific questions for Joe Ludwig. And, you know, each time I did that, he said, you know, I can't really necessarily speak on behalf of what's happening in these open debates within this OpenXR discussion, but that from Valve's perspective, this is kind of what we're looking for to achieve. Now, one of the things that Joe also said is that they want to create an API that provides as much value as possible to the application developer while also not being an onerous burden on trying to specify the implementation of the API. So it's a little bit of trying to create this interface such that is generalized enough to define the basic value that you need, but not be so descriptive as to say this is how you have to implement it. So I think if you take a step back and look at this initiative, I think it's probably one of the most important initiatives that are happening across the entire VR industry, because first of all, it's the different platforms that are saying, Hey, you know what? We don't necessarily want to own every different dimension of the vertical integration of this VR platform. This is a thing that is larger than any one company and that we want to be able to create a system such that you could have all these other minor players come in and still be able to interface their VR headsets and be able to still work with the different levels of software and experiences that are out there. Because it actually is lowering the barriers for innovation because it's allowing these interfaces to be standardized to the point where you could have these small players come in and come up with some sort of peripheral that is able to seamlessly integrate with one of the headsets. Now, in talking to Joe, he said that there's still a difference between the technical implementation of this open standard and the types of business decisions that a company is making. So, for example, Oculus may still decide to have platform exclusive so that, you know, even though technically if Oculus was to be able to implement all the different dimensions of this OpenXR standard, that doesn't necessarily mean that they're going to make the business decision to allow the HTC Vive to be played on any of the experiences that are launched on Oculus Home. Right now SteamVR actually implements a kind of a platform agnostic approach in their SDK such that it detects whether or not you have a HTC Vive or an Oculus Rift, And it's able to adapt the experience and still work right now. That's not true with Oculus home. So GDC this week, I'm going to be asking some of the representatives from Oculus, whether or not they are planning on doing this kind of open approach and eventually making that business decision to make some of their experiences on Oculus home, a little bit more platform agnostic. You know, virtual reality overall is going to be hacking all the different senses, and there's going to be peripherals that have to do with haptic displays and more and more sophisticated audio integrations. And so I don't know if this specification with OpenXR is going to include the five different senses eventually. I think they'll probably certainly start with the visuals. defining an API for spatialized sound, that's an open question. And whether or not they're going to include anything with haptics, I think it's too early for them to actually have anything concrete. There isn't really even a haptic display company that's out there that's producing a fully fledged haptic display that would be able to define the API and what would be needed. But I think in the long run, this OpenXR Standard sounds like it has extensions such that you could have these peripherals come in and start to create a module that starts to define some of those APIs. And as more and more of these different types of peripherals come in, then maybe you'll be able to standardize it and include it into the main OpenXR standard. So when I first talked to Neil Trevit, he said that this process is going to take about 18 months or so. Uh, and also depends on the critical mass of all the different major companies and if they're able to, you know, come to a consensus. So we're about three or four months into that. So expect probably about this time next year, perhaps we'll start to finalize the 1.0 spec of the open XR. And at this point at GDC, they're just trying to get more companies to learn more about the initiative that's happening here and to get involved if you have some specific ideas for what type of low-level APIs you want to have available in order to enable your specific feature that you want to build into virtual reality and potentially even augmented reality as well. So that's all that I have for today. I just wanted to thank you for listening to the Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends and become a donor. Just a few dollars a month makes a huge difference. So donate today at patreon.com slash Voices of VR. Thanks for listening.