Joe Ludwig is on the VR team at Valve, and was part of the original group with Michael Abrash that initiated the experiments into virtual reality and wearable technologies at Valve around January 2012. Joe says that Valve’s ultimate goal is to eliminate the gatekeepers in VR, and to foster a more open ecosystem like PC gaming. While Valve would love if VR developers use Steam to distribute their VR apps and experiences, there’s no requirement that force Vive developers to distribute only via Steam.
Joe talks about the different terms that Valve uses to describe their VR initiatives and products:
- SteamVR is the umbrella term for all of Valve’s VR efforts.
- OpenVR is the API, SDK and runtime that “allows access to VR hardware from multiple vendors without requiring that applications have specific knowledge of the hardware they are targeting.”
- HTC Vive is the virtual reality head-mounted display
- And the Lighthouse technology is the tracking solution that they hope becomes an open standard for tracking
Joe says that will not be a lot of time between Vive’s developer edition and the consumer release, and so his expectation is that most of the developers not initially selected will not be able to start developing on the Vive until the consumer release comes out in the Winter of 2015.
Some of the other topics covered by Joe include:
- SteamVR Plugin for Unity
- Valve’s goal is to eliminate the gatekeepers in VR to make it more open like PC gaming
- Developers will not be forced to distribute their VR apps via Steam
- Best-case scenario for Lighthouse is to open it up, and have other hardware manufacturers start to have it available in public spaces.
- Some challenges for large spaces with Lighthouse
- The Demo Room at #SteamDevDays and how Valve worked over the past year to turn that into the Vive product
- Michael Abrash and a few other people at Valve started working on wearable displays around January 2012
- Place Illusion & Plausibility Illusion within VR. “The VR Giggle”
- Some of the more popular Vive VR demos: Job Simulator, Tilt Brush, Google Earth demo
- Application process for Vive dev kits. Not much time between the developer edition and the consumer release, and so that their expectation is that most of the other developers will start with the consumer release.
- Everything that Valve builds is iterated on with monitored play tests, and so that’s their hardware QA strategy
- Joe is looking forward to being places that he can’t be and do it with other people and going through rich and adventurous experiences
- Joe hopes that interactions between humans will become more positive with VR and that it’ll change online behavior because it’ll communicate more of our humanity
More information about applying for a Vive dev kit can be found here.
Become a Patron! Support The Voices of VR Podcast PatreonTheme music: “Fatality” by Tigoolio
Subscribe to the Voices of VR podcast.
Rough Transcript
[00:00:05.452] Kent Bye: The Voices of VR Podcast.
[00:00:11.994] Joe Ludwig: My name is Joe Ludwig. I work at Valve on the VR team. And SteamVR is sort of the umbrella term that we use for all of our VR efforts. OpenVR is specifically the API. So OpenVR is the SDK and the runtime to access various kinds of VR hardware. One of those pieces is the HTC Vive. which is a headset that we're working on with HTC that uses OpenVR to get access to the hardware. And Lighthouse is a tracking system that we've developed. The Vive will use it and we're hoping to expand use of Lighthouse past just the Vive and its controllers and also past VR in general. Now, let's say I'm a VR developer and I've created something in Unity. What is the process to make it compatible to work with the Vive? We have a plug-in in the Asset Store, so one or however many clicks it takes to get an Asset Store plug-in into your app, and then you've got cameras set up in your scene that are rendered on the display. The particulars of exactly what to click and what to drag where, I haven't actually used the Unity plug-in myself, but it's pretty simple. And so here at SVVRCon, what were some of the big messages that you were trying to communicate to this community here? Mostly that OpenVR and eventually Lighthouse are all about eliminating the possibility of gatekeepers. So we don't want VR to become what mobile has become, where there's a small set of players that have a large degree of control over what can go on. And so we want to make that much more open, more like the PC, because the PC ecosystem has seen a tremendous amount of innovation, and we think that's a result of the open nature of it. I see. So there's no sort of like, if you create an experience it has to be on Steam, but people kind of distribute it however they want, I guess is what you're saying. They don't have to be on Steam. We hope that many of them will be, because we want Steam to be a great place for VR experiences. But there's no requirement that they be on Steam. There's no requirement that they distribute their app at all, or that they charge for it, or whatever. They can do whatever they want with it. Now, there's been some talk about wanting the Lighthouse to be a little bit of a standard of the USB of input tracking. So what do you see would be the best case scenario for what happens with Lighthouse from here on out? I think the best case scenario would be that we open it up as much as we can. We get lots of people building base stations and lots of applications find a use for it. And when you walk into a public space like the Convention Center here, there would just be lighthouse everywhere. That's a long-term goal, obviously. We're a long way from that. but we think that standardizing on a single tracking system will unlock a lot of potential in hardware companies that can do one piece of something interesting in VR but where the tracking is an additional challenge that they're not able to address all at the same time. That's an interesting sort of engineering optimization question in terms of what is the power of the lasers and what type of space it's optimized for because I'd imagine that you're really going for, as you say, 15 by 15 feet, which would indicate a certain sort of power. But if you go to larger spaces, it would need a lot more, I guess, juice in order to even work in that volume. At what point are you really trying to keep it underneath, say, the FCC regulations for how powerful lasers can be without getting special authorization or permission to even be able to manufacture them? The challenge for larger spaces is not actually the power of the lasers. The lasers are pretty well below the limits. The challenge with larger spaces is that the spin rate of the motor implies a certain level of precision in the tracking when you're within 15 feet of the base station. And if you were 30 feet from the base station, that angle that that same amount of time represents would be a much larger physical distance. So it's more about trigonometry and the distances involved there. Right now, we're focusing on a 15-foot across sort of squarish area tracked by two base stations. We're expecting that to, you know, get more developed and more flexible as going forward. But for this year, that's basically where we're focusing our energy. And last year, at the beginning of the year, there was the Steam Dev Days, which was kind of like the public unveiling of the demo room. And from the interviews that I've done, it seems to be a pretty big turning point in the history of VR in terms of kind of convincing people, like, oh, wow, this is a real thing. Something about having that level of tracking gave people a sense of presence that goes beyond what the Oculus Rift was doing at that point. I've just heard, anecdotally, a lot of people being like, OK, this is a real thing. I've seen what the potential is, I guess. And so from your perspective, what was it like to unveil the demo room for the first time publicly? It was nice to have people see what we'd been seeing for a while. That room had basically come together over the previous year. So we'd seen it six, nine months before that in various forms. And it was nice to get to share that more broadly. And then this past year we've spent basically the whole year figuring out how to take that demo and turn it into a real product. So that's really what the Vive is about. And from your perspective, how did the VR initiative at Valve get started? What is sort of like the origin story for how that even came about? There were a few of us talking about a bunch of things that we could do, Michael Ebrech and I in particular and a few other people, and at a certain point we all moved down to the fifth floor and staked out a couple of rooms and just started working on the problem. We brought in a couple of, you know, a few more people to help out at that point and just started looking at wearable displays and where the tech was and started learning about the whole process. And when did that process start then, do you remember? I would peg the start of it at the beginning of January in 2012, because that was when I moved down there, but it was right around there, give or take a couple months. Yeah, and I think the other thing that I observed just with people using the Vive and talking to people that have been researching Presence is they talk about two components of being both in a the place illusion and the plausibility illusion of being immersed in another place, but actually having your body and agency and that the world is coherent and makes sense. And I think that what a lot of people saw for the first time at the GDC, the public unveiling of the Vive HTC, is that this was the first time that a lot of people had been able to see those two things together. being immersed in a place and also having that fully tracked and to barely have this sort of virtual body ownership illusion or just a sense of presence. And so what was it like for you to kind of observe people, the reaction, and if it was kind of like surprising of how much the press kind of went like, Oh, wow, this is like the next level. Well, it's always gratifying to give people VR demos they've never seen before. You frequently, you know, they get the VR giggle, right? At some point in the demo, they just get really excited and they kind of can't believe what they're seeing. And they think about things differently when they get out of the demos. So that's always gratifying. In fact, the first time I heard about the distinction between the place illusion and the plausibility illusion was from you at GDC. And since then, I've read and heard a little bit more about it. And I understand sort of how the Vive would fit into that and how the controllers help with it. So we've seen bits of that from those first few demos. And we're hoping once the developer edition is out there, we'll see a lot more of it. And I'm curious, from your perspective of seeing all these different demos, it seems like Alchemy Labs' Job Simulator has been something that has been the most popular implementation, because I think they've created this coherence of being able to interact with anything. And it's not necessarily very high-resolution. It's very low-fidelity, low-poly. But yet, people still seem very compelled by something on the surface, which may seem very boring to do a job simulator. Why would anyone want to do that? So from your perspective, what do you see as that? What were some of the design elements within that experience that really, you think, make it so compelling for people? Well, everybody has a different flavor, and it's been interesting on the team that some people really don't like a demo that somebody else will really like. Job Simulator is very popular, Tilt Brush is very popular. For many people, Google Earth and standing over San Francisco, even though there's no controller input in that one. is a very powerful experience. And really, I think it's just that these different demos touch different things that people are interested in. Like, one of the guys at work who's really into Job Simulator, he likes that he can juggle in there. So that's something that really appeals to him. He likes that it's low fidelity and yet still gives him that sense of presence and gives that sense of agency that he gets from the controllers. So, you know, everybody's a little different, but that one has definitely tickled some excitement in certain people. And what's been one of your kind of favorite experiences that's, I guess, been revealed publicly? There may be things that you've seen that you can't talk about, but things that have at least been out there that you've experienced that you think is really compelling. I think Tilt Brush is great. I think everybody's going to show their buddies Tilt Brush when they come over to see their VR rig or whatever. Nice. And I guess one question about the strategy that Valve is taking in terms of making these dev kits available is through an application process. At what point is this going to open up before the consumer launch? Is it always going to be sort of an application process of being vetted? Or is there going to be a point before the consumer launch where the dev kits become more publicly available for people who may have not passed this sort of test of meeting all the criteria that you're looking for for this initial round? So right now we're focusing on developers, and I would encourage anybody who's working in the VR space to fill out the application and apply for a dev kit. After the developer edition, there's really not that much time between then and the commercial release, so our expectation is that the majority of people will just get the consumer commercial release. I see. I guess that's one thing that worries me is just that there hasn't been like a public, like Oculus has been through like two rounds of like, you know, it's getting the DK2 up and running was a bit of a nightmare of just like dealing with all the hardware issues. And so what's the QA process that you go through to make sure that, you know, when you actually do the consumer launch, that there's not going to be that same level of like having to be a technological genius to even get it to work. So one of the things that Valve tries to do as much as they can is build something and then iterate on it using feedback from monitored playtests. And we're going through that process with everything related to the Vive. Great. And finally, I guess a couple of questions. What are some of the experiences that you want to have in virtual reality? So there's so many different things I want to try that it's hard to nail them down. But really it's all about sort of being places that I can't be. And I think in many cases I'd like to be able to do that with other people. So not just like a chat app, but having multiple people with me going through an experience that is more sort of rich and adventurous or whatever, right? And finally, I guess, what do you see as the ultimate potential for virtual reality and what it might be able to enable? I think that VR is exciting for a lot of reasons. One of the things that excites me most is the possibility that will make interaction between humans more positive because they'll see each other as people instead of seeing each other as anonymous screen names or whatever. So I'm hoping that we see some changes in online behavior as a result of that sense of humanity that people get when they're talking to somebody in VR.
[00:11:18.865] Kent Bye: OK, great. Well, thank you so much. Thanks. And thank you for listening. If you'd like to support the Voices of VR podcast, then please consider becoming a patron at patreon.com slash Voices of VR.

