At GDC this year, I had a chance to talk with Valve developer Jeremy Selan and tracking engineer Ben Jackson about the evolution of room-scale tracking technologies, as well as some of the oral history stories that include some of their favorite memories and types of experiences within VR. They didn’t talk about any specific future plans for things that they’re not ready to talk about yet, but Jeremy did allude to the fact that there’s a lot of latent hardware functionality that’s shipping with the Vive that can be turned on with a software update. We also speculate a bit about the potentials of using the front-facing camera to track static objects with fiducial markers, the desire to go beyond room-scale in VR, and using controllers to prototype tracking other body parts until Valve presents a solution that makes it easier and more aesthetically pleasing.
LISTEN TO THE VOICES OF VR PODCAST
Become a Patron! Support The Voices of VR Podcast Patreon
Theme music: “Fatality” by Tigoolio
Subscribe to the Voices of VR podcast.
Rough Transcript
[00:00:05.452] Kent Bye: The Voices of VR Podcast. My name is Kent Bye and welcome to the Voices of VR Podcast. Today I'm going to be talking to a couple of Valve engineers about some of the stories of how they got into virtual reality at Valve, but also some of the capabilities of the Lighthouse tracking technology and some potential futures of where things might be going with hybrid tracking in the future. Now, just to kind of orient you and time and space for how this interview came about, this happened at GDC this year, and it was at GDC a year ago when the Vive was first revealed to the world. And I remember being at GDC and I was actually not invited to check it out, but I really, really wanted to because I had heard the buzz at GDC that it was absolutely mind blowing and amazing. And so this year, I had actually got a chance to get a press pass to GDC and got invited to check out the latest Valve demos, which they were showing off the lab demo, as well as the IMLX Labs trials on Tatooine Star Wars experience. And so after I had a chance to experience both of those, I was just kind of roaming around that Valve area and had a chance to do an interview with the labs, Jeep Barnett, and then check Valsnak. And I started chatting in with Jeremy, who I'd met previously at the VR intelligence conference. And we were just kind of telling stories. And I found myself two or three times just saying, oh, man, we really need to just record this as a podcast. And so we stopped our conversation and then kind of took some steps back to recount some of the conversation that we had already had and then dove into kind of the future of lighthouse technology and tracking and just other stories. And before we get started, I just want to also put in a quick pitch for anybody that's listening and has been enjoying the podcast. Please do consider becoming a contributor to my Patreon campaign at patreon.com slash Voices of VR. It's what is helping ensure that this is continuing to move forward and for me to continue to do this type of coverage. So with that, let's go ahead and dive right in.
[00:02:24.742] Ben Jackson: My name is Ben Jackson. I'm a software engineer at Valve, and I wrote the tracking system for Lighthouse.
[00:02:29.449] Jeremy Selan: My name is Jeremy Seelan, I'm also a developer at Valve, and I work on software technology, including a bit of the tracking system and some display and optics.
[00:02:38.953] Kent Bye: Great, so we were just kind of like recounting some of the history of, you know, the evolution of the technology of both the HTC Vive and the Lighthouse technology, and it seems like this thread of like, how does innovation happen? What was the seed that really catalyzed this trajectory into VR?
[00:02:55.322] Jeremy Selan: We've been talking a lot about room scale with the Vive and those types of experiences. And I think a lot of people think that we came up with Lighthouse and Lighthouse could do room scale and that's how we ended up with the SteamVR system in the HTC Vive. But it really was the other way. We originally put together the MPTAM room, that's the valve room where you have all the markers on the wall. And it was not a super convenient system, but all our developers would love to go back to that room. And it was really just watching them and how much they enjoyed making experiences in that space that we decided we really need to go forward with Lighthouse. Alan had had the idea for a while, but that was sort of what kicked off the idea that this is something that is really important for the experiences that we want to create. So it was almost that we ran experiments, we saw what people enjoyed, and then we used that to drive the direction of the technology.
[00:03:45.943] Kent Bye: Yeah, it seems like they could develop at their desk, but it sounds like they were going down to this room to try things that they could only try there.
[00:03:53.085] Jeremy Selan: Yeah, now certainly they have the best of both worlds and they can use this also at their desks. But even back in the day, that's what they would prefer to do. And we paid attention and moved forward in that direction.
[00:04:03.889] Kent Bye: So I'm really curious about each of your moments when you decided that you really wanted to work on this at Valve.
[00:04:09.869] Jeremy Selan: Well for me it was actually seeing that demonstration of the Valve room with the markers on the wall. I came down after Steam Dev Days and got the demo and I said this is what I want to do and actually I joined Valve to work on VR because I was so excited about this being the future.
[00:04:25.883] Ben Jackson: Well, I had previously worked on the wireless support for the Steam Controller, and some people from the VR team came over and asked me to make a wireless controller prototype for them. And I saw a bunch of interesting technical challenges in making that work, and that's what got me involved initially. And I don't think I was fully converted to the vision of what VR could become until I think we were doing initial controller testing with an early, early build of Job Simulator. And this was still when we had very few lighthouse setups, you know, that were full room scale. And that moment of like operating equipment in the kitchen and literally forgetting that I was even holding controllers and just doing stuff in an environment, throwing it, spinning around, just reaching behind me to slap the microwave shut. And when I got out of that and I took the headset off and I had, like, I was surprised at where I was in the room. Like, I had forgotten what room I was in or where I was in in the room, and I had forgotten I was even holding controllers. And that's when I was like, this is magic.
[00:05:19.897] Kent Bye: What are some of your favorite memories of being in VR?
[00:05:23.053] Jeremy Selan: I think for me the ones that I remember the most are the experiential travel ones. So one of the experiences we're showing off at GDC this year is an Everest experience. And things like that are amazing. I'm surprised that I keep doing them. Like the Everest experience is at this stage a shorter experience, but I've probably done it a dozen times just because it moves you to a different place, like you really feel like you're there. And I wasn't expecting to like that. I mean, certainly if you looked at a textual description of what it would be like to do that, or you saw it on the screen in a video capture, it wouldn't capture the reason you come back to an experience like that. So I think the travel ones and just the way it can affect your mood There's one that's a Grand Canyon experience, and it's almost meditative in some way. I even went to a VR meetup a few years ago, and it was just sitting in a DK2 in a bar with headphones. And I totally forgot I was in the bar, right? Just the ability to transport you to different locations is, I think, one of the most exciting things.
[00:06:25.480] Kent Bye: Yeah, the photogrammetry example when you're on the vista here that was shown as part of the lab, that to me was one of the most deep sense of presence that I've seen in a place because it was just an amazingly beautiful place that I was like, yeah, I would actually want to just like hang out here and be here because it's like this looks like nature. It didn't look like it was rendered by a computer. It just kind of really transported me to that place. So yeah, I'm really excited about the future and potential of this photogrammetry as well.
[00:06:52.405] Jeremy Selan: And that's just the tip of the iceberg. So a whole bunch of us have taken the cameras at home and tried to do that ourselves, and it's not that hard. I think photogrammetry in the last few years hasn't had that many eyes on it because it's sort of a niche application. I mean, how many people really care about creating 3D models at home? But now with VR, all of a sudden you can bring these objects from the real world into your virtual space, and you can capture physical environments virtually and then revisit them. So I think there's going to be a huge number of people who are excited about doing this and I bet the tech is going to move forward really quickly because of the number of eyes on it.
[00:07:28.086] Ben Jackson: We made a scan of the office. There's a company called Matterport and they specialize in doing scans of properties for sale and normally you view those on the web, you teleport from point to point and you'd have a sort of photosphere experience but they can also export meshes and you can view those in VR and I remember the day I walked by, this is back again same era where we only had really one fully tracked space and there was sort of a line of people all peering in the door and whoever was in the headset was giggling. And then they would hand off to the next person and they would start giggling. And it was just something about being in the Valve offices virtually while you were also there. And being able to do things like teleport around the office until you're in the room you're in, aligned with the room you're in, and then take off your headset and like you're literally there and like go back and forth and then like... Because their technology wasn't meant to present in VR, it looks really good from the points where you took the photos, but that early version we were using, it didn't look that good if you were actually standing in a place where there hadn't been a camera. So there'd be holes in the geometry, so you could go up like two floors and there'd be a giant hole in the floor of the kitchen and you could see down like three halls. grown men giggling like schoolgirls because they were in a virtual model of the building they were in. You would not go up to the actual kitchen and giggle like that.
[00:08:42.077] Jeremy Selan: It's surreal. I can't tell you the number of times, like we use VR all day long in our offices for multiple years now. I can't tell you the number of times I've walked by and someone's just hysterically laughing. It really has such an potential for emotional connection for things you wouldn't even expect and maybe we'll get more accustomed to it in three years from now our bar will be raised but even at this point simple things we get so excited about.
[00:09:06.410] Kent Bye: Well I think one of the things that's happening a lot in virtual reality is that it's exceeding our expectations of what we've seen and experienced in our lives and a lot of times I've heard people when they've created social VR experiences they'll have a bug and they'll find that the people will kind of like center around that bug and just like really play with it because it just really messes with their mind and so I'm curious if you guys have experienced that in terms of any bugs or any things that you were kind of surprised by that's something that was really unexpected.
[00:09:33.657] Ben Jackson: Well, I guess one example would be we created a demo app for Unity to show developers how to use our APIs that include velocity and angular velocity for the controllers, which is useful. If you're throwing an object, we can actually tell the developer at the moment you let go of the button, here's how your hand was rotating and how it was moving so that you can impart the proper spin on the object as it comes out of your hand. And given your question, you can probably guess where I'm going with this. The initial attempts turned out that our coordinate systems were not entirely correct. So you'd throw the object, and it would go backwards. You'd throw it with this spin, and it would go that way. And that demo, which literally consisted of a white plane, and when you pulled the trigger, a white sphere would appear in your hand that had a white cube sticking out of the side so you could tell if it was rotating. I must have, once it was working, I must have played with it for 20 minutes. There's no content at all, but you can just make up games. You'd throw the cube, and it would roll on the ground. You'd be like, can I throw this and hit that? Can I throw one and hit one that was still in the air? And then Jeremy wanted to see it, so I think I handed him the headset, and he's playing with it. And now I'm in Unity, and I'm turning off gravity. So he throws it, and it just floats away. And eventually, he's got all these balls that are just kind of floating there, and Unity has made them static objects now, because the physics quit working. turn on the gravity and it'll rain down and like one time as he's throwing I changed it from like plain white to what was it like a stained glass window effect and as he's like throwing it as his hand comes past his peripheral vision he sees it he forgets he's throwing and he stops and he's just oh look it's a pretty sphere
[00:11:09.148] Jeremy Selan: Yeah, I don't think the bugs were that interesting, but the number of happy accidents we've gotten where I wasn't anticipating something would be interesting, and then we're like, wow. It just happens all the time. I mean, I think one of the killer apps is just playing around in Unity. I think that's why so many devs have taken to it. I mean, at GDC, we meet with a lot of indie developers, and just the vast number of them that are really excited about VR. I guess we sent out dev kits to some of them just before GDC who hadn't previously gotten them. And I guess that was a little bit of a distraction. And many of them in crunch mode all of a sudden just started playing VR for multiple days. And so they were really excited and annoyed. They're like, why couldn't you have sent it to us the week after GDC? And we just get that reaction universally, which is really exciting.
[00:11:56.618] Kent Bye: Yeah, it must have been from the Unity Summit, which just happened. And I was there in the room when Gabe came on and announced. And I actually got one and set it up. And to me, I think it's really exciting just because there's something that's qualitatively different about being able to do these room-scale experiences. And to me, there's a lot of things that I'm doing with my body that just feels like I'm getting some exercise. So I'm curious from your experiences in some of these, these games that are out there, what you've found is super compelling are things that you find that you keep coming back to.
[00:12:29.586] Jeremy Selan: I keep using Tilt Brush, surprisingly. It's a lot of fun. I keep coming back to it. The other thing that really surprised me is we've had VR at home for a while now, multiple years in some cases. And the way I think about the tracking system for RoomScale is more like it's Wi-Fi. So if you're in your house and you have a wireless device, you just assume you will have wireless everywhere in your house. It's the same thing with tracking. Sometimes if I want to relax and do something that's seated, I'll do it on the couch. Sometimes I'll do it on the computer. But it's really freeing to not worry about where you have tracking in the room or not. So I do mostly room scale stuff because I find that most comfortable. But for some of the ones that are a more natural companion to seated, like tabletop games where It's not so much a seated or standing. It's more the locomotion that the game chooses to use. So for the ones where the character is seated or there's a small miniature set, it's totally fine to do that. But I really enjoy all of those experiences.
[00:13:22.902] Ben Jackson: So you asked what we were coming back to a lot. For me, there have been a few things that were surprisingly compelling that made me want to try them over and over again. And the common thread, I think, is that they use some natural skill, a skill that you could use in real life but translate into VR in a very natural way. Ninja Trainer, where, obviously I don't have a lot of personal experience swinging a sword, but it's a natural sort of hitting, swinging motion, using your body to move around. I mean, it's not a fancy mechanic at all. I imagine there's some fancy math to get the chopping feeling just right, but once you're playing it, you just feel like you're doing an activity you could do anywhere. And actually, Space Pirate Trainer, recently, it's just aiming, and if you use the repetitive shot, it's like, hold the trigger down, essentially. And it's really rewarding to feel like I'm doing something with skill, aiming, and I'm being rewarded for it. I'm not hitting a combination of buttons or solving an indirect control puzzle. I'm really doing it, and I'm doing well.
[00:14:21.712] Jeremy Selan: I mean, I think Chet describes this really well, but it feels really disappointing to go back to, like, MMOs and then hit a button to swing a sword, where you're like, I have a sword in my hand, can't I just swing it? Right? It starts feeling really natural. I think one of the games that will be coming out later is Budget Cuts, and this does this really well, where it's a stealth game, and you just act stealthy, and that's how you have that action. If you want to duck, you duck. You immediately know it. When you throw children in these types of experiences, right, they just immediately know what to do.
[00:14:51.410] Ben Jackson: I'd say Budget Cuts spawned one of my criteria for interesting VR in that early on I discovered that if an experience could make me forget where I was in the room by the time I took off the headset, like usually that was a good sign, like that I was so engrossed. And prior to Budget Cuts, several games that had mechanics where they really wanted you to kneel down, like they wanted you to kneel behind cover or get something, and I was kind of resistant to it. I'm like, no, I'm just gonna stay up here on my feet where I feel comfortable. And, you know, budget cuts has gotten me to crawl around on the floor of my house. And I have to give them credit, you know, like they motivated me like that. That to me was a good sign that, you know, like I say, you can really connect with somebody. And when it makes you want to crawl around on your floor, if you're not naturally somebody who crawls around on your floor, that's pretty, pretty good.
[00:15:36.047] Kent Bye: Yeah, it's really interesting. I just did the Star Wars experience, the Ion Lex Lab Tatooine experience and you're sitting there with a lightsaber and there's stormtroopers coming up to you and they're shooting lasers at you and you have, you're standing up and you could crouch down and hide down below and I noticed I was like Kind of just standing up, taking it, and I was like, oh man, this feels really intense, because I'm dodging these lasers and trying to use my lightsaber to hit it back to kill them. And then I just was like, oh, I could take a break. And I ducked down, and I was like, oh, this feels a lot better. there's that moment of the exhilaration of like just like kind of really getting that tension of feeling like I was actually in danger and I think that's the thing of feeling like you're actually in danger and that you're somehow you're exposed if you're like standing up and you just have this natural tendency to just want to hide and I definitely had that same feeling in Budget Cuts. So I guess, you know, just to kind of take a step back and look at some of, you know, you were talking a little bit earlier, Jeremy, about there's going to be hardware releases, but yet there could be a lot of software pushes that, you know, software can iterate a lot faster than the hardware. So maybe you could, you know, talk a bit about that.
[00:16:41.670] Jeremy Selan: Yeah, so I think the traditional model at hardware companies is you develop a product, sort of perfect it prior to release, and then put it out there, and it's a little bit more of a static, unchanging target. I think we're really excited about Valve taking a similar approach to hardware that we've taken with the software. So, your Steam client updates all the time, it's really convenient for you to push out software updates for lots of games. We've taken that same approach with a hardware perspective as well. So the Steam Controller since its release has had a bunch of really meaningful and interesting updates going out and that's really our hope to take a similar stab with the HTC Vive as well as the general SteamVR ecosystem. So we don't want to talk specifics right now because we have certainly a lot of ideas but internal to the product, just the way it's designed. A lot of stuff can be updated and changed as software and firmware updates. So real new functionality people will already have in their hands and we'll be able to push things out and give them new exciting things. So it remains to be seen how many of those we'll get to in the near term, but we're super excited about that approach and think it's a really good long-term way to do hardware support in the field.
[00:17:46.375] Kent Bye: So yeah, I have my lighthouse sensors mounted in the upper corners. So if there's a firmware update to the lighthouse, would it tell me and then I have to dismount it in order to hook it up to the USB to update the software then?
[00:17:58.745] Jeremy Selan: Well, I haven't confirmed this personally, but it's my understanding that the final product that HTC will be shipping will actually have remote Bluetooth firmware update capabilities in the base station. So you will have an update, and you'll hit the button to update, and you won't actually have to take them down off the walls even. Oh, wow.
[00:18:14.797] Kent Bye: That's awesome. So yeah, what about you in terms of the tracking technology, if there's anything else that you can tell us about what you were working on or anything that you are particularly proud of?
[00:18:26.878] Ben Jackson: I don't know if we have any future stuff that we can talk about right now, but I think for me, the big reward for that was, you know, last GDC, handing people controllers in VR for the first time, and that moment when people would realize, hey, I'm blindfolded, you know, I'm functionally blind, and I'm taking a physical object, and it feels perfectly natural. In fact, so natural, some people didn't even realize they were doing it until later, or even until it was pointed out to them. And every time I give a demo to somebody new, that's always a fun moment.
[00:18:57.147] Jeremy Selan: I mean, I'll speak for Ben here. One of the things that's funny is we've been generally pretty conservative in what we describe Lighthouse as being capable of. So we'll say it's five meters, and that's certainly what we're going for. But then we'll see videos of people really pushing it to its limits online, and people setting it up in rooms where they'll walk out the hallway and down the corridor. And just the people who develop the stuff watching those videos online were just cheering for them and being like, yeah. And it gets us so excited to see people pushing this tech to its limits. And obviously we don't want to promise those limits as the common experience because in some of those cases it won't work. Certainly what we describe it of is a very conservative estimate of what it's capable of. And I mean broadly speaking, Alan's done a bunch of interviews about the long-term vision of Lighthouse supporting lots more base stations and better positioning and sort of larger spaces and volumes and that's certainly a direction we're really excited to go.
[00:19:49.455] Kent Bye: Yeah, and right now I think that in order for the controllers to be tracked, they have to send that information back. And I could imagine that it's going to be easier to be able to track your arms, your legs. How easy is it for people who are developers to be able to take the tracking technology and build their own peripherals to integrate with Lighthouse?
[00:20:08.274] Ben Jackson: I think what people are doing right now is they're using the controller as a prototype object. They're attaching it to other things that they're interested in tracking and they're getting tracking by proxy. In the future we'd like to enable people to do that more natively or at least have a unit of integration that's a better form factor for that than the controller. right now people are still doing some pretty interesting things just by taking the controllers that they do have access to and Mashing them up with other stuff including, you know attaching them to cameras to do mixed reality AR Attaching them to other controllers because they want different input or different haptic feedback and they want to see how they work Somebody stuck one to the top of a gear VR to make a positionally tracked gear VR or So I hope nobody who's interested in doing that is held back right now because the controllers are very versatile in that regard. But in terms of aesthetics moving forward, it'd be nice to let them do something that's more integrated, more attractive.
[00:21:06.023] Jeremy Selan: And people might not be aware of this, but right now you can actually use more than two track controllers at a time. So if you go and you plug in a bunch more controllers either in wire mode or you can use additional dongles to support this feature, you could have four tracked objects simultaneously and it will just work. So people can experiment and have been experimenting with some really interesting ideas about what it's neat to track. I know the pet tracking was a video we all appreciated.
[00:21:30.262] Ben Jackson: Jeremy was talking about all those videos of people doing different tracking experiences, and for me it's a little different. Certainly in the beginning, right after GDC last year, we did a ton of preparation leading up to GDC. We're using all of the good equipment in the world, and right after that, developers started showing the stuff that they'd been working on, because now it was public. And every video I saw, I'm like, oh, that worked fine. And I don't know, it's taken an entire year of like every time somebody's like, oh, I put a controller on a dog and I'm like, oh my God, I've never tried that. Oh, that worked fine. Or, you know, put them seven meters apart or done all kinds of crazy things. And so Jeremy talks about how exciting that is for me, like every time it's sort of heart stopping and like, oh, good, it worked. And I don't know if I'll ever get to the point where I'm just like, yeah, you did something crazy. I totally expect that to work.
[00:22:18.178] Kent Bye: Yeah and to me it seems like you know with the Oculus they're having more optical track cameras and you know they're kind of doing this thing where they're suggesting them to both be in the front in order to maximize their amount of space that they're tracked but yet when I play experiences like bullet train and I try to I just want to just be able to turn around and I start to lose tracking like oh yeah I have to operate in this specific way that's not like reality, but fits the constraints of the system. But yet, I think that there could be things with the optical tracking that could track fingers or other things in the long run, and so it seems like there's been some decisions and trade-offs of certainly Lighthouse's precision to be able to have that and have the full 360, but what are you losing by doing lasers instead of optical tracking?
[00:23:02.013] Jeremy Selan: Well, I don't think it's an either-or decision, right? We have a camera on the headset. We're really excited about taking further advantage of that. I mean, even our original tracking system that we talked about two years ago at Steam Dev Days was an optical camera-based tracking system. So we have a great respect for those approaches. I think it's sort of a false choice. We evaluate all technologies continuously. We have a lot of people interested in all of these things simultaneously. So I don't want people to think that this is the only way it must be done and we would never consider anything else. I mean, it really comes down to the experience for us. We're trying to look at what are the interesting VR experiences? What do game developers want to use? What do people enjoy having in their homes? And then we'll do anything we can to make that happen. I mean, Lighthouse was the solution to the problem. It wasn't just something that we said, this is what we're advocating for. It was, this is what we're advocating for. Because it gives you this, this, and this, which people enjoy and appreciate.
[00:23:53.475] Ben Jackson: And we started with a variety of camera-based tracking systems. I think the trade-off, if you're developing something from scratch, would be cameras are readily available off the shelf, and especially for prototypes. Last year, when we had the Wall of History here at GDC, we showed some people some innocuous-looking cameras that cost like $2,000, $3,000, because they had all these fancy features. They were super high frame rate, they were black and white, high resolution, they had real lenses on them, not little pieces of plastic. The availability of all that stuff off the shelf is certainly compelling if you're developing a system from scratch. To develop a system like Lighthouse, you have to first invent some technologies and you're kind of taking a leap of faith that once you have all those parts that the whole will be a compelling tracking system.
[00:24:34.819] Jeremy Selan: I mean, even Lighthouse tracking itself is not just the optical component, right? It's only made possible because we've blended the IMU, that's the gyro and accelerometer data on the headset, with the external reference frame, the optical data coming in. So we already have a hybrid tracking system, adding extra inputs to give you, whether it's a magnetic or whether there's lots of other types of components, those are really exciting to add. And I really think in the long term, this type of VR tracking technology will be a hybrid that blends the best of all possible worlds to sort of give you that simultaneous illusion of the full body tracking and the full environment awareness that we're all going for. But I don't think it's any one approach will win out. I think it'll simultaneously do everything you can to hit those goals.
[00:25:17.516] Ben Jackson: Jeremy talked a little bit about how you start getting used to having tracking everywhere, and I can certainly attest to the fact that, you know, even after having, you know, one of the very early Vibe prototypes at work, for a little while I, at home, put on a DK2, an Oculus DK2, and tried to stand up, and basically ripped it off my head, pulled all the cables out, And it just grows from there, like the moment that you know that you can stand up, you just expect you can stand up. And the moment you can walk around the room, you just get used to that. And I think it's just a matter of time before you're confused as to why you can't just walk out the door and walk down the hall. And there are a lot of technical challenges to be solved there. And I think a lot of people are working on them. I'm really looking forward to seeing what we can do in the long term to enable that. But as Jeremy says, it will probably be a hybrid. You'll probably have some sort of assisted tracking system like Lighthouse or something else when you're in an environment that you control where you've set it up and where you need the precision. And when you go out and walk down the street and you're using some sort of mixed AR, VR type technology, then it's probably going to be natural feature tracking based because that's all you're going to have. Or in hybrid with GPS and all kinds of other things. I think eventually the sort of tracking that is necessary for VR that we're currently only achieving on the size of a room is going to be something we'll want to achieve everywhere and that'll be a really exciting time.
[00:26:35.546] Kent Bye: Well, I've heard that, you know, there was a bit of a drive to get to the Lighthouse system that you have a full room with fiducial markers, and that's what you did originally, and it worked pretty well. But who's gonna put fiducial markers all over the room, I guess was the thought that was driving that. But I could imagine a future where people start to have their own VR rooms, and maybe they can just throw up the fiducial marker wallpaper or whatnot, and... then you already have a front-facing camera within the Vive, and I'm just curious if there's any limitations to using the hardware that's already there to be able to do some inside-out tracking using fiducial markers.
[00:27:10.827] Jeremy Selan: Yeah, no, there's absolutely no limitations in that approach, right? It's just the imagination of the software developer at this point, which is why it's such an exciting space to be in.
[00:27:19.553] Ben Jackson: Well I mean one of the trade-offs whenever you build a camera system is that you have some resolution of your sensor and then you have a field of view of the camera and there's a trade-off the larger the field of view the fewer pixel sites per angular degree and so one of the challenges to the fiducial rooms was that like the floor for example couldn't have markers on it because people would walk on them and ruin them and move them and so if you look straight down at the floor the camera would just see the floor and you would lose tracking. Even to get the ceiling to work was inconvenient because you had to plaster the ceiling and that limitation came from the fact that you wanted to choose a field of view of the camera that gave you reasonable resolution at reasonable wall distances, so Yeah, I just mean like you described that it isn't too much of an impediment to plaster your wall in those markers but I'll tell you it is right having set up both lighthouse installations and
[00:28:05.772] Jeremy Selan: my home multiple times and set up one of those marker rooms, it was ridiculously complicated to set up those marker rooms. And even instead of markers, say it was attractive posters that you pre-calibrated ahead of time, and you said, yes, here's my movie poster for this, and here's that, and it just worked off of those instead. That's still very inconvenient.
[00:28:23.638] Ben Jackson: and straightens out the crooked poster. And, you know, one of the things that you've calibrated was how crooked that poster was. You know, the nice thing about Lighthouse is it's basically self-calibrating. You can put the bases in any orientation. I mean, you can lay them on their back, on the floor. You can put them on the ceiling, pointing straight down. And basically, we don't care, and we'll just start up and track from them. When you set up a marker room, like Jeremy described, you put 100-plus markers all over everything, and every single one of those, you have to calibrate its location. And if you don't like that calibration, then you have to refine it. And if somebody knocks one of them down, then, you know, either you put it back up and you calibrate again, or you do without it.
[00:29:00.991] Jeremy Selan: Well, though I will, you talked about using markers for tracking where you're at. I think if you're in a lighthouse environment with a Vive, you already know where you're at very accurately, so that's not that interesting. But there's another class of things you can do with the camera, which we haven't explored yet, but we all know it's possible, and I think these types of Scenarios are one of the reasons we're so excited about having the camera on it and that you can put those markers on other objects you want to track and then bridge it through the HMD. So imagine your coffee mug has a sticker on it and it's really important to have IMUs on the controller. But then for another class of objects where you just want to know where they're at and they're mostly static, you could actually use the camera. to determine where those objects are in the world. So we're excited, and we haven't done this yet, right, but we're excited about doing it, and certainly other people would actually have access to these full APIs as well. It's not necessarily on us. Anyone in the community would have the ability to develop this sort of technology as well, atop the Vive and the SteamVR system.
[00:29:56.854] Kent Bye: So I'm curious what type of experiences you want to have in VR.
[00:30:01.127] Ben Jackson: I think there are a few pieces of content out there. One of them we're showing tomorrow, which is Vanishing Realms and Budget Cuts, which we mentioned earlier, which start to give you a glimpse of what full-length, I guess you could say, VR content could look like, where you're actually going to a place and exploring it and it feels bigger. Even once you're no longer there, it feels like a real place that you could go. I actually had an interesting experience a couple weeks ago. We had a power outage at my house and I was like sitting on the couch and I'm reading a Kindle book on my phone and I thought, hey, you know, I could go play VR. And then I had to stop and think, wait, that's wrong. We don't have power. Why do I think that? And it kept happening to me while the power was out. I think it's because, to me, like, VR is a different place that you go, and there you have power, right? So why wouldn't I just go to VR where I can shoot at, you know, space satellites or hit things with swords, right? And some of these pieces of content are really, they're like a glimpse into the sort of game that you'll play in VR for 150 hours, like Fallout 4 or Borderlands or, to name some of the titles I've spent hundreds of hours in, right?
[00:31:05.434] Jeremy Selan: Yeah, I agree with all of that, right? Those titles he mentions are the types of ones where you can really imagine them being extended to full-length experiences and to still maintain your interest. I also really enjoyed the experiential ones, which is surprising to me. Like, I keep going back to things like Tilt Brush. I can't wait for more applications like that. I mean, on the weekends for fun, I develop Unity apps in VR, right? Because you have access to the hardware and that's... It used to be rare a year ago, now it's pretty common. But it's just so exciting to just interact naturally with things in that sort of three-dimensional space. I would love to see a tower defense game in room scale. There's a lot of 2D games I currently play on Steam, and I would just love for them to be ported in VR as tabletop experiences. It's just such a natural way to interact with your environment.
[00:31:53.567] Ben Jackson: It's amazing how everything small like tabletop scale is just so cute. It's just like I've seen Dota in tabletop scale in VR and those little creeps are adorable. It's just ridiculous. Like this is how you get grown men giggling in VR. It's amazing. You just you play with scale and it's magic.
[00:32:11.616] Kent Bye: And finally, what do you think the ultimate potential of virtual reality is and what it might be able to enable?
[00:32:18.675] Jeremy Selan: So we have another hour now for the rest of this conversation? Oh, I think it's an incredible potential. I mean, I think this is version 1.0 of anything. I mean, if you go back and look at the very first cell phones, they were briefcase sizes that you'd strap onto your shoulders. And look at them now. It's a device you always have with you that does everything electronic you'd ever want. And it's your connection to the internet, to the cell phone. to all the communications, it's your connection to your friends, right? To your social world. I picture the same thing for VR, where it quickly melds away. I mean, some of those very futuristic looking, like, it's just a pair of glasses that you wear, or something that, like, those could be a reality. It's certainly a real challenge to consider how those would work now, but that doesn't mean 15 years from now we look back at what we're doing today and laugh. So, I think the future is that you always have access to this, and you just use it when it's natural.
[00:33:08.833] Ben Jackson: I'm sure you've heard of the singularity, which loosely defined is like a point beyond which you really can't predict what will happen. And I think VR might be revolutionary enough that sitting here, it's hard for us to picture what could really happen because we're so entrenched in our non-VR thinking. I remember a commercial many years ago, sort of the dawn of the public web. And in this commercial, a young couple saves the day by successfully building a bicycle after downloading the installation instructions off the web. And when I saw that commercial, that was still not a real reality. You went to an average website. If your bicycle manufacturer happened to have a website, which they probably didn't, it probably looked like a MySpace page. and had no content whatsoever and now that is completely reasonable expectation that you would buy a bicycle and in fact maybe they would just include a URL or a QR code and send you to the web for the instructions rather than including printed ones and so I think there are some things that sound as fantastical to us now about VR that might very well be true in a few years like Maybe the showroom of the future for cars, for any kind of merchandise, is just going to be VR. If you can see it and it's one-to-one and the fidelity is high enough, why not? And I'm sure a lot of people listening are thinking, no, I would never give up being able to go physically see those items. But that's why it's a singularity. It defies that sort of prediction. So we will see.
[00:34:34.509] Kent Bye: Awesome. Anything else that's left unsaid that you guys would like to say?
[00:34:38.771] Ben Jackson: I will come back to one thing you were saying earlier, because we didn't get around to it, but you were talking about hiding behind boxes in the demo that you did today, and that reminded me that it's always interesting to me to hear other people's experiences in VR, because I feel like VR is one thing, sort of, and we've all done VR, and therefore we should all have a common experience, but that's not really true. VR has enabled us all to go and be in similar environments, but then we all had our own individual experiences. And for some people, like, I've heard the story about a woman who painted a horse in Tilt Brush, and once she did one side, she wanted to see it from the other side, and she looked around and realized it's a really big horse, so she crawled under it. And sort of back to like, I don't crawl under things, so like, that's just not part of my experience. But it's amazing how, even though VR seems like one thing that we've all done, everybody's having these unique experiences and telling people about them, and then I hear these things and then I want to go back and I want to try to do it the way that they did it.
[00:35:31.263] Jeremy Selan: I mean, we have a lot of demo loops available and it still blows my mind that we'll give a demo loop and someone will be like, oh, well my favorite experience was X. And they'll be like, but why did you include these other three? And then you'll give a whole bunch more demo loops and people will love those other ones and be like, why did you include that first one? I mean, there's such a range of what people enjoy that it's... I think everyone will have their own type of killer apps, and I think the appeal of VR is enormous. I've given demos to my parents and people of my grandparents' generation, and they just go nuts about it. Many of them mention they never thought they would live to see something like this. I can't imagine a technology that has such broad appeal once it's done well.
[00:36:11.531] Kent Bye: Awesome. Well, Ben and Jeremy, thank you so much for joining me today. Oh, thank you. It was nice talking to you today. Yeah, thank you very much. And again, that was two Valve engineers, Ben Jackson and Jeremy Salen, who have been working on both Lighthouse and other technologies at Valve. So just a couple quick takeaways for me. I thought it was really interesting that the tracking technology has the capability, you know, with the cameras to do inside out fiducial tracking of a room, but it's not necessarily feasible to do that with markers on a room. However, it's totally capable to start to add different static objects to do mixed reality experiences. And when I was at GDC, I started to see some prototypes of seeing the pass-through camera come through, and, you know, the point that Jeremy made that, hey, you know, this hardware has the capability to have software updates to unlock all sorts of new capabilities. And one of those, on the Vive at least, is that pass-through camera, and I expect to see a lot more functionality be pushed out to that over the next year or so. And yeah, again, for anybody who has been enjoying the series, please do consider becoming a contributor to my Patreon at patreon.com slash Voices of VR.