He talks about the Zombies on the Holodeck experience, and what they’re creating in order to have a more untethered VR experience where the user doesn’t feel constrained or limited by being in VR. They’re goal is to create a more natural user interaction within VR in order to create a deeper sense of immersion and presence.
Nathan also talks about his response to the Ben Kuchera’s article about “Let’s put down the guns in virtual reality, and learn to pick up anything else.” He sees VR as an opportunity to live out our action-filled fantasies of having light-saber battles or shooting zombies in the face.
He also makes the observation that you have to innovate one step at a time with the VR medium. For example, recreating the experience of going to the beach requires all sorts of haptic feedback to make it feel real, that it becomes one of the most challenging problems to solve. Shooting things in video games is a well-establish game mechanic that is fairly easy to implement, and that’s why they started there. Plus zombies aren’t real, and it’s one of his action-filled fantasies that VR allows him to live out with their Zombies on the Holodeck experience.
He talks about his ideas of the mobile, PC and IMAX-location based type of tiers within VR, and how Survios is going down a path of creating optimized hardware and software so that VR isn’t limited by computer systems that exist today. And finally, he shares his vision of the potential for VR as being able to allow us to return to our fundamental humanity of running around and playing games, exploring exciting virtual worlds, and allowing us to express our full creative potential.
Reddit discussion here.
- 0:00 – Intro to Survios, Hardware, Software and Game development
- 0:33 – Recent $4 million round, new tech and new product to push VR forward
- 1:07 – Zombies on the Holodeck. Feel like you’re in an 8x8ft space. Visceral experience where there’s natural interactions to have a tremendous amount of presence.
- 1:54 – Tracking technology, wearable, server, optical camera, VR HMD, Hydras to give sense of presence and be free within a world
- 2:30 – Going for an untethered experience where you don’t feel any limits or constraints
- 2:53 – Safety issues with untethered VR. Pieces of tech to help out with that
- 3:15 – Project Holodeck at USC doing Kinect Research, and learned everything he knows about VR HMDs from Palmer Luckey. Take VR and make games with it rather than just art-based and academic research. Wanted to
- 4:39 – Ben Kuchera article talking about tired of shooting Zombies in the face. Don’t innovate on too many aspects at once, and so taking an easy to implement idea and starting there.
- 5:59 – Response to violence in VR. Hard to simulate going to the beach because it’s a felt experience. Killing Zombies is an effective experience. VR is about living out our action-filled fantasies.
- 7:37 – Nonny de la Pena’s immersive journalism and untethered VR experiences. Using VR for emotional appeal. Push the VR medium forward to make it easier to make this type of content
- 8:49 – VR tiers: mobile, PC-based, and IMAX location based approach. Based upon computing systems that we have in society today, consumer PC and then super powerful, location-based computers. The best virtual reality experiences will be on nextgen computer systems that are tailor-made for VR — something that Survios is focusing on.
- 10:10 – Don’t want to be limited by computer systems that exist today
- 10:41 – VR will let you run around and let you feel human again, and run around, jump, play sports and play games. Games are fundamental, and VR will get us back to our fundamental humanity of running around playing games with each other, exploring worlds, and manifesting our creative potential.
Theme music: “Fatality” by Tigoolio
[00:00:05.452] Kent Bye: The Voices of VR Podcast.
[00:00:11.933] David Holz: I'm David Holes. I'm the CTO at Leap Motion. We do two-hand, ten-finger motion tracking to sub-millimeter precision. We just came out with a new beta of skeletal tracking. It's a totally new, next-generation version of our existing tracking, and it's really cool. It's available on our website now.
[00:00:29.256] Kent Bye: Yeah, so maybe talk about what changed for people to have a little bit better or higher fidelity in terms of tracking fingers.
[00:00:35.513] David Holz: But it's just a new version. So in the past, it was sort of what you see is what you get. So if you see a finger, we will track it. But if you don't see a finger, we're not going to track it. We'll still tell you that there's a hand there. And whereas now, the software is sort of written in a way where You know, fingers and hands are sort of one solid entity, and we understand what hands are in a deeper way. And that's really hard to do, and it's even harder to make it, like, run fast, but we managed to get that. And it's getting better and better every day, and I think it opens up a lot of really interesting physical, intuitive types of interactions.
[00:01:07.078] Kent Bye: And so how do you deal with an occlusion problem when it comes with there's one camera and there may be things that are hidden?
[00:01:12.671] David Holz: It's tough, but even if a thumb goes behind the hand, you can still kind of see part of the hand move. That kind of suggests where the thumb is. And so there's a lot of really, really subtle cues that you can get. And the other thing is, if you just completely can't see something, you kind of keep it still while you don't see it. So that helps a lot. So if you put one hand behind another hand, you may not be able to move the fingers anymore, but the hand will still be there, and that'll be OK. Or if you're twiddling your thumbs, it's OK, because the thumb will just not bend for the brief moment where you don't see it. But it manages to be OK.
[00:01:40.970] Kent Bye: And can you talk a bit about the user interactions where a gesture-based hand control would be way better than, say, a physical button?
[00:01:48.655] David Holz: If the goal is to make something like reality, if virtual reality is going to become like actual reality, then I think things are going to become a lot more physical and we're going to use our hands. And sometimes I think we use tools to interact with the world and not just our hands. I think in those situations it makes sense. But in that case, I think the controller should be more like an actual tool. And the motion control in the past has been the sort of convex of like a no-button mouse. But as we're going into it now, where you can like pinch things and grab things, and you can slide a thumb along the side of a finger, or you can touch a thumb to the tip of one finger, tip of another finger, tip of another finger, tip of another finger, or the inside of one finger, touch one hand to the other finger, like you're actually starting to get to the point where you can use your hands as physical feedback. And different parts of your hands can actually map to different things, both event-based and sort of like analog-based, like I'm sliding a finger along another finger. And it's not actually obvious. that that's not good enough. That may actually just be good enough. But it's a totally new space for people to start to explore that just has in no way been available in the past. And so the goal of this beta is to get it out there so people can start to really experiment with those things and get a sense of that.
[00:02:44.367] Kent Bye: I see. And so are people able to actually do like sign language to be able to detect that?
[00:02:48.802] David Holz: Yeah, there's a Motion Savvy is a startup that's doing sign language interpretation with Delete. So there are people doing that kind of stuff. But I do think that for interacting with things, it's going to be less about a language and more about physical, intuitive types of interaction. And I think there'll be like some other sort of more grammatical things. I don't know if it'll be like sign language for everyone. I think it'll be a sort of simpler things. And then over time, you'll have sort of new grammars evolving. And things will just sort of come out. It'll kind of evolve gradually, I think, for the average person.
[00:03:19.838] Kent Bye: And so one of the arguments against using camera-based motion sensing is that it works maybe 90% or 95% of the time, and then the 10% or 5% that it doesn't can be really frustrating. So maybe can you speak to that a little bit?
[00:03:31.563] David Holz: Yeah, I mean, it's just going to have to get there where it works 100% of the time. And you know, that's our goal. It's definitely getting better all the time. It's actually difficult to quantify. Like, if something happens once every minute, and it's doing 1000 things a second, then that meant it failed one out of 6000 times. But in spite of that, it's still not okay. it'll get there. I mean, I don't think anybody in the space expects that people will just put up with something that doesn't work all the time. We just kind of help people understand that we're on a journey, and that things are evolving, and it's not like a rock, it's like a plant, and it grows, and it's sort of this evolutionary process.
[00:04:04.125] Kent Bye: Yeah, and just one thought is the gesture-based control does seem very compelling in terms of virtual reality, so I don't know, being here over the course of the day, if you had any thoughts in terms of the leap motion and virtual reality in particular?
[00:04:16.453] David Holz: Well, so right now, the main thing that we show off is sort of leap on a desk. And that's pretty cool. There are a lot of people doing stuff with that. Then we're sort of transitioning more to the sort of leap on a head-mounted display. And that's interesting in that it's wherever you see, the tracking is also there too. Because the Leap has a field of view which is actually larger than the Oculus field of view, so actually it starts tracking the hands before you're even seeing it. So there's, I think, some really interesting possibilities there. And you can actually start to do that now with the new beta. If you go to the control panel and, like, say optimize for front-facing or something like that, it kind of switches into, like, a little bit of a head-mounted display mode. But then over time, we're going to release more things than that. You know, there are actually cameras in there. so we can start to pass through imagery, so you can start to do some augmented reality stuff. And there's a lot of really interesting stuff that we're going to start to push soon. Not yet, but coming soon.
[00:05:03.204] Kent Bye: Great. And finally, where do you see the Leap Motion going? What gets you the most excited about working on this project?
[00:05:10.707] David Holz: A lot of it for me personally comes down to sort of the limits and people and technology. And I like to think that, you know, we're no longer limited by the size or the cost or the speed of computing. We're limited by how we interact with it. And to the extent that we get these sort of weird things like I carry around with me in my pocket like a computer. that can do more calculations in a second than I can in my life. But I don't use it for anything other than a notepad or a newspaper or like a 90s era phone. And it's not because I can't do more or it can't do more. It's that something between me and the technology around me that's preventing me from utilizing it. And so a lot of the things that I really value are actually using technology for more, not just better and not just replacing things. And I do think at some point that's the place where the real value is because there's only so much left to replace. At some point we actually have to get better. and be better, and I think we can be better with technology than without it. So it's sort of, there's like that larger human quest, that's what we're about.
[00:06:01.649] Kent Bye: Awesome, well thank you so much.
[00:06:03.614] David Holz: Thanks.