#35: David Holz on Leap Motion’s camera-based VR input, new skeletal model for finger & hand tracking, new 3DUI interactions & hand grammar, and augmented reality plans

David Holz is the CTO of Leap Motion and he talks about how they’re able to track two hands and ten fingers to sub-millimeter precision.

david-holzThe new Leap Motion beta SDK version has a full skeletal model that now treats fingers & hands are one entity. He says that it’s hard to do and still have it run fast, but they’ve managed to implement it. This should open up a lot of physical and intuitive approaches to VR input.

He talks a bit about some of the challenges of occlusion as well as the journey and evolution towards getting 100% accuracy.

David says that if VR is going to be like reality, then we’re going to need to be able to work with our hands. If we’re using tools, then the VR input needs to mimic that tool. And while there are companies like MotionSavvy who are working using the Leap to interpret sign language, he sees that the future of using the Leap as a VR input device will be more physical and intuitive, and that a new grammar will evolve over time.

He speculates on some of the new 3DUI interactions and grammar that may start to develop where you’re just using your fingers and hands. But overall, it’s an open sandbox to experiment with what works and what doesn’t.

He talks about how most of the current demonstrations show the Leap on the desktop and tracking body, but that they’re also moving towards having the Leap mounted on a virtual reality, head-mounted display. They’re going to start doing more augmented reality integrations with the other cameras that are also included in the Leap, but not used as much. There’s an option in the new Beta control panel where the Leap can be optimized for these type of front-facing interactions.

Finally, David says that we’re going to start to hit a plateau and diminishing returns for how much technology improvements are able to provide, and that at some point humans will have to get better through new ways of interacting with technology. Leap Motion is ultimately aiming to enable these new types of  interactions.

Reddit discussion here.

TOPICS

  • 0:00 – Intro. Leap Motion two-hand, ten-finger tracking to sub-millimeter precision
  • 0:29 – New Leap Motion beta SDK version has a full skeletal model. Fingers & hands are one entity. Hard to do, and hard to run fast. Opens a lot of physical and intuitive problems.
  • 1:06 – How to deal with occlusion issues? If can’t see it, then keep it still.
  • 1:40 – User interactions where gestures would be better than a button. Things will be more physical and we’re going to use our hands. If using tools, the controller should be like a tool. Can use hands as a part of feedback. New types of user interactions with the hands and fingers only. Goal of this beta is to experiment and see what’s possible.
  • 2:44 – Sign language – MotionSavvy is doing sign language interpretation with the Leap. But the new UI will be less like a language, and more about physical and intuitive interaction. Some grammar that will evolve gradually.
  • 3:20 – Camera-based motion tracking accuracy isn’t 100% and can be frustrating. It will get there eventually, and it’s a journey and it’s evolving.
  • 4:04 – Gesture-based control in VR. Leap Motion in VR. Leap on a desk is what they show off. Transitioning to Leap on a VR HMD. What you see is being tracking. Interesting AR possibilities. Beta control panel can be optimized for front-facing. Going to release more stuff like imagery.
  • 5:05 – Where Leap Motion is going? No longer limited by speed and cost of computing, but how we interact with it. Use technology for more is what he values. There’s only so much that can be replaced by technology, at some point we have to get better with technology. That’s what Leap Motion is about.

Theme music: “Fatality” by Tigoolio

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast.

[00:00:11.954] Nathan Burba: This is Nathan Berba, CEO of Servios. We are doing hardware development, software development, and game development. We really think virtual reality is the next big platform for people, and we're really excited about full motion virtual reality. People actually being able to bring their entire body into something and really feel like they're outside running around, but they're really inside in their living room.

[00:00:33.601] Kent Bye: I see. Yeah. So you guys just recently announced within the last couple of days that you got $4 million of funding. Maybe you can talk about like, what are you guys going to do with all that money? And now you have funding like what what are you cooking up?

[00:00:45.185] Nathan Burba: So, really the funding is all about developing new technology, working toward a new product, and we're not announcing anything until 2015, but there's a lot of just really cool technical things we're doing, both, like I said, in software and in hardware, and really just trying to help push virtual reality forward, just trying to follow in the footsteps of great companies like Oculus and Sixth Sense, really just working on very difficult problems.

[00:01:07.740] Kent Bye: Yeah, and I've heard of some of the demos that you have in terms of, like, I imagine you're in this room and you're looking around and shooting zombies. Maybe you could sort of describe what that demo is.

[00:01:17.628] Nathan Burba: Yeah, so when zombies on the holodeck, you actually feel like you're in a roughly 8 by 8 foot space and you can walk around just like you would in real life. You can pick up different weapons. There's actually an axe on your back that allows you to fight off zombies as they're actually walking toward you. And it's a really visceral experience. It really feels like you're actually there and you're able to intuitively move through the world in a way that is completely natural. There's no abstractions. To quote Palmer Luckey, that's the most important thing is to make it seem like you don't have to learn anything really to be there. You just need to act like a human being would. And that's really what creates a tremendous amount of presence.

[00:01:54.735] Kent Bye: I see. And some of the photos I've seen, you have a HMD on and then you have other different tracking devices. What else do you have going on in terms of what you have attached to your body in that?

[00:02:05.341] Nathan Burba: So the system that we built for Zombies in the Holodeck was a number of different pieces of tracking technology, there's some magnetic stuff going on, some optical stuff, there's a wearable computer, there's a server, there's a stationary optical camera. So it's a lot of things that have been designed to give someone the feeling of being somewhere else and being kind of free in that world. And so really right now we're continuing on that and developing new technology to help us push that out to end consumers.

[00:02:31.517] Kent Bye: I see. So it sounds like it's much more geared towards like a full untethered experience. Is that accurate to say?

[00:02:37.444] Nathan Burba: Yeah, you could say that. And it's basically, you know, when you're tethered, it actually makes you feel, you know, like you're somewhat limited by the system. So the idea is that you don't want someone to remember any constraints about the real world. And that's when they really feel like they're in virtual reality. And that's when you really can get lost in virtual reality.

[00:02:52.298] Kent Bye: And so how do you deal with a safety component there? If you're in the Rift, do you not notice how close to a wall you are? And if you're freely roaming around, how do you prevent people from running into a wall?

[00:03:03.405] Nathan Burba: We have a number of different pieces of technology we're working on to make that happen. Some things that actually come up inside the virtual world that tell you where you're going, and a number of things I actually can't talk about.

[00:03:13.490] Kent Bye: Cool, so it sounds like you were a part of this really magical time at the USC with the Project Holodeck. Maybe you could just sort of describe to me what was the scene and what was happening there.

[00:03:23.679] Nathan Burba: Yeah, I mean it's simple to describe it. I was in the MFA program at USC and I was working, got a job at a lab there doing some Kinect research. One of the researchers I was working with was a guy named Palmer Luckey. Every time he entered the room, he was the most brilliant guy in the room. Palmer really taught me everything I know about head-mounted displays. I could see from some of the work he did there, and I know he did a lot of work on his own, that he was building what would eventually become the future of virtual reality. And he was, I didn't know at the time, he was the world's essentially foremost expert on head-mounted displays. And he owned like 50 head-mounted displays or something like that. Yeah, 50 different types. So really it was from there, meeting Palmer, meeting James Iliff, who is now the Chief Creative at Servios. Really, it was kind of those formative years where I discovered that you could have motion tracking technology and really great feedback system, like the Oculus Rift or the Socket head-mounted display that we used before that. You combine those two and there's the possibility of creating something that's wholly unique. And really at the end of the day, we just wanted to make that, we wanted to take virtual reality and make games with it. That was really the most important thing was, wow, we have this amazing medium and all we've seen is just this kind of programmer art-based, you know, research applications. What if I could kill zombies, right? What would that feel like? And it turned out it felt awesome.

[00:04:39.170] Kent Bye: Well, yeah, there's also the Ben Kutchera article, I'm sure you've seen it, where he's like, I'm really tired of playing games where you're shooting zombies in the face. And so I think there is a little bit of a backlash in terms of, you know, in VR, is shooting zombies in the face really the first thing that you decide to do?

[00:04:54.683] Nathan Burba: Well, you don't have to just shoot zombies in the face. You can shoot them in the leg, the arm, the chest. You can really shoot zombies anywhere. That's, you know, the full range of activities in VR, you know, is... Now, I think at the end of the day, I mean, there's a few reasons... There's a reason why games and guns have always been linked is because guns are easy to program. They're easy to make them have really, really good feedback. That's why some of the earliest games have been Wolfenstein, they've been Doom. It's easier to produce than other things and it's more effective than other things. And then zombies are basically, part of that is James and his kind of creative vision for this kind of beautiful, awesome, like 1950s zombie world. And that kind of harkens back to some of our roots in Hollywood. But also zombies are just, they're kind of an easy note to play that's once again very, very effective. So we wanted to, you know, get to something that was very effective quickly, as opposed to trying to, you know, you don't want to innovate doing virtual reality, as well as innovate too much in all the other areas. You know, it's something that we wanted to see. It's also our lead developer, Alex Silken, just really wanted to produce that type of game as well. So it really is, you know, what we would like to see, what was easy to produce, that was what led us to producing Zombies, as well as some of the other titles that we made.

[00:05:59.376] Kent Bye: Yeah, I think that totally makes sense, especially when you don't want to be innovating on too many levels at the same time. I'm curious if you did happen to read that article from Ben Kutchera and he's making the point if you could go anywhere in the world and do anything would going around with a gun killing people be the first thing you decided to do?

[00:06:18.375] Nathan Burba: Well, yeah, and it's once again, I haven't read the article, but it's easy to implement that. There's a classic example of something that everyone wants to do, which is go to the beach. That's actually very, very hard to implement correctly in virtual reality, because you can have the audio and video of going to the beach, but really the things that make the beach matter, the warmth of the sand, digging your toes into it, building a sand castle. Going in the water, all these things are primarily based on very complex haptic systems. So you can't go to the beach in virtual reality. You really can't. You can kind of go to the beach. The thing with zombies is that you can create a system that lets you get hurt and simulates dying. There's systems for decapitating a zombie. It's all very, very juicy and very reactive. So that's why that's the first thing to produce, because it's that effective. Is that where you want to go in real life? Well, no, but zombies aren't real. So that's actually where I want to go in a fantasy. That is a fantasy of mine. So if he's tired of it, I mean, that's fine for him. But that, or being in a John Woo movie, or all of those action-oriented things, having a lightsaber battle with someone, I mean, those are the first things that come to mind, not because we want to go do those things in real life, but because those are our fantasies. Those are kind of our action-filled fantasies that make us feel like we're an action hero in a movie. So that's kind of why our system was developed that way, and that's why a lot of VR games tend to wear that initially. A lot of first-person shooters as well, and a lot of video games in general. That's why Call of Duty is so popular.

[00:07:37.272] Kent Bye: I see. Yeah, totally makes sense. Yeah, and I'm curious also, Nani de la Peña is someone also is sort of involved in that sphere, I guess, of that lab. Maybe talk about, you know, your interactions with her and some of the stuff that she was doing.

[00:07:49.888] Nathan Burba: Yeah, I've known Nani for a while. She's the, you know, one of the world's foremost experts in immersive journalism. She created Hunger in LA, which is a really cool immersive journalism piece, and she's just one of the people that started using virtual reality and really complex virtual reality systems, like the phase space system and some of the head-mounted displays built at the MXR lab, using those systems for something other than research, using them for you know, something that had some really emotional appeal. And so she was able to find some really good use cases for it. And once again, it's just another reason why companies like Oculus and Sixth Sense and Servios, why we want to push the medium forward to make it easier for people to actually build that content. Because, you know, it took her hours to set up and tear down that system to give people that experience. That should be something that the set up and tear down should take five seconds. You know, it should be like turning on a cell phone. So I think she's just, you know, she found another avenue that wasn't shooting zombies in the face that was very emotionally effective.

[00:08:49.035] Kent Bye: Yeah, and I think one point in talking to her that she made was that, you know, she foresees virtual reality kind of coming into this tiered system with one tier being mobile, the next tier being PC based. And then the final tier, she said, is sort of like a whole VR room of just roaming around and having more of the IMAX theater. Maybe you go to a location externally yourself to just have a more fully immersive. I'm just curious in terms of how you see that tiered system playing out.

[00:09:15.445] Nathan Burba: Well, those tiers are primarily based on the computing systems that exist in society, where you have your mobile system that you carry around, and then you have your desktop, you know, call it a tethered system, I guess, that requires a lot more power, but is a lot more powerful. Then you have your super powerful system, which is an IMAX theater that allows you to do something in the same space as other people, and it's much more powerful than your computer, if you ever look at the servers that they have running at a movie theater. And that tiered system, it's an interesting idea. That's based primarily on the idea of virtual reality plugging into existing computing systems. I personally think that the best virtual reality will be designed and the computing system will be designed and tailored for the virtual reality experience. So it's not about, you know, we're not focused, at least ourselves, on plugging into the computing system that people already have. We're focused on what will the future be and trying to make something that doesn't work based on existing paradigms.

[00:10:09.628] Kent Bye: So you're moving way beyond, say, cloud computing? Or what do you mean by that?

[00:10:14.252] Nathan Burba: Computing systems today, even though they're so ubiquitous, it's not... It's hard to describe, but we basically want to make sure that we're not limited by that. That we're going to create an experience for a user that's really, really special. And that, you know, we like to take a little bit more of a whole cloth approach, I guess. We like to start from scratch and really ask ourselves, not what's the best way to get this out to people, but what's the best way to make the best experience. I see.

[00:10:41.121] Kent Bye: And finally, what do you see as the ultimate potential for virtual reality?

[00:10:46.036] Nathan Burba: I think virtual reality will let you not only get up and run around outside, but you're actually still inside. It'll actually let you not only go back in time, but it'll let you feel human again. It'll let you feel kind of what people have been feeling throughout time. It'll let you run around and jump and hunt. and play sports and really gaming and doing things that involve playing games with other humans. That's what human beings have been doing for thousands of years. Humans have been playing games longer than they've been using language. And games are very, very fundamental. And games with your body are very, very fundamental. And virtual reality will let us get back to that fundamental humanity that oftentimes in our society with our cars and our sitting down and drinking coffee and you know we don't carve out a portion of our lives to get back to our fundamental humanity which is running around playing games with each other. And so this really allows us to do that and then to explore completely different worlds and essentially it allows us to do that plus explore the depths of the human mind and human creativity. So it's really going to be a combination of those two things that makes virtual reality more impactful than literally anything else you can do.

[00:11:52.316] Kent Bye: Awesome. Well, thanks so much.

[00:11:53.918] Nathan Burba: Yeah, thank you.

More from this show