David Holz is the CTO of Leap Motion and he talks about how they’re able to track two hands and ten fingers to sub-millimeter precision.

david-holzThe new Leap Motion beta SDK version has a full skeletal model that now treats fingers & hands are one entity. He says that it’s hard to do and still have it run fast, but they’ve managed to implement it. This should open up a lot of physical and intuitive approaches to VR input.

He talks a bit about some of the challenges of occlusion as well as the journey and evolution towards getting 100% accuracy.

David says that if VR is going to be like reality, then we’re going to need to be able to work with our hands. If we’re using tools, then the VR input needs to mimic that tool. And while there are companies like MotionSavvy who are working using the Leap to interpret sign language, he sees that the future of using the Leap as a VR input device will be more physical and intuitive, and that a new grammar will evolve over time.

He speculates on some of the new 3DUI interactions and grammar that may start to develop where you’re just using your fingers and hands. But overall, it’s an open sandbox to experiment with what works and what doesn’t.

He talks about how most of the current demonstrations show the Leap on the desktop and tracking body, but that they’re also moving towards having the Leap mounted on a virtual reality, head-mounted display. They’re going to start doing more augmented reality integrations with the other cameras that are also included in the Leap, but not used as much. There’s an option in the new Beta control panel where the Leap can be optimized for these type of front-facing interactions.

Finally, David says that we’re going to start to hit a plateau and diminishing returns for how much technology improvements are able to provide, and that at some point humans will have to get better through new ways of interacting with technology. Leap Motion is ultimately aiming to enable these new types of  interactions.

Reddit discussion here.

TOPICS

  • 0:00 – Intro. Leap Motion two-hand, ten-finger tracking to sub-millimeter precision
  • 0:29 – New Leap Motion beta SDK version has a full skeletal model. Fingers & hands are one entity. Hard to do, and hard to run fast. Opens a lot of physical and intuitive problems.
  • 1:06 – How to deal with occlusion issues? If can’t see it, then keep it still.
  • 1:40 – User interactions where gestures would be better than a button. Things will be more physical and we’re going to use our hands. If using tools, the controller should be like a tool. Can use hands as a part of feedback. New types of user interactions with the hands and fingers only. Goal of this beta is to experiment and see what’s possible.
  • 2:44 – Sign language – MotionSavvy is doing sign language interpretation with the Leap. But the new UI will be less like a language, and more about physical and intuitive interaction. Some grammar that will evolve gradually.
  • 3:20 – Camera-based motion tracking accuracy isn’t 100% and can be frustrating. It will get there eventually, and it’s a journey and it’s evolving.
  • 4:04 – Gesture-based control in VR. Leap Motion in VR. Leap on a desk is what they show off. Transitioning to Leap on a VR HMD. What you see is being tracking. Interesting AR possibilities. Beta control panel can be optimized for front-facing. Going to release more stuff like imagery.
  • 5:05 – Where Leap Motion is going? No longer limited by speed and cost of computing, but how we interact with it. Use technology for more is what he values. There’s only so much that can be replaced by technology, at some point we have to get better with technology. That’s what Leap Motion is about.

Theme music: “Fatality” by Tigoolio

Comments are closed.

Voices of VR Podcast © 2020