#189: What’s new with the Tactical Haptics’ Reactive Grip Controller?

William ProvancherTactical Haptics is working on bringing realistic haptic touch feedback to virtual reality applications. William Provancher was at SVVRCon 2015, and I had a chance to catch up with him again since our first interview at SVVRCon 2014. He had a lot of updates in terms of their technology updates, but also the wide range of different funding sources to get their venture going.

William comes out of the academic world of doing fundamental research into human perception in order to understand how sheer cues could communicate direction information. Tactical Haptics is one of the more innovative companies within the consumer VR community in finding research grants to continue their haptics research. He talks about a couple of grants including one from NASA in order to provide more haptic feedback when performing physical interactions with a tele-robot as well as a grant from the National Science Foundation to research what a minimum viable haptic product for video games might be.

We discuss the implications of the uncanny valley, the tradeoffs involved, and what happens when there’s a mismatch between the mental model in your mind and the expectations of a higher-fidelity VR experience that includes hand tracking. William talks about some of their experiments with finger tracking in combination with providing some haptic feedback.

William also talks about some of their latest hand tracking technologies that they’re using including a hybrid between inertial and optical tracking, and how some of the early discussions with OSVR integration and potential for Lighthouse integration. He says that the sweetspot for Reactive Grip controller is when it’s representing physical interactions where you’re holding onto a tool. Finally, he talks about some of their future plans including continuing to search for grant and investment opportunities, getting prototypes into the hands of developers and partners, and making the controllers fully wireless.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast.

[00:00:12.034] William Provancher: I'm Will Provancher, founder of Tactical Haptics. Our focus as a company is to bring more realistic touch feedback to VR. Right now we're at StartX and are having a focus shifting from a research phase more in towards business and looking for partners and fundraising.

[00:00:26.958] Kent Bye: I see, and so I think Les talked about a year ago at the last SVIRCon, and so what's new? What's happened in the last year for tactical haptics?

[00:00:34.083] William Provancher: So we've had two SVIR grants there, government innovation grants. One has been through NASA. The focus with that was applying our technology to give a greater sense of physical interaction to a telerobot. So you can imagine saving an astronaut from having to go outside. into space. Actually, it's quite hazardous going into space, and so if you can increase their dexterity by letting them feel what the robots feel, you can actually do pretty sophisticated things in space without actually having to get into space. The second thing is we have a grant from NSF looking at how to identify the minimum viable product for video games. More or less it's taking technology out of what we're doing or getting at the core of what our technology could be for use cases in video games that could be far simpler. So if you're into like hacking and shooting and all this kind of stuff, do you need something as sophisticated as what we currently have? And the answer is kind of no. Our initial experiments have already said you can have a much simpler device but still be way more compelling than vibration.

[00:01:28.213] Kent Bye: And so how did you go about getting a grant from the National Science Foundation in order to fund development that may be applied to the games industry? That seems like a far leap for some of the stuff that they may traditionally be funding. So what's the connection and what are the other applications that they may be interested in? Or is it really gaming?

[00:01:49.986] William Provancher: It came out of fundamental research that I was doing at the university. Initially, they funded research towards understanding human perception to use these same sheer cues to communicate direction information. We're going to use it for hooking up to a GPS navigation system or something like that. Originally, the big market we thought would be put it in a car, put it on the steering wheel. And at the time, I think there was some financial crisis stuff going on that the automotive industry had other things to worry about. And so we kind of moved on into other things that people were responding to better, which was to create richer experiences in VR and video games.

[00:02:20.969] Kent Bye: Oh, that's an interesting kind of pivot. Doing this kind of core foundational technology, but yet it seems like the biggest market sort of emerged that it was VR. So I'm curious about this question of the uncanny valley, of something that's low fidelity versus high fidelity, and whether or not some of the things that are low fidelity in a low fidelity experience, and then once you try to go to that high fidelity experience, unless you're able to actually achieve every dimension of that high fidelity, whether or not it sort of falls into the uncanny valley. And so as tactical haptics is dealing with a very difficult subject of kind of generalized haptics with the hands, how have you been addressing that question and kind of thinking about that?

[00:03:00.067] William Provancher: So, actually I think this was a subject that came up at the input session yesterday at SVVR. Not directly, but indirectly. Because somebody asked if the panelists knew about what we're doing at Tactical Optics, and I think every one of the panelists, we know them and they've tried our stuff, and said, when it's good, it's really good. But there's other times where there's these mismatches where I think you start diverging from the exact thing that your mental model in your mind is for that exact representation. And whether that's an uncanny value thing or whether it's just this approximation thing just not being good enough anymore, it's hard to say. The other way that we're kind of dealing with it too is because right now we also know that you diverge from what you have the expectation in real life is through hand tracking. And that's actually one of the big things that we developed as part of our NASA grant is to bring in the ability to track people's fingers in space, because that's a really powerful thing to create the sense of presence. And so I would say, in answer, we're painting around the edges of this uncanny valley issue, not by trying to make everything that's perfectly high fidelity. It's the art of making it good enough and filling in the most important edges by having both reasonably good haptic feedback and having the expectation of good hand tracking at the same time.

[00:04:11.543] Kent Bye: Yeah, and I've sort of had my own experience of the tactical haptics of the... a big difference, actually, between the experiences that were more photorealistic, high-fidelity, sort of more like in the real world, versus more of the stylized throwing-around cubes. And there was something about, I think, expectations, is that... once we go into that higher fidelity then our mind sort of expects it to kind of feel exactly like it does in real life but yet something that's sort of like more stylized and my own experience at least I felt like just throwing the blocks around was very immersive because my expectations for this low poly world were more of a blank slate let's say and it gave me a little bit more of that immersion with the tactical haptic so I'm not sure if you found that as well in terms of the the level of the visual input being connected to how good of an experience someone may have with the Tactical Haptics device?

[00:05:01.296] William Provancher: I'd say that there's a correlation between having your 3D art be good enough, because we'd start off with these very stylized, very minimalist worlds, all gray, just focusing on the interaction, and immediately people respond with, your stuff's interesting, but you just need to put more effort into the artwork. So just the act of putting some color on the blocks, which is free, and putting a stylized background on there, is the transformation that takes place and it's the minimum expectation of something. But really our focus is the interaction and so all we want to do is provide a certain bar of excellence on the visual side such that you're willing to focus purely on the haptic side. But then again we also find that everybody's a little bit different, right? Some people are Wow, your graphics for the sword is awesome, and now I believe that's real. Or the machine gun, you have a great rendering of that. Or fishing, you had a great rendering of fishing. I can see it actually animate the reel and all this kind of stuff. And that's what made it real, because then they stopped thinking about all the stuff that's wrong. So there's this fine line. And for sure, if you have too much graphics going on that it kills your rendering, that's the worst of all. But minimum ante is you have to have some nice scenes, and then they'll focus on our interactions.

[00:06:09.752] Kent Bye: And so you mentioned finger tracking. What are you doing to be able to actually track fingers? Is that part of the Tactical Haptics' reactive grip, or is that some other different way that you're doing the finger tracking?

[00:06:21.782] William Provancher: So we have a number of combinations. Some are integrated in, some are external. And the reason we have both is just so you can have a proof, or I should say a truth model to compare against the other, but you could use things like Leap Motion in combination with what we're doing. Right now, all of the initial devices don't have integrated finger tracking, but the idea is that we know this is going to be one of the big draws, and so the product roadmap is to have integration down the road. And it's really the messaging to people that are kind of choosing solutions to say, well, I don't have to choose between good haptics and seeing my hands in VR. You actually have a solution that integrates in the path. Whereas there's no such roadmap for markerless depth camera-based stuff. You'll see your hands there. You'll have to at least wear a glove, and maybe you'll get a vibration at each of your fingertips at some point, but that's not a meaningful physical interaction, right? Right.

[00:07:09.175] Kent Bye: And I know in the past that you've potentially had tracking with the STEM modules, and what are you doing now for your solution to do positional tracking of these controllers? Is it still using the STEM, or are you looking into other things like Lighthouse or other potential solutions that are out there?

[00:07:23.762] William Provancher: Yeah, so we're actually still using the Hydra because we haven't had access to STEM yet, but that's an obvious integration, and we have the bracketry all waiting for our STEM module when we get it. The wireless controller that we're using here today just has all onboard inertial tracking. We'll combine that back in with some visual tracking. In fact, we were talking to Yuval from Sensex, about probably using their open-source hybrid inertial optical tracking system in combination with our stuff. We've also talked to some folks at Valve about using Lighthouse. We're really excited about that because it's really awesome fidelity of tracking. And ultimately, we want to be a haptic technology company that integrates with all of the existing ecosystems. And so we're essentially trying to show that we can be tracker agnostic. We already have an integration with the Pohemus. It's a higher-end tracking system that we showed at GDC. We have our own stuff here today. STEM and Hydra are easy, and so we don't really care as long as it meets the minimum bar of tracking requirements, and even the home-baked inertial optical stuff that we're doing is sufficient.

[00:08:19.436] Kent Bye: Yeah, and when I was at IEEE VR, one of the takeaways that I got from that conference is that a lot of the haptic solutions are going towards being very specific to a very specific use case, and so there'll be a tool that is designed to do one very small thing, but it does it very high-fidelity, very well, and it's very convincing. So I guess there's a challenge of creating a generalized haptic solution without having specific use cases, but if you were to sort of articulate what specific problems you think that the reactive grip is trying to specifically address.

[00:08:50.922] William Provancher: Well, clearly we do the best job of representing physical interaction when you're holding on to, we'll say, a tool. We've talked to people like Oliver Kralos and, you know, if there's some interface that I'm seeing in the game or in the VR environment that looks like essentially a handle, his mind will leap to it immediately as accepting, especially if the haptics feel right. So, for sure, the best interactions for us are going to be where you're holding a tool. That's a pretty broad range of things. We have gravity gun right now, swords, flails. We have a different form factor of a device that we've shown to Intuitive Surgical and some other medical companies. It's more of a precision grip. And again, if your hand is in the same pose as what you see, either the tool on the surgical robot or in the VR environment, that's probably the biggest thing towards gaining acceptance. And it's this willingness to either accept that my hand can be in this pose and grabbing onto the thing in VR, or that I'm holding a tool that's close enough to what the human interface is. But the short answer would have been, it's where they're willing to accept this tool interface. We're not going to be the best for reaching out and exploring, palpating someone's abdomen to figure out whether you have gallstone or something like this. We're just not able to represent that kind of information. But if you have a tool representing force interactions, we do a pretty good job in that case.

[00:10:03.387] Kent Bye: Awesome. Here at SVVRCon, this is where this interview is taking place. We're a few weeks out from E3, which is potentially where Oculus is going to be giving more information about their input controls. But up to this point, a lot of people are kind of like wondering as to where things are really going to go. So it's a little bit still in limbo. So, as we're kind of waiting for that news, you know, as things move forward, Imagine would help you develop your strategy going forward, kind of based upon what Oculus is going to be doing, which is one of the other big players. Aside from Sony, which right now has moved, and they may come out with something else, but right now I think the big open question is what type of input controls Oculus are going to go with, and depending on that, how does that impact your future of what Tactical Haptics' strategy is.

[00:10:48.667] William Provancher: I actually don't know that it really strongly impacts us. The reason is there's enough people doing motion controls right now that there's some groundswell, some belief that motion controllers will be the path. We know we can also have integrations for things like traditional Xbox and DualShock controllers. That's what we do next. If Oculus says we're going to do a gamepad, we just stay on target. We work with maybe people at Valve and HTC and Sony. In the meantime, we get our initial product out there, gain acceptance, and migrate in whatever direction makes sense from there.

[00:11:15.868] Kent Bye: I see. And so what are some of the next big things for you as we're kind of waiting in this window before some of the big consumer HMDs are released here in the fall and in the first quarter of next year?

[00:11:26.630] William Provancher: Well, so we've had a very limited closed beta. We've been starting to sell and get controllers out to potential partners, some developers, and we're looking to expand that. So over the summer, we'll be doing some fundraising with the goal of expanding our beta and doing some market testing. The other thing is we've gotten a lot of attention or a lot of interest on essentially making stuff wireless. We know that that's the way that the product needs to go and just putting some engineering hours behind making that more reliable so we can get out to partners and developers. That's probably the next big critical thing for us to be focusing on.

[00:11:59.626] Kent Bye: Great. And finally, what do you see as the ultimate potential for virtual reality and what it might be able to enable?

[00:12:06.130] William Provancher: Jeez, that's kind of interesting questions, though. I don't know. It's going to change surgical training. It's going to change stroke rehabilitation and military training. I think it's one of these untapped things right now. And it's all the same things it's been doing right now, just better and just getting out to more people.

[00:12:21.782] Kent Bye: OK, great. Thanks a lot. Thanks for having me. And thank you for listening! If you'd like to support the Voices of VR podcast, then please consider becoming a patron at patreon.com slash voicesofvr.

More from this show