#12: William Provancher on Tactical Haptics Reactive Grip, Unity SDK, Haptics as “The next big problem in VR,” Kickstarter lessons, and conditional VC interest

William Provancher is the founder of Tactical Haptics, which produces the Reactive Grip controller. The Reactive Grip controller simulates force feedback haptics through the two components of the kinesthetic (muscle forces) feedback as well as the tactile (surface frictional forces).

Will-278-269x200 William explains the genesis of his project, and some technical details for how it works. He also talks about why he’s on a quest to bring his academic research to a wider audience, despite the fact that a number of investors are telling him that he”s too early. He admits that he’s more of an academic than a businessman, but he’s been able to live in both worlds by choosing the right framing to create devices for different types of perceptual research.

He also talks about how at the IEEE VR conference, Henry Fuchs identified haptics as the next big problem to solve with VR. William was able to get some recognition for his work by winning the best demo at the IEEE VR conference.

He also talks about the upcoming SDK for the Reactive Grip where you will be able to translate the physics engine data from Unity into haptic feedback, as well as a number of canned interactions, and any combination of mass, spring and damper.

William is very candid about what did and did not work with his unsuccessful Kickstarter campaign, but is very determined to stay the course and help bring this technology to the world.

As far as the future of VR, William wonders how long the collegial and collaborative nature of the VR community will remain once the consumer market starts to take shape.

Reddit discussion here.

TOPICS

  • 0:00 – Intro & Tactical Haptics Reactive Grip controller mimics the friction or sheer forces that you’d feel when you’re grasping onto an object
  • 0:33 – How are you creating this Haptic force reaction? Sliding plates can represent frictional forces where it mimics force feedback.
  • 1:15 – Force feedback is any external forces applied to you, which has two components. Kinesthetic component forces that are received through the muscles, and the tactile component felt by the hand. You provide 1/2 of the kinesthetic & the tactile forces are mimicked.
  • 2:08 – In VR, there’s body tracking & head tracking visual component, and haptics being the key components. Matching your expectations in VR is what makes makes it more real.
  • 3:03 – How did you get started with Haptics? What’s driving you now? Doing research for 15 years. It worked so well, and it became a quest to bring it into the world. And it’s simple enough that it could just work. Early user testing for what people want from haptic feedback in immersive games to add to the experience to the game.
  • 4:45 – What are some of the interactions that you have with your SDK. Two modes of interaction of support direct calculations of physical forces that will come out of your game engine starting with Unity & eventually UE4. Scripts that simulating a gunshot type of motion.
  • 6:24 – What are the canned interactions that you’ll be providing? Have a set of demos that can portray the sense of contact and resistance after contact, portray sense of inertia, kickback torque, and having elastic, multi-dimensional deformable objects. In addition, there’s all sorts of combinations of mass, spring and damper.
  • 7:55 – What are you using for positional tracking – Using Razor Hydra at the moment. Others know how to do tracking, and so they will use solutions from others. Their goal is provide more compelling feedback that’s more sophisticated than rumble, but cheaper than force feedback.
  • 8:23 – Are there any safety concerns with haptic feedback injuring people? Not sure, but usually people who complain about that were gripping the controller to tightly.
  • 10:10 – What lessons did you learn from your Kickstarter that was not successful.
  • 12:26 – Will you doing another crowdfunding efforts or planning any other fundraising efforts?
  • 13:28 – The Razor Hydra seemed to come a few years too early before the demand from VR was there. Is there enough of a viable consumer market to support this type of haptic feedback. Conditional venture capital interest and doubts about viability, and then changing perspectives post-Facebook acquisition. We’ll e interested if others are interested.
  • 15:07 – Talk about winning the demo competition at IEEE VR with the Reactive Grip controller. Henry Fuchs’ keynote about what Facebook acquisition that was the best thing to ever happen to VR. It’s a hard problem, and their solution it’s not perfect, but it’s really good. Haptics is the next big challenge for VR, and the research academic community sees the value.
  • 17:27 – How have you dealt with the culture clash between an academic mindset versus the more enterprising start-up mindset. Make devices and study perception with it. The next things that need done aren’t always framed in an academic way, but they can be. It’s all a matter of framing, and he’s been able to find the intersection between framing what you want to do with what the research needs are. Needs to pass the KISS principle to be viable in the consumer market
  • 19:26 – What do you see in the future of VR? Wondering to see how much of the collegial collaboration vibe will remain once the market forces start to divide people when it comes down to going after a limited pool of resources.

Theme music: “Fatality” by Tigoolio

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast.

[00:00:11.975] William Provancher: I'm Will Provencher, founder of Tactical Haptics. I'm actually on leave from the University of Utah. I'm a professor of mechanical engineering there. Tactical Haptics, what is it? So Tactical Haptics is the startup. Our goal is to commercialize reactive grip. Reactive grip technology is a technology that mimics the friction or shear forces that you would feel when you're grasping onto an object.

[00:00:34.297] Kent Bye: I see. And so how is that actually happening? What are some of the insides of the guts in order to create this force reaction?

[00:00:42.103] William Provancher: So what we're doing is we have some actuators in the handle of the device. You're hanging on to it. It's like a motion controller. You can think about hanging on to a Wiimote. It's just on the surface of the Wiimote, there would be sliding plates. And those plates are moving along the length of the handle. And by coordinating the motion of these, you're able to represent these frictional forces that you would normally feel as if I'm holding on to that sword, that gun, etc. And through that, we're able to mimic the sense of force feedback because you are matching those friction forces to your actions.

[00:01:16.068] Kent Bye: Can you just kind of describe what is force feedback? Is that like real-life reactions or what is force feedback?

[00:01:22.482] William Provancher: Right, so force feedback really, really means external forces applied to you. But more generally, just let me take a step back. When you have a touch interaction in the world, what it's composed of is a kinesthetic component. Those are the forces and motions supplied by and received by your muscles. and there's a tactile component to it. It's coming in through the skin. And so what we're doing is, is rather than giving you that resistance that you would feel through your muscles, you are supplying the motions, which is one half of the kinesthetic experience, and we're mimicking those friction forces you would feel for the tactile component. And so you're kind of getting a one and a half of those two elements that are associated with touch. And when you marry it up to the sounds, the sights, everything else that you're expecting from the real interaction in the virtual environment, it just works.

[00:02:08.632] Kent Bye: I see, yeah, and so in virtual reality it seems like there's tracking your body, there's the visuals, but then this other component of the haptic feedback seems to be a key component in terms of creating this sense of presence. Maybe you could speak to that a little bit.

[00:02:25.108] William Provancher: Right, well it's come up on a couple of the panels here. It's kind of interesting. Some people have been talking about reaching out into virtual worlds and sometimes it's by accident there's an object in the same place that the virtual object is and people just get freaked out immediately because there's something really there. What we're doing is we're tracking your motions so we know where you are. These motions are hooked into a game engine, so there's a physics engine that's calculating what those forces should be, and we're just matching those friction forces up in that scenario to match what your expectation is. And it's really meeting your expectation that's the magic of this. If when you reach out and there's something there, it just makes it real.

[00:03:04.485] Kent Bye: I see. And so how did you get started into this? I mean, what is really driving you to create this device to have this experience?

[00:03:12.929] William Provancher: So I've been doing research on haptics for, geez, almost 15 years now. And we go to these annual conferences where me and my academic colleagues, we go and present great stuff to each other. And well, when we stumbled upon what we're doing right now, it just worked so well. It just became this quest to bring this into the world. And it's just simple enough that it could work. But really how we got there was we started off by doing some things that were more perceptual based. What we're doing is we were stretching the skin of the person's index finger or thumb, and the direction of that you could encode into a direction cues. You could tell people to turn right or left or whatever. And people can perceive this and they can actually move the direction you do it. But then we actually started asking people in a game environment, well, if we told you the bad guy was over there by giving you cues through your skin, would you do anything with that? And the kind of the answer was, I'll just say dot, dot, dot, no. People kind of said, well, that draws my attention away from the game. But on the other hand, we started using that to represent forces or the interactions in a game, your motions. The example I always give is we had a driving game. And you're driving around the curves and you're feeling or portraying the centrifugal forces through your thumb tips in the middle of the thumb sticks as the sway left and right that you would feel with your body as you go left and right around these curves. And people are like, bad guy over there is nice. Maybe if it helps me perform better, but it's distracting. But this other thing you're doing, where you're actually adding to the experience of the game, give us more of that. And there's nothing more engaging than I have a motion controller in my hand, I gesture out, and something's there.

[00:04:46.260] Kent Bye: And so maybe you could talk a bit about some of the actual interactions that you have, say, available through your SDK in terms of how do you describe what you can do with these controllers?

[00:04:56.815] William Provancher: So our SDK is kind of in the formation stages, so I can describe what we'll initially support. There's two modes of interaction that we're going to support. One is the direct calculations that will come out of your physics engine. So whether you're using, well, we're currently just supporting Unity. We're trying to add support for UDK and we'll also have some basic C++ support for the people that are doing their own physics simulations from the ground up. But when you calculate the forces in those simulations, give us a vector, basically a 3 by 1 vector in space to represent what those forces are, or a 6 by 1 for a four-star vector, and we just turn that into what our device does to portray those. So that's just the direct playthrough of forces that you're calculating offline. And every game engine has the ability to calculate physics forces, so it's just pointing that computed information at our device. So that's pretty straightforward and easy. On the other side, it's scripting things like when we do a gunshot, we don't really have any physics associated with us simulating a gunshot. What we're doing is we're just calculating a really quick backwards torque. And so we have a script that says move those sliding plates in a particular fashion and then move them back. And that just happens in time. And there's all kinds of examples of things that people might want to do with this kind of scripting to play an event where it's, we've stored this in advance. I hit go, it plays it in your hand and it feels good with the scenario that you're doing. We have a couple of different types of gunshots right now. It can build from there.

[00:06:24.420] Kent Bye: I see. Maybe you could sort of run through some of the metaphoric applications in terms of like stretching springs and gunshots and maybe kind of list through all the things that you've seen that you've been able to do with the reactive grip controller.

[00:06:37.426] William Provancher: So right now our demos, it's really meant to demonstrate one of each type of interaction. And so we knew we'd want to be able to portray the sense of contact and resistance after you've made contact. So we have a sword demo that does that. We wanted to portray a sense of inertia, so we have a flail demo that's got a ball and chain on the end of it, so when you swing it around, you can feel the shifting mass as that moves around. We knew that people would want to do shooters, so we had a really simple example of just the little kickback torque that you would feel in your hand. The elasticity that you were just talking about, we knew that we could do this on a couple axes. You can just do a single axis of motion and move the sliding plates in a way that represents those shear and friction forces, but we kind of have gone one step beyond that. Now we have this kind of elastic multi-dimensional A deformable object is just like you have a piece of rubber in between your two hands, and as you pull them apart, you can feel the resistance on the axis where you're just pulling in tension. You can twist, you can bend, and more or less, in that particular case, you can represent the elasticity of just about anything loaded on any axis. Everything else is just kind of an example. Once you have the ability to represent a mass, a spring, and a damper in these basic fundamental demos, everything, as they say in academia, is just left as an exercise to the student, right?

[00:07:54.930] Kent Bye: Awesome. And so what kind of positional tracking are you doing or are you kind of relying on other external devices to do the fully 3D positional tracking?

[00:08:04.933] William Provancher: Right, so we've been using the Razer Hydra thus far for our tracking. It's kind of a means to an end. It's actually probably the weakest part of our demos right now because of the high degree of warping that you get if you move your hands straight apart from each other on the screen it may actually look like your hands are kind of bending around the axis of the tracker base. But our goal is not to reinvent the wheel with tracking. There's people at Sony that know how to do tracking, there's people at Oculus that know how to do tracking, there's people at Sixth Sense that know how to do tracking, and there's probably many others that will come about. Our goal is to supply haptic feedback that is way more engaging than vibration, not as limited and as complex and as expensive as force feedback, and use other people's tracking systems and integrate the two together so that the end consumer gets the big win of great haptic feedback with great tracking.

[00:08:53.713] Kent Bye: Is there any concerns in terms of the safety of actually injuring people who are using something with the haptic feedback? Is it going to give such a feedback that's going to put their hands in danger at all?

[00:09:06.677] William Provancher: That's a good question. I would imagine that if we overdrive, which we've kind of been careful about having relatively limited range of motion of those sliding plates in the handle, there's been a few people that, it's usually as a result of gripping the handles really tightly. So it's more a function of the grip force that people have supplied than what we're actually doing, I believe. The analogy I give is that if you've ever been driving in the snow and you've done the white knuckle driving, you get to your destination, your hands are actually really tired. What we find is that the people who grip anything tightly are pushed down, in the old days when we were doing the navigation cues to a single fingertip, they push hard on anything for a long period of time, they're uncomfortable. It's a valid question. The answer is I don't know, but most of the people that complain about any kind of tingly or long-term after effects, it's just because they're really squeezing down on our controllers. It's the type of thing that if you squeeze down on anything for the 10-minute demo that people do, their hand's going to be tired. So I'm not sure how to quite answer it, but it seems like, so far, people adjust over time, and it's not going to be a problem.

[00:10:09.076] Kent Bye: I see. And maybe you could talk a bit about your Kickstarter that you attempted, and what happened with that, and what your plans are going forward.

[00:10:17.862] William Provancher: It's a fair question. So I can say that we've learned a lot of lessons about what not to do, and in fact I'm going to be on a panel tomorrow at the SVVR conference. Some of the things will just come out naturally that I won't have to share, and some of them I'll share. We learned a couple things. One is that you don't compete with the release of two consoles, Xbox One and PlayStation 4, coming out in the same month period. My hubris said that we could just get stuff out there. I just wanted to prime the pump. Once you try this technology, most people are like, I get it now. So I just wanted to have that now start much sooner than later. The other thing that we kind of made a pretty rookie mistake on is we thought, and I did a lot of surveys on this, we surveyed developers and said, look, we have a haptic feedback technology. There's tracking technologies out there. What about if we sold it with or without that tracking integrated? It would affect the cost of the system. What do you think about that? And they said, well, we don't really want you to put the extra cost in there. We already know how to call a couple extra function calls and we have this hardware already. There was the Sixth Sense Kickstarter that was the month before. A lot of them already had that hardware so they didn't need it. Killed our Kickstarter, the upward potential of our Kickstarter because everybody else that was just hearing about our stuff the first time had never heard of Sixth Sense. They were like, well, we'll wait for the consumer device because we missed the boat on the tracking. So that was a big problem. The other big rookie mistake we made is that we were shooting for a batch of about a thousand controllers and that's why we were shooting to raise about 175,000. That was about what my estimated cost was and we didn't have a lot to fall back on if I guessed wrong. And so doing we had a target that was probably bigger than what the 400 or so backers that we had was going to be able to support. But in addition to that, really the problem that we're having is that we held back some of our value. Some of the value that we had came out a week and a half into our campaign, which is that, well, geez, this $165 or this $319 set of one or two controllers, it may look expensive, but one, you're going to be the first one to get it. And two, by the way, hey, you can reconfigure this into being many other peripherals. You can have it be a machine gun, a flight yoke, a joystick, etc. And that's the big thing that I will say to people that are thinking about doing a Kickstarter is, if you got some good stuff, don't hold it back. Because you get kind of one big look from everybody, a lot of attention on the first couple days, and if your good stuff is waiting to come out two weeks later, it's kind of too late.

[00:12:46.773] Kent Bye: I see. And so are you planning another Kickstarter or searching for funding through other means?

[00:12:52.336] William Provancher: So we're considering doing another crowdfunding campaign. It may or may not be on Kickstarter. Right now, what we're trying to do in the short term is identify some strategic partners and get copies of controllers out to them. We will consider doing more crowdfunding as we go, but I also want to build a base, getting some angel investment. One of the things that you find is the pricing that people have to put out on Kickstarter or any of the crowdfunding campaigns doesn't have a lot of extra, we'll say margin in it, not enough to be able to pay salaries. And so to do this in combination with investment is really what makes the most sense.

[00:13:28.527] Kent Bye: Well, I think if you look at the Razer Hydra as an example, as sort of a device that was ahead of its time in terms of, you know, virtual reality wasn't even there yet. I'm just wondering how the whole timing of this haptic feedback is coming into play there in terms of not really being widely adopted by a mass consumer base, whether or not people see that potential there.

[00:13:51.370] William Provancher: Yeah, so like people were trying our stuff back in December and January, shortly after our Kickstarter was over. And they said, it's great stuff, but you're early. And this was before Facebook invested and bought Oculus. And so people were like, you're five years early. There's no market, and you're not part of the initial wave of technology that has to come in. The head-mounted displays and the tracking has to be perfected, and a million need to be sold, and then you have a viable market. Okay, well, what do I do in the meantime? Well, I don't give up that easily. So we just kept at it. And we've been working on making our stuff more manufacturable and all the stuff and then comes along Facebook and invest in Oculus. And now those same people back in January, they were saying you're maybe five years early, they're like, huh, your stuff is really good. You fit this niche in between well, I can't have anything that I walk around and gesture naturally and get anything that's anything better than vibrating or buzzing in your hand. This might actually be a viable market. Stay in touch in case somebody else is really interested in you because we might be interested too. So it's this kind of interesting time where you can tell that the tide has turned and everyone's starting to look with a little bit more interest.

[00:15:07.565] Kent Bye: I did see that you also went to the IEEE VR and kind of entered the reactive grip into that competition. Maybe you could talk about that scene there and what happened with that.

[00:15:17.155] William Provancher: Yeah, so the funny thing is that I'm much more centered on the academic side. And so going to IEEE VR, I understand that audience a little bit better. They're researchers, they're very serious about what they're doing. They identify certain challenges within the VR space, and they go after it, and that's their focus. And everybody's doing something slightly different. And there it was acknowledged, it's actually really interesting, a keynote from Henry Fuchs. He gets up in there, he gives this really good talk. Actually, he even modified his talk the day before to talk about what does this Facebook buyout of Oculus mean. And more or less what he said is, it's not a good thing or a great thing. It's the best thing that's ever happened to VR, is what he says. Then it's almost like there was a shill in the audience that planted a question. Steven Feiner comes out, he's a guy that's pretty well known in the VR space as well from Columbia. He says, well, but now I can see into the VR space, I can gesture around and I reach out and I go to grasp for something and it's not there because there's nothing resisting my hand motion. I was almost like jumping out of my seat saying, you should come and try our demo. So, but the great thing was, is we went there, people recognized that this is a really hard problem and that our solution's not perfect, but it's pretty darn good. And that's one of the reasons why we ended up getting a best demo award, because those folks know it's a hard problem and they're like, great. And it's really a tempting market to go after as well, because the high-end VR market, if those researchers, if those folks are really interested in your stuff, it means that technically you're performing at a level that they recognize as something that can't be done right now. The challenge then is, well, Do you focus on that market in the short term, which you could move the decimal place once from what the consumer VR price rates are of a couple hundred dollars. And it's really tempting, but it's not a huge market right now either. But the great thing is, is you've got some recognition from people who are really critical about this and know that it's a tough problem. And in fact, Henry Fuchs identified it as the next big challenge for VR. And this is people who do this stuff for a living in this research. So it was a really great experience to turn and bring our stuff from going after a gamer crowd at GDC to going towards a much more research-oriented audience with people that they're like almost the hall of fame of VR and get some great recognition from them.

[00:17:27.604] Kent Bye: Yeah, and I've heard reports back from a few other people who were in that keynote, and they said that one of the slides that he had was talking about how in the academic community, if there's no guarantee that there's going to lead towards a dissertation or a certain research, that it kind of prevents this degree of innovation. And so coming from academia, it seems like you've been able to break out of that a little bit, or at least maybe you've had to take a leave of absence in order to do that sort of more entrepreneurial startup. Maybe you could just sort of speak about that culture clash between academia and the startup entrepreneurial approach.

[00:17:58.572] William Provancher: You know, it's really interesting. And a little bit, I don't want to say it's a cop-out, but it is a little bit of a cop-out. It's really a matter of where your heart is. What I've done since I've been at the university is I make one device after another and we study perception with it. And that's how I can be an academic and I write papers and I get tenure and that's all great stuff. And he's right about the next things that need to be done aren't always, we'll say, PhD-worthy or noble science contributions. But framed in the right way, they can be. And I've always tried to frame the stuff we're doing research-wise. In fact, I have a couple of PhD students that are still doing things with these new devices that we've developed because it helps us understand new things about perception. So we've done perception of inertia. We've done perception of impact. And it's all a matter of framing. And in fact, some people have said, you should give your devices to some people in some neuroscience labs. So I agree that in some cases it's not something that you can necessarily get a PhD from. But a lot of it's kind of in framing what you want to do. And I think most academics, like I don't have the training to be a successful businessman. I just have the desire to get the technology that we've developed out into the real world. And the reason I have the belief that I can do it is because it's just simple enough and effective enough to possibly work. Anything, you know, like if it doesn't pass the KISS principle, you know, the keep it simple and you can fill in the second S of that one, if it doesn't pass the KISS test, it's not going to make it onto the real world commercially. And this stuff is just one step more complex than Rumble, but I think pretty effective compared to Rumble. I see.

[00:19:27.322] Kent Bye: Great. And maybe finally, if you have any sort of other final thoughts in terms of where you see the overall VR space here being at the first ever Silicon Valley virtual reality conference here.

[00:19:39.381] William Provancher: I think it's awesome to see kind of the nexus, the gathering of so many people interested in VR and interested in pushing VR into the commercial space. I think it's awesome also that they're somewhat collegial at this point and that people are, well, you're still the underdog. Everybody's rooting for VR because they want the market to emerge, right? And I'm really happy to see that. I'm kind of curious how it's going to be a year or two years from now as the market actually starts to emerge. There's these marketing reports that say that consumer VR is going to be a $2 billion industry next year. And if that comes to fruition, is everyone still going to be willing to meet and talk and all this kind of stuff? I hope so, because that's how I learned and that's how I integrate. And that's how the whole entire industry is going to go forward. It's the, the rising tide races, all boats theorem, right? And you'd like to keep that going, but at some points you got to wonder. So anyways, in a nutshell, great gathering of people. Great to see people developing apps. Great to see people covering all the difficult hardware and the integration and great to see people wanting to cross pollinate and talk and, and the great interaction that's happening here.

[00:20:45.730] Kent Bye: Well, thank you so much.

[00:20:47.181] William Provancher: Thanks for having me.

More from this show