#38: Peter Sassaman on Team Gauntl33t’s Project Lance VR Haptic feedback glove & their open hardware development approach

Peter Sassaman talks about Team Gauntl33t’s Project Lance VR Haptic feedback glove that he brought to SVVRCon. This is an open source and open hardware approach for creating a haptic feedback glove that includes 3D printed materials, an Arduino board, Hitec HS-322HD Servos and leather and elastic materials for the glove.

Peter-Sassaman-gauntl33tThey’re integrating the Arduino board into VR through the Uniduino plug-in for Unity, which works with the free version of Unity. They’re currently doing positional tracking with a Razer Hydra and a modified version of the Sixense SDK, but they’re planning on expanding support to PrioVR, STEM and potentially with Leap Motion and perhaps even with DK2 camera if that’s possible.

They’ve started to make some of the source files available on their GitHub page for all of the 3D printed materials, and plan on sharing more of their source code there over time.

gauntl33t-sidePeter talks about being inspired by sci-fi novels like Ready Player One and Snow Crash. Because he was interested in getting involved with VR on the hardware side, he decided to start trying to tackle the problem of haptic feedback since the omni-directional treadmills were already being worked on with the Virtuix Omni.

He talks about his design process and various decisions along the way, and a lot of their future plans moving forward. Tactical Haptics founder William Provancher told me that at the IEEE VR conference, that it was discussed that haptics is one of the biggest open problems in VR at the moment.

So if you’d like to get more involved in developing haptic feedback devices, then be sure to reach out to them via their website and check out what they’ve posted in their GitHub repo for Team Gauntl33t’s Project Lance project.

Reddit discussion here.

TOPICS

  • 0:00 – Intro to Gauntl33t Project Lance VR Haptic feedback glove
  • 0:26 – Components that were used? Arduino board along with and Hitec HS-322HD Servos
  • 1:15 – Positional tracking with Razer Hydra. Expand to PrioVR or STEM motion tracking in the future.
  • 1:36 – Haptic feedback on the front of the fingers by pulling back with the servos
  • 2:22 – VR demo of a coffee shop where you can pick up a squishy bag or a hard cup. Want this to integrated into VR adventure games
  • 2:58 – How to distinguish between hard and soft objects. Servos turn on completely for hard. It pulses on and off for the soft object
  • 3:25 – What kind of code are you running on the Arduino. A modified version of Firmata. Using the Uniduino plug-in for Unity, but it runs with the free version of Unity.
  • 4:27 – What are your future plans for it? Will be uploading all of the STL files to GitHub, along with the code that they actually wrote. They modified the Sixense SDK, but they’re planning on making as much of it available as an open project. They may have an Indiegogo campaign that provides some of the 3D printed components and servos, but they’ll need a couple of more iterations before doing a crowdfunding campaign for a full product.
  • 5:44 – Why use leather as the material? Using elastic and using leather and rivets to hold two pieces together.
  • 6:33 – Why elastic? Fit many different sizes of hands.
  • 7:03 – What material do you use to connect to fingertips to deal with different surfaces? Only simulating size and hardness of the object. Potentially use buzzers in the future. Aiming to keep the cost down, but people can modify and expand. Aimed for hacker and makers to collaborate and innovate on haptics.
  • 8:20 – Using camera-based tracking of hands with a Leap Motion? Want as many different trackers as possible. Potentially even with DK2.
  • 9:04 – What inspired to get you into VR development? Ready Player One, Snow Crash, and other VR content. It’s now possible, and wanted to work on the hardware side. Treadmills were already being worked on.
  • 10:01 – Tactical Haptics Reactive Grip™ and whether you’ll have to chose a haptic glove or objects. Turn off gloves when you’re holding a prop item. Potentially all integrated into a single glove in the future.
  • 10:55 – It looks fairly fragile. How durable is it? Don’t be afraid of breaking it
  • 11:23 – What’s in the huge box on your arm? Servos are in there, and they’re pretty big. Need metal gear servos
  • 11:50 – How a servo works? Takes two power inputs, but also a pulsed width input. There’s a potentiometer to determine how much it’s turned. Motor to control position of servo. Gears to determine how much to turn
  • 12:55 – What is the servo controller? Pulls strings to pull back onto fingers
  • 13:16 – Translating input from Unity? There’s a lot of control for how hard an object can be to get different levels of hardness. There’s 400 points across 180 degrees, which can provide a lot of fidelity.
  • 14:16 – What kind of reactions have you gotten? Lots of great feedback, and some suggested changes to make it better.

Theme music: “Fatality” by Tigoolio

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast.

[00:00:11.955] Peter Sassaman: I'm Peter Sassman, and I brought with a haptic feedback glove that is restrictive on your fingers. So when you're actually reaching out in the games, it'll pull back your fingers so that it feels like you're actually holding objects within video games.

[00:00:26.422] Kent Bye: I see. And so how did you actually build this in terms of what kind of components did you need in order to integrate this within virtual reality?

[00:00:34.385] Peter Sassaman: Okay, so for the main processor within it, we used an Arduino Uno. For the actual reading of how your fingers are moving, we used servos, more specifically the potentiometers within servos to read the movements of the finger and then also use the servos for feedback. So it required just a little bit of modification on each of the servos to be able to actually get output and not purely input like their original RC intended purposes. In addition also going to the Dallas makerspace. I had access to 3d printers So I was able to print the parts to hold everything together a little bit of leather working as well to Sandwich the plastic together with elastic pieces for the glove I see and so does this glove have positional tracking or is it just mainly a haptic feedback system?

[00:01:21.206] Kent Bye: Oh

[00:01:21.791] Peter Sassaman: Currently, it doesn't have our own version of positional tracking. But instead, we're using the Razer Hydra for the positional tracking on a 3D printed clip on top. In the future, we'd like to use the Pryo VR or the STEM motion tracking system.

[00:01:36.975] Kent Bye: I see. And so talk about the physics of what's actually happening. I mean, when I am holding onto this microphone here, I can feel the feedback on the front of my fingers. But you're doing something on the back of the fingers? Or maybe just sort of describe what's actually happening.

[00:01:51.112] Peter Sassaman: Okay, okay. It has tails on each of the 3D printed plastic fingertips that when they're leveraged backwards actually create a tactile force on the front of the fingers inside of the finger cups.

[00:02:03.580] Kent Bye: I see, and so even though the arms are in the back of the fingers, the actual force is being applied to the front of the fingers, is that right?

[00:02:10.781] Peter Sassaman: Exactly, exactly. And actually most of the force is just on the forward section of each of the fingers, but the brain kind of fills in a lot of the information. The results may vary from person to person.

[00:02:22.471] Kent Bye: And so what kind of demos within virtual reality then have you been able to put together to connect that visual gap then?

[00:02:29.466] Peter Sassaman: Okay, with the visual gaps so far, we've put together a small coffee shop where you're able to reach out and grab a coffee bag that's kind of squishable in your hand, as well as a hard coffee cup that you're able to pick up. In the future, I'd like to see maybe a sword fighting game or some fantasy genre where you're exploring around and able to pick up items. If you're familiar with like the Myst genre, or Myst or Riven games back in the day, I'd like to see games like that where you're kind of exploring around and able to manipulate a lot of objects in puzzle games.

[00:02:58.933] Kent Bye: And so what are you doing on the hardware side to distinguish between whether or not something is soft or hard?

[00:03:05.399] Peter Sassaman: Okay, so when it's a hard object, the servos actually turn completely on and pull back your fingers, so you're applying about the same force that the servos are to get a hard object. For a soft object, it actually pulses on and off, so that whenever it's off, your fingers kind of overcome it a little bit, so you're actually able to get a squishable feeling.

[00:03:25.396] Kent Bye: And so, since it's an Arduino, what kind of code are you writing in order to kind of run this mechanical aspect of this glove?

[00:03:34.324] Peter Sassaman: On the Arduino itself, it's running a slightly modified version of Fermata to where whenever you're writing a certain degree to the servo, it actually turns off the connection to the servo instead of pulling back on your fingers. And also, whenever it writes anything between the acceptable range, then it actually turns on the servo. On the side for Unity, we're using the Uniduino plug-in. It costs a little bit of money to get a hold of that plug-in, but it actually runs with the free version, so if somebody wasn't developing with the pro version of Unity, they'd still be able to have a haptic glove within a non-virtual reality experience as well. That together with a little bit of code that we wrote to call functions within that is pretty much all that we really needed along with for our demo using the 6th Sense SDK and the Oculus SDK.

[00:04:27.413] Kent Bye: And I'm curious about what your plans are with this in the future. Do you plan to kind of make this like an open maker, you know, do-it-yourself kit or a Kickstarter or what are your future plans with this?

[00:04:37.580] Peter Sassaman: We definitely want to keep it open on our website in the near future. We're going to be uploading all of the STL files we have currently. Some of them are going to require some modification to actually be the way they are currently. Some of them were somewhat modified manually, but we're going to clean them up and then also upload those along with any of the code that we actually wrote. We'll be looking at the Oculus plug-in and the 6-items plug-in to see if it'd be okay to upload that because in the 6-items plug-in we made a slight modification. As much code as we can deliver, we'd like to deliver that would put our demo, but the Uniduino plug-in is a paid plug-in, so we'd probably have to leave that out from the repository and people would have to add that back in themselves. For people who don't have access to 3D printers and wouldn't be able to print it themselves, some people have suggested that we have like an Indiegogo campaign in the near future where we'd be able to send out some servos and an Arduino in like a nice box with the 3D printed pieces so they could kind of construct it themselves. But as far as a completed product and funding that, we feel we'll need to have a few more iterations before it's ready for primetime.

[00:05:44.291] Kent Bye: I see. And so you said you were using a leather glove. I'm just curious if you tried other materials or if that kind of plays into this whole haptic feedback.

[00:05:52.134] Peter Sassaman: Actually, the glove itself is elastic. We wanted to kind of have it be stretchable for different finger sizes. But to actually combine together the plastic pieces and the elastic pieces, we're using small pieces of leather along with copper rivets to connect them together.

[00:06:07.795] Kent Bye: What do you mean by that in terms of what parts needed to be connected together?

[00:06:11.298] Peter Sassaman: Okay, it's a metal connecting piece that's oftentimes used for holding two pieces together. However, elastic is very easily torn, whereas leather is not. So kind of like sandwiching the elastic material between the plastic and the leather, then we are actually able to use that connector without tearing the elastic.

[00:06:32.722] Kent Bye: And so why elastic? Does it give more of a skin-like feel?

[00:06:37.124] Peter Sassaman: More so for just fitting different sized hands, as well as we kind of needed a fabric-like material to keep the plastic pieces on the actual fingers of the person wearing the glove. We would like to see if in the future maybe we could have one where you would just need the plastic pieces, a fully 3D printed solution. However, we don't want the fingertips to fall off while the people are experiencing their simulations. It tears the immersion.

[00:07:03.575] Kent Bye: I see. And so, you know, because our fingertips are extremely sensitive in terms of being able to feel all sorts of different variations of surfaces, I'm curious about what materials you use for connecting directly to the fingertips.

[00:07:17.711] Peter Sassaman: Well, on your fingertips themselves is a glove-like material within the plastic. So right now, what we're simulating is more so the size and the hardness of the object. In the future, we'll be looking into putting little buzzers in the fingertips so you can actually kind of feel vibration. But we also want to keep the cost down so that anybody can really build it themselves. But, of course, people can expand upon it. They can add buzzers. They can add different materials that will heat up or cool down the hand or experiment around things other than servos like we are. Motorized linear potentiometers are really nice for haptics. They are a little bit noisier. You might have to replace the motors with stepper motors. Other people want to use pumps and like inflating balloons on their fingers to push them open. There's a lot of possibilities, but that's what our group's really for is anybody who's like a hacker maker and wants to come on to our website or our Facebook or Google. It's gauntlet with two threes instead of E's and just toss out their ideas, talk with other people who want to get involved with haptics and get advice back. Awesome.

[00:08:19.723] Kent Bye: Have you considered using something like Elite Motion that is using more camera-based sensing to be able to track the positions of the hands?

[00:08:27.067] Peter Sassaman: It may work as is. We'd have to do some experimentation. We want as many possible motion trackers as possible. You can use The Hydra right now, but we'll be looking at summer, how many trackers we can actually get to work with. Perhaps even if the DK2 for the Rift allows you to track with infrared LEDs, then we'd definitely put on infrared LEDs to kind of get your position to handle with a camera. And if that doesn't work, get our own camera and see if we can do some tracking with a camera instead. There's a possibility in the future we might be able to put in our own solution like accelerometer and gyro as well. We'll just see what goes and try it out.

[00:09:05.005] Kent Bye: And so I'm curious what inspired you to kind of get involved in this virtual reality development space.

[00:09:10.321] Peter Sassaman: Okay, so before the Oculus Rift Kickstarter, there was a lot of kind of blown-out-of-proportion Hollywood views of VR, and so it seemed kind of gimmicky at the time, but after seeing the Rift Kickstarter and reading books like Ready Player One and Snow Crash and kind of deluging myself with a lot of VR content, I realized it actually is possible. And so since I'm kind of more from the side of the person who wants to work on the hardware, I was like, I want to work on either a haptic glove or a treadmill or something that wasn't really being worked on. Treadmills were kind of being worked on at the time I found soon after, like the Vertix Omni. And so I was like, I like robotics, so I could start working on a haptic glove. So that's what kind of started me last summer into working on that and developed into what we have now.

[00:10:01.652] Kent Bye: And so have you tried the reactive grip here? And I'm just curious if what you're working on is sort of compatible with that or if you really have to choose one or the other.

[00:10:09.890] Peter Sassaman: I don't think in the future you'll have to choose one or the other. We have another person in our group who's working actually on a haptic gun and what he'd like to see is that actually you'd handle with the software that whenever you go and pick up a prop like object like the tactical haptics or a haptic gun then the gloves would actually turn off and you stop feeling so much the size and the hardness of objects and instead then you'd be able to get the kickback of your prop item like the gun. or if you're holding the tactical haptics and you're playing the fishing game, then you'll be able to feel the torquing forces in your hand. I mean, there may be possibilities people will be able to integrate that all into a glove in the future, but right now, keep the cost down. You may want to also have the prop items that you'll pick up that can actually handle that sort of forces a lot better.

[00:10:55.145] Kent Bye: I see, and so, you know, taking a look at the glove, it does look like if you dropped it, it may just kind of shatter and break apart. I'm just curious about the durability of this mechanical solution that you've put together.

[00:11:05.613] Peter Sassaman: Yeah, right now, you're probably right, because it has gotten a lot of wear and tear from Maker Faire, but I think we've dropped it a few times and it's still ticking, but like I tell everybody when they're trying out, don't be afraid to break it, because if it breaks, that tells me that I need to make it better, so...

[00:11:22.561] Kent Bye: I see. I see. So it sounds like this is, you know, cause right now you have this huge kind of box on your hand. It looks a little like what all's in that box right now.

[00:11:30.843] Peter Sassaman: Uh, the majority of that room is taken up by the servos. Actually, there are larger servos. We'd like to eventually move them smaller servos, but in order to still use less expensive servos, then we'd have to use small ones with plastic years. And those would wear out very quickly on a smaller size. So we need metal gear servos and those cost a lot more.

[00:11:50.577] Kent Bye: And so maybe describe to me a bit about what is involved in a servo and how it's powered.

[00:11:56.322] Peter Sassaman: Okay, so it's powered kind of like you'd power a lot of devices with the positive in the ground and then it actually has another wire which goes into it which takes the signal pulse width modulation that the length of the pulse it's sending to it determines what angle it's trying to move to. So at first I was a little bit skeptical because I didn't realize that I'd actually be able to, with an Arduino, control it such that you could actually turn a servo fully off and simulate things like a squishable object, but it's actually quite doable. As far as what's inside, beyond that there's a potentiometer or sometimes encoders inside of servos which help you determine how much it's turned. There's a motor for actually controlling the position of the servo and usually plastic or metal gears to help adjust how hard the motor can turn the servo horn, which is what you typically connect whatever you're trying to move on to.

[00:12:55.096] Kent Bye: I see, and so it does seem like some sort of motor-like device that's controlling, you know, what is it actually controlling?

[00:13:00.958] Peter Sassaman: In our case, it's pulling strings that manipulate your fingers backwards, it pulls back on your fingers, as well as when you move your fingers whenever it's turned off, that pulls on the servo, so that's able to measure how far you've moved your fingers forward.

[00:13:15.660] Kent Bye: And so taking input from Unity, are you then translating that through some sort of filter? I'm just curious about that impulse control when it comes to sending a signal and how well it's able to respond to that signal.

[00:13:29.568] Peter Sassaman: Okay, so there is a good deal of control over how hard the object can be because you're able to send a very brief pulse to turn on the servo and then adjust how much time the servo is turned off to get different hardnesses. As far as it reading the input, there's a range of about 200 points along the traversal of it spinning from 0 to 180 degrees, so that's about half a circle. So it's got 400 points along that to which it can read how far your fingers have moved. So I feel that's a decent deal of accuracy. Some people may require more if they're like really fine detail, like holding their hand right up to their face and seeing how much their virtual hand is moving. But I feel for the average user, it'll be fine.

[00:14:16.448] Kent Bye: Great. And finally, being here at this conference, what kind of reactions have you gotten and where you see this going forward?

[00:14:23.140] Peter Sassaman: Well at this as well as at Maker Faire we've gotten a lot of advice like the big box in the arm will in the future probably be moved to the upper arm, the back or actually around the arm. We're going to be seeing several different configurations to actually alleviate having much weight on the arm. We're also going to be looking at several smart materials such as muscle wire alternatives to the servo. that will either provide an active or maybe a passive feedback, see if it's convincing enough, as well as experimenting with other 3D printed materials that may be more comfortable or more adjustable than the current ABS plastic we're using.

[00:14:59.488] Kent Bye: Okay, great. Well, thank you very much.

[00:15:01.311] Peter Sassaman: Awesome. Thank you.

More from this show