The HaptX Glove that was showed at Sundance was one of the most convincing haptics experiences that I’ve had in VR. While it was still primitive, I was able to grab a virtual object in VR, and for the first time have enough haptic feedback to convince my brain that I was actually grabbing something. Their glove uses a combination of exoskeletal force feedback with their patented microfluidic technology, and they’ve significantly the size of their external box driving the experience from the demo that I saw at GDC (back when they were named AxonVR) thanks to a number of technological upgrades and ditching the temperature feedback.
I had a chance to talk with CEO & co-founder Jake Rubin and Chief Revenue Officer Joe Michaels at Sundance where we talked about why enterprise & military training customers are really excited about this technology, some of the potential haptics-inspired interactive storytelling possibilities, how they’re refining the haptics resolution fidelity distribution that will provide the optimal experience, and their collaboration with “>SynTouch’s texture-data models in striving towards creating a haptic display technology that can simulate a wide ranges of textures.
LISTEN TO THE VOICES OF VR PODCAST
HaptX was using a Vive tracker puck for arm orientation, but they had to develop customized magnetic tracking to get the level of precision required to simulate touch, and one side effect is that their technology could start to be used as an input device. Some of HaptX’s microfludic technologies combined with a new air valve that is 1000x more precise could also start to create unique haptics technologies that could have some really interesting applications for sensory replacement or sensory substitution or start to be used in assisting data visualizations in a similar way that sound enhances spatialization through a process called sonification.
Overall, HaptX is making rapid progress and huge leaps with their haptics technologies and they’ve crossed a threshold for becoming useful enough for a number of different enterprise and military training applications. Rubin isn’t convinced that VR haptics will ever be able to fully trick the brain in a way that’s totally indistinguishable from reality, but they’re getting to the point where it’s good enough to start to be used creatively in training and narrative experiences. Perhaps soon we’ll be seeing some of HaptX’s technology in location-based entertainment applications created by storytellers who got to experience their technology at Sundance this year, and I’m really looking forward to seeing how their textures haptic display evolves over the next year.
This is a listener supported podcast, considering making a donation to the Voices of VR Podcast Patreon
Music: Fatality
Support Voices of VR
- Subscribe on iTunes
- Donate to the Voices of VR Podcast Patreon
Music: Fatality & Summer Trip
Rough Transcript
[00:00:05.452] Kent Bye: The Voices of VR Podcast. My name is Kent Bye, and welcome to the Voices of VR Podcast. So at Sundance this year, there were a number of different virtual reality projects, an augmented reality project, and a couple of artificial intelligence projects. And there was one company that was showing off their latest haptic technology. Now, I had a chance to try out their technology back in GDC in March. It was a giant box where you put your hand into it and you had a number of different stimulations. They had heat as well as this special fluid technology that could press down on your hand to give you this haptic feedback. So Axon VR has renamed to be HaptX and so I've got a previous interview back in episode 545 where I talk about my experience with their original haptic technology that was also at Sundance but this year they had their miniaturized version where you can actually grab onto objects and have the force feedback within your hands but also within the glove they had these haptic technologies to be able to simulate pressure on your hand. And so you could actually grab objects within virtual reality for the first time, and it actually felt like you were grabbing things. And this is the first time that I've actually felt like I've been able to grab a virtual object. And so I had a chance to talk to the co-founder and CEO, Jake Rubin, as well as the chief revenue officer, Joe Michaels, about some of the training applications that haptics is going to be used for, but also some of the potential storytelling opportunities that arise from having the ability to get your hands into the experience. So we're covering all the latest innovations with the haptic technology for haptics on today's episode of the Voices of VR podcast. So this interview with Jake and Joe happened on Saturday, January 20th, 2018 at the Sundance Film Festival in Park City, Utah. So with that, let's go ahead and dive right in.
[00:01:56.492] Jake Rubin: Yeah, so I'm Jake Rubin. I'm the founder and CEO of HaptX. I've had the pleasure to be on your show once before. We are here at Sundance, and we're showing two demonstrations. We're showing the sort of big box tech demo of our haptic feedback technology that you tried before, and we're also showing a newer prototype of our HaptX glove.
[00:02:13.218] Joe Michaels: And I'm Joe Michaels, the Chief Revenue Officer of HaptX, and I'm excited because this is the first time that we're showing publicly our technology to people at the Sundance Film Festival, so it's been great to get reactions from not only press and industry people, but just everyday fans of art.
[00:02:30.087] Kent Bye: Yeah, so I remember trying this out back in March at GDC and, you know, it was kind of fusing all the different sensory dimensions of haptics, you know, including heat. And I sort of imagined that, you know, the heat was something that would require a lot of space to do that. And then here you're showing a new glove that doesn't have the heat, but it has new things in terms of like having it mobilized and on your hand. And maybe you could just kind of walk through a little bit of what technology underneath that is kind of driving this haptic sensory experience within this haptic glove that you have here.
[00:03:00.714] Jake Rubin: Sure, so the technology is all based on microfluidics. We have a smart textile we've developed which is a lightweight fabric-like material that has embedded air channels and small pneumatic actuators. And we can use that to create basically a variety of different shapes and motions and patterns on your skin. We can use that to recreate a sense of the size or shape or texture of objects. Also on our glove we have a force feedback exoskeleton. based on that same microfluidic technology that can apply up to five pounds of force per finger. And we have very precise motion tracking that's based on our own custom magnetic tracking solution. So putting all these things together, we really believe it's the most realistic haptic glove out there right now. And we're very much aiming at a premium market. Enterprise, high-end location-based entertainment, military, medical. That's really going to be our first set of customers. But as far as what's changed from the box you tried, obviously it's gotten a lot smaller, and part of that was taking out the thermal technology. For our first product, we just found that it was a nice tab for customers, but it wasn't something they needed to have, and it was going to be, as you mentioned, hard to miniaturize. We are working on miniaturizing it, but it's a little bit lower priority for us. So the box has become a lot smaller, about 26 times smaller. It's now the size of an Xbox. We've taken our skin material out of that box and put it into a flexible glove. And of course, we've added the motion tracking and all the software that's needed to integrate with Unreal Engine Unity and give you a freehand experience with a simulated hand. So it's come a long way in about nine months.
[00:04:24.438] Joe Michaels: Yeah, and when we take it out and show it to people, we get a really good reaction. I was really inspired by the Houston Fire Department episode that you did a few months ago, because you heard the passion in the voice of the guy who was talking about bringing VR into training, his local fire department. And that's the kind of reaction we get from people who've been working on trying to bring VR into their work lives every day. And sometimes these people have been working on it for even decades. And when they try something that really works and it makes them think they're going to be able to kind of change the way they do training or simulation or design or manufacturing, it gets pretty exciting.
[00:04:58.472] Kent Bye: Yeah, I see that there's a couple of different vectors and trajectories for this enterprise training. Some of it is from the bottom up, like, you know, from Patrick as a firefighter, he's not a VR technologist, he's just collaborating with other technologists and he has the problems of like, these are the problems I need to solve in terms of Being trained and be able to train other people and then there's people who are coming from the VR side who are like You know trying to see what the technology could do within a specific domain So I kind of see that there's these collaborations where the VR people are expertise in the VR and then the different problems that are needing to be solved out there and so I Yeah, I could see how, you know, enterprise training in VR is, I see as this kind of exponential curve that is doubling every year, but it's like kind of slowly doubling. Some people from the outside may not sort of see that there is this growth that is going to at some point hit this inflection point. But I think this is a good example of a technology that, you know, just from March of last year, from where you're at now is the same kind of like, It's almost like a doubling or times four or whatever number you want to assign to it. But it feels like it's like this exponential growth of like kind of moving towards becoming much more miniaturized and applicable. And so I'm curious to hear from your perspective some of the specific use cases that people are able to use this technology in enterprise training or other things.
[00:06:17.248] Joe Michaels: Yeah, so we're most excited about training and simulation and training takes on a bunch of different flavors and types. So training can be everything from training the pilots and drivers of military vehicles to people who maintain the vehicles themselves because there's a massive workforce of people out there maintaining super expensive difficult equipment. And they all need training. You can't give them a big dusty phone book manual anymore. You don't even want to give them, you know, a YouTube video. You want to give them as much experience as you can. But a VR experience without your hands and being able to dexterously operate your hands and interact with the parts is not that useful. So we get grizzled veterans, you know, who've been trying different solutions and haptics for sometimes decades, and they get a little teary-eyed when they try it and go, oh my god, you might actually have broken through here, and we might be able to use this. The military, in particular, has been crazy excited about this. They have these giant training needs. They don't just train a small workforce. They've got thousands of soldiers and, you know, people maintaining billion-dollar equipment, and so they really want the very best training solution, so that's exciting. But beyond just sort of military and workforce training, the healthcare and medical training is super cool. Everything from simple, you know, train someone how to take a pulse or how to palpate a chest or inject a needle in the right spot. I mean, there are only so many cadavers and so many mannequins that you can afford and scale, but if anyone who wears a headset and wears a pair of gloves can cycle through, you know, as many different types of scenarios as you want to train for, that's pretty exciting. And so training and simulation is super cool. And then we've heard about design and manufacturing. One of the biggest car makers came to us and said, you know, we've been reading about these gloves and about haptics. Let us try it. And when they did try it, they kind of nodded their heads and said, I think you may help us change the way we design new car models. Because at almost every stage of the process, it's inefficient and it's expensive. Just clay models and milling different kinds of models cost tens of thousands, sometimes hundreds of thousands of dollars. And if you make a mistake, you've got to start all over again. So they said, if we could do that virtually, using our hands, sort of molding a CAD model virtually, really exciting to them. And then in the review process, because that's a big thing, they say just getting everyone to see and feel the design and agree and test it and say, when I reach out from the driver's seat, you know, is the glove box a safe distance away? And how is that different when you're a 6'5 man versus a 5'2 woman? So just bringing your hands into the virtual design process is really exciting. You know, and finally just, we think robotics is super, super cool because for now, you know, you haven't been able to operate robotics remotely using realistic touch, but when you can feel what the robot arm feels, whether it's lifting something or, you know, moving something or just, or could be something very fine, like diffusing a bomb potentially or going into a hazardous location. I mean, there's just, there's almost no end to what you can do when your hands feel things realistically. And I didn't even mention all the cool entertainment opportunities that we're here to talk about, but we just get jazzed about the industrial enterprise side of this.
[00:09:29.009] Kent Bye: Yeah, one thing that you mentioned that I had never thought about before, but that we are in a body that I'm six foot tall and I have a certain length of which I can reach out, but yet if you're trying to design ergonomics for people, you can start to simulate the size of people. If you have a physical model, you're limited by how big you are, but if you have a virtual model, you can start to fake a lot of that stuff based upon translating. If I'm seven foot five, I can sort of, in a virtual world, have the experience of what it's like to be in a car and reach out and who's someone who's four foot five, you know, so you could start to do those translations is I guess what I'm hearing you saying is that from people who are creating these ergonomic spatial designs with actual either cars or other, any, any sort of like vehicle, then you have to sort of take that consideration, but we've been limited into what we've been able to simulate in terms of that ergonomic haptics when it comes to our body sizes.
[00:10:21.293] Jake Rubin: It might sound like sort of a niche application, but when we talk to executives at major automakers and they say, yeah, I have to have my boss come in and approve a decision on every part of this vehicle, on where the seat should be and how far it should go forward and back, on where the buttons should be on the dashboard. And that executive can only try a very limited set of physical models that have to be shipped to them, and they can only experience it from their viewpoint. And so the ability to you know, not only change the environment and be able to do much more rapid iteration without having to build incredibly expensive physical models and ship them, but the ability to put the executive in a completely different body and scale the world around them and let them experience it from a different perspective. And that's something we see a lot of here, obviously different use cases, but it's amazing. One of the coolest things about the stories that I've seen at Sundance, the VR experiences, is they give you a different perspective and a different, you know, physical avatar and, you know, it really changes the way you see the world.
[00:11:12.000] Kent Bye: Yeah, in terms of the actual experience of the Haptic Glove technology, I had a chance to see your previous iteration and so there's some things that I was experiencing with this Haptic Glove that I had experienced before, so like for me it was like seeing how it was miniaturized, but also like I could tell that there's a little bit of, you know, since it was miniaturized, a little higher fidelity, higher resolution of when there's animals walking on my hand you know I think that's probably one of the most compelling is having either a spider or a fox on my hand and feeling it walk around a little bit and feeling that feedback seeing that visually and having that feeling it sort of like helps really sell that experience but I would also say that this is really the first time that I've been able to actually reach out and grab any type of object in virtual reality and have it feel like I was actually grabbing something and Not only that, but just kind of waving my hand through things and like, you know, also just like, you know, having the rainfall on my hands. All these different, you know, experiences that I haven't had before. Just as an anecdote, at one time I just wasn't thinking about it. I was almost like trying to back slap something with my hand. I was like, oh wait, I don't... Learning that there are limitations of like, you know, I can't fully, you know, push around but, you know, having things focused on the palm of your hand and your fingertips just to be able to simulate the minimum viable things you need to be able to feel like you're actually grabbing stuff. I think, to me, that's a huge kind of leap in terms of anything else that I've seen out there.
[00:12:33.264] Jake Rubin: Well, thank you very much, and I do think minimum viable is a great way to put it. This is a prototype that we're showing here today. We're very proud of it. We think the experience is great, and it's unmatched by anything else out there, but there's still a lot of rough edges. You saw how much progress we've made in the last several months. We're working right now in our lab on our first-generation product, which is going to start going out to our first handful of customers later this year. And that's going to be as big of a leap from this generation to that generation as it was from the big box what you tried today, and that's going to smooth out a lot of those rough edges to the point where we really want to get around some of those limitations. Like right now, you have one hand. It's a simple thing, but just having two hands, being able to pass objects back and forth, handle objects with two hands, not have to think about, oh, I'm limited to one hand. We'll be making it. lighter and smaller, more comfortable, having multiple sizes, and we'll be actually making a big improvement in the tactile quality. So right now the biggest thing that limits the quality in this version is actually not any component of ours, it's a valve we're using to control the flow of air into and out of the glove, and that's an off-the-shelf part that we use just for expediency in building the prototype, and it's pretty slow, it's not very controllable. We're moving in this generation to a brand new type of valve technology made by one of our partners. We're the first company in the world to have access to this technology and it is going to be literally a thousand times more sensitive in terms of how precisely we can set the pressure and displacement of the actuators and it will be about 20 times faster or so. So giving us higher frequencies, finer detail, it's going to be a revolution in quality.
[00:13:57.882] Joe Michaels: You know, what strikes me is that really as aware as we are of how far we have to go, our big challenge right now is to get past the good enough line. It's to get to the point where people can wear a glove and put on a headset and have the right kind of content and experience and feel present enough and feel like it's all realistic enough that they can accomplish a task or be moved by a story. And, you know, everything past that is incredible gravy and someday we'll look back and laugh. But, I mean, that's where we're trying to get to right now is that, you know, kind of, we've done it. We've nailed the good enough line and now we improve from there.
[00:14:34.176] Kent Bye: Yeah, and I'm curious to hear some of your thoughts in terms of where you see this going in terms of entertainment, storytelling, location-based entertainment. I would say that there's a spectrum that I see from authored story to generative story. And at the far extreme of generative story, you have kind of like this open sandbox. And what I saw today in your demo was a little bit more of a sandbox. Like I was able to kind of play around, but there wasn't necessarily like a story that was unfolding. It was more of like a tech demo. And so when you start to add layers of story, you start to say, well, what can you do now that you're able to actually touch and interact with things? And how has that actually changed the trajectory of a story that's unfolding? And I'm just curious, as people are seeing it here, people are kind of more story-minded, what kind of things have started to spark in their minds for what type of stuff that they want to start to play around with?
[00:15:20.608] Joe Michaels: Yeah, I'm reluctant to share too much because some of the storytellers have sworn us to secrecy. But I think what you saw when you had a little creature crawl on your hand, and if it lies down on your hand and you feel, some people report feeling a heartbeat. And I think we have a very light little sensation of that in there. just those little touches can really create a connection between you and the character that you're interacting with, way more than if you're just sitting back and watching. So we've asked some of our creative storyteller partners to think about how they would put this to use, and they've already come up with a couple amazing ideas, like being able to reach out and calm an overexcited creature, and using your hand and feeling that creature respond to your touch, and the emotional connection you get when that happens, it feels like it's going to be amazingly magical. You know, and just being able to get to know something, someone tactically, tactily, thank you, Jake, you know, using touch, I think, you know, the way you can sort of develop through the sort of narrative process and get to know them and feel how they change, that's all gonna be so, so sweet.
[00:16:26.214] Jake Rubin: I would say, you know, just sort of as a high-level commentary, we're very excited to see what true creatives, true artists can do with this. You know, that demo you tried, Israeli Tech demo, was whipped up in a couple weeks by our in-house content team, who are really more focused on the SDK than they are on the content itself. You know, seeing some of the other projects here and just the amazing artwork and the amazing experiences, as I try each of these experiences here, I'm thinking, man, what if I could touch that? You know, what if I could feel it? And I think there's this overwhelming basic human desire to reach out and touch things. And, you know, I think that can enhance almost anything and it can open up all kinds of new opportunities for how you tell stories, how you interact with things. and make it easier on artists as well. You know, we were talking to the creative team for one of these projects and they were complaining about how hard it is to get the hand animations right so it looks at all realistic with a Vive controller. And they're like, yeah, you know, if we could just use your glove, we wouldn't have to worry about that. You move your hand and it moves the way your hand moves. It's better and it's a lot less work.
[00:17:18.103] Joe Michaels: Yeah, I remember one customer saying, it's amazing to us how few people over a certain age are gamers and they're not used to controllers. And so our dream is to be able to let our customers just use their hands naturally. So if they can put on a glove one day and just do that, that's going to be a sea change for us. So that's what we're trying to enable.
[00:17:35.962] Kent Bye: It reminds me of like a corded keyboard where you're kind of doing different motions in your hands, sort of the combinations, almost like playing a piano or an instrument, but the corded keyboard allows you to sort of have a more finite way of making, it's actually kind of like a chord stenographer as well, that type of keyboard where you're kind of typing in combinations but just being able to do that with your hand in terms of input controls could also have lots of like I imagine that we're going to be developing new languages or new input languages in terms of the different combinations of your hand movements in order to like translate that either into actions that are happening within a VR experience or actually maybe, you know, typing and stuff like that. So I think using this as an input controller, I guess you're able to kind of track your hands to that same degree, maybe even better than something like outside-in optical, you know, something like the Leap Motion or some of these other cameras where if you kind of flip your hand around, sometimes it can lose tracking, but just being able to know how your fingers are extended could open up all sorts of new input controls.
[00:18:33.802] Jake Rubin: That ended up actually being one of our hardest challenges with the glove. You know, we think of ourselves as a haptics company more than a motion tracking company. We were hoping to license technology or use something off the shelf. And there's a lot of motion tracking technologies out there, many of which work very well for their use cases. But when you try and apply something like a Leap Motion or, you know, an IMU or bend sensor based glove or other optical tracking solutions to haptics, it just doesn't quite work. For the optical solutions, you've got, you know, occlusion issues. For bend sensors, IMUs, you've got noise issues. The level of precision you need for the kind of fine tactile interactions that we're showing is way higher than just hand animation. You need to be able to tell the difference between a light touch and a hard touch, which is a millimeter or so of displacement. If you can't detect that, you might as well not bother with the haptics. So we built our own custom solution based on magnetic motion tracking. In combination with our custom software, it gives a third of a millimeter average accuracy per finger, six degrees of freedom, so we can track every single motion of the finger, not just, you know, open and close. It suffers from no occlusion issues, and the noise is so low that it's actually significantly lower than the optical tracking on the Vive, which is kind of the industry standard right now. So, we're at the point now with our magnetic tracking where it is better than human perception.
[00:19:41.105] Kent Bye: Wow, yeah, I could tell, like, just being able to grab stuff, it was very, you know, precise in that way. And I remember last time we chatted, we talked about, like, the two-point test and the different millimeters for different parts of your hands, and there's a certain resolution that you had before in the last iteration and then on this iteration. So how do you quantify that in terms of, like, haptic fidelity and resolution of where it was before and where it's at now?
[00:20:03.852] Jake Rubin: So the quick and simple answer is we doubled it approximately in key areas across the hand particular across the fingertips it is twice what you tried on the on the last version and The longer answer is we've been surprised as we've gotten a chance to test this glove and look at how people actually Interact with it because you know, no one's ever built something like this before so we had to kind of guess at resolution We were building it And now that we've built it, we can collect all kinds of data on how people actually use it, how well they can actually perceive different contacts, and do things like that two-point test in VR. And we've actually hit the point of diminishing returns sooner than we thought on resolution increases. For the next version of our glove, we're actually not focused on increasing resolution. We're focused on increasing the quality of each point. We're actually even reducing resolution in a few areas where we found we had more points than you could distinguish. So it was essentially throwing away pixels, so to speak. And so the big challenge with us for this next version, which, you know, I was talking about this new valve architecture that's going to accomplish it. Right now we have pretty coarse control over exactly what the height of each pixel is, and that's something that your brain is very, very sensitive to. And so in our next version, as I mentioned, we'll have a thousand times more precise control over exactly what the height of each point is. We're also focused on the contact between our skin and your skin, between the textile and your skin, and making sure that it is as smooth as possible so you don't feel individual bumps, because it turns out that our skin is more sensitive to the cross-sectional profile of that contact than it is the actual resolution. So even if it's more points than you can feel, you can still get a sense of bumpiness because the surface itself has a bumpy profile. So those are the kind of things that we're focused on in this generation, more improving quality of each pixel than adding more pixels.
[00:21:39.453] Kent Bye: Oh wow, so that's really fascinating that you're kind of in some ways blazing the frontier of mapping out the full sort of spectrum of that two-point test in the hand, that we haven't had sort of a feedback loop to be able to do the virtual experiences and kind of see the phenomenological direct experiences of people to see what actually kind of feels good, but you're able to do that loop now and kind of refine and sort of have these different trade-offs and really optimize in different ways.
[00:22:02.852] Jake Rubin: The basic research is extensive, so you can go out there and find papers on every aspect of the biology and neurology and psychology of touch. What hasn't been done, though, is doing all of these tests in VR, you know, with a particular haptic technology, because no one's had this level of tactile fidelity before. And, you know, we've been surprised by some of the results when we do it. It doesn't necessarily track one-to-one with sort of the the naive literature-based approach. There's no substitute for direct testing to see how people actually use this, what their behaviors are like, what parts of the hand surface are used most often, how much resolution you need in a particular area, and how that translates to their sensations and their experience.
[00:22:38.374] Kent Bye: Yeah, and imagine that the fingertips is sort of like the final frontier of being able to really simulate any texture that's out there. And, you know, your solution, you're kind of giving almost like, you know, as you're grabbing something, it's very like you're holding something. It's almost like if you're pinching your fingers together, you feel that like you're holding something. It's a lot different than having the coarse grain, being able to run your fingertips over lots of different textures. So I'm just curious to hear if that's something that is actually useful in terms of virtual reality training use cases for your demographic, or if this is a nice-to-have, pie-in-the-sky, final frontier of haptics and VR is being able to simulate that degree of the touch with the fingertips.
[00:23:20.157] Jake Rubin: It is important, and it is something we'll be able to show you later this year sometime. We're working on it right now in the lab, without getting into too much detail on texture perception. Basically, that's not an issue of spatial resolution, it's an issue of temporal resolution. So when you get to textures that fine, surface textures like a wood grain or a fabric, they're actually picked up by a different receptor in your skin called the bassinian corpuscle. And those receptors have very poor spatial resolution, but they're extremely sensitive temporally. They can pick up vibrations up to one kilohertz, which is well into the audible range. So they can essentially feel sounds. And as you run your finger across the surface, your fingerprint interacts with all the ridges and microscopic surfaces in that texture, and that creates vibrations these receptors pick up, and your brain turns that into a texture. And so that's an issue of speed rather than an issue of spatial density. And the new valve architecture that we're working on in combination with some novel software allows us to reproduce those kind of fine textures.
[00:24:13.944] Kent Bye: Wow. I can't wait to see that and to feel that. To feel it, I should say.
[00:24:18.948] Jake Rubin: Put in a quick plug for one of our partners. I don't know if you've heard of the company called Syntouch. But they've done an episode. OK. So you know them well. They do some amazing stuff on pretty much exactly the inverse of what we're doing. So they have essentially a sensing skin, we have an actuating or output skin. And by using the maps essentially they make of textures and replaying those with our technology, we can give you not just a generic texture, a generic wood grain, but actually this particular piece of wood, this particular type of wood, or this particular fabric. So that's something that's going to take a little while for us to get quite to that level of precision, but certainly useful in automotive, aerospace, product design. What does this fabric feel like? What does this leather feel like? And again, it'll take a little while, but we're well on the way to doing that.
[00:24:59.067] Joe Michaels: One of the funny things about working with the military is you face these moments of sheer terror as a startup where they bring in the SME, the subject matter expert, to sort of test your technology and see, you know, is this good enough? Would I recommend this to my military colleagues? And so we've faced a couple of these moments where a SME came and tried our tech and, you know, they've always given us the pass so far, but that's when we'll be really nervous the first time we try this texture based technology and see what they think of it.
[00:25:27.715] Kent Bye: Yeah, I mean, I had a chance to talk to someone from Sentunch last year, 2017 at CES and be able to kind of see how they're really mapping out like a taxonomy and have a whole system of kind of automating a way to detect and profile all the different textures out there in the world and yeah, really It's interesting to hear that you're partnering with them because yeah They're kind of coming up with the I guess the database of the patterns that you need to recreate in order to actually test their theory as to their Modeling of these different touches and then you know if you're able to then reproduce them and say oh yeah I can tell the difference of the fine-grained touch of at some point do the Pepsi challenge to see if you're able to like tell the difference. I think we'll probably be way away from doing that, like, being able to imperceptibly tell the difference between them, but at least to be able to discern within the context of VR between different things. In the Final Frontier, maybe being able to touch the real thing and in VR to see how close you're able to get it, but, you know, at least within the context of VR, be able to get it to that level using a lot of their data that they've been collecting.
[00:26:27.233] Jake Rubin: And there's definitely a point of diminishing returns, so we don't expect, certainly in the next year, maybe eventually, but we don't expect in the next year to get to the point where you can feel a real fabric, feel a virtual fabric at the same time, and say, oh, they feel exactly identical. We do expect to get to the point in the next year, maybe year and a half, where you can feel a real world fabric, go into VR, feel that same fabric again, and have the differences not be so significant that you say, that could be the same fabric. There's a point of Dimension Returns where when you're dealing with those micro-scale textures, it's just not really important to anyone. Maybe some very niche use cases, but for most people, having a good enough replica of that fabric texture that it's believable as the same fabric is all you'll ever need. Anything beyond that's just a waste of technology.
[00:27:08.512] Kent Bye: Yeah, I've been referring to virtual reality as more of a symbolic or archetypal reality, so I kind of see that you're kind of getting the essence of a pattern down, so they're able to kind of get the essence of it without it getting down into its, you know, concrete form. But I guess in looking at what I'm seeing here, I've seen a huge difference, a 20 times reduction of the box, but at the same time, it's still a pretty big box that I wouldn't want to be carrying around on my body at all. So I imagine that, like, at least at this phase, you'd be sitting down, maybe standing up at some point, but I'm just curious to hear your ideas or thoughts on the miniaturization, whether or not you feel like once you're ready for a final release of your product, whether or not this will be fully miniaturized to be able to walk around in a room-scale environment with haptics, or if you expect that, at least for the near future, this is gonna be kind of a sit-down type of experience.
[00:27:55.877] Jake Rubin: Yeah, so our developer kit is aimed at seated, standing, and potentially in-place locomotion applications. So we can use this with an omnidirectional treadmill, for example. We don't have an integration yet, but in principle, it should be pretty simple to integrate with an omnidirectional treadmill. For more advanced locomotion, we're looking at a couple possibilities. We're hoping we can find in-place locomotion that is good enough to satisfy some of our military, industrial customers. There are some interesting omnidirectional treadmills technologies out there, like the active type that actually move with you. It's not just sliding your feet along a dish. We're hopeful that we'll get to a point where those technologies are good enough that we don't necessarily have to do room scale. However, of course, a lot of our customers are desiring room scale, and we are beginning to work on a backpack version of the system. One of the nice things about this new valve architecture we're building is it's smaller, it's lighter, and it takes a lot less power. So it is definitely feasible to build into a backpack. The ET on when exactly we do that is going to depend on how much demand there is from customers. But it's certainly something that's technically feasible.
[00:28:52.242] Kent Bye: So what do you each want to experience in VR?
[00:28:56.323] Joe Michaels: You know, when I think about Charlie Fink, who many people know always says that the killer app is other people. And I wonder how that applies to touch. And I think, you know, the ability to capture the touch side of a volumetric image, like just to get less abstract, you know, like I really enjoyed holding my grandmother's hand when I was a kid. And I wonder if someday there's gonna be a way to capture that. feeling and then to be able to go back and hold her hand again and when you combine that with a volumetric image, you know, to sort of be in that moment could be very, very special, you know, and every time I go up to a volumetric image like an 8i or one of these other holograms, you know, I just have this urge to reach out and kind of interact in a way that's physical and I'm dreaming of the day we can do that.
[00:29:44.927] Jake Rubin: I'll give a slightly more abstract answer. I'm very excited for the eventual convergence of all of the different technologies in the VR, AR space that I see people working on. I think there's this shared vision of, you know, kind of always-on cloud-based environments that you can access with many different types of, you know, of devices and, you know, that are just sort of living, breathing worlds. And being able to get in one of those environments, interact with other people, you know, have realistic touch sensation and be immersed enough that you can kind of forget you know you're in a virtual world and just you know be and live and do things and have fun like that's always been my ultimate vision is just not necessarily as you're saying earlier such high immersion that you can't distinguish between a simultaneous real and virtual experience but certainly high enough immersion that your brain just buys it and it's real and you can you can just live it that's that's what I want
[00:30:32.761] Kent Bye: Great. And finally, what do you think is kind of the ultimate potential of virtual reality and what it might be able to enable?
[00:30:42.076] Jake Rubin: The ultimate potential, I think it's almost hard to answer because it's so broad. I think if virtual reality, augmented reality, extended reality, mixed reality, whatever you want to call the industry, if it lives up to its potential, it will literally impact every fast of our lives. There won't be anything untouched. It'll impact the way you play, it'll impact the way you work, it'll impact the way you live. So I really just want to see it not fall short of its potential and to really have the impact it can on every fast of our lives.
[00:31:09.656] Joe Michaels: Yeah, I dream of the day when I won't think about it. It'll be so integrated into my lifestyle or probably my son Nolan or my daughter Lucy's lifestyle or maybe their kids, but it's not a, oh, we have to slip on a device and we have to boot up the system when it's part of your daily life. And when you're ready to shift into some mode that involves, you know, a layer of data or experience, it happens very naturally and they'll laugh. My grandkids will laugh that I ever thought about switching something on and waiting for it to get ready. That's the potential for me. It's part of the way I sense and enjoy new experiences without having to do anything.
[00:31:49.548] Kent Bye: Is there anything else that's left unsaid?
[00:31:52.968] Joe Michaels: I have one thing. As I'm here at Sundance and I meet with some of our mentors and advisors, it really occurs to me how important it is for people in the industry to take startups like us and help them and guide them, give advice, give introductions, give money, whatever it is. If you've made it this far into the episode and you are someone in the industry who is impressed by a startup, It means a ton when you help in whatever way you can. We're depending on that every week, every day. So we appreciate our friends and our mentors and advisors and keep doing that.
[00:32:32.762] Jake Rubin: And on that same note, you know, I would give a shout out to Sundance as a whole, the staff and everyone here. It's my first time here and it's just a wonderful group of people. I was telling you before we started this interview about all of the interesting people I've met and all of the interesting things I've learned. I think it's a higher density of passionate people doing cool things than I've ever seen before.
[00:32:50.552] Kent Bye: Awesome. Well, Jake and Joe, thank you so much for joining me today. Thank you very much.
[00:32:54.515] Joe Michaels: It's a pleasure. Thanks, Ken.
[00:32:56.476] Kent Bye: So that was Jake Rubin, the co-founder and CEO of HaptX, as well as Joe Michaels, the chief revenue officer. So I have a number of different takeaways about this interview is that, first of all, the HaptX glove that was there at Sundance was the first time that I was able to grab a virtual object in virtual reality and how it kind of feel like I was actually grabbing something. Everything else that I've seen so far hasn't been anywhere near as close or as good as what I've seen with the haptics glove. And I think that if you look at the progress that they made from the prototype that I saw back in March at GDC, which was this giant box, this box was about 26 times smaller. And there's a number of different big innovations that are happening. This new valve technology that they're going to be coming out with, that's even a thousand times more sensitive to be able to give you much more granular sense of grabbing things. I think it's it's on the way of being something that's really super useful, especially for training applications. Now, for entertainment applications, I think it's going to mostly be relegated to location based entertainment. And I think that there's going to need to be a bit of like really creative innovations for what type of experiences are going to be that much better for you to actually have your hands into the experience. And I think that having a relationship with a virtual character where you're able to actually physically touch it and interact with it. I think that's probably going to be one of the first ways to start to interact haptically with a virtual character. But I don't expect that to be kind of the first or most compelling applications. I think that the world of training and enterprise training and the needs of the military, there's just so many applications where having your hands within the experience is going to make a training application so much more compelling. So there's the principle of embodied cognition, which is the fact that you don't just think with your brain, you think with your entire body and the environment. And when you can start to have more and more senses added into the experience, then the training experience is going to get that much more visceral. So, a really great example that I learned of for the first time in this interview was just the different use cases of trying to design physical objects and then trying to have an ergonomic study based upon a wide range of different people of different sizes. average size, then you can get a sense of like building a physical model and seeing how the ergonomics are going to be. But if you're at any of the extremes of extremely tall or extremely short, you have to kind of design things to be within a certain range. And if you don't have the physical characteristics of that, then it's going to be really difficult for you to empathize with people of these different body sizes. So I can start to see how this haptic technology would allow you to start to simulate some of the ergonomics of these different body types and to be able to simulate that within virtual reality. And so I think that in some ways, there's going to be a lot of applications where you don't need your hands, you're not going to be doing any fine grain manipulations. But there's also a lot of applications where you actually need to have your hands within the experience. And I think that's where a lot of the military and other enterprise training applications are getting really excited for where this technological roadmap is going for haptics. And I definitely see that the enterprise training and military training is going to be the most compelling use case for their technology. And it's going to be helping kind of bootstrap this technology so that eventually we're going to start to see more storytelling and entertainment applications with this. But I think the price would have to come down a lot in order for it to be at the scale for it to be viable. But we might start to see some of this technology within location-based entertainment within the next couple of years or so. So Jake had talked about the two-point test in my previous interview back in episode number 545 so I recommend you go check that out but the essential point is that they measure the level of haptic fidelity by Putting two fingers down and if they're close enough then your body thinks that that's actually one point rather than two points and so there's a minimum distance at which your skin can start to detect the distance between those two points and your skin kind of have all sorts of different variations. Your back, for example, is not very sensitive, whereas your hand is very sensitive. And so one of the things that Jake was saying is that for the first time they're able to take a lot of this research that's been done about the resolution fidelity within your hands, and for the first time they're able to actually build a haptic technology that can start to actually test it. And what's the actual direct experience or the phenomenological experience of having these different resolutions? And they've been taking away resolution in some of the areas and adding it in other areas where they find that it's super useful. So I think that's super interesting to hear how they're starting to refine some of these different models for what the different resolution they need across the whole hand to make it feel like you're actually grabbing and holding things. The other thing is that their collaboration with Syntouch is super fascinating. I did an interview with them back in episode 496, and I highly recommend you go check it out if you're interested in this type of stuff. Because what Syntouch is essentially doing is that they're quantifying touch on these 15 different dimensions of their model. And they talk about that model back in episode 496. But they have an automated way to basically describe the model for what different textures feel like. So on that end, they're producing that data. And on the other end, on the device display side, this is where haptics is coming in. They're trying to build the technology to actually simulate all these different textures. And it sounds like they are making some pretty good progress in being able to actually simulate the textures, which if you think about, that is kind of like what I consider to be kind of one of the final frontiers of the haptics is to actually feel like you're touching the different textures of different objects. what Jake said was that don't expect that you're going to be able to you know have a virtual reality experience and feel a texture and to be able to come out of VR and for you to not know the difference. I think you're always going to kind of know that this is some sort of symbolic representation of these textures and that the thing that I have the experience of within haptics at Sundance was that my brain kind of knew that I really wasn't actually grabbing objects but it was good enough that I started to actually kind of forget that this was being mediated through the technology. And I started to kind of like wave my hands around and I try to backslap something and then I realized, oh, there's no stimulation on the back of my hand. And so it sort of broke that presence in that certain way that I was able to kind of feel things in the palm of my hand, but not on the back of my hand. And so that's just an example of really getting super immersed within the experience and just kind of forgetting that these experiences that I'm having within VR were kind of mediated through this technology. And I think that's kind of what they want to get to is just getting to the point where you're just actually forgetting that the technology is doing anything and you just get so immersed within the experience. So I think that's the goal of where they want to take all of this is just, you know, creating such a convincing haptic experience such that you can go into a virtual reality experience. You kind of like turn on in your brain that this is not completely real. And then you just surrender to being able to get lost within the experience. And I think that being able to touch things is something that mixed reality experiences and location-based entertainment and The Void and some of the experiences that were at Sundance this year, like Hero, had a lot of mixed reality elements. It just takes the level of immersion and presence to the next level. The other thing that I found really interesting is just the future of where some of these haptic devices are going to go in the future. And first we're starting to kind of simulate physical reality, but I think at some point, to me, what's really interesting is to think about how can a haptic glove like this start to be an input device? So Jake said that, you know, when they started this haptics company, they didn't expect to start to get into like motion track controlling of the hands, but They had trouble really finding a solution that was going to be able to simulate the level of fidelity that they needed to be able to really simulate what it feels like to touch something. So this goes way beyond what some of the more optical track controls of your hands can do, and even more fidelity than what you get from the HTC Vive. He's saying that they needed to be like, have like one third of a millimeter level of precision. And, you know, there's all sorts of issues with occlusion, with leap motion, and so they had to kind of do this sensor fusion of doing this magnetic tracking within the fingers on addition to having a HTC Vive puck on the back of the hand. So they're doing kind of this combination of getting really precise tracking of where your hand is so you can kind of then from that point do the sensor fusion to figure out the precise location of your fingers. So, given that, you can start to do both motion capture, but also start to potentially do things like move your fingers around in what's called a corded keyboard. And these corded keyboards would be like a keyboard that you hold in one hand and then you kind of do... you know, like playing guitar chords where you kind of push different fingers at the same time, you use that to be able to type letters. And so using this type of haptics controller, there could be a way where you start to use that as an input control, where you start to do little combinations of finger movements in order to actually have things happen within the VR experience. And I think the next level after that, the thing that I'm really interested to see where this is going to go in the future, especially with some of their fluid haptic technologies, is back in episode 527, I had a chance to talk to David Eagleman about his neosensory vest. And so with neosensory, they're actually putting a vest on your torso and they have this concept of sensory replacement or sensory substitution. Sensory substitution would be, you know, if you were to translate audio and to break it down into frequencies and to these 32 different buzzers and on your vest these 32 different combinations of the different frequencies and you're able to basically simulate the signal that it would be going into your ear. And so if you're deaf you're able to actually train yourself to hear through your torso if you get the correlation between translating the sound, putting it through your torso, and then that signal gets into your brain. And if you see that visual feedback, you're able to connect the dots and to be able to essentially substitute senses that you have lost or create entirely new senses. And I think that is where this gets really interesting, where you start to use some of this technology to do some sort of translation of data. So let's say like you have 360 degrees worth of data, and then maybe you put some of this haptic technology all around your waist. You could start to stimulate those different parts around your body, and then you could start to have new ways ingesting spatial information and data that goes beyond what your eyes can see. They already can do this with what they call sonification, which is to translate locations of data with sound. And I think there's going to be a similar type of translation of turning massive amounts of data and putting it in through your skin, through your body. I think this is, you know, probably a little further out, but theoretically, this is possible. Using some of the technology that they've built and developed, you could start to do all sorts of interesting things with, you know, training your body to speak new languages and to actually cultivate and develop new senses. So that's where this technological roadmap I think is going. I think right now they're just trying to simulate what it feels like to grab objects and it's going to start to then go into training applications, then eventually into entertainment. But I think there's all sorts of other like opportunities for using your body to be able to ingest massive amounts of information and data and for you to eventually start to figure that out. So that's all that I have for today. I just wanted to thank you for listening to the Voices of VR podcast. And at the end of this podcast, I really enjoyed what Joe had said about, you know, if you know of a startup and you are willing to offer them any type of help or advice or feedback, then that's something that is really super appreciated. And here at the Voices of VR podcast, I am kind of one of those startups where I'm out there just Trying to bring you the latest and greatest information about the virtual reality community So if you enjoy that then please do spread the word tell your friends and you know consider becoming a member of the patreon I do rely upon your donations in order to continue to travel to these conferences and to bring you this coverage So you can become a member today at patreon.com slash voices of VR. Thanks for listening