#495: Tricking the Brain is the Only Way to Solve the Infinite Degree of Freedom Haptics Problem

eric-vezzoliDeep in the basement of the Sands Expo Hall at CES was an area of emerging technologies called Eureka Park, which had a number of VR start-ups hoping to connect with suppliers, manufacturers, investors, or media in order to launch a product or idea. There was an early-stage haptic start-up called Go Touch VR showing off a haptic ring that simulated the type of pressure your finger might feel when pressing a button. I’d say that their demo was still firmly within the uncanny valley of awkwardness, but CEO Eric Vezzoli has a Ph.D. in haptics and was able to articulate an ambitious vision and technical roadmap towards a low-cost and low-fidelity haptics solution.

Vezzoli quoted haptics guru Vincent Hayward as claiming that haptics is an infinite degree of freedom problem that can never be 100& completely solved, but that the best to hope for is to trick the brain. Go Touch VR is aiming to provide a minimum viable way to trick the brain starting with simulating user interactions like button interactions.

I had a chance to catch up with Vezzoli at CES where we talked about the future challenges of haptics in VR including the 400-800 Hz frequency response of fingers, the mechanical limits of nanometer-accuracy of skin displacement, the ergonomic limitations of haptic suits, and the possibilty of fusing touch and vibrational fedback with force feedback haptic exoskeletons.

LISTEN TO THE VOICES OF VR PODCAST

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. My name is Kent Bye and welcome to The Voices of VR Podcast. So I was recently at the Consumer Electronics Show and there was just a sea of different companies there. And in the basement of the Sands Expo Hall, there was an area that was really interesting called Eureka Park. They had all sorts of different VR startups there. And in that park, they had this company called GoTouchVR, which was showing this haptic device that was trying to simulate at a very low fidelity what it would feel like to essentially press a button. It's like this wearable ring that would indent on your finger whenever it was tracking that you were colliding with some object within the VR experience. Now, haptics is something that is a very, very difficult problem. And I would not say that I've ever seen a haptic device that I tried out that I said, oh my god, this is it. They've solved it. It's a hard problem, and it's something that is going to take a long, long time to get to the point where we have the Ready Player One-esque haptic suits with exoskeletons and being able to convincingly trick our minds into actually touching objects. And in this interview with the CEO of GoTouchVR, Eric Vanzoli, he cites a guru of haptics named Vincent Hayward, who says that haptics is essentially an infinite degree of freedom problem that is not solvable, that the only way that you could start to solve it is to trick the mind. And so that's what GoTouchVR is trying to do. They're trying to trick the mind, starting with their wearable haptic ring. So I'll be covering haptics and the future of haptics and their technical roadmap of where they're at now and where they plan to go in the future on today's episode of the Voices of VR podcast. But first, a quick word from our sponsor. Today's episode is brought to you by the Silicon Valley Virtual Reality Conference and Expo. SVVR is the can't miss virtual reality event of the year. It brings together the full diversity of the virtual reality ecosystem. And I often tell people if they can only go to one VR conference, then be sure to make it SVVR. You'll just have a ton of networking opportunities and a huge expo floor that shows a wide range of all the different VR industries. SVVR 2017 is happening March 29th to 31st. So go to VRExpo.com to sign up today. So this interview with Eric happened at the Consumer Electronics Show happening in Las Vegas from January 5th to 8th, 2017. So with that, let's go ahead and dive right in.

[00:02:39.459] Eric Vezzoli: So hi, I'm Eric Bezzoli, and I'm a PhD in haptics. And we are implementing a new kind of haptic sensation into virtual reality by changing a little bit the architecture of the device that we are going to produce. This is a system that is not based on vibration and is not based on force, but it actually generates a real contact between your finger and the device itself. And this is the base how in reality the user touches the object. And that gives the illusion to the user of touching the virtual object.

[00:03:06.740] Kent Bye: Great. So what are some of the standard ways of doing haptics, and what is it that you're doing specifically that's different?

[00:03:12.062] Eric Vezzoli: So standard way of doing haptics is like now in the telephone system is vibrating. And that's the standard way to do the haptics. So you vibrate all the system, and you pick up these vibrations, and you perceive a signal. But vibrations are a signal that is not natural. In fact, in reality, the only vibration that exists are earthquakes and doesn't exist anything else. So we are not used to this kind of vibration. They are just man-made. To make it easier, we regenerate the real contact between the finger and the objects. And so that's the natural sensation between the user and the virtual objects. So you can recognize that it's something real.

[00:03:45.253] Kent Bye: So when I was at the IEEE VR conference, one of the things that I took away from there is that a lot of the haptics that are done right now are using rigid bodies and doing force feedback on that to be able to, with Haption or these $20,000 to $30,000 systems, but a lot of the techniques have been very use case specific. So to come up with a general haptic device that feels convincing I think is a huge uncanny valley to be crossed. I wouldn't say that you've been able to cross the uncanny valley. It still kind of feels contrived and fake and virtual and a little weird. So I guess like how is your approach to try to make this ambitious leap over the uncanny valley and to really try to simulate touch?

[00:04:27.033] Eric Vezzoli: The uncanny valley, I don't really believe that someone can overcome it fully. What we can do and what this device is designed is to arrive at least at 80%, 70% of this uncanny valley and to give to the user as much as is reasonable to give it to them with the price that could be consumer because our system might be priced under $100. So that's the point. If you want to have like a surgery training or whatever other training and you can spend $20,000, $30,000, $50,000 for the full robotic arms that blocks you what you want to do, That's okay, but that's not the general user. What we had is a user in mind that let's get to the point that we can arrive as best with as little hardware as we can. The aim is to get better with what we have. Maybe we'll arrive at 90% of what the user can feel, but to arrive at 100%, that's extremely complicated because you need a lot of hardware around it. We decided to keep it simple and make it work nicely. And that's our technological roadmap for the next two years.

[00:05:23.619] Kent Bye: And so in talking to Rob Lindemann, one of the things he told me was that the body has different frequency responses. Like there's parts of your back where you can't necessarily determine whether or not you have one or two fingers that are pressing you. Your hands seem to be the absolutely most sensitive part of your entire body. And so how would you characterize the frequency response of the hands? Is there a number you can put to it?

[00:05:46.515] Eric Vezzoli: So the frequency response of the hands goes from 400 to 800 hertz. There's been a lot of published paper around that. And it's divided in two different approaches. One is the touch approach that we are developing. And the other one is the vibrational approach that we didn't yet implement in our device, but we're thinking to implement it. What we are going to do is a stimulation fusion between the touch approach that, as you tried, it was like low frequency because it's around 10 hertz. And that's what it is in reality. It's around 10 hertz. and then there is a vibrational approach that is on top of it, that is between 10 Hz or 20 Hz up to 400, 500 Hz. So with this stimulation fusion, we would be able to regenerate this contact and then on top of this contact generate the subtleties that you recognize, which are the difference between the virtual and the real experience. Again, to say we can arrive near the reality, it's really, really difficult to get extremely, extremely good. But that's our approach in the long run, to keep into our device this human perception, which is extremely large, and we want to cover it as much as we can with this really simple approach.

[00:06:48.993] Kent Bye: So when I was using the Haption device, holding a rigid body, and because it's using it and it's being able to track, it's able to correlate between the virtual world and the real world to sub-millimeter accuracy. And using something like a Leap Motion tracker, it's not necessarily have that same level of precision and I'm just wondering how if you're planning on doing like a dynamic force profile because if I'm going at a certain velocity I'm gonna feel the different levels of pressure and you have to do all sorts of correlation between my proprioception of my hand of where it's at which is where I think it should be at in the virtual world but also at a velocity vector that's moving through there. You have to have very high precision tracking that I think the lighthouse tracking could be able to do. But you're talking about your hands and your fingers. So you actually need your dexterity of your fingers as you're doing it as well. And so there's a lot of hard problems that you're doing that. But how do you correlate the position of your hands into the virtual space? Because that seems to be a key part of tricking the mind that what you're feeling is actually real.

[00:07:49.399] Eric Vezzoli: So we addressed this problem in our technical roadmap and we came up with a solution that we think would solve the problem and especially would also implement a zero latency system or a really really low latency system. Of course the optical tracking will be part of the solution but what is extremely important is to have also embedded tracking because what you need to do is a zero latency haptics feedback so you need an IMU on the finger and especially our devices, it's consistent with the tip of the fingertip, which is the point which contacts. So by having a global tracking system and a local tracking system and doing this data fusion between the two, we will be able to implement what you were addressing. So it is this correlation between the displacement of the hand in the virtual world and a correct haptic sensation. Within the limits of our perception, because our perception is pretty large, you can get up to 20 milliseconds of mismatch between the two signals and everything is okay. So you don't need extremely precision in the optical, but you need extremely precise inertial tracking within the device. Because you can trick the geometry, but you cannot trick the frequency.

[00:08:58.843] Kent Bye: And so do you have a variety of different pressure that you're giving within your device? Because my perception was that it was kind of like a uniform pressure. No matter how fast or slow I was pushing, it all seemed to be the same sensation on my finger, which to me was a bit of a fidelity mismatch that broke the plausibility illusion and presence in some way.

[00:09:18.023] Eric Vezzoli: So yes, that's a really, really, really good point. This, what we are showcasing here is extremely simple demo because we need to showcase the different approach. So contact, nobody did it. Okay, let's introduce the contact. To introduce material properties, that is kind of displacement, different displacement in function of the skin, what we need is the correct tracking, a correct data fusion between the two, and a case which is not the CES, which is a really general public solution. We need the kind of proper demos where we can showcase and introduce the different kind of solution. and of course not to be based in the leap motion. In this situation it's the easiest, but of course in the future we will implement all the situation that you actually introduced. What we were building here is the building block, it's the fundamental stuff. It is, okay, let's not work with gloves, let's not work anymore with force feedback, let's work with a completely new object which is a wearable ring that generates the contract. Okay, let's start from scratch. How it is the technology? Is it interesting? Yes, and let's get it better. And that's what you're going to do in the next two years.

[00:10:17.551] Kent Bye: And so just in the process of using it, I noticed as it was pushing on my finger, it was sliding off my finger. I had to keep pushing it back on. And do you have any plans of having other mechanisms to hold it back? Because it's pushing against my finger, and so it's kind of pushing itself off of my finger.

[00:10:32.833] Eric Vezzoli: So, this pushing back is mostly due to the fact that we didn't regulate it on your ergonomy. So, your finger is different to anybody else's finger. So, with your personal device, there will be a personal regulation to avoid this kind of pushback. Of course, the attached system now is just a Velcro with an elastic system. We are working on a semi-rigid system that has a really high friction that will avoid for your system to slip back. Again, this is on our technological development. We are a startup right now. We are looking for funding to implement all this technological review. We are here to showcase the idea. And then we need someone who is interested in the idea to implement all of the remarks that actually you gave it to us. We thought about them because we arrived from haptics, so it's standard remarks. We thought about that. We designed the fundamental system to address them. And then we have our plans to actually put them in production and put them into a really well-working prototype.

[00:11:27.363] Kent Bye: And so some of the people I've talked to, they say that if you're able to simulate the haptics and the feelings in the hands, you can actually take care of a lot of the sensation of touch. And so do you have plans to expand on to your full hand, like all five fingers, or also other parts of your hands? Or can you go beyond just one finger and coordinate amongst multiple fingers to simulate that? sense of actually touching that and then you know how do you deal with if you have the sensation of grabbing a ball but there's no ball there you don't have that force feedback to be able to actually convince your brain.

[00:11:59.028] Eric Vezzoli: That's an extremely pertinent question. The force feedback for the human is divided in two different categories. One category is between zero and one newton, and the other one is between one newton and infinite newtons. Between zero and one newton, what the brain expects is not the force, but it's a skin displacement. So it's a displacement on the tip of the fingertip, which is exactly what you are doing. So that's why we designed the system. So we want to displace the skin of the fingertip and give the force feedback till one newton. After one newton, of course, you need an exoskeleton. But the object that every day you grab and move are within the range between one and five Newton. So with five fingers, you have one Newton for each finger. So you are almost still in the range of the illusion that you generate between this place and the skin. So that's why we designed this system. If you want to address the full hand, our plan is to use five or three different devices, which are totally independent, one of them. Why we are interested in that? Because with the three different tracking, what we do also is just a recognition. Not just grabbing, but also human-machine interaction. Like the Apple trackpad, you have three fingers, they have a different track, they expand, and so the windows just go in their places. Why can't we do it in VR? Why can't we expand, explode systems just with free hands? But we need really precise tracking on board. And for doing that, we need a single unit which is able to do it. Otherwise, we have to design, as we were saying before, a single machine for a single use. What we want to do is have the mouse of the virtual reality, because the mouse does tracking and clicking, so track and haptics. What we will do here is track and haptics. and it is scalable and easy to wear. That's our approach, it's like simple is better. Start from there, then get complicated. Not start from complicated and then get simple.

[00:13:42.380] Kent Bye: And I'm curious to hear a little bit more about the different use cases you see for your approach to the technology, because, you know, in the long run in haptics, maybe 5 to 20 years, I hope that we'll be able to have haptics so that when we touch something, it feels like it's replicating the material properties, the levels of friction, and, you know, if it's fuzzy, or all these different, probably some of the hardest engineering problems that I can imagine within virtual reality, even though it may be just a small percentage, I imagine that the vision and the auditory may account for anywhere from 80 to 90% of the experience, but the touch actually gives a lot more and it can actually flip you over into really giving this deep sense of presence. The passive haptic feedback experiences that I've had with being able to have objects in mixed reality settings or the void, it really gives that deep, deep level of immersion and sense of presence, but yet It feels like there's a lot of almost intractable problems when you think about replicating material properties or, you know, in order to really get a virtual reality that's indistinguishable from real reality then. Do you feel that that's on the roadmap and achievable in the next 5, 10, 20 years? And I'm also curious to hear some of like the stepping stones to get there.

[00:14:50.182] Eric Vezzoli: So that's an extremely, really well question. And I will quote a guru of haptics, which is probably the best haptician in the world, which is a professor in UPMC in Paris. He's called Vincent Heyerd. He says, the haptic problem, it's an infinite degree of freedom problem, which cannot be solved. The only way to solve it for specific application is to trick the brain. What you can do is get to the 80-90%. You cannot get to the 100%. Because to get to 100% you need temperature, texture, and it has been shown that the user, a human, can distinguish up to 10 atomic sheets of edge. So what you want to do is to reproduce one nanometer. And that's extremely complicated. And you want to reproduce one nanometer displacement on a fingertip, which is totally different for every e-wares, and it's not a regular surface, because it has a fingerprint. So, to arrive at the full reproduction, I dare to say that he has a point. to arrive at a convincing reproduction with a multisensorial approach, which is virtual reality and sound, then yes, we can use illusion and then trick the brain. What we are doing is an illusion. In the next 5-10 years, what I see is an extremely good fusion between touch and vibrational approach to get kind of a full range of perception of the finger. And based on that, probably the integration with exoskeleton. However, the force feedback is not cheap, because mechanics is expensive, and it's always been expensive, it will always be expensive. Because motor, you have to build them. So to get to the great performance, like for surgery, it's expensive systems. For user, we can get, as we said before, to something which is totally reasonable. Not really the full reproduction of reality. because that's probably outside the possibility of the mechanics nowadays.

[00:16:41.535] Kent Bye: Yeah, I think that, talking to Rob Lindeman, the thing he said is that in order to really replicate that level of haptics, you would have to have a haptic suit to kind of almost completely isolate yourself from the feeling in the environment in the real world. And then you would be able to start to perhaps to do that level of fidelity to trick your entire body. Because we're talking about just the hands, but to really, the whole body is a haptic system. So do you foresee the eventual moving towards this Ready Player One-esque haptic suit sometime in the distant future?

[00:17:12.032] Eric Vezzoli: So, yeah, still good point. It would be maybe, maybe something possible if we would have been all the same size. Because if we would have been all the same size, you do one device for everybody, and it tailors perfectly on you, all the mechanoreceptors are in the same position, so you can stimulate them individually. But the problem is that we are all different. So my haptic suit is not your haptic suit, because you're maybe like five centimeters taller than me. And so your haptic suit, it doesn't drive. A good point. So the immersion is not total. That's the big point about haptics, because vision, you can regulate the position of the lenses, and then it's done. We almost, more or less, see the same stuff. Okay, there's someone who's diatonic, but that's not the problem. Hearing's the same. The position of the earplugs is the same. The mechanics of the body is extremely, extremely variable. You have people like four foot, and people like seven foot. So you need to tailor the system on them to make them stay totally immersed in virtual reality. Might be possible for an experiment for one person with a lot of effort, but I don't really see it for a mass market system.

[00:18:18.293] Kent Bye: And what do you want to experience in virtual reality then?

[00:18:21.965] Eric Vezzoli: For now, what we are interested for is the translation between interaction strategies that you have in real world into the virtual world. So it's okay, I learn something in real and I can use it in virtual, but also the way back. I learn something in virtual and I can use it in real. That would help us to implement systems which can teach people. because teaching something, a thing that we do for 25 years of our life, the first 25 years, which is enormous, and there is a lot of things around it. If we can just make this bidirectional ability a little bit more immersive, we will actually help a lot the user to get faster in this kind of process. This is what we want now. We are not yet to the full reproduction. We want to, okay, can we at least interact with the system in the same way? Why? Because we are designed to interact with the reality. We are not designed to interact with the virtual reality. So for us, it's best to try to tailor the virtual reality on reality to use all the strategies that our brain has, especially moving, touching. We are focusing on touching because most of the mechanoreceptors of the body are on the tip of the fingertip. We are focusing there. But of course, as you said before, the body is enormous. And the problem is extremely hard.

[00:19:35.967] Kent Bye: Great. And finally, what do you see as kind of the ultimate potential of virtual reality and what it might be able to enable?

[00:19:44.587] Eric Vezzoli: I like to say that the big things is the mixed reality, not just a virtual reality. Virtual reality is extremely well for gaming, it's extremely well for learning, it's extremely well for this kind of application, but it has a big problem which is you put a lot of strain in your eyes, you put a lot of strain in your cognitive system because you always feel a mismatch between what you see and what you actually experience with the body. In Mixed Reality, you will be focused on what you have, your real world, so you're totally comfortable with that. And then you can add information, you can add things on top of that. Which is probably more interesting, because there is a lot less isolation between Mixed Reality and Virtual Reality, because you can see the others. In virtual reality, you can see their avatar, which is actually not the same, because you cannot recognize really well your facial expression. People are working on that, but it's complicated. It is not a problem you can solve tomorrow. The ultimate goal of it, I think it can be applied in so many fields that it is too ambitious to define the ultimate goal. And you can go also in dystopian, actually, dystopian approach, because if you think something about, OK, I just fall asleep and enter in the Matrix, that's total virtual reality. Some people may choose it. OK, but that's a bit strange, actually, to think about it. And that might be a final approach. But is it something that we should aim for? It's not me who should judge it. So I don't want really to give you a single answer to the final aim of virtual reality.

[00:21:18.953] Kent Bye: Awesome. Well, thank you so much.

[00:21:20.575] Eric Vezzoli: Thanks for the interview. It was really pleasant. And bye bye.

[00:21:24.959] Kent Bye: So that was Eric Vizzoli. He's the CEO of GoTouchVR, which is a haptic startup company. So I have a number of different takeaways about this interview is that, first of all, just generally demoing haptic devices within virtual reality at this point is just really disappointing because it's not ever convincing my brain to the point where i just think that oh my gosh this feels amazing it's still really firmly in that realm of the uncanny valley. And I don't think that GoTouchVR has necessarily escaped that. But I think this interview is important just because somebody is going to figure out some solution that's going to trick the mind enough that it's going to be satisfying for consumers to be able to use in different applications. And whether or not that's going to be gaming applications or it's going to be specifically used in training exercises, that's what I am not sure at this point. At the HTC booth at CES, there was the Noidom gloves that they were able to essentially use the Vive tracker puck that was attached to your arm. And it was probably the most convincing VR glove that I've seen up to this point. And with Neutom, they have the fingertips cut off, so you're able to actually touch your fingertips. So when you're grabbing something, you're getting that sense of haptic tuts, but you're actually touching your fingers. The thing that GoTouchVR is trying to do is to kind of simulate that feeling that you're touching and grabbing something. So if you're grabbing a ball, it's giving that stimulus to your fingertips, and then they're hoping that it's going to essentially trick the brain enough so that It gives a little bit more natural user interaction. Now, the thing that Eric is saying is that there seem to be two different levels of touch within VR. One is just simply touching things. That's going to be on the scale of zero to one newtons. And that's the realm that they're really trying to do is like if you're just pushing a button, they're just trying to simulate the skim displacement that is happening when you're actually touching things. When actually grabbing things and picking them up, that's where you need to start to have a more sophisticated exoskeleton that's actually on your hand and being able to mechanically with motors give you resistance so that your fingers are stopped from collapsing and forming a fist and that it gives you that feeling that you're actually holding something. So Eric seems to think that at some point that there's going to be a combination of some sort of fingertip stimulation along with some sort of exoskeleton system. They're trying to do the lowest fidelity, the low end of just stimulating the fingertips. It felt like a very early prototype and they're at the stage where they're taking pre-orders for their dev kits, but they're a startup and I don't know what plan they have to come to market and actually bring this low fidelity haptics devices out to market. So previously I've talked to Love Coley who talked about this idea of redirected touch which is basically exploiting this concept that our visual perception dominates. One of the examples that I've seen at IEEE VR is that they took this cylindrical can that was straight and they made it curved so that you give the visual feedback of your hand moving in a curved trajectory but yet you're feeling something that's straight and your mind's tricked in thinking that it's actually curved and so that's what Eric is saying is that they're trying to create some sort of haptic device that's operating in this multi-sensory environment that also has virtuality immersive visuals that you're seeing as well as with the sound And when you have all those together, that haptic touch, it can trick your mind pretty easily. I've also seen redirected walking demos where you're walking in a quarter circle, but you're getting the visual feedback that you're walking straight, and you have your hand on these walls, and it's actually a curved surface, but you can't actually tell that it's curved because you're getting the visual feedback that it's straight, and so your mind kind of overwrites what some of your visuals are receiving. So just this idea that eventually they're going to be able to do some sort of potential combination of skin displacement as well as perhaps some sort of vibration, they may be able to stimulate the frequency response of the fingers enough to kind of trick you into actually touching things. But in terms of who's going to be able to create the haptic device that is super convincing and being able to replicate all different dimensions of the touch, you'd probably have to end up using the GoTouch VR in combination with some sort of other exoskeleton solution. And it's unclear to me whether or not that's on their technical roadmap. But what was immediately clear is that they do have this multi-year trajectory of where the technology is now and where it's going to go in the future. But it sounds like overall there's just the state of technology and where things are at. It's going to be, I think, firmly in the realm of the Uncanny Valley, at least for another year or two and potentially beyond that. But in the long horizon of 5, 10, 20 years, I think for sure haptics are going to be one of those things that we actually really want within a virtual reality experience, because you can only go so far with just providing a visual and auditory stimulus. I think actually feeling things in VR is going to be that thing that actually flips us over from having a virtual reality that is indistinguishable from real reality. But without that touch, I think it's always going to just feel a little empty and hollow. And that right now, it may actually be the realm of VR arcades where we start to see some of the most cutting-edge haptic devices. I know the Void, for example, is still one of my favorite VR experiences I've ever had, being able to walk around forever within a virtual reality experience in this kind of infinite way, where I'm reaching out and touching objects. It just tricked my mind in a way that I had never been tricked before, and it just cultivates this extremely deep sense of presence. Also in trying out this motion platform experience at CES it was called vault from Viv Studios from South Korea and it essentially felt like I was flying in the speed racer through this desert and then this three-dimensional city that had no gravity and I was just turning all up down left right and And the content matched with the motion platform just gave me this deep, deep, deep sense of immersion. And so being able to stimulate the haptics in our body is going to be the thing that tips us over into having some of the most intense and visceral experiences. And I think a lot of those are not going to be available for the home and that they're going to be in the VR arcades, and they're just going to be too big, too expensive, and we're going to be moving back into this golden era of arcades that are providing more immersive experiences that are possible than are home. Once the gaming consoles started to surpass the graphical fidelity of arcades, then the golden era of arcades really died out. But I think with the limitations of space and real estate in combination with stimulating all of your senses with the different haptic devices, I think that the most cutting edge stuff is going to be in the VR arcades over the next two to five years. And we may not see a lot of the VR arcades are taking off here in the United States just yet, but I get the sense that there's a lot of movement within China and what's happening with VR arcades in China. So I just wanted to contextualize that a little bit. It's going to be a while before we have our amazing exoskeleton haptic suits and being able to really trick our minds into believing that we're completely immersed within a virtual reality. But it's on the technological roadmap. We're going to get there at some point. And I think it's important to take a look at what companies like GoTouchVR, as well as AscendTouch, which I'll be covering later this week, are doing within the haptic space because it's part of the overall technological roadmap. Well, they're not they're going to be the ultimate solutions. I think it's an open question, but yet I think they're asking the right questions. So that's all that I have for today. I just wanted to thank you for joining me on the Voices of VR podcast. And if you'd like to support the podcast, then please do tell your friends, spread the word and become a Patreon donor. Just a few dollars a month makes a huge difference. So donate today at Patreon.com slash Voices of VR. Thanks for listening.

More from this show