#351: Redirected Touch: Using Perceptual Hacks to Create Convincing Haptics

Luv-KohliRedirected walking is a concept within VR that tricks a user into walking into circles, but gives them visual feedback that they’re walking in a straight line. We tend to trust our visual input over our other senses, and so redirected touch using that same principle of visual dominance in order to trick our minds into thinking it’s touching different objects while only using a single passive haptic object. It can also fool us into thinking that straight surfaces feel like curved surfaces.

Luv Kohli is one of the pioneers of redirected touch, and he wrote his Ph.D. thesis on the topic at the University of North Carolina at Chapel Hill in 2013. I had a chance to catch up with Luv at the IEEE VR conference to learn more about the extent that we can warp VR spaces without our minds being able to consciously perceive it beyond having it temporarily feel weird.

LISTEN TO THE VOICES OF VR PODCAST

Here’s a video example of redirected touch where the user is touching a square object, but the user perceives that they’re touching a curved object because the visual field is so dominant.

There’s a number of interesting neuroscience of perception articles that I’ve come across over the last week that are related to this concept of hacking the limitations of our perception within VR.

Reading through these articles brings up some fundamental questions about the nature of reality, and the takeaway for me is that there’s some low-level aspects of our reality that our brains may be perceiving, but that our perceptual system has perhaps evolved over time to only really process and understand signals in our system that would be essential for survival. We create high-level metaphors to comprehend what’s happening in the world around us, and it in a sense creates a very compelling illusion about the nature of reality.

Cognitive science and neuroscientists have been coming to these conclusions over the past number of years, and virtual reality is starting to bring more attention and awareness to the extent of our perceptual limitations. We’re building entirely virtual environments that have the ability to fool our minds into believing that we’re being transported into another world because of the magic of presence and what Mel Slater refers to as the place illusion and plausibility illusion that virtual reality provides.

Overall, VR is bringing awareness to some of these deeper philosophical questions about the nature of reality into question. The truth may be out there, but perhaps our conscious minds don’t really need to be bothered by the specifics of these details in order to survive. But as virtual reality content creators, knowing about the limitations of our perceptual capabilities allows us to use techniques like redirected walking and redirected touch to create realities that provide a deeper sense of embodied presence.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Rough Transcript

[00:00:05.412] Kent Bye: The Voices of VR Podcast. My name is Kent Bye, and welcome to the Voices of VR podcast. Today I talk to Love Coley, who is one of the leading researchers in an area called redirected touch. Now, redirected touch is similar to redirected walking, which I've covered on the podcast back in episode number two, number 200, and when I talked to The Void in episode 284 and 299. So essentially the concept of redirected walking is that Our minds can be hacked by VR in the sense that the visual field is the most dominant and so all the other senses can kind of be overridden by the visual input that you're given. So for example, you could start to walk in a circle and believe that in VR you're walking in a straight line. That's what redirected walking is essentially. So redirected touch is that you kind of reach out and instead of grabbing an object you have one passive haptic feedback object and you're able to pretend like you're touching different objects even though you're touching the same object. So just to help set the context a little bit as to why this is important is that Haptics within a VR experience can add so much of a sense of presence to your overall experience. Whenever you're connecting what you're seeing and what you're touching within the real world and the virtual reality world, then there's something very deep and primal in your brain that just is completely tricked into believing that what you're experiencing is real. Even though your rational mind may know, there's a part of you that is really deeply present into the overall experience. We'll be exploring that in today's podcast. This podcast is brought to you by the Virtual World Society, which, you know, I first did an interview with Tom back in episode 245, and I remember distinctly being at the Seattle VR Conference and asking Tom, you know, what do you think the ultimate potential of VR is? His answer was surprising and it really stuck out and stuck with me. What he said was that he sees that VR has the ultimate potential to actually get us more connected to our real reality and to appreciate it even more. And I've actually kind of had that experience. This is somebody who's been in VR for over 50 years now and so I think it's interesting to see that the ultimate potential in his eyes of VR is to get us more connected to what's happening in reality. So that's kind of what he wants to do with the Virtual World Society is to create more experiences that connect the virtual world and exploration of that with things that are actually happening in the real world. So check out the virtualworldsociety.org for more information on that. And with that, I did this interview with Love Coley at the IEEE VR conference, and it was recommended to me from Jason Gerald to talk to Love Coley about this idea of redirected touch. I had never heard about it before, and so I just thought I'd talk to the man who helped invent the concept and get more information on it. So with that, let's go ahead and dive right in.

[00:03:11.744] Luv Kohli: My name is Love Coley. I graduated from the University of North Carolina at Chapel Hill a few years ago, and I'm currently working for a small medical device, medical software company called Interoptic Technology, and we do needle guidance for surgery and help doctors get needles into tumors accurately to help them treat patients more effectively.

[00:03:29.048] Kent Bye: Great. And so there is this idea of redirected walking, where you're walking and then you kind of change the visual input to your perceptual system to be able to trick you into thinking that you're walking straight. And you also kind of did a variation of that with touch. Maybe you could tell me a bit about that.

[00:03:43.932] Luv Kohli: Yeah, I did. So in a lot of things that we do with virtual reality, we try to find out what can we get away with and how can we give people a perception that they're in an environment that is what we want them to experience. And one of the big problems with virtual environments generally is that you can't really give people a sense of touch with objects that don't really exist. So when you get into an environment, you reach out to try to touch something and your hand just ends up going right through it because it's not really there. One of the ways to solve that is what we call passive haptic feedback, where we have a physical object that is registered to the virtual object in your space. It's very effective and compelling because there's a real object that you're actually touching. But the problem is if you wanted to change your virtual object to a different shape, then you have to then go and change your physical object to match it. So imagine if I wanted a flat physical table, but I really wanted it to be virtually sloped. So the idea behind redirected touching is trying to enable people to feel that they are touching a different shape than they're actually touching in real life and that's done through visual dominance and essentially if someone is moving their hand across a flat physical surface in the virtual environment in a head-mounted display we would show their hand moving across a slope or across a curved surface and because your vision tends to dominate your other senses quite strongly they tend to believe what they see rather than what they feel.

[00:04:59.923] Kent Bye: Yeah, so last year at the IEEE VR there was a demo where there was like this cylindrical can and you had this augmented reality window into it where it was tracking your hand and so as you were touching it, it would show that it was kind of like a parabolic curve. So even though your finger was going around a surface that was straight, your visual input was showing that it was curved and then you kind of feel like it was actually a curved object. So it seems pretty magical that you're able to do that. So why does that happen?

[00:05:32.268] Luv Kohli: Well, so actually, this is the kind of thing. And that was actually a really cool demo. This is something that's been studied for decades in the psychology realm. People have been putting prism glasses on their eyes for quite some time just to see what would happen, how they would adapt to the different changes in their vision. If you wear glasses, you'll often notice when you get a new prescription that the curvature of the scene that you see is a little bit different. And over time, you adapt to it. So people have a tendency to adapt to these things quite well over time as they get used to it. And as you continue to explore the environment through these sort of distorted visual cues that you see, your brain tends to then adapt to it and figure out what it's really supposed to be based upon the other knowledge that you have. There's a tendency to put more weight on the sensory stimuli that you find more compelling and more reliable. And so since vision is used so heavily in our perception, it tends to be the most reliable signal. And so when you have a discrepancy between the touch and vision, the reliability tends to fall onto the vision. In most cases. It's not always the case. There are some times where haptics wins out as well. But vision tends to dominate.

[00:06:38.119] Kent Bye: Yeah, so there was a poster here that won the 3DUI poster competition that was all about calibration and recalibration for when you're touching. And so the author of that, Ellie, was telling me about another study where they flipped up the world upside down. And then over a couple of days, the person basically recalibrated to be able to see the world right side up. And so there's this really interesting recalibration process that seems to be happening here. So maybe you can, first of all, kind of talk about what's going on there.

[00:07:09.920] Luv Kohli: So, I know there was someone a while ago who wrote a book on this idea where he actually wore glasses that flipped his entire vision upside down. I don't remember the details but it took him a few days to adapt to that and over time it started to feel normal to him and then when he takes the glasses off everything feels flipped again and this is sort of a negative after effect. I know decades ago this was also studied by a psychologist Gibson where he found the idea of a negative after effect in touch when Straight lines are turned into curves. When you take the glasses off, you get the opposite effect in the other direction. And so it really comes down to all this adaptation. You start to get used to it. You have the sense of what the world is supposed to be like, and you adapt to it and react in a way that's consistent.

[00:07:51.410] Kent Bye: So what's the limits of this redirected touch then? I mean, is there a process of recalibration that the effect wears off over time?

[00:07:59.658] Luv Kohli: I haven't noticed that. So in the studies that I did under my dissertation work, I was able to have people experience shapes that were off by, you know, a few inches. I had a square sitting in front of them and rotated it and I could rotate it about, you know, 18 degrees. and they wouldn't notice the difference after they spent a couple of minutes exploring it and adapting to it. The initial reaction when someone encounters one of these discrepant environments where the real and virtual don't match is that it's a very strange feeling and they can't really interact with it effectively in a way that they can do some task with any good performance but after a couple of minutes exploring it they get used to that feeling and they can use it and then when they come back out of the environment the inverse happens and it starts to feel strange again and then they readapt to the real world. I don't know the specifics of what's going on. There's still a lot of research work that needs to go beyond that, but certainly people can adapt quickly and then readapt to the real world quickly.

[00:08:56.511] Kent Bye: Yeah, so Jason Gerald was telling me that you were able to do this thing where someone would reach in, touch a single object, but visually they were seeing, like, maybe two or three objects, and then reach back and then reach again and touch a different object, but yet it was the same object through redirected touch. So maybe you could talk a bit about, like, how are you able to use a single object but trick someone into thinking it's two or three objects?

[00:09:18.535] Luv Kohli: So the way the technique works was basically by creating a warp field. We were warping the virtual space such that as you move your hands through the real world, the mapping to the virtual world movement is not one-to-one. And so, for example, a straight motion through the real world would result in a curved motion in the virtual world. And so as you're moving forward to touch a single object and you pull your hand back, the mapping between the real world and virtual world is then changed. And then as you move forward again, it's mapped such that your motion might be on a diagonal, your virtual motion might be on a diagonal. So you look like you are touching the left object instead of the one that's directly in front of you, even though your hand is moving directly straight. Some of this work was done recently by another colleague and I think this was done at Microsoft recently and they had more of this where they were able to stack blocks on each other by touching different objects but it was always the same object. Earlier I had worked on something related to that as well. I wanted to make like a whack-a-mole game type of thing where you could have a single object and you know touch a bunch of virtual objects but just use a single object.

[00:10:21.800] Kent Bye: So what are some of the practical applications of this redirected touch then?

[00:10:25.783] Luv Kohli: So the original application behind this was for aircraft cockpit procedures training. The idea being that when you're training for different aircraft a lot of times you want to have something that's reconfigurable. Creating a full flight simulator for a lot of different aircraft takes a lot of space and a lot of expertise and a lot of money and we're looking into how can we enable designer of these simulations to have a simple physical setup with a head-mounted display and map a different cockpit onto the setup, many different cockpits on the same setup, and especially for a cockpit procedures training where you don't necessarily need that precise reach stuff for ergonomics, certainly ergonomics is a separate field, but knowing where the different buttons are and being able to interact with them would be a useful thing for cockpit procedures training.

[00:11:12.097] Kent Bye: As I'm trying to think of a metaphor of what you're doing here, it's almost like you're splitting and ripping and stretching and squeezing space-time in a lot of ways. You have these different distortion fields that are being warped, and I'm just wondering how many of those can you do and have someone adapt and calibrate to it? I could see one pretty simply and easily, but once you have a whole field of these different places where you're starting to tweak the direction of someone's touch, if someone's able to adapt that or if there's a limit to how many times you could do that in a scene.

[00:11:44.782] Luv Kohli: That's a really good question. I don't know the answer to that. One of the things I was interested in looking at is how distorted could the field be and could we distort the field in a way that we could map several different objects simultaneously. And what I found in some of my limited exploration of this idea was that I could distort the field for multiple objects, but I couldn't make it such that there are these huge changes in inflection in the warp field. Because as you're moving around, it's very strange if your hand doesn't move in a consistent way. So if it's going forward and then suddenly it starts going the other direction, even though you feel like you're moving your hand straight, it can be a little bit disconcerting. But as long as it's somewhat of a smooth curve through the space, you can get away with a fair amount of things.

[00:12:25.646] Kent Bye: Now, in redirected walking, they usually measure the minimum distance that you need to be able to have this effect of redirected walking without really necessarily noticing. And I'm curious if there's a way that you are measuring the limits of the range of how much you can distort one of these fields.

[00:12:43.758] Luv Kohli: So I haven't measured that exhaustively. The original study that I worked on, I was looking at that to try to find out approximately how much people would be able to experience with this. I think a lot of it depends on how much time they spend adapting to it. Initially, you can get away with a couple of inches pretty easily. People won't notice. But for the larger distortions, you definitely need time to adapt to it because the large discrepancies are a little bit strange feeling at first. But people do tend to adapt over time. There's a lot of research that needs to go into figuring out these different parameters because there's a lot of different kinds of distortions that you can make. It's not just a translational distortion, but you can have rotations, you can have scales in all different directions when you're talking in 3D, and so different kinds of distortions may work better than other ones.

[00:13:31.070] Kent Bye: And I know last year at IEEE VR there was some discussion about the cognitive load implications of redirected walking. In other words, like when you start to trick someone's mind into thinking that they're walking straight but they're actually walking in a curve, that there's additional recalibration, cognitive processing that has to happen. And I'm curious if there's a similar analog process that would be happening here with redirected touch.

[00:13:53.045] Luv Kohli: Yeah, it's a really good question and it's an important one because if you want to have a system in which redirected touching is used for something that requires task performance, you don't want their performance to degrade. And if there's an extra cognitive load on someone that may not be able to focus on the task they're actually doing. And so the last study I did for my dissertation was to evaluate task performance in that setting. And what I learned is that after adaptation, people are able to do sort of motor tasks, Fitts' Law style motor tasks, no worse than they could in a one-to-one mapped environment. Now, that was a very specific task that involved pushing sequences of buttons and geared sort of towards cockpit procedures training. But the hope is that after adaptation, people are able to use a distorted environment just as well as they can a one-to-one environment.

[00:14:38.115] Kent Bye: And are there any implications of going into a virtual world that, you know, they're building up these muscle memories to be able to switch these knobs, but yet when they get in the actual cockpit, they're not going to be having that same proprioceptive muscle memory of where exactly to turn the knob. Do you find that the visual field still dominates, or is there any impact of doing a training in a virtual space that's not exactly the same as a real space?

[00:15:03.179] Luv Kohli: It's a really good question too and it's also a real concern because after doing the study and people adapted to that distorted space, they would come out and try to touch the one-to-one spaces again. It would feel strange again. They re-adapt very quickly. So the question is, if you're doing this for an ergonomics type situation, I think it would not necessarily be appropriate because you don't want people to believe that they can reach something when in fact in the real world they really can't. But, for things where you want to remember where things are spatially, I think having that haptics there lets you interact with the environment more effectively, but also lets you remember where they are, and when you come back to the real world, you'll probably very quickly readjust to the actual distance.

[00:15:44.940] Kent Bye: So are you applying some of this research to your current position? Not at the moment.

[00:15:49.858] Luv Kohli: We have some perceptual tricks that we have to do behind the scenes. I'm not currently doing anything with touch in the medical device space, but it's something that we've considered in some scenarios. Some of the stuff I can't talk about right now, but yeah.

[00:16:01.743] Kent Bye: So what are the other big questions that you see are kind of left to be researched still with redirected touch?

[00:16:08.686] Luv Kohli: Well, so we're always looking at ways to explore large environments, and so redirected walking is a big way to do that. And it's being studied in the research community and the gaming community a lot. But haptics is still, it's still a really hard problem. And I think enabling people to explore a large environment and also be able to touch the objects in a large environment would be really important. So finding a way to combine redirected walking with redirected touching I think would be really interesting. Now, it's a complicated problem because if you're using redirected walking and you rotate the space, the mapping between your physical objects and virtual objects no longer exists. And so you have to find a way to remap those things. So finding a way to seamlessly integrate those two together I think would be really interesting.

[00:16:50.110] Kent Bye: And finally, what do you see as kind of the ultimate potential of virtual reality and what it might be able to enable?

[00:16:57.670] Luv Kohli: That's a really good question and a tough one. Certainly gaming and entertainment is a huge one. What I actually am most excited about is how virtual reality can be used for productivity and for the medical space. One of the things that, at least in the company I'm working for, we use tracking technology to help guide surgeons to put needles into tumors accurately so that they can treat patients better and my hope is that as the technology proceeds and progresses over time that we can use virtual reality and technology for a lot of those kinds of applications effectively.

[00:17:29.392] Kent Bye: Great, well thank you so much. Thank you. So that was Lev Kully, one of the pioneers of the technique of redirected touch. And so I've had an experience at The Void where I was putting my hand on a curved surface and walking forward. And it looked like a straight wall in the VR experience. And I had no idea that it was actually a curved wall. And so it's pretty amazing that we can have this disconnect between what we're actually sensing, what we're seeing, And there's been all sorts of really interesting articles coming up over the last couple of weeks that I've been reading about this whole concept of what is reality and how much of it is actually constructed by our perception and our subjectivity. And it really starts to blur the lines between objectivity and subjectivity when you start to really dig into how much of the world is essentially being hidden from us because we're just trying to understand metaphors that are allowing us to continue to survive. Just to point out some of these articles that we're checking out. There's a interview with Donald Hoffman who's a professor of cognitive science? Talking about this idea that much of our reality is actually constructed through our perceptions There's a couple of researchers Michael Herzog and Frank Skarnowski who have been looking into you know our consciousness may actually come in 400 millisecond chunks which essentially means that at 2.5 frames per second is a digestible moment when our conscious mind is processing all this unconscious information and trying to sync it up and make it match. And I'm sure there's different levels of latency that you need for each of that. Obviously, VR is pegged at 20 milliseconds in terms of the visual field. Otherwise, we start to notice it. But overall, in real reality, they've kind of put at this 400 milliseconds processing speed for each moment of conscious thought that's integrating all these unconscious signals that are coming in. And then finally, Buke Cronin pointed me towards this article by David Eagleman, who is talking about brain time, which is essentially kind of covering the same thing of how our reality is being constructed within our minds and trying to explore different dimensions of time contraction, time dilation, time perception. more from a non-VR perspective, but general cognitive research into the way that the mind processes time. So there's just been all these sorts of different topics and articles that have been coming out recently just discussing this concept of our reality not being exactly what we see and perceive. The actual reality is something different. A great quote that Bew Cronin put out that was from this interview with David Hoffman was that, you know, you can just kind of imagine our perception of reality as kind of being the screen or a window, you know, all this operating system that we don't actually know what's happening within the innards and the guts of a computer in order for all these bytes to be delivered and put together. You know, we can't understand that, but we can understand this metaphor of a file folder on a screen, and we're moving it around, and we're able to open it, and We have these high-level metaphors to be able to describe reality, and we don't necessarily need to know about all the low-level things of what's actually happening at the quantum level, because if we did, it's not actually useful for preventing us from getting killed by the tiger. So anyway, there's a lot of really interesting thoughts and ideas about perception, the nature of reality that virtual reality is starting to point out. And the more that we start to dig into it, the way that the mind is processing our perceptual input, perhaps there isn't so much of a difference between reality and virtual reality when it comes to a neuroscience perspective and what's being stimulated into our minds. Perhaps we're discovering all sorts of different limitations of our perception through this endeavor by looking into VR, and redirected touch is one of those kind of perceptual hacks that we're able to do. The long-term implications of all this are kind of unknown, you know, what does it do to your body if you do all these hacks of the redirected walking and redirected touch over and over and over again. I don't know. I think there is short-term impact in terms of cognitive load, but I don't know if there's any long-term impact to it. So that's just something to think about. So with that, thanks for sticking around to geek out a little bit about perceptual illusions and the nature of reality and the voices of VR. I hope you've been enjoying this series. And if you have, please do consider becoming a contributor to the Patreon at patreon.com slash voices of VR.

More from this show