#485: Mary Whitton on Fundamental Research on VR Locomotion and Presence

Mary_WhittonDr. Mary Whitton has been working with interactive computer graphics since 1978 and virtual reality since 1994 in collaboration with Dr. Fred Brooks at the University of North Carolina at Chapel Hill. I had a chance to talk with Mary back in 2015 at the IEEE VR in France about her fundamental research into VR locomotion, haptics, and cultivating a sense of presence.

There have been a lot of military grants over the years to research the impact of haptics, spatialized audio, latency, and field of view into cultivating presence within VR. She’s also investigated a number of different issues around walking-in-place locomotion techniques, passive haptics & redirected touch, as well the nuances of Mel Slater’s presence theory with the place illusion and plausibility illusion.

LISTEN TO THE VOICES OF VR PODCAST

In 2015, the IEEE Visualization and Graphics Technical Committee (VGTC) awarded a technical achievement award to Oculus’ Brendan Iribe, Michael Antonov, and Palmer Luckey. At the end of this interview, Mary Whitton made a call out to the role that Mark Bolas and USC ICT may have played within the creation of the Oculus Rift.

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip

Rough Transcript

[00:00:05.412] Kent Bye: The Voices of VR Podcast. My name is Kent Bye, and welcome to the Voices of VR podcast. So on today's episode, I feature an interview that I did with Mary Witten back in 2015 at the IEEE VR Conference. And Mary has been working in interactive graphics since 1978 and virtual reality since 1994, and has been working at the University of North Carolina at Chapel Hill on a lot of different fundamental questions about presence and locomotion. So we'll be covering a lot of the biggest insights on her research into locomotion and presence on today's episode of the Voices of VR podcast. But first, a quick word from our sponsor. Today's episode is brought to you by the Voices of VR Patreon campaign. The Voices of VR podcast started as a passion project, but now it's my livelihood. And so if you're enjoying the content on the Voices of VR podcast, then consider it a service to you in the wider community and send me a tip. Just a couple of dollars a month makes a huge difference, especially if everybody contributes. So donate today at patreon.com slash Voices of VR. So this interview with Mary Whitten happened at the IEEE VR conference that was happening in March 23rd to 27th, all the way back in 2015 in Arles, France. And this was right after the Game Developers Conference and the Vive had just been announced for the first time. And it had also just been announced at that point that Oculus had won an IEEE Technical Achievement Award for the work with the Oculus Rift. So we'll be talking a little bit about that at the end of this interview as well. So with that, let's go ahead and dive right in.

[00:01:54.128] Mary Whitton: My name is Mary Whitten. I'm from the University of North Carolina at Chapel Hill. I've been involved in interactive graphics since 1978 and probably formally what you would call VR since about 1994. That's pretty early days and Carolina was one of the very early places that did a lot of virtual reality, doing systems on head tracking, haptic displays, big rendering engines, precursors to the game engines. So we had capabilities that other people didn't have early, which made it possible for us to make contributions early.

[00:02:30.241] Kent Bye: I see. And so have you looked at the issues such as latency or, you know, what are the type of things that you looked at in terms of what creates a really good virtual reality experience?

[00:02:39.755] Mary Whitton: I have been collaborating with Dr. Fred Brooks since the late 1990s, and we did a series of studies and research projects on what makes people feel like virtual reality is real. So one of the things we did was, does it matter if you have haptic things to feel, such as with your feet? Can we make you feel scared? which was the original Carolina version of the pit that had been first shown by Mel Slater. Does latency matter? Which it certainly does. Does the resolution of the screen matter? Does the field of view matter? So we systematically worked through a series of the technical and hardware aspects and the presentation of the sensory information to see what values were needed for you to have a good sense of presence in an environment.

[00:03:26.445] Kent Bye: And so what do you say the threshold is in terms of latency, in terms of creating the optimal VR experience?

[00:03:33.328] Mary Whitton: At the time we did our latency study, we compared 50 milliseconds and 90 milliseconds and have significantly higher levels of presence at 50. Then we had a student who really looked at, if you were trained, how much can you perceive? And you can perceive down to 4 milliseconds. So if you can get below 40, you're doing pretty well. And most people are pretty comfortable at those levels.

[00:03:55.197] Kent Bye: I see. How is the resolution and the field of view all tied together in terms of creating a sense of immersion?

[00:04:03.375] Mary Whitton: Those things are, of course, application dependent. I think one of the more interesting things people are studying now is the effect of the peripheral vision. If you've got an arrow, what is it that you're actually getting from the periphery? Does it need to be good cues? Does it just need to be something out there? And we haven't studied that so much recently, so I don't have a really good opinion on that. The other thing I've actually studied is locomotion, if you want to ask me about that.

[00:04:27.381] Kent Bye: Yeah, and so I guess, you know, one of the things about the field of view is that there's theories in terms of motion sickness and when you're moving and, you know, what is the connection between the field of view and locomotion then?

[00:04:38.455] Mary Whitton: Probably with locomotion, if you've got a self avatar, it really helps to be able to see your feet. So if your field of view, the vertical field of view is an under-appreciated aspect of field of view, I think. And if it's low enough, because your natural field of view is tilted down so that you do see your feet, or at least sort of see the ground so you don't trip. So you have an advantage if you have that. That's something we've not ever studied together. OK.

[00:05:05.833] Kent Bye: So yeah, in terms of locomotion in VR, it seems like one of the biggest open problems, and maybe you could give me an overview of what the biggest sort of open problems with locomotion are.

[00:05:15.619] Mary Whitton: I think they all come down to how are you going to move around that gives you the proper visual effect, that if you look at what your head does when you're moving, your head bobs up and down, your head bobs side to side, your head bobs front to back. And visually you don't see that, but in fact if you're tracking the head motion, and rendering based on your head position, then the scene will move all around. That's not good. What you want to do is to have a technique that will interpret whatever signals you're giving the system, that you want to move forward, you want to move sideways, and give you the proper optical flow for that. That's really hard because if you're walking in place, you've got proprioception in your feet, but you're not going forward and you've got to map your say, your knee motion to maybe a forward motion. If we really walk around, you can't walk around into space any bigger than your track space. So there are all sorts of techniques for that, whether it's redirected walking, which rotates you very gently to keep you into the space, or there are not really many other things that let you really walk. And really walking has been proved that you have better performance and just a better experience.

[00:06:23.944] Kent Bye: Well, it seems like the Oculus Rift, they have best practice guides in terms of designing virtual reality experiences. And I think a lot of immersive 3D games that in a 2D screen, they would implement a head bob. But then when they would try to implement that in VR, it would actually induce motion sickness. And so maybe just talk about that. Whenever you're sort of moving around the camera more than what's actually happening in the body, it seems like trying to simulate that walking motion. But when you try to do it, it actually increases nausea.

[00:06:54.187] Mary Whitton: It does increase knowledge. Because when we're running, we don't perceive necessarily world jumping all around. That's probably done with a game pad, so you're probably moving with a joystick, which means that your acceleration and your deceleration are probably not natural. The fact that you can strafe, well, nobody sidesteps. What is the mapping between a hard right with a joystick? What do you do with the optical flow? If you're going to make a corner, then it's going to be smooth. It's not going to be a step function to move to one side or the other. So the mapping between the inputs and something that's natural in terms of optical flow of your eyes is a very hard problem. The easy solutions are not very natural.

[00:07:40.875] Kent Bye: And so in terms of researching locomotion in VR, what are some of the studies or things that you were actually looking at?

[00:07:46.665] Mary Whitton: We have looked at ways to, when you're walking in place, how can I make it so that I have a smooth start, that there's not a lot of lag between lifting my leg the first time and starting to move through the scene. Stopping is very hard because there is latency in the system. since once you get going, you can't notice it. But when I stop, I want to stop. I want the visuals to stop at the same time. And the smoother the front end is, the longer it's going to take to stop. So the balancing of those two things in your algorithms is one of the very hard things. And also just simply getting a signal into the system that's reliable enough to recognize your steps, and to recognize the speed of your steps, and you really don't have anything that we might be able to map from the speed of your steps to the length of your steps, which would then be your speed through the environment. So there's this mapping of, again, of the inputs to an appropriate motion of the camera that is really a very tricky bit.

[00:08:48.569] Kent Bye: And would you say that when you're locomoting through a scene that if you have a consistent velocity then it's okay, but if you're sort of accelerating and decelerating then that is when your body would be expecting feeling more perception within the vestibular reflexes?

[00:09:05.487] Mary Whitton: It's interesting because if you look at normal walking, let's just discount the bob, but there is a time when you're ramping up to a steady speed and then ramping down, and there are very well-known characteristics of the velocity curves for those things. Once you get going, you could imagine varying the speed. As long as you didn't do it any faster than you can do it in real life, I think that's the issue. If you're trying to be realistic, to keep the parameters of the motion of the camera within the bounds of the parameters of the motion of the human body, so you've got the physical aspects, the physics of the body to consider as well as the movement through the scene.

[00:09:46.906] Kent Bye: And do you see that that's kind of bounded by whether or not you're in a virtual reality car and you're sort of accelerating in a car much quicker than you would versus when you're just walking?

[00:09:58.600] Mary Whitton: VR for vehicles is much easier because you're modeling a physical system and you know the characteristics of the velocity and the motion of the physical system. So the car will accelerate so fast, you can accelerate so fast. We don't know as much about humans. I mean the people who study body motion know a great deal about what we can do. And we violate that in VR, and I think particularly in games, but it sort of doesn't matter because the users get used to it, you adapt to it, and then it just becomes, okay, this is an environment where I can go fast, I can change and go direction, go sideways. That's part of the real problem, is that we do adapt to non-natural things, and then you just do it.

[00:10:41.601] Kent Bye: And I guess one of the things that they've found in some of the recent implementations of virtual reality is that if you're in a cockpit and have a sort of a steady frame of reference, then that makes it a little bit easier to kind of navigate or locomote through environments. Is that something that the academic community has looked at in terms of whether or not doing a first person perspective with no cockpit or vehicle or versus, you know, having that steady frame of reference as you're moving and locomoting through a scene?

[00:11:08.357] Mary Whitton: We've actually focused on not on vehicles. So I know that that gives you a better frame of reference. We've done a lot of training and we were funded by the military for some of our training work. And it's all about situational awareness. So field of view so you can catch things in your peripheral visions. Certainly being able to turn your head and look in other directions. That means that your view direction has to be independent of your velocity vector. And then the military, their gun is pointing yet another direction, so they need their hands free to establish the position of their weapon, which is likely going to be with their eyes, since they're supposed to be doing that. But the number of factors that have to be independent to get the good experience is the full experience, and awareness is important.

[00:11:55.966] Kent Bye: I see, yeah. And so, since you've been doing virtual reality for a long time, it seems like the funding coming from the military, maybe you could talk a little bit about, you know, some of the questions or applications or use cases that they had in terms of where your research was going and feeding into.

[00:12:11.131] Mary Whitton: We were their locomotion arm. Well, they were first interested in these factors that made the fidelity of the experience. So, did the audio need to be full 3D or would s-methyl audio be good enough? Field of view, having some haptics in there. So they were interested in fidelity and then sort of the second chunk of that was working on locomotion because they knew that you can't do easily multiple people, squad level work in a single space and they're flinging rifles around. So where you put a team of people, so multi-person VR, total immersion, everybody tracked, their weapons tracked, their heads tracked, how do you do that? So currently they do it with independent trackers and head mounts. And they put them in a big room and line them up three by three and do their training.

[00:13:01.778] Kent Bye: I see. And what kind of scenarios are they doing in sort of like these collaborative multi-user environments then?

[00:13:07.762] Mary Whitton: Just squad training, taking a building, moving across an open space, and hiding between things.

[00:13:14.527] Kent Bye: I see. And in terms of haptics, how were haptics kind of playing into the virtual reality experience in terms of creating a sense of immersion and importance of that?

[00:13:22.828] Mary Whitton: Very early on, we did an interesting thing with so-called passive haptics, where we took styrofoam blocks and made a model of one of the real spaces that we had. It happened to be a kitchen. And the idea that you could walk around in this virtual space wearing a head mount and very good tracking, and reach your hand out and touch a counter. I mean, it's just remarkable the first time you reach out and your hand stops. So that was just spectacular, realizing how important that was. I mean, much earlier than that, haptics had been used in doing molecular modeling and docking for drug investigation. So that's a very important part. We went on, and with our experiment where we put people in the stressful environment of looking down into a pit, We actually made a plywood ledge that was up about an inch and a half off the floor, and they could walk out the end and feel the end of that ledge with their toes. And that led to a much higher level of presence as indicated by an increase in heart rate than without the ledge. So it's things that... it will make it seem more real. Now, having real objects in the environment, if you've got to model it, you might as well model it and have people walk around in the real thing. So there's a real question about the utility of that. But that led then to work haptics where we said, well, your eyes are really, will dominate your other senses. Can I put a flat board out there and draw it so it's curved? and move my finger into it, and when I touch it, I'm touching the curve and I can move my finger back and forth visually across the curve, but in fact, across this flat thing, will it feel curved to me? And up to a certain level, it will. So the fact that we take the visual dominance over the haptic sense to allow us to use a prop to be not just one thing, but to be multiple things. Is it a flat dashboard cockpit? Is it a curved cockpit? So you could imagine that that would be useful in terms of design work.

[00:15:20.013] Kent Bye: I see. Yeah, it sounds like you'd be a little sloppy in terms of the passive haptics don't have to match one-to-one exactly. And so, did you create sort of like universal passive haptic objects that would be able to be interpreted into a wide variety of different objects within VR?

[00:15:35.841] Mary Whitton: Well, that was the goal, and we had one Ph.D. in it, and he graduated and is off doing something else, and no one has taken up that work that I know of.

[00:15:44.504] Kent Bye: And so, yeah, maybe tell me a bit about more what he was able to find in terms of, you know, what did he create, and then what's the range of different objects you could do?

[00:15:53.816] Mary Whitton: He was actually only working with curves, with a flat board, and he was looking at the angle of it pivoted directly in front of you, and how flat did it feel, and he was looking at how crooked he could make the board look and still be flat, etc. And I don't remember the angles that he was able to achieve. His name was Love Coley.

[00:16:15.436] Kent Bye: And so being involved in virtual reality for a long time, I'm curious, you know, from your own personal experience of what were some of the most immersive virtual reality experiences that you've ever experienced?

[00:16:26.584] Mary Whitton: Wow. Still, when you're wearing shutter glasses and something comes flying out of the screen at you, I always dodge. And that's wonderful. It's kind of really exciting. I am one of those people who nauseates easily, so I don't stay a long time in most virtual realities. I was actually in one Sunday a week ago that was in a room with projectors all around the walls. And if I'm not driving, I have to leave. So I'm not a good person to ask about them. Sorry. I'm trying to, you know, that's interesting though. Sometimes it comes down to the beauty of the modeling. And we have a model that was of an art exhibit of Oriental art at a museum at Chapel Hill. And just the artifacts were beautiful, and it was modeled with a laser scanner. So it's extremely intricate. And experiencing that art museum virtually was really quite a phenomenal experience. I saw it real and then I saw it virtual. It was really neat that you could have a wonderful, pleasing art experience just completely virtually.

[00:17:34.424] Kent Bye: Have you looked at the concept of presence in VR? How do you describe it?

[00:17:40.600] Mary Whitton: We ascribe to more of the Slater view of presence, which is the level of immersion, and we define the level of immersion as the quality and naturalness of the sensory inputs that are computed and displayed to the individual, whether that be sound or haptics or visuals. Other people put the engagement element in there and we do not. As a system developer, it's important that I have a word that means all of these hardware and algorithmic things. And for us, that's immersion and that's presence. sort of accidentally, this pit environment. When we first did it, we were really evaluating locomotion techniques. But it was very, very effective. And we said, oh, what was it about this system that made it effective? And that led to that entire series of studies that we did on elements of the system and what were the values that we needed in refresh, et cetera, to make a compelling experience. So presence is, it's something that's in your head, so you can't measure it directly. And one of the first things we did was say, okay, can we correlate people's subjective reports of presence to the objective measure of heart rate or other physiological measures? So we did that and got a very good correlation between change in heart rate when they entered a stressful environment. and heart rate when they were not in a stressful environment. So we used heart rate as the objective measure of presence. You can also measure that while somebody's doing it, rather than they're reporting it to you after they finish. And we used that as our standard for presence through that whole series of experiences. In fact, as they're still looking at it. Several years ago, Mel Slater posited that we have place illusion and plausibility illusion. And place illusion is what is typically known as presence. But again, it's, do I feel like I'm in that place? Which means, does it seem like that place? Does it sound like that place? Does it look like that place? And then plausibility is, am I interacting with it? And if I've got shadows, are the shadows moving with my hand? That's plausible. Is the scenario plausible? Are things floating in air that are supposed to be real? Well, that's not plausible. So how do we understand those two things? So we've been looking at presence and place illusion for a long time. And the theory on pi and psi, which is what Mel does for place illusion and plausibility illusion, Trying to understand those things and I actually have a student who's working on that area now.

[00:20:11.158] Kent Bye: I see yeah, and it seems like with the plausibility illusion that input controls would be you know having some way to actually engage and impact and have agency with the environment and so is that something also that you're looking at in terms of that coherence of the plausibility illusion?

[00:20:25.323] Mary Whitton: We have not in the work we've done, but the ability for you to interact with the environment and move through it is actually part of the place illusion. I am here, and if I'm here, then I'll be able to move around it, something or other contingencies. And then plausibility is, again, it's the coherence, congruence. Is it right? So the interaction is more, I have ability to do this naturally, and can I do that in the environment? It's a very confusing pair of concepts.

[00:20:55.383] Kent Bye: Yeah, and it seems like once the coherence breaks down, it's almost like I've heard people describe it in terms of like a house of cards, like once you break that presence, then it's hard to get it back. And so have you looked at the ephemeral nature of it once you have a break in it and then the impact of the presence afterwards?

[00:21:12.528] Mary Whitton: It's interesting because we think about, there's been a lot of talk about breaks in presence. So if you have a glitch in your tracking and the world freezes, and if you have a glitch in plausibility, and how fast can you recover from one or the other? Which one's going to bother you more? I think the place illusion ones, the ones that are related to the system, that sort of we've got this inherent ability to say, oh, it's just a system. But if it's not plausible, then you start distrusting everything. And a break in plausibility, you could imagine, could take much longer to recover from. It's an open area. Nobody's kind of worked in that.

[00:21:46.979] Kent Bye: Yeah, and in terms of the physiological things that you can measure, are there other things like skin gallivation? And what are some of the other things that you can actually physically measure to get a sense of someone's presence?

[00:21:57.715] Mary Whitton: Galvanic skin response, sweat, measures reasonably quickly and reasonably reliably. Skin temperature changes, but it changes very slowly, so it is less useful unless you're going to have somebody with a long exposure. We've never done any work with the brain wave caps. FMR takes machinery and you can't walk around and we're all about walking around at Carolina. So you need something that you can wear and is not obtrusive and doesn't compromise your ability to interact, your ability to walk around. Part of the problems are getting the measurement sensors.

[00:22:34.051] Kent Bye: Yeah, and in terms of motion sickness in VR, it seems like this has been something that has been an issue for a long time, and I'm curious if you have any insights in terms of what causes motion sickness within a virtual reality experience, and then perhaps things to kind of mitigate that.

[00:22:50.676] Mary Whitton: Full 6DOF head tracking will mitigate it. Sensory conflict is certainly the most often cited problem. And if I don't have full stereo, if I don't have head tracking, I can rotate my head if I'm just tracking rotation. But if I'm not tracked in position, then I get bad parallax cues, because the world behind you changes when I do this. And if it doesn't, then that's very, very disturbing. So I think full head tracking probably is the most important thing. Frame rate, I got so like, the difference between a constant 15 frames a second update, not just refresh, but new content update, and 30 was remarkable. And now we can, for simple scenes, we just turn vertical sync off and let it free run. And you don't see the tears at very high rates of, you know, 150, 180.

[00:23:42.849] Kent Bye: Oh wow, yeah, so I think that, you know, some of the first systems from the Oculus Rift and the Vive HTC are running at, the CV1 is targeting at 90 and then at 120. Do you see a big shift above 120 frames per second in terms of sense of, you know, immersion?

[00:24:02.705] Mary Whitton: You know, I haven't looked at them so much. It's how it interacts with the possible frame rate update of the display device. And once you have display devices that can update at 120, then matching the rendering update and the frame update rate so that you don't have the possibility of having two different frames displayed at the top and the bottom of the screen, for instance. So keeping them matched is better. And I think when you get to 120, most people fuse everything pretty well at 120. I see. Flicker fusion between the frames varies with the intensity and the brightness and other content things and to some extent you get to a point where almost everybody will be able to fuse it. Cool.

[00:24:46.644] Kent Bye: And so within the last two, three years, there's been a big resurgence of virtual reality in the consumer VR space. And, you know, there's been VR happening for a long, long time since 1968, the sort of Damocles. And I'm curious, from your perspective, how do you see the larger field of virtual reality and sort of the changes that have happened over the last couple of years?

[00:25:07.600] Mary Whitton: Well, it's wonderful to see interactive graphics and virtual reality have absolutely been the beneficiaries of the lowering in prices that have come about by gaming. And when you consider the price of the things we had to use in the early 90s and both the increased quality and speed of stuff now, it's just hands down, it's wonderful. Will the hype, will it match the hype? It didn't in the 90s. I suspect that there are problems that will arise now. There certainly are things that are not solved. And I hope that it will get better. But we have a lot of cool toys to work on non-entertainment applications. And I really have focused on non-entertainment applications.

[00:25:52.645] Kent Bye: Yeah. And finally, what do you see as the ultimate potential for what virtual reality can enable?

[00:26:01.085] Mary Whitton: Dare I say, I'm old enough that I really like spending my life in reality instead of someplace else that someone created. There are remarkable things that can be done, I think, helping people who don't deal with reality well. You know, there's rehab applications, there's science applications, there's training applications. It's all in how people use it.

[00:26:23.030] Kent Bye: Okay, great. And is there anything else that's left unsaid that you'd like to say?

[00:26:29.690] Mary Whitton: There's some enormously creative people who have been working in this field a very long time, and one of those is Mark Bolas, who is now at the Institute for Creative Technology at USC. Palmer Luckey worked in his lab and actually did some of the initial work that ended up appearing in the Oculus when he was working for Mark there. OK.

[00:26:49.363] Kent Bye: Great. Well, thank you.

[00:26:50.304] Mary Whitton: Yeah. You're welcome.

[00:26:52.045] Kent Bye: So that was Mary Witton. She's a professor at the University of North Carolina at Chapel Hill. So I have a number of different takeaways about this interview is that first of all, the very end of this interview, there was some information there that I just wanted to contextualize a little bit more. So this interview was done in France back in 2015. And at that point, The Oculus Rift DK1 and DK2 had been released, and the IEEE had voted to give Oculus a Technical Achievement Award, and there was a little bit of controversy about that. There was a bit of a hush within the conference, and to this date, I still don't quite understand why. There was just some sensitivities of people not wanting to fully talk about some of those implications of that. I think My take is that there was potentially part of a erasing of the role of the academic community within the mainstream popular press. If you look at articles like Wired Magazine's history of the Oculus Rift, you kind of get this superhero take of the story of Palmer Luckey working with John Carmack through internet forums, and they come up with a huge innovation in technology. And that's a story that sounds great for the press and the media. But I think there's always a lot more nuance to these stories, just at a baseline that it's a group collaborative effort with a lot of people involved. Not only the direct players, but just the repository of information, research, and hardware that was even available that Palmer Luckey was able to collect and modify in different ways. If you take a look at the early prototypes that Palmer was working on, he was taking existing virtual reality headsets and modifying them and changing them and swapping out different screens and he's really trying to expand the field of view. But at some point he got a job at the USC ICT based upon some of those early prototypes that he was working with. And he was working in collaboration with a lot of people there at UIC ICT. And so when there's a Wired magazine article that just says that Palmer Luckey was working out his garage and collaborating with John Carmack, there just gives this impression that there's an erasing of the role of Mark Bolas and the role of the larger academic community. So that's my best guess as to why there may have been some sensitivities around this issue. And it could be just a matter of the fact that it's a more complicated story than the existing methods of journalism is really able to handle. Or there could be additional, other conflicting information as to what actually happened in the origins of Oculus that are still yet to be told. There is at least a couple of companies that had claimed that Palmer had signed NDAs and was working in collaboration with them on the Oculus Rift. And Alan Yates of Valve even came out and said, hey, you know, Valve gave Oculus a lot of technology and set up a whole Valve demo room with fiducial markers within Oculus's offices. So I expect that there's just a lot of untold stories as to what actually happened during this time period. One thing is clear is that Palmer did actually work at USCICT and collaborated with the people in some fashion and working and getting some feedback with working on virtual reality technologies. And at some point got in front of Mark Zuckerberg to give him a demo. And that Oculus purchase from Facebook is undeniably a huge turning point for the entire industry. It was a financial signal that said, Hey, this is actually happening this time. And I think that was really the catalyst for so much that actually happened in the VR community. So at this point, I think what is clear for me, at least, is that I have more open questions and answers into some of the origin stories of the Oculus Rift. And, you know, there's a clear public record of Palmer's activities on forums, but there's also a lot of stuff that had happened behind the scenes that I don't think we've really heard the full story about all of those different perspectives. But at this point, I think what Mary was perhaps alluding to is just to give attribution and credit to some of the other academic players that may have played some significant part in the overall history in the research of virtual reality, and perhaps even more directly with the Oculus Rift. But just some other takeaways that I had about the content of what Mary was talking about is that there's a lot of just fundamental research into what it takes to actually cultivate this sense of presence within a virtual reality experience. And some of the questions that they have been asking over the years is, do haptics matter? Do specialized audio matter? What's the impact of feeling scared within a VR experience? And does latency and field of view matter? So that by the time that Valve and Oculus were really getting into virtual reality, a lot of these kind of fundamental research questions had been addressed by University of North Carolina at Chapel Hill and the whole academic VR community. Now, one of the things about just feeling scared in VR that I just wanted to talk about is that when I talked to Mel Slater, he really wanted to have these objective, quantifiable measures for being able to detect presence. And so because of that, the conditions that you have in order to really have that level of presence that you can measure objectively with your body end up being these situations where you're trying to invoke these feelings of fear and that you're under a threat. And so with this pit demo, you're kind of walking on this plank and walking over a pit. And the more that your body is reacting to it actually feeling like you're there, the more that they can say that you're present in that experience. And I think that that has a certain level of utility for being able to draw these different correlations between different body reactions that you're objectively experiencing, as well as your subjective experience of presence. I think where that falls down is that, you know, I saw a number of different students that, following this thread of research from Mel Slater, where they were trying to create these sort of artificial threats, like a fire within the experience. And, you know, there's a certain amount of the uncanny valley as to whether or not, oh, is this threat going to actually make me feel afraid? And if it's not plausible, then you're not going to have that sense of having your body actually react to it. But I think that there's a whole realm of experiences that I've had where I felt like highly present where I wasn't afraid of all and so I think it's a little bit of a red herring in some ways to completely rely upon creating these artificial feelings of fear or feeling scared in order to actually objectively measure if I'm present or not. Some of the deepest levels of presence that I've felt have been in AudioShield where I feel like I'm just like having fun and really engaged and active within the experience and The way that I think about that is the four different levels of presence, of having embodied presence where my body feels like it's in the experience and I have this haptic feedback from the controllers, or active presence where I feel like I'm exerting my will within the experience and I'm actually moving my body in an active way. or social slash mental presence, where I'm able to either stimulate my mind through a puzzle or to have a certain level of cognitive load where it makes me feel like the whole scene is plausible, or maybe there's a social interaction that's happening that makes me feel completely present. And then finally, the emotional presence, whether that's coming from the music and having an emotional connection to the music, or maybe there's characters that I'm really connecting to on some sort of emotional level. But to me, having those different four levels of presence, that when I think about the experiences that I've had in VR, where I've been able to achieve the highest states of presence, it's not always been about creating this sense of fear and being afraid. So I just wanted to point that out, that having these objective measures, I don't think is necessarily the end goal for being able to really detect presence. The other thing that I just wanted to say is that the role of haptics is actually huge. When I think about experiences where I've had the most presence, things like The Void or Real Virtual Reality or the Leviathan Project, these are all projects that did have these levels of passive haptic feedback where they had walls that you could physically touch. And Mary had a great point. It's like, well, if you're going to have to create these levels of haptic feedback and these styrofoam kitchens, then what is the real utility of creating a virtual reality experience then? But when you look at the void, you have this pattern of this 30 by 30 foot maze that has a big enough radius so you can start to invoke some of these redirected walking techniques where you can just keep walking and walking and walking and reach out and touch the wall. But because they have this template, you're able to then put on top of that a whole range of different experiences. And Mary did say just in locomotion the best way to really feel like completely present is to actually physically walk around because when you're using a joystick to do this abstraction then there's all these different natural body movements that are hard to mimic but at the same time there's these sensory disconnects because your body's not actually feeling that. She was talking about the movement of the knees, that when you walk, you don't just lift up your knees, you're actually lunging your full body forward, and there's a certain amount of, in your brain, you have this proprioceptive expectation for what your body feels like when you're walking. And when you're walking in place, it has it to some level, but it's not completely feeling like you're really actually walking around. And so I think approaches like the void start to really connect all those dots and allow you to have that haptic feedback and allowed to actually freely roam around. And that's still a why today that's been one of the most amazing virtual reality experiences that I've had, because I did just feel like I was able to just keep walking and walking and walking. And when you're able to actually have that full body presence untethered without wires and being able to freely roam around in a huge space. The level of presence just skyrockets through the roof. Now that sort of begs the question as to whether or not we're actually going to be able to do that with our homes. Perhaps not. Maybe we just need that level of space in these digital out of home entertainment places to be able to really get that level of presence. And so we're going to be a little bit constrained with these small rooms that we're able to do a room scale. And that helps being able to move your body around, but it's not going to get to that level of actually being able to freely roam into these big wide open spaces. So in this interview, Mary was talking about some of her collaborators at UNC, as well as some of her students. And I have done interviews with three of those people that she mentioned. First of all, episode number 130 was her student Richard Skarbes, who was working on extending Mel Slater's research onto the place illusion and the plausibility illusion. That's an amazing, excellent interview. You should absolutely go listen to that episode 130. It's in my top 10 list of all-time favorite interviews that I've done, just because I think he really breaks down the two basic components of presence in a really comprehensive way. And I also did an interview with Love Coley who did a lot of that research into redirected touch back in episode 351. So you can go back and listen to some of the research that he was able to figure out and being able to use that phenomena of your visual dominance in order to kind of trick your mind into touching things that were more complicated than they actually were. And then finally, back in episode 359, I had a chance to talk to Fred Brooks, who saw one of the original speeches from Ivan Sutherland about the ultimate display back in 1965. So back in episode 359 is a great episode to kind of listen to a little bit more of that history and how that really inspired Fred to go back and to do all this work into virtual reality at the University of North Carolina, Chapel Hill. So that's all that I have for today. I just wanted to thank you for joining me on the Voices of VR podcast. And if you enjoy the podcast, then please do tell your friends, spread the word and become a donor to my Patreon at patreon.com slash Voices of VR.

More from this show