When people dream about what they want to do in VR, it inevitably involves actually moving around within a virtual environment. But VR locomotion triggers simulator sickness in a lot of people, and solving it is one of the biggest open problems in virtual reality. NextGen Interactions’ Jason Jerald wrote a comprehensive summary of much of the pertinent academic research about VR in The VR Book, and in Chapter 12 he summarizes the five major theories of what may cause simulator sickness.
I had a chance to catch up with Jason after he taught a class about Human-Centered Design for VR at the IEEE VR conference, and he explained each of the five different theories that may cause simulator sickness including the Sensory Conflict Theory, Evolutionary Theory, Postural Instability Theory, Rest Frame Hypothesis, and the Eye Movement Theory.
Jason also talks about ways to mitigate simulator sickness including implementing different viewpoint control patterns, using a cockpit, providing a leading indicator of movements, walking in place, vibrating the floor, reducing texture complexity, and limiting optical flow in the periphery. He also discusses the tradeoffs for varying the Representational Fidelity ranging from stylized to photorealistic, and the Interaction Fidelity ranging from abstracted to literal and natural gestures. While there are ways to mitigate some of these causes of simulator sickness for VR locomotion, some of them remain open problems yet to be solved by the wider VR community.
LISTEN TO THE VOICES OF VR PODCAST
Donate to the Voices of VR Podcast Patreon
Music: Fatality & Summer Trip
Rough Transcript
[00:00:05.452] Kent Bye: The Voices of VR Podcast. My name is Kent Bye, and welcome to The Voices of VR Podcast. So one of the biggest open problems within virtual reality is locomotion. There's a number of different things that can cause you motion sickness whenever your vision is getting signals that you're moving through a world, but yet your whole body isn't. And so I'm going to be diving deep into the five major theories of simulator sickness, thanks to this in-depth conversation with Jason Gerald of NextGen Interactions. And Jason also wrote the VR book, which is really the most comprehensive summary of all the different virtual reality research that's out there. It's a great textbook and summary of a lot of what academia has to contribute to virtual reality design. So we'll be diving deep into human-centered design for VR and with a big focus on looking at motion sickness and simulary sickness within VR and what you can do about it on today's episode of the Voices of VR podcast. But first, a quick word from two of our sponsors. Today's episode is brought to you by The Virtual Reality Company. VRC is at the intersection of technology and entertainment, creating interactive storytelling experiences. The thing that's unique about VRC is that they have strategic partnerships with companies like Dbox, which is a haptic chair that takes immersion and presence to the next level. So they're making these digital out-of-home experiences for movies, studios, and original content. For more information, check out thevrcompany.com. Today's episode is also brought to you by The VR Society, which is a new organization made up of major Hollywood studios. The intention is to do consumer research, content production seminars, as well as give away awards to VR professionals. They're going to be hosting a big conference in the fall in Los Angeles to share ideas, experiences, and challenges with other VR professionals. To get more information, check out thevrsociety.com. So this interview with Jason Gerald happened at the IEEE VR conference that happened in Greenville, South Carolina from March 19th to 23rd. So with that, let's go ahead and dive right in.
[00:02:19.699] Jason Jerald: So my name is Jason Gerald. I'm the co-founder and principal consultant at NextGen Interactions, and I'm in a very fortunate place to be able to focus exclusively on virtual reality, typically more of the interactive side versus passive immersive film and such. That's really been my passion since I can remember, about 20 years or so now.
[00:02:39.672] Kent Bye: Great. So here at the IEEE VR, you're doing a workshop on human-centered VR design. So what does that mean?
[00:02:46.355] Jason Jerald: Absolutely. So a lot of the research in the past has come from the technical field, which is myself. I'm an engineer. But what I realized a few years ago is what's really needed by the community is understanding the human side of virtual reality. Virtual reality is not about just the technology. You know, 50% of virtual reality is up here in the brain and in the hands, how we interact with these virtual worlds. That's what makes it really, really compelling. Otherwise, without the human in the loop, there is no virtual reality.
[00:03:13.218] Kent Bye: And did you talk about simulator sickness at all in your talk?
[00:03:17.222] Jason Jerald: Yeah, absolutely. So there's really kind of five accepted theories of motion sickness. And the primary one being, of course, the visual vestibular conflict that our inner ear doesn't match what we're seeing visually. But there's other theories as well. There's like the evolutionary or poison theory, which basically says we ingested some bad berries. So now our body is trying to eject that by vomiting or sweating it out or encouraging to lay down that, you know, help to survive, right? to live another day, or in VR to experience VR another day, I guess, in the modern terms. And then there's other theories, such as the rest frame hypothesis, something we're researching ourselves at NextGen Interactions, which is saying, OK, that's true, the sensory conflict theory, but you don't necessarily need the whole visual field to match the vestibular system. Instead, if it's certain parts of the visual field that the person considers to be stable, then that's not in conflict with the vestibular system, then you're OK. Or if you feel like the world's coming towards you instead of you going through the world, that's saying, well, the world's maybe this object that I'm manipulating. And so I'm stable, but the world's moving around you. And so if you have that mental model, that expectation, you're less likely to get sick. Another example is putting a cockpit in the scene, right? Like Eve Valkyrie. That's a really great example of reducing motion sickness because that Cockpit is stable the outside world sort of moving around you or you're you know, depending how you think about it You're flying through that world But people are less likely to get sick because a large part of the visual field is stable relative to the inner ear Was there one more or was that all five of them? Yeah, so there's the eye movement theory which says that one of the reasons why we get sick is that the eyes aren't moving like they do in the real world as we move our head around. So there's the vestibular ocular reflex, even if in the dark your eyes are stabilized in orientation in space as you rotate your head back and forth. In VR, if you have latency or it's miscalibrated, then the eye movements get a little bit confused because what they're seeing is not necessarily stabilized in space. Then you also have the optokinetic reflex, which is the visual element that you can visually see the room, your eyes are stabilized in space. And so that optokinetic reflex, if it's out of line with the vestibular ocular reflex, then your eyes get a little confused. They're moving in the wrong way. And certain parts of our vision, especially in the periphery, go through a different visual pathway. They go more straight through the brain stem versus the more modern pathway. It goes towards our occipital lobe that is more of a logical sort of pathway. that more primitive pathway, there's actually some connections that go directly to the stomach that can induce nausea and vomiting and such. Another theory is the postural instability theory. The sensory conflict theory is not arguing with that, but, well, it's kind of sort of accepted that as you get sick, you're less stable in space. You're more likely to fall over, right? You'll start wobbling. The postural instability theory goes beyond that, and it claims that you actually start becoming unstable before you get sick. And so it's not a result of sickness, but it's actually a cause of sickness. And so one implication of that, for example, is you see a lot of standing experiences, and you're more likely to fall over if you're standing versus seated. And so that's one concern to consider. If you have a lot of motion in the world and such, you probably want to have the user seated down.
[00:06:34.474] Kent Bye: Wow, so I am fairly sensitive to motion sickness and I think I have the sensory disconnect that usually triggers me. I'm curious, in the research that's been done in each of these different theories, are there ways to kind of test people who have simulator sickness sensitivities to see if there's any one of those that are triggering them?
[00:06:52.819] Jason Jerald: So kind of in the virtual reality research community, there's something called the simulator sickness questionnaire. And it's sort of this, you know, question, are you feeling bad in certain ways? And that works, but the data can be quite noisy and it's subjective data. And so there's other more objective measures such as the postural instability theory. If you have someone do a postural stability test, similar to what the police do, if you've been drinking, right? If you aren't able to put one foot in front of the other, touch your nose, whatever, you know, there's different forms of that test. If you can't do that well, then that's a sign that, you know, you're not feeling well. So that's one example of testing that. There's also more physiological measures that may or may not relate to that. And that's an area of research that there's not a lot of data there yet, but I think there's certainly some interesting things to explore there, like sensing the body in different ways. How does that correlate to the sickness questionnaires, for example?
[00:07:44.249] Kent Bye: Yeah, the one with having the eye stay in one spot. So when you turn your head, your eyes are basically locked in. And then if you're in a VR world, then if there's any sort of disconnect there, then that is one of the theories for simulator sickness. And I imagine that at some point, that could be a technological issue that could get better over time. whereas some of these other ones seem to be something that's sort of ingrained within whatever genetics that we have or whatever we've been conditioned over time. And so I'm curious if you have any thoughts about like this idea of VRC legs where some people say that the more that you do it, the more you just kind of get used to it.
[00:08:19.963] Jason Jerald: Yeah, absolutely. So as far as the eye movements, we've reduced latency. Some of the, you know, the better systems we have out there today, the latency is low, they're really well calibrated. So as you move the head around, there's not much conflict and thus you get less sickness. The eyes, you know, work as they should. But you can imagine if there's a visual flow in the field, the optical flow of, you know, the scene moving towards you or moving to your right, the optokinetic reflex, what your eyes might follow that visual flow where the vestibular ocular reflex would not follow that flow. And so your eyes might move when it's not really expecting to move. And so there could be a conflict in that sense. So the point here, there's many factors, not just motion sickness, but other forms of sickness. There's so many factors there. Some of them, I think I list 50 or so in my book, you know, and a lot of them we don't know about, right? There's tons out there that may or may not have an effect as far as getting sick in different ways. There's also physical issues, as you know, many people are aware of, of they, you know, swing a bat and they don't realize there's a physical wall there and they slam their hand into the wall or tripping over a wire. All of these things we need to be very careful of and communicate the dangers in a ethical way, but at the same time doing it in a way that doesn't, you know, scare off users and such. But we do need to do our best to get the word out there of these are things we need to be careful of and designing the systems well.
[00:09:40.639] Kent Bye: Yeah, as you're speaking, I kind of realizing that I think I do get triggered by the visual flow that happens when I'm walking in locomotion that if I'm walking on a floor and I look down and I see the floor moving with like a lot of textures, there's something about that that is very nauseating for me. But yeah, if I'm in a cockpit or if I'm flying up high in the air with not a lot of things going right directly under me, then That seems to be okay. So do you think that there's a connection there between that specific motion sickness theory with that optical flow with the vestibular ocular reflex with motion as you're walking, you know, around? But the cockpit seems to kind of cut that out a lot and seems to be a little bit more comfortable. Is there a connection there?
[00:10:22.118] Jason Jerald: Yeah, absolutely. So sort of all of these theories sort of fit with that cockpit frame, because there's the optical flow, there's the vestibular ocular conflict, there's the rest frame. And so those are kind of all consistent with each other. In fact, there's one model I use based off of Sharif Rizak's work who created redirected walking. we went to school together, and I kind of take his perceptual model and apply that to sickness that really brings all of these theories together. So, you know, a lot of people argue that, you know, whether it's virtual reality or other fields, that, you know, one theory is the correct theory. And the real answer in most cases is it's some combination of the theory. that they're not necessarily that one is correct or one's not correct, but each one of them has some truth to it. And so we bring those together and figure out, okay, how do these different perceptual cues and such, how do they relate and how do they cause you to, you know, like, for example, one's mental model of what they expect to occur in the real world. That's part of it as well.
[00:11:17.049] Kent Bye: I don't know if you've seen the game Drift that you are basically a bullet and you're flying through in the Gear VR experience and there's not a lot of textures. It's all like white walls and because you're a bullet things are kind of frozen in time. You're flying through these objects and so you're flying down this hallway and you have all these objects like in the near field and there's not a lot of like textures and it's just kind of like white. And for me, that actually was comfortable. It was surprising to me about how little things like that where I'm still moving through and I'm able to focus on in the near field. I don't know if it's the near field or what it was about that experience, but there seemed to be some combination of things that normally moving through a hallway like that would get me sick, but in this case, it didn't.
[00:12:03.715] Jason Jerald: So there's at least two things in there that may be going on. One is if there's not a lot of texture in the scene, then there's less optical flow, so there's less conflict that way. But also, the vestibular system does not sense linear accelerations, at least not that much. It does a little bit, but it's really a secondary effect. And so if you're moving forward with a constant linear velocity, then that's comfortable for most people. And so if that's the case, then there's going to be little conflict. And so really, if you're designing an interaction and navigation through the world, then you want to, for example, immediately jump to a constant linear velocity instead of slowly accelerating to that constant velocity.
[00:12:41.910] Kent Bye: Yeah, and I also did at GDC, there's like the VR's room. You're basically on an exercise bike and you're riding and they had a number of experiences where you're riding and you have my body and it's moving and there is some stuff that seemed like that was a little bit more comfortable to kind of like have this, you know, reaction my body when I'm moving but there was other things that like they were starting to like turned to the right on a bank, and then so they were changing the horizon line. And to me, whenever they mess with the horizon line without my body moving, it creates that sensory disconnect. But I'm curious if there's some connection there, if you're actually kind of physically moving your body, if that helps to take in more locomotion.
[00:13:22.862] Jason Jerald: Yeah, so the horizon, I don't know of any formal research showing that, but we definitely noticed in some of our designs that the horizon, you know, sort of anecdotal evidence that you want to keep the horizon level in most cases. As far as moving the feet, there's not any solid research I've been able to find on that, but again, kind of anecdotal evidence. I just tried the Virtuix Omni the other day at the Game Developer Conference that I didn't really feel that sickness, right? Even though I was moving pretty fast through the world and not necessarily with linear velocity. And one theory for why that might occur is actually DreamWorks has recently submitted, they tell me, a patent for this. If they vibrate the floor beneath you, then basically the whole body vibrates. Your vestibular system gets a little bit confused because of that vibration. And then it's confused and it basically gives up. And then because there's not as much sensory conflict with what you're seeing and what you're feeling. maybe in a similar way moving the feet or being on a bike with that feet motion that that confuses the vestibular system a little bit. Now that's pretty speculative there but it's certainly possible. There's also more than inner ear of how the body you know feels tugged or how the body moves that very well may affect how we perceive motion as well even if you're you know actually in one place but parts of your body such as the legs such as in Verzoom or the Virtuix Omni.
[00:14:42.972] Kent Bye: Yeah, I'm curious if there's any other best practices when it comes to things that you should or should not do when considering motion sickness.
[00:14:50.572] Jason Jerald: Yeah, so I basically looked at about 100 or so interaction techniques, then I organized those into common themes, interaction patterns. And one group of patterns is what I call viewpoint control patterns. I don't call them navigation patterns, because that includes things like scaling yourself up or down, right, which isn't really a navigation. But those are the patterns you need those sort of viewpoint control patterns that you have to be extremely careful of when you're thinking about motion sickness. other sort of, you know, if there's some latency in my hand, maybe a little annoying, not great to work with, but it's not really going to get you sick. So if you are moving in the world without one to one mapping, like in the HTC Vive, most of those experiences are one to one mapping, so you're not going to have motion sickness. But when you are virtually traveling through the world or controlling the viewpoint, be extremely careful with, you know, the types of motions you have. Minimize those accelerations. You can do things like provide a leading indicator, which is a clue that the camera is going to move in a certain way. So that can, even a passive experience like immersive film, if you provide this cue, this comes out of Tom Furniss' lab at the HIT Lab at the University of Washington. It turns out a person's expectations really contributes to sickness or not. So if you know a motion is coming and you expect it, you're less likely to get sick. Similar to driving in the real world. Drivers are less likely to get sick than a passenger. And so those types of things are great ideas to experiment with and we definitely don't know all the answers. really have to iterate, iterate, iterate, experiment over and over and get real users, your target, you know, personas and such in the environment, not just your team, because your team, like you mentioned earlier, likely has grown resistant to, especially, you know, resistant to the specific things you have been implementing.
[00:16:32.038] Kent Bye: Yeah, and Oculus at the GDC came up with about 30 launch titles. And they have these comfort ratings. So it's either comfortable, moderate, or intense. And when I do the intense, I usually get a little motion sick. So it's pretty accurate for me. But it may not be accurate for everybody. And so I'm just wondering if you foresee a future where we're going to be able to really dial in, like, this is intense because of these reasons. And people may be able to kind of self-identify and then decide whether or not they're going to do an experience.
[00:17:01.006] Jason Jerald: Yeah, absolutely. So I mentioned before, I listed about 50 or so factors that contribute to different adverse health effects. And some of the biggest factors are the individual, right? I divide them into system factors, the design factors, and then the human individual factors. And the individual can be the biggest variance. It's not just that one person is more prone to motion sickness than another, but they might be prone in different ways. For example, I don't have any problem whatsoever with strafing. I have problems with other things, but not strafing, yet some users don't want to do strafing at all. And so it's really a challenge to try to, if you're doing some form of viewpoint control, to make it super comfortable for everyone. So some things we're experimenting with, first person shooters are probably on the, you know, on the high end of that likeliness to get sick. And so we're experimenting at next-gen interactions, and eventually we'll be having an Oculus PlayStation title is a first-person shooter, but adding sort of these rest frames around you to take that down to more of the moderate level. But I don't think it's going to take you all the way down to the super comfortable level. So in that case, you don't necessarily need a design for every single user. If you have to have a first person shooter, you just have to expect this isn't going to work for everyone. That's not my target audience. I'm looking for the hardcore gamers. But you can improve upon that comfort, even though you may not be able to solve it completely. So remind me again what a rest frame is a rest frame is basically a stable cue that's in the real-world reference frame So think being able to reach out and physically touch something in front of you which the real-world reference frame works great for like location-based Entertainment where you have physical props because then you can reach out and touch those physical props and now that matches the real world And so, yeah, there's different companies that are doing this really well. We're building something ourselves. We're going to have walk-around experiences, multiple users, full body tracking, but also tracking some objects in the world so you can physically feel things. So a rest frame is basically the visual prop that's stable relative to your inner ear, like the cockpit with Yves Valkyrie is a really good example, done really well.
[00:19:05.904] Kent Bye: And so in this workshop, I imagine that a lot of the things that you were presenting were kind of a summary of a lot of information that's in your book, the VR book. And so I'm curious if there's anything else that was presented here that you want to mention.
[00:19:18.740] Jason Jerald: Yeah, you actually asked some great questions. I think you covered a lot of those. And that's part of the challenge. There's just so much to talk about when it comes to virtual reality. So I kind of narrowed it down more on the interaction side, mostly working on the hands. But I really started very broad and theoretical. And then with the ideas, the workshop went forward that we get more and more specific. There's very few absolute truths when it comes to VR. Of course, there are some, such as you need a high frame rate, you want to be careful visual accelerations and such. But the most important point I was trying to take is that there is no real truths of absolute guidelines when it comes to virtual reality design. It's sort of a space and trying to figure out where you fit in that space and what are the goals of the project? Who are your ideal users that you're trying to target? You know, is it an industrial design? Is it an entertainment application? What is it? Those will very much affect what you're trying to create and how you design the interface and such.
[00:20:15.027] Kent Bye: So what are some of the special considerations then?
[00:20:17.769] Jason Jerald: So some things are, for example, are you trying to represent reality, right? A lot of people think or trying to get this ideal, what I call representational fidelity, which is, are you trying to portray an experience that is on earth or could be a place on earth? But you don't necessarily want that, right? You might want sort of a abstract virtual world of bubbles in the environment that you're reaching out to pop, right? doesn't fit the real world. So higher representational fidelity is not necessarily better. It just kind of depends what you're trying to do. Or there's interaction fidelity, which is does the virtual task that you're performing, does that match the equivalent real world task? And if you're doing a training application, for example, that interaction fidelity is very important. But in some cases you want efficiency, right? Sometimes just pushing a button is very efficient and something the user doesn't need to feel presence to do some of his objectives. So, say you're going to a library to check out a book. A lot of times you don't want to go to the library and browse through the card catalog and walk to the book, right? You just want to maybe use a voice command to do some sort of verbal search in order to find that book. That's not realistic. So that has low interaction fidelity, low interaction fidelity, high interaction fidelity. They're both valid. It just really depends on your goal. And then in between low and high interaction fidelity are like magic techniques, right? Maybe you're kind of doing the motions that you would do in the world, but you're going beyond what you can do in the world in some ways of, you know, having superhuman strength or throwing objects at a distance or selecting objects at a distance, for example.
[00:21:53.233] Kent Bye: Yeah, and I imagine that part of the representational fidelity is that the higher that you go in the photorealism and representational fidelity, the higher the expectations the user might have in terms of all the other dimensions of their interaction. So maybe you could talk a bit about some of those trade-offs in terms of when you might want to dial down the representational fidelity and when you might want to increase it.
[00:22:18.785] Jason Jerald: Yeah, so a lot of people a couple years ago, it was all about trying to reach this realism. And in my opinion, photo realism isn't the most important thing to go for, unless it's, you know, it really fits the goals of the project. But photo realism, I don't think that's a very good word, right? Because photos are kind of boring compared to what we can do. Photos are normally small, they're static, they don't move. And compared to what we can do with virtual reality, I just don't think it describes what we're shooting for in a lot of situations. But immersive film is, you know, not as high resolution as we'd like to be, but representation fidelity isn't about so much the technology, it's about the design and the art of the world, the aesthetic aspect. And so I was trying to look for sort of a term that wasn't describing the immersive technical aspects of it, but more the concepts of the design than the specific technology.
[00:23:07.672] Kent Bye: And so what are some of the implications of that then, in terms of how that impacts someone's decision when they're doing their design?
[00:23:13.896] Jason Jerald: So for example, sometimes cartoon worlds can feel very immersive. You don't need to feel like I'm in this real world. So we're working on an educational game for kids, right? And kids love being in cartoons or the zombie cartoon zombies that we have in the world. right? And they can really feel that, you know, sense of a little bit of fear of maybe those zombies coming at you and you got to take some action to, you know, protect yourself, but not at a level that's going to have that jump scare that you're going to scare them in too bad of a way. We also talked about bimanual interaction, right? Because the hand track controllers are just amazing. I mean, I'm so excited to see that in the research community, we call those wands, right, but it was typically in the virtual reality research community, it was typically just one hand. And so I'm so happy to see that the gaming industry just jumped immediately to two hands. And so I've been working with companies like six cents for quite a while we had something or even before that we had something called space grips, which were essentially like the, you know, handheld controllers we have now the difference they were $10,000 instead of $100 for the Razor Hydra. So bringing in both hands in the environment isn't just twice as good as one hand, right? The non-dominant hand and the dominant hand, the differences in those are substantial of how you design your environment. So in the dominant hand, the left hand for most people, for example, is if you're doing a handheld virtual panel, like, say, tilt brush, for example, then that kind of acts as a reference frame. So if you're controlling, say, a slider on that panel in the non-dominant hand, it's sort of a reference frame for the dominant hand to work precisely in. So for example, peeling a potato is the example I like to give. If you're trying to peel a potato that's on a table, even if it's locked into place on the table or freely rolling around on the table, it's pretty hard to do with one hand. But as soon as you put that in the non-dominant hand that acts as a reference frame, then suddenly that just becomes intuitive and really easy to use. So I'm really big on these sort of different reference frames, like for example, the torso reference frame, that is by the times, you know, now with consumer sort of VR, we're not tracking the torso. But in some cases, you can estimate if the user seated in a forward direction, you know, approximately what the forward direction is. And so in that case, it's great to put the tools around you, like a utility belt, right? Because then you can use that sense of proprioception of where your body parts are located. So I close my eyes, I can touch my nose sort of thing. But you also know if something's behind my back. So Sixth Sense has a great demo I just experienced at GDC. You know, you reach behind your back, pull out the arrow, and you don't even see the arrow when you're pulling out, but you know it's always behind you. And so once your body gets accustomed to those things, you don't even have to look at something to interact with. You just reach out and expect it to be in that area.
[00:25:53.332] Kent Bye: So what do you want to experience in VR then?
[00:25:56.020] Jason Jerald: I want to experience it all. I don't want to experience one type of virtual reality, which makes it so exciting, right, is that we're creating worlds and universes that don't even obey the laws of physics. And the real world is super exciting. I'm really big on understanding the real world. I love the real world. But there's so much that we don't understand about the real world. And if we can understand that better, and we study the real world, that allows us to create these sort of virtual worlds. And to try to think that we know everything about VR, that's never going to happen. We're never going to know all the answers. We don't know all the answers about the real world, so how the heck do we expect to know all the answers for virtual reality? Because we create all these very diverse worlds, and that's what makes it so exciting, because we're never going to know all the answers. Some people say, well, now that time is so exciting about virtual reality because there's so many unknowns. There's always going to be those unknowns, and thus, it's always going to be a fascinating field to work in.
[00:26:51.247] Kent Bye: And finally, what do you see as kind of the ultimate potential of virtual reality, and what it might be able to enable
[00:26:58.277] Jason Jerald: There's so many ways to go with that question. So anything from real world training, professional applications, say on an assembly line, to retail, knowing where the most important things to place products in a shelf, like once we have a high tracking and putting heat maps, so analytics to understand reality better, right? Earlier, I mentioned, we should study reality, but virtual reality can also help us study the real world. An enormous tool for psychologists because you can very much control the conditions that are more difficult to control in the real world. So yeah, experimental psychology I think is going to be a very big thing.
[00:27:33.763] Kent Bye: Great. Anything else that's left unsaid that you'd like to say?
[00:27:38.077] Jason Jerald: For those of you that are listening, get out to the conferences, get out to SVVR, get out to GDC, get out to IEEE VR, whatever it is, and just, you'll have these fascinating conversations with, you know, other people's ideas and, you know, help build your ideas to, you know, build upon what you may already be creating.
[00:27:54.783] Kent Bye: Great. Well, thank you so much. Thank you. So that was Jason Gerald. He is the founder of NextGen Interactions, as well as the author of The VR Book. So there's a lot of different takeaways from this interview. And first of all, just to be able to lay out these five different theories of similarity sickness within VR, I think is super helpful, because I think you can start to see that some of these can be addressed by the VR design. And some of them are perhaps just the way that our minds are wired. And I think it's important to note that there's a wide spectrum of people that are either triggered by one of these theories or one of the combination of the theories. And so everybody kind of falls on the spectrum of how many people are going to suffer from simulator sickness. And I think it's important to note that there are some established ways of measuring this from questionnaires, but there's also a number of different objective measures. And I think that as the virtual reality community moves forward, this is going to be probably one of the biggest open problems to try to find different solutions to. So I just wanted to go over the five different theories just quickly, just to kind of recap them. First is the sensory conflict theory, which is essentially saying that whenever there's a disconnect between what you're seeing and what your inner ear is experiencing, then it's going to give some type of motion sickness. Typically, I think it's kind of switched in real life versus what happens in VR. So usually you might be in the bottom of the boat. You can't see the ocean, but you're kind of really moving around. So you're not getting the visual input of your movement, but you're really feeling it inside. And so that can be a big trigger for motion sickness. In VR, it's kind of the opposite. You're kind of getting all the visual signals that you're moving around, but your inner ear isn't actually feeling any of it. And so you start to get this feeling of motion sickness because of that. The next one of the evolutionary theory basically says that whenever our body starts to see any disconnects from our different sensory motor systems, that essentially that's a sign that we've been poisoned in some way. And so if we eat in a poisoned berry, then it starts to create this disconnect within our body, which then causes a lot of sweating and nausea and just feeling motion sick. And so any sort of the subtle differences that our mind is expecting versus what we're actually experiencing could also be a part of that as well. Another theory for motion sickness is the rest frame hypothesis, which essentially says that when you're in a room, you kind of know what things are stabilized. And so if you start to getting things like judder within VR and you start dropping frames, then you kind of have this rest frame instability where the whole world starts to move around. And because of that, it can really trigger motion sickness, I think, pretty universally for people. That's why they say you should absolutely never drop below the 90 frames. And it's more important to not drop frames because it's basically an instant similarity sickness trigger for people. The eye movement theory basically goes to this thing that our eyes do when we fix our gaze upon an object. We can look at something across the room and move our head all around and our eyes will automatically keep fixated on that point. And that's called the vestibulo-ocular reflex. So if VR isn't actually precisely keeping up with that, then we kind of have this conflict between that vestibular ocular reflex and the optokinetic reflex. And that can start to cause a lot of motion sickness. And it was really interesting that in that point, Jason pointed out that periphery actually has a big part of that as well. And so if you notice some of the things that Eagle's Flight was doing with cutting off the peripheral vision, It turns out that there's a couple of pathways into our brain and the peripheral vision, I think, probably has a lot more to do with motion sickness than a lot of people realize. And so that's why a lot of times if you are able to be inside of a cockpit or to have low textures on the ground, then sometimes that can help with reducing that optical flow and that disconnects that can happen that might be related to this eye movement theory. And my understanding of the postural instability theory is that it's a little bit of a feedback loop of kind of seeing a little these discrepancies between when we stand up, we kind of do all these subtle micro movements in order to maintain our balance. And again, if we're getting any disconnect from what we're actually seeing, then that can start to throw off our calibration and kind of create this positive feedback loop cycle that can lead to nausea. So these are all the five different theories and I think that's probably the biggest one is the visual and vestibular disconnect of having things that are disconnected from what you're saying. But like Jason said, I think there's different things that are triggers for different people and they may have some combination of these five things. And so it's really kind of like this spectrum of different things that we can do within VR to minimize some of these triggers for people. But I think it's also important to kind of look at motion sickness as, in some sense, a disability for virtual reality. Some people don't have any problems going to VR, and they don't have any sense of motion sickness at all. And so they want to just design everything to have this crazy motion. But if you're a virtual reality developer and you start designing these experiences for people, then it's going to end up making a lot of people sick and it's going to reflect badly on your experience. And so it's a little bit like at this point, I do think that at some point we may be facing this kind of fractured into like the experiences that people can experience if they don't have any motion sickness issues and and the experiences that are geared to be able to be experienced. But there's this crazy trade-off from the more that you teleport around, the less sense of embodied presence you have with creating a shared space. And so there's these tensions and trade-offs between creating that sense of presence in space and time passing as you're moving through a space by not teleporting, But yet that moving around and the vection, and if you're in a cockpit, it seems to be a little bit better. But there's all sorts of challenges to being able to do that in a way that's comfortable for the majority or all people. So I would just call for developers who don't experience any motion sickness to try to find people who are sensitive and to have them test out your experience and to get more feedback for the different types of techniques that we can use in order to create more comfortable experiences. Or you could alternatively go the path of just designing the most extreme experiences that the people who don't suffer from motion sickness issues can enjoy. Perhaps we'll have a future where we know both the triggers and our capabilities and we'll be able to kind of match it up from what we know we'll be able to comfortably experience. But I think we're pretty far off from that. But I really appreciate the role of academia has played and being able to kind of lay out a lot of this research. I don't think it's very well publicized within the larger VR development community, especially considering doing this interview with Jason and really doing a deep dive was the first time that I had really heard of a lot of these specific hypotheses about it. It's not really discussed all that much within the consumer VR community, so just happy to kind of cross-pollinate some of this research. Yeah, I hope that it will inspire people to dig more into what's been researched, and I think a lot more research is needed, especially with a lot of this new technology and new experiences. I think it's a little short-sighted to discount all of this research into motion sickness because of old technology or old experiences. And so I'd encourage people to start to do their own tests and to find ways of measuring the impacts of simulator sickness by doing more objective measures of people's physical feedback, but also a lot of these simulator sickness surveys that are out there that you can give to people. So that's all I have from this interview right now. I'm in San Francisco after coming directly from the International Joint Conference on Artificial Intelligence where I did 60 interviews about artificial intelligence. And so I've basically seeded the first couple of months of daily Voices of AI podcasts. I just have to find some time and capability to actually launch the site and line up some sponsors and get that off the ground. But I'm excited to expand out into both VR and AI, and I think there's already a lot of overlap and really interesting lessons that each of these really cutting-edge technologies have to learn from each other. So I'm really looking forward to starting to get that started and out there. And I'll be at Casual Connect, moderating a panel on VR game design, doing lots of interviews with casual VR developers and see what's happening in the free-to-play model within VR. And I'll be at SIGGRAPH the week after that. And then the week after that, I'll be at VRLA. So I've got like 15 conferences that I'm going to between like a couple of weeks ago until the end of November. So I should put a list of where I'm all going to be. But if you're at one of these, then track me down. I'll be roaming around doing lots of interviews about the future. So with that, thanks for listening. And if you'd like to support the Voices of VR podcast, then please do consider becoming a patron at patreon.com slash Voices of VR.