Azad Balabanian is the co-host of the ResearchVR podcast where he discusses the latest cognitive science research that applies to virtual reality with fellow cognitive scientists Petr Legkov and Krzysztof Izdebski. The premise of their work in VR is that the more that we understand about how humans work, then the better VR experiences that we’ll be able to create.
The ResearchVR podcast has covered topics ranging from time perception in VR, VR and memory, to Presence in VR. I had a chance to catch up with Azad at SVVR where we discussed what cognitive science can teach VR user experience design, the connection between memory and perception, privacy in VR, biohacking for sensory augmentation, neuroplasticity, and how VR can be applied to doing cognitive science research.
LISTEN TO THE VOICES OF VR PODCAST Donate to the Voices of VR Podcast Patreon Music: Fatality & Summer Trip [00:00:05.452] Kent Bye: The Voices of VR Podcast. My name is Kent Bye, and welcome to the Voices of VR Podcast. On today's episode, I have Oz Balabanian, who is the host of the Research VR Podcast. Oz is a recent cognitive science graduate, and so he's been using all of the insights that he has learned about the human brain, the perception, and starting to apply it to virtual reality user experience design. So Oz is really at this cross-section between cognitive science and virtual reality and seeing how these two fields are kind of informing each other in different ways. And so that's what we'll be talking about on today's episode of the Voices of VR podcast. But first, a quick word from our sponsor. Today's episode is brought to you by Unity. Unity is the lingua franca of immersive technologies. You can write it once in Unity and ensure that you have the best performance in all the different augmented and virtual reality platforms. Over 90% of the virtual reality applications that are released so far use Unity. And I just wanted to send a personal thank you to Unity for helping to enable this virtual reality revolution. And to learn more information, be sure to check out Unity3D.com. And so this interview with Oz happened at the Silicon Valley Virtual Reality Conference that happened at the end of April at the San Jose Convention Center. And so with that, let's go ahead and dive right in.
[00:01:31.875] Azad Balabanian: Hi, my name is Oz Balabanian, and I'm the host of ResearchVR podcast. And we like to talk about all the research aspect of virtual reality, try to break down all these really complicated things like proprioception and vestibular cues, and talk all about the research papers that nobody really likes to read, but they should be reading, or at least understanding what the results are, and integrating them into their experiences.
[00:01:59.463] Kent Bye: Great. So maybe you could talk about your background and the other people that are co-hosting this research VR podcast with you, what their backgrounds are.
[00:02:06.846] Azad Balabanian: Absolutely. So there's three of us. It's Krzysztof Izdebski and Peter Lekoff. And they're two, actually, cognitive scientists as well from Germany. And they go to University of Asnabrock and we all do VR research. And so I'm actually a recent grad from UCSC and I was a cognitive scientist there as well. And virtual reality was something that we were all interested in because we were learning so much about brains and how all these systems worked. And suddenly this technology came around and we realized it's probably the most human of the technologies and how you really have to understand the human first to make something for it. And so we're like, this is perfect. Everybody's trying to learn more about how your brain works. And we're all such nerds about it so that we like to talk about it. But the big difference is that you have to really take these concepts and make them simple and digestible for people to understand. And I think that's what we do best.
[00:02:58.505] Kent Bye: Yeah, and I think that VR is such a huge area in that there is that really interesting cross-section of psychology, neuroscience, cognitive science, behavioral science, physiology, as well as our visual system and perception. And then all the technological elements are just bringing that all together. But that's what I think is so interesting about it, is that it really kind of spans all of the human experience in a lot of ways. And so I think that, you know, for me, I see that virtual reality is kind of like this capability to be able to very finely tuned control the visual input to our system and kind of tricks our mind at a subconscious level. And so from a cognitive science perspective, how would you describe what's happening when we're kind of receiving this input from a virtual reality screen? Sure.
[00:03:46.769] Azad Balabanian: So the reason why, if you do VR, even without a headphone, the reason why that works as well as it does, because a lot of the brain area is really dedicated to your visual system. The whole occipital lobe in the back, that's a lot of real estate that your brain is dedicated just to vision. And that's just kind of the nature of our species. But that doesn't say that's the only thing you need to account for. Then comes in audio cues, then comes in kind of haptic feedback and all of those. And those just add on to the experiences. But the first thing you need to do is really take care of the visual system. And that's what we've done really well so far in terms of VR. The next steps, those are coming. Audio is getting there, I think, with these Ossicat phones that I tried out today. Really interesting. There's some really interesting research happening on the haptic side of things with ultrasonics, you know, like using sound waves and just really interesting ways of feeling and touching things. There's actually a new device that just came out from Japan called Unlimited Hand that was essentially something that you put around your arm, kind of like a myoband if you guys know what that is. And it's supposed to actually stimulate your muscles in your arm and it pulls back your fingers so that you're supposed to touch something. I haven't tried it myself, but you can see where this research, and it came out at a research lab actually, and now it just became a Kickstarter and they just launched. So I think the research end is starting to really pick up. It's been around for actually since the 90s, even before then. I'm sure you know the Stanford VR lab, they're huge in that field, and the lab in USC and in Virginia Tech. Mine isn't so prestigious as them, but we stay strong and we do some interesting immersion research.
[00:05:17.342] Kent Bye: Yeah, well, one of the things that has been coming up on the podcast is this concept of time perception, time dilation. Gerd Bruder is somebody within the academic virtual reality community, had a chance to do an interview with Gerd about time perception. But there's all different areas of research that has already been happening for even longer in terms of time perception and time dilation, time contraction. And so you've done some episodes on this. So what are some quick takeaways that you make sense of what's actually happening? Sure.
[00:05:45.777] Azad Balabanian: Time perception, I think it correlates with your sense of immersion and presence. And the way you kind of work with that is the more things you're involved with and doing in terms of like cognitive load or cognitive effort. Essentially, what are you doing in VR and how much of your attention is that taking? That really starts to play with your time perception. And so there's even simple things like music, where having like a major key music playing versus a minor key or a tonal, it actually dilates your time perception even more. I'm really interested in it because, well, you can start to do interesting things like therapy. Like there's a lot of interesting research with how chemotherapy patients were sitting and getting their treatment, but they actually felt like they were in there for shorter. And that ties in with pain therapy and VR being great for that. And then there's the opposite end where I've been seeing things being talked about from the void where people think that they're in there for hours, but they're only in there for like, let's say one hour. And I know Carl from here, that's one of the experiences that he's talked about. I haven't had that crazy time dilation experience just yet. Maybe I have, but I just haven't checked the time and seen that something has happened. But there's so much to it. And we think we're just at the brink of seeing how we can integrate these research principles into our experiences. So I want to lay down the groundwork for all developers to really understand how the hell your brain works and start thinking in that sense.
[00:07:10.691] Kent Bye: Well, there's been a lot of different articles that I've been tweeting out recently about this whole idea that our reality is kind of a construction of our perceptions and that what we think we're seeing in the world isn't actually like an accurate representation of reality, that it's just some sort of model that our brain is doing to kind of create these high-level metaphors to just eliminate all the different noise that's out there. And so what do you think of that idea in terms of like the amount of our perception where a lot of our different inputs from our sight and sound and touch are kind of synthesized and integrated at perhaps at some sort of quantized percept level of 400 milliseconds, I saw one. article recently say that we're kind of doing this sensor fusion at that level even though there's all this subconscious processing that's happening but our bodies have evolved to the point to just kind of tune that out so that we can just Get the high-level stuff and in order to survive so that 400 millisecond I think you're referring to that article about time slices and how your present is essentially a
[00:08:11.577] Azad Balabanian: Your present time is just split up into something of this 800 millisecond span. And all the stimuli and all the things that you feel, see, touch, whatever, those are all combined into this one present time. There's actually a little bit of like, I think, not controversy, but like a little different data coming from different papers. I've seen 80 millisecond going all the way up to 400. So that's you're right. So everything that you're experiencing your reality is completely a construct of your head meaning the colors that you see really aren't really those colors your brain is really good at trying to find patterns and things and The way it finds those patterns is based on all your experiences and your memory of doing the same exact thing so from the moment you're born you start looking and you start seeing and you start seeing these edges and then you start putting those edges together to form a table. So you know what a table looks like. So this is called top-down processing, where you start to really factor in your previous experiences into what you have now. And that's what we're trying to get to with computer vision now. It's like not just looking at something and trying to figure out what it is, but using all your memory before of looking at that object. That goes into every sense that goes into your visual auditory thing And that's why you have these hallucinations where you're even if you're not getting real visual stimuli So you when you close your eyes you're not seeing anything but your brain starts to make these patterns because it's always looking for those patterns to see and to understand what's in front of it and So that's where VR comes in. It's like, OK, so it's always looking for these patterns and it's seen these things before. How can we start exploring that and changing it? So you had that really interesting interview with a surreal VR experience. What was his name?
[00:09:48.782] Kent Bye: Khabibo, yeah.
[00:09:50.304] Azad Balabanian: I'm really interested in things like that because right now in VR, we're still so stuck on the design end of like reconstructing what reality feels like around us. But like, forget that. Let's just go off. Let's go completely make something new and see how our perceptual systems can handle that. That's the interesting part.
[00:10:07.679] Kent Bye: Yeah, I had been talking to different people and someone told me, hey, you got to check out this one reverse sensory deprivation. So overstimulating your senses so much, but that it has the similar kind of effect of getting you into kind of like this Zen state that you get into when you're in a sensory deprivation tank. And so, you know, that's an interesting question of like, whether it's 80 milliseconds or 400 milliseconds, whatever that time slice chunk is, I think virtual reality provides us the opportunity to completely trick our perceptual system into changing so quickly that it's just doesn't know how to process or keep up with it and you know you think about something like Mad Max and that as a film that I think is at the extreme of visual storytelling in terms of a cut every 2.5 seconds you know even decreasing the frame rate of how they shot it so that it even appears faster when they play it back at the regular frame rate and so you're getting such an onslaught of visual information but our brains are still able to somehow construct the narrative and story out of that and so To me, I think that it took many years to get to Mad Max of film and in VR. I'm just wondering what happens if you cut so quickly that our senses can't even keep up with it.
[00:11:17.245] Azad Balabanian: I think right now we're at that stage of we're developing the language for 360 video. I'm not really that convinced with what I'm seeing right now, but that's because we're not used to this new medium yet. And if we showed Mad Max to the people that first saw film in early 1900s, I don't think their brains would have really had the way to piece it together. So again, it comes back down to experiences and memories, and having so much experience in it will help you get much better at processing it. I don't know what 360 video is going to look like in 10 years. I think it might be even more interactive than we think because right now it's just so static and you feel like a ghost. And what does that mean? Why do you want to feel like a ghost? You don't. You have real presence and we need to acknowledge that. And that's where some filmmakers are trying to start to overstep those bounds. I'm really excited to see what VR is going to look like in five years, and maybe it's going to be more involving than AR, because I think having AR storytelling, it's almost like we're going back to sitting around and just seeing things happen in front of us, kind of like theater play. And that's what we have been doing for thousands of years. Suddenly we, for the past hundred years, we went to a flat plane and all the things we're interacting with are just in a flat plane. And yeah, suddenly those bounds are taken away and we're trying to figure out how the hell are we going to tell this story. I think we're probably going to even go back to the old theater way of doing things. of having the stage and things coming into the stage and out of the stage. And not having control of the frame, I think, gives you, as an audience member, as a viewer, so much more to be able to do that. So, let's see where we're gonna go.
[00:12:50.180] Kent Bye: Yeah, and I do think that as virtual reality continues to evolve and grow, that the language of VR will grow as people learn how to watch VR, because, like you said, there may be experiences that people are creating now, but that the audience isn't ready to really consume it in that way. And so from the cognitive science perspective, you've done a number of different podcasts looking at virtual reality from a cognitive science lens. I'm curious if there's any big themes or topics that you've been looking at or big open questions. Now that you continue to do this podcast, what have been some of the big hitting topics that you've been exploring?
[00:13:24.408] Azad Balabanian: So we all like to talk about presence and immersion and I think those are really cool and fun things because I've had those Very rare present moments. I'm like I'm here I'm there and one of them was was Everest and that's due to the auditory effects that they were kind of using and It was really interesting. I want to make a note of this that by the end of it you're kind of dying I think on Everest and it was like a GDC experience and then they're pulsing your heartbeat in your ear So that's really clever design thinking and so I'm really interested not only in that but the complete opposite end of immersion and presence sometimes like would you want a VR experience that's not immersive or present because of things like PTSD because of triggers, because of like really scary experiences. And so I think there's going to be genres of content out there that some are good for present and immersive experiences, but others where you still want to see, you still want to experience it, but you want to be a step removed. I think that can apply really well for journalism. I mean, obviously empathy is like one really big thing you want to hit with journalism, but sometimes it's just too heavy and that will just draw your viewers away. So that's the question I put out on my podcast where I was like, okay, like what are the experiences that you actually don't want to be present in, but you want to be there. You want to see it in VR and immersive tech, but you just wouldn't want to be present and immersed.
[00:14:41.956] Kent Bye: Yeah, just to follow up on that is that, you know, talking to Eric Darnell, who's a storyteller, he's a filmmaker who co-founded the Baobab Studios, and the way he said it was that there's kind of a trade-off between that agency and interactivity and empathy. And so here at the Silicon Valley Virtual Reality Conference, I did a talk where I tried to synthesize the 400 interviews that I've done so far, and creating a framework and a map of the landscape. And as part of that, I see that there's on one axis, the self and the other axis, the other. And so the self is more about identity and presence and agency. And the other is more about empathy and storytelling. Because, you know, the point that Eric Darnell makes is that as soon as you start to figure out that you have some sort of control or agency, then the story becomes more about what can you do, what are the limits of your impact, the extent that you can exert your will into the experience and have that agency. and have that sense of presence. But on the other extreme is empathy where you're actually just trying to put yourself into the shoes of somebody else and not necessarily be present in your own experience. It's more about their experience. And so I do think that in terms of storytelling in VR and these narrative pieces that presence necessarily isn't even the goal. Maybe we'll see a blend where there's a bit of like interactivity but It's going to be a challenge moving forward, but at this point I see that there is this kind of tension of choosing one or the other. Are you going for presence or are you going for storytelling and empathy?
[00:16:06.382] Azad Balabanian: Just going off on a tangent here, I want to make this kind of a clear thing that we're talking about now. I think this entire immersive tech can go in a really bad way. Like, as in, right now we're dealing with privacy issues, right? You can even claim that privacy is dead. And we're having a lot of things tracked, like where you're looking, what your head is, where your hands are, what you're doing. These are great things. I think you can do a lot of great things with this data, but you can really go down the deep end of it as well. I don't know where we're going to be in 10 years. I'm kind of worried. I'm an optimist, but I'm also a realist. I don't know. Maybe privacy will be dead with these things. But I think even with getting eye trackers in there, this is going to be great for user interfaces. You're going to look at things. It's almost like a brain-computer interface when you're just You know site was gonna start controlling your things and combining with your hands and how you can choose options But I can really see this going down to deep end like I want to bring this back on to you Like where do you see VR or where do you want VR to be in 10 years? And how are we gonna constrain evil from kind of coming out of it? What do you think can't you've done this before?
[00:17:11.937] Kent Bye: Well, I think that right now in artificial intelligence there's this explosion of AI and I think right now with AI and VR we do have this like it's the best of times and it's the worst of times because you know there's a lot of exploitation that can happen with evil AI or with VR that does gather all sorts of biometric data in terms of It has the capability to track your eyes, track your heart rate, track your eye movement, track your attention, track all sorts of different biometric data that maybe you don't want these private corporations having and being able to sell to advertisers. So to your question, where is this all going, I think that openness, transparency, innovation, that's all important. in trying to figure the trade-offs between how immersive an experience do we want versus how much control are we giving up of ourselves, identity, our lives to these companies, who knows where that information can go. So, what was your question?
[00:18:12.005] Azad Balabanian: What do you want things to be in 10 years and how do you think people are going to approach using the data in not a bad and evil way? Do you think that's an inevitable thing for us to even have the good content that's out there? You give off your privacy and everything else for having your friends on Facebook and all the interactivity that you do have. So it is a trade-off, but is that the same question that's going to be asked for VR?
[00:18:37.807] Kent Bye: Yeah, so for me to look at the future I look at the past and see where we're coming from and so I kind of look at the 1960s as a time of this revolutionary new breakthrough of this technology that's coming through and that the seeds were planted in the 60s and they're just starting to sprout now and then another 60 years they're gonna come into full bloom in some ways like all the things like if you imagine like being in the 60s and teleporting into 2016 and seeing like the advent of augmented and virtual reality when you know, from the labs of Ivan Sutherland to where we are now. Try to imagine going another 60 years into the future and seeing where we're going to be there. I think that's sort of like what we're trying to do. And I think by looking at those cycles of history, I think that's really hard to really pin down. But I think that looking in the short term, we have something like just pinpointing the Game Boy to the BlackBerry to the iPhone to VR and then what's next after that. If that's the story, if it's starting with mobile gaming and then it's going to BlackBerry and then going to the iPhone and then to VR, what we're talking about essentially is that these alternative imaginal worlds that we're able to dip into at any moment and have a sense of still being connected to our reality, but yet we're just checking out and going into a completely other reality. What is the implication of that? And I think that there's a lot of potentials for our imagination of being able to explore our creativity and to start to connect to each other and to empathize and to bring the humanity back into technology because there's been a dehumanization that's been happening with the flattening of social media and these 2D screens and that VR represents on the large scale this re-humanization of technology at a point where we are no longer being horrible internet trolls to each other but actually treating each other kindly and with empathy as human beings and so Being more connected to each other and to the planet and to the cosmos at large I think is the ultimate goal for me of where the technology could help us. But, as kind of like a train going the wrong track is just the opposite direction of becoming more disconnected, more escapist, and more lost with no real clear direction as to where we want to go and where we want to take it. Because we live on a physical planet that has a lot of real problems and I think that VR kind of represents the potential to To really connect people through the power of stories or the power of engagement to the power of entertainment and fun and play So that's sort of how I would answer that.
[00:21:09.652] Azad Balabanian: Oh, that's a great answer And I think that you can even make the argument that even people Every time you're you know, talk about VR to someone that really isn't much familiar with it. They bring up the problem of addiction, right? They're like, okay. Well, this is way too good way to real you're gonna be in there for all day long and And then you can make the argument that like, OK, well, we have technology today that like the majority of the bell curve, they use it, yes, every day. They might be somewhat addicted to it. I think we're all addicted to our phones. But we still do our day-to-day things. We're still connected to each other in a slightly different way than we were in over 100 years ago. But the way things are going, yes, there will always be people that will be completely locked up in their own rooms doing whatever the hell the tech they are doing but they're still in a way social actually like as a human you're such an animal that you need to connect with your pack and so even those people are connected to someone somehow you know and they are part of a community that and they're very aware of that And so I don't know if I can even make the argument to say that what they're doing is wrong and they shouldn't do that, and we shouldn't cater to those needs to make it more of a connected network. I think this immersive tech is probably going to be the most social network, and it's because the first time I actually felt present was in alt space. I stepped in front of someone, we waved to each other, and we had this really interesting interaction. And I stepped into her personal bubble, and she stepped back. And I was like, wow, that's a huge impact that I just had on her. Those things really excite me, because suddenly all of these things about your social self are suddenly translating into technology. where we can go with that, I think let's tread lightly, let's explore different things, but VR harassment can be an issue, I think, and let's design around that. So smart design, back with good cognitive thinking, I think will yield the great products in the end that won't destroy all humanity.
[00:22:58.618] Kent Bye: I know, just focusing in a little bit more into presence, because I think that's a huge topic in the VR community, and there's researchers like Mel Slater who has his theories about it, but from your perspective of cognitive psychology, how do you think about presence, and what are the cognitive science components of what is happening during a moment of presence?
[00:23:20.737] Azad Balabanian: Wow. What is happening during presence? I don't know if I can really say. I know a couple of different things that can make you more prone to being present. One of the big ones actually is social interaction. Having someone else in there, not just something that you're looking at, that's a big one and that you can interact with. Another thing is actually having your avatar match you perceptually. As in, your hands are where your hands would be. Your legs kind of are where your legs would be. and not having that disconnect, I think, really does start to help you become more present. Good technology, you know, low latency. Honestly, even field of view doesn't really need to be there, because, like, people experience presence even with 100 degrees of field of view that we have now. But if we start to increase that, that, you know, will help us a lot more. It doesn't even need to look realistic. Like, I've felt great presence in Windlands, which is this super low-poly kind of game where you're kind of going around Spider-Man. but I think one thing that really also adds to presence is like understanding sense of scale and like I was standing next to a huge tree and I was looking up at it and I just almost could feel like I was touching it which leads me to the next thing which is haptics and even though we don't have the real full haptics that we want right now that stops your hand from going through a wall. Actually having your hand almost like touching two fingers together, this kind of pinching gesture that Leap Motion is using, that HoloLens is using, is smart. Like there's a difference between your fingers not touching each other rather than touching each other. And when you touch, that's a real click that you're doing. So that when you are touching something your brain is associating that touch to whatever object or button that you're pressing So those things really start to add on top of each other into a more present experience I don't know. I'm like, I'm not the expert on presence and and what you exactly need to feel present it's not a really simple equation that you can plug your numbers and variables into. So let's come back to this question in about three years and see what's really making the grand majority of people feel present and what's not. And we can reevaluate it from a cognitive perspective.
[00:25:22.160] Kent Bye: Well, from the course of the number of episodes that you've done and looking at the cognitive science research and other research into VR, what are some of the other big lessons that you would point people to within the virtual reality community with some of the things that you've discussed?
[00:25:34.709] Azad Balabanian: So, user experience design has been of huge interest of mine because I've been doing a lot of HCI research, human-computer interaction. So, user experience has always been a thing as, you know, it used to be called human factors even, you know, back in the 60s. And it's become a buzzword now, but suddenly this UX thinking of actually how do you craft this experience is paramount of importance suddenly in this new realm because you're literally crafting an experience, a more tangible experience than before. And so what have I been seeing from research VR? I think people are starting to realize that, like, this is not something that you can really just throw things at. Like, you can't just put someone into a crazy roller coaster and expect for that to work. We call that the DK1 days, where we have things like cyberspace, where it's like a barf simulator. Those are always fun to play around with. But, like, go back to learning about yourself, about learning about your brain, your experiences of how things work in real life. And then essentially using those principles of socialness, of even color and how that affects your mood and your perception. Color is actually huge because blue light is, I think evolutionarily we've evolved that blue always keeps you up, right? That's when the sun comes up, that's a blue sky. And then when it's warmer, it's kind of like orange and red, that's when the sun's second, that's when you're supposed to sleep. And that actually has effects on your cognition and your ability to do things. I just saw this paper coming out of Korea and they're looking at the color of light and how that affects someone doing a math task in there. And in the end it relaxed someone when they were more in the orange and red light and when they were more in a brighter yellow and blue light that was kind of like exciting them more. And so these things, I think, can really translate into VR design. That's what my entire goal is. I want to craft a term called cognitive designer, where essentially you're designing what the hell the brain is going to be doing in that experience, and why that's going to work, and how that's going to work. So I have so much yet to learn. I wouldn't even call myself halfway knowing to what I want to be knowing. But this research VR has really been pushing me to sit down, read all the papers that I want to read and put it in a way that people can listen to it and understand and it's not just a bunch of jargon that they're reading on a paper. Because I think research has kind of the problem of trying to appeal to academics and like showing off. Academics love to show off to each other. Professor, oh yes, I'm doing this research. Yes, I'm doing research. But in the end, it's like how do you apply this to something, right? And that's a big problem in research. And so VR is really pushing researchers to start like, oh, let's start making applications because we can actually put this into a product, we can ship it, and we can become millionaires. That's everybody's dream, this startup scene. So I want to see where this is going. I love reading more papers. So if people want to send me more research papers about your brain and how they think this is going to apply to immersive tech, please do me. Find me on Twitter. Find me on Facebook. wherever, and let's talk more about brains and how we can make VR, AR, all the immersive tech a lot better just based on how you're working.
[00:28:47.493] Kent Bye: And it seems like there's a feedback loop between cognitive science and VR in the sense that cognitive science has a lot to teach us about VR design, but VR may actually start to open up new research paths for cognitive science, I would imagine. So how do you see that playing out in terms of the different research questions that VR may be particularly well-suited to help answer?
[00:29:08.066] Azad Balabanian: So, you're right, actually, that VR actually is making research itself easier, I think. So, because you don't have to make real physical rooms and do these experiences. Like, if you want to do a walk-the-plank experiment, right, you can actually just do that in VR. Maybe just put a plank on a just flat piece of ground and it'll really feel like you're walking on top of a skyscraper. I've seen another application where there's things like thought experiments, right? It's like the trolley problem. Instead of actually thinking about these thought experiments, you can actually create them, put someone in them, make them do the actual decision that you're trying to get them to think about, and then get that as a real data and measure that. It's essentially helping you prototype and make these experiences faster, easier, and you can get more data points that way. I think it's really interesting that it's going to help research in the end, being able to replicate old research that you can't be able to do. like there's research done underwater where they think that like if you do your learning and your actual exam taking in the same environment, like it helps doing that better. So instead of actually doing that underwater, which was a really expensive experiment, you can just do that in VR. Obviously you're losing a little bit, but like it's halfway there. So yeah, VR is not just for games. It's not just for things, but it's suddenly a type of technology that's creating experiences and environments just easy and cheap to do. So I'm really interested in how that's going to all come together. And I think we're already seeing the first steps of that happening in research institutions.
[00:30:33.821] Kent Bye: What do you think some of the ethical implications of doing something like the trolley problem might be? Because, in essence, it's a moral dilemma as to do you save the one 10-year-old girl's life or do you make the choice that, you know, four people die but they're in their 60s, you know? Like, that's a situation that, you know, a university would never kind of approve from their ethics board or whatever process that psychological research has to go through, but yet In VR, you can do all sorts of things that have no real consequence, but yet are simulating some really emotionally intense situations. Because I also think about recreating psychological experiments, like the Milgram experience of obedience and having people being tortured, but it's a virtual person rather than a real person. And so from navigating this, what do you think that line is in terms of how that's going to unfold in terms of the ethics?
[00:31:23.938] Azad Balabanian: Usually laws and legal reasoning, like it's always behind technology, right? Like what our legal system is like right now with internets. I think ethically we should apply the same principles as we've always been. So to make an experiment in any of the psychology cognitive sciences, you have to go through something called IRB, International Review Board. You know, I have to give them all your experimental process and they have to look at it. They're like, okay, this isn't going to cause any full psychological harm. Or you might, I mean, you can kind of put them in a somewhat of a discomforting situation, but that's something they need to be informed about that they need to consent to. So those are things that always are going to be a thing. However, I think there's still a little bit of leeway that people have to do it. Because it's such a new technology, we don't even know what we can do with it. There is a certain potential for it going haywire and even motion sickness can leave a lasting effect on someone for the rest of the day. So in terms of the tech catching up to that, I'm really interested in seeing how the IRB is going to start to try to regulate VR research. I think that's a very, very important subject and I believe there's even a talk about ethics today that I'm really interested in going to. So yeah, let's be cognizant and like I said, tread lightly in your VR experiences. Think about what you're building before you start deploying it to the masses.
[00:32:45.664] Kent Bye: And so as a cognitive scientist, what are some of the career paths into virtual reality then?
[00:32:51.648] Azad Balabanian: So my main goal is kind of getting into design and doing UX design. And I think that's, again, very relevant. I mean, being a researcher, you kind of have to study. I mean, I didn't know that I had to learn how to work with a game engine a year and a half ago. But here I am knowing Unity. So that's another route. It's like, okay, if you go really down the coding path, you can become a software engineer doing these things. But Fareed Zakaria said this brilliantly in his great book called In Defense of a Liberal Education. You know, your college education and your degree, that's going to help you and it should help you in your fifth and sixth job, not your first job. As in, that's going to help you start to think in a correct manner and give you those tools to actually do that. Whether you know cognitive science or whatever you're studying is gonna apply to your first job out of college I don't know if that's like the correct approach but giving that cognitive science is probably my favorite thing that I've ever learned about ever in life and it's because it's such an encapsulation like it's of everything of neuroscience of psychology of computer science of anthropology and just yourself and That will always be a relevant thing, I think, even 100 years down the line where we're kind of worried about jobs going away. All these job terms that we're using now, who knows what that's going to look like in 50 years? Are we even going to be programming still in 50 years? Honestly, I don't really think so. I think AI and machine learning is getting to that point where Creativity is not really a human thing anymore at all. There are papers, there are real machine learning algorithms that can produce Vivaldi pieces just based on looking at how Vivaldi wrote his pieces and the patterns that he goes through. Because creativity is not a magical thing. Creativity is just based on previous experiences in one field, taking that principle, applying it in a new way in a different field. That's one way of defining creativity, I think. And so these things, you know, these principles are always going to remain true about your human body and about how your brain works, how your visual and auditory systems work. So now that leads us to like, okay, how can we develop new senses and how that's going to go? I think I'm really interested in that and essentially augmenting your senses. And because your brain is such a plastic organ that any input to it, if you give it with a certain order, if it's not just noise that it's receiving, it's going to start to make a sense of it. And that's going to essentially completely lead us to a whole new world of like, having new senses and understanding the world even better or just understanding even other people better, getting a real intrinsic sense of like how someone else is feeling. I think that's within our possible realm of like doing things really in like maybe 10 years at most. Your brain is just that good of an organ. It's so easy to give it something new to the point that there's people that are integrating magnets underneath their skin in their fingertips. And then after a month or two of just having that, they can feel the electromagnetic waves around electronic devices. Like that's a crazy thing like you would never have that before and then they were talking about like how now you can maybe have a way to program that so that you always have an intrinsic sense of where north is or where south is or take that a step further and where a certain location on your where your home is always have that sense of it, right? if you're tuning into electromagnetic waves, I think that's how birds work, and that's how they know how to migrate. So this idea of sense augmentation, we're very, very at the edge of it, and I think VR is gonna start to build on top of that. At least, I don't really know how that's gonna look or how that would work, but I know Jeremy Bailenson at Stanford is doing something interesting with a third arm coming out of your chest and being able to control that with enough practice. So that's step one. Let's see where that's gonna go.
[00:36:33.464] Kent Bye: We're really getting into the transhumanist future here where literally you're embedding all the technology into your body and you know that's the part of where I get really you know kind of freaked out and skeptical about you know like I think there's going to be a decision point that humanity is going to come to where are you going to embed technology within your body or not and there's going to be kind of like this perhaps species split for people who decide to be organic and clean of all foreign embedded technologies and those who want to Explore all the new extended human capabilities that are possible And so because that's happening now when you look into the future You watch the matrix and you get this direct normal implant and I I sort of said back Okay, we're about 200 years away from that And yet, you know, going to a conference at USC, there was this professor who was talking about like some DoD research that hadn't been fully published yet, but he was presenting it there. And it was essentially like being able to implant cameras into the outside of the eyes for people who are blind. They can see up to like four pixels and then moving up to like 16. And so resolution from going from no site and to four pixels and you start to get into the sense of like well actually being able to tap in directly into the neural pathways and take these digital signals and put them into our brains as perhaps closer than we think and so if we go from the birth of virtual reality in the 60s and then into you know 120 years from that point maybe we will be to the point where we're completely being able to take this technology and and do neural implants and control all of our haptics and all of our perceptions on all levels.
[00:38:08.759] Azad Balabanian: You're right. There might be this shift, and I think there will be this shift between the rich and the very poor, where we do have this access to such insane technologies that make us even better. And I think that this even ties in with gene editing, where there's going to be countries, China or something, they're going to go full out with it before there's any kind of ethical reasoning that like America is going to be really dealing with because we're very involved with ethics and but like if the business end of it is really pushing it forward and I think there will probably be a shift between like really augmented people and just very like Luddites you know that people that don't like technology there probably will be that and they might have their own places to live and doing their own you know little dystopia that or utopia that they're living in Honestly, that's a kind of a real scenario that I'm thinking but I want to be on the other end of things where I'm not really afraid of augmenting my senses. I'm not really afraid of being a transhumanist. I'm not really a transhumanist. I don't have really any augmentation in myself, but like even wearing glasses, you know, I think that's a way of augmenting your senses. I'm slightly colorblind and so I'm going to be getting these Enchroma glasses that are going to correct that. That's a way of augmenting those senses. So I think technology always ends up becoming cheaper and cheaper the same way that smartphones. Right now you can buy a $30 Android smartphone and that's completely changing what the developing world is like because you're suddenly connecting all these people that live in villages to the world and they have access to it. Whether they use that technology for good or for just browsing and memes and whatnot, that's kind of up to the user. Hopefully there's good guidance that they will use it in a good way. But I'm an optimist in that sense that I think technology does get always down lower and cheaper and cheaper because there is a business incentive behind doing that so that there's always a mass adoption. And yes, the Luddites will want to not use it and they'll have good reasons to not wanting to do it, you know, to stay pure to what humans have been like for thousands of years. But like, even those humans have changed from what humans were like a million years ago, you know, like Those humans were up in the trees or they weren't even real humans. They didn't have real tools. So like humans are always changing. There's not really a reason to stop change and there's no force that can really do that. So let's see where this is going to go. Let's look back at this podcast in 20 years and see what our predictions were like.
[00:40:23.322] Kent Bye: Well, so what do you think is kind of the ultimate potential of virtual reality, then, and what it might be able to enable?
[00:40:28.563] Azad Balabanian: The step of something new. The step of, ah, this is the camera being invented. This is suddenly, instead of having to, if you want to describe something that's 100 million miles away, instead of having to draw it with a stick and sand, you could show them a picture. Like, that was a big step. Now, instead of talking about an experience, I think you can make them experience that thing. That's a really obscure way of like thinking of how that's going to apply to everyone's life. But like there's going to be a lot of great applications that we're going to be doing. And I think one of the things that are going to be relevant in 10 years is the social aspects of this immersive tech. I think we're always going to be there and it's going to be the small snackable and The little games that we play on our phones, those always end up being the huge heavy hitters. How that's going to look like for VR, I don't know. But you're going to be in there for a little bit, take it off for now, and then afterwards, whenever it's all ingrained into it, maybe glasses that you're wearing. I don't know if people are going to want to put eye contacts on, but somehow have it beamed into your eye and have that in your life, I think. It's going to bridge the gap of digital pixels and real-life atoms into creating digital atoms, where you're not really going to be able to tell the difference between the two. And you're not really going to care whether it's a real thing or not a real thing, as long as it's providing some kind of value to you.
[00:41:50.882] Kent Bye: Is there anything else that's left unsaid that you'd like to say?
[00:41:55.126] Azad Balabanian: Yeah. I hope people are interested about learning about themselves and about their brain. I think that's been a huge theme of my own life and my own curiosity. And if people are interested in doing that, seek those resources. Go out online. Go down the rabbit hole of Wikipedia and read about your own brain. And that will help you 10, 20 years down the line, whatever job you're doing. As long as you know how humans work, I think it's a really useful skill to acquire. Awesome.
[00:42:23.365] Kent Bye: Thanks so much, Azad. Thanks for your time. Thank you, Ken. It's been awesome. And so that was Az Balabanian. He is the host of the ResearchVR podcast, as well as recently took a job at UploadVR as the program and community manager. So there's a number of different takeaways from this interview that I had, is that first of all, I think the concept of embedding magnets within your fingers and just allowing your brain to be able to slowly interpret those signals that you're getting from those magnets, like, oh my God, what the hell? Like, that's crazy that you can start to put these magnets in your fingers and start to detect electromagnetic waves. So the larger point is that our brain is plastic enough to even start to interpret those signals, which if you think about it, that's pretty wild to see where this augmented cyborg future is leading us. Think about haptics in terms of if you were to embed these different sensor technologies into your body, then you could start to trick all sorts of different haptic signals within the core of your body. And so another thing that was brought up here within this interview was just this concept of, you know, if you were to go back in time and show people who had never seen a film before, if you were to sit them down and show them the film Mad Max Fury Road, would they be able to really even follow it and process it? I don't know if people would really find it comprehensible. I don't know. Maybe the storytelling is so strong that it's universal that we could do this time travel experiment. Or perhaps what Oz is implying here is that a lot of our perception is based upon our previous memories. So if that's true, that would mean that we needed this long of watching films and TV and kind of understanding the natural language of film to be able to watch something as highly edited and sophisticated as Mad Max Fury Road in terms of a visual storytelling. So what that is implying is that perhaps within virtual reality, it's going to take some time for us to really learn how to engage and interact with these dynamic and interactive VR experiences. And perhaps the more that we experience watching the experiences that that is going to play a part of evolving the language of storytelling within VR at the same time. In other words, if you were to go in the future and bring out the most sophisticated VR storytelling experience 20 years from now and put it into people's hands today, would they be able to enjoy it and watch it to the same degree that somebody 20 years from now would be able to? My expectation is possibly not, that even if you did have the most advanced levels of storytelling, that there's a certain amount of education that we have to go through in order to really know how to experience and watch this new media. So I think it's something that some experiences may be ahead of their time right now, where they're trying to do things that are so innovative that people aren't really trained enough to be able to know how to fully navigate it. So it's a bit of a speculative question, but it's an interesting thought that a lot of our perception is based upon our memories of what we've already experienced. And so The implication of that is that that's a dependency of our ability to fully enjoy and experience these new storytelling modalities that are going to emerge within virtual reality. And so the other point that I just wanted to mention in listening to this interview is that Oz was making all these really specific concerns about privacy and the future of privacy. And just listening to my responses, I didn't fully answer the privacy aspects of that. And actually, from this interview, I did a number of other interviews, including with Ebi Altberg, that kind of really started to inform some of these deeper concerns that he was raising about privacy. So I also did an interview with Sheila Dean in my episode when I went to the White House's workshop on artificial intelligence. Sheila is a privacy advocate, and there is a bit of a battle that's happening about privacy. And we do still own all of our own data that we are putting into these systems. And yet, where we don't have much access or transparency into that. And so there is a bit of a battle around privacy that I think is emerging. And I think that virtual reality is going to be at the forefront of a lot of those privacy battles. So there was something that happened back on May 31st that I just wanted to bring out here, which is that there is a Fourth Circuit Court of Appeals. They upheld what is known as the third party doctrine. which is a legal doctrine that suggests that consumers who knowingly and unwillingly surrender information to third parties therefore, quote, have no reasonable expectation of privacy in that information. regardless of how much information there is or how revealing it is. In other words, from the perspective of the U.S. government, any information that you're deliberately handing over to third parties, say Facebook, Google or anybody else that has a terms of service where they're gathering information about whatever you're looking at or doing in VR, anything that you're voluntarily giving over to a third party is essentially you're waiving your rights to any reasonable expectation of privacy of that information. So again, I'm not a lawyer and you know, there's probably other specific privacy terms of service that people are signing but The bottom line is that there is this larger thread of like information that you are handing over to these third parties Could actually not really have as much Privacy protections as we would like and so that's got a lot of really big scary implications for VR especially as we start to integrate more and more of these different biometric data and into these systems where if you add artificial intelligence that can then extrapolate all sorts of other deeper meaning into whatever that raw data is saying, then you're essentially talking about your emotional states and your attention and all sorts of other really sensitive information that we may not want to be giving over to these companies. So I just wanted to reemphasize and reiterate the concerns that Oz was bringing up here, because it really started to kind of focus me into asking other people about this, especially Ebi Alberg of London Lab, as well as some questions that came up at this White House artificial intelligence workshop that I went to. So that's all the big points that I wanted to point out here. And if you do enjoy the podcast, then please consider telling your friends, spreading the word, as well as consider becoming a contributor to my Patreon at patreon.com slash Voices of VR.
Rough Transcript