#1550: “EchoVision” Answers the Question ‘What is it like to see like a bat?’ with Mixed Reality, AI, & Haptics

ECHOVISION is the latest experience from multi-disciplinary artist Jiabao Li that has three major parts. The first part is a mobile-phone based mixed reality experience that does a metaphoric translation of echolocation by using LiDAR to detect your immediate physical surroundings, and then reveals it with a rippling shader that is voice activated. The second part is a video that poetically visualizes different bat calls that have been identified by AI into different contextual domains, and there’s a really awesome haptic couch experience that goes along with it. The last part is a field trip to the Congress Street bridge to watch the emergence of hundreds of thousands of bats come out at night as they go to eat. I caught up Li to hear more about all of her interdisciplinary and interspecies collaborations on this piece.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.458] Kent Bye: The Voices of VR podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR podcast. It's a podcast that looks at the structures and forms of immersive storytelling and the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So continuing my series of looking at different immersive stories from South by Southwest 2025, and also I'm starting to dive into some of the pieces that we're really focusing on different embodied interactions. And so in today's episode, we're gonna be talking around EchoVision by Jiabo Li. So this piece had like three different parts. The first main part is that it's a phone-based mobile AR experience. The phone is set within the context of like this bat mask that is 3D printed, and it's using LiDAR scanner to scan the world around you, and you make some sort of noise, a loud noise, It detects that noise and then sends out a pulse that is represented by a shader that is trying to simulate different aspects of like echolocation. And it's trying to give you this visual translation of echolocation. And they have a whole like balloon cave that you have to navigate through. So the back of their booth, they have this really dark cave that you go into and that you have to try to navigate amongst all these different pillars of balloons in order to get to the other side of the cave. They also created a whole video experience that is taking bat calls. They're categorizing it using AI algorithm developed by these researchers to find the different types of calls, whether it's for mating or food or all these kind of different contextual domains. And then from there, designing these different scenes and creating like a whole immersive, interactive video installation that shows the back calls and some like audio reactive visualizations. But also you're sitting on this couch that has this haptic vibration that is also immersing you into this whole music video type of experience. So that's like the second part. The third part is that they actually had a whole field trip to go out to the Congress street bridge in Austin, Texas, where there's like 700,000 or a million or so bats that come out like six 30 central standard time or seven 30 central daylight savings time. But I had the chance to go see like all these bats flying out. And I also partnered with the Austin bat refuge to have rehabilitated bats that they were releasing. And, Yeah, it was just kind of like an immersive adventure out in Austin to bear witness to these hundreds of thousands of bats that were flying out to go out throughout the night and to basically eat like three times their weight and come back. So I talked to Jaibo about all these different collaborations and she's a real interdisciplinary artist and collaborator. So she's also pulling on all these different other subject matter experts to be able to produce a project like this. So we're covering all that and more on today's episode of Voices of VR podcast. So this interview with Jaibo happened on Sunday, March 9th, 2025 at South by Southwest in Austin, Texas. So with that, let's go ahead and dive right in.

[00:03:06.943] Jiabao Li: Hi, I'm Jiabaoli. I'm an artist, designer, creative technologist. I'm also assistant professor at UT Austin. So here at South by Southwest, I'm showing a work called Echo Vision in the XR experience. MARK MANDELMANN- Great.

[00:03:18.893] Kent Bye: Maybe you could give a bit more context as to your background and your journey into the space.

[00:03:22.659] Jiabao Li: Yeah, so before I get into the content creation side, I was at Apple and work on the Apple Vision Pro for four years. And then I want more artistic freedom. So I joined the academia, I become a professor. And I have a lab called Echocentric Future Lab. I work in all kinds of medium, XR, installation, performance, video. The medium doesn't matter. It's the content, whatever speak to the content.

[00:03:48.939] Kent Bye: I know we've had a chance to have a couple of conversations about some of your previous projects, but maybe you could just give a bit of context for your journey into working with XR specifically.

[00:03:57.987] Jiabao Li: I think back to my time in Harvard, I've been working on how technology mediates the way we perceive reality, and that creates three perceptual machines to push into extreme scenarios to talk about. the impact of our perceptions through social media, targeted ads, and Google search. And then I also made a VR film. After that, I made a VR film called Once a Glacier. It's about the story between a girl and a piece of glacier ice. And as the glacier melts and retreats, this piece of ice is being threatened. And then at IFA, our last conversation was on Squeaker, the mouse coach. The mouse be a running coach. I made a smart running wheel that whenever the mouse runs, it prompts me to run. And I get to scan on social media if I run more than the mouse.

[00:04:49.642] Kent Bye: Yeah, in our last conversation, we were talking around how part of your interest is to look beyond just what is happening with connections between humans, but humans and non-human intelligence, so like animals in this specific case. And so maybe you could just elaborate a little bit on your interest of this intersection of human intelligence, but also intelligence that goes beyond just humans.

[00:05:08.908] Jiabao Li: Yeah, so, okay, since we're talking about extended reality, we live in this multitude of reality, the reality mediated by technology, by our bias, by the medium we read, or by the different umwelt that across species. Like, I see the world very different than how a bat sees the world, because we have this different sensory umwelt. And so, okay, among the different species so far I've been working with, the Hawaiian bobtail squid, octopus, cricket, Asian elephant in China, and now the bats and the mouse. So the idea is by co-creating or creating with these species, we get to understand their intelligence better. We get to cultivate curiosity and shift perspectives that can help us to make more actions. And we had a panel here called animal influencers as a way out of climate fatigue. So when we talk about climate change, we kind of get tired of it and it becomes very doom and gloom. But maybe through the lens of animals, through the individual species, we can get closer to it and find it more interesting and that can lead people into the topic.

[00:06:20.185] Kent Bye: And what was the catalyst for you to start to get into this specific area of research of the intersection of this human intelligence and intelligence beyond just humans? What was the turning point or catalyst that really got you started down this path?

[00:06:32.417] Jiabao Li: I think it was the octopus. Octopus, they have distributed intelligence. So at the time, I worked with the Kevalo Marine Biology Lab in Hawaii. They study the Hawaiian bobtail squid on the bioluminescent bacteria called Vibrio fischeri that they can mimic moonlight to not have shadow so the predator won't see them. And in that marine lab, I also got to spend a lot of time with two octopus and they've taught me so much, like the octopus teacher. And I was just fascinated by all the animal world. And once I started to work with squid and octopus and then It comes with the bats, the elephants, the cricket, and it just keeps going.

[00:07:12.285] Kent Bye: Well, there's actually quite a large urban population of bats here in Austin, Texas. And so maybe take me back to moving into Texas and teaching here at UT. And what was your first introduction to the bat ecosystem?

[00:07:26.920] Jiabao Li: So I think UT Austin, the school who hired me as a professor here, is very smart. The first thing they took me when I do a campus visit is the bats. And I was like, oh, this is so weird and so interesting. I got so attracted to it. And then so I chose to join the school. And so now, a few years from then, this idea about bats just keep running in my head. And I want to do something together with the bats. So that's how it all started. Yeah, so like every night there are 1.5 million bats. They are Mexican free-tailed bats flying out of the Congress Bridge. It's a big tourism attraction. When they first started, the people and the city kind of viewed them as pests. There's a lot of people who want them to be out of the city. And there's conservationist and scientist Merlin Tato, also my collaborator on this project. He used many ways to change people's perception and in the end kind of save the bats from extermination. And now it really become really an icon of the city.

[00:08:31.590] Kent Bye: Nice. And so it is a pretty unique phenomenon. I actually went down and watched it for the first time yesterday as a part of your, I guess, expanded experiential design of this experience that you have, Echo Vision, where there's haptic chairs and a couch, and there's AI videos. There's a whole computer vision aspect with echolocation. We can get into all that, but maybe you could take me to what happened last night in terms of your collaboration with taking folks out. I got to see the bats for the first time from flying up underneath the bridge. There's a certain time. Yesterday it was like 6.30 p.m. Central Standard Time. Now it's going to be 7.30 p.m. Central Daylight Savings Time, since we just had that switch last night. But there was basically this one little stream of bats that just looked like a fire hose of bats just shooting out of the bridge, and they were all flying around. It was really quite a spectacular spectacle to watch. But what is the... context for like, is this something that happens a lot with bats and this, like, is there anything specific around that bridge or I haven't seen that in very many other cities. So what's unique about this ecosystem here in Austin that has attracted this large of a urban bat population?

[00:09:36.312] Jiabao Li: Yeah. So they are the largest urban bat colony in the US. It's this special design of the bridge that was by accident by this engineer that they made the slit. The width between the panels are perfect for bats to crawl in and make home to them. And it was only like in the 1980s, I think, that the bats chose them as a home. And so, okay, what happened last night is we walk from the Fairmount, from where South by Southwest happened, 10 minutes to the Bat Bridge. And then we partner with the Austin Bat Refuge to have bat release and also a lot of bat education about bats. So people get to see these bats really up close. They bring rehabilitated bats and they... There are two male bats and they got released to the 700,000 females. So it's all the bats, most of the bats under the bridge are maternity colony. So these are female bats and they fly to Mexico in the winter and they fly back around this time. So they go to Mexico to party, they get pregnant and they fly back here to nurture their young. So around the summertime, they will have their babies, their pups around them. Yeah, and so we have the bat education. We also showed various live bat flight paths, so we can see, okay, when they fly out of the bridge, where are they going around Austin? They actually fly quite a bit, like they even go to San Antonio and then take a journey back. And they go out to eat mosquitoes, eat moths, like all the pests. I like to quote Marlene here, if you hate mosquitoes, you better love bats. So they take a whole journey and they get, they got really full and they come back to sleep during the day. And then for the stream you see last night, that was because it was a bit cold last night. So that was considered as small bats flying out during the hot days. It's just all over. It's everywhere. It's like a whole stream and it lasts for like an hour, a whole hour. And they fly out during the sunset because that's the time that the predator won't see them. So they can just fly out easily.

[00:11:45.245] Kent Bye: Okay, so the predators can't see them as well. They're really quite small. Someone from the Austin Bat Refuge was holding one of the bats. The body seemed like it was the size of maybe one of his fingers, but he was showing different grips to hold the bat, and then there's a special grip to feed the bat, so they had some moths or worms to actually feed the bat, so it was really quite cool just to see a bat up close like that, but...

[00:12:07.518] Jiabao Li: When they released the first bat, he landed on my leg. And that's the second time that a bat release landed on me. I was so happy about it. Yeah.

[00:12:19.768] Kent Bye: What did it do after it landed on you?

[00:12:21.850] Jiabao Li: Oh, he crawled up, and then they grabbed the bat again and then released him second time, and he fly to the bridge. I think he just got disoriented and just grabbed on the first thing that he see.

[00:12:33.672] Kent Bye: You do have some bat ears on, so I don't know if you're attracting them in that way. OK, so you were also showing some of the experiences that you've created here for the installation. So maybe you could take me back to the first initial iterations of the echo vision that you're showing here, because you have a number of different components. There's the phone-based computer vision visualization with the whole 3D printed mask of a bat face. There's the haptic couch and the AI video. Where did you start with this experience?

[00:13:00.086] Jiabao Li: Yeah, so the essence of it is people can echolocate like a bat. Bats echolocate, they scream and they hear the environment bouncing back. And so similar to bats, people can use echo vision to scream and see the environment bouncing back just like the bats. So we have received all these different shapes of bat masks. There are four of them, different color. It's based on different species of bats. And when they scream, we use the LiDAR sensor to sense the 3D environment in real time. So you can basically take the bat mask anywhere. It's not tethered to anything. And then you just scream, make really a lot of noises. Actually, today we just got a noise warning at South by because people Because people are behind the bat mask, they don't feel shy. And we built a bat cave so they can explore the cave with the echolocation. And it's pretty dark there, so people just go all out. And so there's a screen booth at South By. I feel they don't need that. They just need this bat mask.

[00:14:02.000] Kent Bye: So yeah, so there's behind your booth, with a part of your booth, you've created this little bat cave that has balloons. And so part of making a little noise, I was making a little whistle noise. And so when I would make that whistle noise, I would see the waves of the reflections come back of almost like you were painting the contour, like edge detection of the different boundaries. And it was going with a sound field going out in time. And so it was hitting all the different structures that it was detecting and then sending back light. So for bats, they'd be hearing back the echo. But this was, I guess, more of a visual translation of that umwelt. So yeah, maybe you could just talk a bit about that process of creating this effect with the computer vision and the phone to be able to recreate this type of echo vision.

[00:14:47.256] Jiabao Li: So it's really hard for us to really understand how echolocation works because we can't hear that much information and use our brain to be able to tell and reconstruct 3D just by hearing it. So we use the visual part to represent that. So the microphone takes in the sound and the LiDAR sensor reconstructs the 3D and we have the shader to go through to create these waves and particle point clouds. And we use a kind of a red shift and blue shift color of the waves because I read from Ed Young's book An Immense World. the best book of the year, I recommend it to everybody. He said when bats echolocate, they're not just only, as we usually think about, they're like black and white waves, but it's much more complicated than that. They have color, they have texture, they can sense this material is wood, that material is carpet, or something is metal, and they can sense all the intricate structure in it. They can also vary the frequency of the sound to be able to, like if they're flying in a kind of open space, they can make longer intervals to explore the open space. And then when they target a mosquito or a moth, they can make more like . So really target to that and go for it. So they can really change their resolution. which is amazing, right? So we try to recreate some of the visual that in a way, a bit closer to simulate that whole experience.

[00:16:17.284] Kent Bye: Yeah, so I guess in the sound terminology, it'd be like different frequencies with different hertz that they're able to send out these noises and get it back. Do you know what frequency of that, what kind of range we're talking about?

[00:16:29.234] Jiabao Li: Yes, there's a frequency. I don't have the numbers on top of my head, but it's beyond our hearing range. And they are also able to tune out, let's say, the bats under the bat bridge. That's like millions of bats. So if they all can hear each other, block each other's ear. We all be deaf. And so they can tune through their radio frequency to not hear others' echolocation and just hear their own echolocation.

[00:16:55.396] Kent Bye: Oh, wow. So they have to basically find the unique signature that they know what they're putting out. OK.

[00:17:00.798] Jiabao Li: They don't interfere with each other. Yeah. And also, we are using the HoloKit and the LiDAR sensor. The HoloKit are a set of lenses that reflect what the iPhone can see. And then the LiDAR sensor itself, the technology, it's also used in a way that's very similar to echolocation. You shoot out IR patterns and then it bounces back from the 3D environment. And so that's how also like echolocation works. So the technologies speak with the biology.

[00:17:29.366] Kent Bye: You mentioned a book and you said it was a book of year. What was the book that you're referencing and what did you get out of that book?

[00:17:33.809] Jiabao Li: So it's An Immense World by Ed Young. He won the Pritzker Prize by writing a lot of the work around COVID before writing the book. And each chapter tells the story of different sensory world of different animals. So like one chapter is all about magnetic field. One chapter is about sound and echolocation. The other could be about a visual field. the mantis shrimp can see so many colors than us. So like really open up, it blow your mind about how animals see the world so differently.

[00:18:07.665] Kent Bye: It was really fun to watch the bats as they were flying out because they're really shooting out almost like this fire hose that is just shooting these bats out of the bridge. And then there's a tree that was right there. And I didn't see any of them hit the tree. They're like, just like when you watch swarms of animals, they're never really running into each other. So similarly, they're shooting out and not running into anything. Obviously, they've been able to tune it. But at the speed at which they're flying and they're coming out, it was just really, for me, one of the more striking things was to see an indication of how they're navigating around. Yeah, they're just kind of a marvel to be watching them. And it was cool to also have your demo that you're showing here at South by Southwest on the ground. And this is a type of project that doesn't need to be tethered to an installation. You can basically take it anywhere and start to use it. So I'd love to hear if you heard any feedback or any other reactions you had by doing more of a public exhibition of some of your projects.

[00:19:00.154] Jiabao Li: Yeah, so to the speed part, they are the fastest memo among all memos. Their speed is faster than a car on the highway, so really fast. And regarding to bring, so we call it pop-up on-site in-situ events. We can just bring it to anything. And because it's so intuitive, people just grab the handle. And because we have this little... hand that are also hook of the bats because it's hand people know where to grab and even kids or people just jogging nearby they're like oh what the heck that is so cool i want to try and then they just grab it and they scream they immediately get it and it's not like a vr headset that you have to spend a whole minute to put it on your head so it's really easy to for people to share the experience to pass it along So even like when there are thousands of people, hundreds of people under the bridge, it's easy for everybody to have a sense of how to echolocate to try it out. And I really enjoy showing it in events like this that people coming are not just batch holders. People really just, they're interested in bats and they want to learn more about bats and this is a way that they can try that. And there's so many kids and I really love seeing kids using their cute little voice and experience this. It's such a good education tool.

[00:20:21.863] Kent Bye: The thing that also was really striking about your project was just how many different types of collaborators you have, not only in creating these projects, but also at the exhibit showing it. So different bat experts, people that are involved and working with Austin Bat Refuge. And so maybe talk about some of the other people that were here at the booth and providing bat expertise.

[00:20:40.306] Jiabao Li: Yeah. So Teresa, before she worked with Marlene Toto, now she works with the Austin Bat Refuge, and she's the bat expert here. You can get 100% of the bat knowledge of your life from just talking to her, and she can talk that to you forever. She's amazing. And if you go down to the bridge, then you can see the Austin Bat Refuge. Like Lee is always happy. He spent so much of his life and energy. There are a couple that are running the refuge. just in their backyard and they convince their neighbor to know to sue them or report to the police like the whole austin battery field is running in their backyard and they built the whole flat cage themselves and lots of volunteers come help them random i also was a volunteer there before i got the rabies shot three shots for that so i can work with the bats And I also work with Mike Ryan, who is a professor at UT. Actually, he studies frogs, and I approached him first because bats eavesdrop on frogs to eat them. So during frogs, they're mating call. It's like you want to mate, but you also endanger your life to be eaten by the bats. So because he studied frogs, he worked with Merlin Tato, who studied bats. So that's how I got connected with all of them. And it's really a whole community of people who love bats in Austin that come together. Also, Nate, Nate the Great, that's what we call him. He's from the Balloon Collective who built the Bat Cave. We are longtime collaborator and really good friends. When I show one's a glacier at South By, two years before he built a glacier cave, ice cave with the balloons. Yeah.

[00:22:24.721] Kent Bye: Yeah, and so there's also a whole other component to this project, which is a video that's using different AI techniques, doing some categorization of the batcalls, and then poetically interpreting it in different ways. But the thing for me that was really striking was the haptic couch that was also turning some of these batcalls into more of a rhythmic musical experience that was also very visceral because of the haptic feedback that was involved. And so can you start to elaborate how you started to take this idea of taking the bat calls and then categorize them and then create different types of moods and vibes and depictions of those calls that we're trying to also replicate the themes of what those calls meant for the bats.

[00:23:09.085] Jiabao Li: So that part I made it in collaboration with Marc Coco. He specializes in sound, sound art. So after I start to volunteer at the Austin Battery Fuge, I hear so much of their different sound. It's so interesting when they do different things, they are so different. And so I started looking into the papers and I found out that Researchers in Israel, in Tel Aviv University, they use AI to categorize the bat calls into mating, fighting, kissing, food-related, or mom bat talking to baby bat. They even found out that similar to human mom talking to human baby, we're like, Oh, baby, you're so cute. So Bat Mom has a similar tune, but with lower frequency. So I reach out to them and get a lot of data from them, super nice researchers. And we're like, OK, what does a baby bat lullaby sounds like? Or what is the love song by the bats could be? So me and Matt start to imagine what could this be and we made like a bad karaoke but instead of human singing it's the best singing and we create this visual audio experience the visual are their habitats like the cave the rocks the cliff the trees and then each scene is a unique experience that pair with the song like for example one scene has a lot of fruits like the banana apple that's the fruit eating bats and then the sound of the bad karaoke song are all made from their food related. It could be argue about food like you got my food, this should be my food or this food is so delicious. But we know it's food related. We don't know actually what they are really talking about. So it's also a thinking around With AI, there's various initiatives, like the Earth Species Project, trying to use AI to decipher what the animals are talking about. But so far, we're only able to know the metadata, the bigger category of what they're talking about. But really, what they're really saying inside the category, we don't know yet. So it still remains elusive. And the whole thing is inspired by Thomas Nagel's philosophical inquiry, what is it like to be a bat? Can we really experience the whole like embody another species and all this is trying to get a bit closer to that.

[00:25:23.680] Kent Bye: Have you seen the marshmallow laser feast in the eyes of the animal? I'd love to hear some of your comments on that, because I saw that back at Sundance in 2016, and it is a similar type of taking these different animals and how they are perceiving the world, so trying to create this visual representation of their umwelt, as it were, and trying to use different point cloud representations and different visual aesthetics to try to mimic different aspects of either things that may be beyond the frequency of what we can see or things that they're paying attention to that We may not, because their perception has evolved to put forth their own survival. So I'd love to hear some of your comments on the Marshmallow Laser piece of Into the Eyes of the Animal.

[00:26:01.969] Jiabao Li: Yeah, I only see the documentation of that project, and I've been using that to teach my students whenever I talk about XR storytelling and how do we cultivate empathy and change our perspectives of other species. I love that piece. I haven't seen the inside of your headset, yeah.

[00:26:19.415] Kent Bye: Well, we'll definitely have to connect you up with the Marshall Laser Feast. I mean, I guess that probably speaks to a larger challenge of distributing a lot of these projects that have been on the festival circuit. But yeah, to me, it's like a classic exploration of this as a topic. And I feel like your project is kind of in that lineage of finding ways of using different visual shaders to represent the perception of other entities. the thing that i really loved about the music video component was that there was a lot of audio reactive components like there was i do a lot of editing with sound so i'm very familiar with like this fft frequency spectrogram that is able to show a range of different frequencies and there's some really nice ways that you took the linearity of how I'm usually seeing that and putting it into a little circle, or just different ways that you were trying to take the bat calls and then do some sort of visual representation of it, but with this kind of artistic twist that was unexpected to the utility of how most audio engineers would be looking at these frequency spectrums. So I'd love to hear a little bit about your process of trying to find creative and unique ways of doing this audio reactive visualization of the bat calls.

[00:27:23.584] Jiabao Li: Yeah, so that really credit a lot to Matt Marcoco, my collaborator. He has worked with so many animal sounds. He created a lot of exhibit reappearance for the Natural History Museum in New York and really is the expert of making animal sound come to life both visually and multi-experientially and like the kind of we choose different visual effects when it's according to different emotion they have like the love cave that's like really loving effect particle system and they twirl they spin and some of them they have a longer spin because they twist the sound a bit more like comparing to whereas fighting it's like ah really strong one big shout out of the fighting so we kind of we use the visual language that speak to that kind of the feeling of that and we actually also have a live performance version it's like a DJ bad but instead of pushing back that use the music instrument we have the bad cause and we performed at the like Austin Contemporary Art Museum. It was after the bridge, then we go up on the rooftop and we push the buttons and play the bat sound. And then people can also come in and play themselves. So they all become DJ bat, bat DJ.

[00:28:43.443] Kent Bye: Okay, so that happened last night afterwards?

[00:28:45.484] Jiabao Li: It's before. So we performed this. It's in collaboration with Fusebox Festival and the Austin Contemporary Art Museum. So it's sometime last year. And we're also performing in UK and Vancouver, Vancouver International Film Festival.

[00:28:58.328] Kent Bye: Okay. And you have this really beautiful audio reactive particle effect audio visualizations of the bat calls, but those are overlaid onto a larger spatial context that seems to be these different moods and vibes and spatial contexts that you're creating that seem to have some other perhaps generative AI elements. Or maybe there isn't. Maybe just talk around the world building of those scenes that you're making.

[00:29:22.079] Jiabao Li: Yeah, so the scenes are built in Unreal Engine. It's based on their habitats. And then the particle system is all in real time in Unreal Engine. And then when we perform real time, that's touch designer. We also perform at the Ars Electronica Deep Space. So they have the vertical wall and then the horizontal wall. And we really make the whole space immersive. And people sit inside. And they can trigger things together. They can play it all together there.

[00:29:50.766] Kent Bye: So is there any generative AI at all, or not?

[00:29:53.287] Jiabao Li: There's no generative AI, but only the AI part that's being used to categorize the bat calls. Okay, and that's from the... It would be interesting to kind of create a new bat language based on all this bat social call data. That could be a new direction of the project.

[00:30:11.977] Kent Bye: Yeah, maybe eventually you'll be in direct communication with the bats and you'll be able to communicate with them. Well, the other big component is the haptic chair. So you're on this couch and there's two sweet spots. I don't know if people sitting in the middle are feeling quite the same intensity. It could seat three people, but when I was sitting it, there was just two headphones. So I assumed that it was meant for me to be sitting right there in the middle. But there was a series of different types of haptic experiences that were also being tied to this more audio reactive visualizations that were being shown. And so maybe you could talk through the process of trying to design the visceral haptic experience of this piece.

[00:30:46.233] Jiabao Li: Yeah, so we want multi-model experience. So the haptic, it's coupled together with the audio. So you hear the audio and lower bass frequency are being translated into the vibration. Sometimes the vibration is so strong, you start to feel like a massage chair or people call it the sex couch. And so you can feel the cadence of the bat and the beats.

[00:31:10.717] Kent Bye: Yeah, it's really visceral. And I always appreciate multimodal experiences because, I don't know, I just feel like you're going back to the days of subpack. And I feel like haptics in some ways is fairly overlooked in terms of how to really create that next level of immersion. And so it's something that's subtle on the surface, but I think really powerful. So yeah, I don't know if you have heard any other feedback for folks. What kind of other experiences are coming up as they're sitting on this couch?

[00:31:36.707] Jiabao Li: Yes, there were people that just keep sitting there for like one, two, three, three round of loop and really enjoy the vibration. When we show this in Sheffield, there were people really enjoy the haptic that they turn the dial all the way up. They even create a ring on the sofa. When you talk about the multimodal experience, I think, honestly, the experience under the Bat Bridge, that's the best multimodal experience. Like, you go towards the bridge, you smell this strong guano, which is the bat's poop. It's so, so strong. I don't know if you smell it.

[00:32:11.535] Kent Bye: I didn't smell it. I didn't notice. I mean, I may have been there. I just didn't clock it at all. Yeah, so I didn't really. Maybe I wasn't paying attention to it as much.

[00:32:19.458] Jiabao Li: Yeah, as soon as you approach, it's just so strong you can't ignore. And then you hear the bats' sound, their social call, their echolocation call, if you have really good hearing. If your ear didn't get blasted by loud music, which probably most people do through South by Southwest, you can still hear a little bit of the echolocation sound in the very high frequency side. So there's the sound, there's the smell, and then with echo vision, you can have the sound and vision, and then there's the real bats.

[00:32:50.398] Kent Bye: How did you create the haptic couch? What kind of haptic devices are you using? Did you have to get the couch and install it yourself, or is this something that is sold, these types of haptic couches? Maybe just give a bit more context for how this haptic couch came about.

[00:33:04.329] Jiabao Li: Yeah, so the couch is from my living room. One perk you can do during South by when you are basing Austin, I bring my whole living room here. And then the haptics, they're basically like subwoofer and we have an amplifier to amplify the music and then branch it out and filter out the high frequency part and left the low frequency part and the subwoofer can make it vibrate together with the music.

[00:33:30.365] Kent Bye: So it sounds very similar to how Subpac worked, which would be looking at those lower frequencies and then translating that into the haptic vibrations. So how did you design each of the different haptic experiences? Did you develop a pattern language to understand? Because there seemed to be a little bit of variance throughout, and maybe there's consistent patterns through each of them, but how do you start to differentiate between the different sections and the different frequencies or techniques or rhythmic patterns or like what are the degrees of freedom that you have to work with to create a haptic experience like that?

[00:34:02.758] Jiabao Li: It's really based on the emotion of the scene like the love scene or the one with a lot of fruits that's like a lot of beats and you go with it and people start to dance with the beats on the sofa as well. So it's really depending on the emotion we want to create, and we design how much we vibrate. And for scenes that are more like the one in the trees, which are more subtle and lingering, then we tune down the haptics.

[00:34:30.341] Kent Bye: So what's next for this project? Do you have anything else that you're going to continue to develop it, or other projects that you're starting to dive into?

[00:34:37.858] Jiabao Li: We've been talking about other animals who also use echolocation, like the beluga whale, the orca, and maybe expounded to other animals. Yeah, and other projects. I just finished a solo exhibition in Tokyo last month, and I showed 10 projects I created From the Arctic Circle artisan residency, we were sailing on a boat, a sailboat, for two and a half weeks with our internet in Svalbard. So that's an island that's north of Norway and near Greenland. So I created works around polar bear relocation or futile efforts. I was trying to freeze the glacier, melt the glacier and give it back to the glacier, or try to empty the ocean with a coffee cup. We play soccer, but the soccer is a glacier ice. Yeah, so like ten projects out of that and I guess what's next after that? I have another solo exhibition coming in May in Austin that I'm still thinking about the whole narrative about that, yeah.

[00:35:41.735] Kent Bye: Well, yeah, you're really quite prolific. On a cadence here, we've had an opportunity to cross paths a number of times. Yeah, you were going to say?

[00:35:48.683] Jiabao Li: Yes. What I'm working on now is I'm on sabbatical this semester, and I'm visiting professor at Stanford University. I work with Manu Prakash Lab. planktons. So how these single cells can their tiny cells, how can they impact climate change, and how to really let people know this such a small scale thing can have such big impact on the thing that that's the most like most impactful thing in the world now, one of the most one. Yeah.

[00:36:20.290] Kent Bye: Yeah, I was going to say that you're so prolific in all these different projects, and it's always a pleasure to catch up and hear your latest project, hear how you made it, but also these other projects. And we're here at South by Southwest, and you had mentioned that you had an opportunity to go listen to a talk about artificial intelligence. What were some of the things that you were interested in hearing and learning about there?

[00:36:39.266] Jiabao Li: Yeah, so in general, this year I'm interested in intersection of AI and XR. So today there's this talk by John Maeda, kind of like a report of what to look out for or what has been interesting work done in this space. It was really good. He spent one hour, but the whole content really could be five hours to talk about. Very hardcore, dry content, really good. I think he will post it later online somewhere about that. And I think in general, the generative AI XR space I'm curious to see what are some more interesting way to use that. And especially with AR and AI, we can have context aware. We can have, no matter where you sit at, what's around you, like, oh, this core context you exactly know. And you can generate the UI or the content based on all of that. And I work on Apple Vision Pro. When it's not time, that generative AI is not really a thing. And so everything is designed around the pre-generative AI era. So I'm curious what can made the post or during this era.

[00:37:52.766] Kent Bye: For me, when I look at AI, it's such an interesting, weird spectrum because there's a lot of very clear utilities that are coming out of large language models, but there's also this whole range of hallucinations and the blind spots and the lack of comprehensive data. I guess there's a lot of promises and potential, but also a lot of these perils that are also embedded in there. So when you're looking at this intersection of XR and AI, are you looking at it beyond just like the paradigm of large language models? Or is it mostly that's where the center of attention is for seeing how people can push the extent of what may or may not be useful for using that specific deep learning architecture for applying to different contexts? So love to hear some of your thoughts or reflections as you start to do a survey and look at the intersections there.

[00:38:37.852] Jiabao Li: I think so beyond there's also so interesting space around use in biology and medical space like what if the whole evolution of biology is LLM because we can put the DNA strings the letters into basically everything can be become letter and those are text and if those are LLM and the whole evolution it's become something you can train and all the things that prompt us to change our evolution this like a selective revolution those are like prompts like what new discovery we can find in biology. Like throughout when this event happened on Earth, what biology changed. And so those are prompts.

[00:39:27.161] Kent Bye: Interesting. Yeah, it reminds me of a project I saw at IFA Doc Lab this past year by a sister, Sylvester. It was called Drinking Brecht. And part of the thing that she was arguing against was this tendency to translate DNA into the language and have this kind of deterministic way of the connection between biology and how it plays out. Because there's all these other epigenetic and contextual relational dynamics that go beyond just those letters. So then the conversation we go into much more detail specifically how she's making the argument in that piece around this tendency towards thinking in a eugenics way can be like the DNA is the code of life and therefore then we can seize control over it, and it kind of leads to other problematic outcomes of that. So anyway, we kind of discuss a lot within that context of the conversation, but yeah, since you're at this intersection of biology, XR, and AI, I don't know if you have any other reflections or thoughts on that.

[00:40:19.671] Jiabao Li: Yeah, I guess, well, I'm not really a biologist, but I feel so much of our biology is based on the Darwinian thought of natural selection, but also various evidence and ways that prove there are more alternative to that. But if you use AI to train based on all those as evidence, then we start to create a corner, like all the ethics and bias we have in the things we usually talk about when we talk about generative AI. So we also have that problem. MARK MANDELBACHER- OK.

[00:40:52.606] Kent Bye: Well, yeah, I guess as we start to wrap up, I'd love to hear what you think the ultimate potential of all these immersive technologies and immersive art and immersive storytelling might be and what it might be able to enable.

[00:41:04.245] Jiabao Li: Oh, you have this question every time. I wonder what's changed over the years. Yeah, I think, again, the most obvious one is to be able to embody another being. So we've seen a lot in the medical side or mental health side, and now a lot of my work is on embodying the experience of another species. hopefully because that can help us to know them better and we can make some difference, we can make some actions. Like in terms of bats, a lot of people think they are filthy, they carry disease and they are disgusting and they suck blood, but there's only like two species of bats, the vampire bats, that suck blood and they don't really suck human they are no interest in human blood they suck cows or horse blood and they're kind of like a mosquito bite you don't really feel it so it's really not that scary and because of that we see a lot of people like setting fires at the caves or like really want to get the bat out when they see a bat in their house. So a lot of destructive behavior. But hopefully through this XR experience, people can get to learn about the bats a bit more and appreciate them that really they're amazing creatures. They help with our crops. They eat the mosquitoes. They are great pollinators. We won't have agave without the bats. So that's another part of the experience when we show it in the contemporary. We also serve agave. Yeah, so that's what I hope the XR could bring.

[00:42:37.378] Kent Bye: Great. Is there anything else that's left unsaid that you'd like to say to the broader immersive community?

[00:42:42.542] Jiabao Li: Well, keep creating and keep finding corner cases and experimenting in a way that the technology was not designed for you to make for itself. I think that's the unique ability of the artist.

[00:42:58.683] Kent Bye: Awesome. Well, Jabo, it's always a pleasure to get a chance to sit down to hear your latest multimodal interdisciplinary art projects that you're making in the immersive space.

[00:43:08.269] Jiabao Li: I want to have a shout out to another collaborator of the project, Amber Hu, who is the inventor of HoloKit. And HoloKit is a great headset for education or general usage because it's only $100 something. You can buy it on Amazon. So it's really good for schools or museums to use it. And you can do multiplayer with it. So it's a pretty good tool.

[00:43:35.270] Kent Bye: I think I've seen it at AWE. It's kind of a way to have a head-mounted display that turns your phone into an AR device, right?

[00:43:40.896] Jiabao Li: Yeah. And they win the Innovation Award at South by last year.

[00:43:44.540] Kent Bye: OK, yeah. So yeah, like I was saying, lots of collaborations, lots of interdisciplinary collaborations, multimodal experiences. And yeah, it's always a pleasure to be able to get a chance to sit down and hear a bit more. So thanks again for joining me here on the podcast.

[00:43:55.832] Jiabao Li: Thank you. Always a pleasure.

[00:43:57.844] Kent Bye: Thanks for listening to this episode of the Voices of VR podcast. And there is a lot that's happening in the world today. And the one place that I find solace is in stories, whether that's a great movie, a documentary, or immersive storytelling. And I love going to these different conferences and festivals and seeing all the different work and talking to all the different artists And sharing that with the community, because I think there's just so much to be learned from listening to someone's process to hear about what they want to tell a story about. And even if you don't have a chance to see it, just to have the opportunity to hear about a project that you might have missed or to learn about it. And so this is a part of my own creative process of capturing these stories and sharing it with a larger community. And if you find that valuable and want to sustain this oral history project that I've been doing for the last decade, then please do consider supporting me on Patreon at patreon.com slash voicesofvr. Every amount does indeed help sustain the work that I'm doing here, even if it's just $5 a month. That goes a long way for allowing me to continue to make these trips and to to ensure that I can see as much of the work as I can and to talk to as many of the artists as I can and to share that with the larger community. So you can support the podcast at patreon.com slash voices of VR. Thanks for listening.

More from this show