#1552: Fly to You Explores Stories of Separated Families in Division of Korea Using Point Clouds & 360 Video

FLY TO YOU tells the story of Kang Songjeol who was separated from her childhood friends in the aftermath of the 1950 split between North and South Korea. Oral history interviews were captured in 360 video, and her childhood memories were recreated with point scans derived from Leici BLK360 LiDAR scans combined with Azure Kinect volumetric captures. The experience manages to virtually fly over the restricted demilitarized zone into North Korea, which is something that is illegal from them to do physically but serves as a symbolic gesture towards reunification. There are still thousands of families hoping for an opportunity to be reunited, and there’s a beautiful spatial memorial that brings home that point at the end of this piece. I had a chance to unpack it all with co-directors Sngmoon Lee and Youngyun Song

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.458] Kent Bye: The Voices of VR podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR podcast. It's a podcast that looks at the structures and forms of immersive storytelling in the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So continuing my series of looking at different immersive stories from Southwest Southwest 2025, and also starting to dive into one experience that starts to really use the point cloud aesthetic to tell a story in a unique and different way. so the piece is called fly to you and it's produced out of south korea and so it's at this demilitarized zone edge where south korea is meeting with north korea and it tells a story of a woman who was separated from what ends up being like a really close friend but she treats her like a sister because they grow up in the same house and they're just like really close So there's a time when there is a separation between South Korea and North Korea around 1950 or so. And some families got separated. And as time has gone on since 1950, a lot of those people who are separated from their friends or family died. They were never able to be really reunited with them. And there's still a lot of them that are hopeful that they will be able to have an opportunity to reunite with them. So this kind of tells the story of one woman in that situation. But at the end, they end with this really powerful memorial of all the different people that have died. been separated from their friends or family. And then there's kind of a fade to black where you get this spatial representation of how many people are dying. So this is basically a pressing issue that unless it's addressed, then there's all these people that will go to the graves without ever having been reunited with their friends or family. So this story is using 360 video to tell the main part of the story, but it's also using Leica BLK360 LiDAR scanner and also Azure Connect depth cameras to be able to capture the people. So they're recreating a lot of the memories, the childhood memories of this woman in this point cloud representation. And then blending that together with 360 video. And they also do this really interesting shot where they kind of fly over the DMZ line, something that would be totally illegal for them to do if they were physically crossing it. But because they do the spatial capture, they're able to kind of zoom you over in this point cloud representation. And as this kind of symbolic reunification, which is a theme that's covered within the context of the film and something that they're able to use the spatial affordances of the medium in an interesting way. So we'll be covering all that and more on today's episode of the Voices of VR podcast. So this interview with Sogmu and Yongyan happened on Tuesday, March 11th, 2025 at South by Southwest in Austin, Texas. So with that, let's go ahead and dive right in.

[00:02:57.785] Sngmoo Lee: Hi, my name is Seung Moon Lee, creator of Fly to You. And I've been doing the VR showcase like immersive theater using volumetrics for the last seven years or so. And I try to explore the new way of storytelling. I am from a traditional filmmaking background. So for me, the most important thing is why is this medium different and what's the specific story fits to this medium. And I've been exploring that for the last 10 years or so.

[00:03:26.075] Youngyun Song: Hi, my name is Young Yoon Song. I'm director of Fly to You. I mainly work in documentary field and VR work as well. Fly to You, I want to make a separate family story about using some polymetric technology. Yes.

[00:03:46.468] Sngmoo Lee: Youngyoon speaks pretty good English, but if it's not enough, then maybe he's going to say in Korean and I'm going to translate it for him.

[00:03:52.237] Kent Bye: Okay, sounds good. And so yeah, maybe each of you could give a bit more context as to your background and your journey into working with virtual reality.

[00:04:01.621] Sngmoo Lee: Yes, it's a long story actually. I'm from the legacy media. I was a filmmaker and I'm still a filmmaker and a writer. When I first tried Oculus first version, I see the future of the storytelling. So my first try was the 360 video because I came from the filmmaking background. It was natural. And actually Young Yoon was first AD on that project called Eyes in the Red Wind. And we tried to embrace the sped tier side of the storytelling. And after that, I went to Sundance in 2018 with Eyes in the Red Wind and realized that what's lacking in that film is the interaction and more sense of the spacer. So I was really into these immersive performances. So I made a project called Scarecrow. And I evolved that into the VR chat version where we can meet in Metaverse. Scarecrow original was LBE version. and I'm making an AI version, not the Scarecrow, but the immersive theater with the AI actor right now. It's called Diamond Dust, which has been widely released yet, but where we get to talk to the AI agent who replaced the actors in the immersive theater. So that's one side of my main work in VR. The other side is using this volumetric, and because the volumetric is the only thing you can capture real person in the VR world. So I've been using the technology and Young Yoon and I made a project called Rain Fruits in 2020, which was invited to many festivals. And what we are interested in is how to use the volumetric not just to replicate the physical reality of the real person and it's not accurate as yet. And why do we have to see the transformation of the human body into this volumetric form? And we play with it. And that's how we play with the rainfruits five years ago. And now we are playing it again, experimenting again with recreating people's memory in Fly to You.

[00:06:08.737] Youngyun Song: He started his career as Eyes in the Red Wind with me and it was so difficult so he said he's never going to do the VR again. But as I've been working on VR, I've been thinking a lot about storytelling that can only be done in VR, especially the part where technology is used, and the part where technology and story are intertwined. I've been thinking a lot about how I can use those things in my movie background.

[00:06:46.153] Sngmoo Lee: And he got fascinated by the possibility of combining this technology and storytelling together. And there are areas where traditional filmmaking and documentary cannot go. So he's still enchanted by this medium and he's still exploring it.

[00:07:02.145] Youngyun Song: So, in 2020, I got to work with director Lee Seung-moo on Rainfruits, and I got to know more about the possibilities. In particular, while working on this project, I thought that the technology would be able to melt the wishes and wishes of Kang Song-jul's grandmother, who we interviewed and filmed, not only in storytelling, but also in the technology. I think we did our best to create those things.

[00:07:31.405] Sngmoo Lee: Yes, so he made The Rainforest with me as a co-director and he saw the possibility of the storytelling. So he went forward and in this project, Fly to You, he thought he could use the technology to convey the idea of our main subject, which is Kang Song-jeol, the old lady, her memories and her longing to go back by using this new technology and VR.

[00:07:56.311] Kent Bye: Great. And maybe you could give a bit more context to how did this project fly to you begin?

[00:08:02.615] Sngmoo Lee: For me, the VR is all about the space, which the 2D screen lacks in the tradition of filmmaking. I did some couple of projects in between, but when we decided to do another documentary-style volumetric MIX 360 project, what came to my mind is the divided nation of our country, which is probably only one nation with the same language. and having two different countries, two different nations in one small country. And when you go to the DMZ, which is the militarized zone between Korea and South, the sense of the space is amazing. I mean, I lived there for almost 60 years now, and then even I visit there again, it's just, we kind of think of North Korea and South Korea in a kind of superficial way, and it's very foreign to us. But when you stand there, it's right across the river. And you can just see it visually. And you don't really see that as a different country. It's just continuum of our own country. And that kind of sense of the space struck me. So I kind of think that probably if I do the next project using this special sense of the 360 video, I want to do something about the division of the country and the DMZ. And the initial idea was not like this. It was like we see somebody, and at the very end of the storytelling, we get to see him at the edge of South Korean border, and then we kind of pull back in VR, but we kind of realize that oh this is the reality of the space of the nation and that meant to struck people but by researching it and i i'm from the storytelling background i'm narrative background i always make fakes to make people moving you know that's my side of the storytelling and young yoon is from documentary side and i believe that this kind of subject matter if i were to in full control i would try to minimize it but there is always urge for us to make it more bigger than life and you know make it more fascinating than it is and this kind of subject matter is about one person's tragedy and nation's tragedy i need more sincere and humble eye to approach to those people and i think young yoon is the perfect people to convey that so i kind of leave it to him and meet people and find the real story and be true to themselves and let us know that what should be told and who has the most vivid memory. And from there on, he was doing many interviews with people with separated families and their memories in the hometown. That's my side of the story.

[00:10:43.422] Kent Bye: Yeah, and I'd love to hear where this project began for you.

[00:11:10.151] Sngmoo Lee: Yeah, she's into documentary. Documentary means many things, but one thing for him is that about things disappearing, about the people who's not going to be there for long, and kind of archiving their memories and their story feels more important to him than trying to find the new subject matters.

[00:11:30.303] Youngyun Song: So, while working on VR, their stories, their looks... In fact, all of their family members are elderly, so they all disappear one day and no one listens to their voices. So, I thought it was important to leave their bodies and things like that first.

[00:11:49.457] Sngmoo Lee: Yes, the first thing came into his mind is just preserving their existence about the divided family and they are old and they will eventually perish in 10 years or so. So he think the most important thing is just capture their story and capture their reality with 360 video.

[00:12:09.683] Youngyun Song: There are some problems that need to be looked at realistically. For more than 70 years, there have been only three actual meetings. It was a very political event, but in fact, as they get older and older, I thought it was very important to leave data for them later. That's how I started. If it is difficult to cover all the stories of these people, I think it would be better to expand the personal history of one person and talk about social issues.

[00:12:52.338] Sngmoo Lee: Yeah, so the separated family, there are like hundreds or almost thousands of them. And there were only three times in Korean history that some of them could actually meet the counterpart. So most of them were never going to see their families again. And for him, is important to archive them not for the sake of the archive but so that they can meet the other parties their you know their sons grandsons or whoever in the future if it's possible and VR is that kind of medium which is different from live video so that was first thing came into his mind and then he met people and he realized that we don't have capacity to archive everybody's story

[00:13:39.353] Kent Bye: I had a clarification question around the story because you're talking around separated families and the English translation was that the two people that were separated the relationship was described as sister-cousin but I don't know if it was a sister or a cousin or you know if they had the same mother same father like maybe you could just describe the relationship of the two people that were separated between North Korea and South Korea

[00:14:05.808] Sngmoo Lee: That's Korean thing. Family-wise, it's a sister-cousin, not from same father and mother. But in Korea culture, at that time, they're almost like same family. So they live together and they call them sister rather than cousin. So to her, it's a little sister rather than sister-cousin. But genetically, they are not sister-sister. They are sister-cousin.

[00:14:28.853] Kent Bye: Okay. I was confused. Yeah, because...

[00:14:31.881] Sngmoo Lee: Thinking like whether we should put like sister, cousin or sister, but she kept telling that my sister, my sister. And then it's really about the two separated sister rather than cousin. So we just kept it that way as a sister as main translation.

[00:14:45.319] Youngyun Song: I think it would be good to talk about that too. The mother of Sae-byeol's mother passed away, and the mother of Song-jeol's grandmother raised her. So she's more of a cousin, but a little more... Yeah, the Sae-byeol, who's in North Korea, mothers die.

[00:15:01.545] Sngmoo Lee: So Song-jeol's mom brought them together in their house. So they grew up in the same home. So it's not just, you know, cousin-cousin, but it's almost like sister.

[00:15:11.303] Kent Bye: Okay, yeah, I just wanted to clarify, because I was confused as to why she didn't come along, but it's because she was from a different family that she was left behind?

[00:15:18.809] Sngmoo Lee: Yeah, yeah, I mean, they thought they can come back, you know, the day or after, so they just find the most immediate, most related family come first, and then she gonna become the second batch or something, but she couldn't make it. So, yeah, that's why. Okay.

[00:15:37.185] Kent Bye: Yeah, and there's a really powerful scene at the very end where you have photos of people. They have relatives, family, people that they're close to that were separated and that you fade to black on those people representing how many of them are dying. you really see the urgency of telling the story right now because you're trying to archive some of these stories and get them reconnected. So I just wanted to also point to that. But I'd love to hear any reflections on that because it feels like a really beautiful memorial, but also speaking to the urgency of this issue that the longer that this goes on, the more likely that they may never be able to be reunited again.

[00:16:15.438] Sngmoo Lee: Yeah, exactly. That's what we exactly meant. And that's my kind of storytelling in VR. Again, it's only can be told in that way in VR. Like actually you see tons of people's face around you. So it becomes a dominant existence. special sense that you can feel that oh they are there not just being told but they're there with you and they're just disappearing from your eyes so that exactly what you said you gotta be hurry they're not gonna be there forever but it tells it in a special sense than logical sense so for me in that kind of scene for me is the most important part of the VR storytelling and most effective way to tell the story in VR rather than argue and do the protest and tell the people that they're gonna die they're gonna not gonna be there in five years you just sense it then something's gotta be done now so totally what you got is what we really intended

[00:17:14.656] Youngyun Song: To comment on the last scene, when I work with director Lee Seung-moo, I feel like I'm filling in the parts that I don't have. So before I made that scene, I was wondering if this was possible, but after I made it, it was very intuitive and people were only in that space. I think it was possible only in VR. I think it was possible only in VR. I think it was possible only in VR. I think it was possible only in VR. I think it was possible only in VR. I think it was possible only in VR.

[00:17:59.366] Sngmoo Lee: Yeah, that kind of scene is where Young Yoon thinks it's nice to work with me because that's kind of lacking in him in that sense. When he first thought that is it going to work or not, then he saw it and then he kind of sees that it really hits you in the gut rather than in your brain. And then that kind of thing is the power of the VR medium. And he thinks, you know, there's my side of the... storytelling, rather than documentary side of the filmmaking. And then, what was the last one?

[00:18:30.625] Youngyun Song: Anyway, those parts, rather than spraying a few things with text, it's powerful that you can just go into one space and feel those things at once.

[00:18:40.748] Sngmoo Lee: The space says it all. You don't need words or you don't need explanation. You just feel the space and absence of somebody and then you just feel it and it tells you in your gut rather than your brain and that's the power of the VR storytelling.

[00:18:55.666] Kent Bye: Yeah, I thought it was really quite powerful as well, because you hear the story of one person, and then at the end, you could show a number, but by showing the spatial representation of that, it just really hit me, and a real emotional impact, so definitely feel that. I wanted to ask around, there seems to be a number of different volumetric capture techniques for capturing people. It seems like I saw in the credits you're using 8i, and then potentially compositing those into these other scenes where... felt like maybe either LIDAR scanned or other ways that you're capturing the environment and then creating these point cloud representations. So I'd love to hear a bit around the different volumetric capture techniques that you were using to capture both the spaces as well as the people.

[00:19:35.404] Sngmoo Lee: Yes, we used 8i, and we used tons of footages, and we had some access to the 8i by the government fund. So we did make a full capture of the LED, but we didn't use it that much. Most of the volumetric capture was done by Azure Connect, and we can tell a little bit more about the details of the technology side.

[00:19:59.524] Youngyun Song: So, we used the BAK LiDAR scanner, and we mainly used the NERF technology in this project. Since we couldn't scan a wide area in the volumetric capture, we used the NERF technology to change the 2D image to a 3D point cloud. In particular, we scanned once more with NERF in the DMZ space, and we also scanned with BAK.

[00:20:28.993] Sngmoo Lee: Yes, so we used the Azure Connect and AdaEye, but we also used BLK300, which is a LiDAR scanner. And we used a lot of our NERF technology where we can transform 2D into 3D spaces. So it's a combination of all those things.

[00:20:46.145] Kent Bye: Yeah, the very first scene I noticed in the water, there's some really beautiful reflections. How did you get the reflections of the trees? Was that using some sort of like Gaussian splats or NERF technologies?

[00:20:57.034] Youngyun Song: Nerf technology. Yes, we are using Nerf technology. When I first made this, Nerf technology was just starting to come out. So I think I did a lot of tests on that technology. The reason why we had no choice but to use Nerf technology was because we had the experience of passing through DMG. It's a very special experience for Koreans and others. Since it's divided, it's close to an island country. Koreans don't have many opportunities to get to the border. That's why when we use the NULF technology, we can go beyond the fence. Of course, it's very small, but I thought I could give users experiences that could go beyond the fence, so I tried to actively use the technology.

[00:21:44.910] Sngmoo Lee: Yes, so the NERF came in during the production. We didn't intend it to be that way in the beginning. And then NERF was introduced and we saw the possibility of that medium, what can mean to Korean people, because we never can cross the borderline. And with the NERV, it can, in a way, because, you know, the camera, we recreate the 3D spaces and technically speaking, you can cross over even in the limited capacity. And it was quite symbolic to us. So we explore the possibility so that we fly with the flower petals and we briefly cross the border, which we could go to jail in Korea in reality. But, you know, the NERV could enable us to make that happen. And also we could fly over the border just a brief moment with these technologies. This is one good thing about this new medium where we get to see the new technologies like NERV. We're not just using for the technology's sake. We try hard how to embrace it and make it a very powerful storytelling medium on the technologies. And in this case, not for us that.

[00:22:53.573] Kent Bye: Awesome. Yeah, I think it's a really powerful moment to have that crossing over. So yeah, that scene of going over the DMZ line, the actual resolution starts to break down. So I could see that, OK, obviously, they're probably not even allowed to put a drone over this line because it clearly is fragmenting. But it was at least a symbolic way of feeling like you're able to reunite in a way. So I thought that was a really elegant way of using the technology.

[00:23:20.143] Sngmoo Lee: Yeah, that's exactly what we meant. I mean, the result was, like you said, not 100% happy because we wish that we could just fly over a little bit more with the better resolution. And probably we could do it better technology with these days. It's almost like one and a half years ago, two years ago. But again, like you said, it's all about the symbolic and just crossing over one inch of the border would do the job. So we're just happy with it.

[00:23:45.585] Youngyun Song: I have something to say about this part. In fact, the part where the resolution is broken in the DMZ part is because we held it with a stick. But in fact, drones and things like that are not allowed. So every time I see it, even if I scan it again, it's the same result value. If I had used a drone, wouldn't the resolution be better? I think there are some regrets.

[00:24:08.971] Sngmoo Lee: Yeah, technically, yeah. If we use the drone, it's going to have much better quality, but we are not allowed to. So we had to use the stick and put it up and legally possible as possible. So he still feels a little bit sorry that we, if we could really, you know, do it from above.

[00:24:25.612] Kent Bye: Yeah, so in terms of formats, there's a mix of, at the very first scene, there's like 360 video. But it's also flattened to be 180, so you get more of a wide angle view. And then you have other volumetric techniques that you're using in this story. But maybe talk around this use of 360, a high resolution video, as contrasted to more of the a little bit lower resolution point cloud representations. And how are you thinking around the use of those different types of formats?

[00:24:54.858] Sngmoo Lee: Yeah, honestly, I really don't like the quality of 360 video. And I really like the poetic quality of the point cloud. So aesthetically, I'm not very happy about this mixture of the 360 reality and we transform into this point cloud poeticness. But we got to see her in person, you know. The other project and Rainfruits, we barely see the 360 video of the narrator at the very end. But in this case, people really have to see that she is a real person, not a ghost. And she's there with a full scale, with full resolution. So we had to do that in that way. And transition was not as smooth as I intended. We kind of think it like I always like that kind of dissolved quality from 360 to the point cloud world, but it didn't quite work. But that's the best we could do.

[00:25:52.718] Youngyun Song: Younghyun, do you have anything to add? I think I was a bit stubborn when I was doing the documentary. I couldn't think of any equipment that I could actually use to show the actual grandmother's house, the actual grandmother's space, and the fact that she actually went to DMZ. So it's very important that the person actually went to the actual place and was there. And as proof, as a documentary document, I used them because I thought they could be reused. I'm very disappointed. If I have a chance to make this again, I think I'll think more about it. What do you think? Is the quality bad or what?

[00:26:32.280] Sngmoo Lee: The quality is bad, but I think it would be better if I thought about the way I shot it.

[00:26:36.564] Youngyun Song: Is it because it's not pretty?

[00:26:43.712] Sngmoo Lee: Yeah, I mean, he kind of insisted that we have to use the real footage of her, even though he knows that it's not going to be aesthetically perfect. And that's the part he is not 100% happy about, the way he shot the object and the way the live 360 video is combined with the other part of VR. But people really have to know that she is there. She was in front of the barrier and she lives in that small apartment. And from the documentary background, he think it was very important to portray that with or without the aesthetical value.

[00:27:23.752] Kent Bye: Yeah, and in terms of telling the story of your subjects, you are capturing the oral history. So we hear her telling the story. So we hear the audio of her story. But then there's also a number of different places that were taken. And then you are having recreations of these children acting. So I'm imagining using capture technology and then compositing those human actors within the context of the physical space, since it's difficult to capture both the space and the people at the same time. So maybe just talk about that process of choosing different places that are featuring aspects of her story that she's telling.

[00:28:00.140] Sngmoo Lee: One part of the project, importance of one part of the project is that show her her childhood and make the memory revive in front of her eyes and our eyes. So in that sense, we try to recreate the memory and point cloud is perfect medium for that because it's always abstract and ambiguous and it becomes something else suddenly. So I think that that served a purpose. And again, the technological side, are you asking like both?

[00:28:32.616] Kent Bye: Just if you had to like composite or combine, because I'm assuming that you didn't do the same at the same time. But yeah, just the process of if you had to individually capture each of the humans and then put them all together in the scene.

[00:28:43.489] Sngmoo Lee: Yeah, that's the idea behind it. And how we did it is like...

[00:28:49.318] Youngyun Song: The background can be scanned with the BLK360, but the moving parts cannot be scanned, so we filmed the characters in the studio. When I was working on Rainfruits five years ago, I learned that the feet of the characters and the contact points were not erased well. So I used the know-how of those things in this project. I was able to do things like peeling off a black cloth or doing it in a white background.

[00:29:25.888] Sngmoo Lee: Yes, so the background is LiDAR scanned, BLK360, but BLK360 cannot capture the movement. So we combined it together and by doing the rainforest five years ago, we know the limitation of the Azure Connect, where especially when somebody touched the ground, his feet disappeared and that kind of thing. So he learned from that, so he used a black mat so that it isolates the people from the ground and that kind of thing. So even with the low-end technology like Azure Connect, we achieved something beautiful in a way. People thought it was ADI, but we didn't use AI on that side. ADI is for Song Zhe when she's in the wheelchair, but all those girls and playing, that was like $300 machine, which is Azure Connect.

[00:30:15.307] Kent Bye: OK, so you were able to make do with the Azure Connect, especially because you do have some camera movement, but are you using multiple connects or just one? One. OK, yeah, so just one. So you're using just one connect. So yeah, you're just cheating the angle because on the other side they'd be empty, right?

[00:30:32.005] Youngyun Song: Yes. And when we captured the Acer Kinect, we had the know-how of lighting. So we were able to shoot that clearly. We talked about how we were going to transform this in the second half of the shoot, so we started by designing the steps beforehand.

[00:30:56.331] Sngmoo Lee: Yeah, again, learning from the rain fruits, which was our first volumetric project, we knew the limitation and the possibility. So in this case, we knew what would be the last product. So we kind of could design in a way that lighting and all that kind of stuff. We kind of see the end product in our mind and we kind of make it happen automatically. during the production. So it's not like, you know, shoot and fix later. It was pre-designed for the final look and we do it in a kind of efficient way by our lesson from the rain fruits.

[00:31:31.036] Kent Bye: In this piece, with the LiDAR scanner, LiDAR doesn't necessarily detect color, as far as I know. Did you have to add color to everything and colorize the LiDAR? Yes.

[00:31:41.460] Youngyun Song: We added additional lighting to the LiDAR in Unity. And then, let's talk about the background. The LiDAR scan part. That's how we did it. The LiDAR scanner itself, the BLK360 itself, has RGB data, so we used the color values we already had.

[00:32:02.688] Sngmoo Lee: The outcome itself comes out in black and white, but it doesn't come out in color, right?

[00:32:08.211] Youngyun Song: The outcome comes out as the first RGB data.

[00:32:10.772] Sngmoo Lee: Our scanner, the BLK360. The BLK360 has RGB data in it, so we could explore that. And also, we are using Unity lighting system. And also, we're adding lighting with the Unity so that it flourishes a little bit.

[00:32:25.039] Kent Bye: And I also noticed that in some of the scenes, there is a little bit dynamic motion. Were you adding shaders to be able to animate the point clouds?

[00:32:33.468] Youngyun Song: Yes, we are using also VFXGraph and also KGO Effect as well. So our developer is in UK. Actually, we didn't met. We just met in Zoom. So actually, he is very specialized in point cloud. So yeah.

[00:32:54.607] Sngmoo Lee: Yeah, Charlie helped a lot. We never met him in real person, actually. We did the Zoom conversation. And then he's such a brilliant VFX artist and engine artist. So his wisdom and his expertise in point cloud and the game engine helped a lot to realize this one.

[00:33:13.163] Youngyun Song: It was 2021 when we started this project. There was a pandemic all over the world. We searched for a lot of people to work on this project. I personally searched for people on YouTube. I met Charlie there. I asked him if he wanted to work with me, so we worked online.

[00:33:37.084] Sngmoo Lee: Yeah, during the pandemic, we were kind of looking for our technical directors and Charlie was the one he found in the YouTube and he liked the style. So we approached and he liked the project. So he became a team and it was three years ago and it was a very good collaboration.

[00:33:54.486] Kent Bye: Well, this experience, Fly to You, is being shown here at SXSW under the XR Experience Spotlight. So I'm just curious if you've had a chance to show it to Korean audiences and how this story resonates with them.

[00:34:07.573] Sngmoo Lee: Not yet. Not yet. And we show this to Rain Dance. And we show this as a C-Graph, a little demo as a C-Graph. And this is first big festival. And we are still in the process of fine-tuning it. So probably we are done by now. So we are planning to release it in Korea. And for Songjeol, Like I said, we really wanted to see her hometown and memories, but first minute of VR, she couldn't stand it. So she said, you know, I know the story and it's beautiful. And she saw it in a 2D, but she never really experienced the full 14 minutes with VR. And she will never.

[00:34:49.953] Youngyun Song: I wanted to show it to Kang Song-jeol's grandmother first, but she couldn't see it because of her blood pressure. So it was very hard for her to see it. I hope there will be more opportunities for Koreans to see it. It's a problem that we need to solve in the future, and it's a problem that we need to be interested in, so I hope that many people in Korea will have the opportunity to see VR.

[00:35:26.178] Sngmoo Lee: Yes, the lady is very old and she has a problem with high blood pressure so she lost one of her eyes so she couldn't really enjoy it but she appreciates what we've done and she appreciates her story is being told to the worldwide audience and then this is mainly not about her story this is about the nation's story so of course our primary target is Korean audiences and we have some issue with the VR like everywhere else as a distribution format but we are looking forward to distribute it as big as possible so that as many people can see it and find our nation's history and they realize that they're still living there and they're still there longing to meet their families so we really want to make this story resonate worldwide but also for the Korean people.

[00:36:16.798] Kent Bye: Awesome. And finally, I'd love to hear what each of you think the ultimate potential of virtual reality and immersive storytelling might be and what it might be able to enable.

[00:36:26.725] Youngyun Song: VR reminds me of a lot of difficult moments. Well, whenever we work on something, we always use the most advanced technology, but in the future, it becomes the technology of the past. I'm curious about how VR will affect the pieces we've made in the future. While working on VR, I feel that more and more people are entering the VR market, and I think it has more potential.

[00:37:18.331] Sngmoo Lee: Yeah, it has limitations, but he thinks it's growing and many people are coming in, so he thinks there will be a brighter and bigger picture of VR storytelling. And for me, as we all know, it's kind of frustrating. When I first started, they said it's technology for 10 years later, and now we are at the 10 years later. And now they said it's 10 years later, so we really don't know when it's going to happen. And I think it should happen because We live, and people are not meant to look at that tiny LED screen. We are living in this age where one person can handle tremendous amount of data. And by just looking at that tiny small screen of their iPhone, that's not natural. So it will go away eventually. And then we were gaining back the space, which we lost by that frame of the Renaissance painting. Ever since then, we are kind of framed. through the cinema and the TV and now the mobile screen and it's against human nature and it's against nature so we will be free from the frame to go to the architecture and the sculpture so that's more natural and with the help of the AI definitely it's going to happen sooner or later and I truly believe that's the ultimate medium of the storytelling.

[00:38:42.648] Kent Bye: Is there anything else that's left unsaid that you'd like to share with the broader immersive community? Any final thoughts?

[00:38:50.091] Sngmoo Lee: Hanging in there, we'll make something great together. Yes. Same words.

[00:38:57.774] Kent Bye: Awesome. Well, I really appreciated this piece and the point cloud aesthetics. And flying over that line is something that I also noticed while I was watching it and kind of thinking, oh, wow, they probably can't actually cross this line. yeah the way that it ends i think in a normal 2d film we could watch it but then that moment when you're completely surrounded and immersed and getting this spatial metaphor for how big the story is beyond just this one individual story i think it really landed for me so yeah just really powerful and i very much appreciate both of you joining me here on the podcast to help break it all down so thank you so much thank you very much thank you kent thank you thank you very much Thanks for listening to this episode of the voices of VR podcast. And there is a lot that's happening in the world today. And the one place that I find solace is in stories, whether that's a great movie and documentary or immersive storytelling. And I love going to these different conferences and festivals and seeing all the different work and talking to all the different artists. And sharing that with the community, because I think there's just so much to be learned from listening to someone's process to hear about what they want to tell a story about. And even if you don't have a chance to see it, just to have the opportunity to hear about a project that you might have missed or to learn about it. And so this is a part of my own creative process of capturing these stories and sharing it with a larger community. And if you find that valuable and want to sustain this oral history project that I've been doing for the last decade, then please do consider supporting me on Patreon at patreon.com slash voicesofvr. Every amount does indeed help sustain the work that I'm doing here, even if it's just $5 a month. That goes a long way for allowing me to continue to make these trips and to to ensure that I can see as much of the work as I can and to talk to as many of the artists as I can and to share that with the larger community. So you can support the podcast at patreon.com slash voices of VR. Thanks for listening.

More from this show