#86: Danfung Dennis on developing the language of cinematic VR with the 360-degree, Zero Point experience

Danfung Dennis talks about the process of making the first commercially available, 360-degree, fully immersive VR cinematic experience called Zero Point, which was released today on Steam. I see Zero Point as a historic VR experience that will likely catalyze a lot of other ideas and experiences within the realm of cinematic VR, and so it’s definitely worth checking out.

Danfung-Dennis Danfung is the CEO of Condition One and actually started looking at 360-degree video starting back in 2010 when creating some immersive experiences for the iPad and iPhone. He sees Zero Point as a camera test and experiment that shows the evolution of 360-video capture technology starting from monoscopic 180-degree video and maturing to the point of 360-degree stereoscopic video using RED cameras shooting at 5k and 60 frames per second.

Danfung hopes that Zero Point will be a catalyst for other experiments with 360-degree cinematic experiences that help to develop and cultivate the new syntax, language and grammar for creating immersive VR experiences. Zero Point uses the constructs of a traditional documentary film, but yet I was personally left craving something that was focused more upon putting the physical locations at the center of the story rather than the other way around. Danfung seemed to agree with that sentiment, and that moving forward what is going to matter more is using the strengths of the VR medium to create a more visceral and emotionally engaging experience where the location, visuals, and audio are all synchronized together.

In terms of the language of VR, Danfung had many insights ranging from seeing that cuts between drastically different scenes is a lot more comfortable than moving around within the same scene. And while fading to black seems to be a safe way to navigate between scenes and minimize the disorientation from teleporting instantly to a new place, he also sees that it could be overused and that he’s interested in experimenting with longer shots with few cuts. He’d also like to iterate more quickly by creating shorter experiences that are long enough to create a sense of presence, but to be able to create a wider range of different types of focused experiments in order to learn more about the language of cinematic VR which is still in it’s very early days of being developed.

There’s also an interesting experiment of blending computer-generated experiences with live-action footage, and Danfung wants to experiment with this more. He believes that when the resolution gets to be high enough, then 360-degree video has the potential to provide a sense of presence more quickly than a computer-generated environment.

Finally, Danfung sees that there’s a tremendous potential of VR to be able transcend the limitations of the mass media, and to provide a more advanced language for sharing stories that take people into another world, another mind, and another conscious experience. There is a lot of potential to explore more abstract concepts and to use the VR medium for pro-social causes and to help make the world a better place.

Zero Point is now available on Steam here.

TOPICS

  • 0:00 – Introduction
  • 0:35 – Process of innovating and creating 360-degree video. Create powerful immersive experiences, and tried an early prototype of the DK1 and knew that VR was going to be his medium. Started with astronomy photography and use 180-degree fisheye lenses to look at capturing the night sky with a DSLR. Saw limitations to 180-degree lenses and started to add more cameras and now have an array of Red Cameras shooting at 5k and 60 fps.
  • 3:20 – Mix of different video styles and techniques. It’s an evolution of different capture systems and playback. Started with 180-degrees and mono and keep innovating the technology to the point of 360-degree sphere in stereoscopic 3D. There is quite a mix throughout the VR experience. Get it out and get feedback, and it’s a signpost of what’s to come. It’s a camera test and an experiment.
  • 5:00 – It’s easy to break to break presence. Seams and black spaces are immediately noticed and it’ll pull them out of the experience and it’ll break the sense of presence. A large process of trying to iron out the glitches. Have to have a very seamless image because people will notice the errors. The resolution needs to be higher than the DK2 to take
  • 6:54 – Mixing the form of documentary with being transported to different locations within VR. What is VR film the strongest for? Drawing on the techniques for what we know before. Those film techniques don’t really apply to VR. Zero Point is a more traditional documentary approach, but moving forward the core experience is going to matter more. How can emotionally visceral an experience will be is what is going to be the most compelling. The scenes and environments will be key. How to best convey emotion with the VR medium is an open question. It’s a new medium and a new visual language that’s being developed.
  • 9:00 – Zero Point as a “stub article” that will catalyze ideas for what the strengths of VR film medium. Designing around the location first. VR will be a visuals-first, and then audio. Design around that. This is what high-resolution and high framerate VR looks like. Hopes that Zero Point is a catalyst to start putting out experiments to see what works and what doesn’t work with the medium. What would happen if you put audio first?
  • 11:14 – Length of Zero Point as a VR experience, and optimizing around that. What would the optimal length be? It’ll still be determined. It has to be comfortable. DK2 is a lot more comfortable than the DK1. You need enough time to really get that sense of presence, and at least a couple of minutes. 15 minutes does feel like a long time to be in VR watching a video. Hope to iterate more quickly on shorter experiences.
  • 12:58 – Introductory sequence is a CGI experience, but most of it is 360-degree video. Interested in blending the CG to film. They’re very different. When you’re in video, then the brain can be more quickly convinced of being in another place with video IF the resolution is high enough. CG is more acceptable with the resolution of DK2. Video resolutions is kind of at the level of a DK1 at the moment. Need another iteration of panels at a higher resolution.
  • 14:58 – Cuts and fading to black and the language of VR for what works and what doesn’t work. It’s a new syntax and new grammar that needs to be developed. Fading to black works, but isn’t satisfying. Long-shot, unbroken and uncut is the starting point. It’s not too slow and still dynamic, and not motion. Easier to cut between scenes that are very different than to cut between a similar scene. Longer shots and less cuts is where he’d like to go. But it took years for the language of cinema to develop. Still in the research phase of what works in cinematic VR experiences
  • 16:50 – Movement within VR. Had some shots with motion, but also some stationary shots. Trying to push movement and moving through space. Need instant acceleration to make it comfortable. In film, you do easing from stationary to movement, but need to go straight to speed in VR movement. Head bob is too much when moving and causes motion sickness. Movement needs stabilized by a three-gimbal drone, and you get more 3D information. But VR needs new rigs
  • 19:00 – Experiments with audio. Different techniques used throughout the feed. Put binaural ear phones within your ear canals. Had a binaural mic in his ears recording the scene, but there’s no head tracking.
  • 20:32 – Different locations and different experiments with the scenes. The driving force is what is an effective VR experience? Direct engagement? Having things happen around you? Be a character and have an avatar and have some interaction. There will be a wide area of different experiences. People and animals and complex environments is compelling. Directly addressing the camera, and you start to feel like you’re there in real life. If you react as if you’re there and react to social cues, then it’ll get that sense of presence.
  • 22:34 – Power of VR is that it can put you in another conscious experience, put into another mind and another world. It’s a new form of communication and a new form of emotion. Apply this to situations that are abstract like global warming, and potential use it for pro-social goals. We can go beyond the type of content that we’re used to that is mainly controlled by mass media. It could provide a more advanced language for sharing stories in a new way. VR has tremendous potential, and it’s such early days still.

Zero Point Official Trailer from Condition One on Vimeo.

Theme music: “Fatality” by Tigoolio

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast.

[00:00:11.956] Danfung Dennis: I'm Dan Funk-Dennis. I'm the CEO and founder of Condition One. And we're a virtual reality studio creating 3D, 360-degree video for the Oculus Rift and other VR headsets. We're going to be releasing Zero Point, the first VR film on Steam. And yeah, really excited to get it out there and kind of our first foray into VR video.

[00:00:34.480] Kent Bye: And so maybe you could talk a little bit about, you know, diving into doing 360 video because I just was looking through your Twitter feed and it looked like you had like three red cameras configured in some way to be able to capture all the footage and then stitch it together. So it sounds like you were a pretty early adopter and creating your own gear to be able to do this. So maybe you could just talk a little bit about what your process was to actually create 360 degree video.

[00:01:02.725] Danfung Dennis: Yeah, I started working on this in 2010, and at that point we were developing on the iPad and the iPhone, creating this interactive window in which you could move the device and the video would interact, but really the vision was to create powerful, immersive experiences, and all of that development was a stepping stone to VR. And so when you got to try the DK1, Palmer sent us an early kit and just knew immediately that this was going to be the platform, the medium that I wanted to create stories in. It was that feeling of being somewhere else that you could actually put someone into the shoes of someone else and let them experience that, that I've been trying to create. And then, you know, we're looking at planetariums and domes and how that was created. And we kind of started with astronomy photography using these 180 degree fisheye lenses that would capture the entire night sky, 180 degrees from horizon to horizon. And so these planetariums, which had a lot of reflections, so dark experiences looked much better, and so the night sky. And so we sort of started with a DSLR and a fisheye lens. I'd been working as a filmmaker and I had helped push forward the DSLR video movement and creating high quality, which sounds archaic now, but it was 1080p video at 30 frames per second. That was a big step for a camera that could cost $3,000. So it started with these large sensor DSLR cameras, fisheye lenses, doing 180 degree mono videos. And as we got better and seeing that there was real limitations to 180, started adding more cameras, more fisheye lenses, higher resolution, higher frame rates, and getting up to these red epics that were shooting 5K, 60 frames per second. We had an array of them shooting at 360 degrees to try to capture video that was optimized for VR and VR first. We didn't want to try to convert anything that, you know, you could convert a 3D movie and it looks decent and it's a great 3D experience. But VR video is just an entirely new medium and a new language. And so we're trying to explore that.

[00:03:16.055] Kent Bye: Now, having a chance to get a beta preview of Zero Point, There seemed to be a mix of different visual styles in terms of like sometimes it was a 360 degree video, sometimes it was just maybe a 90 degree view with an inner view that was stereoscopic, sometimes it seemed to be non-stereoscopic and sometimes it was stereoscopic. Maybe you could talk about the mix of different 360 degree video techniques that you're using within Zero Point.

[00:03:44.900] Danfung Dennis: Sure, so Zero Point is an evolution of our different capture systems and our workflows and pipelines and playback and we actually started shooting 180 and mono and as our camera systems improved our shots kept getting better and so as we were shooting the technology was increasing so quickly that the shots kept getting better and better and better. to the point where we could do a full 360 degree sphere in stereoscopic 3D which takes the longest to make but we ended up replacing a lot of the early shots that we made with some of the earlier cameras with some of our latest tech and so in the film we still leave compelling shots that were done with earlier camera rigs But you will see some of that are just 180, some are mono, some are, you know, you still see some seams and many have resolution limitations, but it's getting better. And so I think, you know, we want to release this to the VR community, to enthusiasts, to developers, people that have a DK2, instead of waiting until the consumer version. You know, we wanted to get this out to the community and get feedback. And, you know, we know that this is just sort of a signpost of what is to come. This is an experiment, a camera test, and there is so much more that we can do.

[00:04:57.696] Kent Bye: Yeah, I do really see it as sort of a first stake in the ground of like, okay, here's the first take. And, you know, I, I wonder from your own experience of showing it to people and getting feedback, what were some of the feedback that you got in the process of making this that really kind of fed into what the final zero point film became?

[00:05:17.895] Danfung Dennis: So we found that it's very easy to break the sense of presence. If you have seams that are obvious, if you have any black spaces in the video, people notice them immediately. And it makes them pull out from focusing on what they're seeing to, oh, here's an error. Here's a glitch in the matrix. And I focus on that. And it breaks that feeling that you are somewhere else. And so it was Trying to iron out a lot of the technical issues that come with working in really high resolution, high frame rate, multiple cameras, and just trying to be like, this doesn't look good, let's try to do this, and adding more cameras here, and trying to shoot down, and shooting up. whether to do those in mono or 360. And so there's a lot of considerations, and it just becomes more challenging the more perfect you want the image to look. But really, you have to actually have a pretty seamless image for people to notice. People can notice the errors quite quickly. And we know that this isn't where it needs to be. It's, I think, a good first step, and we're excited to get it out there and show what we've been working on for quite some time. But, you know, we know the resolution needs to be higher. The DK2, that screen, it's a huge leap over DK1, but we're really eager to see some of the 1440p panels come out. And, you know, we shot at a much higher resolution 6K Master, so when these panels and these screens get higher res, we'll be able to accommodate them and have higher resolution footage. So, yeah, we know that this is sort of just, like you said, a stake in the ground.

[00:06:54.443] Kent Bye: Yeah. And so after going through this experience and doing this, uh, as I was watching it, it felt like, okay, this is kind of like a standard documentary, but yet I'm being transported to all these different locations and you're kind of like mixing those two. It was almost like this omniscient audio feed of these people talking about this, but yet I'm in this location. And I did feel like I was transported to a lot of different locations and I thought, Wow, I did sort of feel like I went to these places. But I'm curious about your own take of what you think this medium is the strongest for and what type of things you might change in the future projects.

[00:07:33.345] Danfung Dennis: Great question. This has been such a learning experience. You know, we're drawing on what we know that has come before. And as a filmmaker, as a journalist, we just draw from our storytelling abilities from what we've honed on previous mediums. And it's quite clear that those techniques don't really Apply in VR where we're starting from the ground up and so even zero point it sort of taking a more traditional documentary. Film approach where as we know that moving forward that it's going to be the core experience that that matters more. than even necessarily a narrative. It's how viscerally emotional can an experience be that I think will make the best VR and the best presence. It won't necessarily be here's a script and this script is actually what is the cohesive whole that ties it all together. It's going to be the emotions, the scenes, these environments that I think will give a sense of being somewhere, but also give a sense of emotion. I guess it's how do you convey emotion effectively in this new medium that, you know, when people figure that out, that's where VR gets exciting, gets interesting, and we can move beyond some of the constraints of the previous mediums that have come before. It's going to be a combination of a whole lot of different types of storytelling that go into VR, but ultimately VR is a new medium and a new visual language.

[00:08:59.515] Kent Bye: Yeah, and I think that there's this concept in Wikipedia where you write a stub article and they found that more people would take a stub article and edit it and improve it than if there was nothing there to begin with. And so in some ways I see Zero Point as this stub article of like the first take of what the medium of a VR film could be. And it's going to, I think, catalyze a lot of people's ideas of like, oh, no, no, no, it should be this way. And for me, my takeaway from actually experiencing it was almost focusing on the location first and then designing everything around the location rather than sort of having this narrative audio feed coming in and sort of putting the visual to it. So what are your thoughts on that feedback?

[00:09:43.927] Danfung Dennis: Completely agree. I mean, this is going to be a visual visuals first and audio. And I think they're actually probably equally important and design the experience around that. And so, yeah, and I think it is really important to set the bar. And here is what 3D 360 video looks like now. It's going to change immensely over the next months and years. But once you can visualize it, once you can say, OK, that's what live action, high frame rate, high resolution video looks like in VR. And this becomes clear what you need to do next. It happens in all sorts of different types of fields, you know, whether it's like you said, editing a Wikipedia page, or extreme athletes. The four-minute mile was this barrier that could not be broken by a long-distance runner until it was smashed by Roger Bannister. And then a whole series of four sub-minute miles were run. And so it wasn't actually the physical limitation of, can a four-minute mile be run by a human being? It's, can you visualize someone actually doing that? Once someone does it, then people say, OK, it's possible. It can be done. And I'm going for it as well. And I think that's what I hope Zero Point is a catalyst. It is a catalyst for people to start putting out their videos, their experiments, and saying, OK, this works well. This doesn't work so well. What if it's an audio experience? And everything is built around that. And put audio first over everything else. And I think that can be highly effective. And so there's so much that is yet to be figured out.

[00:11:13.842] Kent Bye: Yeah. And in terms of the length, I'm curious if you thought about what people's tolerance would be from going into a virtual reality experience because, you know, a lot of people within the virtual reality community have kind of like their VR sea legs built up, but yet if somebody's watching this for the first time, you don't want to necessarily overwhelm them with putting them into VR for the first time and having them sit there for 90 minutes. So maybe you could talk a bit about the decisions that you were making in terms of what the optimal length would be for a VR cinematic experience.

[00:11:43.506] Danfung Dennis: Yeah, that's still to be determined as well. And I think there's going to be a couple of different ways to look at it. And I think, you know, first is, you know, it's got to be comfortable. And with the DK1, we certainly had issues. DK2, far more comfortable and really not a whole lot of issues with motion sickness. But there does need to be, I think, a certain amount of time inside VR where you believe in it for presence to start to be able to exhibit itself. A couple of, you know, just if you're in for a minute or so, you're not completely in yet. And I think there is this time that you have to kind of be in maybe a couple of minutes of very good VR and the feedback is very good. All the latency is low and it's just everything is working quite right. And you get that magic and your brain's like, I'm there. So I think there is, you don't want it too short, but right now. You know, 15 minutes does feel like quite a long time to be in VR in a video. So we probably wouldn't go longer than that for now. And, you know, I think we can also just iterate more quickly on shorter videos. Uh, you know, what, what can you do in one shot? What kind of story, what type of experience can you give in simply one shot? And then let's figure out how to string these together where we're still at such early days on this.

[00:12:57.986] Kent Bye: Yeah. And I don't want to give too many spoilers of the film, but you know, the introductory sequence is more. of a computer-generated sequence, and it comes back to that at the end, and then when you're actually seeing the film, most of it is, throughout the meat of the experience, is 360-degree video, but there was a very distinct difference, I thought, in terms of that sort of CG experience versus the live-action film, and so when we're talking about these cinematic experiences, I'm curious, like, what your take is in terms of blending the two.

[00:13:30.457] Danfung Dennis: Yeah, I'm really interested in blending CG and live action. It's hard though. It's hard to get it right. And I think, you know, we were trying to sort of experiment with that. What does it feel like to transition from one to the other? And they are very different. When you're in video, I think the brain at a subconscious level is convinced more quickly that it is VR, but only if the resolution has crossed a certain threshold. And I don't think the resolution has crossed that threshold on the DK2. And I've had a chance to try some of the newer headsets at Oculus Connect, and that has, I believe, has crossed some threshold. where VR video looks really good. And I think the resolution doesn't hold up as well for video as in CG. And I think there's a lot of reasons for that. But CG is, I think, a little bit more acceptable right now on the DK2. Whereas video, it's sort of video is at its sort of DK1 stage right now. It's you're like, okay, you know, I wish there was more resolution and I can see where this is going, but it's not quite there right now. And I think that's sort of where we sort of see it. It's just we need one more iteration of panels. Let's get to the 1440p and then live action video is going to be super convincing and look great. And then the resolutions can continue on and get higher and higher. and it's going to just become more convincing. But we're still at that point where I think CG looks better now, but video potentially could be an extremely convincing experience when those resolutions get higher.

[00:14:59.005] Kent Bye: Yeah, and one thing I also noticed was that there were some cuts and then there was a lot of fading to black. In terms of the language of cutting between different scenes within a 360 degree VR cinema, What were some of your lessons learned in terms of what you thought worked and what completely did not work?

[00:15:18.501] Danfung Dennis: Yeah, it's really hard. I mean, this is a new language, new syntax, new grammar. We're going to have to invent all of that. And we found the fade to black to fade up was an obvious safe way to go, but not very satisfying. It's sort of, okay, we can do that, but it kind of breaks it. And if you do it a lot, I think it's just not what VR needs. And I think a long shot, kind of unbroken, uncut, are sort of the ideal starting point. What can we do in these long, long scenes where it doesn't feel too slow, things are still happening, still dynamic? But cutting, you're not moving around inside a single environment. I think it's easier to cut between two very different scenes than it is to cut within one scene where it's somewhat similar. It's more jarring to suddenly be transported 10 feet than it is actually to be put into a whole new space. So I think we would like to try to do longer shots, fewer overall cuts. And when those cuts happen, they're quite obvious, and you're moving to a new environment, and you kind of understand that. we don't even know. I mean it took cinema years to figure out editing and that you can do a close-up to a wide and this means that and we have to figure out all of that that storytelling that's going to have to come with VR and really it's in this research phase of what's effective, what's not. We know most of the traditional rules of traditional filmmaking don't apply and we've got to discover what is effective, what is VR good for.

[00:16:47.540] Kent Bye: Yeah, and after I watched the zero point experience, I went back and looked at the trailer and noticed that there was a number of shots that weren't in there, and some of them were moving around quite a bit more. And I'm curious if you found that having more stationary shots were more comfortable. At the same time, you did have some shots that were kind of having a motion. but maybe at a consistent motion and not accelerating too quickly to kind of give people motion sick. So what were your experiences in terms of playing with moving the camera around versus just keeping it stationary?

[00:17:18.645] Danfung Dennis: Another great question. This is something that we were really trying to push was movement and being able to move the camera through space. But it's really hard in VR. I mean, you need instant acceleration to make it comfortable. And so it's really counterintuitive when you try to film a cinematic shot In film, you want to gradually ramp up the velocity and slowly sort of just ramp up to speed and then slowly ramp down. In VR, you just want to go to instant acceleration and be at speed. It's more comfortable. There's less time where there's a mismatch between your inner ears and your body. So we had some scenes from E3 that we shot last year. and it was on the trade floor, hundreds of people going by, and we had our three red epics on a shoulder-mounted rig, and our cameraman was walking slowly and steadily just through the trade floor with people flowing around, and it was such a cool shot. But just the little amount of camera bob from the cameraman walking was enough to cause motion sickness. And so we just decided to drop a lot of those walking scenes. It's going to have to be smoother. It's going to have to be steadier. You're going to need a stabilized three gimbal drone carrying the rig to really get something that's ultra smooth and comfortable. So the tripod shots, the fixed shots are certainly the safest way to go, but not the most satisfying. I want to move the camera. I want to be able to move it through space. You get, I think, a much higher sense of immersion There's much more 3D information passing by, and you're pulled into the experience more. But it's hard to do, and we're going to require new types of rigs to be able to do that effectively.

[00:19:01.253] Kent Bye: Cool. You also mentioned here that you were doing 360 degree audio, spatial audio. What kind of audio rig were you using to be able to record this experience?

[00:19:12.108] Danfung Dennis: Yeah, so we've been experimenting with audio a lot as well. And it's something that, again, it's not consistent through the film. We were trying different techniques, different types of capture devices. But what is sort of the simplest way is just putting binaural microphones in your ear canals. It sounds strange, but you can use the shape of your ear and the shape of your head and shoulders as a way to capture spatial audio. And there's one channel for your left ear and one channel for your right. And so some of the scenes I had this binaural mic in my ears recording the scene. It certainly adds 3D spatial audio, but there isn't the head tracking where it's much more limited to a left-right type head tracking pan fade. It's hard to have audio sources differentiate between sources that are in front of you and behind you unless you have sort of a visual cue that ties to the source of audio. And it's really optimized for my ears. So if you've got a different shape ears, you're not going to get that ideally spatialized sound. And so, you know, I think there's going to have to be a different solution from that type of capture. I'm really interested in, you know, ambisonic, HRTFs, binaural. It's sort of really hasn't sort of shaken out as to you know what is effective audio in VR and how do you capture it in a live environment.

[00:20:32.253] Kent Bye: Yeah and throughout the experience you do have different locations and I noticed that you were trying to doing different experiments like you know, standing in around a group of people or having someone come up to you and directly address you and say something to you. And then, you know, what were some of the things that you were trying to go for in terms of the types of experiences that you're giving people in these different locations?

[00:20:54.323] Danfung Dennis: Yeah, I think the driving force behind the experiments was what is an effective VR experience? We just don't really know yet. Is it someone engaging the camera directly? Is it more being an observer and letting things just happen? around you and I think you will find that there's uses for both. In some cases you simply want to be a fly on the wall and things are happening around you and in other cases you want to be a character, you want to have an avatar, you want to be involved in what the other characters are doing and maybe have some control in that. So I think there's a space in between that and it's going to be sort of a wide area of different types of experiences that push both and So it was for us just, you know, what works well for video in particular in VR. And it's people, it's animals, it's complex environments. And so what do those look like? And, you know, having, you know, a person looking at camera is actually a novel experience in VR. It sounds so simple, but Having someone directly address camera, look at you, they feel like they're in front of you in 3D, it says you feel like, oh, they're talking to me. And you start having these reactions as if you were there in real life. And I think that's when we'll know that our VR is good and is effective. when people start reacting as if they were actually there. The basic example is putting up their hands in front of them if something is coming towards them. But you'll have sort of social cues that if someone's staring at you, you're going to feel like, oh, why is that person staring at me? And look over at them. And so all of those things we're trying to explore. And it's just wide open space, new frontiers, really exciting time to be exploring VR.

[00:22:33.375] Kent Bye: Awesome. And finally, what do you see as the ultimate potential for virtual reality and what it can provide?

[00:22:40.658] Danfung Dennis: Well, I come from a background of journalism. I worked in Iraq and Afghanistan for many years. And I think the power of VR is that it can put you into another conscious experience, that one that you couldn't have otherwise. You can be in another location, another mind, another world. We're going to be able to understand things about ourselves, about others, in a way that we've just never been able to do before. I think this is a new form of communication. We might be able to create new forms of emotion and be able to communicate those in very subtle yet powerful ways. So I think, you know, we can apply this to situations where they're just abstractions. They're like war. They're like climate change. Can you use this technology for pro-social goals? Can you use it for more than just entertainment and advertising? And do something that truly breaks new ground and is new ideas and new thinking. So I think it's an opportunity. I think it's an opportunity where we just, we don't have to show the same type of content that we're used to that is mainly controlled by mass media. What if there's a new voice that comes out, a new, younger, more advanced type of language that we can share stories and share emotions in a new way. And so I think it's just, the potential is just tremendous. We don't even know what we can achieve with this yet. It's just such early days, and I'm just really excited to be a part of it.

[00:24:01.601] Kent Bye: Awesome. Well, Dan Fung, thank you so much for joining me today. And I guess if people are interested in checking out Zero Point, it is available on Steam. And is there anything else that's left unsaid that you'd like to say?

[00:24:12.408] Danfung Dennis: Thanks so much for having me, Kent. I really enjoy your show, and I hope people enjoy Zero Point.

[00:24:17.872] Kent Bye: Awesome. Thank you. Thanks.

More from this show