I interviewed Tipatat Chennavasin, General Partner and co-founder of Venture Reality Fund, at Meta Connect 2024 about why he has gone all-in with Gaussian Splatting by investing in the Gracia.ai and their viewer that’s on the Quest store. We also talk about some of the news from Meta Connect, Quest 3S excitement, the merits of AAA games like Batman Arkham Shadow to bring more people into VR, and other some experiences and demos that caught his attention at Meta Connect 2024. See more context in the rough transcript below.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
Podcast: Play in new window | Download
Rough Transcript
[00:00:05.458] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So continuing my coverage from MetaConnect 2024, today's episode is with Tipitat Chennavasin, who's a general partner and co-founder of the Virtual Rowdy Fund. So Tipitat was super excited around Garcia, which is an application that he's invested in that is showing these Gaussian splats. And so being able to take a number of different photos and then rendering out these objects, which were essentially in this black void. It's a little bit different than what I saw with the Hyperscape demo, which was much more like complete full scenes, which required like cloud rendering on the backend. These were a little bit more simpler where you could start to render out these discrete individual objects. Tipitat is fond of calling these Gaussian splats as sort of like the JPEG of immersive content and could have a lot of potentials for how these Gaussian splats eventually lead into creating trained large language models and from there feeding into new forms of generative AI that are able to get much more higher levels of photorealism. And so we also talk about some of the general news that were coming out of MediConnect and some of the other highlights of different experiences that were really catching his attention. So we're covering all that and more on today's episode of the Voices of VR podcast. So this interview with Sipitat happened on Thursday, September 26th, 2024 at MediConnect in Menlo Park, California. So with that, let's go ahead and... Dive right in.
[00:01:39.052] Tipatat Chennavasin: Hi, I'm Tipatat Chennavasin. I'm general partner co-founder of the Venture Reality Fund. We are a VC firm that specializes in investing in early stage VR, AR, AI startups.
[00:01:49.605] Kent Bye: Great. Maybe give a bit more context as to your background and your journey into the space.
[00:01:52.766] Tipatat Chennavasin: So I actually started as a developer. I actually started in the industry as a 3D artist, worked my way through different jobs, became a prototyper, a designer, creative director, technical art director. I eventually had my own mobile game startup. But then I backed the Oculus on Kickstarter, was blown away thinking, finally, consumer VR is going to be in my lifetime. It's going to happen. I started making some really simple demos, accidentally cured myself of my real-life fear of heights with a demo I created with my friend Eric and John, and that really got me all in in the XR space. I ended up joining a fund, becoming one of the first investors focused only on the XR space, then was able to start my own fund, the Venture Reality Fund, with my partner Marco. And so now we've done nearly 60 investments in the past 10 years of investing in this space.
[00:02:39.059] Kent Bye: Great. Well, we're here at MetaConnect 2024. And you just showed me a demo of Gaussian Splats. And you have a lot of passion. You said the thing you're most excited about right now in XR is Gaussian Splats. So why Gaussian Splats? And why are you so excited about them?
[00:02:52.284] Tipatat Chennavasin: Yeah, so I just showed you a demo from the technology of one of our portfolio companies called Gracia. They're really working on a Gaussian splat viewer for VR and AR. They already have a demo in the Metastore. You can download Gracia VR, and they'll show some amazing still splats that I just showed you. Some of my food scans are included in there. What really amazes me, impresses me about Gaussian splat, especially with my background as a 3D artist, one of the biggest problems for VR is creating content, and especially real, high-quality, 3D content. And so, especially if you want photo realism, a lot of times right now people are doing things like 360 or 180 video and 180 stereo video. And it's compelling, but it's still three-ed off at the end of the day. And to get full presence all the time, like, you know, what really makes VR stand out is that when it hits your lizard brain and you realize you're not staring at an image, but you actually believe the object is in front of you, that's when VR is at its most powerful, or AR is at its most powerful. And so we've been missing this critical link. This is the missing link. This is the holy grail of VR content, where now I like to say it's the JPEG or PNG of VR, where now you can have something look photorealistic and you can have complete six degrees of freedom looking around it. And that's so important. And it's very inexpensive, relatively inexpensive to capture and produce. And now with Garcia, very easy to play back in an XR headset.
[00:04:19.298] Kent Bye: Yeah, and the first day that I was here, back on Tuesday during the demo day for Press and Creators, they had demos of Hyperscape, which they released as an app, but you have to have cloud rendering of the scenes. And so I had a chance to see it, and I really appreciated the level of detail for the scans that they had. in order to do the processing, they were forced to do the cloud rendering, mostly because when there's different elements of translucence with multiple layers, then it basically becomes too heavy for an individual GPU to be able to do that. So that was an entire scene that was very complex. And what you just showed me was more object-based, like one object that you're able to see in front of you in this kind of black void, rather than being fully immersed within the whole environmental context. But what you're showing me is able to be rendered out on the headset, whereas most of the stuff that I saw before was doing cloud rendering. So maybe you could just describe a little bit of the limits of how would you be able to run as complex of a large scene rendered on the headset? Or is it really isolated to these small, discrete objects right now?
[00:05:23.268] Tipatat Chennavasin: The hyperscape demo is pretty awesome. I definitely recommend anyone in the US that has a really good internet connection to try it. I think what you'll see, though, is, again, at its core, what a Gaussian splat is is it is a point cloud of radiance fields. And so, of course, the complexity of the scene increases the complexity of the render and the data size and all of that kind of stuff. And so it's because those are big scenes. You travel all the way around through them. And you can actually walk. You don't have to just teleport. multiple rooms that they've captured in each one of those so it's a huge data set i think it would be tough to kind of render it out but also too the problem is like at those huge resolutions it looks pretty good but then you want to see something up close you know you can start seeing some resolution loss what's really interesting about taking this more object oriented approach of course it's more compact but you can also get much more like pixel density and so things look sharper and it really has that like for me hitting that lizard brain of making it feel like it's actually in front of you and it's the object and you get closer to it and you If it's at scale, it doesn't lose resolution. That's a huge, huge win in terms of convincing you that you're no longer staring at a screen, but you're actually staring at the object. But it does scale up. I showed you some examples of food. I love food photography, and so I take a lot of food scans. But they also have stuff that's more like human, people scale, and all that kind of stuff, too. And so you can see different examples. And it scales out pretty well, depending on what you want to capture. But of course, the more complex the scene, the less they can do. But what's really cool, though, is if this is the JPEG, They're also working on the MPEG of VR. And so they have a 4D Gaussian splat demo that I've seen, and it's very impressive. And what's interesting too, there's been volumetric video in the past, and there's been photogrammetry, but there's always been kind of drawbacks. They've never had photorealistic visual fidelity. And that's just the nature of meshing, the nature of having to play it back in a 3D engine. And so here, because it's actually playing back the photo data, again, it's like an AI that trains on the data set to give you the new novel views. So it's completely smooth when you look at it. It takes 50 images and then gives you infinite angles of resolution, essentially. And that's really the power of what makes like radiance fields, nerfs, gauss and splats, all of that family really, really unique. But then what's really impressive with gauss and splats, it's really efficient to play back on more like standard devices.
[00:07:40.731] Kent Bye: And how large of a file does it take to, because imagine that you have to do some processing. So it sounds like it might be being processed offline on like a high-end GPU, but then put onto some sort of format that then you're able to play back in a much more compressed format. So how big are the files that you're showing me, like one of them?
[00:07:58.649] Tipatat Chennavasin: Yeah, so the data set I have, I go to a restaurant. I take pictures, anywhere from 50 to 100 photos. It takes me about four minutes to capture. And then that ends up being about 150, 200 megs of data. And then I process that in the cloud. And then I get back this file format that's roughly about 30 megs. It's actually about 200 megs. But then once I remove the background and make it just object focused, then it becomes about 20 to 50 megs.
[00:08:28.536] Kent Bye: And then you were talking about in the second splat that you're showing me, there was like tomatoes and some translucence. And so how do you describe some of the unique affordances of how the Gaussian splats are different than what you see with the photogrammetry with texture, like more static meshes?
[00:08:43.687] Tipatat Chennavasin: So this is what's really, really impressive about the Gaussian splat. I mentioned it was like a point cloud of radiance fields. And a radiance field, instead of being just one single color value like a voxel, the radiance field is giving you essentially a sprite that understands what angle you're looking at it and gives you the right image set for that angle. And so what ends up happening is it can capture all of the view-dependent lighting. It can capture translucency, transparency, reflections, specular lighting, any kind of lighting, really. And so that's why it just captures all of the photographic data and displays it back at the right angle. And so you can't do things, at least without a lot of AI processing, like relighting, but you naturally get everything that's proper about the scene.
[00:09:28.706] Kent Bye: So what are the use cases that you see in terms of where this goes from here?
[00:09:31.708] Tipatat Chennavasin: I like to tell people this just, again, opens up more possibilities for content that have never really been possible before in XR in this compelling of a way. And so things like fashion, things like beauty, even things like food, tourism, all these other use cases where, again, the synthetic 3D rendered scene in Unity or Unreal could look pretty good, but you still know you're looking at game assets, game data. You know you're not really looking at a photo of the real thing. And so now, all of these other use cases are possible. If we think back about the way that computer graphics started, it first started out with vectors. And it took a long time until we got bitmap graphics and JPEGs. And it's like, once we got JPEGs, oh my god, it opened up so many other industries. And I think that's why we're at that point now with VR.
[00:10:20.066] Kent Bye: Nice. And so there's basically these objects in these black voids. And it makes me think about how, well, this could be like a mixed reality. So you could use the camera of the rest of the scene. I don't know if that would be too heavy for it to process the mixed reality components. Or just in talking to the developers of Starship Home, they were really emphasizing that photoreal objects don't really work that well with mixed reality a lot of time. And so they went with, at least for their game and their context, more of a cartoony art style that makes it more distinct for what objects are physical versus which ones are virtual. So yeah, I'm just wondering if something like this has been experimented with having some sort of mixed reality use case where you set a spot on your table and you can click through different objects but still have the rest of the scene that's just the world around you rather than being in a big black void.
[00:11:06.348] Tipatat Chennavasin: Yeah, it's especially set up for those AR use cases. And I've seen some tests of it, and it looks phenomenal. I think the interesting things about it, too, is there can be a mismatch of the lighting. But you can rotate the object to approximate what the lighting in your area is, and then it locks it in in a way that is very compelling. And I think the test for AR used to be, there's three objects on this table. Which is the virtual one? And the Gaussian slot, you would not be able to tell. nine times out of ten, unless you're a hardcore photographer that understands the way lighting looks and the way lighting behaves. But for the layperson, I think, yeah, they probably wouldn't be able to guess.
[00:11:46.598] Kent Bye: MARK MANDELBACHER- Great. So yeah, as we start to wrap up, I'd love to hear some of your thoughts on like MetaConnect and what are the big takeaways that you have from the different announcements or just some of the buzz that you're hearing right now in the XR industry?
[00:11:55.904] Tipatat Chennavasin: DAVID MALANI- Well, I want to get to that. But real quick, I just want to plug the guys. I mean, they're doing amazing work. I want everyone to see this. Tell me what you think of my food scans. So go to the Meta store and download Gracia, G-R-A-C-I-A-V-R, and check it out for yourself. Because I think it's hard to describe. It's hard to even show. a 2D image of what they're doing. You have to experience it in the headset, but that's how you know it's the best XR. So in terms of what I thought about MetaConnect, I mean, honestly, this was amazing. I think they nailed the keynote. I think they showed strength in all of their current activities. I think the MetaQuest 3S hitting that magical price point Having a spectacular, amazing game like that, not just is a great IP and looks good, but actually plays great. I think that's going to be huge for the industry, especially right now where there's no new consoles in sight. So what's going to be the most interesting thing for Christmas? It's not going to be a PlayStation 5 Pro. I think playing Batman on the Quest 3, or more likely also playing Gorilla Tag on the Quest 3, will be way more compelling for most people out there. But then... Also, really big on just the meta Ray-Ban collaboration. It's really working. I wear a pair. I like them a lot. But what's really interesting to show all the other use cases of it and that they're thinking beyond just taking photos and doing really some interesting things. I think this is like the Trojan horse of consumer AI. I think a lot of times people are like, is there going to be a consumer AI device? And of course, Jonathan Ives and Sam Altman are working on some crazy thing with OpenAI to create what they think could be the iPhone of AI. But honestly, it's going to be Glass's form factor. You want the AI to see what you're looking at. And I think Meta has an amazing head start. Form factor, it looks good. And the functionality, like the translation, the memory things of keeping track of where you parked or even maybe where you lost your keys and helping you find your keys, that's huge, right? But most important, of course, is their vision of the near future and showing Orion. And I was really hoping that they would show something. But what they showed was way more impressive than what I thought was currently possible. And so the fact that it's a functioning prototype and not just some vaporware thing. Unfortunately, I'm not cool enough to have tried it. I'm not either, yeah. But we have lots of mutual friends that have that are very harsh critics and would tell us if it wasn't. And they were like, no, it was. And so I think they really threw down the gauntlet to Apple to be like, hey, you know, like, Yeah, we have not been just working on the Quest. And telling everyone, right? Everyone always makes fun of, oh, they spent $10 billion for what? And it's like, well, first off, they've created a whole new app store ecosystem. Developers are making hundreds of millions of dollars. But they have also created something as super compelling as the Rhyme.
[00:14:38.925] Kent Bye: Yeah, you mentioned the Gorilla Tag. There was lots of leaks of the Quest 3S, and Walmart had a store. And there's someone on Twitter who said, because they were promoting Batman, what they should be promoting is Gorilla Tag, because that seems like it's a real system seller. So it does feel like, I played the Batman, but I still feel like there's a certain amount of these AAA games that are interesting and compelling, but I don't know if it'll be one that actually cracks through. There's some nice embodied gameplay with the punching mechanics and stuff, but still, like, I don't know, there's something around, like, what's happening with Guerrilla Tag with Karistel and the rest of all Axiom games that it feels like they're on to, like, a completely new paradigm of some of the different interaction that this type of, like, hand-based locomotion seems to be really taking off. And it makes me wonder, like, if there's something around, like, folks at Meta who are trying to curate and decide what is or is not going to be on the store, or with the critics of XR. And then there's the users who are finding these kind of word-of-mouth, grassroots-grown communities, like IamCat and Gorillatag, or just stuff that feels like maybe not on the radar of the larger discussions within the XR industry. It makes me wonder as to how much of this gap between what's happening in the grassroots of actual users that are using it versus some of these big marketing pushes when there could be stuff that's actually being system sellers that are not being a part of these campaigns. So I'd love to hear some of your thoughts as someone who is watching the industry very closely. That is a fantastic question.
[00:16:04.385] Tipatat Chennavasin: I feel like we could fill a whole hour just on that. But what I will say is this. Honestly, I've never been a huge fan of big IP. And I've always been like, OK, really great VR-first gameplay. That makes the biggest difference, right? But fundamentally, I also think, too, there's this wish fulfillment factor in VR. And are you telling me that when you glide down the cape, and then you look at your shadow, and you see the bat shadow, that you don't feel like that inner kid in you doesn't just sing and say, hell yeah?
[00:16:31.696] Kent Bye: Yeah, the bat's shadow was probably the most cool things of the whole demo, was seeing my shadow and knowing who I was. Yeah. I mean, so. But the whole, like, it's still stick-based locomotion. It's still, like, not comfortable.
[00:16:42.662] Tipatat Chennavasin: But yeah. But again, just because it's stick-based locomotion doesn't mean it's not selling. Like, Bone Lab and Blade and Sorcery are killing it, are still doing great. Like, I don't think it's either or necessarily, right? Where you're like, OK, you know what? Like, what they need to get is all types of players. And I would say, like, Gorilla Tag, Definitely, they should be promoting more. And they have been in certain ways. But also, Guerrilla Attack kind of sells itself, because kids tell other kids, and they get them. And that's really important. But there's also a segment of traditional gamers that are looking for something more. And so it's like, they need to expand across all categories. And of course, I want them to do everything. But limited resources, you can't do all. I will say, like, I appreciate, I feel like this is one of the first times I felt like one of the big AAA IP kind of things actually plays really well in VR and feels VR, and it feels embodied. You know, they have kind of beat-savory kind of, like, rhythm combat that feels fun, but it still, like, honors, like, the Arkham kind of combat system. So I don't know, like, is it the perfect game? No. But is it an excellent, fun game that uses the IP in a very, like, true way and lets you live out your Batman fantasies? Heck yeah. And so, yeah. I'm all in for it because it felt great. To me, I feel like a great VR game has gestural combat or gestural embodied gameplay. And I feel like they deliver on that. Wish Fulfillment. I feel like this is the best kind of superhero simulator I've played. Like throwing a batarang, using a grappling hook. All felt just great. Gliding felt great. And then the combat was just, it was fun. Was it like... Was it, like, the most physics? No. But was it great in-body gameplay? Yes. Like, also, too, when you did the ground and pound, wasn't that pretty satisfying? I see you smiling.
[00:18:25.488] Kent Bye: Yeah. I enjoyed it. It's just it was a little, like, I guess I don't play a lot of games, and so it was fun. Maybe you'll play through it. I don't know. I enjoyed the world building and other things, but I feel like I played through Riven. I don't know if you've played through Riven, but it's a little bit less hand-holdy, and there was a little bit more of, like, the open-world exploration aspect where you can get lost. And I feel like this was very prescriptive of like, OK, pull this lever to proceed, and now go here and do this. And it was like, so it's like.
[00:18:52.903] Tipatat Chennavasin: We played a demo. So we played a very scripted demo. But I think the bigger thing, too, is like, would I think this is a great introductory VR game for a lot of new adopters? Yes. And I think that's also the thing, too, where it's like, would I recommend Gorilla Tag for most people's first VR experience? No. So if you're asking me, yeah, I think more about this. I'm like, I love what they're doing. I think it's amazing. But when I tell people, hey, put on a headset and try the first thing you should try, I would feel very comfortable saying Batman. But of course, I'd still probably recommend Beat Saber. But it's important to have lots of different options for lots of different types. And also things that are comfortable, do all of these things. And I feel like it was a very rich experience that was not nausea-inducing, not like, you know, high levels of comfort. But still, you got to do amazing things and feel pretty good. So I think they nailed a lot of things that, for me, I totally understand why they should push this as a first game. But anyways. Yeah.
[00:19:47.843] Kent Bye: Yeah. Well, I guess we'll see. History has shown that the AAA translations have rarely kind of like- Because they were garbage.
[00:19:54.668] Tipatat Chennavasin: They all played terrible. But what's really great, too, like, I mean, they acquired the studio camouflage. They did a great job with the Iron Man game. So this is not their first VR game, right? And not even their first good VR game, right? And so that means a lot. Right. Yeah.
[00:20:07.798] Kent Bye: Yeah. It's all to be determined. I think the price point, more than anything else, is going to drive a lot of the headsets. And it's nice to have something that's equivalent to Quest. For me personally, there's different things with like IPD and comfort and the lenses and no headphone jack. And so there's little things like that that, for me, the Quest 3 is still going to be the preferred device. But for people that are looking for budget-friendly devices as an entry point, I think it's going to be a real catalyst for bringing more and more mixed reality content out into the ecosystem. So that'll be exciting.
[00:20:34.764] Tipatat Chennavasin: That's actually the bigger question. It's interesting that their flagship game is not a mixed reality game, right? If this is the device that will bring mixed reality to the forefront. But I will say, too, like, special ops. I don't know if you got to try it.
[00:20:46.492] Kent Bye: No, no, but I heard about resolution games that happened last night.
[00:20:49.313] Tipatat Chennavasin: Yeah, and one of the cool things about it, they were showing off kind of like multi-room interaction and just, again, your whole, not just one room, your whole house or apartment becomes the battlefield, and it's just nuts. Like, they really knocked it out of the park.
[00:21:03.052] Kent Bye: Yeah, I saw lots of people playing it ahead of time. And I just saw you walking with Tommy Palm in Resolution Games. So yeah, it sounds like that's one of those co-op games where you essentially transform your house into a laser tag type of game. But then, yeah, it looks like a lot of fun. So did you get a chance to play it?
[00:21:15.881] Tipatat Chennavasin: So I haven't played the full experience. I played bits and pieces as they've been developing. And I'm not an investor. I'm just a friend. But talking to them, seeing the demos, seeing other people's reactions, unfortunately, the line was too long for me. And I feel awful if I try to cut. So I'm like, uh. Yeah, the reactions have been phenomenal, and I can't wait to try it.
[00:21:35.036] Kent Bye: Yeah, they had a happy hour yesterday for people to come try it out. And also, Wave XR had a whole event where they were showing their latest demo. So yeah, I guess as we start to wrap up, I'd love to hear what you think the ultimate potential of XR, especially computing, might be and what it might be able to enable.
[00:21:50.908] Tipatat Chennavasin: So I've often thought the ultimate killer app is really the HoloLens, a multiplayer HoloLens. That's really the dream for us Star Trek nerds. And I used to think, man, it's just so expensive to build. And it's like, OK, if you give a team like Rockstar a billion dollar budget, 10 years, and 1,000, probably 2,000 employees, they can build a city. and a very impressive 10-hour or 40-hour narrative story. But now with the rise of LLMs, we realize, oh my gosh, we're going to do this much faster. Or there's going to be much quicker ways to do this. There's kind of a clear line of sight to how this can be accomplished in a much more efficient way. And what's really interesting, too, is one of the biggest challenges is text to 3D and 3D data generation from generative AI. But I'd argue it's because the data sets are not good enough. Garbage in, garbage out. And this is why Gaussian splats are so important. Now we've never had a way to create high-fidelity, high-quality 3D data sets as efficiently and inexpensive as Gaussian splat and radiance fields in general. And so now we can train LLMs based off these huge data sets. And now it can generate 3D data that looks as good as what I showed you and what you can see in the Gracia app. And oh my gosh, yeah. Like, that combined with the LLM characters of, like, the in-world, and yeah, we have all the components ready to go to make a holodeck that wouldn't cost a billion dollars.
[00:23:14.030] Kent Bye: Nice. And is there anything else that's left unsaid? Any final thoughts you have for the rest of the immersive community?
[00:23:20.407] Tipatat Chennavasin: Yeah, I've been doing this for 10 years. You know, you and I, we've had many of these discussions over the years. And it's never been more amazing, more exciting. I feel like everything that we've dreamed about is, like, now more, like... either in our hands or on the cusp. I still can't believe the quality of devices that we get access to and the qualities of experience that we get to play. And the fact, too, that now we're seeing so much innovation and creativity. And we're seeing things that are not obvious. I think Gorilla Tag, no one had that as a thought of, this is what's going to be the big thing in XR. And to know that all of that is happening just makes this so exciting. And I'm so glad that I'm a part of this and I get to support amazing founders. And, you know, if you're working on stuff and you need support, reach out to me or my partner, Marco. I'm tipitat at thevrfund.com or you can find me on X. Yeah, thanks again.
[00:24:14.086] Kent Bye: Yeah, thanks so much, Dipitat. It's always great to hear your latest thoughts. And yeah, you had such enthusiasm for Gaussian splats. And I could see not only the experience is compelling, but also this larger implication for using it to train AI and this kind of generative AI and how to create these photorealistic scenes with data sets that were now capable of not only capturing all these different Gaussian splats, but then using them to generate even more. imaginal worlds of our dreams and things that don't even exist in physical reality, but are mashing and melding of things together. I think that's like a sweet spot of taking these different archetypal forms and like smashing them together and creating stuff that just doesn't exist in reality. So there's this kind of dreamlike visions that look real but just don't exist in physical reality, but we can live into our dreams of VR and XR and AR and all the things. All the Rs. So yeah, thanks again for joining me to help share a little bit more about your thoughts here on MetaConnect, but also where you see it all going in here in the future.
[00:25:12.376] Tipatat Chennavasin: Thank you for having me. And thank you again for doing what you do, man. And yeah, what I like to do, especially at MetaConnect, is we're just seeing so many friends that now decade-long friendships. And it's incredible. Backing that Kickstarter changed my life. And I know XR is going to change a lot more people's lives for the better. And I'm excited to try to help.
[00:25:33.168] Kent Bye: Thanks again for listening to the voices of VR podcast. And I would like to invite you to join me on my Patreon. I've been doing the voices of VR for over 10 years, and it's always been a little bit more of like a weird art project. I think of myself as like a knowledge artist. So I'm much more of an artist than a business person. But at the end of the day, I need to make this more of a sustainable venture. Just five or $10 a month would make a really big difference. I'm trying to reach $2,000 a month or $3,000 a month right now. I'm at $1,000 a month, which means that's my primary income. And I just need to get it to a sustainable level just to even continue this oral history art project that I've been doing for the last decade. And if you find value in it, then please do consider joining me on the Patreon at patreon.com slash voices of VR. Thanks for listening.