#1674: 3rd Place Spectacles Lensathon Team: Fireside Tales Collaborative Storytelling with GenAI

At Snap’s Developer Conference of Lensfest, I did an interview with 3rd place team in the Snap Spectacles Lensathon named Fireside Tales including Stijn Spanhove, Pavlo Tkachenko, and Yegor Ryabtsov. See more context in the rough transcript below.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.458] Kent Bye: The Voices of VR podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR podcast. It's a podcast that looks at the structures and forms of immersive storytelling and the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So continuing my coverage of the Snap Lens Fest 2025, today's episode is with the third place team of the Lensathon. This was the 25-hour hackathon that Snap brought together. A total of 20 different teams of four people each. There was a games track and then also a spectacles track so the spectacles track they were given the remit to develop an application that was using some of the new snap cloud features including the super base which was doing like a database back end and so creating like these advanced tables that you could start to read and write information from also like real-time live elements so kind of multiplayer experiences as well as using edge functions in some fashion so being able to call out to different ai applications or other ways that you're kind of dynamically pulling in information and so the third place prize was by team marshmallow they were doing an application called fireside tales and so essentially you would get into a social ar application with like two or three or four other people there'd be a campfire that everyone would see and so then there'd be a passing of the baton where you would squeeze a microphone and then you would say the beginning of a story so you would share a bit of your story and then that would be transformed into a text to image prompt that would then distill your story into an image that would come out of the fire and then everybody else that's also watching would hear that prompt summarized and then also see this image come out and then at the end it would bring together all the different images as they go through these stories So there's a lot of ways that they were kind of using the database backend to be able to write that information and create detections for these events and to send out messages to all the different AR devices within that social AR experience. And yeah, so this was the third place prize of the Lensathon. I was a judge for all 10. And so in the episode for the winner and the decisionator in episode 1676, I'm going to be going into a little bit more detail, some other thoughts on some of the other types of applications that I saw. But yeah, Snap brought me down to Los Angeles for the week to attend the lens-a-thon and catch up on a lot of different lenses that were released over the past year, like over 109 lenses that were out. And I saw about three quarters of them and be able to play through and just to get a sense of what was out there. So I could start to calibrate for all the different applications that were developed in the course of 25 hours, just get a sense of, you know, what was able to be possible. What was really impressive over seeing all the hackathon applications was just how sophisticated some of these different experiences were and being able to pull in all this different information and data dynamically. So yeah, the snap cloud and the super base enabling all these different tools and backends to create more utility based applications that you are able to store information, to have persistence of data and stuff that you come back to. Because a lot of the stuff that I'd seen that's launched so far were kind of these very snack sized experiments and prototypes that, you know, there were some of them that I found myself coming back to, but most of them were just kind of an idea that I could quickly get a sense of what they were trying to do. But yeah, I think this type of back end, we're going to enable not only like innovations for what kind of like more sophisticated applications are even possible, but also starting to call in more artificial intelligence types of applications. And the other two experiences, I think, start to have a little bit more of the AI and computer vision techniques that are very unique to augmented reality. So, becoming all that and more on today's episode of the Voices of VR podcast. So, this interview with Sten, Paolo and Igor happened on Thursday, October 16th, 2025 at the Snap Developer Conference of Winsfest at Snap headquarters in Santa Monica, California. So, with that, let's go ahead and dive right in.

[00:04:00.412] Stijn Spanhove: Hi, I'm Stijn Spanhove. I'm an XR developer and I'm really passionate to combine like AI and XR together and see what you can build with it.

[00:04:09.518] Pavlo Tkachenko: Hello, my name is Pavlo Tkachenko. I'm actually an XR artist. I would call myself currently working in a player studio as a lead technical artist. And previously I was working as a sailor. So I changed my profession in a bit. It was a very tough hard pivot for me.

[00:04:27.409] Yegor Ryabtsov: My name is Yegor Ryabtsov. I'm an AR developer. And I mainly work with social AR and web AR. And yeah, I've been doing this for quite a long time. And right now, we're experimenting with spectacles a lot. Kind of this reason why I'm here, yeah.

[00:04:42.495] Kent Bye: Awesome. Maybe each of you could give a bit more context as to your background and your journey into working with XR.

[00:04:49.304] Stijn Spanhove: So my background is mainly in backend and cloud development but I really like the idea with XR that if you combine it with like backend data and a lot of data you can like augment the world and I think that's super interesting to see what you can build with all that data in the world and it's visually so that's also super cool to do.

[00:05:10.529] Pavlo Tkachenko: Yeah, so I actually had a hobby while I was working as a 3D artist, and at some point it became my profession as a freelancer, 3D artist, and I found XR Space as a nice place to use 3D skills. So in the end, it ended up with them. Here, where we are, at Lensfest, we are doing projects in spectacles and pushing the future to the boundaries.

[00:05:34.099] Yegor Ryabtsov: Yeah, I think for me, I kind of started out as a graphic designer, because I always like to be creative and also with digital stuff. But eventually I got interested in just building the things I'm designing. So, for example, you do a web design, you want to build that web page. And then those skills kind of started to merge together. I started to experiment with 3D, and then it all kind of evolved into this generalist, tech creative coder kind of thing and I just like really have like this broad field of interest and it's like I think in XR it really allows you to just utilize all of it pretty much.

[00:06:11.030] Kent Bye: And I'd love to get a bit more context as to your journey into working with the Spectacles, if you've been able to develop existing applications. And we're here at the LensFest 2025, and you were working as a team on it. But just curious to hear a little bit more around your journey into starting to work with Snap and Spectacles.

[00:06:28.993] Stijn Spanhove: So I first got to try them at AWE and I was hooked. I really liked the device, it's small, it's easy to set up and get started and then I entered the developer relations and I got my spectacles and I just started to test out things and see what you can build with it, what are the limits and and what you can do and I really like how they are building a lot of tools for us to use with the Spectacles and to see what we can build and I have the feeling that we're both exploring us developers and Snap or Spectacles what is possible and that's what I really like.

[00:07:02.267] Pavlo Tkachenko: Yeah, for me everything was kind of simple after I started my social AR journey. I worked in different social AR platforms and in the end I ended up working for a player company. There we had a spectacle as a device and like I fall in love with it. It's pure magic in my opinion because personally I'm more visual kind of personality who likes to create something visually pleasure and that's how I just ordered for myself own spectacles to just do it as a free time, as a side job as well. So yes, that's it. It's an amazing device.

[00:07:38.324] Yegor Ryabtsov: Yeah, so for me, by the time I got involved with Spectacles, I already had this established AR career, but in mobile. But I always was fascinated with the wearable side of AR. And yeah, at some point, Snapchat reached out and they invited me to an event. similar to this one where they given me a pair of spectacles and that was also the first time i tried them also kind of the same thing like just a feeling of magic and just like how actually good it is because i think like the public in general don't really even like realize how good it is right now because like this is not available to them but when you try it on it's kind of get this idea that hey this could actually be like a really really good thing that will transform how humans interact and just like operate so yeah like since then like i had that device with me and it would be like such a waste to not try and like build something with it so like i've been really like experimenting a lot i like a lot with spectacles it's just like so convenient you just like create something send to the device it's there like in one minute less less even less you just like Touch the those like glowing things that do not exist, but actually they kind of exist.

[00:08:50.019] Pavlo Tkachenko: So yeah Which can offer this kind of fast building up especially for creative people who are not developing like hard developers It's just like it's literally plug-and-play in terms of creative process So you just connect do some things push it to device and it's super fast super easy and simple to get in creative process

[00:09:10.481] Kent Bye: MARK MANDELBACHER- Nice. Well, so we just finished the lens-a-thon. It was a 25-hour blitz of a hackathon where you basically had the remit to use Superbase and the new features that were being announced and launched here. And so just curious to hear a bit around this deliberative process as you were getting together and forming your team and trying to figure out what the actual experience was going to be. So yeah, just maybe kind of give a bit of a short elevator pitch for what the Marshmallow project was and what you wanted to try to use all the different capabilities of SuperBase into the context of a AR lens within Spectacle's classes.

[00:09:47.406] Stijn Spanhove: Yeah, I can tell a lot about Supabase because I did a lot of it, but maybe you guys can tell them about the project. Supabase, as a backend engineer, I really loved it. So I myself set up a lot of backends and it's a lot of work to create all the functions and the logic and things. But now with the Hackathon, Supabase was like a gift because I could quickly set up new tables, add triggers to the tables, create edge functions, call AI models in the cloud and I think like in 24 hours what we did with SuperBase, I think if I would need to do it on my own from scratch, I would need maybe a week or so to build it. So that was super nice for this hackathon to like quickly iterate and start building things and just see that it works. And that's what I like. If you're doing things and it works, that's super cool as a developer. So yeah, I really loved it.

[00:10:36.038] Kent Bye: Just to clarify, have you used SuperBase before or is this your first time using it? First time, yeah. So you were able to get up and running that quickly?

[00:10:42.704] Stijn Spanhove: Yeah, because they had a really good sample. It just followed the sample in Lens Studio, Dakota. I was just seeing how did it work. And it worked super well. And the concepts are new to me, so I quickly get what you needed to do. And it just worked. So kudos to the developers of Snap. Very, very well done.

[00:11:01.124] Yegor Ryabtsov: Yeah, well, so about the experience. In a nutshell, it's a multiplayer storytelling experience where you're with your friends. You don't necessarily have to be in one single space. It can be done remotely. And you're all telling one shared story. One person starts and they say something and the other person picks it up. And it's almost like similar to sort of like an improv or an acting class kind of exercise where people try and do something to like connect to get something going. And visually we tried to make it look like this. You're on a campfire, it's like this nice, cozy vibe. And it's like throwing ideas into campfire, quite literally. We had this visual representation of a microphone, but a bit of a ball of energy that you're speaking to, your throat in the fire. And then the AI on the back end was generating the illustrations for your story. And then when you get to that very end of your story, it will give you this final show of what the story looks like on a bigger surface with captions and all those illustrations. And yeah, we kind of tried to go really hard on the vibe, on this cozy, chill feeling, all about the human connection, which I think is really what I think AR should be about. And yeah, that was kind of our project.

[00:12:20.043] Pavlo Tkachenko: Yeah, so talking about the vibe, it was a question where you tell the story most. So in these childhood memories, when it's a cozy place, you're speaking with a friend somewhere nearby, neighborhood, next to the campfire or at summer camp, for example. And that always brings to ideas what you have at the campfire, marshmallows. So let's create avatars out of marshmallows because it's just cute. and cozy we're next to campfire and when you have the question to do something in 24 hours you choose the things that like is much more faster to build and marshmallows is the simplest thing that we can build in this perspective so it was very good run in terms of things how we're going to optimize our workflow yeah it's fun it's fun

[00:13:06.917] Kent Bye: And when I was doing the demo, it also ended up being a little bit of a collaborative storytelling where someone would sort of pick up from what someone else said. So you're able to go around and do rounds for what happens next in the story. And that when you would say something, then you would get a visual depiction of that. And then you would see other people also in this experience, because it is a connected experience, but you design it to be either co-located or remote so that people could be around this virtual campfire and sort of be around with their friends and be able to tell their story and then have like a log of that and with the super base i'm just curious if like if part of what that was able to enable was a recording of that or like why did you need super base in order to do some of those different types of features

[00:13:51.266] Stijn Spanhove: I think the interesting type we need was real-time databases. So if you send a message, it got stored into a table and then we enabled it to be real-time. So that means that it was sending out events to all the spectacles. Then all the spectacles knew that there was a message and they would read it out loud. And so we could sync it to everybody pretty easy and don't need to do the heavy lifting of syncing it to every device. Supalbase did that for us. So that was super interesting. The other thing that was interesting with Supabase is I added a trigger to the database and that triggered an edge function. And so I could easily call other models in the clouds to generate the image, for example. So that was also a good thing.

[00:14:31.688] Pavlo Tkachenko: Also, you can mention that it stores the story. So that means you can play the story again. You can play the story on other devices, potentially. Yeah, it's like directions to scale it, that it can be shareable, that it can be replayable and fun.

[00:14:48.014] Kent Bye: Nice. And so I guess as you think about this project and given your experience with Superbase and what is possible after this hackathon, I'm just curious to hear some of your thoughts for what type of things you would want to experiment or play with in the future around what this now enables of what you can now do that you couldn't do before.

[00:15:05.225] Stijn Spanhove: I think the nice thing is that you can now store something in the cloud very easily and you can come back to it. So you can imagine a lot of applications where you, I don't know, create something, you store it and you can see it again or maybe other people can see it. And I think that's super interesting. Yeah.

[00:15:23.024] Yegor Ryabtsov: No, yeah, totally. I was super excited about this news that this is coming because I had so many ideas for lenses, but they were the lenses that you would need to come back to or at least like something would need to be stored. And so now I'm pretty excited because like finally this is unlocked. And yeah, like obviously like it's cool to create something fun to just engage once and then like forget about it but now yeah you can pretty much build an app you can come up with whatever backend you want and I didn't get to touch it that much during our hack together, but from what I've heard, it looks like it's really solid. Especially for me, I'm not that much into backend development. I can understand general ideas and I can understand how it works. But for me to set up a backend from scratch, that's very challenging. I wouldn't do that, especially if I want to develop the whole application on my own. But with this, it's much more user-friendly. So yeah, pretty cool. Pretty cool thing.

[00:16:22.523] Pavlo Tkachenko: Yeah, from what I hear from the guys, I can just imagine and like play with my imagination. But first things which pop up in my mind is kind of, it's very handy for commercial things, because you can store a lot of products in one place, you can pick them up, you can play with them. Also, any kind of sandboxes, playgrounds when people can create content, share content, save content, pick it up again on the glasses. So it's amazing tool for optimization of the projects, where you just keep everything outside of spectacles and use only things which need for a user for commercial production super handy tool. We are definitely going to look on it more deeply in our production.

[00:17:01.817] Kent Bye: Yeah, it seems like a 25 megabyte limit for sizes for lenses is pretty constrained in terms of what you can do. But having some of these edge functions to pull down content dynamically feels like it's going to potentially open up more possibility to have more in-depth types of experiences that you could have the logic be within that 25 megabytes, but all the assets and data being pulled more dynamically in a way that allows you to more quickly get into the experience. So it feels like it's potentially opening up the doors to have a little bit more sophisticated or complicated experiences, whereas the existing platforms seem to be really quite constrained. After having a chance to go through about three quarters of all the different lenses that have launched so far over the past couple of days, it seemed like there was very short, bite-sized experiences, but having the capability to have much more in-depth experiences as well. So I don't know if you have any thoughts on that.

[00:17:51.014] Stijn Spanhove: Yeah, I agree. I think it's like the web. If you go to a website, it also loads the code that's very small and then it's loading the assets asynchronously. And that's basically what's also happening now in the specs at Superbase. So you open the application, it's very small, it's very fast and then it's like asynchronously loading all the assets and you can optimize it like that. So I think that's an interesting thing.

[00:18:13.495] Kent Bye: So yeah, really pulling on the fast loading of the website. So pulling in a lot of those same things that are bringing that same type of design architecture to XR. Yeah, indeed. Yes.

[00:18:24.386] Yegor Ryabtsov: I mean, like, it just feels like a point where it's kind of getting really serious. Like, it's no longer this thing where you just create this, like, little experience and, again, like, people try it, done with it. Yeah, this enables, like, proper apps. You can scale it, you can build a business on top of it. Obviously, like, this has been done for the public launch of Specs next year. So, yeah, really great to see this kind of development. Yeah.

[00:18:51.673] Stijn Spanhove: Maybe one thing I want to add and what we also discussed about, it opens like cross-platform apps also. So you could build something on the specs, you have like the Supabase Instant as your backend and then maybe you could open your mobile phone, go to a website and see maybe the story that we told in the Marshmallow app. I think that's interesting that you can build cross-device that makes it much more dynamic and cool.

[00:19:14.488] Yegor Ryabtsov: And that is called Metaverse. Like the proper actual Metaverse, not the kind of thing that people were talking about all these years. Actual Metaverse. So that's great.

[00:19:24.875] Pavlo Tkachenko: And makes spectacles more shareable for sure, because right now it feels kind of limited in terms of what you can share. Right now you can only share preview videos. That's it. And only share device.

[00:19:38.217] Kent Bye: Yeah, you'd mentioned that you're really excited around this intersection of AI and AR. And so just curious to hear a little bit more around the pipeline that you had to create to call these edge functions and basically do image generation. And just curious to hear a little bit around how you were able to wire those different requests and what the edge functions of SuperBase were able to enable for you.

[00:19:57.583] Stijn Spanhove: So what I basically did in the Edge function was, so the message came in and then it was calling like a cloud model based on Replicate. So Replicate is a service that makes it easier for developers to call AI models. So they built an API around AI models. So basically what the Edge function was doing was, getting the message, calling Replicate, Replicate instantiated the AI model, it gave back the result, then the result was stored in a bucket, then it was saved in the database and then with the real-time database it was distributed to all these spectacles. So it's quite easy and that's what I like, that you can iterate very fast and build something super cool with all these products, tying it together and have a really cool experience.

[00:20:41.365] Kent Bye: Yeah, as a judge looking at all the different projects that the 10 different groups did, I was impressed with how sophisticated some of the different types of things that people were able to do in just 24 hours. And so it feels like we're moving beyond what you could used to do with game engines, but kind of moving into all these other types of more web app-based functionalities and have it all kind of seamlessly wired together to be able to add more and more complicated experiences that have a little bit more depth and sophistication above and beyond what we've seen around the gaming so i was surprised to see there wasn't as many games and it was more around like these kind of like short utility functions of like here let's try to see how we can start to wire things up so even within the course of 24 hours just to see the wide breadth of different types of projects that were able to be created but also like just this glimpse into the future of where the sophistication of all these things kind of coming together might be headed. So just to hear any reflections on that.

[00:21:38.819] Yegor Ryabtsov: Well, I mean, I think the reason people build a lot of those utility tools is kind of maybe like for me, I feel it's the messaging of Snap. I think at some point I got this idea that they want to see a lot of utility tools. But I think like at least like for us, we kind of like, well, we've picked our set, like we formed this team because we kind of agreed that we kind of vibe together. There's like something. here that we can work off and I guess like we all kind of are into this like more artistic kind of like experiential things and we thought like let's just do that like we're really good at it we kind of we like it let's just build that and again like you still can utilize all those advanced capabilities around super base and stuff like that it doesn't necessarily have to be like utility based something It could be a game just like this artistic experience, but you can still make it as complicated as you want, as involved as you want. So yeah, that's kind of it for me.

[00:22:39.705] Kent Bye: And so you also were a winner of one of the awards this year. Maybe you could give a bit more context for a trajectory.

[00:22:46.289] Yegor Ryabtsov: Yeah, so Trajectory is a game for Spectacles. I kind of at some point decided that I need to finally like build something for Spectacles after I got them. And I thought like, well, I'll just build something real quickly, something fun over like a few weeks and I'll be done. Well, it took a couple of months in the end because what i decided to do is a game about like throwing things and maybe like you throw at a target and that's it but like why you're throwing things like what's the point like i don't just want to be throwing things i thought like well maybe there's a reason maybe an evil ai has stolen all the objects because like it ran out of training data and it wants to consume reality to train itself on And I decided that, well, the player will be a part of this rebel collective that is trying to liberate objects from the clutches of AI. And that's how the game was born. And it kind of evolved into this eighth level, like a proper story arc, lots of different like gameplay mechanics and they change. I include like a lot of dialogue. So you have this like sidekick of sorts that communicates with you through this intercom and in the end there's like this grand finale and there's always like obviously there's always a twist so i mean i'm not gonna spoil it but i can well basically you've been helping the ai all this time you've actually been like sending the objects the fake objects back into the real world and now like you've made the situation much much worse but it was a really great experience just like creating this game because like for the longest time i kind of i've stopped creating personal projects because i've had so much work and it's just like at some point you kind of stop being like as creative as it used to be in the first place but here like it really helped me to just like feel like i'm a creator i'm like creating something really meaningful and just to me like personally it was like very transformative to like create this because Yeah, it was hard, but I've learned a few things about myself, and it was really fun. And obviously, to get the recognition is just amazing, yeah.

[00:24:47.457] Kent Bye: Awesome. Well, congratulations, and also congratulations for the whole team for the win. And so I guess as we start to wrap up, I'd love to hear if you have any thoughts around where you think the ultimate potential of all these immersive technologies coming together with XR, AR glasses, AI all being fused together, and what that might be able to enable.

[00:25:08.154] Stijn Spanhove: Let me think. I would love to wear them all day. And that's what I envision that will happen soon. I don't know. Let's see in 2026. But I would love to wear a pair and just walk around and do things. And maybe there's an AI that helps you in the world and make it spatial and do things. So that's what I'm hoping for. And I hope it will be with the spectacles because I really like the device. And yeah, fingers crossed. Yeah.

[00:25:37.320] Pavlo Tkachenko: Yeah, I mean personally I'm I'm looking in kind of direction of digital wellness where AR device can be place where you can not only work but also rest Because I can expect that like a lot of notifications will come from your watches from your phone just on a device which you are wearing every day and like you will be messed with a lot of notifications and Yeah, it's kind of making you tired So you should have space for rest and this pure magic can help you to rest to find your places

[00:26:07.761] Yegor Ryabtsov: I actually really like what Stein said because like I personally do not want to be wearing those things all the time but I kind of like imagine the future will be like it will be just like your phone you can use it all the time you can just like take it out like put it back in your pocket so with glasses you can like do something like turn them off put them in your pocket or something like that so like I think a lot of people kind of they're afraid of this like black mirror kind of dystopian future of like us wearing these cameras and like changing reality and no longer understanding what's true or not, what's real or augmented imaginary. But I think as we figure this medium out and start playing with it on a consumer scale, we'll finally understand that it's actually not as black and white. There's many ways to use this technology and I'm really excited how like actual like people out there will adopt it and how people will change it because like it will be communities around the world finding new ways to utilize this technology and like we don't know what that will be and that is exciting we're just like three developers at the very like early stages of it we don't know what it's gonna be but yeah yeah looking forward to it

[00:27:25.296] Pavlo Tkachenko: I mean, we are looking very positive, we are very enthusiastic because we like to develop and we are enjoying the process itself. But you never know how people will take it, how people will use it. It's really interesting how it will evolve over time.

[00:27:39.099] Yegor Ryabtsov: I think there's a lot of responsibility on Snap, on all these companies creating these technologies, and on us shaping what that will be. And it's not a trivial matter. These technologies are shaping the world, shaping how people interact. Phones, they've transformed how we exist on a daily basis. So this is even next level. It's literally like... changing your reality like the ways you see like visually and it's like in 3d around you not on like a flat screen so yeah it's like it's exciting but also like i think there's like a bit of a burden of responsibility that we're not like quite yet realizing that we have but i think we ultimately do have that and uh is there anything else that's left and said that you'd like to say to the broader immersive community any final thoughts you'd like to share

[00:28:26.507] Stijn Spanhove: Yeah, I would think if you're a developer get the spectacles and start creating with it because then you will see the true magic that you can do with it and I think we as developers should explore what the limits are and then we can see as a society what we can do about the limits and maybe the privacy and things but let us first developers try things out and see what we can come up with and develop with it and I think That's what I would say to the XR community. Start experimenting with it and see what you can do with it. And that's super fun.

[00:28:58.878] Pavlo Tkachenko: Yeah, I would join Snap's comments and I would say, let's do magic together. Build with Snap, build with Spectacles. That's it.

[00:29:09.286] Yegor Ryabtsov: Such a powerful message, yeah. I mean, yeah, just like start building things. It does make a difference like between just like seeing someone demo it on a screen like those captures and actually having the device like on your face and just being in the moment and like touching those like things that do not exist but just like this like light in front of you and it's also like you do not realize what you would like to build with it until you try and play with it. And again, it's a very different medium. So whatever you have with VR or mobile, the rules are not the same. You will find yourself arriving at very different results with this kind of medium because it's different.

[00:29:53.729] Kent Bye: Awesome. Well, thanks so much for joining me here on the podcast to break down a little bit more about your design process for this Marshmallow campfire experience. And congratulations again for winning the third place prize here at the hackathon. And yeah, looking forward to seeing where each of you take this platform here in the future. And just thanks again for joining me here on the podcast to help break it all down. Thanks a lot.

[00:30:12.434] Yegor Ryabtsov: Thank you. Thank you very much.

[00:30:14.482] Kent Bye: Thanks again for listening to this episode of the Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a supported podcast, and so I do rely upon donations from people like yourself in order to continue to bring you this coverage. So you can become a member and donate today at patreon.com slash voicesofvr. Thanks for listening.

More from this show