I interviewed Sam Jones, Kathryn Hicks, Lauren Cason, Emma Sofija, and Paige Piskin at the Snap Lens Fest about the Snap Spectacles as well as their award-winning Hackathon project called Emergensee. I also share some of my concluding thoughts on the Snap Spectacles based upon my 8 hours of coverage in the past 15 episodes. See more context in the rough transcript below.
Here’s my 15-episode series on the Snap Spectacles announcement and a deep dive into the Snap AR Ecosystem:
- #1453: Kickoff of Snap Spectacles Coverage with First Impressions with XR Analyst Anshel Sag
- #1454: Niantic Launches Virtual Pet AR Game “Peridot Beyond” on Snap’s Spectacles
- #1455: Wabisabi Game Studio Leverages AR Connected Lenses on Snap Spectacles for Outdoor Capture the Flag Game
- #1456: Turning Your Phone into a Spectacles AR controller with DB Creations’ “Tiny Motor Arcade”
- #1457: Snap’s Lens Studio, Spectacles, & “Polygon Studio” Open Source Low-Poly Drawing App
- #1458: Building a Geo-Located, Dungeon-Raiding, AR Game for Spectacles with Aidan Wolf
- #1459: Snap Co-Founders Share Vision for AR & Spectacles at Lens Fest Q&A Panel
- #1460: Snap AR Platform Lead Sophia Dominguez on Spectacles and AR Ecosystem
- #1461: AR Lens Genres, Spectacles Impressions, & SelfReflect VTubing App Using Snap’s Camera Kit with Brielle Garcia
- #1462: XR Industry Reality Checks with Cix Liv and Turning Your Phone into an AI-Driven Exercise Routine
- #1463: Arguing for Utility-Driven AR Minimalism with Lucas Rizzotto Reflecting upon the XR Market
- #1464: Snap’s Head of Spectacles Software Engineering Shares Technical Break Down of Design Tradeoffs
- #1465: Team of Women AR Artists Explore Dreams in Spectacles Hackathon
- #1466: Freelance XR Dev Veronica Flint Shares Thoughts on Spectacles and Ray-Ban Meta Smart Glasses
- #1467: AR Medical Training App Wins Spectacles Hackathon + My Concluding Thoughts on Snap Spectacles
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
Podcast: Play in new window | Download
Rough Transcript
[00:00:05.458] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So this is the final episode of my series of looking at different announcements around Snap Spectacles, as well as the AR ecosystem around Snap Spectacles. So today's episode is with the winner of the Lensathon hackathon that happened during the Lensfest. And so they basically had around 10 plus hours or so to have access to the Snap Spectacles and then to develop some type of application. So their team name was Emergency. So they basically created a medical training app to be able to know how to administer certain types of medications to save people's lives. So these are the different types of applications. shots that people need that if they run into some sort of condition, then they need to have either themselves or other people to help administer these shots. And so because the snap spectacles is a little bit more of a consumer facing application, at least that's what it aspires to then think about what are the different types of training applications that would be great for folks to have in order to potentially be in a situation where they could help save someone's life. Before we dive into this episode, I wanted to just also give a shout out to some of the other finalists within the Lensathon and some of the different types of projects that they were doing, as well as some of my final thoughts as we wrap up this whole series of, you know, diving deep into all these different perspectives of what's happening with Snap as well as Spectacles. So I think the lens-a-thon was good to see what are some of the interesting applications that are going to be coming out of this as a platform. But some of the other experiences, so there was Hablar, which was aspiring to eventually get camera access and use that raw camera access to do like computer vision and then to detect objects and to translate those objects into Spanish. It would be basically like a language learning tool to... identify objects and to translate those names into Spanish or any other number of languages. Raw Cambridge Access wasn't given yet. And so they had to basically just have a 3D model to like use as a way of giving the type of interface they would eventually have. Another one of the finalists that didn't make it to the top three was Capsule Team. And so this was something that I've seen within Neos VR, where you do this kind of like finger gesture in order to take a screenshot. And so they were doing that as a gesture to be able to like frame a certain part of the screen and then take a photo. Because they didn't have access to the right cameras or the raw cameras, it was kind of like offset and it wasn't like a perfect execution just because they would need a lot more time to really do the types of... warping in order to translate what the camera is seeing versus what you're seeing from your perspective, but also like getting raw camera access. They were able to actually get raw camera access, but in order to do that, they had to like break some other stuff. So they weren't using that other stuff. And so they were able to show what's possible with raw camera access, which was the theme that came up again and again from another different conversations that I had throughout the course of the series. And then the third place prize was by a team called Lettuce Get Physical Run Carrot Run, which essentially had like this NPC type of character, which is a carrot. It's taunting you as you're chasing after it. It's trying to encourage you to run. And then it is basically got this low level amount of scene detection and able to kind of climb up on different objects. But you're trying to essentially catch the carrot. And it's an experience where it's trying to encourage you to be physical and to exercise. Then the second place prize was from MoodQ, which I think it was actually using some raw camera detection. I'm not quite sure. They were somehow using the raw camera detection to do this kind of ML detecting someone's emotion. And then from there, giving some different prompts for people. So aimed towards folks who are on the autism spectrum to be able to give some cues on reading other people's emotion, but also giving some prompts for how to have different conversations with them. And then the top app was called Emergency, S-E-E. They were training people to use different things like the AED, the EpiPen, or the Narcan. So essentially like a medical training application for giving users the ability to practice some of these critical skills of learning how to use these different devices before the moment of crisis and ensuring that they act whenever it counts. Before we start to dive into this interview, just wanna share some of my final thoughts. So a lot of people are just looking at the top line specs of the Snap Spectacles. And I think the Snap Spectacles are actually a really quite compelling device, especially if you're an AR developer, this is like, I think like a DK one moment for AR. This is like an affordable, high functioning applications for you to start to rapidly iterate and prototype different experiences. You're going to have to use a Lens Studio, which I think is a bit of a downside just from the larger XR industry. You never want to just completely put all your eggs into one basket and get completely tied to a proprietary stack. I personally would love to see other modes of getting onto their headsets, either through like OpenXR or WebXR. They do have browsers. And Sofia Dominguez did actually tell me that they're working on WebXR implementations. And I think a lot of the architecture for Lens Studio is supposed to be, you can put it onto their mobile applications with iOS or Android, but also potentially have it on websites. And so kind of have things online. go between like mobile or phone with some other location based experiences. Sometimes when you scan a QR code or something, if you don't have Snapchat installed, it would just go to a mobile version of that same website so that you wouldn't have additional friction of trying to download additional things. And so They're trying to like have this cross-platform thing. And so I imagine that with that, they're already on the plans of having at some point having WebXR implementation so that you could feasibly create a WebXR experience and then be able to have it on all the different XR platforms, including what's happening with the Snap Spectacles. Now, in terms of future viability of the market, I still think they're going to have to have some sort of stopgap before they start to launch into the consumer space. MagicWeb tried that and they failed. Of course, MagicWeb didn't have the same type of ecosystem and revenue that they already have from existing social networks. So maybe they'll be able to continue to sustain the development of this product until it's ready to be launched out in the public. I don't know, a bit skeptical that they're going to be able to completely bypass any sort of interim B2B2C or some sort of enterprise use cases to really make this make sense economically and financially. It seems like they want to continue to be working with these consumers, these end users. And so that's why Sophia said B to B to C, because at the end of the chain, it's still with the consumers. But at least at this point, Snap has exhibited that they're definitely willing to commit all the different time and resources to develop this hardware, despite all of what may be against the odds that they're able to actually do it. they did have a number of different layoffs recently so it's a little bit of a question for as they move forward if they're able to continue this pace of innovation and still work on their operating system and still work on all these different moving parts but at least they have like a functioning dev kit that's out there that works it works great it's got great responsiveness to the hand tracking it's kind of natural intuitive interfaces it's got a shared reality experience that i think is really quite compelling with as devlin with these connected lenses and all these other infrastructure things that I think is different than anything else we've seen before. There's the most passionate developer ecosystem with Snap, with the most interesting creative artists and technologists that are like experimenting with it. I think hopefully you'll been able to have seen that with the course of this whole series. And I'll have in the show notes links to all the previous episodes that you can go check out. And Snap is willing to subsidize and support a number of different creators to be able to continue to experiment and develop for the platform up until eventually they get some sort of viable marketplace available for developers so that they don't have to rely upon getting money directly from Snap in order to subsidize any type of development. So it feels like it's definitely been willed into existence by Snap. Just because it defies all odds for what we've normally seen in the rest of the industry. But it's great to see these types of players come up and push the edge for what's even possible. I have to give a shout out to like Tilt 5 and Jerry Ellsworth. It's a lot more like indie driven AR, which is a lot more affordable, but very much focused on tabletop gaming. So like sit down with other people, tell it to a computer. What they're doing here at Snap is completely and wholly unique. This is like a self-contained AR headset. It's on par with different dimensions of like Magic Leap 1 and Magic Leap 2. It's not quite as wide of a field of view as Magic Leap 2, but the pixels per degree is certainly up there in terms of resolution and being able to have like hand tracking and just all the different features. machine learning core models that they have both from computer vision for the body tracking and the slam and everything else that just like it works it just like it's a headset that works i haven't been able to you know put it through all of its different paces or used over time but from what i saw the demos it was super impressive and i think it's like a minimum viable product for a dev kit for other ar developers to really push the edge for what's even possible the promise that you could start to put your own machine learning algorithms on the headset is a huge benefit and i expect to see a lot of innovation with all these things coming together especially when you have connected lenses shared reality and all the other tools and infrastructure that snap has with much more g-located features that i'm sure that are coming soon which is what aiden wolf was kind of alluding to with some of the stuff that his team is going to be really leveraging So yeah, I think Snap has done it. They've produced something that's really quite compelling. I was pleasantly surprised to see how viable it is. And I think there's some larger questions around the overall ecosystem that Snap is cultivating economically, like how to really make it make sense economically. I heard one developer mention like, hey, you know, these lenses are essentially like apps. And if they're selling apps and that's basically like having an app store like iOS doesn't really allow that. And so are they going to find a way to kind of get around that or have some sort of like third party site like what we have with Meta to be able to buy these different experiences? So that's basically not having to deal with the developer tax that's coming from the existing duopoly of Apple with iOS and Google with Android. So, yeah, I think these are the types of different questions that they're going to have to start to investigate if it's like a completely self-contained ecosystem, if they have ways for people to buy different things and then kind of escape the existing developer tax that we have in the ecosystems. And they are committed to not having a developer tax. But there's also no revenue being exchanged. So at this point, that's not really necessarily something that is going to tell people one way or another. I think they said there's no developer tax at launch. So I don't know if that means that they're eventually going to have one. There's not a lot of other models other than to have something like that. So I imagine in order for them to create this as a viable business, they're going to have to figure something out that isn't quite there yet. And I think... Those different types of open questions create a little bit more uncertainty than certainty for a lot of folks that are questioning whether or not they should invest the money of like $1,200 to be able to get even one of these developer devices. But if you're a creative technologist and an XR developer and you've got a closet full of unused XR devices and you don't mind having yet another thing, You know, it has the risk of doing that, but at the same time, it could actually be the catalyst for some really exciting new possibilities. This is dialectic between what's possible, the possibilities and the hopes and aspirations versus the kind of the actuality and the constraints and limitations of this as a platform. But from my take, there's a lot of excitement and a passion and enough developers that are excited to even like push forward what's even possible. And yeah, Just from even seeing what's there from the lens-a-thon, there's a lot of excitement for continuing to push forward with this platform, especially if you start to get more things like camera access and getting more of the custom machine learning models implemented and more and more connected lenses and shared experiences. So that's sort of my hot take for after doing this whole series. And yeah, now let's dive into this conversation with the Snapchat LensFest winners for the Lensathon of Emergency. So this interview with the developers of Emergency with Sam, Catherine, Lauren, Emma, and Paige happened on Thursday, September 19th, 2024. So with that, let's go ahead and dive right in.
[00:12:09.196] Sam Jones: I'm Sam Jones, I'm an XR technologist and co-founder of Refract Studio, and I do location-based AR, primarily.
[00:12:17.623] Kathryn Hicks: I'm Katherine Hicks, I'm founder and art director of Creature Studio, and I'm also co-founder, chief of games and technology for Baron Von Opperbean, or Bevo for short.
[00:12:26.231] Lauren Cason: Katherine is so cool. I'm Lauren Kasem, and I am the co-founder and creative director of Refract Studio.
[00:12:34.676] Emma Sofija: I am Emma Sophia and I'm a creative technologist. I do mostly engaging and entertaining AI lenses, but I also work with generative AI.
[00:12:42.301] Paige Piskin: Hi, I'm Paige Pisken and I'm a digital artist with a focus on character design and immersive experiences.
[00:12:49.185] Kent Bye: Great. Maybe you could each give a bit more context as to your background, all the different disciplines you're fusing into your practice and your journey into the space.
[00:12:56.812] Sam Jones: Yeah. Originally, before getting started in 3D, I just worked in B2B, like mobile apps and web apps. And then I learned you could make beautiful things on computers, especially in augmented reality. And that's been the story since 2018 or so.
[00:13:11.657] Kathryn Hicks: I started in virtual reality, fell into it at grad school at Savannah College of Art and Design in 2015, and got into the Launchpad program, started working, mostly med tech and instructional stuff, and then St. Jude, and now Ventures.
[00:13:25.888] Lauren Cason: So I got started by making video games, which have a very similar technology stack to XR. Worked on games like Iron Valley 2, where the water tastes like wine. Then was a prototyper at Apple, was a creative director at Meow Wolf for a while, and then met Sam at Meow Wolf. And we've been making stuff since.
[00:13:42.999] Kent Bye: Nice. Meow Wolf. Nice call out there.
[00:13:45.433] Emma Sofija: Yeah, I was extremely fortunate to have a student assistant job for an extremely cool woman in tech who has been awarded one of the most influential women in tech in Europe. So she kind of got me inspired and that's what developed into where I am now.
[00:13:59.198] Paige Piskin: I've been working in the digital marketing space for about 15 years and I decided to just pivot to augmented reality as it brought more creativity back into my life. I've been a full-time creator now for the past five years and I've been having the time of my life mostly focusing on trying to create viral trends and working with awesome brands.
[00:14:17.352] Kent Bye: Right, and I'd love to hear how each of you started to first come across Snap and the AR ecosystem, and if you were involved in the previous versions of the Spectacles, and yeah, just kind of a little bit more of your journey and history with Snap and the larger ecosystem of AR.
[00:14:32.549] Sam Jones: Yeah, I originally started exploring AR on the web and things you could create at the time were just very limiting. And I discovered Lens Studio in about 2019 and it was head and shoulders and still is quite far above the rest in terms of technology stack and what you can achieve technically compared to other social AR platforms. And so it's just kind of stuck with me ever since then and I've got very proficient in it and it's helped me grow in other tools. I was able to try the previous generation of Spectacles in 2021. and make some projects for Snap there. And it kind of applies some things Lauren and I learned while working at Meow Wolf to immersive location-based things and games. And yeah, then we got invited to work on this latest pair of spectacles and it's been a huge improvement and actually very fun to work on, which is kind of rare to say for experimental hardware.
[00:15:19.036] Kathryn Hicks: I fell into augmented reality in 2018 working for mostly medical and training device. So lots of bones and feet and different things. And then I got into Snapchat through the storytelling residency in 2020 and got opportunities to do like House of Dragons, the filter that was 20 around the world and a bunch of different opportunities. So yeah.
[00:15:39.324] Lauren Cason: Snapchat reached out to me in 2019 about their first waveguide headset and I really wanted to make work outdoors and they were like you can do that with this and I was like sure you can and then you could actually do that and that was really cool and I feel like they've been supporting that vision like pretty much since and letting us make really cool location-based outdoor work.
[00:16:04.491] Emma Sofija: Yeah, honestly, I was just really depressed during COVID and like laying in my bed doing nothing and started playing around in Metaspark. Yeah, within one month, I already had like millions of views and I was like, wow, and this is combining all the things I love to do, like being creative, being strategic and yeah, just super fun. And then that later, then I got started with Snapchat. So yeah, that's kind of just how I got into it.
[00:16:26.755] Paige Piskin: Yeah, I started in 2019. I was just trying to find a way to reinvent myself and get into something more creative than I was currently doing. And AR allowed me to really express myself by creating the type of characters I've always drawn when I was a child and bring that back into my life, brought so much more passion into my world. And I started creating more and more experiences on Snapchat. Some of my lenses went viral. It connected me to some major brands and it became a full-time job and it's completely transformed my life.
[00:16:53.976] Kent Bye: Nice. Well, we've been here at the Snap Partner Summit, and then the LensFest is just happening. There was just a whole Lensathon that was like this whole hackathon that you were a team, that you actually built the winning experience. So congratulations for winning this emergency, this medical training application in the context of AR. But before we dive into your experience, I'd love to hear some of your first impressions of the Spectacles after having a chance to play around with it and see some of the other demos maybe during the partner summit and what you were attracted to for the affordances of what this headset is going to be able to potentially enable.
[00:17:28.428] Sam Jones: Yeah, I think this headset is impressive on itself alone. The performance you can get, the kind of tracking you can do, the kinds of experiences you can create, and also just the visual fidelity, how nice it looks on the headset, especially compared to competitors with maybe similar innards, I guess I would say. But what really sets Snap apart for me and what keeps me interested is Snap has this great hardware, these great people working here, but they also have a tremendous back end that I think people are kind of sleeping on to enable these experiences to scale, to be able to be created by all sorts of people, but also to be seen all over the world. And how they're expanding their technology with Camera Kit, I think is going to enable like kind of a whole new wave of self-expression. And they're best poised out, I think, out of all these companies to do that.
[00:18:15.033] Kathryn Hicks: I really like the vivid color and it kind of took elements of like Magic Leap 2 where you had the dimming feature and elements of like different AR headsets. But I think I also like its software capabilities as well. But I think the just the optics and the hand tracking is pretty nice. But yeah.
[00:18:30.805] Lauren Cason: My favorite thing about Spectacles hasn't changed since 2019, and that is that it is really easy to get from your computer to device. A lot of the work that we like to make is work that's outside. It's maybe in remote locations. We live in a largely rural state that doesn't always have super great cell service. I can take my computer and a hotspot and usually build to device on location, which is super important for location-based experiences. You have to be there to iterate.
[00:19:01.004] Emma Sofija: Honestly, I have never really spent that much time in wearables in general because I have a vision disability, so I've never really enjoyed any of them or really fully worked. I'm really excited to try more with spectacles and now also because it's AR, but yeah, I don't have much to say yet, but I'm very, in general, very passionate about like innovation and healthcare and I loved our idea and I would love to see how we can also maybe go into that, something like that further. So, yeah.
[00:19:26.567] Paige Piskin: I'm really excited about these spectacles. It's really the first time I've gotten to experience mixed reality in this high fidelity way that is accessible for me, that's something that I could build on. I've never had that ability before where the barrier to entry was so low that I can actually use things I've already created and send them to spectacles and see them with my own eyes. That for me is just a mind-blowing thing that's going to be possible at this point. Before I was only creating, imagining what would it be like one day if I looked through these with glasses and now we could do that so I'm very excited about that.
[00:19:56.547] Kent Bye: Yeah, well, I'd love to just have a few quick words around your project, and then we can start to wrap up. But the medical training application, I feel like in XR generally, that's a bit of a killer app to be able to have an actual embodied experience and step through different things. And so they're also utilizing different aspects of the hand tracking. And also, there's a lot of training in VR, but to be able to have a device that you can put on quickly and jump in quickly. There's also use to having low friction to entering in and having something that's more of a mixed reality experience. But I'd love to hear some of the thoughts of this project, of the medical training context, and what you were able to pull off in 12 hours or so.
[00:20:34.693] Sam Jones: Gosh, yeah, I think like Lauren said in our deck, this is for a lot of the medical training apps you see in XR or VR. It's enterprise focused or it's specifically for medical doctors. And Spectacles is different and this opportunity is different because this is a consumer facing device and a consumer experience. And so it has to be fast. It has to be even more intuitive than something made for a doctor who already has this built in language and knowledge around these experiences. and Spectacles gave us the chance to kind of test our assumption whether this would be a valuable thing for regular humans. With how good this came together, it makes me feel like it is and maybe one day your AR glasses will help you in a really tragic or difficult situation.
[00:21:20.322] Kathryn Hicks: I guess I really like the aspect of this application is that it's for everyday people, and everyday people should know these things, but having it built on the Spectacles in a very accessible way and very intuitive way is excellent, and I see a lot of potential with this type of training for everyday consumer training, which the Magic Leap and other headsets are very much geared towards professionals and professional training medical devices, but having consumer-focused applications is really important and very much needed. and the magic of spatial computing.
[00:21:52.219] Lauren Cason: So secret, sneaker, something that most people who I know through this space probably don't know about me is that I'm a nursing student at my local community college. And I came to that largely through thinking that XR has an enormous application for allied health, for paramedics and EMTs and nurses and CNAs. And we've all been really focused on the surgeons, but there are so many other people who deliver health care. And Sam and I got talking about something like an AED simulator on the plane and that there are so many things like this that are just so simple that are really just about doing it over again and training that XR has enormous potential for.
[00:22:32.003] Emma Sofija: Yeah, a little bit along the same lines. Back in 2018, I did half of my master's in innovation healthcare because something I'm very passionate about, like how you can actually like help people and then also use tech and innovation. But unfortunately, at that time, I didn't really know how I could actually have a career within that I would also enjoy. So I'm actually just incredibly happy just to now be at a point where, you know, I can combine the things I really, really love and also create something that's like innovative within healthcare.
[00:23:02.836] Paige Piskin: I felt so grateful to be part of this project. Once I heard the concept that they were working on, I was like, oh my gosh, this is just incredible. I've never heard of anything like this. This is groundbreaking. For me, as a person whose people in my life who've been near death that were saved by devices like Narcan, it hit home for me because I personally don't understand how to use many of these devices. And so if just learning that can save a life, you only have a few moments to make that change. So knowing this information can save millions of lives every year. Some of the leading causes of death in this country specifically can be fixed by having the knowledge that's provided in this app. So I get chills just thinking about it, but I was so grateful to be a part of this project.
[00:23:41.462] Emma Sofija: Same. Wow. I'm very grateful to be a part of this team and project. Everyone is amazing here.
[00:23:48.587] Kathryn Hicks: yeah really great so just awesome team great people um i'm like happy to be working with this team and i also wanted to work with lauren yeah yeah yeah so like to be able to work with each other was great and just like i feel like we're just doing something that like for everyday people and it's very much needed don't forget about the everyday person that needs to know about essential medical information so
[00:24:11.292] Emma Sofija: Yeah, there were several people who came and said that the intro question really got them reflecting. Like several people stopped me now. So I think, you know, this is really something that's like needed. And this is just, yeah, it will be life changing. So, yeah.
[00:24:25.488] Lauren Cason: Can I tell a quick story? about our team formation was that when we were all forming our teams, I went up to Rag and I was like, we need designers. And he was like, oh, they're all teamed up out of luck. And we were like, oh, OK. And then Joe Darko came in later, and he was like, would you be open to adding some designers to your team? And we were like, really? And yeah, I think that's been the best part of this project, right? Is that hackathons are about meeting new people and like we really got to do that year.
[00:24:52.468] Emma Sofija: It was great. I've been so inspired by everyone on this team. So yeah, it's been such a great experience. I mean, of course, it's always fun to win, but what I will take away is just the experience itself. And, you know, these people really have inspired me and it's something I will just, you know, go home and feel really good about it.
[00:25:08.832] Kent Bye: Yeah, well, I know it was a very short time, not a lot of time to really pull off anything beyond a minimal viable product. But it sounds like you were really able to pull together a coherent experience and overcome all the various bugs that everyone was dealing with as well. So congratulations on navigating that journey through all that. But yeah, I guess as we start to wrap up, I'd love to hear from each of you what you think the ultimate potential of spatial computing might be and what it might be able to enable.
[00:25:32.111] Sam Jones: Oh, gosh. Well, I think our little hackathon project is a glimpse into the future. One day, your AR glasses could help you save somebody's life. And that's not an exaggeration anymore. That's real. So I think the paradigm of how humans interact with the computers is about to change so much in ways that we're really just beginning to explore. And the hardware is just getting there now, really, where we can make these experiences that we've been envisioning for so long. It's a super exciting time and I know it's a kind of a tumultuous time for XR and AR and the hype cycles and things but this events like this show this technology is not going away it's only going to grow and become more ubiquitous in our lives and actually help us be better humans in the world and understand each other's perspectives and keep us healthier perhaps.
[00:26:21.977] Kathryn Hicks: I really think XR will make technology far more accessible. Even with the emergence of generative AI, I see that huge integration. And again, I started in VR, so I don't hate VR. I like all of them equally. So I just really see that fully integrated, truly multilayered spatial computing device that everyone can use, whether it's to save a life, whether it's to learn another language, whether it's to go travel in another area they couldn't go to before. It's really a superpower technology.
[00:26:49.400] Lauren Cason: Yeah, you know, I think what Sam was saying about, like, hype cycles was interesting. That's a thing I've been thinking about recently with, like, I've been doing this for a long time, and it used to be that we were seen as sort of, like, hardware nerds, and then it was, like, metaverse, it's going to save the world, and now we're kind of maybe getting back to hardware nerds, and it's not cool. But, you know, like, I don't know that I think that, like, XR is going to save the world, but I do think it might, like, make its little mark, and, like, maybe we can teach people how to use Narcan, and that's cool.
[00:27:18.015] Emma Sofija: Yeah, I just see major potential for disruption in general within this area, especially just because we have all the tools that we need within these mobile apps and things like spectacles that you don't need to go out and have separate glasses or separate apps and tools. You can already ingrain in something that has, I don't know how many millions of users Snapchats have every day. So yeah, I just think there's a huge market gap and I would love to see what other tools and things that could be implemented like our idea. Just take it further. Yeah.
[00:27:50.096] Paige Piskin: Yeah, and I feel every single thing they said, and I'll just add to everything basically that they said, with I feel that this technology brings creativity into our lives that didn't really exist maybe at any point unless we were children. You know, when you're children, your imagination went wild. You could see things that weren't there. You could dream big. And that gets limited as you get older because you get hit with reality and what that is. But our reality is changing now. And we have a new reality where our imagination can run wild and we could see the unseen and we could feel something that we didn't feel before. And so I feel like this brings a new level of creativity into everybody's life. And it brings a lot of joy and a lot of magic, something that you might have had to go to a theme park to experience before you could have in your own living room or out in the street or on your run. So very exciting stuff.
[00:28:35.622] Kent Bye: Great. Do you have any final thoughts or anything that's left unsaid that you'd like to share with the broader immersive community? Any final thoughts?
[00:28:44.833] Sam Jones: I'm tired, boss. This is a good hackathon.
[00:28:46.574] Kathryn Hicks: It was great to work with the Dream Team. It was awesome. And great to even just learn more about the spectacles and learning new things and seeing how I can integrate this into large-scale location-based experiences. So yes.
[00:29:01.019] Lauren Cason: Looking at how to use an AUD would be a good idea for all of us. The end.
[00:29:05.771] Emma Sofija: Yeah, honestly, I don't even know. It's just been a good experience. And I will go home with so much renewed inspiration and energy to take on even more projects. And I always think it's exciting when you do a project that has the potential to develop further. That this is just 5% of the idea that you could go out and develop it further. And also, I think there's a lot of potential for this team to build more cool things. So I think that's always great.
[00:29:32.219] Paige Piskin: This team was amazing and it just shows that this is such an inclusive and just kind community like everybody is so welcoming. They welcomed us right in to work with them on this project and shared something that was dear to their heart with us. Everybody values each other's unique skill sets. It's amazing to be part of this lens community and it's very unique to every other community out there in the tech space. I've never seen anything like it. And I believe in this project and I'm looking forward to creating more projects like this that change lives. It's totally different from the stuff I usually do. Turning people into aliens, creatures and all that stuff and characters and doing makeup and beauty filters. This is where I feel like my heart is and I appreciate being invited to be a part of this experience.
[00:30:10.280] Emma Sofija: I agree. It was great.
[00:30:13.669] Kent Bye: Awesome. Well, I know it's been a long three days. Lots have happened. And congratulations again on being able to pull it all together. I know it's not an easy thing to do, but it does sound like you had quite a dream team come together to be able to have all the different knowledge and insights and skills to be able to pull it all together. So yeah, congratulations again for the victory. And thanks again for joining me here today on the podcast to help break it all down. So thank you.
[00:30:35.454] Kathryn Hicks: Thank you. Happy to be here. Thank you. Love the podcast. It's great. So it's a pleasure to talk to you more. Yeah.
[00:30:40.151] Emma Sofija: Yeah, thank you so much. Yeah, thank you so much. Thank you so much. Thank you.
[00:30:45.049] Kent Bye: Thanks again for listening to this episode of the Voices of VR podcast. That's a part of my larger series of doing a deep dive into both the announcements around Snap Spectacles, as well as the AR ecosystem at Snap. What I do here at the Voices of VR podcast is fairly unique. I really like to lean into oral history, so to capture the stories of people who are on the front lines, but also to have my own experiences and to try to give a holistic picture of what's happening, not only with the company, but also the ecosystem of developers that they've been able to cultivate. And so for me, I find the most valuable information that comes from the independent artists and creators and developers who are at the front lines of pushing the edges of what this technology can do and listening to what their dreams and aspirations are for where this technology is going to go in the future. So I feel like that's a little bit different approach than what anybody else is doing. But it also takes a lot of time and energy to go to these places and to do these interviews and put it together in this type of production. So if you find value in that, then please do consider becoming a member of the Patreon. Just $5 a month will go a long way of helping me to sustain this type of coverage. And if you could give more, $10 or $20 or $50 a month, that has also been a huge help for allowing me to continue to bring this coverage. So you can become a member and donate today at patreon.com slash voices of VR. Thanks for listening.