#1459: Snap Co-Founders Share Vision for AR & Spectacles at Lens Fest Q&A Panel

Snap AR Platform Lead Sophia Dominguez interviewed Snap co-founders Evan Spiegel and Bobby Murphy at the Snap Lens Fest about AR and the Snap Spectacles, and there was an audience Q&A at the end as well. See more context in the rough transcript below.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.458] Kent Bye: The Voices of VR podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR podcast. It's a podcast that looks at the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So continuing my series of looking at different announcements around Snap Spectacles, today's episode is a panel discussion that was moderated by Sofia Dominguez, who's the director of AR Platforms at Snap, who was moderating a discussion with the co-founders of Snap, with Evan Spiegel, who's the co-founder and CTO, and then Bobby Murphy, the other co-founder and chief technology officer at Snap. they had a panel discussion called building together with Evan and Bobby where Sophia got a chance to sit down with them and ask them different questions around why ar why the snap spectacles that gives a little bit more of the deeper intention for what they're imagining with why they're diving so deep into these ar glasses So this panel discussion was part of LensFest where there was actually a Q&A session there at the end where the developers that were invited to LensFest had an opportunity to ask some questions to the co-founders and to get a little bit more details as their vision of what's happening with their platforms. And, you know, just one other little quibble that I have is that there was a bit of juxtaposing like AR versus VR when they were developing. trying to talk about their own product. And I understand they're trying to differentiate it, but I don't think it's necessary to propagate some of these false myths around what VR is. They were conflating environmental presence and being co-located with having social presence with other people. They're saying that VR is isolating, but yet VRChat just had like over 109,400 people at their peak concurrency on Saturday night. And I asked Tupper, to what degree are those in VR headsets? It's not 100% of people in VR, so it's not over 100,000 people in VR, but he says that he can't share precise numbers, but it's a strong majority in VR versus flat, well over half. well over 50,000 people in VR at the same time having social VR experiences. Now, that's obviously not the entire ecosystem. Of course, there's going to be standalone experiences, just like there's going to be standalone experiences within AR. Of all the different experiences that I saw at the Snap, Event, most of them were standalone. There were single experiences and they weren't having people involved at all. There was the Ev Devlin experience and there's also Wabi Sabi Games, but they weren't showing it. They weren't demoing it. And from what I heard, some of the different stuff that they were shipping with Lens Studio, there was lots of bugs, their whole hackathon. There's lots of people who are wanting to push forward all this stuff, but spent half their time just trying to like work through some of these bugs because they It didn't go through like a proper QA process that was basically rushed out for this event and something broke down in their QA process to prevent some of these things from happening. I totally understand with these deadlines and whatnot, but if you're going to be throwing shade at VR, then you have to also kind of realize that like, these are not easy things. And also it's more of a spectrum. You know, I think my quibble around it is that there's more connected with VR and AR than they're different. And there's no use to kind of throw shade at VR in order to promote your own thing when your own thing is still nascent and not being able to actually fully live into the promise of what you envision, at least at this point. And also, just a lot of the different chipsets that are even being used on these devices are coming from Qualcomm. And Qualcomm's been involved with XR for the past decade. And so a lot of the stuff that's even enabling what their headset's able to do is coming because there is a thriving XR industry more similar to AR and VR. Not only that, then a lot of the people that I was talking to have backgrounds in developing and learning about experiential design from working in VR because VR actually has like a viable market right now for folks to start to get involved with. So I don't know, it just like irks me when I hear a company start to poke at other aspects of the industry when in actuality there's actually more in common than different. Then also just kind of promoting false stories around like, oh, VR is just totally isolating when there's totally like social VR experiences and you don't have to be physically co-located with other people to have meaningful social interactions. I think their own Snap platform should be indications of that. So anyway, that's like my personal rant that really irked me when I was hearing ways that they're marketing it just seemed unnecessary and like just focus on what your thing can do and actually like prove it by showing demos and not these marketing hype videos that are more aspirational. Anyway, that's sort of my rant. And I did want to be able to air this episode just to hear directly from both Evan and Bobby, because I do think that they have quite a visionary vision for why they're diving in and investing and believe so deeply into augmented reality and having these standalone AR HMD form factor and why they're so committed to it. So that's what we're coming on today's episode of the Voices of VR podcast. So this conversation that happened between Sophia, Evan, and Bobby happened on Wednesday, September 18th, 2024. So with that, let's go ahead and dive right in.

[00:04:59.798] Sophia Dominguez: All right. Thank you, everyone, for joining us and for the virtual viewers. Just a little bit of context. So there'll be about 10 to 15 minutes of questions that I will ask Evan and Bobby, and then we're going to kick it over to you all. So start thinking about the questions that you will ask Evan and Bobby. All right, so let's get this show on the road. So this is a question for both of you. Snap was founded 13 years ago. And from the beginning, you've invested in exploring how AR can enhance our relationships with each other and the real world. What first excited you about this technology? And why do you continue to believe in it?

[00:05:36.155] Bobby Murphy: Well, first, I'm really happy to be here. And thank you to all of you. I mean, this journey is so fun. And it's incredibly inspiring to be able to work with such amazing developers and creators. And I know it definitely motivates our team every day to see what you all create and what we build together. It's been great getting here, and we have a long way to go. Let's see. What inspired us? Well, I guess I think early on, we realized that AR was such a fantastic fit for Snapchat. The fact that this original design choice of Snapchat opening to the camera and this concept of seeing the world through the Snapchat camera, being able to bring computing and creative experiences into that just made a lot of sense. And I remember this was probably about a little over a decade ago as we started getting excited about this stuff. We'd been seeing some really, really interesting technology in the space of vision and graphics in particular. And I remember touring a research lab and just seeing a lot of this fantastic creative technology sitting on the shelf. And I was thinking, oh my god, if we could just, this would fit so well and add just so much value to the frequency of camera use that we see in Snapchat. So I think very, very early on, we got very, very excited about that. of course, realized pretty early that bringing it into a new hardware device was just a whole other level. And then I'll say, too, that on the creative side, what's been endlessly inspiring to me about AR is Unlike any other kind of creative format, the experience that you get as a user, as someone engaging with AR, is totally unique each and every time. Because you're seeing the world through your own perspective, and then you're getting a creator or developer's imagination layered on top of that. And so the composition of those two things produces something unique each and every time. And so having worked with our lens designers and creators and our technical teams, that kind of uncertainty is really, really exciting to develop into. And I think it produces something that feels very personal for end users.

[00:07:42.761] Evan Spiegel: As Bobby said, thank you all so much for being with us. Your creativity is so inspiring and motivating to our team, and we can't wait to hear from all of you and your questions. So I think one of the things that really excited us in the early days was the way that AR lowered the barrier to self-expression. One of the big questions we got in the early days of Snapchat were from people who hadn't used the product before, and they were like, what, I'm just supposed to take a selfie? Why? But as soon as we started working on AR lenses, it gave so many people who were maybe uncomfortable taking a selfie and sending it to a friend a reason to create, a reason to express themselves. And that explosion of creativity that was really fueled by augmented reality was incredibly exciting and inspiring to us. And Bobby was so focused on bringing those tools to the creator and developer community. We had been building a bunch of the early lenses ourselves. at snap and Bobby really saw this opportunity to empower the creativity of so many people you know yourselves included all around the world and i think ever since then we've just really been chasing your incredible innovation and creativity to build the best possible tools that we can to make you all as successful as possible and obviously the platform has grown so much since then so thank you all so much for for inspiring us and challenging us to build better tools for all of you

[00:09:03.834] Sophia Dominguez: Great. Well, both of you touched a little bit on this, but so Lens Studio, who you all are very familiar with, is an AR development tool to build lenses for both mobile and spectacles. Why do you believe in having only one tool to power different platforms is the right approach?

[00:09:19.744] Bobby Murphy: Yeah, well, first, I mean, we're extremely happy with the way that Lens Studio has been built and has progressed over these many years and extremely confident and optimistic about its future and very, very committed to it. So I know the last few months there's been some uncertainty around other AR authoring tools, but we're extremely committed to Lens Studio. And hopefully, you know, what we shared yesterday with this new version of Spectacles made that very clear that this is, for us, this is the way that we see development happening with AR now and into the future. into the future. I'll start by maybe discussing why lenses as an application framework for both mobile and spectacles. And the way that I think there are many different potential definitions for lenses, but the way that I think about them are they allow us to see and engage with really engaging, compelling, beautiful AR first, AR native experiences that are super optimized and very, very performant. So if you think back to even just running lenses on Snapchat, our team has spent years investing in getting ML models and rendering technology to run extremely efficiently on a very, very wide range of mobile devices and in oftentimes a very constrained computing environments. And even the design of the carousel itself, the fact that users are swiping through lenses that are loading instantaneously, we've had to put a ton of effort into getting the performance to run that way versus, say, tapping into lenses and then exiting and tapping into other lenses. So I think a lot of that kind of performance knowledge and expertise and optimization actually fit extremely well with bringing AR experiences onto spectacles where we have even greater kind of compute and battery constraints, and we need to optimize even more towards the specific characteristics of that device. So lenses as an application format for us across mobile and spectros have made a ton of sense. And maybe to give some examples of the benefit that we've seen from building a single platform for both, A lot of what we're seeing with Spectacles has been kind of incubated in Snapchat and in mobile AR. So things like even the connected lenses, like that is an infrastructure service that we built first for lenses on Snapchat and is now powering the shared AR experiences on Spectacles. Things like SnapML were built first for mobile, and that framework is powering a ton of really interesting use cases in Spectacles. A lot of Snapchat services, things like connecting with friends or emoji, are kind of built in for Snapchat first and then brought over to Spectacles. And what's actually been really exciting is that we've seen the other contribution in the other direction happen, too, where the whole Lens Studio 5.0 rewrite was really motivated by us wanting to build a better foundation for larger team, extensible development for spectacles. But that work has given us the ability to much more flexibly create tools and add capabilities for lens creators, both for spectacles and for Snapchat and camera kits. So we continue to see a huge advantage to being in this position where we can build a tool and have control over our ability to optimize at the lowest possible levels, technology, and the most advanced vision graphics to both mobile and spectacles. And so having the single tool has really helped us.

[00:12:36.107] Sophia Dominguez: Yeah, that's awesome. Thanks for sharing your vision with everyone. I'm sure that's been a question that's very top of mind for many of the folks here. So now this is a question for both of you. What lenses are you seeing that really excite you? And as a follow-up question, what sort of lenses would you like the community to build more of?

[00:12:53.064] Evan Spiegel: I'm really excited by a lot of the generative AI lenses that I've seen and the way that people are experimenting with SnapML. I think that is really, really inspiring. And I think as Bobby shared a couple of the new tools coming to Lens Studio, as I look towards the future, I think things like Easy Lens are going to make it really fun to not only prototype lenses and get your idea out there faster, but also will lower the barrier to more and more people using Lens Studio to develop really compelling and immersive lenses. lenses. I also think, just for me personally, given some of the lenses I've had a chance to play with on Spectacles, the idea of sharing computing together in the real world is just so fun, and there's nothing else like it. So, you know, the other day we were playing Capture the Flag with Wabi Sabi Games' lens, And I was out of breath running around trying to capture the flag. But to be able to share that experience together is just so much fun. And so I'm really excited to see all of these shared experiences that people are going to build with Spectacles and on Snapchat as well.

[00:13:55.666] Bobby Murphy: I'm a huge fan of games. So any Lens games, I love. I've been enjoying the Master Chef and Fruit Bouncer. We called out some of them at the keynote yesterday. So those have been super, super fun. It's been great to see what creators have done with Bitmoji. I think that's a real exciting capability. And as mentioned yesterday, we're continuing to invest in that. That's been fun to see. On spectacles and maybe just generally, I love the experiences that allow people to get creative in AR. So, you know, the Imagine experience that Evan shared on stage, I think, you know, some of the tools to create flowers in the world are kind of drawn in 3D space and share that. I mean, that's something that my kids at home have had a ton of fun with. So I love experiences that, especially in spectacles, that really kind of allow people to get creative with their space and make something in a lens. So those have been some of my favorites.

[00:14:44.392] Sophia Dominguez: All right, a Spectacles-focused question. So Snap has been working on Spectacles for 10 years, and yesterday you announced the launch of the new Spectacles. Can we get a round of applause, everyone? And not only that, everyone here gets Spectacles, so also a round of applause for that. So congratulations. Tell me, why Spectacles? Why SnapOS? Can you just share more about your vision for both of them?

[00:15:24.151] Evan Spiegel: Yeah, in the early days when we started working on Spectacles, I think the first version shipped in 2016 with a camera, it was really just about getting the camera out of your pocket and allowing people to share their perspective hands-free. And we saw that that was really compelling, but barely scratched the surface of what the Snapchat camera was capable of. We found that the AR lenses, the experiences within Snapchat were really constrained by this very small screen that was in two dimensions and that made it very hard to share those experiences together with friends. And so that inspired us to push much faster into getting an AR display up and running on Spectacles, which happened in the last, I think that was in 2021. and now really uh... the release of the snap alas which allows people to build unbelievably complex immersive of course shared lenses and to deploy them very quickly to spectacles and as we look at for the long-term trajectory here it just feels like we're getting very close we believe that by the end of the decade there will be millions of people using augmented reality glasses hopefully the vast majority of those will be spectacles uh... But it just feels like we're really getting close. And I think as you've all had an opportunity to try Spectacles and to try SnapOS, we're just so excited to build this future together with all of you. So I'm sure we'll get more into this with the Q&A session. But it's just so exciting. I remember Bobby and I were really Now we're dating ourselves. But we were really there for the smartphone revolution and the creation of the App Store and all the, you know, really exciting apps that have now, I think, really helped to change the world, Snapchat being one of them. But I think that this transformation will be even more compelling because it's something that actually can be shared together in a real world and in a way that screens just can't accommodate. So we're now so close to fulfilling this vision and largely thanks to you and all of your support and help. So thank you.

[00:17:31.374] Sophia Dominguez: All right, last question before we open it up. So this is a question to both of you. How do you foresee the ways creators and developers can make money and find success building on Snap AR's evolving platform?

[00:17:43.849] Bobby Murphy: Yeah, so I think we all know this is a very nascent field. And so I think our first objective, creatively, technically, but also from a monetization standpoint, is to be flexible and experiment and be adaptive to what we're seeing from creators. and from Snapchatters and future Spectacles users. Some things that are working, like we see a lot of opportunity for brands and partners to continue their investment into AR. So that's happening through sponsored AR. With Camera Kit, we're seeing growing interest and looking forward to growing that. So I think those will create interesting opportunities for people to create really fun and high value work. And then our Critter Rewards Fund program is doing really well. I think we're very happy with how that's evolving. And we'll continue to iterate on that. But I mentioned yesterday that we're growing that. So quite optimistic about it. And then I know we have with Spectacles and other kind of specific development paths or opportunities, we're always looking for ways that we can ourselves fund and invest in really interesting creative work. So there's a lot of things that we're working through. And I think even more things that we'll kind of play with and experiment with and probably share and talk more about in the coming months and over the next year.

[00:18:58.027] Evan Spiegel: I think it's fun to describe a product that hundreds of millions of people engage with every day as nascent, so I think that gives you a sense for the scale of the ambition here at Snap and certainly the potential we see for augmented reality. I think so far on Snapchat, folks have had quite a lot of success helping brands and businesses build augmented reality experiences to engage their customers. That's certainly where we've seen a lot of momentum over the years and a continued significant level of interest just because the results are there when brands interact and businesses interact with consumers in this way on Snapchat. I think as we look at the future of Spectacles, I think Bobby and the team are really trying to be responsive to the different business models that the developers and creators want to pursue, whether that's subscriptions or in-lens purchase or advertising. And so working together with all of you over the coming years to really build those tools in a way you know, as we said yesterday, that we can be the most developer-friendly platform in the world is, I think, the way that we can really make this ecosystem successful over time. So we'll really be counting on your feedback and the business models that you want to experiment with and try as we work towards, you know, the widespread adoption and rollout of Spectacles.

[00:20:09.193] Sophia Dominguez: Great. All right. Question time. So who would like to go first? I think we'll have a mic set up here. So if anyone has a question, just feel free to walk up to the mic and... Take it from there. Wow, a lot of questions. Love it.

[00:20:26.706] Question #1: Hello. Thank you so much for investing in the creator community. You guys have the strongest, and we're all buddies, so that's not an accident. We love creating. My question is just on the future plans for Spectacles. We have, of course, Lenovo magically semi-successfully targeting enterprise with medical, even military contracts. How do you plan to differentiate Spectacles? Is it a consumer-first product? And then how do we plan to, I guess, monetize that?

[00:21:01.960] Evan Spiegel: Yeah, our vision is definitely that it's a consumer-first product, and that's why we've really focused on the things that we think can differentiate Spectacles. So, of course, see-through, to keep folks grounded in the real world, shared together with friends, because those are the experiences that we've found to be extremely compelling. And then, you know, these natural interactions in SnapOS make Spectacles super easy to use. So my favorite thing when people try the new Spectacles is it takes them seconds to get up and running, and then all of a sudden they're having a blast. and using all these different lenses. So I think that ease of use is really important to widespread adoption. I do think the history of technology would show that if you can build a really successful consumer product that very quickly the enterprise use cases become evident and people want to push harder into the enterprise use cases and categories. But I think holding us to a consumer standard to build a product that hundreds of millions, billions of people around the world can use will eventually create more enterprise opportunities. And that's why I think it is going to be really exciting to work together with all of you around the consumer business models, whether that's advertising or subscription or in-lens purchase, all of which I think will be ways that folks can build businesses with Spectacles. Thank you.

[00:22:07.562] Question #2: Hello. First of all, congratulations. I've been in this for like eight years. So it's really cool to see how far it has come. And you guys don't have the extra computing puck. Good job. OK, now this question's going to be a little bit more difficult, but I think it's something that's important to ask. So you guys talked a lot about developers, developers, developers, how you want to be a super friendly platform, how you need us. It makes sense. What do you do when you can do anything? At the same time, while it's really impressive, you guys are showing off all the AI-generated lenses that create project files. which is really cool, but will financially undermine some of the work that's done in this room. A lot of people here, a lot of the income comes from advertising lenses and stuff, and now a lot of that's going to go away. So how do you balance inspiring developer loyalty while openly working on these tools that will make the lives of some people here difficult?

[00:23:02.802] Bobby Murphy: So great question, and yeah, I spoke to a few people after the keynote yesterday who asked similar things. So first of all, I think it's Like, I do not think, we do not think that Easy Lens will take any work away from creators. And in fact, like even if we take the example of sponsored advertisers who are looking to advertise in AR, right now it's a huge barrier that the minimum amount of investment that an advertiser has to make to do anything in advertising in sponsored AR is several weeks of work. we think we can significantly grow the total size by generally making AR creation more accessible. What we also see is that in places where our creators, our internal team actually spends a huge amount of time investing in very high quality, deep experiences, I think there's a lot of opportunity for lens experiences to be even deeper, more robust, more detailed, much more complex and interactive. And one of the obvious things that gets in the way of that is just the time that it takes from start to finish to build some of these things. So the thinking with Easy Lens and these other tools is that we can allow creators to build way bigger and more interesting experiences in the same amount of time that they would otherwise work on, that you would otherwise work on what you currently do. And you can kind of lower the barrier to allowing a much wider audience of people who will then continue to kind of fuel and grow the AR industry as a whole. So I think there's a lot of opportunity to kind of do both things with these tools. Cool. Thank you. Appreciate the question.

[00:24:39.929] Question #3: Hello, thank you so much for all these initiatives. Super exciting. I had a question for all three of you. I want to ask, what is one book or one movie that inspires your vision of what AR can look like in the future? I would love for each one of you to share at least one book or a movie, if you feel comfortable.

[00:24:59.691] Sophia Dominguez: I can go first.

[00:25:00.828] Question #3: Please.

[00:25:01.548] Sophia Dominguez: OK. So when I was 13, I read a book called Feed, actually, by M.T. Anderson. It talked about a future world where we all had a chip on our bodies and everything was AR and VR, but they didn't call it that because it was so normal. And I'm really aging myself. I had a T-Mobile Sidekick at the time. I'm sure many of you don't even know what that is. And I remember looking at it and being like, this makes no sense. I'm confined to this little rectangle. I don't get it. So I always knew that I wanted to work in AR after reading that book. So it happened to be very early. And yeah, that was the first book that ever inspired me. Thank you so much.

[00:25:37.438] Evan Spiegel: This is going to be so embarrassing, because I'm forgetting the title of the book. But I'm going to describe it, and then someone's going to know. And then you can shout out the title, and then we'll figure it out. But there's a fabulous book where a young girl finds basically a computer-programmed book that teaches her lots of things. Do you already know it? Thank you. That was really inspiring to me because some of the things I'm most excited about in augmented reality are really oriented around learning. And I think what I'm seeing evolve with generative AI right now is making that vision possible on a timeline that is just way faster than I ever expected. So that to me is really, really exciting. And I think that's something that We pretty uniquely have an opportunity to go after just because we have such a deep relationship with our community and our customers and combining our understanding of who they are and what they're looking for with the power of generative AI and spectacles as the delivery mechanism for these experiences. That, to me, is super exciting.

[00:26:32.162] Question #3: Amazing. Thank you.

[00:26:33.450] Bobby Murphy: I've been, so Creativity Inc. is one of my favorite books. It talks about the early Pixar story. I just, it's less kind of specific to AR, but I think the combination of pushing graphics and just like ML technology forward while trying to be inventive and creative is really fascinating because I think there's a lot of, very creative people in the world who kind of use the tools that are available to make things. And then there are a lot of technologists in the world who are trying to kind of expand that. And I think there are very few companies that sit at the true intersection of those two things and are trying to do both at the same time.

[00:27:06.566] Question #3: Amazing. Thank you. So that was called Feed, The Diamond Age, and Creativity, Inc.? Yeah. Sweet. Thank you.

[00:27:13.828] Question #4: Thank you. Good morning and a huge congrats on Spectacles. I've been working in AR for 19 years, so been following Snap very closely and so always really grateful for the innovation and, you know, pushing the space forward. So my question is about external peripherals and thinking about the body and if there are any plans now or in the future to perhaps connect to Apple Watch for heart rate, perhaps BCI, NextMind, or, you know, anything else that may personalize the experience, personalize the story, and, yeah, drive that forward with biometrics.

[00:27:50.345] Evan Spiegel: So first of all, thank you for your long-term commitment to AR. It's so fun when I talk with our team because I'm like, oh, man, we've been working on this a long time. And then they're like, well, actually, I've been working on this 25 years when the headsets were mounted to the ceiling. And I'm like, right, sorry. So it really does take that sort of long-term commitment to create this kind of technology, and we really appreciate that. In terms of peripherals, that's on the roadmap. That's coming. I think one of the reasons why we wanted to get the fifth generation of Spectacles out is just so we could start delivering the OS updates really consistently and in response to all the feedback from our community. So that's coming. I can't wait to see what folks will build with that, but I think there's great opportunity there. And of course, we do have some long-term investments in BCI that I cannot talk about publicly, but I think will be fun over the long term as well.

[00:28:36.388] Question #4: All right. Thank you so much. Congrats.

[00:28:39.134] Question #5: Hey, my name is Bilal. Also a decade in the industry. Not quite roof mounted, but let's say DK2 onward. So y'all are one of the few companies that are creating products and experiences anchored in the real world, connecting with people and places around you. And of course, you're experimenting with LLMs, with some of the stuff you showcased. You have your own AI chatbot. The question I have is around how you think about embodied AI characters. You've obviously built these tremendous devices that have context into the real world around you. How do you think about, can we build AI characters that are additive to the real world, or are we building the frickin' matrix?

[00:29:21.695] Bobby Murphy: I guess, I mean, I don't know if you have some immediate thoughts. I do have a point of view.

[00:29:26.722] Question #5: Yeah, sure.

[00:29:31.010] Evan Spiegel: You know, some of the things I've been most excited about so far, you know, the team, and I think there is a demo of this. I don't know if we made it available yesterday, but I think we should make it available during LUNCEfest of, you know, actually taking folks who are real human beings in the real world and bringing them into AR with splatting and some of the more advanced techniques we've built around video. That, to me, is much more compelling than the AI-generated characters, mostly because I just, as we looked at the chat space, I got kind of weirded out by the volume of, like, romantic partner AI chat stuff going on.

[00:30:03.650] Question #5: And consistently, the ranking four, I think, like, character AI was ahead of HBO and Netflix. And I was like, what?

[00:30:10.851] Evan Spiegel: Yeah, I just got kind of weirded out by that. We were like, that's not really where we want to play for now. So I'd say a lot of our focus is much more like, how do we bring the world's best? We have a great lens example of one of the world's top boxers teaching people how to box in AR with glasses. Those are the sorts of things that we're more excited about and more focused on and I think that will help make a lot of that more accessible to people. If you could have the world's best coach in whatever you can think of in your living room, that's exciting to me and more so than the AI characters.

[00:30:46.918] Bobby Murphy: I'll add to just maybe more generally about what's interesting to me about AI from an artistic creative perspective is its ability to maybe to amplify original creative thinking. And so I love the idea that we can use generative tools to take original characters and kind of bring them into new scenes or... take original art styles and amplify them and extend them to any user. So like Jonathan Yeo example we had on stage yesterday. So I think generally the capacity for AI to kind of take, even kind of to the easy lens question too, I think for anyone who's doing creative work, the ability to use AI to then like expand that and amplify it to more places is quite interesting and we'll continue kind of exploring that space.

[00:31:31.493] Question #5: Thank you very much. If you've got volumetric splats in the lens, I think we'd all love to see it. Cheers.

[00:31:37.568] Sophia Dominguez: Okay, we only have time for one more question.

[00:31:39.929] Evan Spiegel: Should we try to do like rapid fire and get through three or something? One sentence answers.

[00:31:44.211] Question #6: All right. So I got a question more for like the specs developments wise. Will we be able to kind of escape the JavaScript sandbox and make bare metal apps like directly on the C++ level? Say I want to like stream video from the spectacles. to OBS and then to Twitch or make like a video decoder or something like that? Is that anything that's in the works?

[00:32:09.298] Bobby Murphy: Well, so to add to what Evan said earlier, the whole point of our developer program is to get feedback and questions like this. So I would say that's a fantastic one, and let's talk to the team about it, see what you're trying to do. Thank you.

[00:32:21.650] Evan Spiegel: I don't know if I'm ruining the announcement, but developer mode gives folks raw camera access, and then we're figuring out how to evolve that in a way that can be privacy safe, but still let you build the experiences that you'd like for the broader community.

[00:32:33.942] Question #6: Yeah, right. I'm thinking kind of like the permission system you have on the Apple ecosystem, something like that. Just pitching ideas at this point. Thank you. Yeah. Thanks. Thank you.

[00:32:42.424] Question #7: All right. So do you see a future where Spectacles and SnapOS can be the only devices that I leave my house with? I can leave all my other devices at home? And if so, what steps do you think need to be taken to make that possible?

[00:32:56.086] Sophia Dominguez: It's supposed to be rapid fire.

[00:32:59.687] Evan Spiegel: You can talk really fast. We certainly think that'll be a possibility over the longer term, although I think in the near term, the ability for Spectacles to interoperate with your phone, with your laptop, with your desktop is something that's exciting to us because the reality is we use all these different technical products for different reasons and for different use cases. And so I think interoperability is more important than sort of one device to rule them all. But I certainly think a lot of the basic functionality around being able to, if you kind of go through the top things, folks are doing with their phones, call your friends, you know, use basic web browsing and email. Those are the sorts of things that I think we'll have to check the box on before you can feel like just run out the door. Awesome. Thank you.

[00:33:41.050] Sophia Dominguez: Okay, last one.

[00:33:43.131] Question #8: Hello, I'm Sasha Savigallian, and my question is, while other companies are focusing on putting humanity in a virtual reality, Snap is focusing on keeping us in reality, but with the virtual world blending in. So my question is, why is Snap's strategy better for the future of humanity?

[00:34:00.166] Evan Spiegel: Oh, I love that. Fundamentally, you know, and I think this is supported by the research, people's relationships with one another are the number one predictor of their mental health and well-being. And relationships are at the heart of everything we do with Snapchat. It's at the core of the Snapchat experience, and it needs to be at the core of the future of computing. And I think a lot of computers, I really felt this growing up. In order to get all the awesomeness out of computing, I had to go to the computer lab. And it took me away from recess with my friends and hanging out. And I really dream of this world where I look out the window and our four boys are running around outside wearing glasses, playing together. And I talk to my wife. I'm like, we can't get them off the damn computer. because I think computers have the potential to bring people together and to empower creativity, then I think that future is gonna be much better for all of us.

[00:34:50.675] Question #8: Totally agree. Thank you.

[00:34:52.158] Sophia Dominguez: All right, thank you so much. Thank you both.

[00:34:54.001] Evan Spiegel: Thank you.

[00:35:00.987] Kent Bye: Thanks again for listening to this episode of the Voices of VR podcast. That's a part of my larger series of doing a deep dive into both the announcements around Snap Spectacles, as well as the AR ecosystem at Snap. What I do here at the Voices of VR podcast is fairly unique. I really like to lean into oral history, so to capture the stories of people who are on the front lines, but also to have my own experiences and to try to give a holistic picture of what's happening, not only with the company, but also the ecosystem of developers that they've been able to cultivate. And so for me, I find the most valuable information that comes from the independent artists and creators and developers who are at the front lines of pushing the edges of what this technology can do and listening to what their dreams and aspirations are for where this technology is going to go in the future. So I feel like that's a little bit different approach than what anybody else is doing. But it also takes a lot of time and energy to go to these places and to do these interviews and put it together in this type of production. So if you find value in that, then please do consider becoming a member of the Patreon. Just $5 a month will go a long way of helping me to sustain this type of coverage. And if you could give more, $10 or $20 or $50 a month, that has also been a huge help for allowing me to continue to bring this coverage. So you can become a member and donate today at patreon.com slash voices of VR. Thanks for listening.

More from this show