#1668: Snap Co-Founders Community Q&A about Specs 2026 Launch Plan

The snap co-founders of CEO Evan Spiegel and CTO Bobby Murphy typically have a community-driven Q&A after their Lensfest Keynote where they field over a dozen questions from Lensfest attendees. I’m including this in my coverage again this year as it’s a really great set of questions about their consumer release of Specs AR glasses next year, some of their thinking about the role of AI at Snap, and reflections of their 10 years of working with AR lenses going back to the vomiting rainbows facial filter released in 2015.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.458] Kent Bye: The Voices of VR podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR podcast. It's a podcast that looks at the structures and forms of immersive storytelling and the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So in today's episode, I'm going to be featuring a Q&A that the co-founders of Snap held with the developer community at their LensFest, which I think is an amazing tradition. You don't see this a lot from a lot of the other companies, which is during their main keynote, they invite the audience to ask questions, whatever questions they have, and And I find that some of their questions are some of the questions that I would have. And I think it's a good representation for the types of things that are coming from the community and the things that they're wanting to know from the co-founders of Snap. So at the Snap Lens Fest, which they brought me down to LA for, I was covering the Lensathon, which I'll have more coverage of that here later on in the series. But this Q&A was referenced through other conversations that I had. And just because, you know, as the co-founders, the CEO, Evan Spiegel, and the CTO, Bobby Murphy, they have a very strong vision for augmented reality. And they're saying that they want to do consumer launch of augmented reality, but at the same time, magically did the exact same thing where they wanted to skip straight to the consumer launch. But the heads were so expensive that at some point they had to pivot over to more enterprise and LBE types of contexts. And so I'm curious if that will be the same situation here or if it's a different moment in time where people are really ready for this type of full blown augmented reality glasses. So there's a lot of questions that are around that with the strategy and monetization, but also location-based entertainment is something that has had a lot of momentum and Clue is something that was featured at Augmented World Expo and they're deploying out the Snap Spectacles developer kit essentially out to the consumer market and doing it. In a way that saw that there was a lot of momentum and Snap actually saw that and developed a whole fleet management tools for include as well as other LBE types of contexts. And so that was some of the new features that I was shown. Those are some of the things that I'm going to be thinking around in terms of like, is this actually a consumer device or is this going to be more a early adopter, dev kit, prototype, different types of experiences? especially because snap is kind of developing their whole software stack that is both the operating system as well as the loon studio that is a lot more pared down from what we've seen traditionally from the existing game engines that are out there but also it's a proprietary stack and they're still developing it and so there's things that they have to continually add and build up from scratch but they're doing it from scratch and meta tried to do that at some point and they abandoned that vision of creating their own operating system so then they end up going with android and So anyway, that, you know, it's unsure as to what path and direction they're going to actually take as they move towards their consumer release sometime next year. If I were to guess, it was probably going to be sometime in the third or fourth quarter next year after Labor Day, just because that's usually when new consumer launches are happening with XR devices. We just had a couple of new XR launches this week with the Samsung Galaxy XR deploying Android XR as well as the refresh of the Apple Vision Pro with the M5 chip. And so this typically is when a lot of the new XR devices are launched. So we'll see what happens with their consumer launch next year and if it's going to be more focused towards enterprise use cases or more consumer applications as they try to develop all their whole ecosystem over the next year or so. So covering all that and more on today's episode, the Voices of VR podcast. So this Q&A with the Snap co-founders of Evan Spiegel and Bobby Murphy was moderated by Rush Sidhu and happened on Thursday, October 16th, 2025, after Bobby's keynote that happened at LensFest at their headquarters in Santa Monica, California. So with that, let's go ahead and dive right in.

[00:03:50.004] Resh Sidhu: All right. What an incredible keynote, Bobby. I don't know about you all, but he had me dancing golden retriever puppies. All right. Ten years of lenses and we've barely scratched the surface. It feels like we're just getting started. You know, this is a really rare opportunity to have both Evan and Bobby on stage. And I've been speaking to a few of you over the past few days, and there are a lot of questions. And so we are going to get there really fast. Evan, I want to come to you first. 14 years ago, Snap changed how people use the camera. And when you look back at everything that's been built since by this incredible developer community and by the Snap team, what stands out as something transformative, a moment for you personally?

[00:04:43.663] Evan Spiegel: First of all, good morning, everybody. It's so great to be here. Thanks for all of your incredible support. And it's so fun to see all of the amazing things that you all are building. I can't wait for the hackathon awards coming up here. So thank you. I think for me, the journey building lenses has been incredible, really starting as a tool to help enhance self-expression, to lower the barrier to creativity, where anyone felt like they could use a lens and express themselves and have fun. with their friends. I think one of the biggest turning points more recently has really been around machine learning and artificial intelligence. And I think going back to maybe, I think it was about 2020 when we introduced SnapML, I think Bobby would say, or he did say earlier this morning, I don't know if that was a huge hit. At the time, it wasn't, but it represented the beginning of bringing all these machine learning tools, especially the ability to run lenses on device with these real-time transformations that has unlocked so much creativity that we've seen since then and played an incredibly important role as we think about the future of specs. So I think for me, that major turning point has really been bringing all of these ML and AI-driven tools into Lens Studio and seeing the incredible creativity that's resulted.

[00:05:53.297] Resh Sidhu: I love that. There's some moments that you don't. At the time, you feel it, and then you realize that this is going to change everything. And then later down the line, these guys do it all the time. They're like, we told you that was going to be great. We love that. I love that. OK, so we want to hear from you. I am going to encourage you to come to the two mics. There is one stage left and stage right. Now is the time to start queuing up. If you have a question for Evan and Bobby, they are here. Please start queuing up. But before I do that, Bobby, I'm going to come to you for a question. So, you know, Evan talked about that moment that, you know, is a moment that's quite transformative. For you, you've been deep building that vision for developers. You know, we've launched a ton this year. We just talked about Lens Studio AI, all of the new specs, which is phenomenal, and Snap Cloud, which we know you're all loving. What's your vision for the future of Lens Studio and also lenses themselves?

[00:06:48.683] Bobby Murphy: Yeah, well, first, I have to give huge credit to our unbelievable team. And we have so many smart, super talented, creative people who love waking up every day thinking about how they can build better tools and capabilities and opportunities for developers. And anything that I'm sharing here is on behalf of this remarkable team that we have. I think if you look at what we're doing or what we've been trying to do over the last several years, and as we think ahead to the next few, Lenses are really, to me, a phenomenal format for creative expression. It's an amazing tool for a developer or creator to take an idea and turn it into something real that a Snapchatter or a specs wearer or someone engaging with a camera kit mirror can experience in their own way. And that's a really, really powerful medium for artistic expression. And so when I think about what we're trying to do with lenses and Lens Studio, including Lens Studio Mobile and WebNow, is really increase the creative space that you all have to operate in, to make sure that we're providing better opportunities to stretch your imagination and translate that imagination into really awesome experiences, that we're lowering the barrier to entry to doing that, while also enhancing what our advanced developers and kind of deeply committed and invested developers can do at the top end, inventing new hardware and spectacles to take it to a whole other level. So all of this, I think, To me, it's really about empowering our developers to stretch their imagination and transmit amazing ideas into the hands of Snapchatters and spec squares. And we're at a very, very exciting moment in time, especially with all the advances in AI as Evan touched on. So a lot of bright things ahead, I think.

[00:08:23.914] Resh Sidhu: Yeah, I love that. I love that idea of just creating space and the tools to allow people to really express themselves. All right, let's go to your questions. We're going to go stage left first. If you tell us your name, who the question is for, and then the floor is yours.

[00:08:40.017] Questioner #1: Hi, my name is Raymond Motha, and my question is for the chief technology officer. This one is going to be around the future of the spectacles, and right now I think you're using the Qualcomm chipsets essentially to develop. Do you envision ever going off those chipsets into custom development to build lighter weight spectacles, or are you always going to be tied into kind of the OEM manufacturers?

[00:09:04.029] Bobby Murphy: Well, actually, I'll defer to Evan because I'm not sure what we're allowed to share here, but what if we...

[00:09:12.293] Resh Sidhu: Evan, do you want to take that question?

[00:09:14.274] Evan Spiegel: I can't ruin all of our surprises. That's certainly something that we would consider in the future.

[00:09:18.397] Questioner #1: Do you mind if I ask another question then? By all means, yeah. Right now, it seems like Snapchat is more consumer-focused, where you're focusing on B2C. Do you think that potentially spec goals will unlock more B2B, where you're unlocking productivity gains for businesses, and how you might plan on monetizing those specific apps that are created for B2B experiences?

[00:09:42.802] Evan Spiegel: I think it's such a great question. That's actually where we're seeing a lot of demand today, even just for the developer version of specs. We've gotten some requests for very large volumes of units and folks really wanting to build B2B experiences or enterprise experiences with specs. I think largely because there's an absence of other compelling headsets or glasses on the market that have specs capabilities. As we think about next year, specs for us is really about setting that consumer bar. We want anyone to be able to pick up the glasses, use them, get a ton of utility, and have a lot of fun using the product. But I think there's no question that they're going to be adapted for enterprise use cases, certainly just based on the inbound we've received already with the existing generation of specs.

[00:10:22.829] Questioner #1: Thank you.

[00:10:23.830] Resh Sidhu: Thank you. Great questions. All right. Next up.

[00:10:28.200] Questioner #2: I have a feeling this question is for Evan as well. Really happy to see the monetization options coming online. But in that same vein, my question is content portability. So do you ever see a path where we as developers build something in Lens Studio targeting not only your flagship device, but taking it to other devices as well?

[00:10:48.061] Evan Spiegel: This is probably actually a question for Bobby, and Bobby loves paying developers all the time. He's like, how do we get more money to developers? I think as we look at extending what folks are building in Lens Studio, that's why we've thought about camera kit, being able to deploy onto specs and onto Snapchat. And so we are interested in those sort of cross-platform applications. I think the challenge has been as we've looked at the ability to deploy lens experiences on other platforms, sometimes those platforms just don't support a lot of the advanced capabilities that lenses offer, and that's where there can be some challenges. But I don't know if you have some further thoughts.

[00:11:21.175] Bobby Murphy: Yeah, no, I think you covered it. CameraKit, we're increasingly, as we mentioned in the keynote, we're kind of reducing the need for branding and kind of opening up flexibility with CameraKit and continuing to invest in that as a really powerful avenue for people to distribute lenses off of Snapchat or off of Spectacles. And as new... distribution outlets come online and are of interest to developers, I think we'll explore camera kit on any number of surfaces. So that's really exciting. And then similarly, I think the WebXR support on Spectacles is a great way to go the other way for developers who are building on the web to bring those experiences into Spectacles without necessarily having to go and build a lens.

[00:11:57.339] Questioner #2: I'll just be very specific and say I would love to take my lens app to Vision Pro or Quest, even though I know you're restricted by what those hardware vendors give you access to. So...

[00:12:08.317] Bobby Murphy: I'll leave it. It's top of mind for us. Thank you.

[00:12:12.633] Resh Sidhu: Amazing. And we'll go to you.

[00:12:14.824] Questioner #3: Hey, my name is Kevin Bonzio. We are working on a camera kit app and built a ton of lenses with Lens Studio. And I absolutely love the ML tools you guys have, whether it's creating a face effect or just AI agent. And I sometimes thought it would be so cool to expose some of these functions that are so great to the users directly, right? Like, why should I now use some diffusion model to create a face mask when you guys do it so well? So I was wondering, with the video model that you announced, you're kind of doing that now. So I was wondering if that is something that you would think about in the future.

[00:12:44.600] Bobby Murphy: It is, actually. And yeah, these newest generative AI models are incredibly accessible. That's partly why Lens Studio AI and extending that interface into a more convenient app in Lens Studio Mobile and web is, I think, pretty interesting. And we can unlock creativity in a much broader base. We've also seen in lenses, in Snapchat, some real great success in open prompt style experiences in which end users do have much more capacity to create in the moment. All of this is definitely in purview for us and in the space of things that we're thinking about. The whole promise of AI, I think, is exciting for this very reason. You can unlock creativity to a much wider audience while still raising the ceiling for advanced developers.

[00:13:27.284] Questioner #3: Thank you.

[00:13:28.886] Resh Sidhu: Love it.

[00:13:30.629] Questioner #4: Hi, my name's Alfred. So blocks, new product that you're launching and now you have the, I know you have the asset library too. So how are they gonna be integrated? Are the current asset library gonna be converted into blocks? Are they gonna be two separate tracks or is it gonna be a mixed bag? How's that gonna look like when you ship the product?

[00:13:50.950] Bobby Murphy: I think for the time being, it'll be a mix. We're not converting one into the other. I think the asset library is kind of a catch-all for all kinds of tools and assets and components that people may want to use. But I think broadly speaking, the opportunity with blocks is pretty enormous, and in particular, the usability of them. So for people who are engaging or really interested and excited about creating with the support of Lens Studio AI, blocks offer a very powerful interface to engage with blocks. So yeah, it's... We're still, I would say, pretty early. And actually, there's been a handful of developers who we've worked with in the last couple of weeks who've already done some outstanding things with blocks. So we're very, very excited about this new foundation. And we think there's a lot of upside as we evolve this.

[00:14:31.571] Questioner #4: Cool. Thank you.

[00:14:32.412] Resh Sidhu: Thank you, Alfred. We probably have time for one more question.

[00:14:35.675] Evan Spiegel: Oh, come on. We'll just... Can we? Let's do a few more.

[00:14:43.122] Resh Sidhu: You're the boss. Yay! I love it.

[00:14:43.971] Questioner #5: Hello, Zihao here, traveling from New York City. Yeah, so I have a question specifically about applying the spectacles for the more engaging visitor experiences, because I'm from New York, and there's a lot of visitors traveling all over the world to New York. And we saw a lot of references and cases both in Asia and Europe. They're basically using the air glasses to enrich the visitor experiences, no matter for the museum settings or for outdoors. But surprisingly, nobody do that in New York City. So we just stopped to launch our Chinatown Arcade in Chinatown. And we're thinking for you guys, what's your idea or potential to see the either demonetization opportunities or to see the basic development opportunities for this kind of projects in the future? Thank you.

[00:15:26.175] Evan Spiegel: I think it's such a great question because we're already seeing so much excitement and momentum there. We call them location-based experiences. And a lot of what we've tried to offer are more advanced fleet management tools, for example, to be able to manage a fleet of specs, to be able to manage the experiences people are having. And folks are already building really compelling businesses in this space. I think it includes a great example with what they're doing. So I'm excited to hear that that's something that you're working on. If we can better support you or share some of the tools that we built to make those location-based experiences easier to manage and easier to offer, we'd be happy to do that. We see a ton of opportunity.

[00:15:57.532] Bobby Murphy: And that's exactly the intention on the topic of monetization commerce kit. We want to work with developers who are interested in figuring out great ways to monetize lens experiences in Spectacles in a very flexible way. So yeah, that's kind of the intention of that program, to learn from all of you on what's appealing and relevant.

[00:16:13.756] Questioner #5: Thank you so much.

[00:16:16.385] Questioner #6: Nice to meet you. My name is Neha Bindjapuri, and now that Spectacles will be going public in 2026, my question is, what do you think will be the biggest bottleneck for widespread adoption of Spectacles?

[00:16:29.321] Evan Spiegel: I think the biggest bottleneck is that it's a fundamentally new type of computing experience, which means that people need to try it to understand how it works. And I think right now, if you look at a lot of the products that are in the market, they're very simple, right? The sort of camera glasses that we released 10 years ago, or these very sort of monocular, very small display glasses, which I think, unfortunately, are shifting the market perception in an unhelpful way, right? Because people are trying products that fundamentally are not very useful or valuable. And so I think the challenge is going to have to be re-educating folks on what's actually possible with AR glasses and why having a computer and a pair of glasses can be so transformative. So I think we've got a lot of work to do to help empower people to try the product really easily and to see what it's all about.

[00:17:13.044] Questioner #6: Awesome. Thank you.

[00:17:13.704] Questioner #7: A very good morning. I'm Crazy Grunal from India. All the way from India.

[00:17:19.409] Evan Spiegel: Yes. Wow. Welcome. Amazing. Thank you. Amazing.

[00:17:20.049] Questioner #7: So let me just say this first, that Snap has one of the best, I would say not one of the, it's the best AR tech out there, but I feel it. But currently what I feel has been bottlenecked by the distribution and PR, like even though we have the greatest and latest tech over there, but not many people know about it, right? So is there any, like, do Snap has a broader strategy of how to tackle that?

[00:17:51.457] Bobby Murphy: We're working on it. Yeah, that's what this event is for, what all of our developer outreach is for. Definitely looking forward to feedback. I'd love to talk to you afterwards and hear where you feel like we could do better. We definitely want to get the word out. We know we have some great, fantastic technology and great tools and some really exciting developer opportunities. So yeah, that is our mission over the next couple of years is to make sure the world knows what we're doing and the opportunities that exist here.

[00:18:17.056] Questioner #7: Thank you so much. That would be crazy. Yeah.

[00:18:19.915] Questioner #8: How's it going, Evan and Bobby? Yusuf Omar here from Australia. A lot of us in the room are building for a future that both the public and sometimes investors can't quite see. How are you guys thinking about that mainstream adoption date? Because all of us don't want to be too early in terms of investment, but we also definitely don't want to be too late. 2030 was the number that's been floating around for years. Is that when we can expect to see billions of people around the world wearing these kinds of devices?

[00:18:48.461] Evan Spiegel: I think it's unlikely that you'll see billions of people by 2030, but we do think widespread consumer adoption is certainly possible by 2030. I mean, we said that many years ago, just looking at the technological roadmap and what's possible and sort of our investment milestones. So certainly 2030, I think folks will be able to build meaningful businesses with specs, and that's something we're really excited about and working towards.

[00:19:09.675] Resh Sidhu: Great question.

[00:19:12.203] Questioner #9: Hi. Well, first of all, thank you so much for this amazing space. I'm coming in from Mexico. I work with AAA brands to integrate creative technology, mainly for experiential marketing. And I feel like a lot of them are having resistance and seeing the potential in the future. of AR glasses like spectacles. So what is Snapchat's strategic plan with the rollout of having widespread adoption for daily use cases, not only for creators but for users as well?

[00:19:44.235] Evan Spiegel: I think one of the reasons why we've focused so much on seeding the development ecosystem so early is that consumers are going to have so many different amazing experiences to choose from when they buy specs. I think that's critically important for technology adoption. And then we know that these larger companies tend to follow where their customers are. And so I think as we see more customer adoption, using the incredible experiences that smaller creators and developers are building today, then eventually over time, larger developers will move over as well. But I think seeding this incredible developer community we have today with specs, building tools, being really responsive to what they're telling us, means that by the time we release specs for the broader public, there's going to be such an incredible number of experiences for our community to enjoy. And I think that'll help propel consumer adoption, which will then compel or encourage larger developers to join in as well perfect thank you amazing

[00:20:35.131] Questioner #10: Hi, guys. My name is Sophie. I'm the founder of Makiar Studio. And I wanted to ask Evan about the general direction of how you see social media evolving with AI stepping into the social media spaces and taking a big chunk of how the content is produced. I think we're going to see the shifts in the way people are interacting with social media. So how do you see our way of being online a few years forward?

[00:21:03.240] Evan Spiegel: I think it's such a great question. AI is a really compelling creative tool. Where we're seeing it being used most often is actually with friends, back and forth with friends, you know, with inside jokes and this sort of thing. As I look at, you know, entertainment content, and really if we, you know, the whole concept of social media in many ways has changed dramatically and nearly collapsed, right? Instead today we have messaging and entertainment. And I think that's really how people are engaging with content and with their friends on their phones. So I think in the messaging realm, we're seeing AI used all the time as a creative tool for funny snaps or for stickers, these sorts of things. On the entertainment side, oddly, I think we're seeing quite a bit of pushback against AI-driven content. And I think if you look broadly at sentiment, what we're hearing from our community is they want more real content. and not even real content necessarily from celebrities or influencers, but real content from folks who live next door or something like that. And so I think there's this really interesting moment where we're seeing so much uptake for AI as a creative tool with friends, but a real pushback on the entertainment space and a real desire for real content from real people. across the world. So that's sort of what we're seeing today. I don't know how it'll evolve. I love the creative potential of AI, and I think we're going to just continue to offer these tools to our community and see what happens. But I think messaging space, massive usage of AI as a creative tool for connection and on entertainment, more of a focus on authenticity.

[00:22:22.363] Resh Sidhu: Thank you. Amazing. Great question. We do have just one more question. I know, I know. And I will make this promise. For those who do have questions, come see me in the courtyard. I promise I will get the questions to Evan and Bobby, and we will respond back to your questions. I just have a question.

[00:22:39.040] Questioner #10 (through a sign language interpreter): Just a thank you. Just a quick thank you. I just want to say thank you so much to you for providing interpreters. It means so much to me. Not just a lot, but a lot. I have gone to other events, and I have fought and argued for access. And I come here, and I feel at home the way you treat me. And it's amazing that you've brought in sign language interpreters, and it's just really, it just means so much. Oh, amazing. And can I add one more thing? Please, I'm selfish. Can I say one more thing, if that's OK? I'm so impressed with you accepting feedback on the AR, the spectacles, continued feedback, especially with the increase in battery life, the power. I'm wondering, is that something that's going to change as well? Will we see more like an ability to plug your glasses in or get more time on that?

[00:23:39.984] Evan Spiegel: Yeah, that's definitely a big focus for us. So we're working on it.

[00:23:44.620] Questioner #10 (through a sign language interpreter): Awesome. One last thing, one third thing. I want to become a developer, so where is Snapchat for dummies?

[00:23:50.984] Bobby Murphy: Yeah, if you want to get into development, definitely go check out Lens Studio. There are a lot of great tools now. It's easier than ever to get started. If you want to work on spectacles, there's a fantastic simulator in Lens Studio, too, even if you don't have a device. So there's a lot of great opportunities, and it should be pretty easy to build your first interactive 3D experience. And we love feedback.

[00:24:12.449] Questioner #10 (through a sign language interpreter): I can't wait to bring more deaf people next year. Thank you. That's it. Amazing.

[00:24:20.651] Resh Sidhu: I want to take this moment to say thank you to Evan and Bobby, both of you, and Evan for giving us more time for these questions. We love that. Also, I want to take a moment to say thank you to all of you for those that have been here in person with us, the Hackathon members that have been here for 48 hours nonstop. You guys are a vibe. I know the awards are coming up next, and we are very excited. You are the reason why this platform exists. We want you to keep collaborating, to keep experimenting, keep pushing, because when you push us, we push what's possible within Snap. And most of all, keep building with Snap. Thank you.

[00:25:02.906] SPEAKER_11: Thank you. Thank you.

[00:25:09.755] Kent Bye: Thanks again for listening to this episode of the Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a supported podcast, and so I do rely upon donations from people like yourself in order to continue to bring you this coverage. So you can become a member and donate today at patreon.com slash voicesofvr. Thanks for listening.

More from this show