I did an interview with Joe Darko, Global Head of Developer Relations at Snap, at Snap’s Developer Conference of Lensfest. See more context in the rough transcript below.
You can also check out all 11 episodes in this Snap Lensfest series here:
- #1667: Kickoff of Snap Lensfest 2025 Coverage & SnapOS 2.0 Announcements
- #1668: Snap Co-Founders Community Q&A about Specs 2026 Launch Plan
- #1669: Snap’s Resh Sidhu on the Future of AR Commerce & Developer-Centered Innovation
- #1670: Snapchat’s Embodied Gaming Innovations with AR Developer Relations Head
- #1671: Reflecting on Snap’s AR Platform & Developer Tools Past and Future with Terek Judi
- #1672: Niantic Spatial’s Project Jade Demo Shows Latest Location-Aware, AI Tour Guide Innovations
- #1673: Snap Lensfest Announcement Reflections from AR Gaming Studio DB Creations
- #1674: 3rd Place Spectacles Lensathon Team: Fireside Tales Collaborative Storytelling with GenAI
- #1675: 2nd Place Spectacles Lensathon Team: CartDB Barcode-Scanning Nutrition App
- #1676: 1st Place Spectacles Lensathon Team: Decisionator Object-Detection AI Decision-Maker
- #1677: Snap’s AR Developer Relations Plan for 2026 Specs Consumer Launch with Joe Darko
Here are some concluding deep thoughts that I just posted in a LinkedIn post.
Reflections on Snap Lensfest XR & AI Trends Covered in Latest Voices of VR Podcast Series
Snap brought me down to LA to cover their Lensfest developer conference where they made a lot of AR developer platform announcements, had a hackathon featuring those new capabilities, and are gearing up for their 2026 consumer launch of Specs, their fully 6-DoF, hand-tracked enabled, AR Glasses. It’s been a full year since their Spectacles dev kit was announced and made available to developers, and I feel like Snap is on the bleeding edge of where the overall XR industry may be headed.
These latest 11 Voices of VR podcast episodes spanning nearly 7 hours dig into these deeper trends that go beyond the headline announcements from Snap Lensfest. I recorded five interviews with various Snap employees, and I had a chance to catch up with some of the leading AR developers in the space, including Niantic Spatial’s latest VPS guided tour experience on the Spectacles with an AI virtual being. I also served as a preliminary hackathon judge where I got hands-on experiences with all of the AR experiences exploring what’s possible with the latest Snap Cloud announcements, and I’m featuring interviews with the top three Lensathon teams from the Spectacles track.
Snap’s Latest AR Developer Platform Announcements
Snap is gearing up for a 2026 launch of Specs for what will likely be nearly two full years of the Spectacles dev kits having been made available. So this Lensfest marks a half-way point towards a consumer release, and the product team has been busy rapidly iterating on their bespoke, AR app production pipeline. Dedicated AR glasses are very resource constrained, and so Snap has been continuing to evolve their Lens Studio developer tool and optimizing their SnapOS platform for Spectacles. Snap didn’t share any news on their target specifications for the Specs, but they released eight significant releases of their development tools over the past year with some of the biggest announcements being shared as the primary focus at Lensfest.
Snap is launching Snap Cloud, based upon a Supabase deployment of their open source, PostgreSQL hosted solution. This will allow developers to dynamically load assets, call edge functions, and more easily set up database backends. This will hopefully help to enable Spectacles AR lenses to go beyond some byte-sized entertainment and rapidly prototyped experiments into more fully-featured applications that also leverage cutting-edge AI models and computer-vision enabled applications. Spectacles developers have been limited by 25MB lens size limits, but the Snap Cloud announcements makes it so that larger assets can be dynamically loaded, and I expect to see more sophisticated experiences, more AI-driven applications using various cloud services, and lenses that have data persistence that make it more likely for users to want to come back to them.
Is the AR Glasses as Front-End to AI a Viable Path?
There’s certainly a lot of experimental things happening with various AI services that are cropping up, and Snap is very much embracing the exploratory potentials for AR devs to see what’s possible by making it easier to integrate with these services. While many are excited about the possibilities and potential of the mashups between AI and AR, there are also a ton of open questions that have yet to be answered on what types of business models will prove to be sustainable.
We very well could be in an AI Bubble where many of these emerging AI services prove to be economically unsustainable as the costs to run them may continue to outpace the revenue that they generate. But Snap seems content to go all-in on enabling AR developers to see what’s possible, while also trying to mitigate the risks through some more experimental flags and requests for consent from end users. See my conversation with Terek Judi for more context on how Snap is striking this balance between innovation and AI trust & safety.
Snap doesn’t have their own preferred LLM or AI service, and so the Spectacles and Specs may be one of the only AR devices that allow developers to more freely explore all of the various AI options that are out there. But at the same time, the developers of these apps may also be on the hook to foot the bills for whatever AI-driven services they create. The business models for all of these AI-driven AR experiences have yet to be fully fleshed out, and the fly-wheel of innovation is at the point of pure experimentation to see what types of compelling AI-driven experiences may be enabled by the convenience of a face computer.
An oft-repeated adage in a number of my conversations is that AR will likely serve as the experiential UI and frontend to an AI backend. Therefore, Snap is very much interested in empowering developers to experiment with these new AI capabilities. The prompt for the Spectacles Lensathon participants was to leverage these new Snap Cloud features from Supabase, including being able to call edge functions to various AI services, implementing database-driven apps, or to have some sort of live multi-player and social interaction facilitated by the Spectacles.
Serving as a preliminary judge for the Lensathon allowed me to have a chance to experience what the ten Spectacles track teams were able to pull off in a quick 25-hour hackathon. I share more about some of the trends that emerged in the introduction to my interview with the Lensathon winner, but also within my interviews with the 2nd place and 3rd place teams. Yes, technically many of these AR apps could also be phone-based apps, but the convenience of hands-free, gesture-based triggers with a head-mounted camera on your face may lower the friction enough to make new AR applications much more viable than a phone-based equivalent.
Snap as a Dark Horse in the AR Glasses Race
Overall, I see Snap as a bit of a dark horse in the race towards fully functional AR glasses, and the big differentiating factor may be what types of experiences developers will be able to make for the Snap Specs launching sometime next year (likely sometime after Labor Day in either Q3 or Q4).
This dark horse status is mainly because Snap is going up against some of the biggest companies in the world. The 14th-Anniversary of Snap on September 8, 2025 was marked by an email that Evan Spiegel sent out to all of his employees. In the letter, Spiegel says, “The cutoff for inclusion in the Fortune 500 was $7.4 billion in revenue in 2025, and with analyst estimates suggesting Snap could reach nearly $6 billion in revenue in 2025, we’re not far from achieving Fortune 500 status.” Snap is competing in the XR space with other companies that are near the top of the Fortune 500 list by revenue with Apple at #4, Alphabet (Google) at #7, and Meta at #22. By profit, then Alphabet is #1, Apple is #2, and Meta is #6.
It takes a lot of money to do a proper consumer launch of XR hardware, and reporter Alex Heath published a report last week on his new “Access” Substack that Snap CEO Even Spiegel will be in Saudi Arabia this week speaking at their Future Investment Initiative with the intent to raise a $1 billion round money for the Specs release. Heath reports that “sources say Snap plans to turn its Specs hardware unit into an independent subsidiary that can continue raising capital from investors. The idea under discussion is to structure it similarly to Waymo, which operates independently within Alphabet, rather than fully spin off Specs into a new company outside of Snap.” Heath’s report answers some of my own logistical questions, and could provide some additional puzzle pieces for how Snap would continue to punch above its weight on releasing consumer AR glasses in competition with some of the largest companies in the world.
Snap Betting on Developer Relations as Differentiating Factor
Given that Snap is an underdog in the race towards AR glasses, it means that they have had to differentiate themselves in some fashion, and Snap is betting on their development relations strategy as the key differentiating factor. Meta has de-emphasized collaborating with third party developers for their AI Glasses and Meta Ray-Ban Display Glasses as their smart glasses have been on the market for a couple of years before Meta finally recently announced a pathway for developers to have their own apps interface with them.
In contrast, Snap has been taking a much more developer-centric approach for their AR glasses strategy with the Spectacles dev kit being made available on a subscription-basis. Despite the odds, the Spectacles dev kit feels like it is on par with what Microsoft was able to accomplish with the HoloLens devices or Magic Leap with their devices within a much smaller former factor. It’s certainly the most fully-featured AR glasses dev kit that is currently available. And through this, Snap has been listening to developer feedback, rapidly iterating, and responding to developer needs.
Snap has been rapidly prototyping their AR developer production pipeline with eight major releases over the past year since the Spectacles announcement. But Lens Studio and SnapOS are still proprietary workflows that currently only work for their own product. CEO Evan Spiegel expressed interest in expanding beyond their own hardware after being directly asked in the Q&A session after the Lensfest keynote, and I hope it’s something that is seriously on the table to open up more open standards-based options.
Questions around cross-compatibility are currently being diverted to their WebXR implementations of the WebKit-enabled browser that they announced at Lensfest. But the browser vendors across all of the XR devices have yet to reach full parity, which has led XR developer Matt Hargett to look at using alternative XR open standards production pipelines. He’s been React Native and NativeScript as viable stopgap alternatives to the relative instability of XR implementations across the various browsers available on XR devices. See my previous conversations with Hargett in episodes #1311 and #1660 for more on that. It’d be awesome if these open standards-based pipeline options could be on the table, but as Snap is developing their own custom tooling, then their priority is obviously to optimize for their own devices first and foremost.
AR Glasses Driving Move Away from Game Engine Dominance in XR
Snap aims to be the most developer-friendly platform, but they’re also asking developers to commit to a proprietary development stack that doesn’t really leverage Unity or Unreal Engine or OpenXR in a way that previous companies who have been starting with VR, MR, and XR have been able to do. But starting with AR-glasses first has meant dissolving all of these previous XR production pipeline paradigms and building up their own paradigm from scratch. This is something that the other major players may end up having to do at some point, and so Snap’s approach may be a sign of larger things to come.
Rather than basing the future of the XR industry on the game design logic encoded into these game engines, then it may look much more like a web-based application that pulls in data and information from a variety of cloud-based APIs and AI services. The fact that both Meta and Google have embraced the moniker of “AI Glasses” as the bottom end of their XR spectrum of devices may also be a sign of things to come.
From a Decade of Facial Filters to Head-Mounted AR Glasses
Snap has been doing AR development for 10 years now starting with their AR facial filters launched in 2015 focusing on identity expression and connecting with friends in a social media context. Snap had an amazing interactive exhibition at Lensfest where you could trace the evolution of these AR filters over the past decade. Snap hasn’t wavered from their vision of using AI-driven AR lenses to enable their users to connect in new ways.
Major competitors like Meta have cloned many of these AR facial filter innovations, and then Meta inexplicably killed off their major AR platform of Spark without any viable replacement. Snap has been clearly dedicated to AR for over a decade in a way that others have not been, but they’ve also built their systems based upon frontward facing cameras where now they’re going to be shifting their POV to the wider world. It’s an entirely new set of features and problems and demographics, but ones that Snap’s leadership seem determined to give their opinionated vision about what the future of spatial computing should look like.
Will Snap Specs Really Be for Consumers or for Enterprise Apps?
It’s still an open question for me in terms of who Snap’s AR glasses will be for. Will the Snap Specs primarily be used within an enterprise, B2B, or location-based entertainment context? Snap had a huge exhibition at Augmented World Expo in June, where they were featuring the LBE momentum from providers like Enklu. Since then, Snap developed some slick fleet management tools in response to some of these enterprise requests.
The Spectacles are already the best AR-glasses dev kit on the market, and potentially analogous to how the Oculus DK1 kicked off the modern VR industry. Will the 2026 consumer release of Snap Specs also be treated as a best-in-class, AR glasses dev kit for XR innovators and early adopters? Or are they truly a consumer-ready product that hits all of the major requirements for regular consumers to use? It’s clearly a product that will attract technology enthusiasts, and more fully featured than Meta Ray-Ban Display glasses, which are more of a monocular, 2D-based UI, heads-up-display, with some novel neural inputs that transform your hand into a TV remote. Snap is currently looking for some of the killer launch apps that will convince people who might be on the fence to take the leap of faith to buy their cutting-edge AR tech.
Magic Leap unsuccessfully tried to claim that their initial Magic Leap One release was ready for consumers, but they didn’t have a proper dev kit period or robust set of experiences to justify the purchase. Apple’s Vision Pro similarly skipped a dev kit period, and went straight to market for consumers. I suspect that a lot of usage has been from developers building for the XR future they imagine rather than normal consumers using it for media consumption or as a screen-replacement productivity device. But it’s honestly hard to tell as comprehensive data on retention or popular use cases are not shared by Apple, and their recent refresh with an M5 chip indicates that there may be some larger strategic reasons that move their ecosystem towards their own AR glasses aspirations.
Conclusion as the Lensfest Conversations Point to Larger Industry Trends
I don’t know how this will all play out, but Snap is certainly committed to listening to developers, responding to their needs, and going all-in on true, 6-DoF AR-glasses with hand tracking. Snap still feel like underdogs to me, but they’ve continued to defy everyone’s expectations.
Either way, I feel like the conversations that I had at Snap’s Lensfest can provide some interesting insights for these deeper XR industry trends, and there’s so many of these open questions that I had a chance to stress test with Snap employees and their developer ecosystem.
There’s a lot of idealistic hopes, dreams, and aspirations that Snap may just be able to pull off with enough support from the wider XR developer ecosystem. Or they may be facing some harsh realities of what it takes to do a viable consumer XR launch, and the Snap Specs may be relegated to the enterprise, B2B, or LBE market where companies may be more willing than consumers to pay for the novel utility that the Snap Specs provide.
So tune into this series for yourself to make your own assessment for what Snap’s odds of success might be. Snap’s first-mover status in this space may be an early indicator for where the entire industry ends up heading, and it certainly feels like we’re at the beginning of a new cycle with lots of folks getting back into the game with the official launch of Android XR and Samsung’s Galaxy XR last week. It could also be an opportunity for XR developers to jump in and make a difference in a rapidly developing and emerging market, and one where developer concerns are listened to in a more responsive manner than other major players.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
Podcast: Play in new window | Download
Rough Transcript
[00:00:05.458] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the structures and forms of immersive storytelling and the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So this is the final episode of my coverage from the Snap Developer Conference of LensFest. Today's episode is with Joe Darko, who's the Global Head of Developer Relations at Snap. So I wanted to talk to Joe just because he's helping to build all these different developer relationships. And a lot of XR device launches really live or die on what kind of software ecosystem is going to be made available once they are headed towards this consumer launch next year. So I see Snap is in this position of having this very strong vision of what they see the future of computing is going to be with these head-mounted augmented reality glasses that they're not really compromising on that. They're having binocular displays. The field of view isn't huge. It's probably like 40 degrees. The vertical field of view is a lot higher. And as they get closer to the consumer launch, they're going to be expanding out on all the different specifications and making it lighter, better thermals, better battery, wider field of view, all the things that they're getting feedback on, they're improving upon. They didn't give any additional details for what their target specs will be. They may be still in the process of developing all that, but probably sometime next year, if I were to guess sometime in the third quarter or fourth quarter, probably sometime after Labor Day, that's usually when all these XR devices end up getting launched. We just had like the Samsung's Galaxy XR launched this week, as well as the second iteration of the Apple Vision Pro with M5 chip. The Meta Ray-Ban display glasses launched at the end of September. So basically, this is the time of the year when all the new XR devices are being launched as you gear up towards holiday season. And so that's what I expect to be as well for Snap. as a gear towards their consumer launch of the snap specs. And they're doing their own software stack, they're building their own operating system. And so they're hand crafting a custom bespoke format to be able to develop their own lenses. So the big question that I have as we start to move forward towards next year is what kind of experiences are they going to be able to provide? And also, what is their focus of this product going to be? Is it a enterprise b2b type of device where people are going to be buying it for very specific enterprise or b2b use cases or is this something that is going to be mostly for location-based entertainment or is it going to truly be a consumer device and i say that with some hesitation just because magically tried to do a similar approach where they thought what they were building was a consumer device and that there really wasn't a consumer market that was ready to pay that much money for a experimental piece of hardware and something that didn't really have much compelling applications for them to use So, as we go over the next year, then I'm just really curious to see what different types of industry verticals are they going to really be focusing on? Is it truly a consumer device? Are they going to be continuing to develop these entertainment-based applications? And what other types of utility applications are going to be possible with this new super-based integration to be able to develop more sophisticated applications? So they're at the point where they're going to be kind of moving towards all these different visions. So Joe Darker has his own vision for how he starts to develop this type of ecosystem, starting with the innovators and early adopters and eventually, you know, wanting to cross that chasm into the mainstream, which is what they're headed for for next year. So I'll be very curious to see how they continue to do that at, you know, say, Augmented World Expo and being in conversation with the developer community. If there's anything that Snap has going for it, it's the fact that they have close conversations with what the developers want and what they need. And they've really been responding to that. So clearly a lot of open questions as to how this is all going to play out. And I think it's worth hearing what Joe Darker has to say in terms of some of his vision for how to answer some of those questions into how to continue to cultivate and develop that developer ecosystem. So we're coming all that and more on today's episode of the Voices of VR podcast. So this interview with Joe happened on Thursday, October 16th, 2025 at the Snap Developer Conference of LensFest happening at the Snap headquarters in Santa Monica, California. So with that, let's go ahead and dive right in.
[00:04:17.094] Joe Darko: Hi, I'm Joe Darko, and in the world of XR, well, I'm the Global Head of Developer Relations at Snap, and in the world of XR, I'm here to build relationships with developers, and from a Snap perspective, get them to really build in our ecosystem, get them to build these experiences now for Snapchat, through Camera Kit, for mobile and web, and also for the future of AR glasses through Spectacles. So our job is how do we find the best developers across the world? Earlier I stated that talent is equally distributed, but opportunity is not. How do we find these developers across the world who are talented? How do we bring them into our ecosystem? Give them the opportunities, whether it's tools, a platform to really build experience that they want to build. And how do we help them monetize and build a business and build a career?
[00:05:00.720] Kent Bye: Great, and maybe you could give a bit more context as to your background and your journey into this space.
[00:05:05.126] Joe Darko: Oh yeah, so I've been in tech for a very, very long time. I mean, ever since I was a kid, I've always wanted to, one, initially I wanted to be a scientist, because I wanted to find a cure for AIDS. I'm always looking for solutions, and that kind of pulled me along the path of really finding solutions from a technical standpoint. How do you build Let's say applications or solutions for people to really leverage. Went to college, studied information science technology at Penn State University. After that, worked with various tech companies and worked with developers from all different backgrounds. First, enterprise devs, and eventually started working with developers in the XR industry. And I've been doing this for over 14 plus years now. And with XR devs, I've been working with them for over, let's say, seven plus years.
[00:05:51.724] Kent Bye: And what really drew you to XR as an industry?
[00:05:55.268] Joe Darko: So I was building on the ecosystem from, not from an XR standpoint back in the day, but what really pushed me to XR is I've always wanted to be on the cutting edge of technology. So when I saw the potential of XR being the next paradigm of computing, being the next platform where ideas, where solutions are going to be created, I quickly jumped on it. Because I did miss the boat of, let's say, you know, as things moved from mainframe to personal computing to mobile, I missed the boat, right? So I wanted to be on the next thing, right? I wanted to really be on the next thing. And the next thing, pretty much, I believed at that time, was going to be XR.
[00:06:36.096] Kent Bye: Great. And how did you end up at Snap? And what drew you to Snap to be able to work in XR industry at Snap?
[00:06:41.368] Joe Darko: So, well, I'm not gonna mention any names, but I did work for various companies, work in XR space as well, growing the ecosystem for them at that time. And when Snap approached me, so I was paying attention to Snap and what they were doing then. And they had a robust platform, they had a growing ecosystem. So when they approached me, it was kind of like, why not, right? You're a very advanced platform, You want us to build an ecosystem, bring the developers to really build a platform. I have a relationship with these developers, establish these relationships. Yes, I can bring them over and we can build something great. And to me, one thing at that moment was I saw what Snap was trying to do with AR glasses. They're building these standalone AR glasses. And that was like 2021. And I was like, whoa. I've never seen anything like this before. If anyone has a chance to really crack this, it's probably going to be Snap. So why not join them? And why not build an ecosystem for that?
[00:07:37.476] Kent Bye: Nice. Last year, I had a chance to attend the Snap Partner Summit, as well as the Lensathon, the hackathon that you had. And I had a chance to talk to Sofia Dominguez, and she's no longer with Snap. And so when I was asking folks around, OK, who's kind of the equivalent of what Sofia was doing? And What people said was that, well, there's no one really that's replaced her in what she was doing, but they said probably the closest is Joe and what Joe's doing. So maybe if you agree with that, or give me a sense of what's happening with Snap and Spectacles and the role that you're playing in helping to cultivate not only the developer ecosystem, but also where the platform and product. And I guess I'm trying to get a sense of now who's doing what.
[00:08:15.478] Joe Darko: Yeah, so Sophia, I work with Sophia a lot. I enjoy working with her, working on her team. And honestly, pretty much I'm doing, I would not say the same thing Sophia was doing. I'm doing it a little bit different from an ecosystem standpoint. Sophia was driving not just a developer ecosystem, but also AR partnerships, right? Right now that work has been, in a way, split up between me and a couple of other leaders doing that work, but my focus is pretty much developer partnerships. So when you talk about developers, the people really building the application, building the solutions, building the software for our community, building, let's say, experiences on Snapchat, right? That is what I'm focused on. And so when you look at it this way, And I shared this earlier with a couple of developers. My vision for next year is how do we scale our efforts? So we've built and established a relationship. We've built the programs. We've built a SnapLens network. How do we scale? Because for every platform to grow, you need to find ways to really scale. So my job is how do we build the, let's say, the machine? How do we build partnerships, develop a partnership with XR Bootcamp, AWE, GDC? And how do we work with them to scale and reach more people and reach more developers? I believe in one thing. We have great technology. Some will say the best. You heard it today. We have great technology, great platform. We have the robust ecosystem. Now, how do we get more people to be aware of what we're doing? I feel like that is what I'm going to focus on from now and beyond. And you're going to see this, Ken. A lot of devs do not know about our technology and platform, right? And that is how do we drive that awareness? How do we drive the retention? Because I've seen this. The moment they come into our ecosystem, they get stuck. They fall in love with our community. You see the energy, feel the energy. You see the support, right? And that's one thing I'm trying to do different from what Sophia did in terms of AR partnership.
[00:10:09.165] Kent Bye: she led the team successfully did a great job in that front i was doing this similar work underneath her when she was here and we do miss her but it's like okay how do we take this to the next level in terms of scalability yeah and one of the ways that i start to think around the diffusion of this technology is in this model coming from simon wardley who has this kind of way of mapping out these different ecosystems and so with this worldly mapping he's trying to look at different phases of technology diffusion from like an academic idea, custom bespoke enterprise market into mass consumer market and then eventually mass ubiquity. But with this worldly mapping, he's trying to look at ways that technology diffuse and where they add on that scale. And it seems like that in order to really get something that's brand new, you have to in some ways go through this custom bespoke enterprise market where there's either brand partnerships, location based entertainment or other ways where There's like enterprises or B2B type of things that are able to make it so that the consumer market can be at the lowest enough price point of economies of scale and everything else like that. So I feel like some of the major players in the industry, like with Meta, have in some ways tried to skip over that enterprise market and go direct to consumer. Whereas where some of the main activity is coming from more the enterprise or LBE space that they haven't been as interested in supporting. And they've had programs that have that support and drop it and it's been a little bit inconsistent. really having their eye on the prize of trying to push it out to consumer and subsidize the hardware and everything. So as you look at what has happened here in the Snap ecosystem, just curious to hear some of your thoughts on how to kind of organically grow this ecosystem and provide business models that are sustainable for these XR devs. As we go into these new paradigms of computing, there are potentially new paradigms for how the business models are going to play out with how it's going to make sense financially for folks to spend the time and energy to really invest themselves in the ecosystem. So curious to hear how some of those with new commerce kit and other things that you're trying to do to create a vision for where you see that ecosystem going from a business model standpoint.
[00:12:15.737] Joe Darko: Yeah. So for me, when I look at it from, I have a philosophy and I forgot who came up with this philosophy, but that's something I've kind of like leaned on for a while in building ecosystem. So when you talk about technology or the efficient, right? The efficient of tech in general, we start with the early adopters. Everything we've done has been how do we get these early adopters in-house? How do we get them to really build with us, right? And it's, I feel like this stands for our leadership. Our leadership has a stomach for feedback. It's like, how do we get, early adopters are going to give the feedback. They're going to help you find the use cases. They're also going to help you with like some of the negative feedback you're going to hear in the ecosystem. You're going to hear it directly from them. And you have to be bold enough to really get something that is not maybe completely ready for consumers, yes, to those early adopters. We work through them to kind of like, gain the insights because they are, from a business standpoint, they work with a lot of these businesses that sometimes we are not working with. Some of these use cases that enterprise folks want, they work with them directly. They know. So working with them, we get that first and foremost. Well, everyone will say there's a risk there, right? If you turn off these early adopters, you're probably gonna mess up the ecosystem forever. Luckily for Snap, we've been able to build a strong relationship with them, where we are open to their feedback, good or bad, and they've stuck with us. Now, after early adopters, you go to innovators. After you get all the feedback, how do you innovate with the broader community? It's also going to be a fraction of folks who are going to be able to jump on board. But now your ecosystem is growing. So we've done that as well. And now when I look at our ecosystem, I believe we've crossed that chasm of early adopters and innovators to the point that we're getting more people who are interested in it, find those use cases, right? And we still continue to find those use cases for XR. Because people will say, well, they're already use cases for XR. I disagree with that. I feel like from a consumer standpoint, enterprise standpoint, we're still unearthing those use cases. You saw the hackathon. Some ideas that we've not even seen before. Like, oh my god, this would be great, right? If we can fund these developers to really build with us. So we are at that point where now we're growing our ecosystem beyond the early adopters and innovators to the point that we're going to get more developers moving to next year, more developers who are willing to really build for consumers, who are ready to now move from a consumer standpoint to enterprise. What are solutions for businesses right now? And I think that's where we're at right now. And when I look at that kind of like, let's say, bell curve, it kind of rises from early adopters innovative, it gets bigger, then eventually, once you get over the chasm, now you're going to get to, let's say, all sorts of developers and consumers who are going to come with so many different ideas. So I think we're at that point. And that's why I think the timing of SPEX 2026 is perfect next year, because we're going to find more devs who are going to be able to build experiences for consumers. And I hope that makes sense.
[00:15:10.765] Kent Bye: Yeah, for sure. And I think that crossing the chasm is something that XR still hasn't necessarily seen on scale to see what is the thing that's really going to take off. I mean, to some extent, Meta has been saying that the Ray-Ban Meta AI glasses that they have seem to have a certain amount of momentum in the market of growing year over year. So it seems like that this kind of AI glasses idea seems to be at least apart from what has been happening in more of a niche of virtual reality and the quests and this virtual reality ecosystem that's been primarily focused on gaming, but I guess the form factor is something that having all-day wearables is sort of like the end goal, and I think There's always this trade-off of trying to put as much capability as possible, but doing this balance between weight and thermals, distribution. We didn't hear much around how those different trade-offs are being negotiated for the consumer launch next year, but just curious to hear a bit around whatever you can share around what kind of feedback you've been hearing from the developers to see, OK, here are the things we need to improve on in terms of battery life, thermals, how long is this going to last, and just the form factor of the thing.
[00:16:18.792] Joe Darko: Yeah, so I have to say this first. At the end of the day, we are fighting against physics or working with physics, one or the other, when we talk about feedback from developers. And by the way, the feedback has been amazing. Like I said, you see this firsthand. We're a feedback culture. We love feedback, good or bad feedback. We don't shy away from that. When you look at the XR industry and the glasses industry, and I'll say this, I've worked with a lot of these XR companies, and everyone is trying to build towards this world of XR, AR glasses. Everyone is trying to get there. People are attacking it from a different standpoint, right? We on Snap, on our end, straight up, we're like, we're going to straightforward the juggler. We're going straight to build these glasses. That's why we believe we're the leader in this space, right? Other companies are seeing it from a different perspective. Hey, we're going to come out with smart glasses, right? and then embed AI into them, displays into them, eventually built towards that world, right? Other companies are coming from it from a VR standpoint. How do we cultivate a VR ecosystem? How do we get these devs who can build these experiences, rich experiences, and eventually, as the form factor gets better, as the tech gets better, we migrate them to XR. But for us, it's not easy, but we went straight for it. We're going to build these glasses and we're going to really get developers on board. Now, in terms of the device, the feedback we've heard is, hey, can you make it slimmer? Can you make it lighter? Can you make the field of view bigger? Can you increase the battery life? And they tell us from a developer standpoint, well, why do you want the field of view to be bigger? Well, for these experiences, right? And that's where we get deeper into the feedback they're giving us. It's not because, from a consumer standpoint, anyone can tell you that. But from a dev standpoint, what are you trying to build? Why does a field of view matter, right? And so we get that deep feedback from them and from their perspective. Hey, maybe for this connected experience, we want this field of view for this X and Y reason. And honestly speaking, we've taken that feedback in, and what I can share is every piece of feedback We are working on it. We are trying to make it lighter. We're trying to make it faster, right? But we are still fighting against physics. How do we ensure that we can cool the battery, right, to make sure it doesn't overheat? How do we ensure that if we increase the field of view, we can also decrease the weight and make it feel lighter, right? So I can't share a lot about what we're doing internally, but just know that every piece of feedback we heard that you've probably also heard, we are working on it aggressively.
[00:18:52.079] Kent Bye: Yeah, back on September 8th of 2025, Evan Spiegel put out an email that was sent to all the employees 14 years at Snap, and that was referenced in the talk that you gave in terms of showing that Snap is both trying to incorporate more and more games within the context of what's happening in Snapchat, but also that the Spectacles is really something that is really a vision that for the past decade that Snap has been working on AR lenses and that there's this vision that the next paradigm of computing is on this trajectory of the kind of logical progression from an AR lens filter of people vomiting rainbows 10 years ago to now head-mounted AR glasses to having your own operating system, your own game engine that's the Lens Studio that's a very pared down and so you've got this whole vision that's going towards these glasses and in that Evan starts to share some of his vision for why this is a part of the vision of the future of technology that he has and that the whole company of Snap is working towards so just wondering if you could share a little bit about kind of summarizing that vision of like why Spectacles why is this important to move in this trajectory of kind of the next wave of computing
[00:20:00.563] Joe Darko: Yeah, so when you look at it, so let me touch base on the games piece as well, right? Because you heard this at LensFest. In Evan's letter, he calls it out, right? When you look at how people really engage specifically when it comes to games on Snapchat, people want to connect, right? Beyond just the core product value of messaging. People want to play games with each other, right? They want to play games with their friends and family. They want to really be at different places and still be able to stay in touch. I've played some of these turn-based games, and they're fun. I've played with some Slam employees, and we spent like a week to complete a game, right? And any time I came back on the app, that's what I wanted to complete. That's what I wanted to really, and we've seen it really grow. Our, let's say, developers, our developer community has grown. The developers can build games, anyone who want to build games and engage with Snapchatters has grown. Now, on the Spectacles piece, when Evan talks about the next paradigm of computing and the next platform, I think Evan truly, truly really believes in it, and we truly believe in it as well at Snap. Because when you look at where the world is going, we move from mainframe computing to personal computing to mobile phones, right? And all of these, people still have PCs, people still have their tablets, they still have their laptops, they've co-existed with their cell phones. Now, I see a world, we believe in a world where glasses are also gonna be part of this arsenal of devices that we have. So there are certain things that will live on your desktop or laptop. There are certain things that will live on your tablet, which will be better served on your tablet. Certain things are served better on your phones now than your tablet. So when you walk away, you don't need a tablet, you don't need a laptop. You can still work on your phones and they're way better. They're better served on them. Same thing in the future. There are certain things that are going to be better served on your glasses. Some may coexist. I can answer an email on my phone and still answer an email on my laptop or my desktop. It's fine, right? But when I'm on the go, I want to answer emails on my phone, not on my tablet. It's a little bit bigger. There are certain experiences that will live primarily better served on your glasses. And that's the world we're getting into right now. And also with advancement of AI, advancement of computer vision, understand your world around you, AI being able to really engage with you and the world in front of you and people around you, certain things are really only going to be served on your glasses. For instance, when I think of these examples, when I go to conferences and talk to people, they ask me this question, well, what are some of the use cases? Think about going to a museum, for instance, right? People not go to a museum, they're not paying attention to the tour guides, they're using their phones, trying to really... be in the moment and also remember the moment, right? But with AR glasses, I can be in the moment. I may not need a tour guide per se. I can be in the moment and still really feel present at the same time in the sense of now the AR glasses can really understand what is happening around me, can pick up a statue of any historic artifact and give me the history. Whilst I'm there, hands-free, engaged, fill in the moment and be part of history as well so that is one experience of like okay when it comes to like in terms of museum it's going to live better on ar glasses now some experiences cell phones may still play a role in that we're not denying that if you ask me personally i sort of maybe what anyone at snap will say i will say they will coexist because the bed was cell phones were going to replace what tablets or laptops Never did, right? And we still enjoy both of them at different times or maybe together. So same thing, AR glasses is going to feel the same, right? There are things that we believe cell phones will do better, tablets will do better, and we're going to let them handle it. But in use cases, and you were part of the hackathon judging panel, we talk about these things. Like some of these things are not going to live well on AR glasses alone. Maybe they're better served with other devices. But some of these experiences should be served with AR glasses. And those are some of the use cases we're chasing after. So to me, I see it as people do believe sometimes it's going to replace other devices we have. Personally, and once again, being careful with my words, Personally, I don't think it's fully going to replace. It's going to replace certain things that we do with our world. And it's even going to be better served in those kind of like experiences, right? And that's what we're trying to find with our developers. Help us find those experiences, those use cases that will be better served with AR glasses. Help us build them. We want to build them with you. We want to build them in the open. And we want to help you build a business around it.
[00:24:33.342] Kent Bye: Yeah, one of the things I was really struck by, by reading through Evan's letter that he sent out on September 8th, is that he was commenting around how Snap is not even in the Fortune 500. There's a certain amount of revenue that you have a target. So they're kind of at the edge of the Fortune 500, not even in the Fortune 500. And so Snap is competing with these different companies like Meta, Google, Apple, some of the biggest companies in the world that are in this space. What I find fascinating is that Snap is kind of the dark horse in this race of having a strong vision, investing in it, having a clear idea for how this is an evolution of what Snap has already been doing for over a decade, but that there's been kind of a dedication to produce hardware that feels like very much on par and beyond what HoloLens 1 and 2 was able to do. And so, you know, again, billions of dollars from Microsoft on that where they didn't have a clear vision for how they were going to actually bring that to market. And they had an early lead, but then they took a step back. So I guess the larger question is around that it seems around that Snap is punching above its weight in terms of innovating. pushing for the technology, but also like the real value of cultivating an ecosystem of developers that are pushing the edge of the technology for what's possible. Whereas not all the other companies are doing the same level of integration with their developer community and having events like this and really engaging the community. And so it feels very community driven and very developer driven in a way that the other companies don't have. And so in that way, there's a certainly an edge, but there's still questions in the back of my mind in terms of all the economies of scale of what it means to kind of launch these products at a large scale and all the revenue that's required to really properly launch some of these new technologies at scale. So just curious to hear some of your thoughts on Snap being kind of like the dark horse and from some perspective of the outside analyst, but from someone who's inside the industry, I see how far ahead Snap is, but still there's like these larger challenges of what it takes to kind of really bring a product like this to mass consumer scale.
[00:26:36.863] Joe Darko: Man, I love this question, Ken. And honestly, you're speaking to the heart of what Evan believes in, what I believe in, what we believe in at Snap, and personally, why I'm here at Snap. I believe in developers. When you look back at history and how, as a global society, how we have developed, developers have played a role in everything that we do and where we are today, right? And I've seen this firsthand with multiple companies. and how sometimes they had, in the past, pretty much pushed your developer community aside and lost. We see the value of developers. We see, and to your point, maybe we had a dark horse, right? And I love that position to be in, right? Because sometimes, you know, as being the underdog, you're able to focus on what you have to focus on, and you're able to deliver in silence and in peace, and eventually you win. And I love that position because I think the bread and butter of what we're trying to do focuses on developers. We have the tech. Like, there's no debate about that. Well, people may debate that. But honestly, we have the tech, we have the platform, we have the tools, we have the commitment to invest in this, right, and stay in this race. One thing that we have that I believe other companies don't have, we're also in this game, is also the developer ecosystem. And that is something that we've done for over the past, let's say, decade. And now we've ramped up investment in that area. And to me, that is our secret sauce. If I'm to tell you one thing that will help us win as developers, anyone who has won in a platform race, platform game in the past, once again, not to mention their names, they won because of developer ecosystem. And I think that is why we do what we do. That is why we do LensFest. That's why we bring in our developers to be part of the hackathon. That is why we're humble with feedback. That is why whenever we close the feedback loop, we report back to the developers. Hey, you gave us feedback, and we did it and made it happen. That is why developers say we are a kind team. We embody that value. And that is why we're not going to quit, no matter what. Right? Because I think when you look around you, you have developers from all around the world. They're the ones who are going to build solutions for their audience to drive that demand for consumers. And that is just a trick. The trick is find developers who can build solutions for their culture, for their people, for their audience. If they're able to do so, you're now going to open a whole new opportunity to kind of like build. You talk about the economy of scale. Once you tip that side of the spectrum, There's going to be that demand. And when that demand comes, you're not going to have any issues with consumers getting what they want, having the applications they need, right, to do what they do in their lives. So let's say if you're a developer, Kent, you're based somewhere around the world. You're able to build a solution that resonates with your audience, right? Your audience are going to come back and say, how do I get this device? How do I use this application on this device to solve a problem in my life? So to me, that is the last secret sauce. And we're not going to give up on that. We're going to keep pushing. Whether we're at Dark Horse, whether we're not at Dark Horse, we see it as the key to our success. And we believe in that. And also, one thing I will say is, it's not just finding the developers. It's making them successful. If developers are going to stick to your platform, they need to make money. They need to find ways to really be successful in the platform. They need to be heard and respected. They need to feel a part of the community. And our developer community, I can tell you, is growing. Next year, if you come to LensFest, it's going to be way bigger than this. And that's because we have a plan on how to scale it with our ecosystem enablers that we're trying to really, and our developer ecosystem partners that we're trying to really bring on board and engage with.
[00:30:26.533] Kent Bye: Yeah, and I think that the big open question is some of these new business models. And there has been traditionally, like, sell an app, just like the mobile phone with iPhone and Android. Sell application, and that's where a lot of the revenue stream comes in. And then for the platform holder, there's like a 30% cut that comes in that both Apple and Google have been able to kind of really continue to develop that platform. So now with the Commerce Kit, it seems to be more of like in-app purchases rather than buying it. So just curious to hear if this is a move away from just app-based economic models and more towards service-based or things that you have a value add where you get a taste of something but you want to expand in some sort of subscription service.
[00:31:07.210] Joe Darko: But for that, I feel like this is something that we need to dive deeper on. There are a lot of things I can't share now because it's still very, very early based on what we announced and what we discussed today. So that's something I would like for us to really sit down and dive deeper into, but I can't share more at this moment, at this time.
[00:31:22.517] Kent Bye: Okay. Yeah. That's fair enough. Yeah. Yeah. I think that's one of the big open questions as to- Exactly.
[00:31:26.899] Joe Darko: Yeah. Because right now, we've made that decision to really talk about it right now. The next path, right, in 2026, we'll be able to dive deeper into that. Developers that tell us about monetization, and we want to show them that, hey, there's a path that we're working on towards monetization. So this is kind of like teasing them as to what we're releasing for them, what is coming up, and the plans we have for them. So we're going to dive deeper into that kind of like business model. It's a great question to ask, but we're not ready to share more on it right now.
[00:31:58.356] Kent Bye: Yeah, and one other kind of follow-on as I look at both the position that Snap is in as well as a meta where creating these XR devices where they're not necessarily having phone-based access or connection to what's happening with Android applications, iOS applications that people are already using within the context of their mobile phones and beyond and tablets and PCs or their computing devices. So in that sense, there's a whole ecosystem of applications that people are using day-to-day that companies like Meta and companies like Snap aren't necessarily leveraging those existing behaviors or applications. And so just curious to hear a bit of how do you overcome this lack of breadth of mobile phone ecosystem applications? I saw there was phone pairing or things that maybe you're starting to mirror some of the stuff to maybe emulate your phone in the context of the device. It seems like that, in the long run, we want to have more of a seamless integration for all the existing applications that we're already using. And so, just curious to hear a bit of the strategy for going beyond just the ecosystem for what Snap's building on their own pipeline for Lens Studio, and then how the long-term vision for including more and more of these different platforms, how that starts to get fused in together, like the future of the Spectacles.
[00:33:16.385] Joe Darko: So I think you heard this earlier in the room. Someone asked Evan this question. It is something we're thinking about strategically from a strategic standpoint. It is something that is top of mind. And currently, we're still in discussion with that and trying to find the best solution for that. So once again, Evan did not answer. Bobby did not answer. I don't want to answer it right now. But it's something we're considering thinking about. Yes.
[00:33:39.025] Kent Bye: I did want to ask around AI. I've got a lot of different takes on different people. An adage that's been repeated a number of times, which is that AR is sort of the front end to AI on the back end, where you have kind of more of an experiential way of connecting to these different software and services. As I come here to the LensFast and see some of the different hackathon with the super base integration and persistent data, it seems like that another emphasis has been to tap into these edge compute functions of calling different AI applications. And so just curious to hear some of your applications for how AI is feeding into this larger vision for where AR is at right now and where you see it going here in the future of these two kind of sibling technologies that are working together hand in hand.
[00:34:21.345] Joe Darko: So I feel like, you know, with the advancement of AI, right, a couple of years ago, pretty much consumer AI, there's been a lot of focus on AI. And I think you heard Evan and Bobby kind of like mentioned this earlier today. We were kind of like pushing this years ago and driving and investing in this for some time now. But a lot of people kind of sometimes just see AI as just a tool. we see as part of the solution to what we're trying to build to enable people to really experience the world around them, to make sense of things around them. So for instance, you see some of the projects people build, right? One of the Hackathon projects, it's based with AI helping them understand the world, able to choose and decide what to put on, what not to put on, right? And that's just an example of how AI helps you solve a problem. And a lot of people are trying to build companies on the back end of AI, but not really solving a legit problem. So that's how we see it. With AR and AI, to your point earlier, brothers and sisters coexist, make sense of this world, right? If AR is helping you overlay digital objects into this world, how does AI help you interact and understand and give you more information, feed you, let's say, realistic information, get to know you better, understand you better, and get to help you interact with the world around you? So to me, that's how we see AI, and that's how we want to infuse AI into augmented reality. make sense of the world, help you understand the world, get to know you better, get to know your surroundings, and help you solve problems day to day with utility experiences. And we don't want to build like, okay, there's just an AI assistant app and AR glasses. Okay, what is the end goal of that? We wanted to solve real world problems for people. Right. So we want to bake AI into solutions and these applications to make sense of everything around us. Because we, like I said, invested in this. What we spent three billion over the past decade. Right. We've done a lot of research. We've been like advanced in terms of this technology. And now. with the advance of AI, we're not going to be caught flat footed. We're going to find a way to kind of like bake it in where it makes sense, right? And not just build something that feels gimmicky, that actually solve a real problem. So that's how we see it. And to us, we really, really see it as a perfect relationship, a symbiotic relationship between augmented reality and artificial intelligence, not two separate things.
[00:36:46.972] Kent Bye: And having a number of different conversations with different folks, and I think another emphasis that I've heard again and again, which is not always the focus of all the other companies in the space, which is privacy and having some way that you have with the Snap Cloud, all this processing that's happening within the context of your own kind of super base instance. having a little bit of a staged approach of having experimental flags and having things that are not necessarily pushed out to the wider community, allowing experimentation, but also trying to figure out the story around privacy and how to have the guardrails in place in order to really navigate the future of all these face computers with all these different sensors. Just curious to hear any comments on how you start to see privacy played into this larger story for allowing developers to experiment, but also trying to, as a company, have different guardrails or thresholds for what you have strong opinions on what is or is not OK.
[00:37:38.601] Joe Darko: Yeah, so this is a discussion topic that we have internally consistently. We are a private focused company in terms of from a privacy standpoint. We do want to maintain that even as we push the bondage of technology, even as our developers help us push the bondage of technology. So one thing is that we do allow them to experiment to a certain degree while not violating privacy at any point in time. And also, we find ways and means to insert guardrails as we build. Because privacy is not an afterthought. It's not something that we think about after. Whatever we build or whatever we try to build, we know first we want to get to developers quickly. If we're going to get to developers quickly, how do we ensure that we do not have any privacy breach or privacy situations, right? Because we don't want to kind of like break the trust between Snap and our consumers. We don't also want to break the trust between Snap and our developers. So that's something that we are very, very critical of from the onset. But to your point, to really advance a tech, we need to allow innovation, right? But we're never going to compromise on privacy. No matter what we want to do, try to innovate with our community, that's something that we're never going to compromise on. And I know that's not a full breadth answer that you expect because one, I'm not a privacy expert at Snap, but I know from a technical standpoint, from a developer standpoint, that's not something we want to compromise on no matter what. And we get developers really kind of like always ask us these questions, try to push the boundaries. Hey, is this possible? Can we do this? Can we do that? Right. And always there's something that we bring it back to our legal team, our privacy teams, and we have these discussions about, right? Hey, what can our developers do? Right. How can we enable that kind of like innovative mindset? And also, how do we ensure we don't break that trust with our consumers and also with them as well? Because sometimes I would say developers, whenever they see a new feature, a new toy, they want to build with it. They're excited. They are eager. but we want to make sure we put in the guardrails right because sometimes they don't exactly know the impact of their actions to that point and they will be honest with you oh wait we thought we could do this for this reason and then we have to explain it to them and also our relationship with them is very very delicate and it's very transparent and it's very open and very rich
[00:39:58.348] Kent Bye: So a year ago, I was invited out to come and check out the Snap Partner Summit and see the Spectacles demos for the first time. I came to the Lens Fest. And so now we're a year later. You've had, I think, eight different releases for the operating system for Spectacles. You've had lots of innovations with the 2.0 version of the operating system. You've had all these additions to Lens Studio. been able to accomplish quite a lot of progress on building up an entire new pipeline and stack for building applications for Spectacles over the last year. And then now we're anywhere between six months to a year to the consumer launch as we look back to next year, having maybe similar time frame next year, coming back again to have the consumer launch. And so when you start to think about and reflect upon all the progress we've had over the past year and then all the progress you hope to have in the next year. Just curious to hear some of your reflections on the journey that you've come so far now and where you hope to get to by the consumer launch next year.
[00:40:56.215] Joe Darko: So in reflection of that, one, I got to say the team has really worked tirelessly and endlessly to really push some of these releases as quickly as possible. Our goal was to get them out there as quickly as possible, get developers to play with it, get the feedback that we need, reiterate, and find better ways to enable our developers to really build these experiences better. So that has been our approach. Get it to devs quickly, get the feedback, and how do we help them really make this better next go around? So to me, I will say looking back, one thing I said earlier has been how we collect feedback, how we do that. And that has been at the heart of all these releases. That has been like the goal of all these releases. And we've had hackathons around some of these releases as well. we've engaged student devs, right? The feedback we get from student devs has been very different from professional devs, right? Because of maybe the learning curve from a student and from a professional dev, right? So to me, it's been such like a blessing to have a front row seat to all the releases that we've released to our developers. And now we're at 2.0, right? And now how do we take all those learning and feedback and how do we kind of like march towards 2026? And at the time of 2026, We're still figuring out. You'll be the first one to know once we're online on the timeline externally and share that externally. But to be honest with you, it's going to be an aggressive push from now to whenever we release specs 2026. And every feedback, the feedback collection is not going to stop. We're going to ramp up developer engagements. in so many places around the world. We're going to ramp up hackathons as well. Hackathons has been a way to really kind of like stress test everything that we've released, because you've seen it. Developers build firsthand, whether it's in alpha or whether it's in beta, they stress test it to the full potential. And we're able to get the feed that we need. So that's what we're going to do. From a dev relations standpoint, supporting what our product and journey teams are building, we're going to really push the boundaries of what is possible, engaging more devs. Not devs in our ecosystem right now, devs also who are in different ecosystems, who have not thought of us, but have considered us, right? Who are building on, like, let's say, Unity, right, who are building on Unreal, who are going to attend GDC, who are going to attend AWE. How do we get them to really think of us and then build with us and give us the feedback that we need as we march towards 2026?
[00:43:23.022] Kent Bye: Nice. And finally, what do you think the ultimate potential of XR, these head-worn augmented reality glasses and integration with AI, what do you think the potential of all that might be and what it might be able to enable?
[00:43:35.847] Joe Darko: Oh my God, that is a good question. Fully loaded. So for me, from a personal philosophy, I see a world where People, once again, like I mentioned earlier, there's still gonna be tablets, there's still gonna be laptops, there's still gonna be cell phones, but I see a world where our glasses are gonna play a prominent role in our lives day to day. For instance, personal story, I love gardening. If you don't follow me, Ken, follow me on Snapchat. Let's connect on Snapchat. I do post a lot about my garden. And I usually, when I'm gardening outside, I grab my phone, I take a picture, send it to IGPT, hey, this is dying oh is it really dying can you help me make sense of this what should i do in a perfect world glasses are going to replace that with ai as you mentioned right i can quickly look at this and with ai i'll say hey and ar glasses can you make sense of this right now for me How should I consider really rebuilding my garden to make sure my plants don't die? And I can look next to the side of my garden and it could probably just give me 3D version of my garden and how to really kind of like rebuild it in terms of like the grading of like the land, the soil to help with irrigation. It can probably do that and beyond, right? For instance, I also build a lot at home. I'm very handy. I spend a lot of time at Home Depot. In the future, I could pretty much look at the AR glasses with AI infused into it. I can say, hey, I have these two by fours. I'm trying to build a table for my kids. What is it going to look like? Where do I even start? Right now, what do people do? They go on YouTube, even if they have all the tools that they need, and they ask YouTube, and they spend countless of hours watching, pausing, watching and pausing. In the world of augmented reality, it could probably give me a 3D, let's say, a 3D design of what I should build would literally be the 2x4 plywood I had, let's say, tools I have, the hammer, the nails I have. It's okay. That's how you go through the process. That's how you build. And I can build it as I go without stopping. So to me, I gave you two examples. Also, as a public speaker, speaking on stage at conferences, People are going to wear AR glasses and use them as teleprompters, and people are not going to know. They're going to feel confident on stage, they're going to feel bold on stage, and they're going to be able to focus on the audience instead of looking at monitors. So these are some of the examples, and I believe it's going to take over the world.
[00:46:01.246] Kent Bye: Nice. Anything else left unsaid? Any final thoughts you have for the rest of the XR community?
[00:46:07.228] Joe Darko: I will say this, if you're listening to this, and if you are building HR experiences, I want you to think of Snap first, and I want you to build with us, and I will say, personally, stay with us. We have more coming for you, and let us know. Please share your feedback with me. Follow me on Snapchat, on all other social media platforms, on LinkedIn, and I love feedback. Share your toughest feedback with me, and I'll make sure it gets to the right people at Snap.
[00:46:33.450] Kent Bye: Awesome. Well, Joe, thanks so much for joining here to help share a little bit about where you've come so far and where you might be going here in the future. And it's always really exciting to see how enthusiastic the development community here is at Snap. And I'm really impressed with how far you've come so far with the Snap Spectacles. And I'm really excited to see where you take this here for the consumer launch here next year. So thanks again for joining me here on the podcast. Thank you so much, Kent. See you next time. Thanks again for listening to this episode of the voices of your podcast. And if you enjoy the podcast and please do spread the word, tell your friends and consider becoming a member of the Patreon. This is a, this is part of podcast. And so I do rely upon donations from people like yourself in order to continue to bring this coverage. So you can become a member and donate today at patreon.com slash voices of VR. Thanks for listening.

