#1667: Kickoff of Snap Lensfest 2025 Coverage & SnapOS 2.0 Announcements

This interview with Spectacles Community Manager Jesse McCulloch kicks off my coverage of Snap’s Developer Conference called Lensfest. Snap is gearing up for a consumer release of their Snap Specs AR glasses some time next year, and they’ve been busy frequently updating their underlying operating system and platform tools like Lens Studio. There were no new announcements or reveals about the details of the Snap Specs that have been shared yet, but I did cover the biggest announcements at Lensfest throughout this series and in this interview with McCulloch.

I also had a chance to interview five different Snap employees exploring different aspects of their AR strategy, and I also interviewed some AR developers in from the Snap ecosystem. Snap brought me down to also cover the 25-hour Lensathon, and I had a chance to be a judge for the 10 different Spectacles-based hackathon projects, and so I’ll be featuring the top 3 finalists in the series. I also interviewed the AR game developers from DB Creations, as well as the latest AI assistant, guided tour demo from Niantic Spatial.

Here is a list of the 11 episodes and nearly 7 hours of coverage from Snap’s Lensfest:

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.458] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the structures and forms of immersive storytelling and the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So in today's episode, I'm going to be kicking off my coverage of the Snap Developer Conference called LensFest, where... They were announcing a lot of the new updates to their operating system, their Lens Studio. Sometime next year in 2026, they're launching the Snap Specs, which is the consumer version of the Spectacles back in September 17th, 2024 at their Snap Developer Summit where the Snap Spectacles were announced. They brought me out last year to be able to cover that. as well as their Lensathon, their hackathon, where they have lots of different developers coming in. And they brought me out this year, again, to be able to cover their developer conference. There wasn't any big new news around what's happening with their consumer launch next year, but they have been doing a lot of updates for their operating system, their Lens Studio, which is like their custom game engine that's very specifically designed for augmented reality lenses. And there's also an opportunity to talk to some of the different developers that are within their ecosystem. Also had me be a judge for the first round of the hackathon. So they had 25 hours. Snap brought together a total of 20 different teams of four people each. There was a games track and then also a spectacles track. So the spectacles track, they were given the remit to... develop a augmented reality lens using some of the brand new features of snap cloud which is super base hosted back end where you can have postgres databases to do more sophisticated applications the size limit for a snap lens is around 25 megabytes not a lot you could do within that budget however they're going to be expanding it out so that you can more dynamically pull down assets you can call edge functions and automated reality applications, but also to see what happens if you start to throw a database that's managed in the cloud to be able to have more sophisticated applications with persistence. And these 10 different teams were given an opportunity to develop whatever they thought would be interesting over the course of 25 hours so i had a chance to see all 10 of the different spectacles applications and did interviews with the top three teams and i'll have some more comments on what type of applications were being developed but i wanted to start off my coverage with jesse mcculloch who's the community manager and we kind of go through all the biggest announcements of what they've been doing for the last year most of what they've been doing is eight different releases of their operating system of snap os they just launched their snap os 2.0 they're working towards 3.0 towards their consumer launch next year And we go through the Commerce Kit announcements, all the different Lens Studio updates, all the different SuperBase integrations, and all the biggest announcements that we cover through the course of this conversation. I also had a chance to go through and play about three quarters of their backlog of 109 lenses. So I talk about some of my impressions of that and also some larger user experience things that I was noticing and unpacking some of the different thoughts that I have around what Snap's doing. Snap is kind of a dark horse in this race of augmented reality, XR. We have some of the biggest players out there, starting with virtual reality headsets. We had Meta with their whole ecosystem. They're going from VR and also from what they're calling AI glasses and smart glasses now. Google just came into the mix this week with Samsung Galaxy XR deploying Android XR. We have an upgrade from Apple with the Apple Vision Pro with the M5 chip, but also just launched this week. Also potentially Valve at some point maybe getting back into the game with what has long been rumored as the Deckard or the Steam Frame. So that's yet to be announced or fully revealed, but it could be another virtual reality head-mounted display that could be coming out here at some point. But for a snap, they're really going all in with AR glasses at the lens fest. They were celebrating 10 years of lenses. So they've been doing facial filters for augmented reality lenses for the past 10 years now. So they had a whole exhibition and a retrospective of the 10 years of augmented reality development that they've been doing. And so they have a whole social network audience where they've been cultivating and using AR for self-expression and to allow people to connect to each other. And so the Snap lenses are also creating other types of connected experiences and social experiences, and they have a strong vision for what they see as the future of computing. So I'll be covering some of the Q&A that the co-founders of Snap had, as well as in total five different interviews that I had with different Snap employees over the course of this 11-episode series that I'll be diving into kicking off with this conversation with the community manager at Snap who goes through all the biggest announcements that were being made at LensVest. So we'll be covering all that and more on today's episode on the Voices of VR podcast. So this interview with Jesse happened on Wednesday, October 15th, 2025 at the Snap Developer Conference of LensVest at their headquarters in Santa Monica, California. So with that, let's go ahead and dive right in.

[00:04:58.217] Jesse McCulloch: My name's Jesse McCulloch. I am the Spectacles Community Manager at Snap. I've been in XR for almost 10 years now, starting off as a developer with the original HoloLens 1, eventually getting to join the team at Microsoft through the launch of HoloLens 2 and some of the mixed reality stuff, and then moving to Magic Leap, doing community management there, and then finally landing here at Snap.

[00:05:18.626] Kent Bye: Great. Maybe you could give a bit more context as to your background and your journey into the space.

[00:05:24.301] Jesse McCulloch: Yeah, I kind of fell into this on accident. I've been in the Microsoft development space for a long time and was watching as they kind of unveiled the first HoloLens and I just thought it looked super magical. And so I applied for a dev kit thinking I would never ever get one because I had no background in doing anything, you know, real-time 3D, had no Unity experience, anything like that. And Somehow I managed to get my hands on one of the first devices outside of Microsoft as a developer. And so I started teaching myself how to do Unity and build in 3D. And I'd been part of some coding newbies and some communities that were really uplifting in how developers were helping each other out. And there was none of that in the XR space, at least around Microsoft. So I started building that. and grew a community called Holo Developers, which was kind of one of the first Slack communities for HoloLens because you couldn't like when you're one of the first 20 people to have one, you can't go to Stack Overflow and be like, How do you do X? How do you do Y? Because nobody else knows either. So it was really just all about developers coming together and helping each other out and building relationships. And so that was kind of my foray into it. And then as I started doing the community building, you know, that really was a nice fit for me. I enjoyed doing it and was good at it. And Microsoft kind of recognized that as I was doing it and brought me into the fold.

[00:06:44.720] Kent Bye: So you've been at the forefront of augmented reality from Microsoft to Magic Leap and now here at Snap. What was it around AR technologies that really drew you in to dedicate the last decade of your career to it?

[00:06:56.208] Jesse McCulloch: Yeah, when I first got the Holland one, I had this really magical experience. So I was working in an office building and one evening I was playing around with an experience that I downloaded where there was this AR model helicopter, and you'd hook up a Bluetooth keyboard to your HoloLens, and you could fly this helicopter around. And I remember having this moment of flying my AR helicopter around the cubicles and being concerned about running it into a wall so it wouldn't break, and realizing that my brain had flipped a switch and thought that this object was real, even though I logically knew that it was not. And so... That was kind of that moment that I was like, oh, this is something, this is gonna have legs. And then thinking through to what that future state could be just pulled me deeper and deeper into it.

[00:07:42.577] Kent Bye: Yeah, and since you've had experience with both HoloWins 1 and 2 and Magic Leap and now with the Snap Spectacles, the latest dev kit, what I was really amazed by was how much parity there was between the Snap Spectacles from 2024 compared to HoloWins 1 and 2 and even Magic Leap. amazingly was comparable in terms of the quality of the experience, hand tracking. I'd love to hear some of your comments on seeing what Microsoft was able to do with all of their billions of research into Holo Wins 1 and 2, and how Snap is a scrappy, not even in the Fortune 500, able to create something that would be on par with what Microsoft was able to do with Holo Wins 1 and 2.

[00:08:23.139] Jesse McCulloch: Yeah. I mean, I've been really lucky to have experience with kind of all these devices over the years. Each of them has been kind of magical in its own way. HoloLens 1 was kind of one of the first things that came out that people, to some degree, could get their hands on and experience. But it wasn't something you were ever going to wear out in the world. I remember I'd work in a Starbucks with it and get all kinds of weird looks. And people didn't realize that I could see them. So I'd see them pointing and whatnot and be like, wave at them. And then they get all embarrassed because they thought I couldn't see. But in order for this to eventually come out to the mass market, it's got to get smaller, lighter. And with Snap, I was super excited when I first got to try them when I started here of it essentially had the capabilities of HoloLens 2 with hand tracking and the field of view and the brightness of the displays that we had in a much bigger device. So even though it's not like leaps and bounds, bigger field of view or anything like that. The fact that it's such a more compact thing, I think really just points to the fact that over time we've gotten better at making these things smaller and really excites me for the future of how does this continue until the point we get to consumer glasses that people every day will be able to use.

[00:09:35.154] Kent Bye: Yeah, and so we're about a year in from when the Snap Spectacles were first announced. The developers have had a chance to build different experiences with it over the past year. Over the last day, I've had a chance to sit down with the Spectacles and go through probably around 80 to 90 of the 100 different lenses that are available. And so I'd love to hear some of your reflections on what has been able to be accomplished over the past year.

[00:09:59.761] Jesse McCulloch: Yeah, I mean, we've released, I think, eight releases now over the last year, which is like an insane speed to be updating the OS and the APIs and everything. And so with every one of these new releases, you know, we open up more capabilities that developers either been asking for some of our location based entertainment partners. So it's been really exciting to see that stuff come along. Right now it's super heavily games focused because I think that's where people equate XR to be is kind of this gaming space. And a lot of that comes from things like meta has become a very games focused thing despite them trying to break into enterprise and stuff. And so I think developers that are in the XR space kind of have an affinity towards building games because that's what they know. So I think it's one of our jobs as we move towards the future of consumer to kind of start guiding them to what those other use cases are that they can build for. And we do that by providing new APIs, new features. So we've been opening up things like being able to talk to LLMs. through our managed service. So we've got access to DeepSeek and Gemini and OpenAI for developers to go and start working in that world and taking feeds from the camera, sending them out to an AI, getting responses back and starting to bring AI into our real world, which I think is going to be a really powerful future use case. And then beyond that, the co-located experiences, I think is one of my favorite things about Spectacles is we make it really easy for

[00:11:25.760] Kent Bye: two people who want to play a game together to jump into those experiences yeah when is that augmented world expo I was able to see includes sitecraft which was a collaborative game where you're playing by yourself but you're also competing so that other people can knock you out so it's like a infinite runner but you have to compete against the other people but also you can cooperate so it's like a interesting mix of cooperation competitiveness but talking to includes ray colomar he was talking about how they're doing a lot of location-based entertainment which to me feels like lbe is going to be a place where you're able to have these headsets because you know not everyone's going to have them but also give them an experience that's compelling for them to be able to jump in and get an experience So it seems like LBE is a place where we're starting to see businesses that are already using it deployed out live before even the first official consumer launch. And so I'd love to hear some of your reflections on some of those needs of LBE, but also some of the reflections on this kind of emerging market to, in some ways, before it gets out to the hands of consumers, have some of these arcade-like experiences where people can pay a lot less money and get a taste of it and have an experience of this technology.

[00:12:36.963] Jesse McCulloch: Yeah, I mean, NCLU is kind of a really great partner for us. They are going out and working with bowling alleys and like Dave & Buster's and stuff like that to bring their experiences out to people. And part of the reason they've been a really great partner is like they're really pushing the bounds of what the devices can do. And so we have calls with them at least weekly, sometimes more often than that. And we're at this point where we don't have to try and scale that yet. So we can have that really deep working relationship with a partner and kind of really hone in on what those needs are. So, you know, whether it's like remote assets, because they're pulling down a lot of bigger assets. So having that capability, you know, here at LensFest, we announced the Snap Cloud. So bringing that ability for them to be able to host those assets on a cloud system that's built in and integrated to Lens Studio is going to be a big boon for them. And then even rolling out things like our fleet management tool. So that allows us to have a whole bunch of devices that they're operators, because they're not physically at each location. They go out and they contract with these operators to run these locations. So giving those operators a way to look at all the devices, manage them, see where they are on power consumption, whether they're too warm to be put back into circulation, making sure they're online, and then being able to launch that experience all from a nice form factor like an iPad or an iPhone. So they've been a really great partner of just helping us get that feedback and really starting to build a product that future location-based entertainment partners will be able to come in and use those tools as well.

[00:14:04.676] Kent Bye: Yeah. And having an opportunity to binge watch the backlog of these lenses, I'm doing a use case where most people aren't going to be sitting on the headset for hours and hours, had it tethered. But there's moments where it does get to the point where it has to pause because it gets too hot and it has to sleep. And so you mentioned like the heat measurement. Just curious to hear a little bit around like the heat management in terms of sometimes if you apply for too long, it starts to overheat and you have to turn it off and let it cool down.

[00:14:34.926] Jesse McCulloch: Yeah, I mean, any time you're working with hardware, there's always trade-offs of what you're doing with it. And as we get into these smaller form factors, that just becomes more pronounced. One of the things with Snap is we've got the full stack. We manufacture the hardware, we build the OS, we build the tools that you use to build. So we do a lot of micro optimizations in there to have the device function as well as it possibly can. But the reality is with displays and creating graphics and GPUs and chips, like they get hot at a certain point. And so while we have technology and like vapor chambers and whatnot to kind of help dissipate that heat, it's still something we're trying to get the balance of exactly right so that people can have an experience and move through it without cooking their head. So, you know, making sure that we've got those safeguards in place that we do these thermal shutdowns before it gets too hot to where it's like physically dangerous or anything like that is an important part of that. And, you know, we're always continuing to try and innovate and make sure that we can provide a good experience and within the constraints of the device.

[00:15:38.962] Kent Bye: The other thing that I noticed was that a lot of the experiences are fairly bite-sized in the sense that you could play through the entirety of an experience within the course of five or ten minutes. Some of them are longer in terms of the infinite runner type of experiences or puzzle games that go on and on. But on the most part, it's somewhere between five or ten minutes, sometimes more, sometimes less. It seems like that there is a limit of around 25 megabytes or so that people have to create experiences to fit within that small size. Just curious to hear around that size limit and what new things that you're announcing here at the Snap Lens Fest where having the cloud access, were you able to dynamically pull down more assets so you could potentially have more in-depth or involved experiences?

[00:16:21.605] Jesse McCulloch: Yeah. I mean, again, it goes kind of to that balance. With the Snap Cloud, that is one of the features that we've hit with a lot of developers of wanting to have bigger assets that just don't fit into these packages. So giving them that capability was super important to us. I think as far as the bite-size experiences, right now, with it being a dev kit, there's no consumer version. There's not really a consumer AR glasses that has the full capabilities that Spectrum has out there in the market. So I think... developers have to make this choice of where do they invest their time and their money. And so I think they're interested in playing with the headset, getting to know the features, and waiting kind of for that monetization opportunity to come out, which comes with a consumer launch like we've announced for next year. We're announcing Commerce Kit here at LensFest as well. So that's going to give them the opportunity to do kind of some in-app purchase and stuff. So I think as those monetization opportunities come about, we start to see more developers build longer experiences versus right now where they're just kind of getting familiar with it for when that time comes, that it's monetarily worth them spending more time to build out fuller experiences.

[00:17:35.081] Kent Bye: Yeah. And we're a day before the big keynote where all the big announcements are going to be formally made. I always prefer to see conversations live. I didn't have early access to the embargo information to hear a little bit about what's going to be announced because it I get a sense that maybe some of these different monetization strategies are just for broader snapshot, where lens creators can create lenses, and if they go viral, then there may be some sort of revenue sharing program with some algorithm that determines how that payout is going to happen. For the spectacles, it seems like that there may be more of an explicit like you're buying and purchasing either if everything's free to play or if you have ways of buying the lenses. The lenses are the equivalent of what we would normally think of as apps, but they're basically these self-contained experiences that In the ecosystem of Snap, they're referred to as lenses. So just curious to hear a little bit around like how you foresee this as it moves forward. If you imagine having more of a default to free to play and then people are adding monetization opportunities on top of that, or if there's going to be lenses that you have to buy for more of an app store model. So just curious to hear a little bit around as you move forward, what types of things you're looking at as you move towards the consumer launch?

[00:18:47.352] Jesse McCulloch: Yeah, definitely not ready to talk quite that far forward yet. But like right now, like I said, we've got this commerce kit thing coming where people will be able to do in-app purchases. So I think that kind of hints a little bit towards what may be coming in the future. And then the other way that developers can monetize right now is We've got monthly community challenges that we're doing, so developers can go build out kind of a proof of concept. We've got $33,000 in prizes that we're giving away each month, and that's across new lenses, people who update their existing lenses. So trying to build that continuous wheel of not just dropping something in there and then letting it fade to nothingness. So having people thoughtfully update their lenses. And then we know some of our APIs are what we call experimental. So you can't actually publish them into Lens Explorer because they take camera frames and talk to the internet. And we want to be really mindful of kind of the privacy aspects of that. So if they open source, that's kind of another category. And so the idea being is we've got people who are building out these proof of concepts and we're paying attention to what those are and which ones are kind of rising to the top. And then we can have the opportunity to go fund those if we want to see them go kind of a little bit further with it. And then eventually building out kind of this community of known developers that are doing quality work that when we start getting brand partners who are interested in having experiences but don't have the capabilities of building them out themselves being able to make those connections and so that's where we see kind of this first monetization opportunities for our developers while we're still in this dev program and then commerce kit kind of hinting at what happens when we start getting a consumer version

[00:20:25.549] Kent Bye: Yeah, and so just curious to hear a little bit more around the SnapOS 2.0 that launched like a month ago or so. It seems like that launching things like a browser, WebXR experiences, I was able to pull up some WebXR experiences work, some don't. There's WebAR and WebVR type of subsets of either it's VR or AR within the context of WebXR. But yeah, just curious to hear around the SnapOS 2.0 and some of the other things that you're also launching within the context of that.

[00:20:54.315] Jesse McCulloch: Yeah, so we kind of previewed Snap OS 2.0 to suppress last month. It actually just launched yesterday with some of the features that we're working on kind of in Lensathon and for Lensfest. As you mentioned, the browser, it's a new bespoke browser that we've built kind of from the ground up to give some of those WebXR experiences. Still working through that, that's very much in beta. As you mentioned, you found some that did and didn't work. So that's to be expected as different capabilities either are or are not available through our browser. Other things we've brought to it is Spotlight. So with Spectacles, you can take recordings using a hardware button. It'll take a 30 second clip. Previous, you had to then download those recordings to your phone. And then you could share them out through whatever social media, you know, we've got a gallery now, so you can actually review those on the device themselves and then share them out to social media, like Snapchat integration. I believe there's some others. And then Spotlight brings kind of the Snapchat Spotlight feed experience into Spectacles, and you can kind of swipe through the videos, like, and comment. And so it's kind of us taking a half step towards that consumer experience of what we expect to see on consumer glasses next year and just getting it out there so we can get feedback from it from our developers and improve on it. That way, when we're ready to launch next year, we're already on the head start towards it.

[00:22:12.697] Kent Bye: Yeah, and here over the last 24 hours or so, there's been a lens-a-thon where you've got a number of different developers. There's a total of 20 different teams because there's a games track and a spectacles track, but because this is Voices VR, mostly interested in the spectacles development. So 10 different teams of four each. The remit to these developers was to use some of the new APIs with Superbase, so this open source Postgres API. database capabilities. So maybe just give a bit more context for what type of new experiences you expect to see possible now that you have this in the cloud Superbase Postgres database that's available to start to do more sophisticated data structures or data flows or also pulling in information from the cloud or sending up to do AI processing. Just curious to hear a bit of like what is new with Superbase and this Postgres database.

[00:23:04.522] Jesse McCulloch: Yeah. So, you know, we launched this yesterday. It's SnapCloud powered by Supabase, which, as you mentioned, is kind of an open source Postgres cloud backend. We partnered up with them. So we actually have a customized version of it that's deeply integrated into Lens Studio. So it allows our developers to go create that backend, you know, with a click of a button so they can really quickly get started. And it's definitely in its alpha. So I want to make sure and call that out that these developers are playing kind of the earliest builds of it. And I'm actually really curious myself to see what they do with it. We got the opportunity to go to Supabase Select, which is their developer conference in San Francisco a couple weeks ago. And we brought Spectacles. We didn't give any hints. So we had people just kind of trying to implement it with regular Supabase. And I saw an experience where they were taking frames from the camera, sending them out for processing and looking at food items and using an AI to help decide whether this is safe for a diabetic to consume. And so I think use cases like that are kind of interesting. thinking about how this will help people in the future of having like this always connected AI cloud. I get really excited thinking about the possibilities of what that looks like, especially as AI comes along. But I think like those are people who are deep into the super basic ecosystem and really know how that works. I'm really curious to see what our developers can do. Probably not as deep into that ecosystem.

[00:24:35.609] Kent Bye: Yeah, and back in the day when I was doing a lot of Drupal open source development, there was a LAMP stack. And usually it was MySQL and eventually Maria database. So where did Superbase come from? Maybe just give a bit more context, because Postgres was like, oh, it's like an alternative to MySQL. But just curious to hear a bit around this Postgres track and this whole open source development of Superbase that has seemingly created a lot of other cloud-based services to scale up, like a content delivery network across a number of different countries to make it more robust than just say a normal MySQL database on a single server. So just curious to hear a bit more context for SuperBase and why you decided to partner with them.

[00:25:14.446] Jesse McCulloch: Yeah, I mean, I wasn't super deep into the selection of where we were going with it, but I know that some of the things we were looking at how quickly they've been able to scale up and scale worldwide. They have a super, super focus on having a good developer experience, which is super important to us as well. And then as we look forward to the privacy aspects of wearing a device that has cameras and microphones out in the world, And how do we think about that? Going through a service like Supabase allows us to kind of make sure that we're keeping that privacy focus and we've got the ability to make sure that we're not leaking data out. Having developers go through a service that we manage kind of gives us a little bit more of that control of making sure that it's not just the wild, wild west of Supabase. Like, oh, now I'm sending frames and doing facial recognition and popping up data about somebody that we pulled off of LinkedIn. Like that gets into really bizarre, weird territory really fast. And we want to make sure that we're pulling the reins in on that so that we're not moving past where we feel that acceptable use should be.

[00:26:22.623] Kent Bye: OK, so it sounds like that the Snap Cloud is maybe a deployment of a SuperBase instance. Because it's an open source project, but it's not like people are using it on the Snap Spectral's context. It sounds like they're going through the Snap Cloud, which is sort of managed by Snap, like that open source distribution through them. But it's using all the normal APIs from SuperBase. Is that correct?

[00:26:43.106] Jesse McCulloch: Yeah, and we're working with, like, Supabase is hosting it for us currently, and then eventually we plan to bring that on-prem. But, you know, really working super close with the Supabase team to help us make sure that we're doing it in the right way.

[00:26:55.739] Kent Bye: Okay. The other thing that I noticed, a lot of announcements around artificial intelligence seem to be a really hot topic right now over the tech industry, and so... Some of the features that were shown to the developers were more large language model AI assistance within the context of the Snap Studio. So you could ask some questions. And then it sounds like you've got a little bit of tuning to the APIs with examples and enough information, usually you have to have a wide enough robust set of data for large language models to be of much use, but it sounds like that Snap has maybe cultivated their own training data set and shown different examples to be able to get a little bit of a head start so that either with the SuperBase integrations or other APIs that there's ways that you could start to use within the context of Lens Studio to start to do AI assistance types of things. Maybe you could just elaborate a little bit on that development and cultivation of that.

[00:27:51.055] Jesse McCulloch: Yeah, I mean, AI tools, I think it's hard to say at this point that they haven't affected everybody. And so, you know, internally we've been working with some of the various AI large language models. And then coming at it from the context of, like, something like Lens Studio has a much smaller user base than, say, Unity or an Unreal Engine, just by virtue of what it's been used for in the past. And so, you know, it doesn't have... The documentation and everything that's been consumed by all these AIs where you can go out and ask a generalized Gemini or cloud code or what not and have a deep understanding. Plus the idea of having an AI assistant help you build out a website or something. That's a very flat thing. So when you start working in 3D space, having... an AI code helper understand that 3D space and, you know, placing content in 3D and stuff is just really difficult. And so we've been looking at like, you know, how do we build out MCP tools that'll allow, you know, a regular LLM to be able to come in and understand Lens Studio a little bit better, what APIs are there, what's supported, as well as kind of, you know, building out internal chat assistants within Lens Studio that are kind of trained within the Snap ecosystem. So we're kind of playing with all of it. And, you know, a lot of that starts off with like internally we want to use it. And then how does this roll into something that we can bring out to our external developers as well? So I think it's still, you know, early days on this stuff and, you know, a lot of trial and error. But, you know, we see the future of, you know, AI being a big part of a developer's workflow. And so trying to move towards building that into our tools so that that's easy for them to do.

[00:29:32.315] Kent Bye: Yeah, I know there's certainly still a lot of limitations and constraints with LLMs in terms of hallucinations and whatnot. So at least you have the ability to have error detections or if things aren't working, you get errors that are sent back. But yeah, I'm curious to hear from some of the different developers how much they were using some of these AI tools It seems like there was also this kind of modularization of some of the different aspects of what might be available for different types of toolkits to start to get pulled in. And so there was a mention of a block system. I don't know if that was also for Snap Spectacles or if that was just for broadly all lens developers. But maybe you could just talk a bit about some of the new features of these modular blocks that are being made available.

[00:30:15.939] Jesse McCulloch: Yeah, so I think the block system is more on the game side, but on the Spectacle side, last year we launched with our Spectacles Interaction Toolkit, which is kind of this broad toolkit of how to interact with 3D content using hand controls or far-field hand controls, and there were some UI elements, and it was kind of a big package of things that developers would find useful. As we've moved forward, we've come out with other toolkits, so we've got our interaction toolkit. Now we've got a connected lenses toolkit. So for building kind of those multiplayer co-located experiences, pulling some of the stuff out of the interaction toolkit, that's more specific to that. And then yesterday we launched our UI toolkit. So pulling some of those UI components out of the interaction toolkit and also building out, you know, some prefabs and some examples for what we believe good UI is for these kinds of experiences. So we can help our developers kind of go build out something that I don't want to say a standard because we don't want every application to look exactly the same, but gives them some guidelines and some tools to build out a functional UI for their application. So I imagine in the future, as we suss out different kind of modular pieces that either sit on top of each other or sit beside each other, you know, you'll start to see more and more of these kind of asset modules come out that are just there to help developers kind of have a faster, easier building experience.

[00:31:42.065] Kent Bye: Yeah, it seemed like that you were having a lot of different example projects for people to start to dive in and see some of these latest APIs are just being launched, especially in the context of a hackathon, where you could give the developers a leg up in terms of just seeing what the default canonical implementation of some of these different features. So maybe just talk a bit about some of the different new example projects that you were trying to get ready for this lensathon here.

[00:32:06.158] Jesse McCulloch: Yeah, I mean, we knew for Lensathon, Supabase was going to be a big one. So we went out and built out an example project for Supabase that showed some of the different functionalities, like real time, edge functions, database storage, and being able to fetch and retrieve that stuff. I think overall, we've built out a really big library. I think there's 30 plus open source examples that we have. And now I think part of what we need to go do is those are very good at showing how to use a specific API. But I think there's bits and pieces that can be pulled out of there to bring some of that modularity back down to developers so that instead of trying to be like, okay, now I've got these four example projects that all together kind of do what I want to do, but I don't know how to actually go put them together to do that. Like pulling those pieces out and being able to show like, oh, you grab this piece here and this piece here, and maybe we publish those as assets in our asset browser so that they can pull those functionalities in. They actually are meant to work together and people can start building out these experiences without doing so much of the heavy lifting themselves.

[00:33:11.488] Kent Bye: As I think around the XR industry for the last 11 years that I've been covering it now, it'll be 12 years in next May, but it seems like Unity and Unreal Engine have been real mainstays of all the different XR experiences. The majority of the experiences I've seen over the past decade plus have been on either Unity or Unreal Engine. And so now that we move towards the smaller form factor, then you have the Lens Studio, which is sort of like your own internal game engine that is able to use TypeScript and pull all these things together, but to potentially be a little bit more optimized to run on these headsets. WebXR is also another option for folks to potentially use sets of open standards to start to develop through the web browser. You know, talking to folks like Matt Hargett from Rebecca Specialties, they've found that even within the context of the browsers, it's not necessarily reliable enough across all the different vendors to deploy and ship enterprise quality software that's going to meet the standards when there's various changes, things break. And so moving off into more React Native or native script ways of compiling things down to be able to deploy out to these different devices. But for Snap, there's Lens Studio. Meta just announced that they're creating their Horizon game engine, which is unclear as to what degree that they're going to take a similar path of creating their own custom rolled game engine to be able to have apps on both Meta Ray-Ban displays and Ray-Ban Meta AI glasses. So it seems like that as things get smaller into the form factor that There's a little bit of moving away from these game engines that have been really driving the industry. Just curious to hear some of your reflections on that, having been at both Microsoft and Magic Leap and now here at Snap and the trade-offs and the cost-benefit analysis of kind of rolling your own versus leveraging the existing development ecosystems that are out there with existing apps, but kind of having to really tune the software to be fit for the different new hardware that's coming out.

[00:35:09.286] Jesse McCulloch: Yeah, I mean, Snap's in an interesting position, especially with Lens Studio. When I look at a Unity, originally it was developed as kind of this alternative to Unreal Engine and some of the other game engines for people to go build mobile games and PC games. And as XR came into being, they added on tools for doing that. And eventually it becomes kind of this thing that does everything, but it's not custom built to do any one thing. Whereas with Lens Studio, Snap's been in various forms of AR for 10 plus years. And when they built Lens Studio initially to do Snapchat lenses, it was kind of an AR first thing. And so I think that's one of the advantages that Lens Studio has is that it's been custom built for AR development from the get-go. And so it kind of reflects that. Now, if you're a Unity developer who's coming to Lens Studio and you open it up, you'd be like, oh, this feels familiar. It looks like Unity. The scripting language is different and some of the APIs are different. But it's not going to be like completely new territory. WebXR is an interesting one because, like you said, every browser implements it a little bit differently. And then different vendors on the headset side are implementing different features at different points in time. So some of them may support hand tracking, some of them may not. And so you kind of, even though it's meant to be a standard, I don't think it's been standardized across enough platforms yet to, like you said, where you can have that consistent experience no matter what headset you're on. But it is something that we know our community is interested in building as kind of this stopgap cross-platform thing. You know, what the future looks like for Lens Studio, I can't go down that road yet. But personally, I'm a big fan of open standards. OpenXR, I think, is something that's interesting and whether it's a matter of on my personal beliefs like you know it's one thing to say oh can i build an experience for spectacles using unity and open xr and that's one thing to consider but the other thing is like you know in the future what if you could build online studio and deploy out to other open xr headsets and maybe that's an even more interesting one you know to think about of like Are developers using Unity because it's the best tool for doing AR development or are they using it because it's the one that can reach the most devices? And if we've got a tool that maybe could do that development in a friendlier way for developers and then they could still reach all those other devices, would we see an influx of developers? And so that's something that I would love to see happen. I'm not on that side of the business, so I don't have any input there. But, you know, personally, that's interesting to me.

[00:37:51.045] Kent Bye: Yeah, I know that just talking to Matt Hargett, there is this move towards React Native as well as UnityScript just to be able to pull in more of these different libraries from the JavaScript perspective, but also more broadly, what is open source pathways for generating content? There's the Godot engine and Blender. There's all sorts of open source tool sets. So just finding ways of coming up with OpenXR standards or using things like WebXR. So yeah, that's something that I'd be very keen to see as things move forward, if there would be open source pathways above and beyond what's happening with Lens Studio. But for right now, it seems like that Lens Studio is the primary way of getting things onto the headset. Quick follow on around the browser, you did say that it was homegrown. Usually, most folks are starting either from Chromium or from Apple-based WebKit. So just curious if this is a Chromium-based browser.

[00:38:44.916] Jesse McCulloch: I actually don't know. Okay.

[00:38:47.897] Kent Bye: I would imagine it's probably Chromium, since that seems to be what most of the folks are doing. I did want to reference a letter that Evan Spiegel sent out to all employees on September 8th. There was a whole press release where this letter was then announced to the world. It was a bit of a recap of where things are at, but there was a little section that they mentioned to the tracks of the game developers, because there's the spectacles, track and there's also a gaming track. And so there seems to be more and more of an emphasis on looking at gaming in the context of Snap generally, just above and beyond what's happening within the context of what's happening at Snap. Because there's the spectacles and there's a lot of, in the context of XR devices, VR, gaming has been a huge part of really pushing forward what's possible. And in fact, there's a couple of resolution games that are on the Snap spectacles that really quite enjoyed both the cab and... A way that you're using your hand to direct a car around, but also use your hand to kind of move around in 3D space in a way that felt actually really nice and fluid in ways of moving around. So gaming seems to be a place where there's quite a lot of innovation. So the little quote from Evan Spiegel, he said, we're developing new conversation starters that span new status updates, flashbacks, topics, and games so that Snapchatters have more reasons to reach out and more fun ways to connect. Half of the lens-a-thon is focusing on whole game development tracks, so folks that are more on the Snapchat side creating games than the context of these lenses. But curious to hear some of your reflections on this pivot towards more and more games, interactive aspects, both in what's happening on the Snapchat platform, but also looking at Snap Spectacles and seeing how gaming seems to be a core place where people are experimenting with new ways of using the affordances of these devices to see what's even possible and prototype, what type of fun experiences might be possible?

[00:40:38.420] Jesse McCulloch: Yeah, I mean, I sit really, really center on the spectacle side. So I'm aware of what's happening on the Snapchat game side. I'm not super deep on it. But, you know, Evan's big thing across kind of all of our products is, you know, about connecting people. And that's something he's really you know, big on and centers a lot of his messaging within the teams and externally is like, you know, making the stuff more human connected, you know, across all of our products. So, you know, games is another aspect of that. And like during this challenge for the games track, you know, it's all about turn-based games. So you're playing with somebody else and, seeing that as a reason to reach out and be like, hey, do you want to play a game? Let's have this connection. Even though we're doing it through technology, it's still about that connection. On the Spectacle side, Again, as I mentioned earlier, I think developers come into this naturally with a tendency towards games, which is fine. I think it's super interesting, and you shout it out to Resolution. They're an incredible game studio. They've been doing stuff on the VR side for a long time, and seeing what they've been able to do and how they problem-solve some of the unique things that happen in AR, so... You know, one of the constraints on AR glasses is the field of view. And so they've got a snake game, which is kind of your classic, like go back to the Nokia phone days of playing snake and where you're, you're the snake and you're trying to consume apples and not run into yourself when they first started building it out. Like they'd control the snake and they would immediately go outside their field of view and they'd lose and they couldn't find it. And they're like, how do we solve that problem? And one of the unique things that they did, and they wrote an incredible blog post on our Reddit community kind of about some of these challenges. But one of the things that they did to solve for that is the snake actually stays centered in your field of view. and then they move the volume of the world around it. And so you still get this feeling of the snake moving through the world, but you never have that problem of losing it outside of your field of view, which I thought was just like this incredible compromise of like, how do you work within the constraints of these devices? And so seeing kind of that experimentation happening as they've been building and then bringing that feedback and sharing it out to our developers is a really wholesome thing for a community to have. And so I think as we bring more and more developers on, game mechanics are going to be a thing that we're going to have to try and learn in this AR space where you're using hands and voice to control things more than controllers. So I think all of that experiences that game developers bring is just going to help build out a greater platform for us.

[00:43:06.629] Kent Bye: Yeah, I did notice that some of the different experiences in games where you're asked to do a certain action that sometimes if you weren't doing that action within the context of the field of view, it wouldn't take place. Things like synth writers where you're punching things and you have to keep your hands within the context of the field of view or whenever there's a kind of a pixel building where you're building a heart, so you're grabbing these pixels and moving them across the room. i found that if i was moving my hand outside of the field of view sometimes i would drop the block and so at this point there's a narrow fit of view but looking at something like the meta ray-ban display glasses they have just a monocular display which they're really optimizing for creating it on one side of your face and what i noticed in doing that was that It creates this kind of binocular disparity where you have these kind of Z fighting types of effects. If anybody's a Unity developer, you'll notice that when you shut one eye, it'll be one color and you shut off the other eye and it's another color. And so there's kind of like ways that you can see that binocular disparity play out and what feels like a bug. But that's for meta a feature that they were trying to reduce the amount of. processing power that's required for binocular displays. So doing just a monocular display with kind of a HUD glasses with no head tracked, my experience of that was that it creates, at least for my doing the demos, I didn't buy glasses to do extended use, but kind of more eye strain or something that's like not very comfortable to have just in one eye. So the Snap Spectacles, you have a binocular display, the thinner field of view, but you're not having those other eye strain issues. So yeah, just curious to hear some of your thoughts or reflections on some of these different trade-offs between monocular and binocular displays.

[00:44:48.226] Jesse McCulloch: Yeah, like you mentioned, when you put something just in one eye, it does cause strain, and your brain will adjust to it eventually, but that doesn't lower the amount of strain that's happening, even if your brain accommodates for it. That's not something that we've ever just decided we wanted to go down that road. We really believe in this binocular display with a field of view and hand tracking and stuff. I think they're very different use cases, even though the form factor leads people to kind of try and equate them together. Like, we just see them as two totally different products. You know, as far as the tradeoffs, you know, it's weight, size, heat, and power are kind of the tradeoffs that you're going for. And so as these projectors get smaller, kind of the size and the weight goes down a little bit, but you've still got this tradeoff of, like, You can have a really big field of view and the GPU and the temple will kick in really hot and you don't want to burn anybody or your battery life goes down. So you've got an 80 degree field of view, but you get five minutes of battery life. That's not very useful. It's always a matter of figuring out what those trade-offs are and as the technology comes along and battery sizes get smaller but deliver more power, where do those needles move along the way? So that's something that we're always kind of fighting with and trying to find that right balance.

[00:46:09.369] Kent Bye: Yeah, when I think around the user input ways of controlling these augmented reality experiences, the meta Ray-Ban display glasses, you have the neural band, which is essentially turning your hand into like a TV remote where you're able to navigate 2D menus, move left, right, up, down, and use the thumb to index finger for accept and the thumb and middle finger to go backwards. That works great for 2D interfaces, but for 3D interfaces, I think the Apple Vision Pro with eye tracking of the look and pinch is like the optimal where you want to get where it's like the most streamlined. For 3D interfaces then, look and pinch is where I think the real magic starts to happen. With the snap spectacles, I feel like it's somewhere in between those two of the Apple Vision Pro with the eye tract and the look and pinch versus like the metarabian display glasses with the neural input where you're able to have high fidelity for clicking, but For the 3D interfaces, what I noticed is that you can either reach out and click the buttons if you can reach it within the near field, but the far field turns your hands into these little mouse cursors, and where those mouse cursors are are either flying through space, and you have to kind of move your hand forwards and backwards sometimes to get onto the right plane, And so it felt like a good tradeoff between using hand tracking to be able to dictate. But there was still a lot of times where I was trying to click on something, but it was at the wrong depth. And I had to kind of move my hand forwards and backwards in order to get to the right location. So I'm not sure if this is something that is more of a computer vision or UI or just the standards for how this is going to be operating in the future. But that was one of the things I noticed as I was playing through a lot of the games was this frustration sometimes of not being at the correct depth when I wanted to click something.

[00:47:59.022] Jesse McCulloch: Yeah, I think it's a combination of all of the above, right? Like, you know, computer vision, being able to track the hands and, you know, hands are pretty amazing and all the different positions we can put them in, you know, so being able to train our models to kind of hit all of those. And then part of it's UI. And we've done a lot of experimenting with like, do you snap a cursor to something that you think that the user is going to and At what point do you do that and does that become frustrating as it snaps to the wrong thing? We're still playing through a lot of that stuff and doing a lot of experimentation and user studies and pushing out updates and getting feedback on, oh, this feels like it got better or this feels like it got worse. There's still a lot of that going on and I think there's still no defined UI or interaction paradigm for any of this. We've got things that we're playing with and Eye tracking is super interesting. Apple Vision Pro has it. The HoloLens 2 had it. That can get you in trouble in different ways. A lot of times, when you're looking at something and you go to action on it, your eyes move away to where you're predicting the next thing to be before you actually make that pinch. So it's like, do you pinch on where your eye's at or where it was? And so there's balance to all of this. And I think we're still trying to figure out what that best use is.

[00:49:15.083] Kent Bye: Are there any other announcements or new things that are coming out here at the Lens Fest that are worth mentioning in the context of Snap Spectacles?

[00:49:22.409] Jesse McCulloch: Yeah, I think Commerce Kit for the monetization thing, the Snap Cloud, and then kind of the Snap OS 2.0 is kind of the big stuff that's happening on the Spectacle side.

[00:49:31.996] Kent Bye: OK. So we talked about each of those. And as you think around all the different lenses that have come out, all the different experiences that you've seen, some that may be released or not released, what are some of the different experiences that really stick out for you?

[00:49:45.051] Jesse McCulloch: Yeah, so we've got one called Spatial Tips, so that uses Gemini, takes a camera frame, and you can look around the room and be like, hey, label all the electronics in the room, and it'll do a pretty good job of it. It doesn't know what spectacles are, so it doesn't label those, but if we're in this room here where we've got a TV and kind of a a little tablet, don't do that. But it can also do things of like, I've taken it to a espresso machine and been like, hey, I need to make espresso. Can you give me the steps to do that? And it'll give me step one, step two, step three, and label those all in space correctly. Or you can ask it to like do post-it notes. So I think that thinking about the future of what does it mean to have AI in our world with access to the sensors on these devices and what it can start doing, that's super interesting to me. And then, you know, my favorite game is the snake game from Resolution. It's just very satisfying to kind of move around that space. Just really excited to see what devs are building. And then as we think about what it means to release a consumer version of in the next year, what kind of those key experiences we want, and then starting to lead developers to build those is something I'm very excited about.

[00:50:55.022] Kent Bye: Yeah, I enjoyed playing through the latest Industrial Light and Magic experience as a Star Wars experience that seemed to be a lot more robust than some of the other experiences in terms of like the length of the story using a lot of 2D pictures as holograms and some interactive parts. So that was a lot of fun to play through. That's new. There's new Avatar Airbender from Paramount that's coming out that was a quick bite size, some fun mechanics where you're shooting things out of your hands. and the experience that i found myself coming back to was this experience i think it's called like three dot where you have a grid of spheres in front of you and they're different colors and you just have to pinch and connect the dots and so as it goes on it gets more and more complicated and it was taking up to like seven to eight minutes to start to complete some of these different levels up to like level 14 or 15 it just seemed to keep going and going but It was a very simple game, but I found myself getting into these different flow states of being able to, in the near field, pitch on a sphere and move my hand around, and it was able to do the hand tracking really quite well, and also it was a clear mechanic, and yeah, just kind of like a simple puzzle game that I found myself cleansing my palate to jump into a new session and spend anywhere from six to eight minutes playing through one of these games. Yeah, it felt like there was some traditional games. There's a lot of exercise types of stuff that I saw in terms of fitness. I'm not sure if that's going to be a focus in terms of people out and about trying to gamify their exercise routines. Fitness has been obviously a huge part for wearables from Fitbit to... XR in general and yeah I don't know if there's any other kind of genres that you see in terms of informational stuff like translation things anything that's tapping into AI or when you start to think about the main genres what you start to think around the different types of experiences that seem to work well on the device yeah the translation lens is actually one that we've spent quite a bit of time on and

[00:52:49.542] Jesse McCulloch: It actually has two different modes. So you can do it as like a single user mode where you're wearing the glasses, maybe you're in a foreign country, and it's just listening to the person that you're looking at and translating into your language. So, you know, that's one use case for it. There's also a connected lens use case for it. So you get two people wearing spectacles. Now it's focusing the microphones on you and then translating for the person that's in that connected session. So That's more of like I'm sitting down with somebody and trying to have a conversation with them and both of us being able to see that translation in as real time as any of these tools can be. So I think that's super interesting. But going back to 3Dot, the one that you were playing, I love those kind of real simple experiences that are just enough of a twist to be like, oh, this is a little bit novel. And so there's one that I don't think is in Lens Explorer anymore. It's called... square peg round hole and it was just a storytelling mechanism and so it had a scene really simple scene and then as you turned yourself in 360 like it's just like this progression of scenes telling the story and so as you spin in a full circle like it completes the story but if you think of that in like a storytelling mechanism like There's no reason that as you come to that 360, it can't keep going and hide those original ones. And like over time, maybe you get dizzy because you've spun so much trying to have this story. But I just thought that was a really unique thing that I hadn't seen before that uses kind of the space around you without it being like this gamified or anything like that.

[00:54:22.403] Kent Bye: Yeah, one of the simple ones that I really enjoyed was a shader one that was just projecting shaders onto the wall. And it was really quite immersive to see those. There's also one that tended to make it overheated fairly quickly, because it was using a lot of energy to render all those different shaders out. But that was one that it felt like it was kind of painting the world, that you could really see the field of view in that one. And there's some other ones that were more fully immersive, like you're going into an art gallery of paintings and that you're entering into a world. And so, yeah, some of these that may be a little bit better for fully immersive VR where you're being transported, it seems like there's a bit of a sweet spot for AR where It's putting objects out in front of you, maybe a little bit of distance so that they are fitting within the field of view and not breaking that sense of presence. So yeah, just finding the different sweet spots for where augmented reality is really working. But yeah, it was fun just to see all the different stuff that's available. And we'll be both on the jury to be able to see all the different experiences that people are using the latest things here later this afternoon, just to kind of see what the next iteration is for where all this is going.

[00:55:28.550] Jesse McCulloch: Yeah, the one where like the world's paying, that's the DGNS that developers actually here at lens fest. So I'm interested to see what their team comes up with. But I think part of that is like thinking of the aspiration of like what these devices could be in the future where you have a bigger field of view and, and transform your world in real time. I love to see people build within the constraints and not highlight the field of view, but I also understand like this desire to have a bigger field of view and have kind of those capabilities. that currently are more in tune with what would be a VR experience, but, you know, are put on the glasses for right now. And then I don't know if you had a chance to go down to kind of the arcade area downstairs and play the racing game, but that's kind of one of those ones where, you know, in theory it shouldn't work very well on AR because of that field of view. But, you know, as I played it and, you know, you're looking where the road would be and having that field of view kind of follow you, I was surprised at how well that translated into this AR, even with the limited field of view.

[00:56:25.270] Kent Bye: Yeah, it may actually be of benefit because having a fuller field of view can start to get into vection effects that give you more motion sickness. I didn't feel any motion sickness effects, even though it was a very portaled kind of field of view as you're overlaying this racetrack. But having the haptic interactions of actually turning a wheel and having it project out into the world, I think having that tight coupling between your actions and agency gives even more a sense of immersion into that world, even though there's not a lot of other environmental presence cues that you're in that world because it's still kind of a transparent overlay onto the existing world. But I did find my brain able to surrender into that plausibility in a surprising way that I think that the haptic experience of what my experiences of driving a car were so strong in the sense that that type of experience actually worked quite well in a way that was really quite surprising.

[00:57:18.239] Jesse McCulloch: Yeah, I had that same thing. And then even using the features of like darkening the lens. So you got just a slight bit more immersion than you do without like, I think, help that as well.

[00:57:28.587] Kent Bye: Yeah, because there is a bright and darkness. And so what's actually happening there to make it darker?

[00:57:34.246] Jesse McCulloch: Yeah, so we've got this capability of having the lens tint using electrochromatic features in the lens. And so we've got a light sensor on the device. So really, it's meant so that if you go outside and the environment becomes bright, it darkens so you don't lose that washout that you get with a lot of devices. when you go use them outside, because we do want them to be able to be used outside, especially as we think towards consumer, like this is not something that people are going to just only use inside. And then we give developers control over being able to tint that as well, or you as a user being able to tint that at any time. So this is one of those occasions where it just made sense with the experience to have that darkened feature to kind of not highlight that disparity of the field of view.

[00:58:18.686] Kent Bye: Yeah, the glasses do fit over my existing glasses so I could wear them, but they kind of are low. And you did give me a prescription insert. It was negative five. I have negative 5.5. I found that it was easy for me to see the text. It was clear. But when I was just kind of walking around looking at stuff in the far field, it was still a little bit blurry. I wouldn't necessarily want to walk around the wide world with these on just because the prescription is not matching my own lens. I know metarabian display glasses had inserts up to like negative four, positive four, which is a little bit too much of a gap for my negative 5.5. But just curious to hear if negative five is like the limit or if you expect to be able to expand that out as you move closer to launch.

[00:59:03.632] Jesse McCulloch: Yeah. So when we're out doing kind of hackathons and stuff, we have these kits that go from like positive one and a half to negative five. And that's kind of the written half half diopter or whatever it is. But we have a partner that actually goes and creates those lenses. So you can send them your actual prescription and they build a lens that actually is in your prescription. We just don't carry everything in between. And some people, it's one eye is this thing and the one eye is the other. And so in our demo kits, we don't accommodate that. But our partner that makes these lenses actually can do that. So you get this custom prescription that's for you.

[00:59:38.857] Kent Bye: OK. Yeah, that's great to hear. And one of the other things that I noticed after wearing this for many hours was that it did seem to get a little heavy. It was kind of like smashing down my ears. Again, these are just more of a dev kit. I think as you're moving closer, you're probably be doing some refactoring but just curious to hear some of those different trade-offs of like the weight and how heavy it can be to kind of like weigh down your face because I was doing the short demos before I never really noticed it until I was wearing it for hours and hours on end where it starts to feel the effects of the weight kind of smashing down my ears

[01:00:13.720] Jesse McCulloch: Yeah, I mean, not talking about what's coming in the future. The weight is something that we know is something that's got to come down as we move into consumer and beyond. So I think that's something we're always trying to think about is what those components are and what those trade-offs are. And as you mentioned, generally a lot of the experiences are shorter, so it's something that you may not notice until you're wearing them all day. Probably initially, I think the glasses will be something that maybe isn't an all-day wearable anyway. You put it on for specific uses that you want to use, then you take them back off, charge them or whatnot. So I think we're moving in the right direction on that. We're just not quite there yet.

[01:00:53.134] Kent Bye: And I happen to have a portable battery with me. And so the battery length was fairly short when I didn't have it tethered, but I was able to just tether it and use it for all day. So at least if you have your own battery, you can plug it in and charge it. And I know the MetaRay band display glasses, you have to like charge it in the case. And so you would have to take it off and charge it. But at least in this iteration where there's an ability to plug it in directly, it could potentially become more of an all day wearable because you have the ability to charge it as you go.

[01:01:22.179] Jesse McCulloch: Yeah, definitely. And that was one of the things that we wanted to give for developers is knowing that when you're developing, a lot of times you're wearing them for longer periods of time. So having USB-C fast charge so you can either charge them up really quickly or being able to tether and have that battery pack tucked in your back pocket. We didn't want to ever make that part of the thing like some of the other devices where you have a puck or a battery that you carry around, but giving developers that opportunity if they want it.

[01:01:49.534] Kent Bye: There were a number of different applications that actually were using your phone as a controller or a new yoga app where you put the phone on your body as a body tracker. So there does seem to be ways of using the phone as an extended input device, but also being tethered to the Wi-Fi you connect through. I mean, it's directly connected to the Wi-Fi, but to set it up, you use the app. But also if you have on the go, you're using like 5G, assuming that you're going to be using the data that's on your phone rather than, you know, having like phone plans for this type of thing.

[01:02:21.563] Jesse McCulloch: Yeah. I mean, right now, definitely the phone is a companion app. Developers have the opportunity to use it as a controller. And you can also do things like spectator mode where it kind of get a third person view. Most of the lenses work with that. And then, you know, as part of Snap 2.0, we've actually got a mobile kit that allows you, if you're, you know, an iPhone developer or an, or an Android developer, and you want to have, you know, a more closely tethered experience, like having those APIs to build in to your own phone app that can talk to the spectacles is something that we've just released as part of the snap 2.0. So we do definitely see it as a companion, you know, and potentially as, you know, your mobile wifi or anything like that going forward.

[01:03:04.110] Kent Bye: And I was able to record some different sessions and I was able to actually have a much wider field of view than what I was seeing, but able to record a sense of what I was seeing, but also like a new gallery app that I think just launched recently as well, where you can start to look at these different recordings that you've made.

[01:03:20.410] Jesse McCulloch: Yeah, so the new gallery lets you kind of review them on device and then pull them off. And then, like you said, you've got the option to pull them down on your phone. And they do do compositing, so you do get kind of the wider field of view than you're actually experiencing. So that's something that is really good for showing off kind of what the experience is to a degree with knowing that the experience on the device is a little bit different based on that field of view.

[01:03:48.277] Kent Bye: Nice. And as we start to wrap up, I'd love to hear what you think the ultimate potential of XR and augmented reality, these wearable devices like the spectacles might be and what that might be able to enable.

[01:04:00.661] Jesse McCulloch: Yeah. I mean, as I look into the future of wearables and glasses and AI and everything that's kind of coming to fruition or seems to be heading in that direction, like I would love to see a day where part of this is your glasses are always context aware of what's going on around you. And with a phone, I think one of the things that irritates me is I have to think about, oh, what do I want to do? What app do I use to go do that? Now I go open that app. I would love to get to this world of where there's enough context in the world around you that it can almost anticipate what you want. might want to do and kind of pop up those experiences or those tools that are then useful in that context. So I've, you know, throughout my week have been putting together a grocery list. And now I go to the grocery store and it knows by GPS that I'm at the grocery store and like, oh, here's your grocery list. You know, and as I grab the things off the shelf, it is aware of what those are, you know, checks them off the list, like this whole connected world of like information. You know, we've probably got a long way to actually get there. And it's not as easy as I make it sound, but I'd love to see that world where, you know, just everything is more contextually aware. And the flip side of that is like, what does that mean for privacy? You know, both my own privacy of what I'm giving to all these services and do I trust them, but also like the privacy of people around me and whatnot. So I think there's still a lot of conversations to be around, you know, had around that. And I know that that's something that you've focused on for a long, long time. So yeah.

[01:05:36.963] Kent Bye: Yeah, and just one follow-on question looking at what Snap's doing versus what Meta's been doing with their ecosystem and what Apple's been doing. So at the MetaConnect, there was a developer conference bringing all developers together. Meta's launching the Meta Ray-Ban display glasses, but then it wasn't until the second day of the developer keynote, at the end of the keynote, where they said, you can start to potentially start to develop and deploy things that are more for these AI glasses that are not even the displays yet. So they weren't really leading first with the developer kits or the third party developer ecosystem. It was almost like an afterthought at the end of the developer keynote and also a coming soon. And we'll see if you'll ever really actually get access to this display that we're just launching here. in a month or so when they were having the keynote. So in contrast to Snap, you've had this really robust lens developer community. You've had these lensathons and really engaging your developers with the cutting edge of new capabilities and continuing to rapidly iterate The Spectacles have been out for a year now and continuing to develop that in parallel and collaboration with the broader third-party developer ecosystem. As someone who's a community manager, love to hear some of your reflections on the emphasis of third-party developers here at Snap and why that's so important.

[01:06:53.925] Jesse McCulloch: Yeah, I mean, I think an ecosystem lives or dies by what experiences are available. So bringing developers on early to resolve the pain points that they're feeling as they develop for a platform is super important and something I'm a big fan of. The more transparent we can be with our developer audience to the degree that any business can be is only going to help us gain trust with developers. And as they trust you, they bring that feedback to you in a more real way. And then we can action on that and build out this ecosystem for developers so that they can go then build out experiences for consumers. And so I think that's a really big part of it. We just don't have the bandwidth to go build out all this stuff on ourselves or try and understand all the different things that a developer might want to do. There's so many more developers out there than there are employees at Snap working on this stuff. So, you know, it's also a scale move, right? So how do we scale this to more and more people over time? And so I've had the incredible opportunity of like kind of traveling worldwide over the last year to bring developers into the fold, you know, give them experiences on Spectacles, you know, give some of them Spectacles, you know, talk to the people who are subscribing actually individually. paying money to be part of this program, which is incredible that there's enough of them out there who believe in this enough that they're paying a monthly fee to have access to these devices and whatnot. So getting to go do hackathons and events around the world. I got to travel to Brussels for AWE Europe. We've done hackathons in London and you know talking to user groups in amsterdam and kind of all over europe and then like there's just this huge desire from people in markets that we aren't in that you know would love to be able to have more access to these things so you know it just really fills me with a lot of hope of like there's so much hunger for this technology out there and people who want to build for it and it's just a matter of like how fast can we scale out to all those different audiences

[01:08:53.157] Kent Bye: Nice. And one other follow-on in terms of the social dynamics, I feel like that having, is it a feature called iConnect where you can automatically get into other sessions? I think the social dynamics that are possible with these connected lenses, shared experiences. I know last year when they launched the Snap Spectacles, there was the Ez Devlin kind of ritual experience that was amazing. allowing people to have like a shared augmented reality experience. And so I think that's a really powerful feature. A lot of demos I was doing was just more single player, but maybe you could just elaborate on this mechanism that if you do happen to have either an LBE context or other ways that people happen to have the lenses together, how they can quickly get into a shared experience together.

[01:09:35.935] Jesse McCulloch: Yeah, so there's a few different ways. So when Connected Lens first started, like one person would start the session, they'd have to look around the room, kind of map it. Then somebody else would join and they'd have to go stand kind of where that first person was, look around the room until the two devices kind of had this shared understanding of the world. And then you'd be joined into a session. And then, you know, over time, you know, that was a bit onerous for some of like the LBE experiences. So like NCLU is using this, you know, where they go in and they scan a location ahead of time, pull that scan into their actual application. And then we allow them to kind of hard code what that session ID is. And so the two devices automatically know they're in the same session together. And then they just have to localize on this kind of mesh that's already been pre-scanned. And then the iConnect thing that we just launched is, you know, I put on a set of glasses, you put on a set of glasses, we join the experience, we look at each other and it's able to determine where we are in space from each other. And that's how we do the localization and have that shared coordinate system. So just iterating on like, how do we make this a more seamless experience over time, you know, for our users and for our developers.

[01:10:47.349] Kent Bye: Nice. Yeah. Looking forward to trying that out as well. Awesome. Well, is there anything else that's left unsaid that you'd like to say to the broader immersive community?

[01:10:54.814] Jesse McCulloch: Just to everybody out there who is building in this industry at all, it's so early days, even though we've talked about being in this for 10, 11, 12 years, me being in it for almost 10. There's still so much room for more people to join this ecosystem and this industry in general. I just want to personally welcome anybody who wants to come play in this area. And if you're interested in Snap Spectacles and our community, feel free to reach out to me on LinkedIn, Twitter or X or whatever we're calling it today. I'm available and I love talking to people and bringing them in. So super happy to have that opportunity.

[01:11:33.380] Kent Bye: Awesome. Well, Jesse, thanks so much for joining me here on the podcast to give a bit of a breakdown for all the new features that are coming to the Snap Spectacles as you're headed towards the consumer launch next year in 2026. And so it's great to be able to see all the different lenses that have been developed over the past year. And I'm very curious to see what the developers that are here doing that. Lensathon are going to be developing. We're going to be judging here in about an hour or so, seeing all the different experiences and catching up with the rest of the community. And just the level of access to a device like this is great just to see how accessible it is. You can pay a monthly fee rather than paying thousands of dollars up front or hundreds of dollars up front. It's something that you're subscribing to and getting access to over time. But Yeah, just curious to see how this kind of robust developer ecosystem continues to evolve and grow as you move towards the consumer launch. And we've already started to see some of the different cutting-edge AR experiences that folks are doing. And so excited to see even more here at LensFest and to see where it all goes here in the future. So thanks again for joining me here on the podcast to help break it all down.

[01:12:33.506] Jesse McCulloch: Thanks for having me.

[01:12:34.892] Kent Bye: Thanks again for listening to this episode of the Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a supported podcast, and so I do rely upon donations from people like yourself in order to continue to bring this coverage. So you can become a member and donate today at patreon.com slash voicesofvr. Thanks for listening.

More from this show