#1671: Reflecting on Snap’s AR Platform & Developer Tools Past and Future with Terek Judi

At Snap’s Developer Conference of Lensfest, I did an interview with Terek Judi who is working on Spectacles Product at Snap focusing on SnapOS, Platform, and Developer Tools. See more context in the rough transcript below, and if you’d like to check out the two interviews with Matt Hargett that I reference in the intro, then be sure to check out epsiode #1311 and episode #1660.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.458] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the structures and forms of immersive storytelling and the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So continuing my coverage of the Snap Developer Conference of LensFest, today's episode is with Tarek Judy, who is on the product team at Snap for Spectacles, working on the Snap OS, the platform, as well as developer tools. So one of the things I was really impressed with with Snap was that they're creating their own operating system and creating their own sort of game engine with the Lens Studio to be able to develop these augmented reality applications for a device that's very resource constrained. So because of that, they've had to develop their own pipeline. It's a proprietary pipeline at this point, but they did open up the possibility for developing WebXR applications with their own WebKit enabled web browser, which we talk a little about here. And there's a session that goes into a little bit more detail. But in my two conversations with Matt Hargat, he talks about how like. deploying open standards XR applications to WebXR is sort of a frustrating experience just because the browser vendors are so different that you can't actually have true cross-platform compatibility. But if you start to use something like either React Native or NativeScript, these are emerging ways of being able to develop cross-platform applications and to be able to use open standards and to deploy them across different headsets. So In my conversation with Pat Hargett in episode 1311 from 2023 for MetaConnect, we start to map out the problems with relying upon the web browser at this point for deploying enterprise-grade XR applications. And then in episode 1660 that I just did at Oculus Connect 2025, we do a bit of an update and just to hear how he's pivoted away from React Native. and moving more towards native script as a way to create these self-contained applications that could be cross-platform. So that's something that I would love to see if Snap is interested in actually creating Lens Studio to not be just a proprietary solution, but to have a way to export an open formats way to be able to deploy on these different devices. That said, they're developing everything from scratch and they're doing everything they can just to develop something that's going to work for their own solution. So it's going to be, I think, low priority for them to create something that's going to also work on all these other headsets that have all these other applications. So also they started to integrate all these other types of tightly integrated things like the Snap Cloud that may not be easily decoupled from the types of proprietary things that they're doing. Things that would only work within the context of their headset and their device. So anyway, what I was really impressed with seeing what Snap's been able to do over the last year is that they've pushed out eight different updates to their operating system and working on the Lens Studio and all these different API updates. And so they're trying to create a really developer-centric platform. The one caveat is I'd say that they're still building things from scratch, so it's not as fully featured as it will be once they get closer to their full release for the consumer launch. But as they go along, they've been able to build and flesh out what their vision is for all the things that they want for their platform. It's also an opportunity to take a blank slate for what the XR industry have seen for the last decade plus with Unity and Unreal Engine and to create something that's of the core essence for what types of applications and experiences they want to have on augmented reality. They're launching Snap Cloud, which is like a whole... super base back end to do more sophisticated applications that have persistence and also be able to call edge functions. And so being able to more seamlessly integrate artificial intelligent type of applications, that was a big focus of some of the different hackathon experiences that were developed over the 25 hours that were happening there in a couple of days leading up to the lens fest with the lensathon. And I was a judge on that and I'll have some more coverage of the top three teams, as well as some of my other thoughts of the different types of applications that are starting to be developed within the context of the Snap ecosystem. I'd say right now, Snap is in this really interesting point where they're a bit of a dark horse in the AR industry. They're not even in the Fortune 500 and they have some of the biggest companies in the world that are in this space building more XR devices with both virtual reality, but also with meta. They have the VR end, but they're also doing mixed reality and they're moving more into like these AI glasses that don't have any sort of displays and then they just launch their display glasses. And so they're kind of moving from the bottom up. And then other companies are looking at, you know, like from Apple with Apple Vision Pro and then Google with Android. They're just launching the Galaxy XR this week with Samsung as their first OEM partner. But they're coming more from the top down for like these fully featured virtual reality devices. Snap is sort of right in the middle and they're just saying, we're going straight for the AR glasses and we're not trying to go from any sort of intermediary steps. And they're trying to create a whole stack and platform to tailor for this very specific use case. Whether or not it's going to be a successful consumer launch or if it's going to be mostly user enterprise applications or more B2B applications. That's still yet to be determined, but they're aiming for a consumer launch. And so they have a wide range of different aspirations for everything that they need to get ready for next year's launch. And it's been super impressive with how far they've come over the past year. And so I can only imagine what all the other aspirations that they have to be able to bring everything that they need on the software side to be able to create fully enabled augmented reality. experiences that are going to be made for the snaps max which are going to be launching sometime next year if i were to guess probably sometime in a third or fourth quarter next year probably sometime after labor day is what i would imagine which is when most of the new xr devices have traditionally been launched leading up to the holiday season So in a lot of ways, Snap has been a feedback-driven culture. They've been kind of running their augmented reality AR glasses more like a startup, quickly iterating and really listening to the developer and customer feedback and trying to adapt to some of the different needs that they're hearing from their developer ecosystem. But also at the same time, having a clear vision for where they want to take the technology. And so having a lot of strong opinions for what type of software and services in the So we're covering all that and more on today's episode of the Voices of VR podcast. So this interview with Tarek happened on Thursday, October 16th, 2025 at the Snap Developer Conference of LensFest happening at the Snap headquarters in Santa Monica, California. So with that, let's go ahead and dive right in.

[00:06:29.073] Terek Judi: Hi, my name is Terek. I focus on Spectacles, mostly looking after SnapOS, our platform and developer tools, as well as the experiences that we're trying to bring to the broader audience for Spectacles. Thanks.

[00:06:42.022] Kent Bye: Great. Maybe you could give a bit more context as to your background and your journey into the space.

[00:06:45.725] Terek Judi: Yeah, that's cool. I started 2018, if I remember correctly, or 17, sorry, 2017. It's kind of like the years are blurring now, with a Canadian startup called North, where we worked on a pair of smart glasses that did all sorts of things, like calendar, notification, Alexa. That was all the rage back then for assistance and stuff like that. and used a ring to control the experience. You had turn-by-turn navigation, things like that. Spent several years working on that there, then transitioned to another company. Afterwards, we're also looking at AR glasses, and about five years ago, I transitioned to Snap. Prior to the launch of the last AR, the first pair of AR display spectacles that we launched, and been here ever since then. So it's been a while. I'd say it's like almost coming up to 10 years, so...

[00:07:35.683] Kent Bye: What is it about augmented reality that really drew you to that technology to really dedicate the past seven, eight years of your career to that?

[00:07:44.672] Terek Judi: That's a really good question. So I'm expecting a child shortly. So I used to tell this in hypothetical, but now it's not anymore. And I had a mentor maybe more than 10 years ago. We had a discussion, and one of the things that he mentioned was that He wants to work on things that he can kind of point to and tell his kid that like, this is what I spent or dedicated my time. His biggest struggle at the time is like, he worked on a bunch of stuff that maybe it wasn't obvious for a kid to say like, oh, you know, this is cool or whatever. At the end of the day, technology evolves, things transition, you know, No matter how amazing something is, eventually something else comes and takes over. So for me, that source of motivation was just being able to point at something and tell my kid that like that's what I dedicated my time to. And sort of related to that, it's just I think that when you think about like a screen, the phone screen, if you just look at it like sort of spatially, it's kind of like a very small screen and yet its impact on society is profound. It literally impacts everything we do. And I think I was driven by this idea that like, if you were to take that and actually expand it to the world around us and give it almost in another dimension, it's literally, I mean, literally giving it another dimension here, the possibilities will be kind of endless. And I think might not know exactly what it looks like, but I think we can all appreciate that it's going to be quite profound. And I think that now that we have this AI thing that happened within the our glasses was always about augmenting the world and another part about understanding the world finally we actually have the technological means to understand the world and related to that i think there's this idea that people have a fear that ai will kind of like leave them behind this is kind of my personal belief a little bit obviously that if you think about a future where this ai isn't centralized and controlled by central entity of some sort but rather distributed in a part of you and augments your experience of the world it almost like you get an upgrade to everyone that is probably more recently become a bit of a firmer belief here that that this would be an equalizer that gives everyone an even level playing field as these ai advances take shape because they're able to augment your senses you're able to see more able to hear more sort of like being in a class's form factor is going to give you that that ability to interact with the world so that's kind of like what i guess gets me going right now a little bit it's just i think that this equalizer would be very exciting because I think there's a lot of uncertainty that the world feels about what is AI going to do to how we live. But if we think about it as just upgrading us all and augmenting what we can do in a way that still preserves our own unique attributes and ways, and I think that that currently excites me a lot.

[00:10:30.461] Kent Bye: What was it about coming here to Snap? Because, you know, there's 10 years of anniversary here at LensFest showing 10 years of augmented reality lenses that have been mostly on the Snapchat platform. But as they start to move from this Snapchat platform onto the Snap spectacles with these head-mounted displays of wearable technology with AR, something that you love. have been involved with for a number of years with North and seeing this vision of what could be possible when you start to add the next paradigm of computing on your face, these face computers, or being able to sense the world and detect it. What was it around what you saw in Snap before they had even announced publicly their long-term vision for where they wanted to take it? Why did you come to Snap and what did you see here to see this as the best place to do your work and continue the work that you've already been doing?

[00:11:20.404] Terek Judi: That's an excellent question. So, you know, I think like probably what would have been the midway point of my time in this industry, I kind of internalized a couple of things, I'd say, for what it would take. Because you're constantly thinking about, like, how are you going to build a product that fits into people's lives? Two things I've internalized prior to joining Snap that were kind of very important. or I thought were very important. Number one is the social aspect of it. And then the second is what I'd call like the platform foundations for this transition. When I considered joining Snap, they just had the capture glasses. There was no mention of any display-based product or anything like that. So what caught my attention is actually Lens Studio. Having spent so much of my time trying to think about what would enable a true platform to take shape on this next thing, just looking at Lens Studio, and I think what I immediately recognize is the foundation. It was an AR native development environment that allowed people to build things in sort of what I would maybe call the native language of AR, which is 3D. that was basically the thing I'm like you know what there's glasses and there's this thing and I'm like okay what are the probability that the team is working on or could be working on a pair of glasses I think I kind of sensed the vision without actually having seen it and then actually literally reached out to say hey this is where I think I can contribute or continue what it has been the same sort of like trajectory to come to Snap and actually see that and I think looking back five years later now Lens Studio has evolved so much. Lenses are, I can probably say now, they feel like the format for spatial applications. We've kind of evolved much further beyond the original intent of when we had like face lenses, but because the foundations were so strong, which I kind of see the challenge with a lot of mobile-based platforms trying to jump into the medium. The foundation is this 2D, so you kind of bring 2D-ness directly into your experiences, which I think is a challenge because how do you not disadvantage yourself by bringing the things that are Maybe we're missing in the last platform into this new paradigm. So I think that that's pretty exciting. And now, you know, seeing it all come to life with the new Snap Cloud product that we just announced, which is kind of a testament to the complexity and the utility and the value that these lenses are starting to provide, that they need this persistence layer to bring in user value day after day and maintain this continuity of the experience. It feels full circle with the SnapOS 2.0 release. And I think we have, I wish I can talk more about SnapOS 3.0. And there's going to be even more that delivers on this mission that I think it'll be quite obvious that lenses are the premier spatial format for creating spatial applications.

[00:14:01.843] Kent Bye: Yeah, I was able to attend the Lens Fest and Snap Partner Summit last year when the spectacles were announced. And what I was really impressed with was just to see the robustness of the developer community. And over the past year, also just been really impressed the degree to which Snap has been listening and engaging and being in relationship to those developers and to have what Jesse told me around eight different releases of the operating system. So a pretty aggressive cadence of updates and releases to continually improve what's possible for the Spectacles platform and the Lens Studio and the Snap 2.0. You've been able to achieve a lot over the last year. Just curious, as you reflect on all the different things, just curious how you start to tell the story of What you've been able to accomplish, but also this kind of rapid iteration of consistent delivery of improvements to the software.

[00:14:51.055] Terek Judi: Yeah, I think this is like, for me personally, this was like a very extremely important thing that, you know, I've kind of like internalized from just seeing how like our co-founders, Bobby and Evan, talk about sort of the importance of the developer ecosystem and all that. So when we launched Spectacles, this was before we had, you know, kind of started building out like the Spectacles community role and all that kind of stuff. And we were just launching and I do remember just making sure during evenings and weekends to review every single application, thousands of them, and getting to understand who are the people that are applying, sometimes even reaching out, engaging and understanding what their needs are. We also made a decision to start from scratch with a subreddit, Spectacles subreddit, which we've had other channels before, but we felt like we wanted this notion of a community is like open. Communities are open, they have open doors. So we felt like we wanted to kind of like make a home for our community that felt open, like you don't need to permission and up to your comfort level, how anonymous or sort of open you want to be. So with the Reddit, I think what was exciting is just like it felt open. You just go and you're able to see and you're able to engage. and it kind of like really took off with the team and things like almost everyone on the team is just constantly either sharing something from the reddit different people are answering questions and all that so i think there's been a bunch of things that we took from the beginning that we wanted to be very very mindful you know some might call it the things that don't scale which is like for example getting to know who every single applicant is and what their motivation is and what are their challenges and why they're coming in the first place and continuing to create a space where you can kind of like sense their problems, the things that they're having challenges with, what's blocking them. So I felt that that obviously took quite a bit to build as a solid foundation, but it's palpable. You can feel it now in the community. And the other flip side is like we knew there was a lot of things we couldn't get done when we launched SnapOS 1.0. And what I personally think is important is that if you're able to show that you're listening to your community and, you know, so many times somebody would go ask for something on the radio and then two weeks later it's released and they're like, oh, great. It's not about just like what you deliver, but it's about showing that commitment to time and time again that you're basically listening and then you're providing sort of like regular updates that address that and evolve the platform over time. So I think this is all to say that like when we started this, this is something that we've been like very keen on is establishing a genuine community of sort of like what I would call like AR developers. Some of us are first time they're experiencing sort of Lens Studio and Spectacles and showing your dedication and your commitment and continuously trying to make sure that you're evolving in a way to get to what I just described, which is, you know, that true spatial. And I think if you look at Spectacles, like, wouldn't be exaggerating, I would say it's the most robust platform for developing specifically for AR classes with see-through. I think that that hopefully will continue, and our commitment is to continue that into next year.

[00:17:59.676] Kent Bye: Yeah, I have been covering the industry for the past 11 and 1 half years. What I've noticed is that everything has been primarily oriented and orbiting around these game engines of Unity, Unreal Engine, where you have kind of a game development paradigm. You have this cross-platform capability, which is amazing, but yet you have the need to load all the assets, and you have these gigabytes of files. And so when you start to think around miniaturizing and optimizing all these different applications to the point where you could fit it onto something like glasses, where you have very aggressive size constraints of, I think, around 25 megabytes or so for a lens, which is very minimal for most developers that are developing Unity apps or used to having gigabytes worth of space to be able to do stuff. And so what I was really struck by is with now the Superbase integration of being able to dynamically pull in different applications, but also just moving beyond just the game engine logic of Unreal Engine and Unity and moving into more web application-based logic where you're pulling in different software and services where you're able to, over the course of 24 hours, do some utility that is more pulling in from these other coding and design disciplines that go above and beyond the game development community. So just curious to hear some of your reflections on the future trajectory of where you see this continued fusion of pulling in from other different programming traditions beyond just game development and more of the web-based communities and how you see Lens Studio as kind of like the center point for starting to pull all these things together.

[00:19:33.937] Terek Judi: It's a very, very excellent question. And I think, for example, when a lot of people look at existing mobile platforms and then they'd say, like, for example, you're coming from like some other existing mobile platform, that there's going to be this inherent advantage that you can take whatever it is that's on that platform, whether it's like a gaming or mobile platform, and then you're going to be able to kind of take that advantage of an existing group of people that have done an existing application and just kind of put it in this new form factor. Our approach has been always to understand sort of like what is needed to realize a true, lightweight, ultra powerful, wearable computing platform with see-through lenses. That's basically the vision that we've been working towards. And the interesting thing that happens is, as I mentioned earlier, our DNA or the DNA of lenses started from filters, super lightweight, sort of like self-expression oriented experiences. And the trajectory was to get to something that resembles what I would call spatial applications. So our trajectory has been more like thoughtfully asking the question, what is the next incremental thing that we need to add to lenses in order to make them the true platform for spatial experiences versus coming from these heavier things that you just described? That is taking things away is really, really hard. It's the proactive nature of, oh, how do I take an existing paradigm, whether it's for mobile or or games, for example, and kind of like bring it back to this thing that is probably somewhere in the middle. The size question is obviously something that's very intentional because we kind of like want to get to this notion of you experience AR experiences right away. Like you can begin to think about just what we talked about, like this contextual idea and all that kind of stuff. Being able to load the particular application in the situation super fast is kind of like an important thing. And I think it's going to become more important as we get better at understanding the world, better at being able to know what's going on around you and serve that so that is kind of like in my mind why it's like actually what others might see as a disadvantage is actually a quite a strong advantage that we're actually able to evolve to meet the needs versus trying to figure out a way to convince people that they don't need multi gigabytes experiences to actually deliver the value that users want so hopefully that answers your question I think it's just like I think in our minds we have like a very clear framework of thinking about it it's just additive versus subtractive which I think all the existing mobile platforms are gonna have you know, a bit of a challenge with that in my mind.

[00:21:51.909] Kent Bye: Yeah, well, it's curious to see that it's the hardware side that you have the OEMs and all the stuff that a lot of the other industry players are also doing. But you also have the operating system layer and then the what I guess would be the equivalent of like the game engine of something like rather than Unity on Unreal Engine, you have Lint Studio, which is essentially becoming this kind of proto game engine that slowly as time goes on, adding more and more game development type features that you would typically get into something like Unity on Unreal Engine. And so you have both the operating system level of the Snap OS 2.0, but also the Lens Studio. Just curious to hear how over the past year, you've continued to iterate and develop these software platforms that seem to be feeding into the overall ecosystem of Snap, because with Lens Studio, you can deploy not only to Spectacles, but you can also deploy to Snapchat. So it seems like that XR technologies are kind of infusing and directing where some of the future design directions are going. But I'm just curious to hear some of your reflections on what you've been able to do over the last year of this vision of building out that operating system level, as well as the Lens Studio to be able to do the application layers.

[00:22:59.918] Terek Judi: yeah i think it's important i think sometimes people think that like things kind of coincidentally click which i can say that's not the case i think evan has had a specific vision about what the world that he would like to see this is not something that was born out of nothing it's been way before my time like 10 years in the making so when you have such a strong vision of the world that you want to create it makes it much easier to see how the different, you know, sometimes you might have to take certain detours, but like, I think the general trajectory becomes in retrospect somewhat obvious, but I think it's clearly the whole point of having a vision is that the pieces come together and you're able to put them together. So I think that that's probably the way that we think about a lot of these things is like, where are we going and how do these pieces come in? So for example, you know, one of the biggest things that we announced in addition to SnapOS 2.0 is the new Snap Cloud. There's a lot of intentionality in thinking about what do we want to achieve with something like Snap Cloud. This is just the beginning of the Snap Cloud journey. There's just a ton more work to do and we have a lot of things I wish I could share, but I can't. But what I could say is that our vision is to figure out how to add that additional piece in there that will allow for AR to become a daily utility, something that you come back to. And then there's another thing that we're also trying to think about, which is with AI becoming such an undeniable force in society. And in my mind, I think there is an idea that like we will see that AI's true power will become so undeniable when it comes to actually its ability to augment the world in front of you versus giving you a paragraph to read, whatever it is that happens today. That things like how you persist data, what data you give access to whom, and how do you protect the user's privacy is going to become a very important topic. And that's one part that I'm quite excited about because it complements that critical part that we already have, which is, okay, we feel pretty good about that lenses are undeniably the spatial application platform. or special application format, one of the leading ones, at least, that we could rely on. Now, what other pieces are still needed so that you can create that ecosystem? I think maybe in 2017, the AR cloud was all the rage. People were like, oh, there's going to be this thing, and there's going to be, like, a lot of things everywhere and you're gonna be able to pull them and experience this full immersive experience of the world. Kind of like, I think, Pokemon Go type of time, people kind of got a glimpse. It's like that, to me personally, that was like one of the very few moments when you're like, oh my God, you've got global scale behavior change. I think anyone who saw that, obviously it was short-lived and maybe settled over time, but there's something about large masses of people changing behavior that precedes every transition in technology. Whether it was an aberration or foreshadowing of what's to come, to me that was that moment. I just felt millions of people around the world changing their behavior, going outside, hanging around park. I remember walking by this church in my hometown and i've never seen anyone outside of that church and like what's going on like and they were all like hanging out obviously after hours and if you go you know normal hours there'll be plenty of people in church but like and it was just a bunch of local people just kind of like trying to catch the next pokemon i think that in so many ways at least for me foreshadows the future that when these pieces of the technology come together what will be possible so a lot of what we've been doing in the last year is just thinking about How do they all come together? What are the missing pieces? And I think next year there's even more stuff that will come with the consumer launch that will begin to complete the puzzle, so to speak, of why this is a fully featured AR ecosystem. And as we could see with other competitors, it's really hard to launch a product with an ecosystem that actually feels somewhat complete. A lot of people might just opt in to just create a set of experiences. To build one experience is actually great. To build the same platform that builds that experience plus 10 more readily, I'd argue it's like 10 times harder.

[00:27:04.517] Kent Bye: Yeah, I wanted to ask around the browser that's launching because there's WebXR capabilities that are coming. And I've heard a couple people say this was built from scratch. Normally, when I hear that, I was like, oh, it's usually either WebKit or Chromium that they're kind of building upon. So just curious if you're using either WebKit or Chromium or if you're literally building a browser from the ground up.

[00:27:24.525] Terek Judi: It is definitely, like I'd say, as you've seen our example with like the cloud, basically, sometimes some things are, you do need to invent them from scratch. Some things are just like browsing the internet is browsing the internet. Our emphasis is always on what are the things that we care about that we can bring into this platform. So in that context, like things like, you know, thermals and how you actually make sure that you get reasonable runtime, how do you support all the different types of experiences that are meant to be. So I think in Duresh's talk that we just talked, we actually shared that we were using The WebKit engine is the foundation for it. But there's a ton around it that we're building as well.

[00:27:59.621] Kent Bye: Yeah, and I was bearing witness to the hackathon. I was a judge, got to see a bunch of the experiences. One of the things for Spectacle's developers was to start to use the new capabilities made available for SuperBase. And so maybe you could just describe some of these new capabilities of backend with the database, Postgres, and some of the other different things that are now made available for developers and why that was such a key part of the announcement. Maybe just give me a bit of a sense as to what is Superbase and why is it so exciting for developers to now have capabilities for easily connecting to this backend to have more persistent experiences?

[00:28:37.717] Terek Judi: Yeah, yeah, no, that's an excellent question. I think what we just discussed earlier, right, like we launched Spectacles and the last version of Spectacles in 2024 with the intent of giving developers the ability to start building the foundations for special applications. As the complexity and utility of those applications evolved, like naturally, when I as a user come back to a lens experience, For the longest time, I just kind of experienced it. It's the same experience every single time. But the moment it becomes like a utility or something that you rely on frequently, then you kind of develop the need to essentially have a certain level of persistence of the state, persistence of the user data. You start to come to rely on training artifacts, for example. So what Supabase does is actually just makes that possible. We're still at a stage where a lot of spatial applications are built by smaller teams that are kind of excited about the future they don't have a dedicated back-end person right but like we wanted to give them a foundation i mean super base's tagline is building a weekend scale to millions we are being very mindful about helping a lot of these developers build the foundations for those future applications in a way that will be scalable without them having to pay the tax of like scale you know like you can basically now Open up Lens Studio, download a plugin, and then from the asset library, download the SDK. And with literally under two minutes, you have the ability to store different types of files in blob storage. So that's basically like S3, for example. You're able to update images, whatever the type of file you want to store, you're able to do that. And it's all isolated, isolated per tenant. So I can remember the foundations that we're trying to build. And I think that's a key thing about everything we do is we try to build as much as we can, the right foundations. So having this multi-tenancy is quite critical because you don't want your data mixing with different people. The databases itself, you're able to use, you know, a spinoff, like a relational database with Postgres where you can store user data. The other thing that comes with that is also we handled actually behind the scenes authentication piece. So when you actually open up this plugin and you're logging really Snapchat identity, you're actually immediately as an organization created for you in the background, you can have your projects and you don't have to actually remember a new signup or anything like that. When your users and spectacles use super base as well, their identity also is created in a privacy preserving way, but like you basically have authentication. So if you are Kent and you're using your lens that I built, I don't have to actually worry about like creating a your new user, your identity on the glasses is kind of seamlessly used to create this thing, but it's still as a developer, your backend. Also with Postgres, they have these with super business real-time capabilities, which are quite interesting. I'm going to digress a little bit, but one of the features that we launched over the past year that It's one of those things where we're like, OK, we knew this was going to be exciting, but just seeing how much it enabled those WebSockets, I mean, it kind of makes sense, right? You're basically creating an arbitrary channel between two endpoints to exchange data in real time. The opportunities are endless. So we noticed a lot of people were spinning off, for example, WebSocket servers so they can actually connect Spectacles to some other device. With SuperBase now, you can actually do that out of the box through the real-time capabilities. So you've seen, for example, a teleprompter lens. You've had a communication between the Chrome plugin and the lens. and that's being facilitated through a real-time connection with Supabase. So we kind of, again, took out what we thought is very powerful, but I would have to go spin off a server or set up a WebSocket server and all that. Why? You just get that out of the box. So that's another aspect of this, that those are the three or four exciting capabilities. And we're collaborating. I think Supabase team themselves They pride themselves in the developer experience that they've created. And I think I've done web development in past life. And I think it's just they just make it so easy and so seamless, but they don't compromise on how scalable it is as well. And that you're going to be able to build a foundation, a strong foundation. So we're really excited about what that brings for our developers, because it essentially is driven by the same goal. How do we allow developers to have a seamless developer experience with Lens Studio, as well as a seamless developer experience on the cloud? And they all come together so seamlessly. So that's kind of like why this particular announcement is quite exciting.

[00:32:35.415] Kent Bye: So one of the other big trends that I've seen and I saw in quite a lot of the different projects, the hackathon was the integration with these third party AI services. And AI seems to be in this place where from the development perspective, there's a lot of developers who are using lots of different AI tools for coding assistance. There's even like AI integrations with the Lens Studio to help have your own custom LLM as an assistant to help figure out how to use the SDK. But there's this kind of double-edged sword with AI where there's all these new capabilities, but also all these potential risks when it comes to where AI is going to go here in the future. Just curious to hear some of your reflections of what are the thresholds or boundaries of things that you don't want to go into, or what are the values that you're centered around to make sure that you're keeping centered in the humanity, authenticity, but also like being in right relationship to the world around us in terms of not violating copyright or like there's a lot of ways that there's a colonizing effect of AI where it's moving fast and breaking things and going in a direction that can be quite scary and destabilizing and certainly a lot of new potentials, but also a lot of risks. And so just curious how you start to think about that from a product perspective and integrating all these different services to enable the developers to use the capabilities, but also what kind of guardrails are there in order to say these are the boundaries of we're not going to go down in this direction.

[00:33:58.317] Terek Judi: You really saved the best for last year. I think that that's so we're a bit of what I consider an interesting place with this because like what we focused on with Spectacles is to build a platform at the end of the day, right? We're building a platform that allows developers to build experiences on our product. I think with AI, at least one of the things that feels a little bit obvious is that like ai is an important part of building in my mind application logic i think it's even like it's sort of like a key part of how you build your applications now is that you can rely on a reasoning engine of some sort that is powered by natural language right i think that's more of like a building block so as far as like obviously there's risks with copyrights and all that those are i would say like we've partnered with a lot of foundational model providers in this case and A lot of what we try to do is make sure we work with people that have thought about these things. They're being mindful of that because at the end of the day, we're facilitating that particular aspect of the experience. Where we pay a lot of attention to is as a platform, we provide a lot of signals, right? Like your location, your camera, your microphone. So those are important things, right? I think when we think about like... When we launched last year, we knew that the camera is going to be very important and the microphone because they're kind of like multimodal inputs to an LLM essentially. So what we started with is that we introduced this thing called experimental developer mode, which meant you needed to enable an experimental APIs in your Lens Studio project. You also needed to enable it on the glasses as well so that when you're a developer and our goal is very simple, like let's unblock experimentation. Let's see what people build. in a thoughtful and responsible way. And our subreddit is full of those examples. You might've noticed that there's a watermark. If you ever capture at the bottom, it says experimental API, it's going to be very mindful of like learning from our community, learning from our developers, learning from our users, where the boundaries are in an area that we're like beginning to understand. So we saw a lot of creativity, as we had expected. I think we unblocked a lot of developers. And then we reached a point where we were like, OK, these are really awesome. People want to try them. But we put on that limitation that you could not publish this because, obviously, you're accessing sensitive data and calling the internet, which is the key part. If you did not call the internet, you don't need to do that. If you're doing everything on device, that was fine. And what we started doing is obviously onboarding what we call platform integrations that are vetted with very particular rules about data retention and processing, making sure that the data is disposed of, all that kind of stuff. And that was basically that second thing that we did. And we onboarded an integration with OpenAI and integration with Gemini as the leading providers. We had our own as well where we self-hosted DeepSeek as well to try to also see what we can do with the open source side of things. But as we saw what more people have done, it just still felt like there's more that people wanted to do with that plus the internet as well. So this release, we introduced something we're calling transparent permissioning. As the name says, you're being transparent about permissioning. Our goal is to try to minimize user friction in general. But in this case, if you use experimental permissions... As a user, you'll be required to give permission to the lens every time to say that this data might leave the device and all that kind of stuff. And if you give it permission, then the developer will have access to the data. And if you're using microphone or camera, the LED itself will also flash to give bystanders permission. sort of like an indication that, hey, this data is being transmitted off device. They may be processed or stored. So these are some of the things that we're excited to see how people use it, learn some more. I don't think this problem is solved. We still have a lot of learning to do and adapting and understanding how things work. But I think the key takeaway here is that we're learning and adapting and being very mindful of where these boundaries are rather than saying, oh, like whatever or something like that, that We're playing fast and loose with that. I think we're being very mindful in general. And it's intentional. I think it's like Evan's vision has always been original Snapchat. People trust Snapchat because it put a high bar on protecting user privacy. So it's kind of like, again, it's in the DNA. It's in the vision. So that's honestly, I think, the theme of everything we've talked about today. It's been a vision that has been laid out for over 10 years, very consistent. Everything ladders up to it. So in retrospect, everything feels like it adds up because there's such a strong vision of where things need to go.

[00:38:14.824] Kent Bye: Great. And finally, what do you think the ultimate potential of augmented reality, these head-mounted glasses with AI integrations, where all these things might be going here in the future and what they might be able to enable?

[00:38:27.150] Terek Judi: Yeah, I'll try to say succinctly somebody said that on Twitter. I don't know who actually anymore. I probably should figure it out so I can give them credit when they say that like AR feels like the interface for AI and add what I just said earlier that I think that AR glasses has the potential to democratize sort of like this AI access to everybody and put everybody on an even level playing field that where their uniqueness is celebrated. Those two are, I think, at least for me personally, where I think everything is going to come together and why people are going to need AR glasses in their life.

[00:39:01.521] Kent Bye: Anything else Sefton said you'd like to say to the broader immersive community?

[00:39:05.009] Terek Judi: No, I'd say like, you know, check out our subreddit, join. We all hang out there. I think this is the easiest way for us to engage, you know, and share your thoughts. And if you haven't yet started building for Spectacles, you know, please apply, spectacles.com. And as was mentioned earlier today, everything that we build on this generation is meant to transfer over to the next one. So, you know, that would be the call out. Just go try it, apply, experiment with this new medium, see where the boundaries of your creativity are.

[00:39:34.969] Kent Bye: I heard Jesse McCullough, one of the community managers who was working with you, mentioned that he was posting the release notes to the Reddit. And so I went through and read through it. I was thinking to myself, actually, this is like a little bit more comprehensive than the press release version that I saw. You see a lot of the data and all the updates. And so I'd recommend folks to go check out all the release notes there on the Reddit. And it sounds like you've got a pretty consistent clip of updating the whole platform and that it's in a subscription base so people can get access to the device and leading up to the consumer launch sometime next year. So yeah, I'm really excited to see where Snap takes us here in the future, especially because it feels like of all the different XR companies, Snap seems to be the one that has the developers at the center of the process and really being engaged. And I've seen over the past year how there have been developers that have come up and give feedback and things being launched and responded to very quickly, just even like the LBE and the fleet management from AWE to now, all these things. It's a pretty quick turnaround in terms of listening to the community and rapidly iterating. And so with the Superbase and all the different things and innovation I saw from the hackathon, I'm excited to see where developers continue to take all these new features and add more complexity and more robustness to persistence and other utility applications and integrations with AI and these other web services that It seems like there's a lot of possibilities for where this could go. And I feel like the foundations are slowly being built out here at the Snap platform to kind of build out this future of spatial computing. So I'm really excited to see where you take it all here in the future. And thanks so much for joining me here and to help break it all down.

[00:41:05.624] Terek Judi: Thank you so much. I really appreciate you taking the time. And yeah, like excited to see where we go next year with the consumer launch.

[00:41:12.145] Kent Bye: Thanks again for listening to this episode of the voices of your podcast. And if you enjoy the podcast and please do spread the word, tell your friends and consider becoming a member of the Patreon. This is a, this is part of podcast. And so I do rely upon donations from people like yourself in order to continue to bring this coverage. So you can become a member and donate today at patreon.com slash voices of VR. Thanks for listening.

More from this show