#1460: Snap AR Platform Lead Sophia Dominguez on Spectacles and AR Ecosystem

I interviewed Snap’s AR Plaform Lead Sophia Dominguez at the Snap Lens Fest about the Snap Spectacles. See more context in the rough transcript below.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.458] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So continuing on my series of looking at different announcements around Snap Spectacles, today's episode is with Sofia Dominguez, who is the Director of AR Platform at Snap. So I first met Sophia when she was doing this really great newsletter called All Things VR and had a chance to do an interview with her all the way back at Oculus Connect 2 back in 2015. And so then she continued to do her own startup and then eventually moved on to working at Snap, heading up their AR platform there. so yeah i had a chance to talk to sophia about the snapchat spectacle some questions that i had around it you know she mentions that you're able to hook it up to a battery pack if you want to that you're not completely limited by the 45 minutes and that most of the stuff that they have in their enterprise is mostly like b2b to c so it's like business to business but those businesses are directly interfacing with the consumers and that's kind of like the sweet spot that they have found and so I think my challenge to Snapchat is to really think about how to work with other types of businesses that are able to really scale out and have more of a B2B component to how they're going to be growing out this ecosystem for the Snap Spectacles, whether that's like location-based entertainment or other types of museums or just some institution that's willing to pay a certain amount of money to have access to these dev kits and to kind of like experiment with different social dynamics with shared reality experiences. So, yeah, get a lot more context for how the snap spectacles came about and just some more information and context about the snap ecosystem and how they're starting with personal self-expression as being like a real catalyst for AR lenses within the context of snap. And then turning those lenses out into the world to do more world facing types of AR lenses that are for more of the first person perspective, which is what you get from the snap spectacles. So that's where we're coming on today's episode of the voices of VR podcast. So this interview with Sophia happened on Wednesday, September 18th, 2024. So with that, let's go ahead and die right in.

[00:02:19.033] Sophia Dominguez: So my name is Sophia Dominguez. I'm the director of AR platform at Snap. I've been at Snap for four and a half years, but I've been in AR for 11, starting when I traveled around the world with Google Glass and documented how people saw the first AR headset. So my role at Snap is I oversee the developers and partners that are building lenses on Snapchat with CameraKit, which is our external SDK that is embedded into Snap. and other companies, apps, websites, and also out in the real world. So you see things in the Jumbotron, mirrors, et cetera. There are partners and developers that are building the spectacles. And last but definitely not least, Bitmoji.

[00:02:54.288] Kent Bye: MARK MANDELBACHER- Great. Maybe you could give a bit more context as to your background and your journey into this space.

[00:02:59.963] Sophia Dominguez: Yeah, so when I was 13, I read a book called Feed by MT Anderson that talked about a future world where we all had a chip in our bodies. Everything was AR and VR, but we didn't call it that. And I just remember I had a T-Mobile sidekick at the time and looking at the T-Mobile sidekick and I was like, yeah, this rectangle doesn't make sense. Even though the book itself was a little dystopian, etc., etc., but I read it and thinking, like, it doesn't have to be this way. And so that book has always stuck with me. When I went to college, I went to NYU and studied how technology transforms human behaviors. And part of that was me naturally just looking at the space, being super interested in what was happening in AR and VR. This is 2013. When I graduated Google Glass, I just came out. And so I had to get my hands on one. I was part of the second wave of Google Glass explorers. And for what it's worth, when it came to Google Glass, I was never like, oh, this is the device. This is it. But I was like, how cool is it? How lucky am I that I get to live through this next technological And then from there, which is how you and I met, I started one of the first ARBR newsletters. So I was helping people understand about the industry. All my friends were like, what are you talking about? Like, what is this thing in this future? And I was like, it's going to happen. It's going to happen in our lifetimes. albeit I thought we'd be a little further along, but at the end of the day, it's been really exciting. Then I started my last company before joining Snap, which is called Surf. We were building the first search engine for augmented reality, which is then how I became really familiar with Lens Studio. some of the most viral AR experiences, you know, what Snap calls lenses, you know, went viral on the platform. We worked with huge artists like Nicki Minaj and a few others. And then this was like 2019. My co-founder and I both were like, you know, it's really, really hard to make money as an AR company, as an AR team. And I felt like the best place for me to go was to join a platform and kind of help pave the way for more people to make money building AR. And so ultimately, we sold that technology to a company based in the UK and I joined Snap. I joined Snap as the first AR partnerships person. And over time, my role has evolved. But ultimately, the vision has been the same. It's like, how do we make AR ubiquitous? How do we bring it everywhere? But also, how do we make sure that developers are making money, that they're able to make their whole livelihood on building these tools and this platform? So it's kind of my very quick history lesson.

[00:05:19.116] Kent Bye: Nice. Yeah, that catches us up to today. And then we just had the Snap Partner Summit yesterday with the announcement of the Spectacles glasses. And then this morning here at the LensFest, you had a conversation with the co-founders of Evan and Bobby. And, you know, they were talking about how for the past decade, they've been working on augmented reality. And the company... of Snapchat, now Snap, existed for around 13 years. So for the majority of the time that Snap has existed, then augmented reality has been on the radar at least. And so it's a camera company, I often hear them say, and then now it's sort of moving more into like what happens when we take the camera off our phones and put it onto our bodies. And so when you were deciding to come work at Snap, maybe you could talk about your own inspiration for what Snap was doing and the mission for what you saw they were trying to do.

[00:06:06.547] Sophia Dominguez: Yeah, that's a great question. So in being in the industry a long time, and mind you, you've had almost every headset, so like DK1, DK2, setting up the Vive with lighthouses. And actually, my company, Surf, we were first building a search engine for VR. And what we realized was there's actually, at the time, now there's a lot more content. There wasn't that much content, comparative to AR, where there was actually so many lenses, so many experiences. And this is also when other platforms started to make their own tools. And what people don't often think about, and Bobby touched on this a little bit in the Q&A session today, or Evan did, it's like 300 million people on Snapchat use augmented reality lenses every single day on average. And it's only growing. And so there's just something, at least for me, especially at the time, and seeing just all the numbers, you know, again, when we made this transition from being a VR search engine to an augmented reality search engine. This is the ubiquity of content, but then also the familiarity of like a normal consumer and the fact that a lot of them don't even think about it as technology. So there's just something inherently interesting to me about that. And another thing about augmented reality that I really do appreciate, it's It's really that vision of how do you interweave digital objects into the real world? How can you be in a physical world, see somebody that's in front of you, have them see your eyes, interact in the same space together? That virtual reality just to me at least, it takes you elsewhere. And when I was looking at the market and seeing the numbers of building on all these different platforms, we would ship something on Snap and it would get tens of millions of views in a few days. And I was like, if there's one company that's going to win in augmented reality, it's going to be Snap. And that's the place that I want to help carve out these pathways for developers to make money.

[00:07:54.494] Kent Bye: And it's really been in the wider XR industry with Augmented World Expo, which has been around for 15, 16 years now. And a lot of it has been very enterprising. And so for me, in terms of an experiential perspective, I've always been way more interested in VR just because it felt like the AR apps that were out there were much more like solving a problem, something was for enterprise use cases, something that was not necessarily getting nearly excited around the experiential potentiality for what is possible with the medium. But it feels like Snap is taking kind of like a consumer approach for having already cultivated an ecosystem of a social media platform that's a camera first that already has some of the most widely used augmented reality use cases and more of a consumer scale, but also a lot of art and creativity and people that are doing fun stuff with identity, but also with modulating their, augmenting their reality in different ways. And so, yeah, I'd love to hear you reflect on this kind of niche that Snap has with what they're able to do with both identity expression with the filters, but also in terms of other ways that you see people using augmented reality in a way that goes beyond the enterprise use cases that we've seen so far.

[00:09:02.458] Sophia Dominguez: Yeah, so when it comes to, you know, consumer sort of use cases, first and foremost, augmented reality has been a hit with self-expression. So it lowers the barrier entry for somebody to feel more comfortable sharing, you know, themselves or the world. And so while it started there, it's definitely evolved into be so much more and there's a lot of other use cases which we'll get into. But I think self-expression at the end of the day is just really core to a consumer level of augmented reality. At the end of the day, like we're human beings, we want to connect with others and we want to do so in a way that's as frictionless as possible. To your earlier point about opening to the camera, you know, before when you just opened to the camera, you know, sometimes people may feel shy, like they don't want to exactly look the way that they do or share that. So in early days, we added like filters, which are just like 2D images, et cetera. And then Over time, it transformed into more like 3D experiences leading up to where we are today, where now if you don't even want to show your face, you don't have to. There's a plethora of options. You can actually have your own Bitmoji be your head. You can talk back and forth to your friends with that. But when you think about this next generation of computing, Augmented reality, at the end of the day, it's just technology. If you think about it, like the internet, right? Like the whole word augmented reality, et cetera, goes away over time. Just people are using technology in the same way that they're interacting with physical objects. ideally in the real world but i think the platform that does that in the more like intuitive human way and does so in a way that consumers feel like it's very approachable because at the end of the day they don't care if it's augmented reality glasses they just want glasses and they want a really solid experience and then they also want the right content or the right lenses that are providing value to their daily lives So I think, you know, often in the industry we talk so much about hardware and how important it is to get the field of view bigger or the weight down, et cetera, et cetera. But I think if something is bringing a lot of value to people's lives, People are willing to go through hurdles in order to experience that. But because AR at its core is still the first mainstream use case was self-expression. It's communication. It's talking to other people. There's just a really natural through line to that to make it very human. And that's a lot of those values that we're taking into Spectacles.

[00:11:25.990] Kent Bye: OK, and so there's an analyst that I'm a fan of. His name's Simon Wardley, where he talks about there's these four different phases that technology goes through as it's developing and propagating out into the public. There's academic idea, just to prove something's possible. Then there's custom bespoke enterprise applications that are really handcrafted and customized for specific use cases. And then there's the consumer market, where things can go out into the mass audiences. And then eventually, it may get to the stage of getting to mass ubiquity. And it feels like most of XR has kind of skipped over the enterprise in some ways, like meta, kind of subsidizing things to really go to the consumer market. Snap, in some ways, has gone straight to the consumer market with all these AR lens filters. But there's also been more, I'd say, enterprise-y type of use cases with working with stadiums, with museums, with location-based entertainment. Maybe you could give a little bit more context for how those more enterprise use cases or non-consumer use cases that are more B2B type of applications that you see the AR platform that you have.

[00:12:27.577] Sophia Dominguez: it's interesting that you're referring to snapchat cam which is you know our technology which is built off of camera kit that enables stadiums or venues to integrate our technology into let's say like a large screen or a jumbotron the way that we think about it is maybe less of an enterprise model but more of a b2b to see like something that snap is you know excellent at it's just like their bread and butter is understanding what consumers want and we have the understanding of virality and like what makes people laugh, what makes people smile, what are people sending back and forth to each other. And so even if, you know, we have these solutions, which ultimately are going business to business, it's always with the mindset of this will help increase ubiquity to your other point. How do we make lenses everywhere but do so in a way that's not about the technology it's just about it being really fun or really joyful so you know we talked about our partnership with the Louvre yesterday that was another one yes we're partnering with the Louvre or we're actively encouraging a lot of our developers and agencies to go to museums and hear from them, hey, what are the things that you all want to bring to life with augmented reality? But always with the mindset of just making sure that it's done so in a way that's very approachable to the end consumer, because they're the ones who are going to walk away from this and be like, wow, versus thinking about it from, you know, there's a lot of, to your point about people who are just selling B2B, right? And it's about How do people, I'm making up this use case, but how do people in hospitals use this for their day-to-day lives? I would say there's companies who are optimized to do that. It's not to say that we would never do that, but again, we would think about it if we were to partner with, let's say, a hospital, it would be like maybe a children's museum or something of that sort. If we partner with a hospital, it's always like, how do we get this in the hands of the people that you can bring joy to versus this more enterprise use case? Never say never, but that's really just our bread and butter is thinking about consumers. And yeah.

[00:14:25.155] Kent Bye: I think one of the areas that HTC Vive has really gotten into, and Meta not as much, but the location-based entertainment context where you have people come together to have a shared social experience. The Ez Devlin experience feels like this kind of community ritual that I could see people potentially using the snap spectacles in order to do that type of LBE experience. There may be some things with the onboarding and getting IPDs and measuring if you have corrective lenses. There's friction that happens that both Magic Leap has faced in trying to do LBE context. But I'm wondering if that's something that's on your radar at all in terms of if you see that there could be these kind of location-based experiences to bring people together in a physical context that allows them to have these unique social dynamics.

[00:15:06.525] Sophia Dominguez: Yeah. And things on location are connected sort of experiences. That's a huge priority for us. And it's also like baked into the OS. So over the last few months, at least, we've been working really, really hard to make sure that if someone enters into a connected experience, It's really fast, so it can happen in seconds. And evolving from that, you know, with the S. Dublin experience, we were taking through people who'd never used spectacles before. This is their first time probably using see-through AR glasses. Not everyone, but a lot of folks. I think on the IPD, like right now, when you get spectacles, it's tied to your account. And so you've already set up your IPD. I think for these sort of activations that we may do with museums or like other, you know, venues, etc., All of that stuff is just really about optimizing for use cases. So in the last version of Spectacles, which I forget if you tried, but... Someone accidentally showed me.

[00:15:58.589] Kent Bye: They weren't supposed to, but I did see them.

[00:16:00.030] Sophia Dominguez: That's fine. Well, anyway, so we shipped without hand tracking. And pretty quickly, anyone wanted to do was put on the glasses and reach out and touch some things. And so the team worked to... optimize hand tracking for a device that was never built with hand tracking in mind. So I think this is something that our approach is very iterative. It's taking feedback, it's taking learnings from what our developers want to do, and then figuring out technical solutions to get around that. So yes, IBD is very important. Like every company is actively like setting up the device to said person. But if there's a use case from connected lenses that we really start to pick up, then we'll look at like a software solution to be able to evolve that.

[00:16:36.212] Kent Bye: And I think the other dynamic that we've seen in the VR industry is that a lot of the companies have invested in different projects, but there may be strings attached for what that investment money means in terms of developing. And there's a marketplace to sell experiences. But with augmented reality, it's sort of like a completely new green field of there hasn't been an existing market. And so I talked to about a half dozen developers yesterday. And I think every single one of them had received no strings attached grants and money from Snap to build an experiment and push the technology forward, which I think is something I haven't seen as much done within the broader XR industry, but it's sorely needed in terms of like the frontiers of innovation are those independent developers. And so it's really encouraging to see how Snap has been really cultivating this ecosystem. So I'd love to hear you reflect on this process of seeding the industry, but also working very closely with the developers to be able to help make this a viable platform.

[00:17:29.895] Sophia Dominguez: Yeah. So on that, like our mission is to be the most developer friendly platform in the world. And a lot of that comes from the fact that a lot of team members at Snap were once individual or like small team sort of creators and developers. So we understand the hardship that people go through and, you know, Snap's core values are kind, smart, and creative. And so, Not only do we operate in that sort of way internally, but also externally. So, you know, when we get feedback from our creators and developers, they always say, like, we love working with you guys. Everyone is so nice. Everyone's so thoughtful. If you can't get back to us on something, like, you know, you respond, which apparently they don't get from other platforms, which is kind of crazy for me to hear about. So I think it's really just like the ethos of Snap. And then when we're thinking about Spectacles, part of the reason to ship it and start with developers is to build alongside with them. What are the things that they want to see? How do we make it easier for them? And we're actively encouraging them, get this in the hands of your end customers, test with them. Is there something that's too complicated? What do they like, not like? So that we can take all of that and iterate along with you rather than in a vacuum. And our approach to monetization is similar, right? Again, we know how hard it is. We know that these developers, at the end of the day, they're people. They need to earn a livelihood. And if we want to work with the best people in the world who share our vision for the future of augmented reality, we need to create pathways for them to monetize. Now, of course, you know, when it comes to scale monetization, like the things we've been doing with LensGrid Rewards on the Snapchat side, obviously we're not there yet and we understand that, but it's about, okay, how do we create the right inroads to be able to make that happen? And in the meantime, it's just making sure that when we support them, yes, it is financial and, you know, meaningful financial support, but also it's about feedback and making sure that they're connected to the right team members, sending them opportunities and just being very thoughtful about, how they can continue to do this and so important for the future of spectacles because at the end of the day the experiences are differentiated to your earlier point you know we're like hey it's not just you know you're sitting at home or you have to be indoors you could take this on the go what are the experiences that you want to see come to life how do you bring value not just to your daily life but also for the end customer that you really are building for and Hopefully that starts, you know, with you, but the goal is not for you to just build something just for yourself. I think the goal would be also for you to build something that would eventually have an audience.

[00:19:56.673] Kent Bye: Okay. And this morning when you were talking to Evan and Bobby, they were talking around the 13 years of the company and that they've been working on AR for the last 10 years. The fifth generation of the Snap Spectacles that were just announced yesterday. So that's on average once every couple of years that there's been a new prototype. And so I'd love it if you could maybe kind of go through the evolution of each of the different spectacles in terms of what... You don't have to give the dates or years of just kind of like what was new or different, and then what led us to today with the fifth generation of the Snap Spectacles.

[00:20:27.299] Sophia Dominguez: Yeah, I will try my best, but also highly recommend watching Daniel Wagner's talk that we live streamed at Lens Fest, where he kind of goes into the more technical components of what went into each generation. But the first version of Spectacles, which a lot of you all should be familiar with, with the vending machines that dropped in various places, the bright yellow ones, So those were launched in 2016 and those were just simple camera glasses. So you could capture photo, video, you know, hands free. They came in a variety of different colors. And I think the lesson there was that ultimately just putting on your camera on your face, it was not enough. And so Then from there, a few years later, then we shipped a version with two cameras, then you can actually like capture stereo content. And then the third generation, again, came in various different sizes and shapes and colors. But then that was the first time that you could actually after you recorded video, you could apply lenses on Snapchat. And so it was post processing, it was not real time, but then people started to get really, really creative with it. At the end of the day, even back in 2016 when we launched, the vision was always for these glasses to be augmented reality glasses. And so the last version that we shipped was in 2021. That was our first AR display. We did not sell that device. It's very much a developer kit. And we got a lot of feedback. One, people were like, whoa, this is so cool. It's wearable. It's standalone. But the field of view was really small. I think battery life was about 20 minutes or sub. And developers were like, this is awesome, but this is not enough. And on top of that, the way that we were treating the lenses was it was very much, it was not as complex or spatial. It didn't leverage all of the connected lens infrastructure. You didn't have the ability to import SnapML or machine learning models. to make the experiences more unique or complex or like really understand the world around you. As I mentioned before, we didn't have hand tracking, we added hand tracking. So we were thinking about this generation, which we just announced yesterday, the fifth generation. It's really about how do we take all of those learnings and get it as far as we can into something that is still wearable, is something that developers can be really excited about building for, and then ultimately is paving the way for consumers. And part of that is on the software side, which was about rewriting Lens Studio. So as Bobby highlighted earlier today in the Q&A, was that we rewrote all of Lens Studio from the ground up to be able to encompass lenses that are built on mobile and also built on spectacles. And the reason that that matters is because at the end of the day, people are investing their time. They're investing their time into learning Lens Studio, into building their careers, etc., etc., And what's so nice, and you saw this with Max's draw flowers lens, that was actually built for Snapchat. And so he made this wire tool, procedurally generating different flowers, just using your hands. And he not only shipped the lens on Snapchat, but also shipped the wire tool, which he's selling on Gumroad. And we're working on enabling more monetization through our asset library and things of that sort. and it's now one of the experiences that launches on Spectacles. And so that sort of ease of use, and it's not always so out of the box, but at the end of the day, it's like you're importing assets, you're spending a lot of time, you're designing something, and very easily through things like the Spectacles Interaction Kit, you can really easily make it a spatial sort of experience. And I'm sure that there's a lot more that we can do to make that better and improve it over time. But we really do see it as one ecosystem rather than, hey, like you just spent all this time learning this tool. It's on mobile, and yeah, it's not going to work for our glasses, so bye. Again, it's like we want to be the most developer-friendly platformer in the world. We want to bring the developers along with us and, again, pave pathways for them to make money. If they're building on mobile, great. If they're building on Spectacle, it's great.

[00:24:16.471] Kent Bye: I think just in hearing different feedback from the larger XR industry, there's a bit of like, 45 minutes, is that enough? There's questions around the different types of experiences. So I think listening to Daniel's presentation this morning, I could really have a much better appreciation of all the different trade-offs that had to happen to not have an external puck, to have everything on a standalone with the glasses, there is some, just talking to a number of developers, just ways to potentially put it into sleep mode. And maybe you are using your phone as a way of getting notifications or updates or other status information, and then to then context switch into the experience. And so I'm just wondering what you feel is the real sweet spot for you're able to go outside, maybe do emergent social dynamics, or what makes the Spectacles unique to be able to have an immersive experience that you can't have on any other device that's out there?

[00:25:07.847] Sophia Dominguez: Yeah. So I'll talk about maybe the 45 minutes bit, and then we'll talk about the differentiated experiences. I don't know if you've been on a 45 minute zoom call or Google me. It's like actually pretty long. And even thinking about workouts, I'm like a 15 minute workout kind of person, like anything longer. I'm like, I'm so bored. Like I need to go and take an app or something like that. So even though I think we're still in the intentional wear phase with 45 minutes, but developers are, you know, people who are using this, if they want to plug into a battery pack, they can do that. If they want to plug into their computer, they can do that. We're not stopping people from doing that. It's just really about the fact that it is standalone. And so if you want to use it for more than 45 minutes because you're on a walk with your Peridot or something like that, you can do that. But we just assume people are on the go. They don't want to have to think about those things. And so it's really designing for that. When it comes to differentiated experiences, again, It's indoor and outdoor. And so I think the more that developers can bridge this gap between indoor and outdoor, that's where we'll see a lot of success, and especially the outdoor piece. Because at the end of the day, there is no other AR glasses on the market that enable you to do that. And then from the OS level, thinking about it from how do you make this shared? How do you make just interactions just feel really human or natural? There can be single-player mode. There can be multiplayer mode. And the more that folks can just lean into the fact that they can do things with this device that virtual reality or mixed reality headsets don't offer them, or if they do offer them, it's like you still have to wear a big VR headset on your face to go outside. And so the fact that you can go outdoors, you can see somebody who's not wearing spectacles, you can have a conversation with them, Or if somebody is wearing spectacles, you can have a whole magical experience. What does that look and feel like? And the last bit is we have something called spectator mode, which we talked about yesterday. So that ships out of the box, you know, through the spectacles app. So anytime that you are wearing spectacles and you have your phone on you, which I'm assuming most people always keep their phone on them at all times, you can just take it out of your pocket and show them exactly like what it is that you're seeing. There's other things coming with the phone. So there can be like custom controllers, which you can use for a range of capabilities. I feel like we're really not even scratching the surface of what's possible. And then again, it goes into this idea of how do we lean into everyday real world objects that because you're looking at the real world, you're not looking at a video screen of the real world. So text or like any sort of text on your phone is super readable. You want to bring in elements of the phone. So there's a lens called layout that allows you to actually bring in your photo album, see different things from your phone and a number of other capabilities. So I think it's about like leaning into those elements and at the same time, leaning into it being connected or on the go. And yeah, anything that's not easy for them to build, we want to hear that feedback. We don't want people to shy away from, hey, I really have this really great use case, but I can't build with it. Those are the sorts of learnings that we would like to have. And again, it goes back to the framing of how do we become the most developer-friendly platform in the world? Well, we listen, but it's not just we listen and it falls on deaf ears. We're active, we're leading in, we want to understand how to make augmented reality integrated into everybody's lives and we know that we can't do it alone you know the best platforms in the world over the history of time have really understood that at the end of the day you know a company can only employ so many people but there's infinite amounts of creativity there's infinite amounts of technical folks who can bring ideas to life and so we want to empower them as much as possible

[00:28:47.270] Kent Bye: Okay. So one other technical question before we start to wrap up is, so there's a new operating system, SnapOS, that is managing this dual processing architecture. And there's also the Lens Studio, which in talking to developers, that there's no other pipeline to get stuff onto the spectacles other than going through Lens Studio. So there's kind of like this JavaScript, TypeScript language that you're doing with Lens Studio, and then everything's kind of optimized around that. What other information can you say in terms of like the operating system with hand tracking social dimensions or Linux dimensions. And then if there's any future intention to have things like OpenXR to have existing standards and plugins that could be put into this platform to extend it out. So yeah, I'd love to hear a little bit more details on the operating system stuff.

[00:29:31.365] Sophia Dominguez: I guess going back to feedback and actively listening to developers and the things that they want and need. So we've already heard from developers that we've been working with for quite a few months or even just about over a year. What they really want is WebXR enabled on Spectacles and so without saying too much like that is coming um and so again i think it's we're very use case driven so it's really about how do we based off of what developers want to build and if there's use cases that make a lot of sense and or they're you know trying to bring their own experiences that maybe exist on mobile or exist on other platforms How do we make it very easy for them to be able to do? So your question on OpenXR, yeah, WebXR, it continues to grow, and we will be providing support for that. I think for future or more insights into the OS, think about it more like it's a series of different capabilities. So going back to like the shared or like social aspects of the US or like the social elements, right there out of the box, it's like you have connected lenses and those sort of capabilities. And a few months ago even, it was very, very hard to build with. And like our internal team members who were building with it and some external developers were like, oh, this is so difficult. And so the team has been working really hard to make that not just using it, but also integrating it easier. Hands and voice-based UI, that comes out of the box with Spectacles Interaction Kit. So I think we're always thinking about, OK, how do we give the different building blocks of what we're doing and package it in ways that make it really easy for developers to build with? And again, if there's something that doesn't work for them, then it's about hearing their feedback. What are the things that would make it easier for you to build up your experiences with? When it comes to Lens Studio, you know, it's an end-to-end AR platform. And so something that we pride ourselves on is the fact that you have a lens and you're building in Lens Studio and, you know, in seconds you can have it running on spectacles without, you know, there's no like long wait compile time. So you can just like iterate super, super quickly. Same thing on mobile as well. But it's a lot of that sort of thinking about we've built Lens Studio for the ground up for lenses. We've built it as an AR development framework. And we feel very passionately that it's really important for us to build the next generation of computing. It's to make it as easy as possible for developers and learn from them and make the tool better and easier to use so that they can build whatever it is that they want.

[00:32:00.779] Kent Bye: And finally, what do you think the ultimate potential of spatial computing might be and what it might be able to enable?

[00:32:07.100] Sophia Dominguez: I mean, all day where, right? We've been limited to screens for so long. my whole life, your whole life, but the screen is very limiting. So something that we say is, imagine a world where you can look up and not down. And you can be around people looking up, you can be looking at their eyes rather than looking down at a rectangle. And, you know, we've done a lot to innovate on screens and made it so much more fun, but I mean, the world is really awesome. There's so much to explore, there's so much to see. And... We so often lose ourselves into our little rectangles, whether it's our phones or our computer. And so the more that we can connect with others in the real world, that's what success looks like and get it ubiquitous.

[00:32:50.622] Kent Bye: Any other final thoughts? Anything else left unsaid that you'd like to say to the broader immersive community to get more information?

[00:32:55.984] Sophia Dominguez: Well, I'm just so glad that you're here. I know this is your first Snap Partner Summit and LensFest and I've seen you floating around with developers just in the same way I've seen you floating around for the last at least like 11 years of my career. So I'm just really happy that you continue to do this. I think you bring a lot of really important stories to light for like most people in the industry and beyond. And for anyone who's interested in getting Spectacles, visit Spectacles.com to... join our developer program and we want to hear your feedback, we want to hear from you, we want to build the best platform.

[00:33:27.654] Kent Bye: Awesome, well there's certainly a lot of expectations for what we've seen before and I feel like what Snap's doing is completely different and new and taking a different approach and so I feel like there's some things I don't quite understand but when I talk to the artists and the makers and independent developers then I hear their excitement and passion for what's possible and I follow them to see what they're going to do with the platform with all these new affordances and different trade-offs that you've had to make so Just excited to see where the developers take this in the future, and excited to see what happens in the lens-a-thon and the hack-a-thon and everything else as people start to get their hands on the platform and to see what's possible. So yeah, very curious to track to see how this plays a part in the continuing evolution of where this is all going. So thanks again so much, Sofia, for joining me today to help break it all down.

[00:34:07.845] Sophia Dominguez: SOFIA NELSON- Yeah, thanks so much. Yeah, camera access, machine learning models, we're so excited. I think there's going to be a lot of really great use cases. So thank you so much.

[00:34:16.687] Kent Bye: Thanks again for listening to this episode of the Voices of VR podcast. That's a part of my larger series of doing a deep dive into both the announcements around Snap Spectacles, as well as the AR ecosystem at Snap. What I do here at the Voices of VR podcast is fairly unique. I really like to lean into oral history, so to capture the stories of people who are on the front lines, but also to have my own experiences and to try to give a holistic picture of what's happening, not only with the company, but also the ecosystem of developers that they've been able to cultivate. And so for me, I find the most valuable information that comes from the independent artists and creators and developers who are at the front lines of pushing the edges of what this technology can do and listening to what their dreams and aspirations are for where this technology is going to go in the future. So I feel like that's a little bit different approach than what anybody else is doing. But it also takes a lot of time and energy to go to these places and to do these interviews and put it together in this type of production. So if you find value in that, then please do consider becoming a member of the Patreon. Just $5 a month will go a long way of helping me to sustain this type of coverage. And if you could give more, $10 or $20 or $50 a month, that has also been a huge help for allowing me to continue to bring this coverage. So you can become a member and donate today at patreon.com slash voices of VR. Thanks for listening.

More from this show