#1453: Kickoff of Snap Spectacles Coverage with First Impressions with XR Analyst Anshel Sag

I interviewed Anshel Sag at the Snap Partner Summit about the Snap Spectacles. See more context in the rough transcript below.

Here are the links to each of my 15 episodes on the Snap Spectacles announcement and a deep dive into the Snap AR Ecosystem:

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.458] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast, the podcast that looks at the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So we're going to be diving into a 15-part series of looking at the Snap Spectacles, their new standalone AR headset that was announced back on September 17th, 2024, and snap flew me down to la to cover both the snap partner summit as well as the snap lens fest had a chance to talk to 26 different people mostly nd ar devs across 14 different interviews and then around seven hours of coverage and then there's another conversation that sophia dominguez had with the co-founders of snap with evan spiegel and bobby murphy that also be airing in this series So Anshul, he is a XR analyst working for More Insights and Strategy. He's one of my favorite analysts just because he goes to all these different events for both the conferences and he's meeting with all these chip manufacturers, just doing independent research into what's happening in the industry. Has a lot of really great insights as to what's happening in the overall ecosystem of XR. So he's got a lot of great context. And I think we both were impressed with the motion to photon latency being 13 milliseconds, which I think is very responsive. That's what you want to see. Not too much jitter as you're moving around with the AR headset. Also very reactive with the hand tracking. Snap is probably one of the leaders when it comes to different machine learning models that they've had with their larger ecosystem of AR developers. They've got a huge social network and probably the most daily active users of augmented reality lenses with over like 300 million people on average per day. And so they have found the sweet spot where people want to express themselves. And when they have AR filters, it dips them into new modes of self-expression. And so there's this entire ecosystem of AR lens developers, and it's got the most advanced tools, the biggest ecosystem, and they're dipping their toes into the hardware. This is actually the fifth generation now. And so they've been iterating for the last 10 years. And this is still a dev kit. It's $99 a month. So a one year minimum. So it's going to cost at least $1,200 to get ahold of one. And you're basically renting it. I believe you have to like send it back once you're done paying on a monthly fee, but they're just trying to find innovative ways of getting these dev kits into the developer's hands to catalyze all sorts of new development. I kind of see it as the DK1 moment of AR, just like the Oculus Rift DK1 was kind of like a affordable dev kit for developers to start to see the entire immersive industry that launched back in 2013. This, to me, feels very much a close analog because it's a cheap $99 a month. Yes, it's going to be around $1,200 total, but... It's the most capable and affordable standalone AR device that's out there. And Snap has all the tools to be able to start to rapidly iterate and start designing what's possible. There's a lot of people that were talking about like the 45 minute limit on the battery. You can plug it into a battery pack. It's going to at that point be more of a concern of are the thermals going to be comfortable on your face? to be wearing more than 45 minutes. And so the bigger issue is whether or not they're gonna be able to cultivate a software and developer ecosystem and what their long-term plan is from going from this dev kit that's very expensive into something that's a form factor that's small enough, capable enough, and affordable enough for consumers to buy it. I suspect that eventually they're going to have to move into more enterprise use cases, even if it's B2B2C, meaning that they're location-based entertainment with museums, or in some way that you can have some sort of context for people to come together to have these different types of experiences, which is very much what they were showing with the Es Devlin experience, which is like 12 people in the context of a room. It's kind of a community ritual that they have, but also at the same time, a tech demo. So it was kind of serving multiple purposes for you to kind of understand what the technology can even do. But it was a shared reality experience where you can see 12 other people in AR all kind of interacting with the same digital objects, which was quite powerful. And it's something that I haven't seen before. The Snap Spectacles is also very much tuned to be used outdoors. So all their computer vision and machine learning is geared so that you can actually use it outside. So that's something that's also distinctly different. So there's going to be completely new use cases that are going to be available with the Snapk Spectacles. And I spent a lot of time talking to independent AR developers who are in the Snapk ecosystem who have had early access to the Spectacles or were participating in the LensFest and were able to play around with the hardware. So we get lots of different perspectives and we're going to be doing a deep, deep dive. And we're going to be starting off with Anshul to give kind of the high level context for how he sees the Snap Spectacles fitting into the overall XR ecosystem. So that's what we're coming on today's episode of the Voices of VR podcast. So this interview with Anshul happened on Tuesday, September 17th, 2024. So with that, let's go ahead and dive right in.

[00:04:59.807] Anshel Sag: My name is Anshul Sog. I'm a principal analyst at Moore Insights and Strategy, and I've been covering the XR space in some capacity since 2013. Back then, I was a journalist covering mostly VR, but I also covered some AR and some of the initial stuff. And since then, I've been covering both technologies, as well as MR and other 3D technologies that accompany it. And yeah, I've been covering Snap for a long time, and I have multiple generations of the Spectacles. However, I never got to try the first generation spectacles because it was kind of like a closely guarded secret. And I feel like there might have been a good reason for that. But it's nice to actually get a demo, try out what they have today, and think about where it might go in the future. Great.

[00:05:44.650] Kent Bye: Maybe you could give a bit more context as to your background and your journey into this space.

[00:05:48.956] Anshel Sag: Yeah, so I started in the hardware side. I started as a teenager in the electronics prototype design firm, started as an intern, ended up working there long term. Then I started working as a PC hardware guy for EVGA. They used to be NVIDIA's best board partner, now they are kind of in irrelevance now. And then I became a journalist, had my own publication for five years. And then I transitioned from journalism to being an analyst for the last decade now. And I think a lot of my experience, while I haven't really been actively developing, I have been trying all the headsets I can, you know, trying as many experiences I can and kind of understanding what the technical capabilities are, what the limitations are, and maybe why things are the way they are. I mean, you know, I'm looking at wave guides, I'm looking at chipsets, battery technology, display tech, so very much hardware focused, but, you know, I'm always paying attention to all the latest advancements and trying to understand how those technological improvements can enable better devices and better experiences.

[00:06:54.864] Kent Bye: So yeah, we're here at the Snap Partner Summit, where just an hour ago or so, it was announced that the fifth generation of the Snap Spectacles, its battery life around 45 minutes. They've developed their own wave guide that seems like they've got a couple of different Qualcomm Snapdragon chips that they're using. So yeah, as you get a chance to get your first hands-on experience that we just came out of a demo just moments ago, love to hear some of your early thoughts.

[00:07:17.901] Anshel Sag: Yeah, I think the one thing that I was surprised by was how confident they were in their demo. You know, not very many companies are willing to do a live demo on stage in front of the audience. I thought that was very ballsy, but I also think it's a kind of a representation of how confident they are in the stability of the platform. This definitely feels like it's probably something they've been working on for a very long time. I thought it was interesting if you kind of paid attention, there was very clearly a choice to prioritize vertical field of view as opposed to horizontal field of view. And I think a lot of that has to do with the fact that it seems like a lot of the Snapchat features on your phone and the Snapchat filters and all the Snapchat things that exist on your phone are going to be very portable to the glasses. And because of that, you have to have that like 16.9 vertical as opposed to the 16.9 landscape. So I think you're seeing them prioritizing that vertical field of view. And I think we kind of noticed that in the experience that we had today, but also with a demo on stage. You know, they're using the wave optics that they acquired a few years ago. And, you know, they were talking about the pixel per degree. Yeah, 37 pixels per degree is what they said, yeah. Yeah, and they were saying, I think, 46 degrees field of view, which admittedly is okay. I would say it's not bad, but it's also not great. I've definitely seen better, but I think there are compromises with wider field of view, even though I do think wider field of view ultimately does help with immersion. I think the fidelity was there. I think the colors were good. The image quality was good. Frame rate seemed decent. I think there was a little bit of juddering here and there with the comets. the faster moving objects. That might be a frame rate issue but I really did like how interactive it was and how we were all able to interact with one another and have that shared experience. I think they get and I think most people in the industry get that shared AR experiences are really valuable and drive more social engagement which means more fulfilling experiences.

[00:09:18.877] Kent Bye: Yeah, we just did a little bit of like a group ritual where there's around a dozen people or so sitting around in a circle and everybody seeing the same thing. And so it was kind of like we're seeing these different objects coming about and we're encouraged to reach out and to see how we can interact. I was impressed with how interactive it was in terms of just being able to track for the hand tracking and to move objects and to grab them and throw them around. I did see that, at least for my version, it wasn't exactly centered. It was a little offset, and some of the bugs that were crawling around weren't quite on the ground. And so I don't know what they're doing in terms of trying to match up their spatial augmentation with the world around you. But they said that they've developed their own operating system, which I found quite interesting. I don't know if that means that they don't need anything like Unity. They can just build everything from scratch, custom design for their own headset. So, yeah, some of these different details were some of the other things that you were noticing in the demo.

[00:10:12.977] Anshel Sag: I think, to your point, I really enjoyed being able to see other people interact with the bugs or the fish and actually see it real time. I think that's very difficult, but also really compelling. To your point, it seems like maybe some of the slam was a little bit off or could have been a little bit more tight. I think that we're just scratching the surface. And I think with the OS, it's an interesting thing because they're very much trying to control the total experience. And I think that they just want the glasses to be like a companion or like an elevated AR experience compared to what you have on the phone. which i think was kind of always the way that they were going because if you look at how they're iterating spectacles it felt like that already so this feels like a natural progression of what they were trying to achieve with the previous generations but now you have the displays i would love to see more applications with hand tracking and seeing how well that works. And with voice too, I think that demo was pretty good. People had a little bit of a hard time activating the moon to do their voice to text, but very clearly the voice to text was working and it was pretty accurate, I think. I think it would be cool if they had maybe done like a real time translation demo. I think that's a very low hanging fruit for a lot of AR stuff, but it's very easy for people to understand and it's extremely easy to implement. And I think a lot of people find that a useful application.

[00:11:37.728] Kent Bye: Yeah, when I think about the overall industry, and since you're an analyst, I'd love to get some of your take, because we have a little bit of like a top-down and a bottom-up. The top-down being like starting with virtual reality, then having mixed reality pass-through with what the Quest line is doing, but also Apple Vision Pro. And then from the bottom-up, you have more of like smart glasses that have more of like the form factor of existing glasses with cameras, but maybe some AI features with the Meta Ray-Ban smart glasses, which I know you've been a big fan of and has also seen some market traction. They seem to be like in the middle where it's kind of going for more of the pure AR approach, which we haven't seen as many companies really go in and all in on. We've had Magic Leap that's come and gone. We've had the HoloLens that's sort of come and gone, but I don't know how many other big viable wave guide type of solutions for AR that are out there. But this seems like the best bet from a consumer space. But I don't know if this is going to find a market fit with a 45 minute battery and if it's going to get into the hands of developers and maybe create these shared experiences. But you have to have other people in your physical environment that also have these headsets in order to have some of these different shared experiences. But I'd love to hear some of your thoughts on this kind of bottom up versus top down.

[00:12:47.830] Anshel Sag: Yeah, I think it is very much something that I've seen happening. I think ultimately it comes from a, like, what is the origin of your approach to AR or VR, that matter? I think when you look at, like, from the VR side, I just feel like companies that started on the VR side kind of moved towards MR and then eventually want to move into a full AR experience, while the ones who just had smart glasses like Snapchat kind of moved from the lighter perspective and are building something a little bit more heavyweight and more capable, but maybe still focusing on the comfort and weight aspect of things. I would say that these glasses did feel pretty snug on my face. I didn't feel like they were wiggling. I also didn't feel like they were too heavy, but I did notice a little bit of the weight on my ears. So there's definitely some weight reduction that could be done there. But in general, I think For consumers, I think the best approach is moving from smart glasses into AR glasses, whether it's one display or two displays. I think it's kind of a progression that allows the semiconductors, the displays, the waveguides to mature. so that way you're not trying to accomplish the impossible immediately. And it's kind of an iterative approach, which is kind of what Snap was doing. They were adding more camera capabilities, more voice capabilities. Then they added displays, and then they kind of hit a brick wall with that first generation of AR spectacles. And it seems like they're kind of moving back into their cadence, improving the form factor, resolution, capabilities. But coming in from the high end, like from, let's say, a Vision Pro, There's so much more miniaturization you have to do. I just feel like, to me, Vision Pro, if we're talking about Apple, that's like their dev platform for something that's much lighter. Kind of like these are a dev platform for something that's much lighter. They're just coming in from different angles. But I think they're trying to achieve the same thing. It's just Apple's approach, I think, probably going to end up costing about the same. Because these glasses are $99 a month. So you're going to be paying at least $1,200 a year for these. And they didn't say when the next generation is coming out. So you might actually end up paying for these for two or three years, which could end up costing the same price as a Vision Pro. So we don't know what the cost of these are, but we know they're $99 a month. So I think very much this is a dev platform, just like the Vision Pro. They're just coming at it from a different angle. But I think the ultimate goal is very similar.

[00:15:16.055] Kent Bye: Yeah, I know recently we've heard news that Meta has shuttered their Spark AR platform. So the developers that are here at the Snap Partner Summit, you have a lot of lens developers. And I know I've talked to different lens developers that there's the Meta platforms with Instagram, and then with Snapchat, and then with TikTok, and that each of them have different audiences. And some of the feedback I heard from creators was that there's a whole private dynamic within Snapchat, where if you create a lens, you may not see how it's actually being used because it might be happening in private DMs. So there was some stuff today where it was like trying to make Snap more simpler, have more public-facing type of stuff, and so maybe have more of a public-facing kind of influencer culture, but also content creators. But it seems like that from when I've talked to developers, Snap has always had the most cutting edge when it comes to the developer platform tools. But now that we've had, like, one of the big major players in the industry just kind of completely dropped their Spark AR platform. I'm just curious to hear what you think of this ecosystem of AR lens development and where Snap is filling in some of the void to maybe create more of a duopoly between what's happening with Snap and TikTok.

[00:16:21.738] Anshel Sag: I mean, the way I look at it is Snap is kind of the leader in this space. You know, they are the ones who, for a lot of people, define what people understand to be AR, right? Like those lenses are very much AR and they are some people's only experience with AR. And when it comes to creators, I feel like they are kind of the primary platform for most creators when it comes to these kinds of AR filters. But to your point, I do think that this is kind of the thing where they kind of gobble up, the creators who were on Meta's platform and using Spark. But I would also say that like, I feel like Spark wasn't necessarily at the forefront. So this is just kind of more people falling in line with what Snapchat was already doing. And I do think TikTok is a huge competitor to Snap in this space, but I still feel like they're behind Snap and Snap is the leader and they're just gonna absorb all of those creators, I think pretty easily.

[00:17:21.161] Kent Bye: Yeah, and in terms of where you expect this to fit within the context of the ecosystem, of the market, I mean, you're doing analysis, so you're digging into some more empirical numbers and getting a better sense of the state of XR than me as more of an oral historian who's doing these types of one-on-one conversations. But I'd love to hear some of your reflections on how you see where they're at now and it fits into the overall ecosystem of XR.

[00:17:44.898] Anshel Sag: Yeah, I think that we're in a place where we haven't really made that much progress. You know, there are things that have improved. Waveguide technology has improved. Display technology has improved. Battery technology, processing technology. But it feels slower than I think a lot of people would like it to be. And the truth is... if you want to look at that like the ar market the largest volume is like xreal right xreals glasses they're really moving the most volume because it's affordable it's easy to use it works with your existing applications and your existing devices and they're iterating a new version of the glasses or the compute every six months or less than a year so you know i've been using the xreal uh Air 2 Pro with their new X-Beam Pro. And it's like, it feels like a beginning of AR in the sense that you're able to have some AR applications. Admittedly, almost nothing with 3D. It's mostly 2D experiences. And I think that's the problem is we're still in the infancy. We're still trying to do 2D applications in a 3D space. And it's like... That's not going to be what drives the industry forward. It does get people interested in using the platforms and getting that install base, which is ultimately how you get developers to build 3D applications. Because ultimately, when you look at like AR, 2D AR in a 3D space, it's not going to move anything. And I think that... that's kind of representative where we're at today if you look at the industry it's kind of just trying to get going but every time we get a launch like let's say vision pro you know we get these initial up spike in sales and interest and then it kind of just flattens out and it's because there's just not enough developers building applications that are compelling in 3d And I think once we get over that hump, that will be where the industry really starts to take off. The problem is that developers need tools and the platforms need to be able to support developers in a way that make them confident that they can build for these 3D experiences into the future and that their investment won't be wasted on, say, building something for like a Magic Leap or a HoloLens.

[00:19:58.908] Kent Bye: Yeah, one of the things that really stuck out to me in the keynote this morning was the CEO, Evan Spiegel from Snap, saying that there's going to be no developer tax. And I don't quite know what that means. Well, I know what it means in terms of usually when you have some of these different VR applications that are put on the store, if it's sold, then there's usually a 30% developer tax. So to say that there's no developer tax, I'm just curious some of your thoughts on what's that mean for trying to create a creator economy that's actually sustaining the creators.

[00:20:27.409] Anshel Sag: Yeah, I think that's a good thing for developers. I think when you look at what they're trying to accomplish, they want to build a developer base and they want to have developers building apps for their platform. And the best way to do that is not to take any money and let them take all the money. But I don't know if that's sustainable long term. And I also think that it's just a way for them to build a bigger base of developers because right now I don't think they really have that much in terms of for AR. Obviously, tons of lens creators, but I think that's a very different type of developer than someone building a true AR app for AR glasses. And I think they have to have compelling tools, compelling pricing structure, but also I think they have to have an install base and to be able to have maybe a creator fund, because I know they do it for other things. So it'll be interesting to see what they do there. But there hasn't been any talk about money. But ultimately, money drives a lot of things in our industry. And that's part of the reason why I think they did the whole free thing. But I also think they have to also maybe put a little money in front of people ahead of time, too.

[00:21:32.867] Kent Bye: What do you make of the two Snapdragon Qualcomm chips, if you have any information or any guesses for what they might be using?

[00:21:40.754] Anshel Sag: So I've been trying to gather what chipset it is. From what I can tell, I don't know if it's actually the current generation stuff. So it might not even be an AR2 or an AR1 chipset, but it's clearly that disaggregated architecture that they've been talking about. which is how you keep latency down, which is, I think, how they're able to achieve. I don't remember what the number was. 13 milliseconds motion. Motion to photon latency. Which is close to the 10 millisecond target. I think that's good enough. But I think they were only able to achieve that because they had that split architecture. Hopefully, the next generation will have an AR1 or AR2 chipset, and that will help it with battery life and performance.

[00:22:21.525] Kent Bye: I'm somebody who has been much more interested in VR than AR for a number of different reasons. But I think one of the reasons, I get a little bit of just social anxiety, especially when it comes to bystander privacy and infringing on other people's privacy. So with the smart glasses, it's something that I haven't necessarily been an early adopter of the meta Ray-Ban smart glasses. But I know that you wrote up a really compelling review, and you've been a huge fan. And so maybe you could just give a bit of a pitch for, What is it about the Meta Ray-Ban smart glasses that you enjoy so much?

[00:22:52.413] Anshel Sag: I would say, you know, one, they are in a form factor that doesn't change the way I use them. You know, they are glasses first and foremost, but also, you know, they fit into my life. You know, when I wear them every day, I drive I can hear text messages in my ears. I can respond to text messages without taking my phone out. When I'm walking the dog, I can listen to a podcast or music and not lose spatial awareness. I can respond to my wife's texts while I'm walking the dog, keeping my phone in my pocket. I can listen to music. I can take pictures while both my hands are occupied. I can take video. The stabilization is really good, so the quality is there. They've improved the image quality, so it's like... Sometimes when you look at a thumbnail, you're not sure if it's a phone or the glasses. And yeah, I think it's done a really good job of slipping into an existing form factor and having a good experience, both in terms of weight, comfort, sound quality, image quality, and not being too expensive. And also, like, they fit into this glasses case that looks like normal glasses cases, so they didn't go with this big, chunky design. And I just think that they nailed it on so many different aspects. Admittedly, I wish I didn't have to and have another app to run them, but I can see why having it separate from their other apps maybe helps them on the regulatory front. and also maybe potentially adds more interconnected apps down the road if they want to do that. But yeah, I just really like using them because they're comfortable, they're easy to use. You know, I just pop them on like I would normal glasses. And, you know, I was actually when we walked into the experience, my mental reaction was to use a meta hot word to take a picture of the inside of the room, but I was wearing the wrong glasses for that. So I thought that was kind of funny. It's already augmented my behavior where if my hands are occupied or I just want to like take a picture of something from my perspective, I'm already like saying, hey Meta, take a picture, which I thought was kind of funny. I caught myself there.

[00:24:57.289] Kent Bye: Do you use any of the AI features, any like just conversational AI or asking questions with the chat GPT or if there's any other like image-based AI features that you're using?

[00:25:07.719] Anshel Sag: I have used them here and there. I would say the most, like it's not as good in terms of like when I query it for like the weather or sports. It's like 50-50 in terms of accuracy there. I don't love it that much. But I would say the best use of the AI is just voice to text. Like when I send text messages, they're like flawless. The actual transcription is really good. And because the transcription is good, I trust to use it. And then I just keep using it for more voice stuff because it's accurate.

[00:25:37.423] Kent Bye: So next week is MetaConnect. Do you have any predictions or anything that you're expecting for next week?

[00:25:43.877] Anshel Sag: I'm going to say that I can't for sure know what they're going to announce. I have a good idea, and I think we're going to see some interesting stuff on the AR side, I hope. But, you know, it's hard to know when it comes to the competitive landscape how much they want to disclose. And, yeah, I think we're going to probably see some Quest stuff as well. Potentially that 3S that everybody's been rumoring about. And, yeah, I think we're going to see a lot more... teasers. I'm not sure. The AR side seems very unclear, but they've already teased quite a bit. So I think we're going to get more teasers.

[00:26:20.535] Kent Bye: I'm curious to hear some of your thoughts on what's happening with Apple, because they've been a new entrant into the space. And I think one of the big things with XR in general is just retention and keeping people in the headset. And my question is whether or not the headsets that are out there, if they're being used at a level that they're happy with, or if there's any sense of how successful it is with their larger vision. My sense is they've kind of treated it like any other new platform launch, like a new iPhone, but not recognizing the unique quality of emerging media and technology and need to be a little bit more boots on the ground and engaging with community and just for my own interactions with them. But I'd love to hear some of your thoughts on where they're at with their ecosystem with coming in with XR now.

[00:26:59.827] Anshel Sag: I think they're still very early. I think they understand that a new platform is going to be pretty slow to grow. I think what I have appreciated from them is that they've definitely invested in content, specifically like video content that's immersive. You know, that's a very large investment to make. But also I think what they're doing is they're actually sparking the industry to start making that kinds of content again to satisfy Apple, but also other content platforms. And I think, you know, I just saw that there was a big hackathon this weekend and that looked like there were a lot of participants and it looked like that was a good experience. I don't think that was officially done by Apple, but it seems like there is a rich ecosystem of developers that want to develop for it. The problem is it's not that big yet. And also, it can't be that, you know, the install base was never meant to be big. You know, I've seen people say it's going to be small and then it's going to be big and there's been overshot, there's been undershot. The reality is I really think it's kind of sold what they expected it would be. I mean, they're not out of the realm of understanding of that a $3,500 headset's not going to move that many units. even if it is Apple. And I think that with time, you know, there will be another platform. There has to be. And I've always believed that Vision Pro is a prosumer development device. So, you know, it's like, if you have the money, fine, you can have it. But really, this is for developers to figure out how to build for Vision OS and to prepare their applications for the AR future. But to be fair, it is a VR headset because my belief of a VR headset is if you turn it off, you no longer can see outside. And that's one of my biggest problems with the Vision Pro is like if it dies and you put it on, you're not going to see anything. And, you know, I wish I used my Vision Pro more, but I'm barely touching it once a week. And I wish there was more sticking power for me to stay and keep using it. But truthfully, I don't really need to do it often because I don't really feel there's an experience that's drawing me in. Also, I have a, you know, an eight month old at home, so I can't really split my time too much. But even when she's asleep, you know, I grabbed the Vision Pro last night. Before that was maybe a week or two ago. So I just think that there needs to be more applications that draw people in and we're just not there yet.

[00:29:20.247] Kent Bye: Yeah, I guess as we start to wrap up and have some concluding thoughts, you know, one of the things that I think is another big takeaway that I see here is just in the way that their messaging is very similar to the Niantic, like, really emphasizing the kind of in-real-life physical reality and then creating this contrast and almost really explicitly poking fun at people that are in VR, having this, like, More of a stereotype of anybody that's in VR is disconnected and dissociated and not connected to the world around them. And so, you know, to have this real emphasis on co-located, physical-based AR experiences. So there's things around that messaging that I feel like is creating this kind of false bifurcation that doesn't need to be as antagonistic against VR. Like, it's all a spectrum in my mind. Yeah. Apple's taken a similar approach with spatial computing, avoiding all the XR language altogether. Niantic's also on that sort of in physical reality kick. But I'd love to hear any of your final thoughts on that or anything else that you've seen here.

[00:30:14.576] Anshel Sag: I mean, I agree with you wholly that it is a spectrum. There is no need to poke holes. I think that's just a... I almost feel like that's them helping themselves justify their approach. The reality is I believe AR has to be world scale. It needs to be outside in the real world. And these glasses are designed to be outdoors. So, you know, they want to try and differentiate themselves. And I think that's really what they're trying to achieve with that messaging. But I think ultimately, AR has to be a shared experience. It has to be outdoors. Because if you do it indoors and it's just an individual experience, it's not that far off from being in VR. So I think AR should be more engaging and personal and location aware. And I think the location awareness is still lost on a lot of people. But I think if you look at Niantic, they're very much about the location awareness. And I think we're going to see Snap talk more about location awareness as well because that's where a lot of the monetization can come from. That's also where you can connect with people and also have common shared experiences like we had in that room.

[00:31:18.418] Kent Bye: Great. And finally, what do you think the ultimate potential of all these immersive computing, spatial computing devices might be and what they might be able to enable?

[00:31:28.103] Anshel Sag: I think ultimately what they'll do is they'll help us be more connected to each other. I think you know a lot of us right now we have our phones and we're connecting with each other through our phones but we're not really like looking at each other and I think that AR can actually enable us to put our phones down and to interact with each other face to face and experience games and apps in a shared common space, seeing one another, as opposed to staring down at our phones and sharing a digital medium that way. So I think it's almost like a net positive in the sense that so many of us are already absorbed in our phones. And I feel like if this takes us away from our phones and moves us towards more interacting with one another at a personal level, I think that's actually a net positive.

[00:32:13.646] Kent Bye: Yeah, and certainly having some community rituals that they're showing here as demos. But yeah, any other final thoughts or anything else left unsaid that you'd like to say to the broader immersive community?

[00:32:24.232] Anshel Sag: I think that, you know, I've been tracking our space for quite some time and it definitely feels like AR kind of had a, you know, a slow spot these last few years. But I do feel like with Vision Pro, with the spectacles and other things that might be coming down the pipe, we might be actually starting to see the industry kind of pick up a little bit. I think a lot of things that I had seen in the last few years, everyone was targeting 2024. And the whole ecosystem is getting ready to launch at 2024. And now it feels like maybe that's going to be 2025. So the delays are still happening, unfortunately. So nobody really knows when things are going to be mainstream. It would be foolish to lock that down because those dates have constantly shifted. But I would say that if you're not in the space, continue to follow it. And if you are in the space, be patient. Don't overcommit. and try to focus on making things that people want to experience again and again because ultimately i feel like if it's just low-hanging fruit anyone can do that sage

[00:33:31.127] Kent Bye: advice, to be patient, and lots of great insights on the industry and reflecting on where we're at and where we might be going. So Anshul, thanks again for joining me today to help break down what we just saw here with the Snap Spectacles demo, but also where the state of the XR industry is at and where it's going here in the future. So thank you. Thank you very much. Thanks again for listening to this episode of the Voices of VR podcast. That's a part of my larger series of doing a deep dive into both the announcements around Snap Spectacles, as well as the AR ecosystem at Snap. What I do here at the Voices of VR podcast is fairly unique. I really like to lean into oral history, so to capture the stories of people who are on the front lines, but also to have my own experiences and to try to give a holistic picture of what's happening not only with the company, but also the ecosystem of developers that they've been able to cultivate. And so for me, I find the most valuable information that comes from the independent artists and creators and developers who are at the front lines of pushing the edges of what this technology can do and listening to what their dreams and aspirations are for where this technology is going to go in the future. So I feel like that's a little bit different approach than what anybody else is doing. But it also takes a lot of time and energy to go to these places and to do these interviews and put it together in this type of production. So if you find value in that, then please do consider becoming a member of the Patreon. Just $5 a month will go a long way of helping me to sustain this type of coverage. And if you could give more, $10 or $20 or $50 a month, that has also been a huge help for allowing me to continue to bring this coverage. So you can become a member and donate today at patreon.com slash voices of VR. Thanks for listening.

More from this show