I interviewed Anshel Sag, Principle Analyst at Moor Insights and Strategy, at Meta Connect 2024. See more context in the rough transcript below.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
Podcast: Play in new window | Download
Rough Transcript
[00:00:05.458] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So I'm continuing my coverage of MetaConnect 2024. Today's episode is with Anshul Sog, who is a principal analyst at More Insights and Strategy, and I actually had a chance to catch up with him the week before and at the snap partner summit where he had just tried out the snap spectacles with myself within a collective ar experience called as devlin the council and then he had a chance to try out the meta orion glasses and so get a little bit of his take on the orion glasses as well as just the broader reflection of what's happening in the xr industry and get some of his reflections on all the different things that are happening in the context of meta So that's what we're covering on today's episode of the Voices of VR podcast. So this interview with Anshul happened on Wednesday, September 25th, 2024. So with that, let's go ahead and dive right in.
[00:01:06.532] Anshel Sag: I'm Anshul Sog, principal analyst at Moor Insights and Strategy, and I cover the XR space broadly. I would say I'm a little bit more focused on the hardware side, but I also cover a lot of the software. And mostly I cover the software because it drives the hardware. Nobody buys hardware just to buy hardware. So I'm always looking at the experiences and understanding where the market's at in terms of maturity, where the technical capabilities are, and just making sure that people understand the realistic or have realistic expectations for the industry where it's at today and where it might go you know not over hype things but also maybe not be too doom and gloom so i like to keep things in the middle
[00:01:50.566] Kent Bye: Awesome. Well, we just had a chance to catch up and chat last week around all the news from Snap Spectacles. Now we're here again one week later at the MetaConnect, where you also had a chance to see the hottest ticket in town in terms of the Orion AR demo. But before we dive into that, I'd love to have you give a bit more context as to your background and your journey into this space.
[00:02:09.574] Anshel Sag: Yeah, so I started working in this space as a journalist, actually, when I had my own publication. My first introduction to AR was Qualcomm's Vuforia, if you remember that. They were, I think it was around 2013. And then I kind of followed smartphone AR for the most part. There really wasn't much headset AR at that point. And the expectation was that it would eventually turn into glasses. But the realization was that it was very far down the road. you know, maybe 10 years down the road, but clearly we're more than 10 years down the road now. And I just think that when you look at my trajectory in this industry, it's been a lot of like, you know, making sure that people understand that market sizing is overblown. I did a lot of that in 2015, 2016 with VR and AR because I think a lot of people were kind of overvaluing the whole XR industry, saying 100 billion by 2020, if you remember that. I really find myself to be like the sanity check in a lot of ways but i also like to get excited and evangelize about the industry for people who don't understand it and i find myself to be a resource for press to understand the business perspective and the market trends but also work with the companies to understand you know how the press might interpret things how the public might think about it and help them navigate landscape and maybe make the right decisions using a third party as a you know independent observer and kind of like a person to bounce ideas off of.
[00:03:38.931] Kent Bye: So let's maybe start with what are some of your impressions from the keynote, all the announcements, and what are you taking away from today's announcements?
[00:03:46.570] Anshel Sag: I think it really comes into three areas. It's quest and mixed reality and virtual reality. It's AI and everything that Meta does in AI. And then obviously AR and how AR is like a conduit for AI. I think on the quest side, they're clearly showing they're building AAA quality titles. They've got more content coming. They're leaning into fitness. I don't know if you noticed, there were a lot of fitness apps that were either announced or updated. And I think that when you look at what they're doing on AI, they're trying to democratize AI by making it more open, more accessible to more devices. You know, that 1 billion, 2 billion parameter model is going to pretty much run on anything. And they already pre-qualified it with ARM, Qualcomm, and MediaTek. So that's a huge chunk of the smartphone industry is already covered there. And I think when you look at what they're doing with AR, while it is maybe not consumer-ready yet, there's already a lot of really good ideas that they have in terms of prioritizing field of view, making sure that it's comfortable to wear as a standalone piece of glasses. Obviously, it still needs that external compute, but I think when you look at what's possible, that compute might not always be in your pocket. That compute might be on your desk, or that compute might be your phone if you want it to be, or it could be at the 5G edge, which T-Mobile and NVIDIA were talking about last week. So there's a lot going on. with Meta, but I think they have a lot of right ideas. And I think them opening Horizon OS and making it more actually open is a net positive for the industry because it makes Google have to try harder, but it also makes their platform much more accessible to more developers and OEMs like Asus and others who are part of their program.
[00:05:34.110] Kent Bye: Yeah, I had a chance to try out the Gaussian splat demo of the Hyperscape yesterday. And it was really quite impressive because you're essentially going into these Gaussian splats where they're rendering it in real time, but it's all being done in cloud rendering. done on their servers, and it's being rendered to your device. But it looked really good, and it felt like Air Link type of technology of being able to send video remotely in this kind of cloud rendering of VR. It's still static scenes, and there's no real interactivity other than you navigating around, but they were able to do enough prediction as to where you're going to be looking and to what different types of frames they need to render that it was super solid and it feels like that type of remote compute are being fed into what's happening now with their AR glasses, the same type of distributed compute, edge compute, where you're able to render out everything and to offload all the big processing so you can start to have a form factor that doesn't look super big and bulky.
[00:06:32.662] Anshel Sag: Yeah, and I think if you look at like Gaussian splats, those are such a computationally expensive thing to do that you just wouldn't want to run that on glasses. Rather, you'd want to render the final product. And obviously, Meta has a lot of high performance GPU compute among some of the most in the world. So I think that they definitely have the opportunity to make this work. And while I haven't tried that Gaussian Splat, I've heard that it's like a higher, slightly higher quality version of Vario's Teleport, which is also Gaussian Splat that's very similar. And I think it's good that we have more than one company doing this. Hopefully we see even more companies going after this. I guess Snap was kind of doing Splats, but I think they were doing it in partnership with with Niantic, with Scandiverse, I think. So, but nevertheless, I think it's really good because it's more content for the industry. And I think it also helps to create more interesting and immersive environments and potentially, you know, on the Snap side, they were using Splats to actually make things that you could then put on in AR, which I thought was kind of interesting and follows their whole ethos of making AR fun.
[00:07:43.473] Kent Bye: Yeah, I'd love to hear some of your thoughts in terms of the pricing changes with the Quest 3S. Someone mentioned that this is much more like an Apple model now, like where we have 128 at $299, and then you have 256 at $399. And if you want the 512, then you have to upgrade up to a Quest 3, which has everything else. the pancake lenses rather than the Fresnel lenses. And so it seems like that now they're adopting much more of an Apple model now where you can have a range of different options. But I'd love to hear some of your thoughts of what your sense is in terms of they had the Quest 2. It was super popular. They launched the Quest 3. They probably looked at the numbers and were like, this is not really growing as much as we had with the price point previously. So it seems like there was some motivation there internally to return to that type of pricing structure with some of those trade-offs of quality and comfort that you had with the previous version.
[00:08:35.560] Anshel Sag: Yeah, I think it's definitely more Apple-like in approach. But I would also say that I think it's more that they've realized that there's a naturally bigger market for $299. And, you know, I don't think they want to split the market between Quest 2 and Quest 3 because Quest 2 is just not as capable platform. So I think they wanted to make sure that the Quest 3S and Quest 3 had both mixed reality, both had the same processor. So that way developers' experience is better. They have a bigger install base. And the people who are paying $299 are getting a better experience than they would have with a Quest 2. And I think using the storage as a way to slide people up that scale. it's simple but it also kind of addresses the fact that people who are going for the minimum storage minimum spec are going to be more price sensitive than people who want more storage and maybe want the higher spec so it works in terms of how the consumer perceives the product and what their propensity is to spend more for a better experience and i just think that developers want continuity and for them storage is not really that big of a deal for them it's really about what the processor and what the resolution is and how they build to that and keeping that more simple and and kind of unifying across quest 3 and quest 3s i think is a really good move because it's going to make developers lives easier which then in turn makes it more likely for there to be better and more content on the platform
[00:10:01.462] Kent Bye: Yeah, I had a chance to demo the Quest 3S yesterday. And to me, it felt functionally equivalent. Like, the biggest difference for me was that there was the IPD that was more the click-based, rather than the finely adjustable in the Quest 3. But also, they demoed it with an accessory that had a fabric that you could see through. That's not the one that's shipping with, which they did last year as well. So I kind of prefer to see what does it look like out of the box, show me that. But they are adding these other upgrades that give a little bit better demo experience. So it's a little bit difficult for me to know what the final form factor is going to be. But all the other things feel pretty much functionally equivalent to the Quest 3, which I think is a good thing.
[00:10:40.307] Anshel Sag: Yeah, and I think with time, we'll see that this was the right move. It'll just be a question of how do they address the next generation with a Quest 4? How do they keep that continuity going? But I think we still have some time to figure that out.
[00:10:56.636] Kent Bye: So moving on to the Orion demo, this is like the hottest ticket in town here at MetaConnect. So you had a chance to see it ahead of the conference. So love to hear some of your feedback or experiences of what you experienced with the Orion AR demo.
[00:11:09.096] Anshel Sag: It was all the demos that they've talked about. So I got to play Pong. I did the space shooter thing. I did video conferencing, you know, chat, messenger. I also tried using the Instagram app. For me, I thought the user interactions with someone who was remote was really good. Some of the 2D avatars were really impressive. I also enjoyed just playing with someone in Pong and the latency wasn't too bad. I will say the single player experience was better latency. I think for me, the most immersive was when I was playing that space shooter game because it combined the neural wristband with eye tracking, with spatial audio and the wide field of view. So it just kind of all came together in a really comprehensive, engaging experience that I've never experienced in any AR headset to this point. Having that field of view also enabled multi applications and actually multitasking, which in AR really isn't a thing because whatever you're looking at in most AR headsets is what you're doing. And if you turn your head, it's no longer in your field of view. So I think the fundamental thing is the field of view really changes how you use AR. I think the one thing I didn't experience was outdoor use. And I think for me, I would love to see how it performs outdoors. I think maybe it needs to even be brighter than that, but indoors it was great. And yeah, I just think they're doing a really great job of combining that neural wristband and that neural interface with voice, with eyes, and with good sound to make it really a comprehensive platform. And that's why I think it's the best AR I've tried so far. But it's clear that, you know, the resolution isn't quite there yet. They're already working on improving that. They've already got some prototypes with double the resolution. So I think they knew that that was the next natural step for the improvement of it. And I think we're only a few iterations away from it potentially being a consumer product.
[00:13:07.612] Kent Bye: Yeah, I heard that was around like 13 pixels per degree or something around that. It was like pretty low.
[00:13:11.980] Anshel Sag: Yeah, and I think they're already in the 20s, I believe, if not 30. So I think they've clearly addressed that issue because with that wider field of view, you obviously sacrifice resolution in a lot of ways. And they've clearly found a display, I assume, with a much higher resolution, which would help solve that problem. But we've been dealing with this FOV resolution problem for so long. that it's nice to see that at least FOV and potentially resolution could be solved. It's just a question of cost and scale and yield, which I think I heard from the Alex Heath article was that they definitely had some yield issues, which drove up the cost of the headset. Well, the glasses, they would hate for me to say headset.
[00:13:56.741] Kent Bye: So yeah, talking to Norm Chan, he said the price per pair of glasses was around $10,000. I'm not sure if that's including the wristband at all. I think one thing I'd just comment on is that Meta tried this whole trying to branch off from the normal Quest line with the Quest Pro. And I think they found that there wasn't necessarily like a good product market fit for that where there was a lot of more experimental things like eye tracking and other things that they were playing with that would hopefully have this enterprise market fit where they'd be able to have these higher end headsets that would allow them to prototype and experiment with all these different technologies. But it seems like I'm not even sure if we'll ever see like another Quest Pro because of like the first one didn't do all that great but then the other thing is that now with this headset with it being so expensive it definitely seems like a pie in the sky prototype it feels like to me at least that they would need to have at least a number of different other interim steps to prove out and make it affordable because like having eye tracking having the neural wristband having everything it seems like that would still be within a price range that would be way out of most people's reach
[00:15:05.362] Anshel Sag: Yeah, I think price is obviously going to be a challenge. You know, Meta is not running a business where they're losing a ton of money on every consumer product anymore. But I do believe that there will be a higher cost to begin with regardless. And I think that's probably okay because they probably don't want to produce too many of those either. And I think Snap is a good way to look at how that might happen. with a developer-only release that's limited to a monthly payment that's also a minimum of a year.
[00:15:36.157] Kent Bye: Is that surprising to me that they're not losing money on the hardware anymore? Because I know they've wanted to get it at an affordable price, but essentially subsidizing it. So are you looking at public numbers from what they report for now from Reality Labs and their quarterly reports for their public stock? Or what's the basis under which they're now not losing money on all the hardware they produce?
[00:15:55.626] Anshel Sag: Well, I think when you look at what they're doing, they're probably still losing money on hardware, but not to the degree they were before when they first were launching the original Quest 2 at $299. I think they're probably losing less than they were before if you look at their margins. But also the problem is that they wind in all of their R&D operations into that. So it's in the billions of dollars. So they have a billion dollars of sales, but they have $5 billion in losses. So we don't actually know what their margins are. And I don't think they're going to really tell us. But I can tell that when they launched the Quest 3 at $499, they could have easily done the Quest 3 at $299. But that would have also probably caused them to lose significantly more money. And I think they're just trying to price the products right so that they're not totally blowing themselves out.
[00:16:45.382] Kent Bye: We had a chance to talk a little bit about the meta rebands at the Spectacles. But now that they haven't announced any new hardware stuff, they've announced, I guess, more AI integrations that are coming. So I'd love to hear some of your thoughts of what you're excited about or what you think is going to be new and different in terms of the reband meta smart glasses.
[00:17:02.354] Anshel Sag: I really think they're going to use it to push their AI and improve its capabilities. But I did get really excited about the accessibility aspect of it, how you can call somebody and they can see what you're seeing from the glasses and tell you what you're looking at. I feel like that's a baby step towards the AI doing that. I think some people are already doing that in some ways with, you know, I actually watched someone on YouTube talk about how she's blind and is already using the meta Ray-Bans to get around every day. And she uses them as a dash cam too. So she actually can document all the crazy things that happen to her while she's walking down the street. But I think for me, accessibility is a big one. And I think when they were talking about how they're going to continue to, do translation and stuff like that. I think translation is going to be a big one. I was waiting for that to come to roost because it has really good microphones and it already has meta AI. So it felt like it was one of those things that like, of course they should have done that. But I think translation will be a big one too. In my experience, the voice to text is really good. So if it can capture good quality voice to text, then it should be able to translate pretty easily. So I'm excited to try that out and see how that works. But that live demo on stage... I think was pretty good. You know, it got a little bit messed up there. But overall, I think they're moving in the right direction. And I think it's really helping them set the stage for AR glasses so that all the, you know, natural interactions have already been worked through. And now it's just a more powerful platform with displays.
[00:18:40.743] Kent Bye: In terms of looking at the AI landscape and all their announcements, Chris Cox got up during the developer keynote and shared all the many variety of different things they've contributed to open source-wise over the years. And so it seems like my take, at least, is that Meta and all these companies can open source certain aspects of the AI algorithms because they still have the data. And the data are the most important for actually making it useful. Meta is in a unique space where the form factor for really great wearable AI is actually the glasses form factor with AR. These other upstart AI wearable devices that have been trying to integrate stuff just were on these self-contained devices that doesn't seem to make as much sense as just having everything on the glasses. So Meta seems like in a unique position of being at this sweet spot of having not only all of the XR, AR, and VR lines of stuff that they're doing, but also actually having an output for where they're going to put the AI and actually have users want to buy their hardware and products and services because they have a clear use and utility for it. But I'd love to hear your thoughts on all that.
[00:19:43.174] Anshel Sag: Yeah, so I think them open sourcing their AI has been a way for them to, one, make sure that the hardware in the industry is optimized for their AI so that it runs faster and cheaper, but also that they are able to push the industry in the direction they want to go. And it seems like they've done that successfully. So with them offering these 1 billion, 2 billion parameter models that run on pretty much anything, but it's still very accurate. They're offering something to the industry that they can latch on to, which then in a way locks them into Meta's ecosystem for AI, even though it's open source. But it kind of like makes them want to continue to develop on Meta's AI platform and to continue to use their different models that they have, whether it's for language or if it's for vision, you know, they're developing all kinds of different models or anything else. I just think that they're moving the right direction and their open approach seems to be working well.
[00:20:39.726] Kent Bye: So what are the types of stuff that you're going to be looking at in the market over the next six months to kind of see, like we had all these announcements, what are you going to be tracking in order to like see how it's actually taking root?
[00:20:51.092] Anshel Sag: For me, it's going to be actual experiences that are using these platforms and what developers are able to actually accomplish with them. Obviously, with meta, they're not actually making this available to developers, so it'll be interesting to see how they publicly iterate things and what updates they'll provide. I think on the Ray-Bans, it's just going to be a continuous role and I think that's going to be the most exciting just because it feels like it's something that they can continue to improve upon through software and people will be happier with the experience. And then on the VR side, just more content, more content, more content.
[00:21:26.594] Kent Bye: Great. And finally, what do you think the ultimate potential of spatial computing might be and what it might be able to enable?
[00:21:33.962] Anshel Sag: For me, it's really about connecting people together and making sure that people are able to connect with one another regardless of where they are and be able to experience things together at a improved social way more than just over 2D video.
[00:21:52.657] Kent Bye: Anything else that's left unsaid or any final thoughts you'd like to share with the broader immersive community?
[00:21:57.541] Anshel Sag: I hope we're done with AR glasses for the next few weeks.
[00:22:01.758] Kent Bye: Yeah, I don't have anything on my calendar at least, so yeah.
[00:22:04.681] Anshel Sag: Yeah, me either, but you never know.
[00:22:08.041] Kent Bye: Awesome. Well, Anshela, it's great to be able to catch up with you last week and this week to get just this continuing evolution. It feels like there's a little bit of a zeitgeist moment right now, honestly, with what's happening with these AR glasses. We've got two big companies that put their flag in the sand and have two completely different visions. At least we have, like, dev kits for what's happening with Spectacles and now with Meta. It sounds like they're going to be making it available to maybe some developers who may be going to come in and start to develop and prototype. But, yeah, it feels like we're kind of entering a new phase and all these things, like you said, coming together with, like, the... The VR, AR, and AI kind of all mixed together with these products. So yeah, I very much appreciate all your different insights and analysis again. And yeah, looking forward to seeing you at the next festival, whatever that ends up being. So yeah, thanks again for joining me to help break it all down.
[00:22:50.306] Anshel Sag: Thanks a lot. And I'll probably see you at CES.
[00:22:52.844] Kent Bye: Thanks again for listening to the Voices of VR podcast, and I would like to invite you to join me on my Patreon. I've been doing the Voices of VR for over 10 years, and it's always been a little bit more of like a weird art project. I think of myself as like a knowledge artist, so I'm much more of an artist than a business person. But at the end of the day, I need to make this more of a sustainable venture. Just $5 or $10 a month would make a really big difference. I'm trying to reach $2,000 a month or $3,000 a month right now. I'm at $1,000 a month, which means that's my primary income. And I just need to get it to a sustainable level just to even continue this oral history art project that I've been doing for the last decade. And if you find value in it, then please do consider joining me on the Patreon at patreon.com slash voices of VR. Thanks for listening.