#1462: XR Industry Reality Checks with Cix Liv and Turning Your Phone into an AI-Driven Exercise Routine

I interviewed Cix Liv at the Snap Lens Fest about the Snap Spectacles, his reality check reflections on the XR industry, and his latest fitness startup Vibe Fitness Gam. See more context in the rough transcript below.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.458] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So continuing my series of looking at different announcements around Snap Spectacles, as well as the Snap ecosystem, today's episode is with SixLiv, who is... an XR developer who's been involved with a number of different VR mixed reality startups over the last eight years. And he just launched on September 19th, a brand new phone-based AI-driven application, sort of like Supernatural, but using your phone to do AI motion tracking to do this boxing application that's called the Vibe Fitness Game. So six is someone who actually ran into a lot of situations where he was basically trying to develop the same types of applications that meta was interested in developing as first party apps and ended up having like different ways that he was being blocked from the store and just basically went through a lot of different claims of like. anti-competitive behaviors with meta so has also been moving more into like augmented reality and interested in like so what's the future of augmented reality fitness and gaming but also just have a lot of deep insight around what's happening in the xr industry having some reality check type of comments around like what's it take to really actually cultivate some of these different xr ecosystems based upon what he's seen in xr for the past decade but also just comparing what's happening with Apple versus Qualcomm in terms of the silicon that is being developed. So the chips that are within the context of the Snap Spectacles are some sort of unannounced Snapdragon Qualcomm chip, likely to be the next generation that just hasn't been announced yet. I was told by Daniel Wagner that no one else is using the chip. So it's a brand new chip. And then also Apple, which is doing the Apple Vision Pro and some of their latest A18 has just got really amazing performance that's kind of equivalent to an M1. And so 6 is saying, look, it's really a battle between what's happening at the silicon level between Apple and Qualcomm. actually think there's a lot more that's also in there in terms of the software ecosystem developer ecosystem i don't think it just comes down to just the silicon i think there's a lot of other factors of actually cultivating an ecosystem just because there's lots of different ar and vr headsets that are out there and there's a big difference between what type of quality of experiences that are created on each of those different platforms so i think it actually is everything tied together so if there's anything that snap really has going for it it's got one of the best ecosystems and developer ecosystems that's out there I think it's just more of a matter of like how to turn that into a market and to get everyone paid, which is one of the concerns that Six also brings up in this conversation. So we're covering all that and more on today's episode of the Voices of VR podcast. So this interview with Six happened on Wednesday, September 18th, 2024. So with that, let's go ahead and dive right in.

[00:02:50.683] Cix Liv: So my name is Six. I've been in XR for almost 10 years now, since the DK1, DK2 days. I've built several different apps in AR and VR. My general thesis has always been about that headsets are kind of the headphones of the eyes and so a lot of the use cases i think that were big with headphones will translate so i like to imagine that if you were to go back in time when the big bulbous headphones first came out and say everybody in the gym is going to be wearing headphones in the next few decades you know people would probably roll their eyes and i think that that is probably the part of the industry that interests me the most which is the kind of like embodied movement actually how do we hijack our visual cortex to create these experiences where we have embodied movement. So I've created a few apps and experiences around that. A fitness tracker, an augmented reality sports app, and also probably most known for an app called Live, which allows you to see people inside of VR in kind of like real-time VFX. And the best content for that that interests people were the ones where people were essentially dancing, dancing while playing VR.

[00:04:04.013] Kent Bye: MARK MIRCHANDANI- Great. Maybe you could give a bit more context as to your background and all the disciplines that you're fusing together and your journey into the space.

[00:04:11.647] Cix Liv: So I originally got into it because I moved up to Silicon Valley in San Francisco, and I kind of had a dream to start a tech startup someday. But at the time that I moved up to San Francisco, everybody was focused on mobile apps. It was kind of the Uber and Airbnb hype days of how do we commodify various different things. And it didn't really vibe with me. At the time, I wasn't a mobile app developer. I was more an IT guy. And VR interested me because VR was still very hacky. So you had to set up lighthouses that were basically these infrared things that spun to measure the location of the headset. And there was a lot of still hacky stuff happening in the VR space. So they needed IT people to kind of set this up and figure it out. We were still running on desktops. Things were very hacked together back then. my existing experience was valuable in that where i wasn't as valuable in the mobile app space and also because i remember when i first used you know some of the early vr headsets i was really intrigued but the real moment for me was i used the dev kit of the vive and i played space pirate trainer and i remember this moment because it's the reason why i've been doing this for the last almost 10 years now is I had this moment where the drone came up and it shot a laser at me, and my mind as a gamer for the prior decade or so was thinking, how do I respond to this? And I was thinking of pressing buttons, and how do I respond to this? And I realized my body was the controller. And it was like this massive light bulb in my mind that this is huge. This is connecting your body and gaming in a way that's never been seen before in human history. And it really fascinated me because I believe that... The lack of physicality is one of the biggest endemics to not just the United States, but the large majority of the Western world. And to see that and feel that, that games can be so engrossing, that your physical body can kind of exercise, essentially what sports used to be before kids stopped playing sports, really fascinated me as a gamer.

[00:06:31.638] Kent Bye: OK, well, I know I've been in the industry for over a decade now. And we both sort of started through the VR lens. And at some point, you got a lot more interested into the AR applications and starting to develop with an AR sports game that you had mentioned. But maybe talk about that turn that you went from VR to suddenly getting more and more interested into the potentialities for what was happening in the AR space.

[00:06:53.700] Cix Liv: Yeah, absolutely. So I really liked VR in the physical movement elements in the games that you played that were essentially you running around a room. But the big issue, and I find the funniest example that I can describe of this, is that there's a Reddit, a subreddit called VR2ER. So virtual reality to emergency room. And the reality is, not ironically, that a lot of people who play really physical VR games, they end up hurting themselves or hurting other people or there's all these gaff videos of people punching their TVs or accidentally kicking their dogs or punching someone in the face. And for me, that was part of the social apprehension that people had to really move their bodies around in VR was the fear that they would break something or hurt themselves or hurt other people. And because of that, people were kind of more physically reserved. And what interested me about AR just because of what interested me in VR was, you know, how do we do the physicality type things. So I got more into AR under the notion that two different things. One was, you know, we can do things that you couldn't do while wearing a VR headset. Like you can't run outside with a VR headset on, you know, you'd run into something, but also like, the idea of like boxing in your living room, you feel much safer doing that if you can see your environment. So I started prototyping on basically in that format because, you know, now that we're in augmented reality, people could run around. And then now that, you know, in augmented reality, people could do like a full punch without fear that they're going to break their hand or something like that. So that was what like on a high level is what I was really interested in is like How does that format change how people can interact?

[00:08:45.413] Kent Bye: OK. And so we're here at the Snap Lens Fest. And then yesterday was a Snap Partner Summit. We've had a chance to try out the Snap Spectacles. And so maybe you could give some of your early thoughts of what you think of this platform that's coming from Snap.

[00:08:58.562] Cix Liv: I think the simplest way I can describe it, and I mean, that might be kind of hard for people who haven't been in the AR VR space, but considering your demographic who listen to this, probably would make most sense. It's kind of like what Magic Leap should have been, I feel like, or at least their V1. You know, it's a much better form factor. The original Spectacles dev kit would overheat very easily. It's basically a much more polished version of that. The FOV is much larger, but it's still very clear that it's not a large amount of your field of view. But, you know, I've moved from like a severe optimist to, you know, someone a little bit more skeptical. And I think realistically, the cost of these glasses, you know, supposedly they're, you know, $2,400 and we are... I'm expected to pay $100 a month after two years. So that's kind of there.

[00:09:50.559] Kent Bye: I assume that they've said it was a minimum of one year, but I don't know if that's 1,200. But yeah, it's anywhere from 1,200 to 2,400. It's a subscription model where you're a developer and you're paying $100 a month, essentially. Yeah.

[00:10:01.430] Cix Liv: Yeah. Yeah, I think it's really difficult to... I don't understand that business model because if you're building as a developer and you're expected to pay for essentially a developer kit and there's not existing distribution, I think the only reason that business model makes sense is if you're a business selling to another business. And I think one of the things that it allows that I think is very interesting Probably the only thing that's really interesting to me about the glasses is the fact that you can have your own computer vision models and also run LLMs on it. And I hate to be the AI guy because it's getting so annoying to talk about all the time, but I think that being able to run your own models is actually a big problem. Going back in time to when I created the AR Sports thing, we had it outside, but It only worked during sunrise and sunset because at night it was too dark and the Quest computer vision models for hand tracking or even the controllers themselves were only trained on low light. So essentially like in a living room. And the only time that outside reflects those type of visual conditions is during low light periods where the sun is rising or the sun is setting. So that was obviously a huge problem. And we couldn't have our own computer vision models. And so because of that, we had to work in this very tight accommodation. Other things that are really interested with a computer vision model is like Say you wanted to train something very specific for, say, you're running a restaurant and you want to be able to identify ingredients and very quickly train people. There's a lot of things you can do if you can run your own computer vision models because then you can identify objects and things in a scene. That would be really interesting to businesses. Another thing that it allows you to do is run ChatGPT or these other AI models. So to have like contextual understanding of what you're looking at. This is also something that Meta is working on. But I think the big difference is like why Meta, why Snapchat or this is that Meta will only have their own model. So Llama, they will only allow Llama and not have your own custom models. So I think that for me is the most interesting thing about the glasses. I think realistically, the aesthetic and the price and battery life. The battery life is only 45 minutes. Obviously, these are three massive issues that need to be solved. And hardware is really, really, really hard. And it takes sometimes five to 10 years to evolve something into a form factor. It may take them another half a decade or more to get a form factor that the average person wants to wear.

[00:12:46.294] Kent Bye: And one of the things I know that you've been pushing for for a long, long time is to have camera access. I know that at the beginning of this, at least at launch, camera access isn't available for the Snap Spectacles. However, they did say that in developer mode that it is enabled or will be enabled. So it does sound like that they will have more and more opportunities for the Snap Spectacles to be able to have access to the camera to do a little bit more interesting things with augmented reality technologies. So I guess ironically, in some ways, Meta has been using the argument of privacy to not give access to the camera. But I'd love to hear some of your thoughts of why this is such an important thing to have access to the camera for AR.

[00:13:23.021] Cix Liv: Yeah, I think without camera access, we can't call it augmented reality. It's as simple as that. I think when any type of person romanticizes the idea of what augmented reality is, a massive component of that, especially if you look at historically of science fiction, is contextual awareness. Knowing what you're looking at, knowing what you're experiencing, And being able to communicate with that is, I would say, the most important part of augmented reality. And if you can't see what you're seeing, it's just ridiculous to think that that's going to be valuable. The term mixed reality, augmented reality, whatever we want to call it, if it doesn't have contextual awareness of what you're looking at, it's almost completely useless. The only thing that really provides you value is you're not going to run into something. You know, that's essentially the only value that mixed reality, augmented reality has is essentially a safety component if it doesn't know what you're looking at it or how you're interacting in space. Because I think another big part of augmented reality and mixed reality that people miss especially in like if you're playing a game no one wants to see their living room like i don't want to see my living room when i'm like like it's like watching a movie where the movie is partially in your living room it's just it removes you from the immersion aspect of it and so i think the only function of mixed reality for like the quest in its current state is essentially safety or you know maybe your playing with your friends in a living room, which I think is an extremely edge case that no one actually does. But yeah, I think in the majority of cases, people do not want to see their living room. And until we have contextual awareness of our environment, mixed reality and augmented reality is just useless.

[00:15:12.844] Kent Bye: Well, with the Snap Spectacles, I guess one of the value propositions that's unique and different from any other platform is that it is optimized more for having outdoor experiences and to have emergent social dynamics and experiences. So to me, it feels like that's still a little bit more speculative in terms of like, OK, well, you have to live in a city where there's other people to go off and do these kind of social experiments with the technology. it'll probably have to hit a critical mass before you start to see some really interesting experiments, maybe some location-based entertainment where you're specifically going to a place that already has the hardware and that you just go there with your friends and you get to experience it. But I'd love to hear some of your thoughts and reflections on some of the value proposition for what is unique with this platform. There's a lot of trade-offs they did to be able to optimize it for outdoor and for these kind of more connected experiences. And so, yeah, I'd just love to hear what your early thoughts on where it's at now and where it might go.

[00:16:06.218] Cix Liv: I think there's kind of a, before we started this conversation, I'm going to go back to what we originally were talking about, which is there's basically a minimalist augmented reality, a maximalist augmented reality, and then like a fully hijacked into the metaverse you know reality I guess you could say kind of like and I think snap is somewhere in the middle so you know on one extreme you would have like the Ray Bans it doesn't have any visuals but it has like you know you can take pictures in contextually understand your environment and then the on the other extreme we have the vision pro in the quest and he's kinda like you're fully jacked in you know it's completely replacing your vision with pixels basically and this current glasses somewhere in between For the sake of, just to be honest, not trying to upset Snap here, but I don't believe in the mid-level. I don't think it makes any sense. I think that if I'm talking to you and you're seeing a bunch of visuals in front of your eyes, the human brain cannot look at two things at the same time. And even if you hypothesize all the different social dynamics, if you're staring at something else instead of looking at me, I will not like you. It's as simple as that. I would think you're annoying and I would think you have a very low emotional intelligence that you think it's appropriate to stare at a Pokemon while we're talking, right? And I think that people have to be honest about the fact that when it comes to how your eyes and your visual cortex work, you can only look at one given thing at a given time. And if you compare that, for example, music, you could go to a concert and hear music and still talk to your friends. It wouldn't be very great. You'd have to yell over the music, right? But at least you can hear two different forms of sound at a given time where you can't look at two different things at the same time. You know, that's kind of like the physical reasons that I think it won't work. But I think the social reasons are, like, I just had a thing that happened to me yesterday. Of course, I'm at a convention where everybody's, you know, most people are nerds. And, you know, someone was, like, listening to a message through their Ray-Bans or something while they were talking to me. And they just completely stopped talking to me. And, like, I was like, can they not fucking hear? I swear to God, I thought they were deaf for a second. Like, they had, like, lost their hearing. And they were like, oh, I just got a message from work. And it's just like... Even that, I'm like, what are they doing? Like, I'm in the middle of a conversation. I think they started getting a message that was autoplaying into their ear, like, you got a text message from your boss. I didn't ask them, but they said, oh, I'm sorry, I'm sorry, you know, I just got a message. And, you know, I was sitting there like, that's kind of fucking annoying, right? Like, you know? so yeah i i think that you know if we remove the technology and stop talking about what we think the future of the space will be i think we have to return back to what will be socially appropriate and i think we can kind of see the evolution of phones in a similar way where when people first got smartphones they had these custom ringtones everybody was super annoyed by hearing ringtones all the time So then people stop doing ringtones. Now they have notification sounds going off all the time. But if you're hanging out with someone and they have notification sounds going off all the time, you're still annoyed by them, right? And I think we're going to have to kind of figure out what is going to be socially acceptable But I think to summarize my feelings on that is I think you'll be completely jacked into the metaverse through something like the Vision Pro or Quest experience where your entire vision is being replaced. Essentially, you're in your living room, but you're actually mentally somewhere else. And then I think only the minimalist perspective where, you know, you're walking around and it's giving you say directions on where to go or translating things in real time. I think that's been a really cool use case. Like say you're in another country, you don't understand what the person is talking to you and what they're saying. Real time translation, extremely valuable. But I don't think it's going to be a lot of visual things. Because you can't look at two things at the same time. I can't walk around a street and play Pokemon Go. I get hit by a car, right? You know, so yeah, that's kind of my perspective there.

[00:20:10.708] Kent Bye: Yeah, it reminds me when I talked to Robert Skolbo back at Oculus Connect where he was saying that he was kind of like the poster boy for the glass hole effect with Google Glasses. And the backlash that came from that, one of the things he said was that the thing that he thought was actually the thing that was violating the social contract was the eye contact and the ways that these devices are disrupting being able to have that eye contact. So I think from my experience yesterday of seeing all the different demos, there was one piece called As Devlin Council. Did you have a chance to see that one, As Devlin the Council?

[00:20:41.038] Cix Liv: Yeah, I do. The one with the planets in the middle of the sphere? OK.

[00:20:44.941] Kent Bye: Yeah, so I feel like that is more of a location-based context where everybody is all wearing glasses. And so it's more of a shared environment where everybody is in it. I think it can make sense. And when there is this asymmetry of someone's in the augmented experience and someone's not, then you create this boundary and barrier that I think does have some social, what's the norms around that from a social perspective. So I do think that there is a location-based entertainment use case where you can actually bring people together. It's more of like setting the invitation for people to all be on the same level. But yeah, I don't know if you have any thoughts on that.

[00:21:20.757] Cix Liv: Yeah, but I think you could experience something like that in the maximalist perspective. Have you been to the Sphere in Vegas?

[00:21:27.964] Kent Bye: No, I'm about to go to Cosm tonight, though, so I'll see a mini version of that.

[00:21:31.467] Cix Liv: OK, so in that, you're kind of having a somewhat shared experience with another person, right? But it's more virtual reality, I guess, because you're fully in, and the vast majority of your peripheral vision is in that screen. It's essentially they're trying to emulate a VR headset without a headset, right? And I think if you're in a location-based setting, I don't think seeing someone's eyes is as important. If you're all having a shared experience, I think that's less important. But I understand what you mean And there's this kind of irony behind using waveguides because in order to make them work outside, you need to heavily polarize the glasses. And if you heavily polarize the glass, you can't see the person's eyes anyway. So you're basically already kind of obscuring the person's vision because they're essentially wearing sunglasses and you can't see their eyes. But one of the things where I will agree with you was an interesting thing that we discovered at Liv, which was at Liv, we were creating content where someone was in a VR headset. And there's a big issue with that because when people consume content, you know, video content, say on Twitch or on YouTube, seeing someone's face is a huge part of how you connect with a content creator. So if you don't see the person's face, it's very difficult to create an emotional connection. So oftentimes the content has a very high churn rate because they're like, okay, I'm not connecting with this creator. I don't empathize with them. I don't see their face. I don't see the nuances that make them human. So because of that, I think that the only content that was interesting to watch in VR, you know, the mixed reality content where you see people in VR, was when they were using their body essentially as the communication. So in the absence of seeing someone's face, you're connecting with their body movements, which was the only real way that you could create a human connection to that person in the absence of seeing their face. So...

[00:23:31.072] Kent Bye: Well, yesterday I had a chance to talk to about half a dozen of developers who've had early access to the Snap Spectacles to be able to prototype and develop and to create a lot of the launch experiences. And every single one of them was funded by Snap in order to do that in more of a no-strings-attached grant type of way to just facilitate and catalyze the ecosystem. And I know that's a bit of a stark contrast to what we've seen with Meta and the XR ecosystem that they've been cultivating. And I know we've had previous conversations around that. But to me, there's a part of what Snap's doing with trying to put the developers first. Whereas there's this famous memo that Mark Zuckerberg wrote where he was trying to lay out his XR strategy, and they were deliberating as to whether or not to buy Unity or not. And he said, OK, here's our strategy. Number one, we're going to build our own first party apps. Number two, we're going to support an ecosystem and have systems and services essentially profit from the 30% tax. And then number three, we're going to get some money from the hardware, but we're going to actually flip it to what Apple does, which is profit from the hardware and not have these other things as the top priority. And so yeah, I'd love to hear any thoughts or reflections on the developer ecosystem here at Snap, contrast to what else you've seen happening within the XR ecosystem with Meta in particular.

[00:24:42.645] Cix Liv: Yeah, so I think the reality is that both companies are still trying to figure it out. And I think that we'd be disingenuous to ourselves if we were to say that VR, AR is a highly successful industry, and we don't have any learnings to do, and Meta and Snap have it all figured out. I mean, I was just joking with you before this interview that the number one VR game is... a game where kids run on their hands pretending to be a monkey. And that is, you know, that's the highest grossing and highest used app in VR right now. So, Gorillatag and I Am Cat, both of them? Yeah, well, that locomotion in general has been open sourced and so it's becoming kind of the baseline locomotion for a lot of games. But, you know, kind of like running on your hands instead of teleporting or whatever. and i think that the reality is that both companies are still trying to figure it out and at snap these grants are in effect trying to teach them what works and what doesn't and i don't know if snap is going to be you know necessarily a good samaritan in an app store ecosystem or whatever because the reason i say that is because they haven't had that you know they haven't had an app store And the closest thing they have is to lenses, which from what I understand, they do give out a non-trivial amount of their own revenue to creators. But I'm hard-pressed to assume that they're going to be 100% great to third-party developers when they haven't even had that yet. Personally, I think Meta has operated their app store atrociously bad from the very beginning. I've dealt with them personally on it. I think that the reality is that what Meta is doing is very clear that they don't really care about third-party developers. They even just recently, a few weeks ago, changed the app name on your mobile device which was originally called the Oculus app, and then it became the MetaQuest app, and now it's literally called Horizon World, which is their first-party social layer, which is a clear indication that they have no real intention on having a robust app store ecosystem when their app to install the headset is literally called their first-party social app.

[00:27:00.834] Kent Bye: Yeah, it seems like that memo that Mark Zuckerberg wrote that I referenced of having their first party app seems to be that they've wanted to sort of own and control the metaverse and that the types of support that they've been able to have, they have funded quite a lot of content. And they have kickstarting the ecosystem in a lot of ways. It's not like they haven't invested in the ecosystem. They certainly have. But I think the thing we're seeing with even when you search for apps, I saw a screenshot from Mark Schramm saying that when you were searching for Beat Saber, there would be these clones that were on Horizon Worlds above even the app there. So it feels like, again and again, they keep prioritizing their own first party approaches and still kind of want to own the metaverse. So I don't know. It just feels a little sad when they've been the major player, and they've certainly put out the best VR hardware that's consistently been technologically way better than anything else that's out there in the market. But it just feels like they've prioritized their own properties, the first party apps, rather than really actually cultivating an ecosystem. So to me, it gives me great sadness just to imagine where the XR industry would be if they would have actually put the developers and ecosystem first.

[00:28:05.103] Cix Liv: Yeah, I think that the second we moved away from PC VR into this closed platform approach, the writing was kind of on the wall. I think what Meta did is they basically said, okay, we're going to have an app store, we're going to learn about what works, and then we're going to just first party all of it. And we'll buy some of the things, we'll kill the other things and copy the other things. But our goal here is to control the entire stack from hardware to platform to OS. Now they're going to have Horizon OS. In fact, I would say that they're least interested in the hardware. If you look at what they're doing recently with Horizon OS, they're basically going to provide the OS and then they're going to have... The hardware creators create the headsets. I think, honestly, Meta has never wanted to do hardware. They did hardware because they had to in order to control that distribution. And I think that MetaStrategy with the quote-unquote Metaverse has always been a response to the fact that they're kind of under the thumb of Apple. And Apple passed those, you know, the... I forgot exactly what they called. They had to do with doing background tracking of your device and you had to allow or not allow it. But I know that Meta and Facebook feel like they're kind of under the thumb of Google and Apple and their whole entry into the hardware space and the XR space has been, you know, we don't want to be under someone's thumb for the next generation of devices.

[00:29:34.076] Kent Bye: Nice. Well, I know next week is going to be MetaConnect, and we're expecting to hear some new announcements. And what are you expecting to hear out of next week?

[00:29:41.818] Cix Liv: So they're going to announce the 3S, basically the cheaper version of the Quest 3. It's ugly as sin. I don't know what they're thinking. I mean, even something as simple as just making it black, because it basically has the iPhone triple lenses on both sides. It looks like you're wearing a bug on your face. I don't know who's doing the industrial design at Meta and what they're thinking, because I think that it's greatly underestimating the importance of something looking good, especially if they're trying to be like the Apple of XR. I'm very confused with their aesthetic choices. But yeah, it's basically, you know, I think it's going to sell depending on the price point. I think it'll sell relatively decently. I don't think it's going to become a massive consumer adopted device unless there's some... you know, secret that we're not aware of. But if it's much cheaper, it should increase the market adoption. But I don't think, you know, unless I'm missing something on the capabilities of the 3S and there's some capability that doesn't exist with the current 3. I think it's probably just a price drop mostly, right? Yeah, from my understanding, it's basically a cheap version of the 3, and I think that'll help the industry a little bit. I don't think it'll help a whole lot. I think that, you know, at this point, Meta's probably more interested in their Ray-Bans, you know, in terms of, like, mass consumer appeal. I think that's, honestly, at this point, probably has more draw, and also the fact that they can run their own models on those glasses. And there's not going to really be an expectation that third party developers have to have access or whatever. So I expect, you know, my theory will be that they'll put more emphasis on their glasses than they have in prior years as they see that kind of a closed ecosystem device that By the numbers, it's probably doing better in terms of daily retention, but also the demographic is probably closer to the demographic that they want, which is young adults, young or older adults, where the Quest demographic has heavily skewed to a very young demographic recently.

[00:31:46.511] Kent Bye: As we start to wrap up, I'd love to hear some of your thoughts on Apple entering into the ecosystem and how they start to play into everything. And I know you mentioned retention. That's certainly something that we don't have any data on. But just anecdotally, I question what their metrics are in terms of seeing whether or not it's successful. I'm happy that it's there, and I'm sure that it's going to be feeding into all of their design at the operating system level of how they're kind of feeding it into all these different platforms. But I'd love to hear your thoughts of how you contextualize what's happening with Apple and what they're doing with XR?

[00:32:18.621] Cix Liv: Apple has one thing that they're way ahead of everybody. And I think most people are greatly underestimating this advantage that they have. They have the best chips in the world. The A18 is faster than an i9 processor, and it's on par with an M1 chip. People don't understand how unbelievably valuable that is, that they're so far ahead in mobile chips, and everybody else is using Qualcomm. So it's basically, if you want to say the real battle, the real battle of XR is Qualcomm versus Apple chips. Everybody thinks it's, you know, Snap or Meta or whatever, whatever. Until those chips get to a level that they can power high fidelity at a low power consumption, none of this other stuff matters. So here's why it's huge. So to say it again, the iPhone 16 Pro has an A18 chip in it that's as fast as an i9 processor or an M1. Now the reason that's a big deal is that means if we're looking at kind of generational leaps that's relatively consistent. Also, by the way, it's been a 40% GPU increase per generation, which is massive for the last two. Assuming that we're on parity or even optimistically, the next chip in the iPhone 17 will be as powerful as an M2. Now, why is that a big deal? What powers the Vision Pro? Yes, the M2. So, if Apple is really trying to push their strengths, if they can pull off some type of performance parity to the Vision Pro and they can power it by the iPhone 15 or maybe they'll have an Ultra version or whatever, The biggest issue is going to be thermals, because I've worked on this type of stuff before, and they have active cooling in the Vision Pro, and I think the biggest issue is going to be thermals, whether or not it overheats in your pocket. But if Apple can nail a good enough chip, they could, in theory, power a Vision Air or Vision, essentially something close to the Vision Pro in a much smaller form factor. It won't have to be nearly as big. This is why the chips are so important. First of all, if the chips are more efficient, you can make a smaller battery. If the chips are more efficient, you can power it by the phone, which means the battery's not on your head anymore. If the chips are more efficient, you can make the device smaller. Now, if you have all that coming from your pocket, If they can make that headset $1,000, powered by the iPhone 17, that will be a massive, massive inflection point. Because I think a lot of the people that are not using the Vision Pro, they're using it because it's almost $4,000. They're not using it because it's extremely heavy. And if they solve those two things, they're in a massively advantageous position.

[00:35:07.981] Kent Bye: Yeah, I know that even the Snap Spectacles are using dual architecture of two unannounced or unspecified Qualcomm chip. I suspect it might be a next iteration of something that hasn't yet been announced, because typically they will say what it is, but they've just been very secretive around it. But the marketing that Snap was using that was really kind of shitting on VR, of creating this contrast between the exalted reality is the real reality and that you can really only have true social and intimate connections with other people if you're co-located in the same location, which I feel like is kind of creating this false bifurcation. But they wouldn't have those Qualcomm chips unless it was for the last decade of what's happening with virtual reality and all the different mobilization that's been happening. And so the Snap Spectacles literally wouldn't have their processing power without Qualcomm having all this R&D that has been happening within the context of virtual reality. So I feel like it's a spectrum where there's so much more in common in terms of both the tech stack and the design and the architectures and everything else. But that chipset is a key part for what is even enabling what's able to do on the Snap spectacles. And with Apple, I think, yeah, you're right in terms of where their long-term roadmap might be. But yeah, they certainly have an advantage there in terms of their processing power. So yeah.

[00:36:20.563] Cix Liv: Yeah, I think people are greatly underestimating the importance of chips in this battle of XR. And honestly, I think the best chip will win. And I think that a lot of people are thinking of Meta and Snapchat and Pico and all these other companies. But the real battle is Qualcomm versus Apple.

[00:36:42.338] Kent Bye: Great. And finally, what do you think the ultimate potential of spatial computing might be and what it might be able to enable?

[00:36:51.068] Cix Liv: I mean I'll just I mean there's so many different things that will happen but I think for me the thing that has excited me the most and will excite me the most is bringing physical movement back into what entertains us and I think that I'm greatly concerned with this endemic of people sitting on their couches or chairs for 10 hours a day and I feel like we've lost our humanity and I You know, when we think of dystopian movies where people are kind of like sitting in a chair and completely gone mentally, we're already there. In fact, I would say optimistically we're already at peak dystopia, which is people sitting all the time, people having severe health issues because of that. My hope for VR and AR is it brings us back to our humanity. instead of being kind of removed from how we interact with the technology through layers of abstraction, you know, like the QWERTY keyboard, the mouse, you know, these are all very, I mean, even touchscreens to some degree, it's almost like we're turning back to, you know, return to monkey. tag may be onto something there right yeah i mean it's like it's like we're almost returning back to that primitive type of interacting with the world around us and i think it's important because i think part of our soul died in the evolution of the technology and i think that if we do this right ar and vr could return us back to a more human state right do you have uh anything else that's left unsaid any final thoughts that you'd like to share to the broader immersive community Yeah, so I'm going to be launching something tomorrow. So September 19th, I'm going to be launching, we're essentially building a rhythm game that works with the computer vision on a phone. And it's kind of this larger idea of building fitness gaming. It's a little bit adjacent to the VR, AR space. I needed a little break from headsets for a little bit, you know, thanks to working with Meta and the pain that's caused me. But yeah, we're really excited about this idea of just creating embodied fitness and we're going to release it on iOS. So it's a boxing game that just uses a single camera on your phone to track your body movement. I guess the closest VR equivalent would be like Supernatural without a headset. So yeah, I'm announcing that tomorrow and I'm excited. I will eventually come back into the VR space and my hope is to be able to create my own headset someday because I feel like a lot of the user experience issues that I have with devices can only be solved if I really have like a core level access to the device. Yeah. Awesome.

[00:39:31.774] Kent Bye: Well, Six, always appreciate hearing your thoughts and reflections. You think very deeply around the industry. You've always been ahead of the curve in some ways, and that's kind of bitten you in the ass with Meta. Being on the same track of Meta has not done you any favors in terms of trying to do a number of different things that you've had a lot of that friction that has caused you to kind of take a step back from XR. But yeah, just really appreciate all the deep insights and reflections you have on the industry and kind of a bit of a reality check, as it were, as where we're at and where we might be able to go. what's really important to focus on as we move forward. So, thanks again for taking the time to help share a little bit more of your thoughts. So, thank you. All right. Thank you so much, Kent. Thanks again for listening to this episode of the Voices of VR podcast. That's a part of my larger series of doing a deep dive into both the announcements around Snap Spectacles, as well as the AR ecosystem at Snap. What I do here at the Voices of VR podcast is fairly unique. I really like to lean into oral history, so to capture the stories of people who are on the front lines, but also to have my own experiences and to try to give a holistic picture of what's happening, not only with the company, but also the ecosystem of developers that they've been able to cultivate. And so for me, I find the most valuable information that comes from the independent artists and creators and developers who are at the front lines of pushing the edges of what this technology can do and listening to what their dreams and aspirations are for where this technology is going to go in the future. So I feel like that's a little bit different approach than what anybody else is doing. But it also takes a lot of time and energy to go to these places and to do these interviews and put it together in this type of production. So if you find value in that, then please do consider becoming a member of the Patreon. Just $5 a month will go a long way of helping me to sustain this type of coverage. And if you could give more, $10 or $20 or $50 a month, that has also been a huge help for allowing me to continue to bring this coverage. So you can become a member and donate today at patreon.com slash voices of VR. Thanks for listening.

More from this show