#1468: Kickoff of Meta Connect 2024 Coverage with Orion AR Glasses Hands-on Impressions with Tested’s Norm Chan & CNET’s Scott Stein

This is the kickoff of my Meta Connect 2024 coverage with more than 12 hours of coverage across 20+ interviews on site. In my first episode, I interviewed Scott Stein, Editor at CNET, AR, VR, Wearables, Gaming, & Future of Tech as well as Norm Chan, Executive Editor of the YouTube Channel Tested, at Meta Connect 2024 talking about the AR Orion Glasses demo. See more context in the rough transcript below.

Here are the lines to all of my episodes from Meta Connect 2024:

Here’s Chan’s Tested coverage of Orion AR Glasses

Here’s Stein’s CNET coverage of the Orion AR Glasses

Here’s the full video of the Meta Connect 2024 keynote:

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.458] Kent Bye: The Voices of VR podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR podcast. It's a podcast that looks at the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So I'm going to be diving into my coverage from MetaConnect 2024. I did 19 oral history interviews with over 12 hours of coverage, trying to cover some of the big news that was announced at the keynote, but also take a little bit of a sampling from the zeitgeist of the XR community to see what kind of emerging topics that were happening there. So I had a chance to catch up with Norm Chan and Scott Stein. They had a chance to actually try out one of the hottest ticket demos that I unfortunately did not get to see. It was the Orion AR glasses demo, which is basically a R&D prototype form factor of AR glasses that is less than 100 grams and able to do like eye tracking and hand tracking and like 13 pixels per degree. So not high resolution, but yet at the same time, a 70 degree field of view. All the compute is happening on an external puck. And they've got the Control Labs EMG neural input, which I think is, for me, probably one of the more interesting things that they're showing off because that seems like it's a little closer to actually being launched. They said it was probably like two to three years or maybe even more before the AR glasses demo was going to even be ready for a consumer product. There was an interview that Boz did with Ben Thompson of Stratutory and essentially was saying that part of the reason why they were showing this demo was to take off some of the pressure from investors, but essentially that they've been dumping billions of dollars into XR through reality labs and they wanted to show like the North Star for where they're going for the AR glasses form factor. And basically it's like one of the most sophisticated consumer technology devices that's been developed. So they want to reassure the broader public and especially their investors that they're on a track of technological innovation, even though it's going to be a number of years before it actually is being released. So I unfortunately did not have a chance to try out their Orion demo classes, but I did have a chance to talk to four different people over the course of my coverage, including Norm Chan of Tested, as well as Scott Stein at CNET, who both had hands-on experiences with the Orion classes. So I had a chance to talk to them with some of the big news coming out of MetaConnect, including the Quest 3S, which is amazing. the same chipset as the Quest 3 and the same optical stack and display resolution of the Quest 2, and some of the different AI and Ray-Ban Meta smart classes. No big announcements on that front, but just more AI features coming to that platform. So I had a chance to talk to a number of different power users of the Ray-Ban Meta smart classes as well throughout the course of my coverage. and also other topics like WebXR and the Flat2VR gaming, some of the different changes on the store. You know, Gaussian Splats is a topic that came up a number of times, as well as with what's happening with fitness and some of the new mixed reality games that are being released on the platform as well. There was also a number of different content creators I had a chance to catch up as well as with the editor of UploadVR, Anne Hamilton, where we do a bit of a deep dive into some of these reflections of the current state of the XR industry. I'll be ending with a deep dive with the chief creative officer at another Axiom, the original solo creator of Gorilla Tag, Karistel Smith. I'll be ending my coverage with a deep dive into the design philosophy of Gorilla Tag. So that's what we're covering on today's episode of the Voices of VR podcast. So this interview with Norm and Scott happened on Wednesday, September 25th, 2024. So with that, let's go ahead and dive right in.

[00:03:34.111] Scott Stein: So I'm Scott Stein, I'm editor at CNET, and I cover AR, VR, wearables, gaming, and kind of the future of where this stuff's going.

[00:03:42.893] Norm Chan: And I'm Norm Chan, I'm the executive editor of the YouTube channel Tested, been covering the space for over 10 years now, it's a personal passion of mine, and yeah.

[00:03:54.096] Kent Bye: Maybe you could each give a bit more context as to your background and your journey into the space.

[00:03:58.792] Scott Stein: Sure. So I started at CNET where I was looking at laptops like 15 years ago. And then my background before that was theater and creative writing things. That kind of came back into play when I started moving into, well, first wearable tech and phones and VR, AR. And a lot of stuff now is looping back into that theatrical element, the storytelling, even like the theatrics of like the experience, you know, like what's happening on cognitively and what's going on trying to communicate to people. So yeah, I just kind of keep following that path as it goes to also Neural, which we'll talk about too.

[00:04:31.403] Norm Chan: And I started in games journalism back in 2000. So did games journalism, did tech journalism in print magazines? and then moved to digital publications, YouTube channel, now covering mostly what would be considered the maker space, so how people are using technologies to create things, to have a point of view, so a little bit of what that means for storytelling, but also personally I love emerging technologies and diving into how things work, how things go from the research lab to becoming consumer products.

[00:05:02.301] Kent Bye: Great. Well, so we just came out of the big keynote that happens each year at MetaConnect, where all the announcements are announced. So yesterday, I had a chance to try out some of the demos of stuff that was announced with the Quest 3S, the AI and the Ray-Ban Meta AI integrations. But there was one demo that I didn't get a chance to see, which is the Meta Orion AR glasses demo, which each of you got a chance to see. And I'd be curious to dive deep into that. But before we do that, I'm just curious to hear some of your overall impressions the event, the announcements, and the other things that you think are really sticky in terms of what was just announced?

[00:05:36.553] Scott Stein: Yeah, so trying to digest what was going on here, getting a chance to see Orion, it's like Meta's creating a new goalpost to get to. And I feel that it makes all the things they're doing now seem a little more like incremental steps to getting to that goalpost. And so in the here and now, Quest 3S, it's incremental. It's affordable, but it's kind of a continuation of the same. Meta, no new Ray-Bans, and a lot of AI as the glue. And Orion, to me, when I demoed it, looked like you could see all those pieces coming together. Not all of them were there, but I feel like that was the message, was like, see what we're trying to get to. What we're aiming for, where all the pieces fit, but I'm really curious to see where those pieces emerge before the glasses, you know, before the finished glasses. You know, will we see more here or there in mixed reality and wearables and all that stuff?

[00:06:25.104] Norm Chan: Timing is super interesting because last year we had highly anticipated Quest 3, new hardware for VR space and mixed reality space, really putting their flag down on the things that they believe are important for those users. And I think it's not going to be an annual hardware cadence. So 3S is for broad mass appeal. I think they found surprising success in the hardware for Ray-Ban metas. And so doubling down their commitment on that, but they probably couldn't have iterated on a new version of that in a year. I don't think they want to. They're not Apple, it's not a new phone every year. And I think the proof is in the backend updates and the software update they've put in both Quest Platform and Ray-Ban Meta over the past 12 months to show that there's a lot still to grow there. So they had to have something new in terms of hardware. And so we saw teases of what they announced as Orion Instagram posts in previous interviews that Boz and Zuckerberg have done. So interesting timing that this is now the right time for them to show it off. It plants that goalpost a little bit further out. It's not a product, even though one interesting thing is that it seems like they considered it being a product in its development, but something that costs them around $10,000 to make per pair, there's no way it could be something that would make sense, nor the experience I think makes sense right now for what people would want.

[00:07:40.205] Kent Bye: Yeah, I haven't had a chance to try it out there, Ryan. But I'm very excited to dive into all of that. But before we do that, I want to also just mention I did have a chance to see the Quest 3S, which felt pretty much like any other Quest 3 demo, which I think is saying a lot in terms of they're able to have some sort of parity. There's no depth sensor on the Quest 3S and also 96 degree horizontal field of view and a 90 degree vertical field of view. It was essentially just like a cheaper version with the Fresnel lenses instead of pancake lenses and also One annoying thing sometimes with Meta is that they will put on an additional accessory that you demo it with the accessory rather than demoing it with what is shipped with. They did that last year. So I didn't get a chance to see what comes out of the box. But at least from my experience, it seems functionally equivalent, which Meta probably saw their sales for how well the Quest 2 did at that price point. And then the Quest 3 increases the price. Now they essentially have kind of an Apple model where you can get the $128 gig for $299, and then for $399, you can get the 256 version. And then in order to get the next version up, the 512 gigabytes, you basically have to go up to the Quest 3. So it's kind of like an Apple model. When you're buying it, they're doing all these different trade-offs. So at least they kind of have that mode now where people can either decide how much space they want and how much they want to spend. But other than that, I'd love to hear any other thoughts you have on what's happening on the Quest ecosystem.

[00:09:01.415] Norm Chan: I do have some more thoughts on like the decisions or what decisions they made. I think they made probably the right trade-offs and time will tell we'll have to spend more time with 3S. We all remember what Quest 2 was like and going from Quest 2 to 3 was such a revelation in terms of comfort and it's a question of is comfort the thing that's preventing users from returning to this? Price point will get it in the house for the holidays but what's going to keep it on the coffee table and not in a cabinet and not in a drawer. And I think Meta feels strongly that they have the content and there's enough content ecosystem. They certainly have all the games for young users and users to be interested in VR. So is it price or could they have chosen a different price point or chosen different trade-off, but maybe went with pancake lenses because the huge benefits of longer wearability?

[00:09:50.885] Scott Stein: Yeah, it kind of feels now, like you said, it feels like they're putting those pancake lenses as like a pro uptick and they're trying to make you go, oh, it's very tempting to upgrade with the more storage and that thing. It's a little weird price bump wise where it's $300, which is affordable, but it's still a little more than the Quest 2 had been price dropped to recently. And so I wonder if people are going to be like, oh, I was about to buy that. Now this price goes up a bit, even though it's still low. But I think it's impressive how Meta for this and the glasses are putting things in a very like kind of impulse purchase zone, like still expensive. It's not total impulse purchase, but it's definitely in the like, I might treat myself to it with some money versus like an investment price of like a phone or a laptop. And they're keeping it in the territory. So the product line stays there. It's very different from competitors. and so yeah the 3s the one thing that stood out for me in the demos that i thought about afterwards is that they really didn't do anything with mixed reality in the demos and i just kind of thought it's interesting because that does seem like a unique advantage but it leads to a question even coming out of connect now they kept talking about things like augments like in the weeds like these like pop-up mixed reality elements and things that would work in your environment and it doesn't feel like the state of what they're doing with that has really happened yet and i wonder if it will

[00:11:10.860] Norm Chan: Even one year. Even one year after 3D launch with those features.

[00:11:14.142] Scott Stein: Yeah. The pitch now was very Vision Pro-like, not surprising. But demos always kind of comment on each other, I feel like, over time. But they were looking at the Dolby Atmos, and they were talking about Prime video on it, and dimming the environment, which was cool, but really flat screen type stuff, and then games. And so I was like, well, the capabilities could be a lot more, especially fitness. I was thinking, that's such a killer app for me over the past year and a half. I feel like people I know get the headsets for that. but that wasn't really in the conversation today. It was just kind of like use the quest as you use the quest, I think.

[00:11:45.014] Kent Bye: Yeah, they showed the press a 45-minute demo loop, but there's probably two or three times more content that we didn't get to see. And they'd have a whole section on some of these exercise. One other point that I want to make about the Quest 3S before we move on is that the adjustable IPD goes back to the three-phase click that had in the Quest 2 rather than the more precise slider, which I'm sure that's also additional cost. For me, I think that was a huge comfort thing, that my eyes get double vision-y when it's not exactly your IPD. My eyes adjust, but it also feels like, What is this doing to my long-term vision when you have stuff like that?

[00:12:17.158] Norm Chan: Exactly what I was talking about in terms of the trade-off. They chose the cheaper component, something that they knew tens of millions of Quest 2 users are OK living with. But making comfort, and it's not just the weight and the clarity of the pancake lenses. It is comfort. So you're talking about the bare minimum. We're still at that point with VR headsets that the bare minimum, you need the premium model for something that I feel like people can wear for longer than the two hours of the battery life.

[00:12:43.444] Kent Bye: Yeah, I mean, I would recommend people getting the Quest 3 over the Quest 3S. But if price is an issue and if you don't have it at all, then it is, I think, worth considering. Well, let's dive into the experience of Orion. Tell me everything in terms of what the experience was. I know, Scott, you had a chance to file your report this morning. And then, Norm, I know you're still kind of working on yours. It's probably going to come within the next couple of days. By the time this conversation airs, it'll all already be out there. But yeah, just kind of walk me through what it was like to see it.

[00:13:10.615] Scott Stein: Yeah, I mean, I saw it later in the day, and I didn't know that I had to bring contact lenses. I should have known better. So this is the trend now. You've got to have prescription inserts or find your own way with contact lenses. Some things like Vision Pro, they actually had a glasses scanning system already in place where you'd be able to get a fit, even for my prescription. They didn't have that in place for this. There were no prescription lens inserts, period, for Orion. They gave me a pair of... Mark Zuckerberg's contact lenses in the demo, which apparently they were his. So that was very nice and accommodating. I didn't meet Mark, but I did get to wear his lenses this time. And they were very close to mine. So that's how I got to demo it. Vision, not perfect, but good enough for the demo. And I got a briefing with the team on all the stuff that they were talking about with the vision, all the things they've been doing over the past five years. And then a big group gathered to guide me through the demo. Very contained, guardraily, monitoring how I'm doing on one thing to see if things are working. A couple of hiccups at times, but fascinating because, you know, the glasses I've seen in various forms kind of like that, but tethered or in different shapes or sizes. This was very light, comfortable, still bulky, but in the photos, you know, it looks more like, it leans more towards regular glasses than something weird. Yeah, so and then the neural input wristband was the most interesting thing to me because they've been talking about EMG for years, control labs, acquisition, and never got to try it. And that's now very familiar because I just think this stuff is creeping in. Gesture stuff is creeping in. Vision Pro does gestures. Quest does gestures. Apple Watch has some tap gestures. So it felt like that with more rather than something new, but I'm really excited to see what else it can do.

[00:14:59.005] Norm Chan: Yeah, oh my gosh, so much to piggyback on. I had the exact same experience. Got a briefing where they talked about what the goals of this device was. And I alluded to earlier, this is something that they planned originally to see if they could make a consumer product, and eventually realized that it made more sense as what they call a time machine, which is, let's put as much money as possible into something that's impractical commercially, but try to achieve a goal of some kind, a design goal, a functional goal, so that their team can learn from it. It seems the goal here wasn't like the brightest display or the widest FOV. It was a combination of it being the widest FOV augmented reality pass through optical glasses in a form factor that would be socially acceptable. So, you know, unlike Magic Leap 2, which also has 70 degree diagonal field of vision, those are big HoloLens type goggles because they have to be that big because that's how their wave guides work. And so the technical achievement that they're very proud of is instead of using glass or plastic, they're using silicon carbide, which is, as they explained it, a material that's grown like a sapphire crystal would be grown. And like Apple, for example, famously could not make sapphire crystals strong enough and good yields for their Apple watches. So this is like 90% of the cost of this are these silicon carbide lenses. that they have very low yields on that then they etch physically as opposed to doing an imprint for the actual guiding of the waves, the lights that comes from the light engine, the micro OLED projectors on either side of them. But what you get then is something that's still thicker than your Ray-Ban Meta Dust, both thicker in terms of the profile of the rims as well as thicker in terms of the actual depth. of them, but are getting closer. And something that was reinforced to me is that not only is this something that they're happy with, that they would have considered to be a consumer product, but they have the roadmap in place. They know the next steps to make this 50% thinner and 50% smaller ribbon thickness. And explain those to me, and that's a whole other conversation, but they feel really confident that within a couple of years, this is the bare minimum.

[00:17:03.931] Kent Bye: So yeah, at the Snap Spectacles, I have a negative 5.5 vision, and they only went up to negative 5 for even those. So they had an insert, but it was still a little off. But it seems like that these, at least as they're being designed, are these the types of things you would always have to wear contact lenses? Or can they have corrective lenses up to your prescription and still be able to do all the stuff that they're doing?

[00:17:25.660] Scott Stein: I didn't get to ask that specifically. I would imagine that's on the plans. It's got to be. They know with Ray-Bans and everything else, and that's probably another point. Whether they could do it even now, that's probably another point to why it's not a consumer product yet that they didn't talk about. But I'm curious what they're accommodating people for. It sounded like the team is putting in contacts. And so, you know, I'm not an optics specialist in that regard, but how those smaller tracking cameras can work to deal with the bend of prescription lenses. I mean, I remember when I tried different eye tracking products in the past with my glasses in there, it always had problems with me. Vision Pro even needs some recalibration with my lenses from time to time. It tends to kind of drift a bit over time. So, yeah, I don't know. It's a good question.

[00:18:08.576] Norm Chan: I do have an answer for that because I asked them specifically about it because I don't wear contact lenses and thankfully my prescription isn't that bad. It's like one and a half, negative one and a half and one eye. But they will support corrective lensing and it won't be an attachment. They wanted to keep that profile minimum. They're really proud of the fact that the distance between the pupil to the nearest layer of the optical stack is less than 15 millimeters, less than 1.5 centimeters, which is, if you think about the distance, if you wear glasses, I mean your eyes and your glasses, that's pretty close. One comparison they had was like the X-real glasses, while they look straight on like sunglasses like the bird bath optics require that distance here they don't want that distance and so the optical stack that they've designed it's many many layers there's a polarizing layer in the front there's a definitely combining layer there's a silicon carbide waveguide layer but the layer that's the closest to your eyes is something that they can custom order it'll be a piece of plastic and it will be something that you can as you would order ray-by-metas order something with your lens correction but they didn't say the range

[00:19:07.594] Kent Bye: Yeah, having just coming back from the Snap Partner Summit and the Lens Fest, the aesthetics and the form factor of what Snap had to do in order to get their 46-degree field of view, super bulky, super awkward, not something that I think that most people feel acceptable wearing. And I feel like most of Meta's industrial design and aesthetics is not acceptable. prioritized, like in VR headsets, you know, they're not as concerned. You know, Apple tried to create something that looked really cool, but then sometimes they prioritize how it looks versus how it actually functions and actually has weight distribution that works and doesn't need additional straps. But at least for this collaboration that Meta has with Alexotica with Ray-Bans, they've kind of stumbled upon something that Mark said today that is like, was surprisingly successful. And I think there's something around like, The importance of these types of wearables, having them look good, prioritizing that now within their design process.

[00:20:00.095] Norm Chan: No one wants their wearable tanked by someone wearing it in the shower like Google Glass is. All these devices are remembering what happened with Google Glass, which if you look at that tech, functionally, it did a lot of things that these Ray-Ban metas do. And today, maybe it launched differently if they didn't have to look as futuristic, would have been more of a success. So optics, to run the pun, very important.

[00:20:23.611] Scott Stein: Conversationally, I remember we were discussing a little bit in the beginning of the briefing, their partners in Italy hold them to a high standard with the design. It sounds like meeting that challenge is part of their interesting game that they're playing now too. It means that they can only include so much in the glasses as they evolve, but it sounds like that's the big goal is to meet that standard and make probably the glasses manufacturer side of things happy too with the design that it's not drifting too far off track and it's keeping them with an idea of like form and what fits comfortably for people. So yeah, I kind of think that's exactly it. We'll just see things emerge when they're able to fit that form better.

[00:21:07.697] Norm Chan: Yeah. Something I got to see, which I don't think they showed in the keynote, was I had a conversation with Boz. It'll be in my video. We chatted for about 20 minutes about Orion. But he had something which was like the clear Ray-Ban Metas that they announced today as a limited edition. He had a clear Orion that you're not allowed to get close-up photos because there's proprietary IP in that. But it illustrated how densely packed this hardware. I mean, the team joked about the famous Steve Jobs apocryphal story of him dropping the phone in the fish tank to see if there were air bubbles. They're saying there's not much space in here. Every square millimeter of space is filled with battery or electronics or cameras configured in a way to make the local processing, the local SLAM work. And something I don't think we've mentioned yet is that this also has a compute device, but it's a wireless compute device, which is Also another choice and innovation here. So it's like an external puck? It is. It's an external puck that's required to use with it. So unlike Apple Vision Pro, where the external puck is battery, here the external puck is for compute, but it's not tethered. It's a wireless compute puck. I would say smaller and certainly in weight and dimensionally than the Apple battery, but something that they say not just could fit in your pocket, but you could have in your purse. You can have in your backpack as long as you're within 10 to 12 feet of it. It actually offloads the app logic and the app compute while all your tracking, eye tracking, world tracking and reprojection, all the stuff that makes the glasses display and display comfortably actually runs locally on the glasses themselves. I thought that was super, super interesting.

[00:22:44.087] Scott Stein: Yeah, that compute device fascinated me because of the wireless. And it's kind of sort of phone-sized or oblong. It feels not quite as big as you'd think. Weirdly has its own tracking cameras. I have a story about that.

[00:22:58.072] Norm Chan: So those aren't... They're not used now. No, they're a byproduct of an earlier prototype phase.

[00:23:03.775] Scott Stein: I thought it was going to be a controller.

[00:23:04.815] Norm Chan: where yes, they considered, and this is where I don't think we spoke about how this device and this team shares research and shares learnings and technology with the Quest group, is one version of Orion had the processing puck also have the outward facing cameras that the Touch Pro controllers have, so you can use as a six off controller. It also had cameras on the side that were looking up at a 45 degree angle. And the one experiment they had was letting you place it on a table and have it look at you to get body tracking so that you would send your body track codec avatar in a video call, for example. But they're not going that route. But again, these are the paths. They should. I mean, maybe ML. I think at some point they should.

[00:23:47.866] Scott Stein: I feel, I just might hot take. But like, I know what you mean. Because it's like, it was interesting because they said, OK, we're not going to do that. But then I keep thinking about accessibility or other things. Like the Snap demo with the spectacles was big on like, you know, the phone could be a controller. The phone could be this if you needed to. And I feel like it reminds me of Vision Pro a little bit where it's like, you know, well, our only inputs are this. But it's nice to have another option. I don't know. But it was just my... I was very curious about that, but I was also interested in, we're seeing a lot of companies now try to bridge or broker what happens with the future of phone-like devices. You know, it's like Spectacles is trying to figure it out, but through a phone app. X-Reals built that phone-like Beam Pro because movement on the phones and OSs doesn't interface well with glasses. And I feel like this is another almost kind of thing like that, where it's like a compute wireless thing because they're not going to have that function with a phone. But somewhere down the road, our phone's going to evolve to develop that good wireless protocol or be open to these things. By the time these glasses emerge in a few years, I would imagine the thing you're interfacing with should be

[00:24:51.405] Norm Chan: A phone? From Meta's perspective, they were very clear. This device, the billions of dollars they've spent into this, the whole Reality Labs venture, is to surpass the phone, is to bypass the phone. Like, they are so frustrated that Rayman Metas don't get the priority access that Apple first-party devices get for pairing or for image transfer that you have. If people don't know about the Ray-Ban Metas, the way you offload your images to your phone, you have to connect to a Wi-Fi network generated by the glasses. and then transfer. It is the opposite of seamless. If Apple made the Ray-Ban metas, they would just be in iCloud and be on your phone. And that's all regulation and policy, and that's all a whole separate conversation, but this is one of the frustrations of Apple holding all the cards with the hardware.

[00:25:38.168] Scott Stein: They're also second-class citizens with the AI stuff. So it's like, you know, there's so many things that hang on requests that, like, you know, when is Apple ever going to allow the open AI partnership, you know, whatever, with Apple Intelligence? But, yeah, exactly. Like, how does a thing like this have an understanding of what your phone's doing? It's working around that.

[00:25:54.052] Norm Chan: They don't want to. They want this, and that allows them to then price it as a laptop replacement, as a phone replacement. The directive is to make this a consumer product that someone would buy instead of buying a phone, which is a tall, tall order. Absolutely. problem.

[00:26:07.767] Scott Stein: I feel like that's the thing is that I kind of interpret it as like absolutely they're not pursuing the phone connection, but it's this like chicken and egg. I don't know. Everyone's got a phone, you know, at some point.

[00:26:20.163] Kent Bye: I think they've been dealing with not being in good graces with either Android and Google and iOS and Apple. And so that's the whole reason why they got into spatial computing. But my take is that there has never been a device that has completely supplanted all the other devices and that we will have our phones, but they don't do all the things that they want to do because they don't have the access.

[00:26:38.821] Scott Stein: I mean, that could happen at some point, but I just feel like in the meantime, especially in the next five years, you're going to have glasses that start to work with phones. And some of them, depending on the partnerships, will work well. And so I just think this device, as it develops, will also be going up against that and so I think there's that thing of hopefully Google opens up in this regard but it's like something needs to give there because I just think unless it's fully self-contained even then you're still gonna have ecosystem incompatibilities you know as AI systems start to you know that's a whole open weird questions like how do all the AI systems and the future beyond maybe apps like play with each other too

[00:27:19.927] Norm Chan: outside of the whole platform question because they are making their own os for this they call it ar os i think we got off track with input right the idea of what is the best modality for human computer interface for ar classes and if you look at apple the flag they've put in the sand it's eye tracking plus hand tracking you know they found a really good sweet spot of that being really intuitive and versatile and obviously they put a lot of effort into production and making it work as you expect but over this past almost over half a year now of using Apple Vision Pro on a regular basis, I've found the limits of eye tracking plus hand tracking. And one of that is fatigue, not just precision and latency, but the fatigue of moving your arm and your wrist versus what you really want, which is something that's micro gestures, right? A tap is a micro gesture, a pinch and a move of your arm, maybe not. And that's where I think they've really doubled down on this Control Labs investment, this company they purchased for the EMG wristbands. So to describe the wristband, it's a small wristband. It's no longer like Black Widow's wristband that she wears in the Avengers. It looks like a watch strap or like a Fitbit strap with a magnetic clasp that does need to fit tightly to your wrist, your dominant hand. No reason that you couldn't have two of them also, they said at some point, but your dominant hand and be as tight as like a tight rubber band. right it has eight contact points which non-invasively read the electrical signals going to your muscles without any training that i believe scott you and i did no training for this no training the system no calibration the basic gestures that we got were like a thumb to index finger tap as a positive input a menu gesture which was hand upside down and tapping middle finger so they know orientation of hand without being able to see it swiping up and down using your thumb against your index finger and then double-tapping your thumb against your inner screen for launching their AI assistant, which is basic. What was your experience with that, Scott?

[00:29:11.669] Scott Stein: Generally good, but sometimes it confused a few of my taps. Like I found that trying to open a menu or something, and I don't know if it was my eye tracking with the tap or whether it was the tap recognition itself. But something was a little odd when I tried to close menus at times between the open hand one and the other. And again, maybe it's my own learning of the system, human error with that. But a lot of it worked really well. And I really love the idea behind this because I was curious how many more gestures they can introduce. I like the haptic feedback that was on it, which was a subtle.

[00:29:42.181] Norm Chan: Very subtle.

[00:29:42.862] Scott Stein: Yeah, very subtle vibration, but enough of a feedback. And I don't know, I just feel like I've been waiting for this for a while in my mind, where watches and other things would develop as an interface for headsets and glasses. And it feels like that is a step ahead in exploring that, putting it out there, at least, first. And saying, you know, who else is working on this? But they're doing it and saying, hey, it's here, you can try it. They mentioned EMG possibly showing up in other products and makes me wonder about that watch rumor or other things. Or could it be a wristband thing for the Quest or, you know, for the glasses, certainly, because you could start doing gestures with Ray-Bans. Although I don't know how many people would put on a wristband with Ray-Bans. I just feel like Ray-Bans are like get up and go. I feel like the Quest seems like saying you'd... So I don't know, just a whole question. But yeah, it just really fascinated me and I wanted more.

[00:30:34.140] Norm Chan: One thing I'll say is, you know, hand tracking worked fine because, again, they're using the Quest hand tracking. They're using eye tracking that they probably inherited with Quest Pro. So they could have done just what Apple did and did hand tracking plus eye tracking. They could have done a track controller. They experimented with the track controller with their compute puck. The fact that they stuck with EMG And there are wristbands now that with a basic gyro, with an Apple Watch, with a Google Watch, you can machine learn and get an input, a positive with just an IMU. Like EMG isn't necessary for a tap. You can turn on accessibility. It is a function right now. So why invest in EMG when you have to make sure it fits in a certain way? And to them, it feels like the ceiling of input is much, much higher. It gets you the immediate benefit of not needing to be in the field of view and getting the micro gesture in your pocket. You can literally be walking down the street, scrolling through your social media feed without needing to be like that classic hand waving in front of you. Like that's the input analog to glasses looking normal, input looking normal. It needs to be discreet, needs to be something that isn't also just you with a finger on a phone swiping. I think they believe in that. And I think they believe that the ceiling is high enough that they'll be eventually able to register handwriting gestures, typing a keyboard that's not there, and all sorts of things that not only do you learn to accommodate the system, but the system's going to accommodate you.

[00:32:00.089] Kent Bye: Yeah, just coming from the Snap Spectacles and seeing that they're launching their dev kit as kind of a novel way of $99 a month for at least a year. So it's at least $1,200, maybe more for depending on how long you have it. You're essentially renting it. You're not owning it. And so today on stage, Mark Zuckerberg said that This will be like a dev kit that's going to be sent out to specific developers to start to prototype and experiment and make stuff. But if it costs $10,000 per dev kit, then it's going to be probably already pretty limited for who's going to have access to it. I see what's happening with Snap Spectacles is sort of like this DK1 moment of AR where they're actually seeding up what's a much more affordable approach and democratized. You still have to apply and get accepted into their developer program, but you could get for $99 a month access to what I think is probably one of the more cutting edge AR dev kits that are out there and start prototyping and building stuff on it. It's only 45 minute battery, which you can tether a battery pack to. But I'm just curious, did they talk about battery life just for even this dev kit, like how long it is and what they're able to get even with this form factor?

[00:33:00.663] Scott Stein: They said about two to three hours, which sounds pretty good for something like that. I mean, we didn't get to test that, but roughly all day or a good chunk of the day for the compute puck and for the neural wristband. So a lot better in theory than the Snap Spectacles, which are 45 minutes of battery life. But I think what's interesting to me too is that at least the way they were pitching them, it also sounds like the companies are going after slightly different things. I think Snap, the advantage of that being in the here and now is it's something that people can actually start playing with and they're very focused on outdoors, very focused on collaboration, like activations with groups of people. And Niantic being a partner was interesting to me because they were talking about glasses for years. You think, how many companies are really doing that now with the thing you can do outdoors? And I feel like it's going to be kind of a live test kit for people maybe to explore some of that stuff. They're trying to work with lenses that exist. A lot of like here and now. Metas, at least the way it was demoed or the pitch, they didn't discuss outdoors at this point. They didn't discuss multiple people, although we did have one gameplay session between two people. But it's just really like it exists. You know, a lot of it was focused on things like achieving field of view, neural input, the fact that it functions wireless like that. But I think Snap was looking a little more like specific environmental situations i'm sure meta will go there too and i wonder with the cost of that kit they mentioned internal use i would imagine it's not going out to very many it might stay just like right here and then people get to maybe try it but i don't know how big a footprint it's going to have at all at that price and with the sensitivities to kind of using it i'm just guessing

[00:34:40.600] Norm Chan: And I think that they're not building a dev ecosystem. I think with the launch of iPhone, the general understanding is that the App Store was a killer app. The iPhone had exponential growth because everyone was able to make apps and you get whatever you want. But it also launched as an internet communicator, an iPod with video, and a breakthrough communication device, right? And so they understand that they need to get those fundamentals right. if they want this to be something that could maybe not replace the need to buy a phone, but certainly replace the need to have your phone out with you at all times. And I'll say that just based on the demos that I saw, it's still so early. Like demos were very, very curated. It didn't seem like a very robust application. It seemed like almost scripted video. They opened a messenger app, opened a web browser, but it was slower than the web browsing experience you get on a VR headset today. Everything was a little bit laggy. Visually, I don't think we've talked about the visuals. They were bright, but they were low resolution. This is, I think, 13 pixels per degree. That's not good at all.

[00:35:40.124] Kent Bye: Snap has like 37 pixels per degree.

[00:35:42.386] Norm Chan: Yep. And so it's low pixels we read. They're bright. You know, it's hundreds of thousands of nits, they'll say, on the micro LED projectors. But that gets pared down to 300 to 400 nits after the wave guides. So the images are there. The world tracking is good. There's no persistence beyond the action that you're doing now. So it's so, so, so early. It proved that they can make the form factor, which I think the form factor, form factor, form factor was the goal here, the hard technical challenge. And now it's about making a product.

[00:36:12.469] Scott Stein: Yeah, they were very quick to also, like, add to the end of the demo a quick look at an even higher resolution one, which was interesting that they bothered to do that because I think they were that cognizant of that. To say, like, there was a brief moment, which I'm sure, like, we all got at least, like, you got to look in at a 26-pixel-per-degree one to watch, like, a little bit of, there was Jurassic Park, and it was, like, just to watch a movie, which looked fine, but again, we've seen higher-res demos of other things. And I kind of hadn't been thinking about that too, too much during the demos because a lot of the icons were pretty... sparse, you know, I would say, and simple in some of those elements. And you're also doing some transparency. But they were yeah, they were just like, OK, we're working on that or already working on that. And it sounded like, you know, again, that was part of that extra wide field of view means more pixels, more power to push. So, you know, as you were saying, like snaps is a smaller area, a smaller FOV.

[00:37:03.780] Norm Chan: so they kind of created this extra challenge with the canvas space too that they're trying to deal with probably or just the physics of it so i don't know it's interesting if i was to make a bet when this achieves a consumer version it will be smaller than what scott you and i saw it will be thinner it will be more even more socially acceptable because they feel like that's the barrier it will be smoother in terms of its interface i don't think it's going to end up shipping with silicon carbide i don't think that that can scale in cost and yields in manufacturing by the time that they want to get this out. So they are looking at other options. And I think the resolution, they're going to go for that higher resolution, but maybe lose on brightness. So they will have to face some very realistic trade-offs.

[00:37:46.032] Kent Bye: It was a really fascinating conversation that Jason McDowell from the AR show had with Sofia Dominguez. And in the wrap up, he said that he wants to propose a scale for pixels per degree, where 60 pixels per degree would be like retina scale, and that would be a 5 out of 5. And then you divide by 12 to go then like 48 would be 4 out of 5, 36 would be 3 out of 5, 24 would be 2 out of 5, and then 12 would be essentially 1 out of 5. So we're talking the order of 1 out of 5, whereas the Spectacles 4, which came out back in 2021 for developers, was essentially like a 2 out of 5. And then now we're talking about, for the Snap Spectacles, a 3 out of 5. So that's just to give some relative scale, and we're kind of going towards that retina scale. So anyway, that was just a really fascinating number I wanted to share, just to give some context to that. But as we start to wrap up, I have two more questions. One is just more of an ecosystem reflection. There's a lot of focus on AI that was in the keynote. I was surprised to see how much they were going all in with creating NPC bots within MetaHorizon world. But also just generally with what Meta is doing with their ecosystem, they were really emphasizing that they were moving into more of like a open ecosystem. They now have all of the apps that used to be on App Lab are now generally available. They were emphasizing that now they have 10 times more apps. And then my comment would be that there's still this kind of like Meta preferencing their own Meta Horizon worlds above and beyond what's available. happening on the third party ecosystem and talking to SixLive at the Snap Lens Fest, he was saying, you know, it's likely that you're not going to be able to run your own machine learning models on these devices, that it will always be Lama and Lama only, but you won't be able to do your own custom. We'll see. There seem to be a lot of emphasis of open source type of things. But to what degree will you have control over the ML models that you're running on these devices? Whereas in Snap, you have your own custom ML models. So, yeah, there seems to be like this. Yes, there is a lot more openness. But how much true openness is there going to be for really fostering a third party ecosystem versus how much of it is meta deciding that they're just going to control everything and have all of their first party apps? And so, yes, there is a lot of commentary about how things are much more open. But what I'm seeing on the ground is a lot of concerns from third party developers who feel like their apps are kind of getting lost in the mix of this flooding in and changing the names to the Meta Horizon app and having the Meta Horizon worlds be equal searches or at least show up on the app as being prioritized over the third-party app. So there seems to be like Meta has traditionally used those third-party developers to build up the platforms and ecosystems. But then at some point, they pull the rug out and then focus on their own apps. So that's my concern, is as you have more and more of these consolidated platforms, to what degree are there going to continue to be these open platforms? So anyway, I don't know if you have any thoughts about that.

[00:40:25.518] Norm Chan: I think they will continue to do it until it does not help them get new users. And they're very lucky that Lama is well-regarded. First-party apps, you know, whether it's Threads, Instagram, WhatsApp, are wide user base. But it's so frustrating. I totally feel it, especially for Ray-Ban Meta, is that you cannot stream, post anything outside of, right now, their apps. There seem to be a little more, I think, things like Audible, places where they're not in business. Spotify, Apple Music. So it's a fight. I'm sure it's a big fight internally as well.

[00:40:56.018] Scott Stein: Yeah, I think there seem to be two different models in play at the same time. And I'm kind of looking at the evolution of what's going on, like, you know, between the traditional legacy of Oculus and the Quest as it keeps changing. And then there are the glasses and the AR vision. And they mentioned camera API access that's coming next year finally. So people will be able to, as a big part of AR, to recognize the environment and AI and multimodal. And same question, you know, to serve where and how and, you know, everything that Orion model is looking towards is this more, feels like a more contained, unified ecosystem. And even I was talking about people and accounts and how you share the quest with people. I treat it like a game console. I kind of let my kids use my own account. You know, that's terrible to admit, but that's what I have done because they're older teens and I get lazy with that stuff. But I think a lot of people have been like that because It's not really a personal account, and I don't do voice chat. But as it becomes more personal, and you're doing AI things, and you're doing all sorts of stuff that's you, it's your experience. And then at the same time, how much of that starts getting locked down into what meta defines. I just think it's a very different shift. And so I'm looking at the glasses and going, Who gets left behind? Does mixed reality change things for developers? And where are we headed for what Meta wants? They didn't really answer that here. They kind of kept both pieces in play. Like, thank you, Quest developers. I think that hasn't shifted, but it's a question.

[00:42:28.748] Kent Bye: MARK MANDELBACHER- Great. And finally, what do you each think is the ultimate potential of spatial computing and what it might be able to enable?

[00:42:35.737] Norm Chan: I mean, after demoing Orion this week, a lot more confidence that they can be something that I will not just wear comfortably in the home, which I do with Apple Vision Pro, but something that I would be okay wearing at a cafe in the real world. I mean, that feels much more realistic today and much less of a nebulous 10 years in the future than it did a month ago.

[00:42:56.297] Scott Stein: Yeah, it made me think about how much of the stuff in that demo, you know, talking about people being astonished, but I guess, you know, we see a lot of things too. What I mean to say is a lot of it felt familiar, but that was interesting because it felt like pieces that you, I felt that way with Vision Pro too, where it's like, you see pieces that are out there, but they're starting to synthesize. and i felt like this was a moment of like synthesis of some things and you go okay like we're already kind of drifting there we're already making this happen so i'm just interested in the larger map of things the network of things the ways in which will work for everyone at the same time the companies have been exploring that but that's still the missing piece like a lot of the demos of of ryan and a lot of what vision pro constitutes now are very personal

[00:43:36.548] Norm Chan: singular things but we're going to be a world with lots of these things so it makes me think of how does that work on a mass scale in the way that phones have transformed through mass use anything else left and said any final thoughts to share with the immersive community they'll be in my video which i still have yet to publish at the time of this recording but this conversation has been really great to help me consolidate some of my thoughts and explore them further so thank you ken for that

[00:44:02.608] Scott Stein: Yeah, just hi. And I hope I run into some of you here. And you're always a source of ideas and inspiration and learning. So yeah.

[00:44:13.126] Kent Bye: Awesome. Well, Norm and Scott, thanks so much for joining me today to give me all the inside scoop of what it was like to do the Ryan demo. I was unfortunately not a part of the cool kids this time around to be able to get a chance to see it. Hopefully, I'll get the chance to see it at some point in the future. But yeah, I just really appreciate it. To hear all the technical breakdown and details and be looking both at your coverage that you already published, Scott, and then to your video that you'll be publishing here within the next day or so, Norm, to get all the other details and all the other conversations. Yeah, it feels like it's kind of putting a flag in the ground of augmented reality with both the snap announcements that just happened like last week and now this is happening now. So it just feels like a little bit of an inflection point of some new exciting trajectories for where things could be going in the future. So thanks again for joining me today to help break it all down. So thank you.

[00:44:55.421] Norm Chan: Thank you.

[00:44:55.681] Scott Stein: Yeah, thank you.

[00:44:57.002] Kent Bye: Thanks again for listening to the Voices of VR podcast. And I would like to invite you to join me on my Patreon. I've been doing the Voices of VR for over 10 years, and it's always been a little bit more of like a weird art project. I think of myself as like a knowledge artist, so I'm much more of an artist than a business person. But at the end of the day, I need to make this more of a sustainable venture. Just $5 or $10 a month would make a really big difference. I'm trying to reach $2,000 a month or $3,000 a month right now. I'm at $1,000 a month, which means that's my primary income. And I just need to get it to a sustainable level just to even continue this oral history art project that I've been doing for the last decade. And if you find value in it, then please do consider joining me on the Patreon at patreon.com slash voices of VR. Thanks for listening.

More from this show