I did an interview with Scott Stein at Meta Connect 2025 reflecting on the biggest news of the Meta Ray-Ban Display glasses and the associated Neural Band. Stein has been covering the XR industry since 2012, and always has some deep thoughts on the broader implications of the latest news. You can check out his hands-on first impressions in his CNET video here, and his interview with Meta CTO Andrew Bosworth here.
Stein and I had a chance to catch up after the day 1 Meta Connect keynote announcements, and also speculate on the future of the ecosystem in around the Meta Ray-Ban Display Glasses. The Meta Wearables Device Access Toolkit wasn’t announced until the following day at the end of the developer keynote at Meta Connect, but there still hasn’t been any announcement about if or when third-party developers will get access to develop apps for it as Meta seems content to focus on their own first-party apps at the launch of the device today.
See more context in the rough transcript below.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
Podcast: Play in new window | Download
Rough Transcript
[00:00:05.458] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the structures and forms of immersive storytelling and the future of special computing. You can support the podcast at patreon.com slash voicesofvr. So in today's episode, I feature Scott Stein of CNET. He's a writer who's been looking at VR and AR since 2012 and focusing on wearable tech, phones, gaming, and the intertwining of the landscape. And he's got a new newsletter called the Intertwixed Newsletter that is kind of more reflections of his personal explorations of technology, but also looking at this kind of intertwining and melding together all these different design disciplines into the context of storytelling, but also entertainment for XR. So Scott is somebody who also had early access to some of the major demos and then also had a number of interviews with folks at Meta. So he's somebody who, whenever I talk to him at these MetaConnects is in the process of still kind of digesting and processing and working through the implications of all these new technologies. And so by this point, I had also had a chance to do my own demo of the MetaRay band display glasses and the neural band. And so I share some of my thoughts, although honestly, our conversation gets cut off because we started it late and we were getting kicked out of the venue. So it sort of abruptly ends, but we were starting to dive into some of the different larger implications of all these technologies. And so I guess I'll say a few more words about my experience of the Meta reban display glasses. So when I went through my demo, I didn't have a chance to do the press demo for people who got early access to the press. It was basically like a super optimized experience of people that were doing it where I think all meta employees. And then I think what they had. at the conference of MediConnect was a lot of people who I don't think they were Medic employees, I think they were hired in and so basically taught how to do things. And so there was a little bit lack of information around what was actually happening because I was fed information that wasn't necessarily correct. Someone told me, this is a demo that you cannot wear your glasses and so you're gonna be asked to take your glasses off, is that okay? And I said, OK, but at that point, I didn't even know for sure if there was like inserts or anything to be accommodated for. So as I went up with my glasses, they didn't ask for my prescription. And then the docent who was taking me through everything, when I said it was really blurry and everything, they didn't say, oh, well, you should have these inserts. And so essentially, as Meta is going to be having all of these demos, it's going to be probably a similar type of situation where the level of knowledge of the docents are going to vary significantly. to different degrees as people go get these demos. And according to online, the Best Buy says that, you know, the accommodations for prescription is going to be available for whatever happens to be there or not. With the Apple Vision Pro, you could get your prescription, you could order lenses, and for sure you would know you'd have the right thing. But you're only able to buy some of these metabray band display glasses if you go and get a demo and if they happen to have the right prescription or whatever for prescriptions. If you have negative four to positive four, and if not, then if they don't have that, then I guess you're out of luck or it's unsure how that's actually going to work. But for me, it kind of gave me this experience of like the accessibility is not the first thing on their mind. They have this trade-off between like having everything ready because to have higher levels of prescription, they actually need to have more evolution in the technology to deal with wave guides that can deal with more curved lenses versus the negative four to positive four. And so they're launching it now with this more constrained accessibility options for vision, but in the future, hoping to expand that out. But my own experience of it was that it was not at the forefront of their concerns. So in result was that I had a very blurry experience because I didn't have any of my glasses and I would love to have seen what it looked like for negative four to see if I could have even seen it. So if you know that you're outside of that range, then you'll have to go see a demo for yourself and then see if it's going to work for you. It is like an early technology. So it feels like that things are still getting worked out. So in my own demo, when they put on the meta neural band, um, They put it on and they said, OK, you should feel a buzz. I didn't feel a buzz. And so then they changed it out. They swapped it out with another band and then I did feel it. But then they had the configuration oriented incorrectly where it was oriented as a left hand rather than right hand. So whenever I was like swiping, expecting to go right, it wouldn't work because it was expecting it to go to the left because it was like oriented to the left hand. So it's good that that's an option, but it took them a while to figure out that it was misconfigured. And then even when I was doing the demo, when I was trying to turn the volume up or down, at some point it turned all the volume all the way down. I wasn't able to turn it back up for a little bit. So it was just kind of like a slightly glitchy experience or something that didn't feel like it was ready for prime time. It felt like a little bit rushed. So I imagine that, especially in these first early days, you may have some of those types of experiences. I did not get access to a demo unit to do more further extensive testing. So your mileage may vary, but I do know that over time Meta has been rapidly iterating and sending out different updates, but there has been this general experience of me working with Meta where sometimes I'll put on the headset for the Quest 3 and then things won't properly work. come up or I won't see the menu properly or I have to restart or if it's super delayed. And so there just seem to be things that should be a solid operating system, things that are still kind of like being developed or worked out or changing or like just not consistent experience overall. Which I think is in contrast to my experience with Apple Vision Pro where things seem a lot more solid in terms of building from like a strong ethos of a company that's built an operating system from scratch for many, many years. years meta doesn't have that so you end up having a lot of these weird kind of like in order to connect your phone to the glasses you have to use wi-fi and then it just it ends up being this sort of weird configuration sometimes where meta doesn't have control of the operating systems of any of the phones so they have to have these kind of alternative paths that just end up creating like more friction or wonkiness in the overall experience. And that's sort of what Scott is talking about in this conversation, but also like these broader questions of the ecosystem that they've had. They're starting with just first party apps. They're not launching with kind of a full repository of iOS or Android apps. And so just really trying to focus down on what they see as the most essential things. But we're also just kind of reflecting on this larger ecosystem for what it means for meta as they're moving forward. And one of the things that wasn't answered at the end of the first day was whether or not developers were able to even be able to develop for any of these different devices. That was answered the next day. The second day of the developer keynote near the end, they said that they were announcing the wearables device access toolkit, which would allow people to essentially get access to the camera, to the speakers, to the microphone, so that if you're a mobile phone app developer, and you want to have access to some of the features of these AI classes, then they're doing that through this wearables device access toolkit, which is going to be in preview. The launch date wasn't clear and sort of like a soft launch as it were saying it's coming soon, sign up and get more information, but it wasn't being launched at the developers conference. And there wasn't any news as to whether or not this wearables device access toolkit was going to include the heads up display for the metabray band display glasses. I'm sure that it probably will eventually, but it wasn't the top of mind for them announcing that it will be coming at some point. At this point, it's only first party apps because I think Meta is seeing that these glasses are really AI glasses and that it's really an opportunity for consumers out there to start to interface with their AI technologies. And so most of the different things that they're focusing on are these first party applications and trying to look at the ways that you use your phone the most often, and then figure out if they can create applications or experiences that would allow you to interface with your phone without having to pull it out and look at it. So we're covering all that and more on today's episode of the Voices VR podcast. So this interview with Scott happened on Wednesday, September 17th, 2025 at the MetaConnect conference at Meta's headquarters in Menlo Park, California. So with that, let's go ahead and dive right in.
[00:08:06.474] Scott Stein: Sure. Well, I'm Scott Stein. I write for CNET. I look at VR and AR, and I've been doing that for a long time, probably since that territory, since 2012. But I've also looked at wearable tech for a long time. I've looked at phones and gaming, and I like to look at the intertwining of the landscape. I'm actually working on a newsletter right now called The Intertwixed, which is kind of about that a little bit. My background before that was creative, playwriting, and other things I dabbled in. So I like to think about those things in what I write, but also what I'm experiencing. Yeah.
[00:08:39.531] Kent Bye: Great. Maybe you could give a bit more context as to your background and your journey into the space.
[00:08:43.854] Scott Stein: Sure. So, yeah, my background, like I was saying, I used to write about tech. I used to write about tech experiences way, way back. You know, it was, I remember, maybe I even talked about this one time, but it was like, you know, in the 90s where I played about chat rooms and online role-playing games and dreamed those kind of science fiction-y thoughts, moved into tech through mutual friends that knew I liked writing about those things and said, well, maybe you want to work in tech because you like those kind of future things. And it kind of led me down this road where bit by bit I became a reviewer and tech journalist. But the thing that's grabbed me the most in the past bunch of years, and maybe through all what I'm doing, is the experiential kind of journey, like what it's doing to me, what it's making me think, And the uncertainties about it, you know, I think when a product becomes finalized and mainstream, there are also lots of uncertainties, but it becomes a little bit of a different process. So I like when things have not quite emerged yet, which is what's happening now in air and VR. And then also I'm interested in neural tech increasingly. Any AI in there as well? Weirdly not, although I am. It's funny because you think AI is right there. I'm writing plays about AI now as I am dabbling my own creative writing and plays about exploring that territory. I find from a journalistic standpoint it's hard because, I don't know, there's something amorphous about it. I find it sort of a futility in a lot of my things I do with AI. I never feel like I'm doing it the right way or that I'm really there. And I know there's a lot of polarization and perspective on what even AI is about right now and how real it is. And so I'm more interested in AI as it interacts with sensor systems on the body and in things like AR. So that deeply interests me, you know, like all the multimodal and the way in glasses. And so that's where I'm going to connect with AI. AI is like a prompt box. Not really. Nice.
[00:10:38.192] Kent Bye: Well, you're the first journalist that I had a chance to talk to after the keynote. I had a chance to talk to Anshul Sog. He's an analyst of XR, as well as Norm Chan, who had an early look at the meta Ray-Ban display glasses that are going to be coming out. So I saw the keynote. I had a chance to do the demo after the keynote. So I now have tried that out. I've done, I think, all the demos that are here, seeing all that stuff. So just curious to hear your kind of top level first thoughts of the new tech, all the announcements, and where you're at as a sort of vibe check as to what your impressions are for a lot of the announcements that are being talked about here.
[00:11:12.318] Scott Stein: Yeah, so it's a great question. And it's like, I kind of feel there's a bit of a crossroads. And the crossroads has been here for a while, but I think where Meta's at right now, it was so glasses focused. And like, oh, and also a little bit of VR. And it's kind of been that way for a while, but last year there was the Quest 3S. So that was a big price accessible thing, and the Quest 3 was just a year old. They're not going to have a Quest headset every year. They probably shouldn't. But it did feel like the focus being so much on AI and glasses in a time where everybody else seems to be working on glasses and AI as well. It's a territory that Meta already has a head start on and a big boost in interest on the Ray-Bans. So it makes sense to pursue that. At the same time, there's a lot of competition coming in and Meta, I think, needs to keep moving fast. and solve for some things that they haven't solved for. So I look at the space with a lot of fascination, but also a lot of curiosity about where it's all going to really end up and how they're going to get there. You know, all the stuff like in the past with Michael Abrash and a lot of visions of Installed in the Present, where I've heard a lot of talk about VR. We're in the unloading zone. It's MetaConnect. There's a lot of stuff going on. There was a Diplo concert. But VR is the computer, the glasses, or the phone. And I've heard that for a long time. The thing I was talking about for a lot of people is that coming out of all this and having done all the demos, I do think people think, okay, well, it's a computer and a phone. How do the computer and the phone live together in the world? And they're separate, but they do have to interact. There are a lot of commonalities. Meta hasn't developed those commonalities between their two big existing products. And right now, the Ray-Bans and the Quest headsets do almost nothing exchangeably with each other. And for the immediate future, neither do the Ray-Ban display glasses with their beginning level of apps and neural interface. While I thought it was fascinating, and I tried Orion last year too, what you're looking at is a more here and now, foot in the door and beginning to try that tech. I think neural wristband is fascinating. I've been really interested in that tech. It does seem to work. But I think not having the full display of the Orion and not having eye tracking means that you're really leaning a lot more on heavy navigation and a smaller viewing area. And even when you look at the live stage demos, which sometimes failed, and who knows why, I also felt there was some level of awkwardness in the interaction. You have a lot of interface navigation going on. When I did it, it was very swipey and tappy and scrolly. You had to know, like, middle button, first button, you know, middle finger, first finger, thumb swipe on your hand, do this. I was forgetting a lot of them. Hold the fingers and then turn the dial. They throw a lot of stuff at you. And I think that's a language that's going to take a lot of time to learn. So I think it's a very rough starting road. And also I wonder about friction, about, like, how long are you going to wear the band for? If it's only just doing it as a band, it's not a watch. But then, also, how many apps are they going to get on that thing? To me, everything is about phone conduits, and we've seen this for Qualcomm and others for years. The vision, to me, I agree with, is that the phone is a processing conduit to run the glasses. It seems like a lot of people believe that, yet we're still here in 2025. Very few people are actually making that happen. And Android XR seems like that's part of the plan. Some of the XREAL glasses development that's being worked on for next year seems to be a little bit about that. It's unclear what's going on with Samsung with that. But Meta, you know, all these updated glasses still connect with phones the same way. Like, I just started playing around with second-gen Ray-Bans, still a Wi-Fi connection, which is awkward, local Wi-Fi, Bluetooth. I don't see there's going to be any difference with the Ray-Ban displays either, although I haven't seen how it works with a phone. And that's concerning because I think that when you expect something to really work all the time in an ambient everyday way and then it suddenly doesn't work, that's like a real disconnect. VR is like a come in, come out, have a session. And a lot of people talk about how the beauty of this time period with VR is that you are not deeply connected with everything else. Maybe at some point you are. So it works that way, but an AI cognitive assistant, neural-linked, all-day wearable, can't not be connected to all the things that you do. And I haven't seen proof on this Meta Ray-Ban display that it can connect all the things that I want to do. I think there are going to be a lot of ways it won't. And then what happens? Am I staring at my watch? Am I looking at my phone? Question mark, you know. But also, I didn't even get into the fact that they don't work with my glasses prescription, which is...
[00:16:02.376] Kent Bye: Another thing about a different type of accessibility than the neural band, which is a minus eight heavy prescription Yeah, I'm minus five point five and they didn't even ask me my prescription Did you try even any sort of corrective lenses on the end? So they asked they didn't even offer that as an option for me when I well, I there no So I was saw it in the evening here. Okay, so And I just saw it and they just said, oh, this is a, I'm sorry, but this is a no glasses experience. But apparently they did have some corrective lenses. So for you, if you didn't have a way to see it without glasses, you didn't have an option. yeah they just said this is a no glasses like experience like you have to take off your glasses but they didn't ask me so I don't know if there was a breakdown in the way that they're doing it but my whole experience was that they wasn't even concerned that I had glasses and that I needed any corrective lenses because I saw it it was so blurry I couldn't see anything I had to squint and And so I was like, is this how they're treating accessibility? Is that they're not even asking me what my corrective lens is? And if they do have them, I would love to see just even if the lenses were corrected, if I could then actually see the display with the corrective lenses, because there's the corrective lenses for me to see the world and not be blind. But then also there's this other display dimension. And I don't even know if the corrective lenses that they might have would allow me to see it. So there's this kind of whole experience of non-accessibility that I had, at least with how they're running the demo here tonight during the after party.
[00:17:28.344] Scott Stein: Yeah, it gives me a lot of concerns, too, about what's going to happen with the retail experiences. Because they're doing it at a variety of places, including Best Buy. And I don't mean to say that like including Best Buy, but I think that the Apple Store Vision Pro experiences are very heavily curated. I actually have not done the Apple Store retail experience. I really need to do that. But all this time, I've been doing Vision Pro stuff at home. But, yeah, that's a big concern because, I mean, I think about Vision Pro, which is another thing you have to get specialized lenses for, and if you don't have those, then what do you do? But if you have inserts that you're in a drawer that could line up with that, that could help. I don't know if Best Buy's gonna have that for these glasses. I have my doubts. And the way they did it in my demo, is what I actually can talk about. They knew my prescription in the past, because last year when I tested Orion, they didn't tell me that I needed to bring contacts, and so we actually grabbed Mark Zuckerberg's contact lenses in a pinch, which was an amazing, weird anecdote. Luckily, they worked, but the answer is they didn't have a solution in mind when they started the demo. This time, they did have inserts in the demos, and they scanned my glasses, and they had an insert that matched mine, but the parrot thing was that I go, well, they're thick inserts. In the photos, you can see that it makes the glasses look chunkier. But I was trying to explain that that was because of these specialized inserts. They only support minus four to plus four range prescription. I'm a minus eight, minus 8.5. But I said, they're working for me with the insert. Are you going to make any sort of insert at least? No. We hope to have a solution in the future. I mean, I do too. And so I found that concerning because my thing I wrote the week before, two weeks before, was about smart glasses needing to get better and will they get better fast enough? And I started this feeling with battery life concerns and prescription. You know, it's a weird anecdote, but like embarrassingly, meaning like when I reviewed the MetaRay bands, I got set up with a prescription pair that was at minus eight. I didn't realize at the time that that wasn't a standard prescription thing. And it was only when I tried to do some stuff myself afterwards, because I was on a time frame, but then I realized, oh, it's minus six. How do you get a minus eight? There are third party options, but they're not officially supported, I guess. But it's awkward. And then they sort of say, you can't really order it. I thought, OK, well, hopefully they'll do better than that. And now you come up with the Meta Ray-Ban displays, and they've done worse. So it's true, because then you also have the storyline of accessibility, which is kind of coming through in terms of the possibilities of the neural band. I got a chance to talk to a Paralympic athlete via Meta who was great. and gave a lot of impassioned thoughts about how he uses the Orion and the Ray-Ban display glasses, kayaking and other things, wearing it on his wrist and how there's a lot of opportunity for that. I admire that and I do hope that Meta continues to think about accessibility. I don't really know about accessibility to the extent that I probably should, but I know my own issues in terms of my vision. And they haven't done it, you know, and they haven't made that basic element work. So it's a big question mark. And it's only one of a number of question marks I have about where the glasses are going, only because if you buy an 800 pair of these glasses, which is not that expensive considering it includes the wristband, it's still a lot of money, you know, for someone who wants a pair of glasses. And when is that going to be outdated by a newer pair? Because we're in very early days where things can iterate quickly. So it seems like an experimental product. I think it's very clearly telegraphing itself as an experimental product. So then what's the messaging here? What is really going to happen? When are more apps going to come?
[00:20:56.318] Kent Bye: Yeah, because there was no announcement of any third-party developer, no SDK, nothing that would indicate that they're even thinking around bringing external third-party developers to even create experiences for some of these latest line of smart classes.
[00:21:10.598] Scott Stein: Yeah, and I'm going to talk to some of Meta's executives. I'm talking with Boz tomorrow, Andrew Bosworth, and I'm talking with Vishal Shah, and I'm going to hopefully work on some stories for next week about that. But I want to ask some of those questions, too, because I think that I saw developers here that are asking those same questions. And obviously this is a developer conference, and so what do I do with the big new product that you announced exactly? Nothing? And there was no developer kit for the... audio-based glasses, which there were a lot of reports that they would do that. And I think there would be interesting ways you could have applications for those amazing applications. And it feels years in that it's time to do that. So I think also it really behooves them to do it because they're going up against Apple with its App Store and Google with its App Store and phone ecosystems that can connect to peripherals. And Meta's fighting uphill with that. You know, there's the whole, like, well, this replaces your phone. It doesn't replace your phone. In fact, all of Meta's things pair with phones. And so they need the phone, none of these truly standalone, for the glasses. And you need apps to work. And, you know, right now, like, with the Vanguard Oakley glasses, they have a Garmin connectivity for health and fitness, which is an interesting foot in the door. But I kept wondering, like, what about other fitness services? What about other heart rate trackers and, you know, I've already thought that about the Quest headset, where I love Supernatural for fitness on the Quest, and I've been waiting for universal heart rate tracker support on the Quest, where you go, why can't I just pair it? It can show me my stats. Even that Garmin thing would be great in a Quest. if you're doing workouts. And I don't see them connecting the dots. So it is a little bit in VR, but it's very much in the glasses, a big question of how they do that. And I don't think they can sit pretty going, we don't need this stuff. And maybe they announce it next year. But I think with Android XR coming and all these other things, I don't know what type of road ahead they have. Like, I wouldn't feel like, oh, they've got the space cornered. I mean, the SLL Exotica is a huge brand. But Apple's a big design studio as well. And Google's working with Warby and Gentle Monster. And yeah, I feel like it's an area to not get complacent. I mean, of course, they just created a neural band display glasses that are like nobody else is making. So I am damn impressed with that. I don't mean to sound like I'm not impressed with that. But it's like it opens a whole bunch of doors. that I don't have answers to. So it's like, I feel like the amazing demo raises even more questions, like a JJ Abrams move, you know, like a lost episode or something.
[00:23:42.158] Kent Bye: Yeah. Yeah, I think this whole rhetoric around these glasses are gonna replace a smartphone, there's never been a piece of technology that's completely displaced previous technologies. Still exists and the medium still exists. And so I think that's part of the rhetoric because you know meta doesn't have an operating system They don't have a app ecosystem. And so they're kind of like working against the fact that both Google with their whole ecosystem of Android apps as well as with Apple having the whole ecosystem with iOS and so without that they're kind of left with you know, whatever first party gaps they can make but Because it is a new product that can maybe get away with having some fitness apps, very context specific. When I did the Oakley Vanguard experience, I don't know if you had a similar experience. I was like, I wonder if I can put this on with my existing glasses. And of course you can't. And I said, are there any options here for people who have prescriptions? Do you have any corrective lenses? And they're like, no, this is for athletes with the implication of like, well, I guess athletes all have perfect vision because there's no inserts that you could put in there. I don't know, I just felt like, oh, there's just an assumption that in order to use this, you have to use contacts. But my experience of it was that you have these exercise and these integrations. And maybe it's worth mentioning, you kind of alluded to the lot of live demos that they showed during the keynote. a lot of them failed and Boz came out and said, well, this is my team. I have to figure out what happened. And my experience generally with the operating system and the software is that it's very glitchy. It's like different every time I put on my quest. And sometimes there'll be like these long delays before things come up and just kind of like random things that don't work. And kind of the quality control over their operating system stuff just on the whole the user experience of meta products have been wildly inconsistent over the years and also very buggy and simple things if you just you know pairing bluetooth up to something you know that should be simple and kind of a standard operating system thing ends up being a couple extra steps and non-intuitive and So I feel like they've been able to make it by, by having first mover status and all that stuff, but having like a vibrant, robust ecosystems and value that is added by having third party developers add to that. It feels like they've made this strange pivot that almost feels like, well, we're just going to go alone and we're not going to need anybody else to do what we're doing. Or at least they haven't like opened up the doors or given a vision for where this whole product line is going to go in the future, inviting people to co-create it with them rather than we're just going to provide you the one AI service with many AI, and that's the only thing you get is our first-party apps. And Zuckerberg said this is on a directory of one of the most successful product launches in history, but yet at the same time, it feels like without that ecosystem, how much is this really going to take root? Those are my concerns that I see.
[00:26:33.959] Scott Stein: Yeah, I would imagine that other AI companies would be very excited to work on Ray-Bans, whereas there's platform wars on phones. I feel like the Ray-Bans have a really good footprint and who wouldn't want to also do that? It does feel like Meta is willfully not deciding to pursue partnerships there. And then that closes things off. It closes off, I know people who are using the glasses for accessibility purposes to aid with their vision. And I've had questions asked of me about how do I access my emails or how do I access my files on the regular Ray-Bans? And it's like, well, you can't really. There's no real easy way to do it. And that's really frustrating because that should be something that should work. And Something else you were saying too about the way this all seems to be lined up, oh, anecdotally, You know, a lot of demos here have been failing, and I don't mean to knock on a tech event, but I just noticed something interesting. Stage demos were failing, but also a lot of my individual demos, like VR demos and other things, had a lot of bad luck and things failing, things not setting up, and like, actually, sorry, we can't get this to work right now, come back later. Or one point I was doing a Horizon demo, and they were like, sorry, I need to talk you through it because the casting's not working right now. And it's like, that is exactly your point, which is that... I think you can see that a lot of this stuff is actually pretty janky. And people who use the headset know that. It's forgivable because when you get things running and it's mainly a game console for people, it's just kind of a fun thing you're doing. But the moment you're wearing glasses that are supposed to be an assistive everyday tool, then you're in like mission critical land. I think it becomes a whole different level of performance. To me, the glasses... largely succeed because they do a lot of individual tasks that work well and don't have interruptions. Bluetooth audio is a standard system, and so working as headphones with really good mics that they have and everything else, but that's just a more straightforward technical achievement, and they work really well as headphones. The camera, yes, they've worked on camera technology that's solid. It's annoying to sync them, but they work as a portable camera. And the AI is like a little bit of gravy in the mix. It's like a bonus, and it can range from being wonky to being actually really cool. But I think it's not the first reason people get them, although increasingly it's a reason. And now, as far as I see the Ray-Ban displays, they're really going full forward into like it's assisting you in your life. And that seems to put right at the forefront. What are you seeing in the display? What are you doing with them? What else am I doing? Because honestly, if you're only doing. a few everyday things with occasional display use, I don't think it would make any sense to spend up for the $800 glasses. But if you really want to power use all that stuff, you would. And then it's like, well, what is that for? If I'm doing live video chat via whatever, like Messenger or something, you know, like doing something like that, At what point are you going to feel frustrated and just pull out your phone and do a FaceTime or whatever, Google Meet? I think it goes back to, I was making this analogy before, the early days of reviewing smartwatches. The first time I reviewed the Apple Watch, I was calling it, called it complicated, like in a negative way. Because I thought you're always battling with how long am I going to last until I pull out the phone in my pocket. If an app is taking forever to load, which the first Apple Watch did, what am I doing here? Apple Watches did get better and better. But even now, there's still a lot of stuff you pull. I mean, of course, you pull out your phone for it. I don't think a lot of the whole app ecosystem on the Apple Watch decayed, I think. So I have the same questions about the glasses. It makes more sense that the glasses are extensions of the apps on your phone and become a way to, kind of like those XREAL and Viture and other glasses, become a wearable monitor extension and become more, but an extension versus its own ecosystem. Meta's building up that ecosystem from scratch, but I don't think that actually makes sense at all. because we're never leaving our phones behind. So it's up to them to bridge that. And it's entirely up to them. And the race is on now between can they bridge it before Apple and Google do and succeed at that? And so that's a big, big part of where that road they need to climb. And also, I would really think, bridge it with your VR, because you have so many app developers out there that were so passionate about making experiences. And right now they have no road to get to the glasses. And I don't know how that momentum lasts or financially it lasts. It's a rough world out there. People might be like, it's already people are trying to make things work and then they give up or they go to AI or they do other things or like what's Meta doing in that equation? So. It's very existential is what I'm feeling.
[00:31:39.838] Kent Bye: The only breadcrumbs that may have been dropped by Mark Zuckerberg during the keynote was in the announcement of the Meta Horizon Studio and a Meta Horizon Engine where he was saying, yeah, eventually you're going to be able to use this new production pipeline that is getting away from Unity runtimes and very similar to what Snap's doing with Snap Studio, something that's much smaller and leaner runtime to be able to run on Snap Spectacles. saying something along the lines of like, this is how other people are going to be able to get experiences onto these glasses. But that felt very speculative and kind of like sort of a promise, but no clear, like here's the timeline and here's the roadmap and here's what that even looks like. Because what they're launching with is, Essentially like a music app and then there's like a maps app and then there's a camera. I think the camera is probably the biggest. There's like a chat app that gets WhatsApp. And then as I was going through the demos, there was basically like six to eight apps that are there. But like you said, it's not like a full ecosystem of content and it's not integrating with your existing ecosystem. apps that you may already be using maybe WhatsApp will be in an instance where you maybe you're texting with somebody and then you could have a WhatsApp chat but even having like a WhatsApp call something that seems like should be pretty straightforward you know that was the one thing that failed where we sat there watching Mark Zuckerberg trying to answer a WhatsApp call video call from Boz for like what felt like five minutes yeah yeah exactly it was like
[00:33:05.569] Scott Stein: Where are you? And they have those services, but we have a lot of other ways to connect. So yeah, that's a big, big question mark with these things. On a technical level, these glasses might be doing things that no other glasses are doing. But the monocular display is not a great solution. I found it kind of like half there, which it is. And I don't know, again, for an accessibility purpose. I don't really know how that's going to feel, especially if it's a smaller display. And Snap's glasses are, well, first of all, they're developer glasses. I just looked at them again a week ago and got a preview of the 2.0 OS. There's a lot of rough edges there. But credit to them, like you're saying, that they are starting with the creators and developers who are making experiences. And you'll go to a conference like AWE and other things. You see places like Nantic Spatial and others. People building for that, Resolution Games, others, people are making stuff for it because it's the only way to do stuff, to begin to dream like that. They're like the main... game in town to do that. Meta seems like it has an opportunity to do that now, to be like, oh, are you curious? Start playing around with this. And there are plenty of people who would find that worthwhile, especially on a headset that actually doesn't cost that much, relatively. Vision Pro is $3,500. This is $800. This is in a realm of affordability. And there's no path to it. And it does feel like an insult to the community They may be trying to make sure the system is stable to develop things or develop a pipeline or who knows what, but it's a hard feeling at the end of a developer conference. That's never a great feeling with that. When Apple released the Vision Pro, it seemed like a big pop surprise to a lot of developers, not a lot of support at the beginning. I'm speaking out of turn experience, but when I gathered, you know, people were kind of having to play catch up and then it's like, well, how many people are going to own this thing? But there was a way to develop apps for it. They weren't like, we're going to spend a year not having apps. We're just going to have Apple experiences. You're going to see that and we'll catch up in the future. So I think it would also support the idea of these being a true wearable computer, the displays, which I think they are in a lot of ways. I mean, they're not a full standalone computer, but I think it would help to have apps to show that off. So maybe it's because it's not standalone. I think maybe there is probably a challenge with that because everything has to run through a phone and then where are you putting those apps? Are you putting them on the Meta AI app? It does create a lot of, they may not have the capability of really even physically doing that right now and just puts things in a strange space and that feels to me what the glass of space is right now. It's at its best when it's simple and then you add complexity The neural band is definitely a sign of future interfaces. And I think it's a real paradigm for future interfaces. But I also think it's going to have to work truly with everything else that you do. And there's no indication that the neural band is going to work with your computer or your phone or your ambient devices in your home. And that seems like the holy grail. And you're also wearing a watch on the other wrist, probably, or a fitness tracker. And so what is it right now? You know, it's a necessary adjunct of the glasses, but it's not the full answer to everything that they're wanting it to be. I mean, it's only just like day one, but when is the Orion vision and what comes after? Because even Orion's not the full vision. Everyone's like, oh, I hope you get to Orion. But you get to Orion, and what is that? It still needs to become something much more ambiently interlinked with everything else that we do. It can't just be a standalone mini Vision Pro pair of glasses. I'm still waiting for the Vision Pro to connect with the phone and the iPad. And it still feels way too walled off. So I'm having frustrations like that. I'm not trying to go on meta for this. I'm having frustrations like this all across the board in the landscape. It's just that today didn't answer those questions.
[00:37:03.894] Kent Bye: Yeah, when I look at Meta's history with the preservation of cultural heritage and history of VR, I think it's a terrible legacy, starting with Oculus Share, which they just took away and deleted without any sort of archiving of those experiences from the early days. When it came to Gear VR, Oculus Go, again, they just kind of seemingly kind of nuked those platforms without any pathway of preserving those experiences and they've done that with some of these social games where they'll just shut them down where John Carmack has critiqued them saying no you when you're building these communities and these experiences you know this is very meaningful for people and so the history of the way that they shut things down gives me pause or fear that someday the Quest platform will face the same kind of fates as to what Oculus Share, Gear VR, Oculus Go have faced which is like a transactional means to an end. Like if the final end is these AR glasses, then we're all these third-party developers just helping to get to this point where Meta could go off on their own and build their own first-party apps. That's kind of what it feels like. My hope is it's going to continue to be around. There's still a lot of VR demos here, but I get the sense that if there's another shiny object here of like an AI smart glasses that maybe their attention goes elsewhere and that without having that commitment or indication that there's going to be a third party ecosystem, then maybe we'll hear that more of that during the keynote, which is a little bit more technical. This seems to be more consumer facing, but you know, I'm very curious to hear like where they have a vision for this ecosystem as we move forward.
[00:38:35.771] Scott Stein: Yeah, and there's a lot of hopes I have for the Quest platform that I still want to see come true. You know, it's two years since the Quest 3 came out, so it's not that long. But there's so many capabilities on the headset. Like, I've always thought the mixed reality is fascinating. The whole promise of things like augments haven't come yet. Meta's codec avatars still aren't here, but Apple's doing the increasingly realistic personas. And I thought Hyperscape Capture, which is that 3D Gaussian splat room scanning that's now available in like a beta, or should be available very soon, I thought was wonderful. I saw that David Heaney uploaded also felt that way, and I thought it was a really smooth experience, although I didn't get to see the results of my capture, so that's still TBD, but the capture process that they showed and then some of the final results, I was really impressed on a low-cost headset. And it opens up all these possibilities. People go, can I scan my space? Can I export it? I don't know. So it's going to be for Horizon. And hopefully it's something you add to and build and create from. It gave me a lot of hope. But there are areas like fitness. I use my headset for fitness mainly now. It's one of the main sticky things. I was talking to someone else about this today. I guess maybe sometimes it's sticky and sometimes it's not. But to me, where's the commitment there? I've only been told that it's a stable part of the platform. That's what I was told last year when I talked to some of the team. I think it should be a lot more than that. And I would imagine Google and Apple are going to get deeper into fitness. They showed elements of fitness on the glasses, but there's no connecting through line. And also generative AI, we saw agentic AI ideas for building horizon apps, but that's on a PC. And I asked them when I met with them discussing that demo, which I got to see yesterday, what about in headset? To me, it's like the fun, the holodeck thing is to kind of build those experiences on the fly. I feel it's probably their vision, too, in the end, as a dream. But it's not happening now. And there's also not a lot of magical generative AI happening, even though AndroidXR is apparently going to be doing some generative AI. Oh, hey, are you wrapping up? We're all done for the night, unfortunately. We're getting kicked out, so we're... Closing it down. They literally asked us to leave. It's understandable. Everybody's got to go home. It's been a long day here for everyone. I'm sure the staff, too. But... Yeah, I guess just what I was saying is like, oh, the agentic stuff, like the holodeck, it should be like the holodeck, and they have no way of getting there yet. When I talked to Andrew Bosworth, maybe it was about a year ago, about AI showing up on the headset, he made a comment that it's harder to do scenic understanding and deeper understanding of AI understanding VR. Not to put words in his mouth, but... It might be that Android XR with Gemini, you know, I have to still see how that works on Project MUHAN. It does seem like it's understanding more about the windows that are open versus any sort of deeper 3D spaces. But at least there was a lot of live AI going on in MUHAN when I demoed it. And MET is not doing that yet on the Quest. So I'm waiting for that. I'm waiting for a lot of those things and stability. Maybe a lot to ask for in a low-cost headset because they're not doing a pro headset. But then we also still have never heard about the partnerships they promised with OEMs. Like, you know, we thought there would be an ASUS ROG VR headset, ROG, like something to provide a counterpoint to Samsung with the Project Nuhan being imminent. And if Apple does a Vision Pro 2, or even just Vision Pro, there are things out there that are more enterprise or tech explorer ready. Yet you see James Cameron talking about the Quest, which I thought was really interesting because it points out you're dealing with subsets of access. For all that I'm complaining about the Quest, it is by far the biggest headset. So it's the way to reach eyes. And Disney Plus coming to the Quest, because how many people are wearing a Vision Pro? And no one's wearing that. And so it's not ever priced for anyone to wear it. And so it makes sense if you want people to enjoy a spatial video experience, go to where the eyes are. But Meta at least is exploring that, but it feels more of a counter response to Apple. You know, I think there's so much more they've already started that I'd like them to finish. And it seems like it feels that the dangling carrot of AI and the sparkliness of the Ray-Ban success is pushing that cart forward a lot further than maybe it would have been with the natural co-evolution of these devices. Maybe that's not the case. But it now feels like they're pushing all the chips to the glasses side of the table. And then you're hearing about design collaborations and things like that. I mean, design is great. But the function is the lurking thing. And even, again, going to the basic Ray-Bans, some people said, well, come to me when you can shoot in horizontal. I saw that from Peter Sereta, who is a great Parks writer and writes about film stuff. But it's like, that's a great point. And syncing better, so I don't have to break my Wi-Fi connection when I'm, like I was covering the keynote, shooting on the Ray-Bans. It's very hard to pull the photos off for quick reactions. Your phone is going to be so much faster for that. You're really hobbling. you're really hurting your process to try to shoot photos on your glasses and pull them off and share them, unless you're going right to Instagram. So it's stuff like that. I still feel like they need to make that process even smoother, even better. And they didn't do all of that, although the battery life looks promising. I guess coming to kind of wrap it up, I'm really excited to look at the new glass. I'm already wearing the Gen 2, so I'm excited about the updates to that, because I found the Ray-Bans to be one of the most surprisingly useful devices in the past few years. And I love what the Quest can offer, and I want to see more of that. And I'm really interested in the Ray-Ban displays for a review. I just feel like I need to know more. Luckily, they're coming up in a few weeks, so the review is TBD. We have to catch up more then and talk about it. This feels like the one shoe dropping. I'm very hesitant to prejudge it, but of course I have. But it's questions, and then hopefully we get answers. But I bet this platform evolves so much over the next year that It'll feel very different a year from now. So it's going to be a gradual platform for them to explore how all these things work. So even the review won't be the review. It'll be like stage one. Look at how they evolved the Quest and other things. I'm sure they're going to evolve. Meta is very good at evolving products, but they're not good at consistently fixing all the problems that pop up. But I'm always amazed at the features they add. So I expect features galore will be added to the Ray-Ban displays. But what once? You know, it's like, give us a hint. There's a lot to take in, but it's like, I do need to know because how capable will this run of this pair of glasses be?
[00:45:38.883] Kent Bye: Awesome. Well, we're kind of shutting everything down and going to be jumping onto these shuttles here momentarily, but just really appreciate you taking the time to share some of your thoughts here at MetaConnect. So thanks again.
[00:45:48.191] Scott Stein: Of course. It's always great to talk to you, Kent. I'm surprised my voice has survived. More to come tomorrow. Developer keynote, and I'm wiped out from writing some stories, but it's been a blast. Awesome. Thanks so much. Yeah, thank you.
[00:45:59.824] Kent Bye: Thanks again for listening to this episode of the voices of your podcast. And if you enjoy the podcast and please do spread the word, tell your friends and consider becoming a member of the Patreon. This is a, this is part of podcast. And so I do rely upon donations from people like yourself in order to continue to bring this coverage. So you can become a member and donate today at patreon.com slash voices of VR. Thanks for listening.