#1219: Apple Vision Pro Hands-on Debriefing with CNET Editor Scott Stein

I speak to CNET editor at large Scott Stein about his hands-on impression of the Apple Vision Pro that he had on Tuesday, June 6, 2023. Check out his full article: “Apple Vision Pro Hands-On: Far Better Than I Was Ready For.” Also see my Twitter thread live coverage of #WWDC23

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.412] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye and welcome to the Voices of VR podcast. The podcast that looks at the future of special computing. You can support the podcast at patreon.com slash Voices of VR. So on today's episode, I have Scott Stein, who's the editor at large at CNET, and he's seen quite a lot of VR demos. I think Ben of all the different XR journalists has probably seen the most demos of all the things that are out there in the XR world, but I feel like Scott is right there in like the top three because He's gone to a lot of things that a lot of folks haven't been able to go to, like the reality labs at Meta. He saw like early days of Magic Leap. He's basically been out and about covering the tech industry for a long time now and seen lots and lots of demos, including the latest Apple Vision Pro, which is what we're really focusing here in this episode. But he was able to draw from all of his other associative links from other platforms he's seen. And this interview is a little bit different than the other two interviews because both Ben and and we're still on the road talking to me via mobile computing and the cell phone but Scott had actually come home and it was at his full rig and so we had time to dive in for a full hour and 45 minutes or so and this episode ended up being a little bit more unedited than I usually do because I just have to get this out before I hop in a plane in about an hour or so to go off to Tribeca so You'll get a little bit of a raw discussion as we start to explore both our impressions and reflections on Apple Vision Pro. Well, not my hands-on impressions, just my impressions based upon what I've heard from both Ben, Anne, and Scott as they have their hands-on impressions. So that's what we're covering on today's episode of the Voices of VR podcast. So this interview with Ben, Anne, and Scott happened on Tuesday, June 6th, 2023. So with that, let's go ahead and dive right in.

[00:01:56.087] Scott Stein: My name is Scott Stein. I'm an editor at large at CNET, and I cover a lot of different VR and AR tech. I also look at computing, kind of the future of computing, wearable tech, gaming, and VR, AR, though, has been a really big focus for me for a number of years now.

[00:02:13.256] Kent Bye: Can you give a bit more context as to your background and your journey into covering VR?

[00:02:18.138] Scott Stein: Yeah. So for me, I actually got an MFA in theater way back. So my background was all creative. I mean, I was creative writing and theater and playwriting. And I moved into tech probably around two, it was about 2003, 2004, moving back to New York. But it's funny because the plays I used to write were actually about tech. Way back I was writing about chat rooms, it was my thesis play, and then I was writing about MMORPGs and virtual worlds and AI. So this stuff was on my mind even then in the late 90s. And it's what drew me eventually to what was going on right now. I mean, right now, meaning I feel like 2012 onwards, you know, where Oculus, Google Glass, and all these things that I had been thinking about for a while were starting to happen. I had already gotten a job working in tech. So that was not the reason I started working in tech, but it was this synchronicity. And so I tend to think about it in terms of theater and human experience. And I feel like that's the stuff that's the most dynamic. And I'm interested in the idea of the theater of how these things are demoed even, you know, or the idea of what we're becoming with these things on us. So for me, that kind of holistic question, that cognitive question, that's what keeps me going writing in tech at the moment.

[00:03:37.293] Kent Bye: Awesome. That's really cool to hear that. I wasn't aware that you're coming from that theater background, but it makes total sense, especially with how much immersive storytelling is happening in the space. And that's a beat that I really love to cover. But I want to take us back to May 23rd, 2023. This is when both Road to VR and Upload VR reported that they had been invited to WWDC to come cover Apple's keynote. Now, these are like the independent VR journalists, you know, both Road to VR, Upload VR, also Norman Chan from Tested. These are all folks that have never really been to these big events like Apple. And so you're also been covering the XR space and tried out a lot, pretty much all the different headsets that are out there covering it really closely. First seen it, which is a lot bigger of a publication than say these indie publications of road to VR, Apple of VR, or even tested. But I wanted to ask, have you ever covered Apple events before and covered things like Apple events? Or was this also your first time covering an Apple event?

[00:04:43.828] Scott Stein: No, so I have been to a lot of events before, and that's why this was really cool. Cause it felt like the great convergence, you know, I had reviewed iPhones back in 2012. That was my first Apple event. And I kind of moved at CNET through a lot of different landscapes of tech. I was a laptop reviewer at first, then it was, I was really interested in moving in tablets and phones and that's what I did. And then I was looking at mobile tech, but I was looking at wearable tech. Cause you know, I was more interested in the idea of the extensions on ourselves then. I was looking at iPads and AR and VR. And so, you know, seeing this now brought a lot of different worlds together. And it was really cool seeing people that I've seen at Meta and things like the Hollens or whatever. And then seeing a lot of people that have seen an apple events and I kind of like steeled myself for that too because you know I think. I am a mainstream publication but you know I look at it like I like going into the smaller zones when I can and thinking about what the breadcrumbs are that. You can see that are happening that I can see I'm not seeing stuff in the detail. or the continuity that specialist publications are. But I like to sometimes when I can. And then when it gets really big like this, kind of like when it's different, but like when meta, you know, announced its move to the metaverse, you can sense a lot of different people coming in that hadn't been there before. That's also the cycle of stuff becoming mainstream. So yeah, I just thought it was very interesting because I'm used to the kind of feel of Apple events. And I'm also used to the feel of VR events. And this felt like both.

[00:06:14.683] Kent Bye: Okay. Yeah. Well, I know that certainly a lot of folks in the XR industry have been long anticipating that Apple was going to make this announcement. I think a lot of us kind of knew that behind the scenes, they were working on something not only for the people who had been working there, but their acquisitions and their patents and all signs pointed towards that. This is where we're going to eventually get to with them working on this. Of course, there's been the. intrepid reporting from Mark Gurman from Bloomberg, who's been reporting on a lot of this stuff over many, many years, giving an insight as to maybe some of the behind the scenes, but all indications for me, at least when I was at Augmented World Expo, just from talking to the folks that it was actually going to happen. So I don't know if you were walking into this event, being pretty sure that it was going to be announced, or love to hear what it was like for you to actually be on site, watching the keynote, getting to this moment of, oh, and one more thing at the very end after they've been dragging out all of the people in the XR industry waiting for this to happen in that moment when it finally comes. So I'd love to hear what it was like for you to be there and if you were expecting it to happen.

[00:07:18.923] Scott Stein: Yeah, when I saw who got invites and people were going, I was and also with all the reporting that was leading up to this, although I kind of felt like I've been following this for so many years. I mean, way back in twenty eighteen at CNET, Shara Tipkin had written about that there was a headset, and I think that was one of the first stories on it. And so, you know, I feel like I was crying wolf. You know, I feel like every time I was following this industry and looking at it so incrementally. I mean, I was I was at WWDC when they had air kit. I looked at the first air kit demos. And they were they brought the vibe out then and I looked at you know each increment of AR kit over the years and when they had multiplayer. AR where they were playing like a ball game with iPads, iPads at one WWDC. So I kept feeling like, oh, it's getting closer. It's getting closer. But here I felt like I was trying to be like a little cautious in saying like, it's going to happen because I didn't want to get burned. But I was like, OK, it's go time. Like, it's going to happen. It's going to happen. I expected it would be at the end. I expected it would have to be a big core dump of stuff. I was, you know, I feel like some people were wondering how long the keynote would be. And when I was looking at the time and how much was being eaten up by each amount, I was thinking, You know, like other companies like meta and even Sony of PlayStation and Microsoft, they've extended these announcements over many years and many demos and and sometimes we'll let the cat out of the bag and apple hadn't acknowledged a thing. So I thought, oh, my God, we're going to go from. from the very acknowledgement of it through all this stuff, like that's gonna be, okay, this is gonna be a chunk of time. Like I just, I was like bracing myself for that, but it felt fun. You know, I felt at least lucky that I had experienced a lot of AR VR stuff and a lot of Apple products. So I thought, okay, well, I'm as prepared as I can be to kind of think about that, where we are with this, but I was also trying to not be pre loaded in my head with, with all these conceptions. I guess the thing when you move forward in tech is to try to not feel like you know all the answers. And I was really trying to like unlearn in my head and just listen to what the pitch was.

[00:09:22.802] Kent Bye: Yeah, well, you, you're really at this perfect intersection of not only these many years at a mainstream publication like CNET covering not only Apple and all this evolution, but, you know, even going to like meta reality labs and getting to test out some of the prototypes that literally only a handful of outsiders have ever laid eyes on. So you've had the full range of different experiences to really be set up to be there at this moment to bear witness to this new headset. And I have to say that of all the different folks that had seen it, I think you were one of the first. a journalist that I knew at least to be able to see it and to report on it, get their, uh, file their report and their video. Cause, uh, both been laying for motive VR salt later in the evening and then enhancing from upload didn't even see it until the following day. So there was some, like some of the, uh, dedicated VR journalists didn't get in there, right. The first wave, but you seem to get in there fairly early. So I'd love to hear what it was like for you after. the keynote, and then what it was in your, your shepherding, you're getting chaperoned into this space that is, you know, offset and being able to actually get an opportunity to have hands on. So love to hear what it was like the aftermath. And then getting the hands on.

[00:10:33.128] Scott Stein: Absolutely. I mean, we had like, you know, a time that we knew we were going to be doing it. And you know, unfortunately, they were they were a little staggered. And, you know, I didn't know much more than that. And they had a building that you know, I think the drones that People had reported and said, you're not going to be able to hide where that building is. It was on the other side of the ring. And it reminded me of, they had built a building as well when the Apple Watch came out. I think it was the spring 2015 event, because they were still building Apple Park. So it was at a university, I think. And they had this amazing full cube thing. And it was kind of that feeling, where it was like a purpose-built pavilion. And, um, and then, you know, everything from there, we were not allowed to take photos and video, which is pretty common in the VR airspace. I was, I was, I was, you know, telling my team, I was like, be ready for that. You know, there's no, I went to, I mean, I've been lucky to do a lot of things like going to Microsoft for the, for Hollens too. There, we did have video photo and video, but. Magically back in in in plantation when when when the very first moment looking at it but sometimes those are very contained htc vive in barcelona when i looked at that. Mobile congress no photo video allowed in there i did i did dictate my experience i'm kinda used to be like even. Project Starline with Google, which I was lucky to check out last year. Again, no photo and video in there. So I kind of knew the playbook with this. And I thought, it's probably going to play out like I've seen with other ones like that. And it really was. I think the one interesting thing is that they got us set up for lenses. which, you know, Apple was saying is not like a purple, it's still in the works it or, you know, that's the information that we get is that how those lenses from Zeiss are going to be outfitted. And how you get outfitted for your for the foam, the light shield, I forget the name of what they're calling it, but it's the faceplate. And, and spatial audio, which is like what you do with your phone already to kind of that extra level they added to to initial audio optimization. Those were things they did ahead of putting on the headset. And it was pretty quick. I was pretty surprised that they had lenses that worked for me because I've done this. I wrote about this earlier in the year. I've done this game before. And I wrote about how I'm very concerned about what's gonna happen to my glasses. I think in this XR industry, we're getting, we wanna get to, not we, but companies wanna get to everyday glasses. And in the meantime, you're seeing things like Vive XR Elite, other headsets that are getting closer and closer and starting to say, well, you know, you're not going to use your glasses. And are the prescriptions good enough? And I've been told in the past, okay, yes, we have your prescription. And then you go and lo and behold, They go, we don't quite have yours. I'm a minus 8.25. And they go, we have minus 8. But then it's going to be slightly fuzzy. And I sometimes bring contacts. I said I wasn't going to bring contacts on Twitter. And then one of my colleagues was like, bring contacts. Don't be stupid. And I brought one old pair of contacts. But I was not going to put them in. And I felt that they had an answer. But anyway, the lenses look great when I put them in. So whatever they did, they were prepared for the full range. But I'm not happy that this headset doesn't fit over my glasses. Quest Pro accommodates for that, although it's a little clunky. And so we then got led into a room. This room was a pretty sedentary room. That was one thing that was interesting to me, because I've seen a lot of demo rooms and a lot of spaces, and I'm used now to VR and AR being pretty active. And so I think this was probably it was definitely intentionally designed to be a little more sit down, you know, sit down, try things out, look at them a little bit of walking around at the end. And I kind of indicated to me that I think that's the that's the use case here. That's the design. In fact, between that and the battery pack, I don't know. if they're ready to have this first generation one be a super active headset. And maybe that's something that comes down the road. But yeah, it was about I mean, I think the total amount of time I was there was about an hour 15, but that was not all in headset. That was, you know, there was waiting and, you know, going in and coming out, but I'd say like, but it was a solid half hour. I feel it was, it was a, uh, I mean, writing about it tried to, I covered as much as I could remember. And I think it was everything, but it was actually a fair amount of demos. It wasn't brief. Um, And it was well, I mean, try going to how I felt about the whole. Yeah.

[00:15:24.276] Kent Bye: Yeah. I mean, maybe to kind of reflect the article that you wrote your headline for your article. I don't know if you wrote the headline, but the headline is it. Apple Vision Pro hands-on. This is the headset I'd use to watch 3D Avatar. I experienced incredible fidelity, surprising video quality, and a really smooth interface. Apple's first mixed reality headset nails those, but lots of questions remain. So I guess that's the headline. So yeah, maybe... Oops, maybe take it from there in terms of, you know, what you're left with, you know, that you focus and begin with the 3d video. But as you're walking away now, and you think about it, what are the things that really stick with you?

[00:16:03.945] Scott Stein: Yeah, so I had a lot of thoughts going into this. And after I distilled writing preview pieces, and also knowing what people were reporting, and then going into it, I had these three main thoughts that seem to be surfacing, which were like one, Are they going to emphasize nailing the display? Two, are they going to nail the interface because they're trying to go hands and eye tracking from all reports? And three, are they going to figure out cross device compatibility in a way that evolves the landscape in a way that I think Meta and a lot of other headsets struggle with. And they're trying to find their way, but they don't have the device portfolio that Apple has. I thought that was the thing, that they even more than anything could lean into their wheelhouse. So the reason I did that headline, and it's funny because I wondered if it would even be seen as dismissive, but it really kicked back to something I wrote last December. that I didn't realize would be a direct preview of what I was going to be seeing. But when I saw Avatar Way of Water in IMAX in 3D, and I've seen 3D films, I've seen a lot of 3D things, I've seen VR, but it shockingly gave me a bit of a VR feeling. And I remember there was one moment in the movie where I just like turned to the side like as if I was going to sort of see the rest of the space. And I thought there was a fluidity. And I talked to other people about this, too, who cover VR, like there was a real fluidity, especially in IMAX to that film that was so, so amazing. And then I thought to myself, I've had these thoughts about headsets for your eyes for years, and kind of a dream of like, it'd be really fun to have something like going all the way back before VR. Or at the same time, Avogon Glyph, which was like a, it's like a a lesser known deep cut, they had a headset that was like, you know, literally like a headphone, you put it on and look through and there was a retinal projection. And it was Chris, but it was just like early days, they wanted to watch movies. I thought about that again, every time Oculus or others had cinemas, but then I would think I would go back to watching my TV because it didn't look quite as good. Vario XR three was the one that couple of years ago. I was like, wow, this is like retina level. And, and I really was like, this is amazing, but it connects to a PC and it's $5,000. And the setup is it's not designed to be entertainment based. So. The number one thing that really wowed me that I hadn't seen before was that visual fidelity. And I think I came away from it. And then lo and behold, the demo they show me is of a clip of avatar of the way the way of water. And I was like, oh, they're actually going to show me this and say, it's good enough. And it was the part that brought like, I got like chills. I felt like I was in the theater. It made me feel like a little bit emotional. I was like, I felt that way too when I was looking at Vario XR3 watching a video because I thought like, oh my God, there's something about the visceral element of the resolution, but the brightness here, the vividness and just putting it on. And I was like, kind of dream of like personal cinema a little bit. And I don't have a good setup at home, but I thought, okay, this is really amazing. And that was the first thing that popped to mind. There are a number of other things that this headset did. And I would say the second thing that wowed me was also the quality of the pass-through. I remember putting it on, again, having seen Vario XR3. I mean, obviously, Quest Pro is at a much lower resolution for that. and AR is not doing pass-through, but everyone wants to kind of get to that zone. But when I put it on, I just went, oh, wow, okay. Like, good job. Like, I just felt that. I looked around the room and it was like crisp. And when I looked at my watch, perfectly fine. Like, I mean, not my regular vision, but I will say that by the end of the demo, when I took, I also wrote that down, like when I put on my glasses again, wasn't that big a drop off. You know, I felt like it was as close as I've ever seen to feeling like, like I remember at the Quest Pro demo, when I did it and met his research labs, I think somebody said, welcome back. And I was like, well, that's not the right word anymore because you're, we've been here all along, you know, like we're, I see you the whole time. But it really did feel in that room that I was just continuing with them. And now a day or two later, when I think about it, I'm thinking that I really didn't pay attention to the pass-through video. And I think that's the biggest success is that so much of what they were demoing was mixed reality with pass-through video, as opposed to, I think, Quest Pro, even though my demos were pass-through video, a lot of Quest Pro is still VR, with options to do pass-through. And Vario, the same thing. It's VR, but it can also do pass-through. But everything was leaning on pass-through, and I forgot about it. And I think that that's, um, that's all like the whole interface with the grid of apps was while pastor video was happening, but I wasn't thinking about it. Like, Oh, here's the pastor video background. So that was also amazing. And the interface. I wrote about two, like I have a high threshold for where I want that to be. Um, and I think. it, it does some really good things. I'm really curious about how that will work, how you can opt out of being always having the eye tracking to focus because you're looking and then using very tiny gestures, which I thought about once. Like, you know, people were talking about ambient computing and I was thinking that would make a lot of sense, like to combine these and, and, and lo and behold, they're doing that. PlayStation VR too, doesn't really do that much, although in a couple of games they do. And Quest Pro really backs off eye tracking on the interface, I think for battery accommodations. But it's interesting, because then I also thought, wow, okay, so Apple is able to run eye tracking all the time on this headset, despite being standalone. Maybe that's the battery pack thing, but that's something that Mark Zuckerberg was talking about, that I believe it, that there are big battery life considerations. So I don't know what Apple's doing power-wise here, but it's such a big part of the interface And then because of that, the gestures were so small that I was able – I called it being lazy, that I thought that it was great you could do lazy gestures. I rested my hand on my leg sitting down and just did these little finger mini swipes and pinches. And that's how I feel like you'd use it. Like you wouldn't want to lift your hands in the air and do these things. Like, so I was kind of seeing how lazy you could be. If I put my hands below my knee, like all the way down, it lost tracking, but it looks like, um, the camera tracking is kind of similar to quest pro and that, or in the same vein that it can scan downwards and has a pretty wide range also does jaw tracking. Um, so I thought that was really cool. But what I really want even more is for the touch interface to dovetail with a variety of physical products to make it feel like you're really just instantly swapping. I imagine they're going to do it over time, but they talked about the trackpad and keyboard compatibility. None of my demos had that. So it was all eye tracking and hand tracking. And I guess you can also use your MacBook similar to Quest Pro and extend the monitors. But I was thinking about, I thought Apple Watch, iPhone, and iPad were obvious fits. But for the moment, Apple is not doing any touchscreen interfaces with it. But I know Qualcomm's doing that with their phones and AR headsets. I look forward to when that happens because I think that that's the device people have with them a lot. and could be tactile, but we're also not even at the launch date of the headset. Like I don't even know what they're going to have coming for Gen 1 by 2024. But those are my thoughts on the interface. Those are the three things that, sorry to talk so much, but those are the three things that really hit me the most.

[00:24:18.067] Kent Bye: Yeah, that's really helpful. Yeah, just at Augmented World Expo, Qualcomm just announced the dual render fusion that allows this dual rendering of a spatial object, but also to be able to render it on your phone. And when you look at your phone, you're able to have two different cameras in the same scene in Unity. It's the best way to think about it because you have different ways of kind of interfacing with it from more of an omniscient third person perspective and a spatial perspective and more of a first person moving things around on your phone. And yeah, I'm sure that's going to be on the pipeline. It just makes sense. I think the only thing is I don't think they would want to necessarily tether some of the core functionality to something where people may not even have an iPhone or they may not have a powerful enough iPhone. So, you know, or maybe they may have an Android. So you know, they want to make it so it works without having all that additional stuff. So, um, yeah, but I wanted to ask around, uh, some of the stuff that, uh, you mentioned, which I haven't had a chance to talk to either Ben or Ian about, which is this crown where you turn this dial and you dial into what degree you're seeing mixed reality versus VR and the different interfaces where when there's other people in the physical space, if they come into your area that they kind of like materialize in, in some way. I'm wondering if you could maybe elaborate. You mentioned that there was a similar interface on the Apple Watch that had this dial. So maybe you could start with this crown that allows you to dial in to what degree you're seeing I guess in the Milgram mixed reality spectrum where on the one end you have the physical reality and then you sort of overlay different dimensions of the augmentation and eventually you're fully immersed into VR. So the fact that there's a dial there I think is really interesting for people to kind of dial it in, but also to account for if people are in the space, how do they come in and out of that virtual space?

[00:26:10.510] Scott Stein: Yeah, I thought it was going to be something like a fading in and out, but I think it's more like a curtain. There weren't that many moments that we did it. There were some of the volumetric environments we did it. And I'm trying to remember the other moments where we made that happen. But it looked like what it was doing was that you'd have a window, kind of a mixed reality window going forward. with some peripheral of the rest of the room. And then when you adjust the dial, that window just expands and wraps around. Same thing for panoramas. And then it gets to the point where if you go all the way, then you're fully in VR. And you don't see the room. And so it was interesting. I'm wondering, with that, how often I'll use it and how apps will implement it, you know, whether it will always make sense, or whether it's something that they might finesse a little bit more. It's different than what you do on a on a quest where you're double tapping to just turn pass through on to see what's going on in the room. I actually like the double tap on those headsets, but that's a little bit of like a kind of turns the features off and just goes into, you know, let me see what's going on in the room. Apples is like it feels almost like raising and lowering the blinds in a room. You know, it's like you're you're kind of it is theatrical as well. There's a lot of theatrical things I feel like that are going on with this. um, with this headset. And so it was interesting, but I was going, Hmm, like, you know, will it, some people don't use the digital crown that much, you know, and then it's like, you know, so I feel like, you know, are they, are they pushing this in a little bit to make the design statement, but I think the, um, the, the ability to blend parts of the environment using some of their, um, their ARKit tools that they've been developing, the ability to kind of recognize objects and separate them. And that I thought was really interesting. And I feel like there's a lot of ways that can keep going. I mean, first of all, like my hands, they were a little rough around the edges, but similar to what ARKit does, you know, you could, you could just hold up your hands and see them in a, in a virtual environment and you'll, you'll see your real hands right there. And, but it was, it was pretty well implemented. And then when people were in the room, I had one person on the left, one person on the right, and I turned to them, they would sort of ghost into view. You know, it was weird. Like I sort of saw them through the virtual, and then there was an outline around them. Like, you know, we cut out a little outline showing the real world, and then they were sort of softly coming in. It looked like they were fading in a bit, but it just kind of activated on the presence of them. I thought that was really cool. Again, with a, but that I'm wondering how adjustable that is, because I think it's great to have that. I think there are a lot of times you wouldn't want to have that. And so if, you know, if you have five people in a living room and you want to have your own experience or eat a coffee shop, you know, is it going to mean that people are going to fade into view when you're working on something or. you know, if you're on a plane, as they showed, you know, for a plane flight, or is it more about eye contact or is it something that you can turn off the range? I mean, the Apple watch is very deep in settings. I'm sure this might be too. I have no insight into that. So I think it's, I think it's super interesting. It also reminded me of There was also some dimming of the real world in mixed reality where they would bring down, and Magic Leap 2, the demo that I got in last year, and I haven't used it since then, but one of the interesting features on that headset is that it can, even though it's their transparent lenses, it can do like adjustable sunglasses. It dims the outside world and enhances, looks like it enhances the opacity of the virtual object. And Apple has that too for kind of a focus effect. And I think that's great. I kind of think that that's like a path for visual spatial computing is that just like with audio, you can have the ability to tune out things and have noise cancellation. It's like some element of visual field focus. And I think that there's a whole interesting range of stuff that they could do with that. I think it's just the beginning, but that feels very new to me. Yeah, that's, uh, that's what I was kind of seeing there.

[00:30:43.551] Kent Bye: Well, having the pass-through mode means that you can start to do pixel perfect stabilization in some ways, because you can overlay things with a lot more control through the pass-through, at least from what I've seen in some of the different demos where physical reality is always going to have like, you know, like immediate, no latency of movement. And then the technology just from the speed of light and everything else is going to have a little bit slight jitter when you're overlaying things on top of reality. But the pass-through video, you do have the capability to get it to be a lot more convincing. So I've certainly seen with the MetaQuest Pro, some different moments like the, there's a piece called Eggscape that played at both Venice and South by Southwest. And There's just these magical moments where I was completely immersed into believing that these little egg creatures that were flying through the air, that I just completely bought into it. Suspense of all disbelief and was totally immersed into that these were actually real objects that I was manipulating in this space. So you feel like that the mixed reality pass-through is a good way as much as Apple has been frankly kind of shitting on VR for as long as they have. They've never actually mentioned the word VR. They sort of went to spatial computing, which is fine. I love the spatial computing. It magically helped to popularize it and I've sort of adopted it because I do think it does explain this shift from 2D to 3D. But there's certain elements of that, the lessons from VR, where I think of things like VRChat, where I've gone in and seen some of these different theater worlds where they have these shaders that have light reflections in the world. And they showed a video of this where they're able to kind of have this flickering effect. And I'm not sure if they show that in the theatrical view, if you're watching like a 2D movie screen, if they were able to actually like start to bring some of those lighting effects and project them out into the room. I don't know if you saw anything like that in any of the demos. They showed it in the video, at least. And I thought that would be amazing. It's these subtle things with light that I think your periphery or your unconscious sees that you just believe that it's there because of our understanding of how light works and how it reflects off of things. And if you are projecting that into this mixed reality projection in this space, I think there's going to be a lot of ways that, especially if you have a high enough resolution, that it's really going to be super transportive. But anyway, I'd love to hear what their demo was like to kind of play around with what kind of effects they were able to do with the pass-through video.

[00:33:04.642] Scott Stein: Yeah, they didn't, you know, there wasn't a whole lot of that in terms of effect seeming to affect objects in the world, although they may have been happening really subtly, and I wasn't noticing, you know, like, I kind of feel like there might have been some shadowing things. There was definitely stuff when some of the things were floating in the air that at least the objects were being, you know, kind of realistically lit and shadowed to kind of feel like they were normally in the environment. But, um, But I think it's I know I've seen air kit apps that have played around with elements like that. And on Apple's played around with some of the air features like that in their their their their video app. That's that's not iMovie, which I keep forgetting the name of. I'm totally flaking out right now. But They're they're clips that why they take me so long, but clips actually introduce some air features that were a little more environmental. Yeah, I think it's it's totally capable of it. The demo for me. felt like it was easing us into the idea of an experience. And in some ways, it almost hearkened back to earlier VR demos of the past. In fact, it even ended with dinosaurs coming into the room, which I think other VR journals are saying like, it's like, we're back at the dinosaur demo. It's almost like the Oculus Crescent Bay days of like, you're gonna see a spectacle moment. And a lot of VR now, Especially the Quest Pro demos were about like, you're gonna pick up a paintbrush and do things, or you're gonna be a DJ and do things. I think that Apple also iterates consistently, and so I feel like even though it's not here now, I think that maybe the ramping up to like, you're doing stuff, actively will start to happen, because it's interesting right now is much more like let us get used to the interface, which makes sense, because, again, this is like the first time they've had the headset. So I think they're just like, here's the interface. Here's what you're doing. Here's the here are the controls. Here's opening up a few apps. I was kind of ready to like jump to the next It kind of like the Jurassic Park like ride in that movie you know where you like with Mr DNA where you're like I want to go to the egg room I want to see you know I want to do the virtual DJ stuff I want to, I want to put things in the real world to there wasn't a lot of like I didn't see meshing. I didn't see them, the room setup. I didn't see anything run behind a chair or. you know, I wasn't pinning things to walls. Um, so I think there's a lot, although the dinosaur encounter dinosaurs, which is the most app, the most immersive app demo that I tried at the end was like, I think was the coolest VR dinosaur demo. So I think like, you know, even though it's not being called VR, like it, it was the one where you're like, okay, this was, this is a superior one where like a window opened up in the wall and The Carnotaur, I think it was, came through. And they had a moment with a butterfly, which reminded me of the HoloLens demo where there was a butterfly once. The butterfly almost completely landed on my finger, but it was cool. It was OK. That's a really good. example of the fidelity of my finger and the butterfly, the two meeting together, like when I found this, you know, the thought of the quest pro did did well, like you said, I think this felt like a super duper quest pro, you know, like, I feel like the quest pro what it did well, was demonstrate that if the fluidity of the mixed reality experience in you, even if the color pass-through is not that great, it actually means that you accept the VR quality or it doesn't feel so out of joint with being projected in the real world. And it creates a kind of a fluidity that in some ways HoloLens 2, I think struggles with more. And I felt like it was kind of working like that. Apple had those two ends meet even better. So I felt like, Better pass through and better display just created this real meat in the middle moment that had a wow factor. The other thing I didn't mention was that there was a FaceTime demo. But what was interesting about that, talking about the trip to Meta, is that it was a chance to see what they call Persona. Not Avatar, but it's an avatar, but called Persona. Just like it's not a VR headset, it's a spatial computer. But it's like a lot of naming conventions going on. But Persona was there scanned. I mean, it seems like a pretty integral feature, where you're using the headset to scan your face. And then that's that can be used for FaceTime chats. They didn't I don't think they talked much about where else it can show up. But it also apparently appears on the on the curved OLED. At first, I think the impression watching that presentation, which is the wildest feature, I'm like, I still can't believe that exists in the headset. But I thought that was originally like some somehow your actual eyes. And then it seems that that's the scan of the persona showing those eyes virtually. But somebody appeared in a window and this reminded me of what Facebook's doing with Kodak avatars, Meta is doing with Kodak avatars. These super realistic, I got to see a very realistic Kodak avatar. I talked to them at Meta and then I saw one that was created with a phone scan, which was not as good, but it was okay. The head was a little stiff. It felt like a little like Disney hall, you know, the haunted mansion where the head was moving and you're like, it feels like it's glued on a little bit. Apple's was better than that. And what's interesting is that's, you know, essentially more like the phone scan, but on a headset and it looked good. It actually looked kind of eerily realistic. I was curious, the woman who was speaking to me was smiling a lot and I kind of get into this with thinking about the face tracking, you know, is she able to express other emotions? Like, what if I made if I made her upset with her? Because I found that I was playing around with this on Quest Pro. with a few people where I was trying to do emotions and express my face and grimace and do other things and it has it has limits. So I'm curious where the limits are with this and also if it was my mom who had done a scan if it was someone I knew. where would the uncanniness be? Because I'd never met the woman who appeared as the persona. But that was really cool. I mean, that definitely stuck with me. Almost had a fuzzy, kind of a fuzzy focus feel. It's like they almost blurred out the periphery as she appeared. And she also appeared in a picture-in-picture box, kind of a small window, not right in front of me like I was talking. Surprising, like I kind of thought it would be like a person-to-person chat. But it really was still taking the, the computer metaphor for this demo of a pop-up window. And then I dragged that window across the room and then opened up. It was more productivity focused, like, hey, we're having a chat. Let's open up Freeform, which is their collaborative workspace app. And I didn't get to work on it. I watched her open up some stuff. There wasn't a lot of getting your hands into the apps too much and making stuff. It was a little more of a guided view of, of how the apps are, are, you know, are done, you know, it's like open this window, then close this window, then take a look at this photo type of a thing.

[00:40:36.670] Kent Bye: Yeah. Going back to a couple of things you mentioned, the portal into another realm where you see the dinosaur, the great callback back to 2014 Crescent Bay demo at Oculus Connect 1, because yeah, they did have a dinosaur demo along with a lot of other demos that were very quickly shown back to back. But yeah, going back to that spectacle, Ian Hamilton from Upload VR said that that was the first time that they allowed him to get up and walk around. I don't know. Did you walk around at that point?

[00:41:04.540] Scott Stein: Exactly. That was the first time I got up and walked around. They definitely encourage it. They said, you should get up. Oh, make sure to grab the battery pack, you know. And so like that was the that was the part. And probably that's the that's the sensitivity there as well is you got this big. Well, we'll talk about that when we get this big battery pack. Yeah, I got up and walked around. It was it was cool because the the the dinosaur reacted to you in the room. Like I reached out my hand, it snapped at my hand. And but there was a sense of a kind of a dynamic presence. And yeah, I've done a lot of these demos, but I still walked around that area and went, oh, wow, this is cool. And I think it was interesting to see so many reactions from people in the VR landscape reading reactions afterwards where they seem to have that same feel, where it still had the ability to wow. And that's pretty impressive, because I think a lot of people did come in skeptical to this. And we've seen a lot of things. you know, what was this going to do? And that's why that fidelity is the main thing I lean on is the wow factor because and I really thought having seen Vario XR3 that I would be a little more ready for it and go. And I haven't seen big screen VR, which is micro OLED, and I really need to see that. But I kind of felt like I was going to be ready for it, but I still was like, OK, this is really cool.

[00:42:33.597] Kent Bye: Yeah, for folks that had done the BigScreen Beyond, I know that Norm Chan from Tested was saying that the weight distribution was particularly striking to him, at least from, because the BigScreen Beyond is so lightweight, they're really optimizing comfort over and above everything else. And so certainly you have probably what is one of the highest resolutions, especially when it comes to micro OLED with the Apple Visual Pro Apple vision pro, but the weight distribution is something that I heard come up from a couple of people. I don't know if you felt that it was maybe a little heavy, uh, on your face, even though there wasn't a battery, but still that. That just all the different metal and glass and everything was maybe, uh, just as heavy or maybe heavier than some of the other competition.

[00:43:21.468] Scott Stein: Yeah, it didn't feel lightweight to me. It didn't strike me as, as like. awful or cumbersome, but it didn't definitely didn't strike me as lightweight. It seemed very much in keeping with other VR headsets, but with a little bit of fit challenge. I'm not terrible, but I was getting you as also with any headset, get used to the fit and stuff like Vive XR elite. actually had a tough time fitting on me, even though with the glasses spacer, Quest Pro is not bad, but it's still, it's actually, I really like the fit, but it's a very tight fit over my glasses. So even though I think it's great that it works with my glasses, I kind of feel like I'm like pulling the car into a tight parking spot. And I like that the HoloLens 2 easily lifts up. So that's saying like, I think a lot of headsets have issues. With Vision Pro, I found that the face piece, I had a little bit of light leak at the bottom and the rest of it was pretty good. But then I think when I re-tightened it, it got better. Because there were two tightening mechanisms. One is a dial in the back. to tighten that tension in the back piece, and then a strap on top, which are pretty, kind of pretty standard types of ways of doing it. But kind of finessing that, I think it was riding a little high on my face. But yeah, it felt like a, you know, it seemed like a headset that wanted to be nice and snug on your face. Again, that's, you're not wearing glasses. So in that regard, it's probably something that you wouldn't want to be too loose, because the weight of it might drop down a bit. So I think that that's, yeah, it's not like putting a pair of glasses. It's definitely like putting on a headset. And that's even without the battery. So that's the other thing. I would say that the feel of the headset, to me, and maybe that was a decision, too, why the battery is not on there, is that it may be weight, because the headset felt, that like the size of other headsets, even without the battery.

[00:45:33.991] Kent Bye: Yeah. Yeah, I wanted to go back to your, you know, two of your three main points, which was both the user interface as well as the display. So let's focus on the user interface first, because I feel like a lot of concerns from XR developers, at least who are thinking about potentially porting over their apps, not having a controller is going to present that there's going to be a completely new paradigm for how some folks are going to have to navigate or locomote through these different immersive experiences. And so here's to hear what your takeaway is from this combination of eye tracking, these small micro gestures that you can be lazy with, as you say, but then also the speaking. You didn't mention too much around you know, different conversational interfaces, but it sounds like those three main modalities of your eyes as the cursor, your fingers as the mouse clicker and your voice as maybe the high bandwidth text input. But yeah, I'd love to hear what your takeaways are in terms of whether or not they were able to really nail all those. You said you had a high bar and I'm curious if we're able to meet your minimum requirements for what you would want out of a interface like this from the Apple vision pro.

[00:46:43.003] Scott Stein: I think it's a really good start. And I don't know hand tracking wise that anything is better. But I think what's interesting is that it was a there was a piece I was going to write before this event. And I'm probably still going to write after the event. But I'm saying is it's been in my head that I was thinking about this move to hand tracking, because they were also probably not accidentally, there were a couple of companies coming out of the woodwork. showing some evolutions in hand tracking recently, and they were on the meta side of things. But I saw, which I guess was at GDC as well, Alchemy Labs was doing a hand tracking demo that I played with at home that was talking about, you know, using different pinch things they learned. And it's really a concept of, you know, how you can do this in a way that feels good and uses fingers essentially as your own haptics for feedback by pinching. And I guess that's what Apple's doing too through pinching. I don't really think of that as haptics, but it's, you know, it's a way of getting feedback.

[00:47:44.107] Kent Bye: You could call it passive haptic feedback.

[00:47:46.109] Scott Stein: Passive haptic feedback, yeah, exactly. And then there was also a first-person shooter game that launched on Quest recently that I wrote about that I thought was interesting because it was doing a pretty active action game and saying, OK, we're not going to use controllers. And Andrew Bosworth, who I talked with at Meta before about this, he's often indicated that the goal of the Quest headsets was to eventually get to a point where the controllers are optional or not needed, that the hand tracking would be good enough, but they've been saying it's not quite there. And so they keep getting better with it. It's always been there, but I kind of, it's that you can do a lot with it, but it defaults to the controllers are nice to have. So it's interesting here because I felt like, I think the eye tracking does a lot, Um, but I also wonder a from an accessibility or just everyday use standpoint. And also, like, will they get tiring? Do you want to get to a point where you want to have another way of doing things? Is that where voice or other things come in? And if you don't use that, how. how can you even do that? And is it gonna feel much more awkward? It seems to me that I would just want some sort of a controller. You know, I kind of almost thought that they would have a little like Magic Leap type, you know, or Apple remote, something that would just be a little, you know, especially since it's so entertainment based that maybe you just wanna be lazy and grab a remote. And it doesn't sound like, that doesn't sound like VR or spatial computing, but it's kind of the easiest, laziest tool that wouldn't be such a bad thing on a lot of interfaces that are flat. It wouldn't be for everything. But I don't know, just thinking about that because you're right, like in the VR world, everything is controller based. And so I think it's a big change, especially for games for how they're going to do this. On the AR side, companies are all starting with saying, oh, we can't have controllers. And that's been the talk from Metta with neural input research they've been actively pursuing and. Even talking way back with Alex Kipman when I visited Microsoft and talking about that and like they made the move to never have a controller and they wanted to. Make these moves to eventually add haptics. He said they didn't do that, but. You know, I think snap with its glasses. So it seems like Apple. Is kind of leaping to that spot and. I think it's, I guess that's my long answer to saying, I think it's, I think it actually works pretty well, but I didn't get to use typing. Um, I didn't use the software keyboard and I didn't get to bring up an actual keyboard. And I don't know how, I also don't know how those dovetail, you know, I know sometimes when I'm working on a quest pro with a keyboard on a Mac book, Sometimes the hand tracking gets a little, you know, sometimes you can like lift up and then sometimes you activate something without realizing it. I'm curious when the, how they hand that off and how they deactivate hand tracking when the keyboard is being used and how smooth that feels. I also think there's, this is going forward a step, like leaping over, but again, talking with, I think it was talking with Meta, talking with Michael Abrash, thinking about this world that we're heading towards, there's a whole weirder world of interactive tools. I think it's interesting that theoretically in mixed reality, anything can be a tool. Things that are in your world, things that are not in your world, things that are electronic objects that can interface or even passive ones that are like a piece of wood. And I think we're inventing new tools. I really am so eager to leap forward to think about that, that even though there's the hand and eye tracking, I kind of think about like, well, what can I do with this world? What are the things I can use to interact with this magical mixed reality spatial universe? And I think this really feels like step one. And I'm kind of like, there's so much more. I don't really want to scroll like an iPhone forever. I want to be doing something that's a little more advanced than that.

[00:52:16.267] Kent Bye: Well, as you were talking, it reminded me of what Meta has with Control Labs with their EMG, which watch face interface where they're wanting to do all these subtle movements with the fingers. And it seems like in some ways with just slapping 12 cameras and five depth sensors onto the Apple Vision Pro, they've maybe able to get past that critical threshold of getting some of these really nuanced finger gestures by just tracking the hands and having really great depth sensor based hand tracking rather than, I'm sure that the EMG interface when it comes around is going to be amazing, but you're talking about at least a minimum of three generations of a watch that Meta at this point hasn't even launched their watch and they're talking about they would have to do it with the third gen version of that. Apple already has their watch and there are different ways they could potentially start to integrate different aspects of this watch-based interface where maybe they'll be able to, whatever their patent portfolio is, to do some type of risk-based interactions based upon some additional inputs from the watch. But it seems to me that without any of that, with just your hands alone in this really robust set of uh, array of sensors and depth sensors, array of camera sensors and depth sensors that you're able to really maybe get to that critical threshold for this lazy gesture interaction, which is into this whole new paradigm.

[00:53:41.615] Scott Stein: Yeah. It's very exciting. I agree. Like, I think that's, um, it's like, I'm kind of backing off. It's funny. Like the more you get, you kind of follow a field. Sometimes I'm like, well, why isn't this here? And why is this here? I get to that point with like product releases too. And then, you know, It's all like a spectrum. And I think the thing with Apple is like, they tend to be to have this commitment incrementally, as long as that happens. But yeah, there's a lot of there's a really interesting start for that to build that. And I think this would probably end up doing a lot of the stuff that we would be thinking would be there. whenever that other neural input territory arrives. Not everything, but a lot of the things. It kind of reminds me of there was a question I had asked Mark Zuckerberg. I think I may have talked about this with you, or when I was talking on everything immersive. But it was my one question to him at that Meta-Reality Research Labs, because he had talked about the neural input would be something that you'd have to train and improve on and it would adapt to you. But suggesting that it was like a whole new paradigm, but then it could take on all these things, but it would be a new thing. And that's so strange to me because I thought like, well, that sounds like a massive step. And I usually think of things as being very gradual. So I was asking that you know, are we going to by the time we get to neural inputs, are we going to have like evolved other things like that with tracking and tracking other algorithms that will will kind of have this gradual point that the neural input will seem natural to that but better. And he seemed to not know what I was talking about or not have an answer for it. Or maybe I worded it really badly. But to what you said, I think this is exactly that I think this this could allow us to build a relationship with these types of inputs that will keep getting better and better. And then like a number of years from now, if neural input technology becomes a thing that maybe Meta and a lot of other companies use, that will finesse it further. Then you're like, you have all this interaction that you're doing, and then they could focus on like, let's get it to the next level. as opposed to like, here's a whole new thing that nobody knows about. But I think this is like, it's not an easy thing to get into new interfaces. And I thought it was interesting that nobody seemed to have a problem with the hand tracking here. And I thought that was actually a really successful aspect of this, even though they had the demos were very contained. But I think it did work really well.

[00:56:25.767] Kent Bye: Yeah, just this idea that with these non-invasive neural interfaces, with the risk-based EMG, the electromyography that, you know, I think talking to Palmer Luckey, one of the things he had mentioned was that, you know, when you use the typewriter, you do have to actually train yourself for tens to hundreds of hours to actually know how to use to kind of train your muscle memory of your fingers to be able to type. But once you're able to do that, it's sort of like a superpower to be able to communicate that quickly when you're typing. And so I figure that there's going to be something similar to this, where it may be even just like getting the core muscle memory. But I mean, if you go back to the mother of all demos, 1968, or 1969, uh there was a corded keyboard that he was using where it was basically like a stenographer type of typing where you're pressing different things in different order and so it you know this type of maybe non-linear way of thinking about these combinations with your 10 digits rather than all these different uh buttons that we're pressing this may be the beginning of you know this like we're starting with this kind of gesture based hand tracking but if you really want to have like occluded hands, or maybe you're walking down the street and maybe it's like you're not even, you know, your hands are in your pocket or whatever, but you want to be kind of metaphorically walk, be walking through the park and typing your PhD thesis, which I think is one of what this AR developers who is wearing a lot of these AR headsets in the early or mid eighties, late nineties was doing these early experiments with these type of corded keyboard type of interfaces. So I imagine we might eventually get there when we're moving out and about and wanting to have a little bit more freedom. But maybe this approach of just camera based is going to be a good, I don't know, there's, it seems to be trade offs between battery that it takes to run these array of 12 cameras and five sensors and you know, the thermals and everything. And so it may be more efficient to have other more haptic devices that you just have your hands in this kind of corded keyboard way that again, we're kind of talking about the next five to 10 years. So these interfaces were starting with this, just the basic being able to do the equivalent of clicking on a mouse, uh, or clicking on the mouse button and using your eyes to, to navigate a 3d space. But you know, once you get beyond that, having more high fidelity input and maybe the conversational interface is going to fill that gap where you just speak it and you're not actually, um, typing, but for anybody who does computer coding, I know that it's a lot more cumbersome to sometimes speak the computer code than it is to actually code it. So I don't know. It depends on the use case, I suppose.

[00:59:06.331] Scott Stein: I'm really interested too, with like speaking, it always seems very deliberate. I don't, I'm just kind of riffing here, but like in a similar way to the super lazy finger gesture thing that I felt like I was having that I liked, I kind of wonder if like sub vocal or like murmuring, like, I feel like there's a point where I would just kind of want to have my own little, I dream of a system where my vocal language would be really succinct and subtle and wouldn't be like, hey, blank, open blank. And I just want to be like, yeah, yeah, no, no, this, do now. And I feel like that's almost the way I would talk to myself when doing work or something. And I see this world of very refined micro gestures and expressions. It's like, we don't want to, It's almost like the skeuomorphism. I remember seeing thoughts about the one thing about hand tracking was that, and in VR, everything was very one-to-one, like you're reaching out to this thing and pulling the lever or pushing the button. And Apple was doing something interesting in that you weren't so much pushing the button. It was really like the eye tracking was offering such a shortcut, which no other, even though there's eye tracking in a ton of things, No other company is leaning on it in that way with that fidelity that you can just do it into a tiny gesture. And I thought that was like the coolest thing because we like to credit all the companies that have been doing hand tracking already in interfaces. But, you know, as far as I can remember them, like Holland's two and others, they're much more like. reach out and grab type things, or reach, you know, move your hand up to pinch the thing. And this, like, you didn't have to move your hand up. It was just, you know, kind of look up, and it would kind of highlight a little bit, and you just tap your fingers gently, and that was all you need. And that felt a lot more wonderfully, lazily subtle. And I think that's like the part where it almost feels like a brain activity. It felt more like a neural, thing than a physical gesture.

[01:01:16.552] Kent Bye: Yeah. The closest that I've seen that was really exploring the eyes as a user interface input was a company called iFluence that I saw at TechCrunch Disrupt back in 2016. Yeah. They actually got bought by Google.

[01:01:31.967] Scott Stein: I met with them. Exactly. Yeah. Yeah.

[01:01:36.062] Kent Bye: So that, but they were only using the eyes so that you had to do this kind of eye gesture to do things. It was a little bit of unorthodox. You had to kind of learn it and maybe your eyes move around a lot. So it wasn't necessarily fatiguing, but I can imagine if you were doing it a lot, it might be fatiguing. I don't know, because at least with this system for Apple Vision Pro, your eyes are already moving around a lot, but to extrapolate your intent over what you're actually looking at combined with the gesture of your fingers. I feel like that multimodality connection was probably a little bit more elegant than what I saw with iFluence back in 2016.

[01:02:11.449] Scott Stein: Yeah, I remember some of those early eye tracking demos that were Felt like they were a little more like intent like it felt a little more like i work what i thought was interesting is that the language they were using for the icons lighting up is actually again like an apple products when they were doing the trackpad stuff with the ipad it's it's very similar in that. Like when you move the trackpad around to icons on the iPad, icons get a little bit larger, and then they sort of subtly grow a little bigger. So it's not so much that there's a, the cursor disappears, and it's more that these things are just getting a little larger. When you move something over a button, the button expands. And what was interesting is that that's how the eye tracking manifested here, was that it wasn't so much like, There was no cursor, and it was just that each part of the interface would just be a bit bolder when you were looking at it, or a menu would open up. And that's, again, like, I mean, Call of the Mountain, Horizon Call of the Mountain, actually uses a lot of stuff like this when you turn on the eye tracking on PSVR 2, but it's not a system-wide PSVR 2 eye tracking interface thing. But yeah, it's cool because, It makes it feel even more subtle that you just go. Sometimes I freaked out like when I really looking at the thing, especially in the corners, I kind of felt like I was like, oh, I have to go up a little bit, but a lot of where they play some of the closed window apps and things were a little more centered and. And just it highlighted and I tapped my fingers and. It was just a matter of when you saw it expand, you knew that you were fine. So I'm curious how that how that plays out.

[01:04:01.997] Kent Bye: Yeah, I've definitely done eye tracking demos that have a radical that are showing you the dot and that's so annoying to see the dot go around everywhere.

[01:04:09.681] Scott Stein: Yeah, and sometimes like they'll be almost. Too subtle and maybe not have any. I don't know if I'm thinking about this. Maybe I'm thinking more about phobia to rendering, but it's like there are times where I feel like. there are games that would say, oh, we're using eye tracking, but like, I didn't even know how it was manifesting. And it was like, it was almost like this. Am I am I fooling myself? But like, it was, you know, I think here, at least like the, it was subtle, but you definitely knew there were there were there was feedback, you know, you got you got some sense of what was happening. But it was very underplayed in the way it was showing up in the interface.

[01:04:46.246] Kent Bye: Yeah, it's hard to watch your own eye tracking because the saccades, you basically blackout and you can't see your eyes move. Anyway, your brain doesn't put it together. You need to either have a recording or see other people in the experience to be able to see how their eyes are being tracked. So yeah. But I did want to ask a couple more questions and then wrap up here because you did mention the three things we talked about the interface. You also mentioned the display and also the integration with other systems. So I wanted to just briefly cover those because I want to just get your take. Carl Guttag of KG on tech, he has a blog and did an interview with them at AWE. And he was saying that he thought that the minimum threshold for having really good, you know, screen replacement type of tech would be at least a minimum of 40 degrees or 40 pixels per degree. I retinols resolution is around 60 pixels per degree. Uh, so by rough estimates, it's maybe around that threshold that he thought maybe it passed that critical threshold. Um, he was worried about Virgin Virgin's accommodation conflict. So I strain looking at the screen for too long, that close up. So I know you only had like a half hour of trying it out, but I don't know if you have any initial thoughts on the viability of Apple vision pro being in this use case of a screen replacement. And if you see that people could actually comfortably do a lot of extended work or reading within this, uh, headset.

[01:06:09.882] Scott Stein: Yeah, it's a good question. There was only one real reading demonstration, which was we kind of opened up a Safari, I think, and we were reading an article. The text was not that tiny. It wasn't super large. It was like your standard feature article text. And it looked nice and crisp. But I think a lot of stuff we looked at were photos, and they were immersive, Apple's immersive videos, and the 3D captured videos, and movie clips. The comfort for the eyes different there, you know, I feel like there's a, you know, it's not the same thing. So I don't know. It's a very good question. The text looked fine. I'd say the text didn't wow me as much as the movie footage, probably because of the brightness and the vividness of the color. So it's a good question. And Apple was not forthcoming with them. They did not have a lot of those really nitty gritty specs. I think they mentioned the millions of pixels, but I don't know the field of view even. And the field of view looked really good, That's just a terrible reference point to say that it didn't. It never bothered me and it probably seemed as good as or better than a lot of standard VR headsets. You know, but like good enough that I was not looking through a porthole and it was. It felt like I was getting a good view of the room. But yeah, I think that's a really interesting question. And also the even design of this current first first gen headset. Having that to our battery life with the battery pack. also means that unless you're plugged into an outlet like there and also I would imagine the recommendation would be you should take breaks. So I it's an interesting question of how long you'd want to work on this. I find my my comfort level for VR is expanded up to like an hour plus in a session. But but then I'm kind of I need to get out probably, but like I could do. I've definitely done things being like VR theater or other things where I've done like an hour and a half and and go, wow, where'd the time go? But I think it's, I think it's going to be one of the best, but that's really to be determined also how comfortable it feels the way it was, you know, the, um, and how that all feels. And I really want to be able to see working on it, like, um, writing on it, um, using a track pad, um, using some of their physical tools. This was a very browsing experience, um, that I think we got to see.

[01:08:41.399] Kent Bye: Yeah, that was the last thing you said as you were coming into the demo, you were really interested in the integrations, like they showed footage of like walking up to a Mac laptop and it turns into a 4K monitor above it that can float around and you can move it around and start to do work on that laptop, but as this external monitor. So I know you didn't get to see any demos of that, but love to hear just any comments or reflections on just the ecosystem aspect of Apple, all their different devices, all their platforms, all their frameworks. kind of all coming together, interacting with each other and their whole ecosystem of existing applications. And what you see is the next steps for this existing library of mostly 2D apps as they start to come in and port over into at least getting a 2D representation in the spatial computing, and then maybe sprinkling other spatial components.

[01:09:31.265] Scott Stein: Yeah, I mean, I've been looking at product lines for Apple and reviewing them and waiting for them to evolve certain ways and having this yearly incremental step thing where I'll get driven crazy waiting for something to arrive, particularly the iPad and its evolution into what I'd like to see is kind of a complete Mac replacement or fusion. So sometimes what you hope for, even like Apple Watch with a watch face store, there are things that I expect will happen that never seem to get there. So there's stuff I'd like to see happen with the lineup, but that may not be in Apple's plans. What I think is interesting that I see is that the Mac was presented as the monitor extension device, but the device is running iOS based apps. So it reminds me almost of the Chromebook Android thing, where it's not that, but I don't know. The Mac environment and its relation to Apple's App Store ecosystem has always been a little bit weird. And the way those two have been dovetailing, I've been watching that for years, and I feel like it's converging now on the headset. And it seemed to me going into this event that I had this like 11th hour thought that I wrote into the preview piece that. It seems like the perfect companion piece for the headset is the iPad pro like I really thought that they were going to. Set that up because it runs the same apps. And the iPad Pro was already the test bed for a lot of their AR features. It was the first to get the LiDAR sensor. It also has like, looking at, speaking of Abuy, I didn't go there, but looking at that company, Sightful, that has Spacetop. It's an interesting, similar type of proposition that with a keyboard, with the idea that has an IMU, it can anchor the base. and display the screen. And I thought that was ahead of its time because you need the motion sensors maybe to kind of help it anchor. It would be a nice benefit. And then the iPad has that. Versus the MacBook, the iPad has all the motion sensors. It's got two depth sensing cameras on the Pro, which I thought you could scan. Eventually, shouldn't those scan yourself or do what they did with the TV where you could have second screen Interesting second screen interactions. So anyway, I thought, and also touch screen and pencil, I thought were a more interesting way of tactilely interacting for an interface for mixed reality, where you just much more easily control things. But that didn't come to pass. So I think that the MacBook makes a lot of sense and already is set up in VR headsets now as a way to do that. Um, but I was really surprised that the iPhone and iPad were not, are not interactive parts of this at the moment. Also similarly, the Apple watch, like I think a lot of people were wondering the Apple watch seem to make a lot of sense as an interface. I thought for years that like, and, and that is already talking about this, like you're, I'm sure down the road that is Apple's plan, but the risk seems like a growth. Great way to get the haptics that are already there. They updated the Apple Watch last year with improved motion accelerometer and gyro, which I felt was a clear, I thought that was the hint. They said it was for car crash detection mainly. But I was going, huh, all of this fidelity and all of this stuff, isn't that going to help with things like mixed reality and motion sensing? Maybe that's still in the works. But I feel like they've already got, they've already got a thing. They already have gesture accessibility controls. They've already figured out pinching and, um, and certain gestures are already on the watch in a, in a sort of a more accessibility experimental way. So I just thought they were going to enroll that and go with it, but it's early days and maybe, maybe even. You know, I don't even know how incremental they're going to be with this, like, um, in 2024, when this is coming out, maybe they will make other surprise announcements and things like this, or maybe it'll be a 2025. Like meta updates their hardware all the time. Like, will Apple be really aggressive with updates to the platform? Will they have like a more hardware update rollout? So I'm really curious. I'm really curious about those products. Cause like you, you were saying about how their ecosystem works. I think the answer right now is an incomplete. I think that they have all the pieces there right now, and they've spent a lot of time setting up these pieces. They're probably the best set up with a really integrated set of often closed off products that people get frustrated about, but are also really fantastically made that feel like they play right into this. And I was just going like, What's going on? You know, like, I think the answer was like, we don't want to do that. It feels like the answer is like, why would you have a touchscreen plus mixed reality? But I can see the touchscreen perfectly clearly in their pass-through camera. Like that's the magic here is that it's a yes. And like, you can, you can, I don't think it's maybe that's the thing too. Maybe in the mind of, I can't get into the mind of Apple, but you know, is it that maybe the idea is that it's not good enough to work on a screen through, with the camera pass through, or would you enhance, I mean, why not create a virtual version of the phone that you hold up and suddenly you have that. So I just kind of, I do want it all there and we're getting, I mean, step one, but I think that there is so much more there, whereas I think the thing they really nailed was the display. You know, that's the part where I'm like, I think, I'm not sure what more I'd want there. That's the part where I, that's why I walked away the happiest with it. Cause I go, well, there's one thing they, they did that. I was like, I think, I think, I think they got it was like, you know, that's it. But the interface and the device compatibility, I think there's room to grow.

[01:15:46.645] Kent Bye: Yeah. As you think about minimum viable product, and it's already a $3,500 device. I mean, they have patents for rings with haptics on it and they could add a watch in there, but then you're talking about a baseline of getting basic functionality, the price starts to go up and up even more. So I think it's probably good to start with just the hands and this $3,500 headset. And then I mean, they did during the keynote show how the watch was able to detect your golf swings or tennis swings and do a lot more kind of motion movement, but that's still probably only 3DOF and 6DOF, you really need to have computer vision tracking. I think as they move into whatever their strategy is, it seems to be pretty stationary, more mixed reality and having these productivity apps where you're reading and doing work and actually making things. That's probably an area that of anything meta is so focused on gaming that they haven't really optimized for that productivity use case. And it seems like to really get to the bare minimum amount of specifications that this device, because it is the Apple Vision Pro, it is a pro device. So maybe thinking about it in terms of that working in a professional context, rather than necessarily a consumer context, even though there's amazing, you know, videos and 3D videos and all that stuff, but But really, at the end of the day, what's going to be the killer app that makes people want to get this maybe having like all this additional screen real estate that depending on whatever you're doing, again, if it's able to use that spatial component to make you more productive, I think there's going to be a subsection of people whose profession is going to do that. Not certainly everyone, but maybe some people who need to have like a hundred tabs open and it'd be just easier to put those tabs in a room that you can walk around. And, you know, I've done an interview with Christopher North who talks about what they've done with that type of embodied cognition for sensemaking and intelligence analysis within the intelligence community. So that's certainly that type of sensemaking loop has been used already. And Rob Lindemann of the know IEEE VR that I did an interview with him back in 2015 he was researching non-fatiguing interfaces so you'd be sitting down and using a tablet with your fingers and like using your fingers to kind of like walk on the tablet and so maybe there's going to be additional kind of user interface paradigms where you have like a abstracted way that you're moving your fingers on a touchscreen, but you're able to have some sort of agency that's happening in a spatialized context. That's sort of the beginning of what they're doing with Qualcomm with the dual render fusion with this integration between these cell phones and these spatial experiences. And incidentally, I did actually have a chance to try Sightful at AWE and I'd say that the resolution isn't high enough. It's also like the field of view is kind of annoyingly not like it's not 120 degrees. It's certainly not 90. It's probably more like 80 to 60 to I don't know what the degrees. Basically, I was getting a lot of windowing effects, which you imagine trying to work on a monitor and you look around and you can't see your full monitor. It kind of defeats the purpose of having an extended stream, in my opinion. And again, it depends on how close the text is and what work you're doing, but. I felt like this kind of translucent looking at the text with something in the background wasn't necessarily compelling for my use case. But I can imagine for some people, for the type of work, if they want to have privacy or whatnot, it may be super compelling for them to have these AR glasses and this insightful laptop to be able to type on. It actually picked up a like a best of show at AWE. So certainly there's a lot of people that are excited about it. But for my use case and what I'm doing, it wasn't necessarily compelling.

[01:19:37.725] Scott Stein: Anyway, it's interesting for super mobile. Yeah. Well, but you brought up to the multitasking, like that's something that we didn't get to. They open a few flat apps. But like this headset, one of the most interesting things I didn't see is the ability for it to run multiple. 3D mixed reality app side by side, which is something that like HoloLens magically can do. But again, like you said, the limited field of view and the limited market in the app stores where they didn't have a big, um, a lot of stuff to tap into, but that thinking about that for like a larger field of view, uh, pass through mixed reality headset is also a whole new thing. Like I'm really interested in how that works. Like how do you set up different things in your space and how does that actually play out?

[01:20:21.760] Kent Bye: Hmm. Yeah, well, I guess as we start to wrap up, I'd love to hear some of your thoughts of some of the takeaways that you're coming away with, especially when it comes to this moment in the evolution of the XR industry. I know certainly for, uh, going at AWE, it was really like, it felt like this liminal space where we're about to enter into this new epoch. And so Javier actually told me that he said it was the end of the beginning. So we've been in this beginning phase and we're kind of entering into the, maybe the next phase, the middle phase, or this next chapter of this evolution of XR. But love to hear some of your reflections on that since you've been tracking this industry from, you know, since the beginning and had a chance to have your eyes on probably a lot more of the different headsets than most. And so, yeah, I'd love to hear some of your reflections of this moment in the evolution of the XR industry with the announcement of Apple's Apple Vision Pro.

[01:21:18.212] Scott Stein: Well, I had really been looking forward to for a while getting the, the, the company's driving the hardware and software that people use getting into the space. And I guess that's like the, in my mind, uh, mainly Apple and Google, you know, like Microsoft too, but Microsoft has been in this space, but also I'd like to see how it re they're, they're in kind of a pause mode now is it feels as they are They had a lot of layoffs and Hollins 2 is still around, but they're partnering with Qualcomm on chips. A lot of things are flowing through Qualcomm now and trying to get to glasses and phone compatibility. But Google, Samsung and Qualcomm are working on this mixed reality partnership that probably will, I imagine it's gonna move in dovetailing like lockstep with Apple's headset, you know, similar like to Android Wear and Apple Watch. how they were staggered around the same year. Mandrake came ahead of Apple there. So I think that this does feel like a big moment. And it's interesting because it also clearly, I've seen a lot of comments of people saying that it validates what they've been working on. The industry tends to move in these ups and downs. across the whole tech industry. And whenever things get crushed really big, and they come down, I've heard so many VR is here, VR is dead moments. There have been a lot of those. And I've tried not to write VR is dead stories, or like VR is finally here stories. I got frustrated about them every time they emerge. So here, I'm thinking, I think it's more that things are maybe restabilizing and that there have been a lot of drop off all of a sudden again with AI. But I also notice in the landscape and the way I see it is that they are interconnected. And so it's funny, people talk about AI versus this or this versus that, but they all seem to be... What I've been noticing is a lot of new technologies all sort of excitingly intertwining and going towards some new emergent thing. And sometimes it can feel brain-bendingly tremendous. And I would talk about it with people and feel like I was kind of like, you know, almost losing my mind, but it's intriguing because AI drives AR. I saw a tweet about AR being the interface for AI. And once you have vision arrays, that can really be with you all the time, not only will they get better, but per, like Alex Kipman talked about this once, and Matt has talked about, Michael Abrams has talked about this, like, those can then drive other things, like robotics, and cars, and these systems, it'll fuel a lot of different things at once. And so I feel like what's interesting to me is there's the headset, yes, but ambient computing, robotics, Um, where all of this stuff seems to be going with a lot of different things intertwining. Um, it feels like it's heading towards the beginning of that. And so I think that's, that's like a really long road, but, um, it's funny because people talk about like, oh, I don't want to wear, I don't want to wear a headset, but I wonder how much of this platform, I mean, Apple's product company. So they do want you to wear the, and buy the product, but how much of this platform in the long run is also about this. spatial computing is about something that kind of exists beyond a device, which is, I guess, the code for what the metaverse conversation was. But I think there's a lot of these thoughts that have been out there for years, Magic Leap and others, digital twins and that world map of connected things that I feel like once that starts unfurling, that becomes like, almost mind-bogglingly fascinating, and gets in the internet of things. And so I think there's a lot of that. But then I feel like I'm always hesitant to get too excited, because sometimes these things don't work out, and then there's over-hyping waves. But I think this is going to drive a lot of stuff. I think it definitely seems to validate meta. And so it's interesting. Suddenly, the $500 VR headset seems like an awfully good proposition against the $3,500 one. And it feels like they're all kind of meeting in the middle. Maybe Apple's taking the wave eventually, slowly getting more affordable. Yeah, I think there's something here. But I don't think it's it's I don't look, I didn't look at this as a whole new thing. I think sometimes some people came out of it going, it's a whole new thing. It's really here. I look at it like, this has made strides to get to the convergence that has been happening. And I think this is like, you know, and also shows that no, it's not dying out. This is, this is an ongoing continuum. So I think that's pretty fascinating.

[01:26:41.173] Kent Bye: So yeah, you said it was like this and you're putting your hands together. So kind of like this integration point.

[01:26:45.634] Scott Stein: So sorry, yeah, I was doing it. But the integration point and I think it's kind of it's also I was that metaverse was kind of code for like, the new internet, which got co opted by crypto. But, you know, the internet's been flat, the internet has had websites, we think of, we think, I mean, phones redefined it, but the internet is sort of an ever present thing around us. Anyhow, it's the thing that floats around. And I think this is more about the full expression of that. I think this is going to get to the point where thing, and it's a little scary, but it's just where like, you know, there is not really going offline. But I think when we have phones in our pockets all the time, we're not really offline. Anyhow, you can still disconnect things. But I think the idea of it's, it's, it's to where Apple's doing a headset that's in your home. But as, as Qualcomm and others go towards something that's with you all the time, it's going to feel that's the augmented part of the augmented reality that I think it'll become this will lead to more partnerships to like companies that may have been locations that may have said we're not going to get a board on this. Companies like Apple and others going on may drive a lot of that a lot more where you may have like not seen it at a at a store at a at a theme park at a parking lot or wherever, you know, like suddenly you'll have integrations all over the place down the road, way down the road.

[01:28:21.517] Kent Bye: Yeah, it definitely feels like there'll be a lot more folks have the confidence to get into the industry. It's not just Mark in the metaverse and folks feeling like he's chasing a dystopian dream, but actually that this is a real shift from 2D to 3D in this spatial computing epoch paradigm shift. This spatial computing paradigm shift, I do think is something that is generally new. Generally new in the sense that you're right, that I think a lot of this has been monotonic growth across all these things coming together. But the thing, at least from what I'm hearing from all the accounts, is this integration of all these component parts, these puzzle pieces to the point where it just works and is at this bar of a minimum threshold of an immersive experience that Apple feels like it can stand behind. With its privacy by design architecture, and they're all their human computer interaction. And there's certainly a lot of 2D interfaces. And so to start with really optimizing their interfaces for folks who already have 2D things, so they don't feel like they're completely left behind with not being able to integrate with this, but to slowly transition from 2D to 3D, this feels like a really gentle transition point for a whole ecosystem of developers, millions of them literally that are out there with all these different applications. So I think it's the ecosystem aspect to me is what I'm really excited to see where this goes. Even if there's a fraction of 1% or 5% who generally do something new in terms of spatial computing, then that's going to be a lot more than what feels to be a highly curated selection of whatever Meta has decided is going to be valuable, which I feel like there's a lot of stuff that has been happening that they have not focused on, whether it's that Meta has really been focused on gaming almost like in a monism type of way, like that's the only thing that exists is gaming. And there's a lot of stuff that has been out there that's been relegated to either the app lab or just not even getting on the app lab, whether it's some, some healthcare applications as an example.

[01:30:21.851] Scott Stein: Right. Yeah. Productivity stuff is very hard for visibility fitness they've done. They've pushed into, but, but yeah, there are a lot of other things. I mean, the quest pro tried to make a move into work. But then it also brings up this interesting thing that like Meta has a, it's the trap of they have this very specific app store that has been defined by gaming. And so they were saying Quest Pro is for work, but how many people have a Quest Pro? Whereas Apple, how many people are gonna have a Vision Pro? But on the flip side, Apple is starting with their iOS app library. And so they have a big head start there with stuff that is recognizably productivity focused or whatever. Whereas med is having to import those as 2d apps that may not be integrated with your accounts easily. That's the other thing. Like, so just cause like you have, you know, whatever app that you have on your phone on a quest, it's really about like, oh, are you logged in easily? Is it crossing over? And Apple's having that easy login, that easy, you know, you're on your same iCloud account. And I think that's the part where like. That's why I want the Googles and Apples and others to do it, because we're already like that now. It's like your Netflix account is signed in over here and here and here, and you're doing Google Cloud to do this. And it's like the headsets just need to have that dovetailing. I'm doing my hands again. But they need to have that stuff, which I think the Quest really has had a hard time with, because they don't have another ecosystem out there. It's Facebook, and really, it's this VR headset.

[01:31:56.475] Kent Bye: Yeah, even with some of the areas where they could have brought in some of these different applications, they have HTML web apps, but Meta is taking the approach of having to go through the curated web store system rather than just having like a website where you could have a native application. So I feel like this competition with Apple and Meta is going to really light a fire underneath Meta's ass to be able to maybe change the way we're thinking about this, to really expand the ecosystem in a way to come up to more parity. So I think for both of them, it's going to be really good competition from both sides. Yeah, I guess the final question I have that I ask everybody is, what do you think the ultimate potential of spatial computing might be and when it might be able to enable?

[01:32:38.493] Scott Stein: I think the ultimate ability of spatial computing would be to... I think one of the most exciting things that I thought about when I was talking at Down at WWDC, that We're talking about applications for people who are older. that there is the thought of like virtual tourism or travel. But I think there's one interesting thing about it being a way to trigger memories and experiences that you've already had to kind of not have to say goodbye to things. Apple's getting into that now with memories, which I don't think was perfectly executed with what they're doing, is I think you need to capture the memories better than wearing the headset. But viewing them is an interesting proposition. But the other part is the whole training part. Simulation training has been a thing in this space for a while. But it's so interesting to think about how this stuff really can help you get better at things or experience things before you experience them. And fitness has been that thing now. I think that Beat Saber and other things have been literally fitness workouts for people. That was the one part I was disappointed with Apple, that there was no announcements on fitness, although there was a meditation app. But I think there will be, absolutely. Maybe they're waiting for the more active, mobile, interactive part of Vision Pro needs to come. But that gets into the thing of the real benefits of spatial computing is that movement, the spatial, not having to have everyone be super active, but that it can become something holistic that's not just about vision. That I think that you build really interesting muscle memory and pre-experiencing things that can start to whether it's like music or acting is interesting. Like I did VR acting a year ago with some people and it really felt like we were being there. We were working on the ideas of performance and I think there's so much possibility in all this. People talk about it replacing the real world but I think it's more about pre experiencing. It's like a dress rehearsal, you know, like you could. There's there you stage something in dress rehearsals and then you have performance or you will look at a map at something before you go. And I think that it won't replace the experience, but it could actually. Dovetail, I mean, Apple has garage band like you know, do it with real instruments that you can conjure in mixed reality and you know all those apps that do that stuff like they. They dabble in those creative tools anyhow. You could learn to be a filmmaker. You could set up shots and do things. So I think that's a really interesting, that's the immediate most exciting thing about it. That's like in your home. But then in the real world, to me, the ultimate potential is to transform and intertwine spaces so that they can in interesting dynamic ways, not to replace physical, but to help physical be more vibrant, you know, maybe to have awareness of things, maybe you know that something's going on there are some things. For some people, it has a certain meaning or presence, I don't know, I think about those William Gibson books for years, and people would tour a spot and see something and living landmarks and locations. So I think there's People have tried that already, but you know, I think that that's. Those are the big things. Pre experiencing things and training is the big thing for me.

[01:36:25.412] Kent Bye: Yeah, the fact that it's so much the name of Apple Vision Pro, I think is very striking because the code name was XRS OS, which is the extended reality OS, which is probably a little bit more robust, including all the dimensions of extended reality, virtual reality, augmented reality. But to just focus it on the site, I feel like is too limited because you have the hearing, you have taste, touch, smell, haptics. I feel like In some ways, this first iteration is this full extent of just focusing on the vision. But the problem I have with that is there's not as much aspects of embodiment of moving your body around, moving around a space. So I'm hoping that it's not this systemic, I guess, bias against things like virtual reality, but it's not just the aspect of virtual reality. It's this aspect of being embodied in experiences of spatial computing. So we'll see how it continues to evolve. I think at this first start, it is very vision based, but I feel like as a name, I'm not necessarily satisfied with it because I don't think it's necessarily going to be robust enough to encompass what all the things that's going to account for in five, 10, 20 years. So yeah, I don't, that's my, it's my, as an XR person, who's really focused on spatial computing, um, just focusing on one of the senses feels like a little bit of a weird choice, but that's reflective of where they're at with the whole thing.

[01:37:48.913] Scott Stein: I agree. And it made me wonder, like, you know, will they have other accessories based on other senses? And is this part of like a rollout? Or is this? I mean, even the pitch of it definitely seemed and my takeaway was that and that's what led to the headline was that this is an audio visual fidelity experience, and also has these other things. But so much of what they seem to be looking at right now is about that. And definitely achieves that. But I think the two, I don't know, there are two high ticket price things that I feel like they could have pushed on here. One was definitely the monitor thing, but the other is like Peloton and fitness. I thought that that was an obvious mainstream area that people would go, oh, I go to a gym or I'm And that's what Meta is locked onto. And I think that Apple's design and focus on health research could really make strides in a lot of new ways. But it's not here yet. So I think that's like a here and now embodied thing. But yeah, I wonder about that too. I mean, it is a wonderful screen. It's got a lot of cameras. It's got an outer screen. So there's a lot of vision going on.

[01:39:02.991] Kent Bye: And the eye tracking to be able to interface with.

[01:39:04.612] Scott Stein: And eye tracking. Yeah. So maybe, you know, and even the names for eyesight and other things. And maybe that was their way of also like explaining it to people because it seems like they are leaning away from VR as if they feel it will. I can't think it's a branding thing, but I think it's because VR is just a term, but it's maybe that it would, it would. make people nervous or make people feel weirded out. There's something here that seems to be like, oh yeah, enter your cinema or look at the screen, it's fine. There seems to be a pushback on that and to make it not virtual reality, even though it is technically. It's technically a virtual reality headset. This is a virtual reality headset with really fantastic pass-through. And that's not a bad thing. It's just kind of how we've defined what these things are, which also happens to be a spatial computer. But that's, I don't know. That's how I see it. Yeah.

[01:40:05.562] Kent Bye: I know Niantic has John Hickey who talks about the quote unquote real world metaverse. I feel like in some ways Apple has gone into this hierarchy saying that the physical reality is better than any virtual experience and that it's an escapist. I think because of those different types of bifurcations of what is physical, what is real, it goes back to Descartes and what David Chalmers argues against, which is that You know, virtual realities are actual genuine realities, but yet we have this perspective from Apple that is kind of like this carry-on that's a bias against all these virtual talks, a virtual framing of anything. So they've deliberately avoided any talk about it, even though Like you said, they've literally created a VR headset that they refuse to acknowledge. In a similar way that Microsoft would call them the mixed reality headsets rather than VR headsets, but maybe this is the next iteration of Apple trying to find a branding that they can really get behind without all these dystopic visions of people escaping into other realities. But they want to find all the ways that they can orient people back to, you know, with the eyesight to be able to see other people, to have the mixed reality as this basis, to not be locomoting through these virtual worlds and get motion sickness. And so maybe it's just this other design to get around all these other negative aspects that people have had around VR and have preferences focused on something that is going to avoid all those biases against the medium.

[01:41:27.395] Scott Stein: my biggest fear is with the big tech companies closing in on this now is that I hope their individual philosophies don't get too locked down and that it gets to the point where I love the weird artist presence in new technologies. And I think that it's wonderful and vibrant. And I know that the new adopters of these things, the avant garde, get upset when things get co-opted, and rightly so. And so I really I think there's also a part here with my my intrigue you'd be like it's a come on in the waters fine you know thing like it's OK you can get weird. We've been here. People have been doing stuff in this embodiment and all these things is great. You know, don't. It's like it's the ready player one argument. It's like don't get too corporate. You know, I think that. In Disney was on stage or in the video and you know, I think. The degree to which I think these can be experimental playgrounds is really important. Otherwise, app stores get full of crap. And don't, don't let that happen. You know, like, I think that's the big, big fear in the space. For me.

[01:42:40.543] Kent Bye: Great. Was it is there anything else that's left and said that you'd like to say to the broader immersive community?

[01:42:44.805] Scott Stein: Hi, no, I don't know. No, I would just say, well, one thing I'd like to say is I really love the immersive community's ability to pre-model things. Meaning, like I wrote earlier in this year about Meow Wolf, and I became very enamored with them. And I felt this way before with like third rail theater projects, seeing different theater things. That's how I met a lot, I mean, a lot of the people, at Everything Immersive, Noah Nelson, Catherine Yu. A lot of people were wonderfully early guides for me in these spaces. But it's fantastic to be reminded that great things are made that don't need tech. This goes back to the theater thing. Dreaming can happen without the tech. Immersion is an experience that can happen without the tech. Tech wants to be there too. There are great dovetailings that can happen, but dreaming of these worlds and spaces should be an ongoing process. And so as far as the immersive community goes, I think they're always at the vanguard. I think that dreaming forward ahead of the tech is really important, because it also inspires the technology companies. They'll say, someone goes, oh, I saw Sleep No More, or I did this, or it was really cool, and now I read this book, and now I want to do this. I think the art starts first, and the tech follows, and then the art goes into the tech, but then it's like a balance. So I guess that's my message, is that I hope new things go into the tech, but I also hope a lot of stuff advances before the tech always, because I think that it also creates new arguments, like the Tribeca's and the other. We've seen these things for years. That needs to happen. If that stops happening, then I think that we're in tech's domain. And then I don't want that to be the case. Like, so that's the playwright in me.

[01:44:42.901] Kent Bye: Beautiful. Well, that's, that's a great way to end, especially because tomorrow morning, I'm going to be flying to New York city to cover the Tribeca immersive festival with 13 different immersive experiences from June 8th to 16th. I'll be in New York city covering all that. And also. Shout out to Noah Nelson and Catherine Yu of No Proscenium and everything immersive. They just had their next stage immersive festival that was happening, overlapped with AWE and right before this announcement at Apple on Monday. So I know that they were gathering all the theater and XR folks in Los Angeles, California to have this intersection between this arts and technology. For me, that's a big focus. What I agree totally, a hundred percent, that it's those artists, those immersive storytellers and those creators that are really pushing the medium forward, along with all the other independent game developers and other indies that are out there. It's the indies that are really innovators that don't have anything to lose to get into these mediums and push it forward. So I'm definitely looking forward to see what the indies do with this new platform and what the artists and creatives are able to do as well. Yeah, I just wanted to thank you, Scott, for taking the time after this long trip out to Apple to see all these things and to get back from your long trip and sit down for a couple hours to unpack it all. Really appreciate the time and all of your deep insights and wisdom from not only the XR side, but your long history of covering all things technology and Apple and seeing these two worlds come together. It's really nice to maybe have a complete Scott now that's able to- I really appreciate it.

[01:46:10.648] Scott Stein: No, this has been wonderful. And jet lag works in my favor, at least. So that's good. It doesn't feel that late. And it kills me that I was not out in LA for a further event. I was following everything remotely that Noah and Catherine and others were doing. So I think that stuff is very vibrant, very exciting. And go team with all that. That's really cool stuff that's emerging.

[01:46:37.185] Kent Bye: So that was Scott Stein. He's an editor at large at CNET, and we had a chance to talk about his hands-on impressions of Apple Vision Pro, which premiered on June 5th, 2023 at WWDC in Cupertino, California. So I have a number of front takeaways about this interview is that first of all, Well, it was just really great to be able to just sit down with Scott and talk about all these different platforms, his impressions, and compare it to other systems. I mean, this is an opportunity to really unpack all the different dimensions of this experience and what it means for the larger XR industry based upon all the different things that we've seen. I think we've both had a chance to see iFluence in 2016. Scott wrote about it in February, I wrote about it in September after seeing it at TechCrunch Disrupt. that was the one system where you're using your eyes as a primary mode of user interface and combining that with the pinch with the multi modality aspect I think is you know like Ben Lang said that you're using your eyes as the mouse cursor your fingers coming together as the mouse button click and then with the voice to be able to do higher fidelity text entry and whatnot so Yeah, just really fascinating to hear all the different reflections from Scott and how it for him at least it connects to all these other platforms and Yeah. And just a lot of really deep context also from the Apple ecosystem for the stuff that he wants to see as well. So really appreciated how all these worlds are coming together. And especially as we start to integrate all the different aspects of immersive storytelling and theater, which I was pleasantly surprised to hear a little bit more about his own background in theater, because there's so much about the future of immersive storytelling that is bringing these immersive theater and theatrical components. So. really good transition, I guess, as we start up this little mini pop-up series of covering a lot of the different stuff happening at the WWDC. Like I said, I didn't get an invite for this particular event. And for anybody who is at Apple, please do pass along some of these different interviews to the comms folks there. I would love to get onto the list to be able to come to future events, especially like the launch event that's coming up sometime next year. I'm sure you're going to have lots of different demos. I'd love to be able to come check it out and add my own expertise. You know, my particular focus has been more on the experiential side and seeing what has been the future of immersive storytelling and spatial computing. I've seen lots of different demos over the years covering Sundance, South by Southwest, Tribeca, Venice Immersive, and Ithaca lab, trying to see as much as I can from these different places. And yeah, I just have a lot of different insight for what's happening for the multimodal fusion of experiential design and immersive storytelling and subtle nuances of the future of spatial computing. So yeah, love to be able to get on the list and see the future releases and to also be there to do face-to-face conversations, because it's a lot better for me to be able to have these conversations face-to-face than it is to do them remote. Although I try to make the most of what I was able to do over the past day of getting all these different hands-on impressions from both Ben and Scott on Tuesday, June 6, 2023, which ended up being nearly four hours of conversations as well as putting out a previous conversation that I did with Raven Zachary and Sarah Hill that I had from their first impressions of announcements from Monday, June 5th, 2023. So I think everybody that I've talked to at least is super excited about this announcement from Apple. It's really like a sea change. We're entering a new chapter of the XR industry and excited to see how it continues to evolve from here. So that's all I have for today, and I just wanted to thank you for listening to the Voices of VR podcast. And if you enjoyed the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a listener-supported podcast, and so I do rely upon donations from people like yourself in order to continue to bring you this coverage. So you can become a member and donate today at patreon.com. Thanks for listening.

More from this show