I speak with Road to VR Editor Ben Lang about his hands-on impression of the Apple Vision Pro that he had on Monday, June 5, 2023. Check out his full article: “Hands-on: Apple Vision Pro isn’t for Gaming, But it Does Everything Else Better.” Also see my Twitter thread live coverage of #WWDC23.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
Podcast: Play in new window | Download
Rough Transcript
[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast, the podcast that looks at the future of spatial computing. You can support the podcast at patreon.com slash Voices of VR. So over the course of the next three episodes, we're going to be talking to some veteran XR journalists who, between them, have over 30 years of covering this space as it's evolving over the past 10 years. So in the first episode, I'm going to have Ben Lang, who's the co-founder and executive editor of RotoVR, and then Annie Hamilton, he's an editor at UploadVR, and then Scott Stein, an editor-at-large at CNET. So each of them had a chance to be on site at the Apple's WWDC, where they had the announcement for the Apple Vision Pro that happened on Monday, June 5th, 2023. I unfortunately was not invited. And so the next best thing for me is to talk to all the folks that I know and love in the XR community to get their firsthand testimonials, their hands-on impressions, because these are folks who have seen a lot of stuff within the XR industry and I'd love to get their take. Of course, I always prefer to have my own take, but in the absence of that, I'd love to hear what other folks in the industry are seeing with this experience. So anyone at Apple, please do pass along my folks. Hopefully I can get onto the list of all the cool kids next time around to be able to see all these next demos. I'm going to start off with Ben Lang. He's the co-founder and executive editor of Road to VR. And I have to say, before we dive into each of these, each of them have different perspectives and things that they focus on. So it'll be interesting just to get a little bit of a cross section. And of course, there's going to be lots of other people with different perspectives who are not connected to the XR industry at all. But there's a Twitter thread that I did that I tried to get an aggregation of all the different hands-on videos that I could come up with for at least the first day of the conference. And so there's some other perspectives in there as well. You can dig into. see the different video coverage and the different reports each of these folks have done. So let's go ahead and dive in with Ben Lang and we'll hear from Ian and then Scott. So that's what we're covering on today's episode of the Voices of VR podcast. So this interview with Ben happened on Tuesday, June 6th, 2023. So with that, let's go ahead and dive right in.
[00:02:15.245] Ben Lang: Hey, my name is Ben Lang and I am the co-founder and executive editor of roadtovr.com. I started it in 2011. So I've been covering this whole VRXR space for quite some time, and I'm very happy to be back and chatting with you, Kent, on what has been a very interesting week.
[00:02:37.971] Kent Bye: Yeah. Yeah. Well, let's set a bit of a context because you're on site there and just having seen the Apple vision pro you're an augmented world expo. And then the quest three was announced during that. So we had quite a lot of news over the last week, but maybe we could just take a step back and have you give folks a little bit more context as to your background and your journey into covering the space.
[00:02:59.746] Ben Lang: Absolutely. Yeah. So I guess front to back. So yeah, I'm just after WWDC. I'm less than 24 hours after having my head inside of Apple's first XR headset, which is really, really something, you know, this is a start of a new chapter. And a journey that began for me, as I said, back in 2011, when I was doing tech journalism, writing for some other people as sort of a youngster actually starting in high school. And then after I sort of learned that trade, learned how to do good reporting and start to understand technology, that was a kind of just a curiosity and interest for me. I was interested in doing just something on my own, trying, you know, reporting on something that was kind of my baby rather than what somebody else was, you know, asking me to write for them. So I looked around at lots of different potential topics, things that interested me. Virtual reality was one of, you know, a handful of things that I actually just jotted down on a list. These are interesting. I could maybe write and learn about this thing and kind of have some fun doing it. At the time, the Oculus was not even founded, which is kind of amazing because now it's been so long since then that Oculus is essentially not even around anymore, which is kind of funny. Back then, Oculus was this company that sprung up, was bought by Facebook, and sort of sparked, really, you know, really sparked what ultimately got us to Apple announcing a headset, right? To a big, one more thing moment, a new headset on stage, which is really kind of amazing. So anyway, that is just to say, prior to Oculus, there was really not anything very much happening in VR. Nobody knew it was about to well not about to, but nobody knew over the next 10 years it was going to become, you know, grow into this industry, especially not me. I was just really curious about kind of virtual reality conceptually, and I wanted to explore what it was then, what it could do, where it might go. So that's how I got into it, and so it was about, you know, a year after I had started Road to VR initially, but Oculus was founded, they had their really successful Kickstarter, bought my Facebook, yada yada. Things happened and actually turned into my career, which was absolutely not something I thought was going to happen when I started Road to VR. So that brings us to now.
[00:05:31.510] Kent Bye: Yeah, I wanted to go back to May 23rd, 2023, when you wrote an article that said Apple invites XR media outlets to WWDC keynote for the first time. So you've obviously been covering all this space, even before the Oculus Kickstarter, all the way back to 2011. And so you've been on the beat covering all the major news. And so walk me through how this came about for you actually getting an invite to go to Apple to see the keynote and actually try out their Apple Vision Pro. Love to hear a little bit more about that backstory.
[00:06:00.947] Ben Lang: Sure. So the reason why that headline was meaningful is because through all these years, there were lots of clues that Apple was working on XR stuff. I mean, you know, way back in the day, many years ago, even like kind of the early Oculus days, it was clear that Apple was interested in this technology. They were hiring people, you know, they were filing patents. The hints were there that they had started on this project long ago. However, over the years, we'd reach out to them here and there to ask them if they would comment anything, give us any indication officially that they were doing anything at all. And basically it was crickets from Apple all the way until 2023 when they invited us to WWDC, which is, you know, their Worldwide Developers Conference. This was the first Apple event that we'd ever been invited to. And of course we're a XR specific media outlet. So prior to this event, they had no reason to invite us. And in fact, I thought my best guess over the years was that when they were ready to finally make an announcement, they would not invite us or other XR media because they wouldn't want to tip their hand. That's actually not how it went down. I was very surprised that they did decide to invite sort of the XR specific media for this particular event, because it was essentially the nail in the coffin that they're definitely making a relevant announcement, this event. So that was very exciting. As far as I know, the four sort of XR media people that were invited were myself for Road to VR, Ian Hamilton for Upload VR, and Norman Chen from Tested. Tested is not purely XR, but everybody knows, they know what they're talking about. They've been in it, following it, you know, just as closely as the rest of us. So those are the folks I think that they reach out to, who's, you know, Apple identified as these are the people who are sort of core to that existing VR XR space. And so we want to make sure that they get to look at this and get their feedback on it.
[00:08:04.797] Kent Bye: Yeah, there's also CNET's Scott Stein, who also has been covering XR pretty closely and was also invited and had some hands-on, did an article and video as well.
[00:08:12.723] Ben Lang: Oh, yeah, yeah. That's all right. That's not to say there aren't other people that have covered XR and do great stuff there, definitely. However, I think, you know, a CNET or a Washington Post or an Engadget, these are publications that would have always come to Apple events, you know what I mean? Those that I mentioned were, I believe, we were the ones that were first-timers, and it was specifically because they wanted us to see their XR headset. Like, Apple didn't even bother having us talk to people about their laptops, which was nice because they knew we were definitely just there for the goods.
[00:08:44.189] Kent Bye: Right. Well, I'm glad that you had a chance to actually try it out. I know that Ian Hamilton just posted an article this morning saying he hasn't even actually had a chance to try it out yet. He's doing his demo this morning. So it would have been nice to have all the XR folks in there prioritized. However, that's not how it went down, but you did actually get a chance to try it out. So I'd love to hear some of your first impressions of this. I know you wrote a whole article today called Hands-On Apple Vision Pro isn't for gaming, but it does everything else better. So I'd love to hear some of your first thoughts of what it was actually like to be in this experience of the Apple Vision Pro.
[00:09:19.174] Ben Lang: Yeah, so people know that I tend to do really technical, heavy analyses typically when we do reviews. This piece, I really just wanted to contextualize what the headset actually was and also, you know, what it was trying to be, which when I say it's not for gaming, it's because the headset out of the box does not have any VR motion controllers, which if you know anything about the space, VR motion controllers are essentially the main input for all VR and most sort of major AR today. To not have them means relying on hand tracking, which means you don't have sticks, you don't have buttons, you don't have as precise or reliable or wide ranging input otherwise. So there's so many games, so many apps, basically all the top games, all the most used games need controllers. They're built for controllers. And if you took the controllers away, the games would have a very hard time being what they are or being any good at what they are with just hand tracking alone. So Apple didn't want to do controllers, at least not now. You know, you can do a Bluetooth gamepad, but that's just a whole different, whole different thing. You know what I mean? So that is to say, if you look at the 20 most popular VR games today, pretty much all 20 of those, it's going to be a real struggle if the developers want to try to figure out how to port those to work well without controllers. In many cases, the developers will probably say it's too hard. We're going to compromise the product too much that they just won't even try. So at the outset, the Apple Vision Pro is not doing what most other VR headsets do well today, which is basically just play games. It's trying to do all the other stuff that those headsets don't do well. It's trying to do them well. So it's like not gaming, but everything else. Let's make everything else actually good, which it's kind of funny because like, let's say you look at a Quest 2 or a Quest Pro, It can do almost every single thing that Apple showed with their headset, but it just kind of sucks. It's just clunky to browse the web. It's clunky to get an image from your headset to your phone or share a video from your headset to a friend. Everything but gaming on the headset for the most part, like anything that's not inside a dedicated application, you know, going through the menus, trying to watch videos, it's just really clunky. And a lot of people just don't do it because it's not good for it. Apple, again, they're trying to with this headset, sort of go back to basics and say, like, we're not even gonna try tackling gaming, because that's like, kind of figured out. And it's kind of complicated. People got that all figured out with controllers and the buttons and the crazy schemes. We just want to make the very foundation of using this technology just kind of easy. And that starts with like, how do you control the menus? How do you launch an application? How do you scroll? You know what I mean? Just the basics, which are just really not great on other headsets. So they devised this eye tracking and pinch scheme as sort of like your eyes are the cursor and the pinching is like a mouse click almost. And they just worked to make That sounds so easy, right? You need to pick what you want to click on and you need to be able to click on it. They just worked on refining that system and getting the right hardware in there and iterating on it and refining it until it, you know, you know, the app thing, right? It just works, right? It feels really good. to be in there and you say, I want to click this button, or I want to drag this window, and you just look and you just do, basically. You don't really have to think about it much because the systems are working so well to understand your simple input. I think one of the almost, you might call, stupid simple ideas that I came up with, which One of these things that anybody really could have done, but people just kind of haven't, is they have these cameras on the bottom of the headset facing essentially straight down into your lap. And primarily, the reason for this is just so that when you're sitting on a couch or in a chair, your hands can sit in your lap. and you can do the, you know, the pinching or whatever gestures you need very minimally. You don't need to do this exaggerated, hold your hands up in front of your face, make a, you know, special, like, try to make sure your hand silhouette is clear to the headset and do some pinching or some weird grabbing motions. It's just like, you don't really have to think about it. You can be relaxed, have your hands down. And I was really impressed at the subtle, you know, movements of my pinching that the headset would just be like, oh, yep, he pinched, I saw it, you know. It wasn't something that i had to try again and again and in fact it was so fast to get used to just kind of navigating menus and moving windows and scrolling things with this system that in the like two or three times during a 30 minute demo where the headset did not catch my click for one reason or another. I was like, wait, why didn't it do anything? Because it's such an invisible system between your eyes and your pinching. It starts to feel like using almost a mouse on a computer, right? You don't really think that much about, I'm going to move my mouse from this icon to that icon, and then I'm going to click. It's just like, we're used to it, right? You get that built-in muscle memory. So I built that up so quickly that when it did miss those inputs once or twice, I was already like, what? what's going on here it was working it was working so obvious so clear that to my kind of brain that this just always works that when i didn't i was like almost confused so that is to say uh it's not 100 but it was good enough that within just a few minutes my brain was like reliably understood this just works when he did it And that's not just a combination of detecting the inputs well enough, also about responsiveness. So when you move your eyes, it understands where you're looking real fast. And when you do that pinch from detecting it to actually sending the input to the screen and giving you feedback, very, very quick. It's not this like half second, you know, you pinch and then it goes, you know, it's like, it just kind of feels like you're touching it almost. And it works really well.
[00:15:31.802] Kent Bye: Yeah, a couple of points reflecting on what you said. First of all, Apple never said the words virtual reality. They did say augmented reality, but they would always otherwise say spatial computing, which is originated by Magic Leap, which I thought was an interesting choice. But you know, the other thing that I think of is just user friction and some of the lessons of going from the Gear VR into Oculus Go and then the Quest is that just the friction of people putting their phone into the headset and to just put on a headset and it is going within less than a minute. That's a huge user friction. And it sounds like some of these different user interface changes are those different friction points that it just works and it becomes seamless. And the other thing that it reminds me of is that there was a startup called iFluence that I had a chance to see at disrupt from tech crunch back in like 2016. It was bought by Google. So Google had this system that you could start to use your interface primarily with your eyes. So it's called the Apple vision pro and. I think because you're using your eyes to engage with it so much, it seems appropriate. Even though when I think of XR, you know, the codename for this was XR OS, the Extended Reality Operating System. But now that it's Vision OS, it's both the eyes and I'm imagining that they're going to be expanding out into all these other senses as well. One of the things I wanted to ask you about was the sound and the spatial sound, because they were talking about things like ray casting of audio. Did they have any spatial audio demos that you were able to play around with?
[00:16:58.981] Ben Lang: They did not have like, you know, what you might think of as classic spatial audio demo, like, Hey, the thing's floating around me in a circle and you hear it. The only thing that they actually pointed out, and I should preface this, the demos they gave, which were about 30 minutes long, super, super handholding, super scripted. So there was not a whole lot of time to think about or look at stuff that they didn't want you to look at. You know, you, you do what you can, but pretty much they were saying, click here, click there. let's do this, let's launch that app. They had a map, if you will, or an itinerary for the whole thing. So I didn't get to really sit down and, you know, say, decide to put on a song and see how that sounded and appreciate the music. The only real thing that they pointed to with the spatial audio was when they did a sort of FaceTime call between me and another Apple vision user through that like avatar system that they showed. They just showed that, you know, if I were to move them through the window to my left, the window that was containing the, the avatar, who was talking to me, whether that's my left, you know, I'd hear it off to the left. If I put it above me, I hear it above me. So it was pretty par for the course. They didn't show anything like magical there. However, the speakers, while I didn't get to like really say like sit and figure out how good are these, they did not like detract from the experience in any way that I noticed while I was doing everything else. So they've at least got that minimum bar. Whereas like something like quest two, for instance, is like, I would actively be like, Oh, like the audio is just not quite cutting it. You know, it's kind of taken away for me. But for the Apple Vision, it was not something that I noticed right out the gate. So that's good. And then now the question is, how good is it actually versus how deficient is it?
[00:18:41.958] Kent Bye: Yeah. And as I was reading through your writeup, as well as reading some other folks, one comment that came up again and again was that people commented that they were surprised about how heavy it was. I mean, it's a heavy bill. It has a lot of metal. They have the battery that's in your pocket. That's trying to offload some of that weight. Norman Chan from tested was really commenting on his experiences with the big screen beyond, which is really emphasizing this super lightweight, high fidelity. So I'd love to hear some of your thoughts on just the weight and the weight distribution. I know you were talking about that. They could use like a head strap that goes over the top of the head rather than just around the back. That's something that I know that sadly it's Bradley has talked a lot about in terms of like having more relief of the weight distribution by having something over the top of your head. They did say that they're going to have some options to have. peripherals to have different types of head straps. But I don't know if there's even a way to click into the top to have from front to back to relieve some of that weight distribution. But love to hear some of your initial thoughts of just the ergonomics and the comfort of the Apple Vision Pro.
[00:19:40.086] Ben Lang: Sure. So there was a whole lot of stuff on this headset that impressed me a whole lot of stuff that was obviously best in class, you know, from tracking to interface to displays. In many cases, ergonomics was not something that I felt they had made any meaningful innovations on. The reality is it's got essentially a lot of the same components that other headsets have of them. It's got thick lenses, it's got displays, it's got chips in there. And frankly, it doesn't even have the battery as you mentioned, which makes a little bit of room for, you know, yeah, the aluminum, which other headsets are essentially all plastic or fundamentally plastic, and the glass. So, This is something I would have loved to see them figure out and kind of help the rest of the industry figure out some better way to do it. But we're, we're just not there yet. The components are going to need to get smaller, or we're going to need to have some kind of optical breakthrough in order to really move the needle on weight. I think it's just going to be kind of a It's going to be a painful crawl as it has been, you know, since the DK1 until now, where we see headsets are smaller, but they still don't feel like I'm just going to put on some glasses for a second. And that has big implications for the vision that Apple has for this product. So clearly the fact that they just wanted to like, essentially make the basics really good, like moving around the interface and opening a window and doing this normal flat content consumption stuff, whether that's the web or photos or videos. Clearly, they want this to be something that you do simple tasks in, at least at the start, right? I want to watch a YouTube video, or I want to go to a website, or I want to share a picture with a friend. I want to do that all in a headset rather than this big brand immersive I'm going to be in here for five hours. But if you're doing these kind of little tasks in the headset, like I want to watch a YouTube video on a bigger screen instead of on my phone, the headset is not there yet where you're going to bust out your headset just to do something in it for five minutes. Whereas if it was the size of a pair of glasses, you probably would. It should just be that easy. But we're still the vision that they're setting for this sort of flat screen content stuff that it does really well against the ergonomic reality of where we are, those aren't aligned yet. So I think both with the price of the headset and the size of the headset, the goal going forward is going to be, how do we deliver the experience that we're showing in this quote, quote, pro product? How do we deliver that at a better price and a smaller form factor. That essentially seems to be the challenge that they've set up for themselves. They said, we're going to put everything in here that we need in order to deliver the experience that we think is the minimum viable experience. And we don't even care what it's going to cost. We just need to hit the bar, our vision for how it should work. And then once we prove that that's possible, we're going to shrink it and make it less expensive. So, you know, this first gen device at $3,500 is just not for most of us. It's not for, it's not for almost anybody. There's a very tiny pocket of people for whom this device does enough to justify that cost. But again, it's this flag that they've planted in the ground of saying, this is what we want this thing to do. We want it to be at least this good. And now how do we make it cheaper?
[00:22:59.384] Kent Bye: I think Valve is similar in the sense that they tried to do something similar with VR and they had a system that was all in around a thousand dollars, but there's still a lot of people up to this day that still use the Valve Index as their go-to PC VR headset, just because they were able to nail so many of the core things that people want, even if there's been other things have come out since then. And so I think Apple's taking the same approach, like you said, and they're starting a whole new product in their ecosystem. And I think it's worth pointing out that it is an entire ecosystem that they have that you mentioned in here, that here's the thing that probably 75% of all the things they showed you were these 2d apps that were floating in these spatial environments. But if you have all of their iOS and iPad and other applications that have been developed within their ecosystem, and that's all available from the start with what has an M two chip, that is the same as one of their laptops. I mean, that's a pretty. hefty computing device that is essentially a standalone computer on your face. And that, that is a good place to bootstrap it. And I'm imagining that they have more developers that actually start to explore what's possible with the spatial medium. But that was, to me, what I thought was striking was just watching the entire keynote to see how they have watchOS and they have iOS and macOS and iPadOS and You think about all those applications and widgets all being integrated from the start. That to me seems to be the Apple has a lead when it comes to the robustness of their existing ecosystem. So, but yeah, when you were actually demoing it, you're not really necessarily exploring the full dimensions of interactivity or spatial dimensions of what they have to offer.
[00:24:31.927] Ben Lang: So yeah. Yeah, they really only offered a glimpse of that full immersive, full interactive sort of other side of the XR spectrum for sort of what we know in the current VR space. And I think that they do want to get there. I think they do want to expand that. I wouldn't be surprised if a few years down the road, we get either an official optional controller for more serious VR gaming, or they open the door to make it easy for someone to provide a third party motion controller. But again, that's just not their focus for now that I wanted to know the basics. But yeah, so much of the value of this headset, and so much of like, it's kind of the stuff that really apparently only Apple can do because they keep doing it is leverage their like ecosystem to add value to their products that has nothing to do with the hardware. It has everything to do with just how they integrate them together. So you have a company that is the most valuable company in the world currently that has millions of products out there that people use every single day that is in my pocket right now that I'm talking to you on. I'm already using my iPhone all the time to do stuff that's important to me. If their headset is an extension and addition to that, like being able to say, start a FaceTime call on my phone and then jump into the headset to have a more immersive chat with my friend or share a video. or put a panoramic for my iPhone onto my headset so I can see it as if it's stretched out right in front of me, which was a very cool demo, by the way. And then not only that, but be able to send that through iMessage, which I already use, to people in my network so that they can see that just as easily by receiving it on their end. And if they have a headset, just seeing iMessage right in their headset and clicking on the thing and pulling it up, right? It's all these interconnecting pieces about How do you let people do stuff they actually want to do? And Apple is really smart about all this. And I'm trying to connect it all. If you think about that from like, let's say, Facebook, or now Meta's standpoint, they don't have this surface area of devices or services that people use to such an extent, right? They have Facebook, they have Instagram, people go there to do like a couple things, share pictures, share links and talk to people. If you look at their headsets, their Quest 2 or their Quest Pro, there's almost no integration between their top products, you know, Instagram, Facebook, and their headsets. They're almost not leveraging that stuff at all. So they just don't have, not only do they have sort of less to leverage there in terms of ecosystem, but they also have not leaned into or not cared much about making that happen. They've experimented with it here and there, but there's just not a whole lot in their headsets you can do that leverages your existing network and existing capabilities of their other products in any way that people really use. That could be a matter of just the fact that they can't make the software easy enough or haven't made the software easy enough. Or people just don't find as much value in being able to get a picture from Facebook, for instance, let's say, to their headset versus a picture from their iPhone that they just took and it's top quality to their headset. But yeah, to your point, the ecosystem adds a ton of value. And that's like where Apple is starting. And then I think they're expecting, you know, you've got these flat applications that developers will be able to run. And soon enough, those applications that are kind of just on floating windows will have stuff that kind of starts to pop out of the screen, right? And then your next step is those applications becoming even more and more immersive as developers figure out what's a good way to do that, what's a way that actually makes the application better. I think it's like, again, it's back to the basics. And then how do we build up from this strong foundation that they're trying to set? I think they see this as a long journey.
[00:28:22.935] Kent Bye: Yeah, I wanted to ask about the displays because, you know, the specs that we know, they say that's around 4k per eye, 23 megapixel resolution. And so you say in your writeup that the retinal resolution of the eyes around 60. pixels per degree. And I had a conversation with Carl Guttag at Augmented World Expo, who's a augmented reality expert. And he was skeptical around the feasibility of a pass through to handle enough pixels per degree. He said that 40 pixels per degree was by his calculations, approximately what, according to the specs that were recorded, what the Apple resolution may be. And he said, that's just at the right threshold to be high enough resolution to do productive work. However, there's other aspects of like the vergence accommodation conflict and other ways that maybe people who are in the headset that might be causing eye strain for folks if they're in there for too long. You only have like a half hour demo, so I'm not expecting you to give a full accounting of your experience in terms of long-term use, but just from your short term of using it, do you expect that the screen replacement idea of this device, is that a feasible pathway for what people might be using the Apple Vision Pro for?
[00:29:38.342] Ben Lang: I think it is at a feasible point in terms of the visual quality. The thing I think is still going to create the biggest challenges is the ergonomics, as I talked about before. You know, again, if I want to do 30 minutes of work, is it worth putting on the headset? People are going to get these and they're going to say, oh my God, it's so cool. I love these screens. You know, I'm going to go to the coffee shop. I'm going to have these screens floating in front of me. They'll do that for a little while and then they'll realize it's just a lot more bulky and a little more uncomfortable than just getting your laptop out and opening it on the table, right? So there's this sort of sweet spot that we're not at yet where you need both the visual fidelity so that you don't hate, you know, reading pixelated text, which I'd say this headset appears to be certainly the closest yet. And I think sufficiently there for many, many use cases, especially just like if I had to type and do some photo editing, like it's there, right? Maybe if you're a 4k video editor, it's not going to got it for you or something like that. But for most like text-based things and video content consumption, probably, definitely there. But there's this other aspect to that, which is, is it comfortable enough that you actually want to do it, that it's not like detracting too much from the alternative, which is just a real display in front of you. For so many headsets that have tried this to date, they're failing on one or both of those axes, essentially. Quest Pro, You know, for instance, it's like they're not quite there on the visual fidelity nor quite the comfort, honestly. So I think Apple Vision Pro, it's got the visuals, I think, to be good enough for a ton of different kinds of traditional screen work. But the question is really going to be how many people are going to be comfortable enough in it long term? And is the battery life going to be the killer? The two hours is essentially the top if you're really using the device. And that could be kind of a a problem in its own, right? You want to go to a coffee shop, you'll be portable, you get two hours of work done on your headset, and then you have to get your laptop out anyway. You know what I mean? It's like, why don't you just start on the laptop? And if you if you got a, you got one of these headsets at $3,500, you probably got one of the more recent MacBooks as well, which is going to get you like 1012 hours of battery life. So you know, it starts to become these this accumulation of compromises. And so we're not there yet. There are people who will probably do it. But this is not like at the point where I think I'd be suggesting to everyday people like, hey, this is going to replace your laptop or this is going to replace your iPad. I think still it's kind of an addition to some novel new stuff.
[00:32:13.532] Kent Bye: Well, you can tether it with a USB cord, so you can power it and work on it for as long as you want, as long as you have access to a power source, which if you're going to a coffee shop, at least when I would go, I would still plug in, but with these laptops, you don't need to necessarily plug in. But I did want to ask you about the spatial memory capture that they had, because I think being able to take a photo, a depth photo, and then with spatial audio, look at that later, we haven't seen an all-in-one device that does that. And when I was watching the keynote, I was like, wow, because a lot of people actually use their phones for capturing videos and memories already. So to have this device, have everything integrated where you can actually take the photos and watch them. And you're using the camera quality of what is already existing out there for their phones and their Apple cameras are really high level. So then imagine having all of that level of the high fidelity of the resolution of the cameras on top of the depth sensor cameras on top of the spatial audio capture with like these six different microphones that are there. So I'd love to hear some of your thoughts of actually experiencing some of those spatial capture memories. Cause I feel like that could be one of the things that actually makes it take off of just the ability to be able to capture and watch some of these spatial memories.
[00:33:23.826] Ben Lang: Yeah. So this is sort of, you can think of it like a really advanced version of what we've had, you know, years ago in VR was 360 cameras were pretty easy to get. And you could take a 360 photo or 360 video, and you could relive that in your headset. And it was pretty darn cool. This is like better fidelity. The spatial makes it more immersive, but kind of the big deal is you can do it all from one device now, right? Before you needed a camera, you need a 360 camera. You need to hook that camera up probably to your computer to get the file. Then you need to get the file to your headset. Then you need some app to play it on that fit the format correctly. And then if you want to like share that with somebody else, you're probably back on the computer sending them the file, right? Just this mess, absolute mess of user experience there. The key differentiator here is Apple moving to actually having the headset itself capture stuff. And then, of course, they've got the whole playback. I bet you'll be able to share these things to others. You can see it on their headset. It's all built right in. So it takes this idea that was neat, but was too clunky that people stopped using it and makes it, you know, much more feasible. And this is really kind of the first headset, series headset out there on the market that is capturing, right? Like every other headset has used all of its sensors essentially to do tracking, to look at the environment and figure out where the headsets would be. This is kind of the first one that is using all the sensors and leveraging them to actually record the environment in a meaningful way, which is really cool. So whether or not these sort of volumetric captures take off, or let me back up there. When I did try them, I thought they were very cool. I saw a short clip of essentially some children like in front of a birthday cake and they blew it out and they were happy and jumping around. It was like 20 seconds. And in those 20 seconds, I felt like I was standing right in front of them, you know, five feet in front of them watching like I can see their happiness on their faces. I could hear them giggling. It felt like a really intimate moment for this little family. And I was as if I just was there before and was just remembering it. So it's really kind of an impressive experience. It's a high enough fidelity that you're getting those emotions back from that memory. It's captured in enough fidelity that you're not kind of distracted so much by how it's happening. You're just like, wow, I see these people, they feel like they're right here. And that's best case capture scenario, whatever. But if that's what they're showing is possible, I think a lot of people are going to So this whole idea of capturing these memories, it's a great demo. It's powerful. I think people are going to say like, wow, that's amazing. The question is, are they going to be able to sort of close the gap? They're getting so much closer, right? Being able to capture from the headset itself. So you don't need an additional special camera. The software to view it is right there built in, and it's just going to live in your existing photos library. I wouldn't be surprised if you could even this up on your phone at some point, you know, it's just kind of right there living alongside the existing media that is in your life and your phone that you capture all the time. But can they continue to make it easy enough that this is something that people are going to, a format that people are going to want to capture and, you know, regularly and not just as a pure novelty. If anybody can do it, you know, Apple's the one that would figure out how to make all that easy enough all in one package to happen. So we'll see. But the concept, I think they communicated very clearly and powerfully. And, uh, If people like that and respond to it in a way that I think they will, I bet Apple will say, this is worth pursuing. Let's keep making it better and easier.
[00:36:57.085] Kent Bye: Awesome. Well, I know you have to run off and catch a flight back home, but I'd love to hear any kind of wrap up thoughts, reflections of where we're at in industry and what this news from Apple means as we move forward. Cause it does feel like we're entering a new phase of the evolution of XR as you've been tracking it so closely. So yeah, I'd love to hear some of where you're at of where you think things might be going in the future. Now that we have literally the biggest company in the world entering into this ecosystem now.
[00:37:26.404] Ben Lang: Sure. Yeah, I think I'm on board with what you said, which is sort of a new phase, new chapter. It's a bit of a reboot and it's going to inject some fresh ideas. It's going to mix things up. And what I'm excited about is not just Apple's got a cool headset, but that these ideas are going to proliferate and they're going to get distilled into other products as well. And, you know, VR, XR, this whole category I think will accelerate now that Apple's ideas are here for others to see and judge and maybe share or maybe improve. So it's kind of funny, you know, when I did their 30-minute demo, essentially everything that they showed me, every demo they'd show me, I have seen in concept at some point in the past. And I've seen it and I've been, you know, somebody trying to prove to me that VR is going to be amazing for video calling or VR is going to be amazing for memory capture or VR is going to be amazing for hanging out with your friends or photo sharing. Like I've seen all these things shown separately and been impressed by them and believed in the vision that yes, it's possible. This could be awesome. And then here's Apple in 2023, you know, so many years later, essentially showing all the same demos, but done really well and sort of reselling me on the vision for these things that not only is the concept good, but that there's the potential to make it easy enough that people actually want to do it, that people are actually going to put on the headset. It's not this, you do it one time and then it's kind of too clunky. So you never do it again, kind of thing. So I was very impressed essentially by the entire demo. I mean, just really kind of magical technology. If I had not been following this space closely, like my mind would be blown, right? If I had either never seen VR, really, and just heard about it, or even if maybe I'd seen like a Samsung Gear VR, first gen Oculus Rift. If that was my marker from where I thought VR was, and then I saw this thing, I'd be like, holy crap, did I just jump forward in time? And I mean, essentially you did, right? It's been many years since those things, but really it's a great sort of pin on the map, the roadmap, if you will, to show how far things have come when you put it all together. What I'm really excited about, like kind of in a, in a personal way, the way that, that I thought about this recently is over the years, you know, I'd see a really cool new headset. I'd get to review it. And people who had, you know, go see family members and such. They might see like once a year, like I'd usually have like a new headset to show them. And I'd be like, I'd know what I was going to show them. I'd know it'd be this cool new thing, like brand new game or brand new feature or brand new capability. that was gonna be really neat, that they would be like, wow, okay, things are really moving. Since the release of the first Quest, I showed people that, and they're like, wow, cool, it's all standalone, it's all in one, it does Beat Saber, right? Since then, there has not been an obvious thing for me to show people. There's the Quest 2, it's just the slightly better version than what I would have showed them on Beat Saber on Quest 1. There's Quest Pro, which is, again, a little better, a little more compact, but it's like it's, you know, mixed reality stuff. There was no compelling anything really to show like a normal person essentially. There was nothing that I thought was worth the time to pull them aside and say, Hey, I have something cool to show you, put this on and you're going to check it out. Like the technology is getting better, but the almost experience kind of wasn't or what you could do with it was not moving very far. You know, like it wasn't even a, a brand new VR game that was necessarily like, yeah, there are great new games came out, don't get me wrong. But there was no like VR game that was like, this is totally different than what's done before, or that you could really experience in a 15 to 20 minute demo. So after seeing this headset, and everything that it can do, like i'm very excited to get to show people this as the next showcase to your normal person like look at what this stuff can do it's like there's really impressive stuff that i think is going to blow people's mind that i will i will bug people to pull them aside to say like you gotta see this you know not like i don't know there's really nothing too exciting that it's worth the time out of whatever else we're doing. But when I have my hands on this headset, I'm really looking forward to getting back to that kind of a fun thing that I've gotten to do over the years, which is show people the coolest new stuff happening in this space. And it's going to feel like a leap to a lot of them.
[00:42:01.816] Kent Bye: Awesome. Yeah. And you're probably going to have to buy a lot of ZEISS prescription inserts for anyone who has glasses.
[00:42:06.920] Ben Lang: Absolutely.
[00:42:08.821] Kent Bye: just to make sure that I can actually see it. Well, thanks so much for taking the time to join me today to share some of your hands-on impressions of the Apple Vision Pro. I tried the best I could to finagle an invite, was not able to swing one. So hopefully the next time around, I'll be able to see it at some point over the next year. If there's other developers that get ahold of them, I'm still curious to get my head into the headset and try it out myself. It does sound quite magical with the integration of eye tracking and the hand gestures and everything else, and just the whole ecosystem. I'm excited to see this as a turning point and also just to help capture this moment, your early thoughts of where we're at now, where we might be going in the future. So thanks again for taking the time to join me today.
[00:42:46.975] Ben Lang: Thank you too, Kent. And, you know, just as I was saying before, you know, it feels like a meaningful reason to show people new stuff. Now, you know, that's, what's happening here too. I have new things to talk about. This is exciting. I'm always really glad to get to do this. Check in with you where we, uh, really kind of feel out where things are at. It's always great knowing, seeing that now and seeing where we're going. And, you know, I look forward to the next leap that gives us a good reason to chat again. So thank you.
[00:43:13.098] Kent Bye: So that was Ben Lang. He's the co-founder and executive editor of road to VR. And he had a chance to check out the Apple vision pro on Monday, June 5th, 2023. And it was in transit, giving me a live report as he was going off to the airport. And yeah, just really appreciate him taking the time to be able to give a little bit more context for his. Firsthand impressions and hands-on experiences with the Apple vision pro. So yeah, it sounds like by all accounts for all three of these journalists, that this is on a whole other level with the integration of the eye tracking and the gestures and yeah, just really dialed in all these things together. I think each of the folks that I spoke to from the XR industry have some various concerns in terms of what's going to happen to the existing XR ecosystem. Is Apple going to somehow find a way to bring those folks in or are they really going to set a strict boundaries for comfort and the lack of locomotion and There's going to be a lot of limited ways in which Apple's vision may actually constrain the existing XR industry in a way that they're not necessarily ever mentioning explicitly virtual reality. It's all about spatial computing. And for them, maybe spatial computing is all about mixed reality. It's about being embedded into a specific space and not about locomoting through these virtual worlds. Even though their headset can certainly do that, they're not necessarily coming up with any of the control mechanisms to be able to explore the full potential of extended reality or virtual reality. And it's more about their mixed reality set within a context of a room. So I'll be very interested in seeing how that continues to develop, obviously. And we'll be looking to their human interface design guidelines that are going to be coming out for Vision OS later this month. Hopefully that will give us a little bit more context for what Apple's vision is for this platform as we start to move forward. So I'm going to wrap this up and move on to the other podcasts. I'm going to be hopping onto the plane to Tribeca here in just a couple hours. So I do want to get these podcasts out before I am off to Tribeca for a week. And then there's an XR Access Accessibility Conference that's also happening on June 15th and 16th. So if you're either at Tribeca or this XR Access Symposium, Give me a shout out, love to connect there and chat more, but let's hear from Ian and Scott. And yeah, again, thanks for joining me here on this episode of the Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends and consider becoming a member of the Patreon. This is a, this is a supported podcast. And so I do rely upon donations from people like yourself in order to continue bringing this coverage. So you could become a member and donate today at patreon.com slash Voices of VR. Thanks for listening.