The Apple Vision Pro launched on February 2nd, 2024, and I feel like it represents an inflection point in the XR industry where the screen resolution is high enough for it to be used as a viable screen replacement, the eye-tracking combined with pinch gestures is a paradigm shift in human-computer interaction reaching new levels of speed and intuitiveness to get into productivity flow states, and the software ecosystem of 600 visionOS apps with over one million 2D iOS and iPadOS apps makes a really compelling argument that this is a proper spatial computing device and not just the most advance virtual reality / mixed reality headset launched so far. The ergonomics and weight distribution for both default straps are inexcusably terrible in my own experience, and there’s a growing consensus that they’re suboptimal for most people with a minority of people feeling one of the straps are completely adequate. I expect DIY fixes and third-party strap solutions to start to fix this existing gap in comfort. At $3500, you’d expect to get something that feels better than most VR headsets on the market, but this is not the case.
I had a chance to catch up with Road to VR co-founder and executive editor Ben Lang, who has had a number of demos in 2023, and has actually had early review access to the Apple Vision Pro for more than a week now (his full review should be landing either this week or next). We actually talked to each other using our Personas in Apple Vision Pro while recording external audio since I still need to still figure out how to record within the headset. But we share a lot of our first impressions of the Apple Vision Pro, some early technical feedback, and elaborate on many of the comparisons to Meta’s Quest 3.
Lang elaborates on how usability and ecosystem integration are some of the more qualitative differences that go beyond any of the quantitative specifications that ultimately creates a qualitatively different holistic experience that transcends anything that the Quest ecosystem has been able to achieve. Lang makes the pithy insight that it’s the ergonomics that is the biggest bottleneck for more use and adoption of this device, and that at launch there are already enough things to see and do from a productivity perspective that the bottleneck for XR is no longer a matter of software or retention, but rather that the weight distribution is simply uncomfortable for extended use. The $3500 baseline cost is of course a barrier for the general public, but like many other Pro devices, most people will be writing off the Apple Vision Pro as a business expense if they can functionally use it within the context of their business (which many XR developers are using it a developer kit).
But what has been the most surprising to me is how much utility this device can have out of the box — caveat being that you really do need a Bluetooth keyboard and potentially a Magic Trackpad or Mac Laptop to truly use it the full capacity of a spatial computing device that replaces or augments your existing devices. The holistic integration of the Apple ecosystem is one of the biggest value propositions for people who are already either fully committed or even partially invested into the Apple ecosystem. I’m only half-way committed with an iPhone and iPad, but without a Magic Trackpad, MacBook Pro laptop, Airpods (for headphones since there is no headphone jack for the Apple Vision Pro), and I use Google Drive instead of iCloud. Even my Logitech K830 Bluetooth keyboard had some Bluetooth connectivity glitches that made me wonder if I should just get Apple’s keyboard. But I won’t be able to reach the full potential of the Apple Vision Pro until I more fully commit to all of Apple’s ecosystem. I made a couple of Linkedin posts with more first impressions here and here.
There are a lot of annoying operating system and software bugs in this first gen launch, but I expect all of this to get smoothed out over time. I expect third-party strap developers to provide a stopgap for some of the most glaring ergonomic and comfort issues. It’ll always be expensive, and I don’t expect Apple to get any more open as we move forward. Apple has no intentions of implementing something like OpenXR to enable existing software APIs or to use external peripherals in what we expect to be a spatial computing device to be. So despite Apple’s emphasis that they’re creating a spatial computer, they’re actually creating a super locked down device that is more similar to a console than an open computer, which is really troublesome for what the future of computing might look like. I’m really glad that Apple is advancing HCI, user experience, and showing what the potential of XR might be, but we do need both Meta to compete on what the potentials of fully embodied, spatial computing with haptics and controllers can provide, with whatever Google, Samsung, and Qualcomm are cooking up together that leverages the insights of the last decade of the XR industry with OpenXR.
One saving grace for Apple is that they Safari as a bridge to the spirit of the open web, and I hope they see the engagement numbers with it since right now the open web is the killer app of the Apple Vision Pro. I’ll be diving more into their WebXR integrations, which are currently hidden behind some special flags, but WebXR will likely have some of the most immersive and cutting edge experiments for what’s possible on the Apple Vision Pro when it comes to pushing the bounds of immersive and embodied experiences. Be sure to enable WebXR via the WebXR Device API and WebXR Hand Input Module, (and if you’re a web developer enable the AVP Safari web inspector), and check out the nominees for Polys WebXR Awards. I’ve heard from some developers that there are some fixes that are being made that didn’t show up in the simulator, but I’m really looking forward to diving into The Escape Artist once they iron out some of the bugs this week or next.
I expect to be diving into a lot more Apple Vison Pro coverage this coming week and month, and if you enjoy this coverage, then please do consider becoming a Patreon supporter at https://patreon.com/voicesofvr.
[00:00:05.412] Kent Bye: The Voices of VR Podcast. Hello, my name is Ken Bai, and welcome to the Voices of VR podcast. It's a podcast that looks at the future of spatial computing. You can support the podcast at patreon.com slash Voices of VR. So in today's episode, I'm going to be sharing some first impressions that I've had on Apple Vision Pro, but also in conversation with Ben Lang, who's the executive editor of Road to VR. And he's had a number of different Apple Vision Pro demos since WWDC when Apple first announced it. He's also had early access to the Apple Vision Pro, so he's had a lot of extended time to Use it in a lot of different ways. So he's sharing some of his first impressions and some early technical feedback He's still in the process of writing his full review for road to VR and he always does a really comprehensive job of covering all the different VR specific things that you'd want to know For me, my first impression is that it's absolutely stunning in terms of the technology. Both display, it's probably the thing that I noticed the most. It has no screen door effect. It's the best display that I've seen. It's a high enough resolution where it can actually become a screen replacement. I suspect one of the biggest use cases is going to be productivity and as a monitor replacement. And I think that the eye gaze with the pinch gestures is like a whole new level of user interface and just how intuitive it is in a way that becomes unconscious, that people start to unconsciously want this everywhere else. So I do think it's a complete paradigm shift in human computer interaction in a way that is going to be way more powerful than I think people are expecting as we start to move forward. We're going to see this as like the I do think we're at this inflection point. $3,500 machine and the ergonomics are like worse than most VR headsets that are out there on the market. Worse for me than the Quest and the Quest default strap, you know, most people automatically upgrade to a more rigid strap that is either directly from Meta or a third-party market. The good news is that there are third-party developers that are in the process of trying to come up with some fixes to what, for me, seems to be a bit of an unexcusable ergonomics faux pas from Apple, where they have seemingly preferenced aesthetics over actual comfort for using the thing for long periods of time. And we might actually see some first party solutions as well. I mean, Apple thankfully has decided to ship with this dual strap solution, but even then I think it needs a little bit more front to back support because there's so much of the weight that is falling onto the face that it's actually giving me a little bit of a rash on my face for using it for extended period of times. I actually have to take a day off or so just to let my face recover and try to figure out some of these DIY solutions that are out there. But one of the things that Ben says in this conversation is that it's actually a good that people want to use this even more than they can physically tolerate just because the problem up to this point has been more of a software and an experience bottleneck rather than an ergonomics bottleneck. Having an ergonomics bottleneck is actually really great because we actually have some ergonomic solutions, things like rigid straps, potentially even like counterweighting in different ways, although I don't know if we want to put all that weight onto the head because it is quite a heavy battery as well. But it could actually start to offset some of the weight that's in the front. So I imagine there's going to be a lot of third party solutions to start to fill some of those gaps. And there's a lot of early first iteration bugs that come both with the software and with operating systems. So I expect to see that iron out as well. Other than that, the technical achievements of the Apple Vision Pro are just amazing. I will say the audio is not great. People say it's amazing, but I don't think they've really been trying out a lot of what's already out there with, say, the Quest 2, Quest 3. I'd say the Valve Index audio is the best that I've heard in terms of audio. So I hope to see some more solutions that use a little bit more of that approach, maybe even with the third-party straps, because you can unplug it. We'll see how much that Apple opens up vendors to alternative audio strap solutions and whatnot. But I wanted to have a conversation with Ben because Ben is somebody who has been in the industry since 2011, before even the Oculus Rift Kickstarter in August of 2012. So, more than 12 and nearly 13 years of experience of covering the space. And so, we ended up talking a lot and comparing to the MetaQuest and what Meta's been able to do with their ecosystem versus what Apple's been able to do with over a million iOS and iPadOS apps that are launching, 600 Vision OS apps, and for Meta, they have a lot smaller selection of apps that are available and they're mostly games. So they've been treating it more like a games console rather than a general compute device that has just a plurality of different options for different types of applications that people want to have. And we actually recorded this within the Apple Vision Pro I haven't figured out all the recording and software, so I'm actually recording the audio externally and putting it all together. But maybe as we move forward, I'll start to record some of these conversations and interviews directly within the Apple Vision Pro if I can get over how uncanny my persona looks. So anyway, that's what we're covering on today's episode of the Voices of VR podcast. So this interview with Ben happened on Monday, February 5th, 2024. So with that, let's go ahead and dive right in.
[00:05:32.704] Ben Lang: Hey, I am Ben Lang and glad to be back. This is, oh man, I don't know how many times on Voices of VR, but we managed to catch up every few years. And it's always a fun time to check in on where the whole industry is at. Yeah, I'm Ben Lang. I'm the founder of Road to VR. I've been covering this space since 2011. And well, this is the virtual me, right? We're looking at each other's personas right now. So this is a first for us as far as recording this goes.
[00:05:58.544] Kent Bye: Great. And yeah, maybe you could give a bit more context as to your background and your journey into covering the space.
[00:06:03.605] Ben Lang: Yeah, so in 2011, when I started Road to VR, there wasn't something that we would call an industry. There wasn't even something that we would call a community for VR. There was pretty much no consumer headset on the market. It wasn't even something that was kind of like on the way. I ended up actually starting Road to VR purely out of curiosity for virtual reality, just as a concept. You know, you see it in sci-fi. It's a fascinating idea, the idea that you might be able to replicate Enough of reality to convince somebody that they're somewhere else, right? So this is a really interesting concept I was very curious about it and I've been doing tech journalism prior to that writing for some other people and I said Hey, why don't I just start a little thing on my own? I'll write about where virtual reality is today. Like what happened to it, right? It was around for a while in the 80s and 90s and kind of died out So I wanted to know like what happened to it where might it go like tomorrow and also way into the future so I started this little blog wrote a VR and I It was about a year later that the Rift Kickstarter, the first Rift Kickstarter kicked off, pun intended, and that was the thing that got Oculus going. And of course, then they were bought by Facebook. Blah, blah, blah. Lots of things happened. Lots of companies started launching headsets. And yeah, so I started kind of before there was really a VR industry or an XR industry as they tend to call it now. And now here we are 10 years after that Facebook acquisition of Oculus in an Apple headset, which is insane to think about, right? When that moment happened, if you told somebody, you know, hey, Apple's going to get into this space too, they'd say, you're insane.
[00:07:40.911] Kent Bye: Yeah, well, there's certainly been a lot of rumors for a long time that Apple was working on something. And when they finally announced it last year, they gave some demos to some journalists. You were one of the lucky few to get an early demo. And then they've had at least three or four other times that they've made the Apple Vision Pro available to press. So maybe you could recap the different moments where you got to try out the Apple Vision Pro. And if you had early access, I was reading something on Road to VR where it sounded like you actually had an opportunity to have early access to the Apple Vision Pro. So I'd love to hear a little bit more context as to your first introduction to the headset and then what you've been able to do with it now that everyone's had it for the past weekend.
[00:08:22.795] Ben Lang: Yeah. So it was WWDC 2023 that Apple invited us to as Road to VR. And that was pretty clear that something was coming because they've never invited us to any prior WWDC because they had nothing to talk to us about, right? We're Road to VR. We're only covering, you know, VR, AR stuff. So, when they sent us an invite, it was clear they were ready to talk about something. But we didn't know until we went there. There were rumors that it was a headset. There was actually a decent amount of correct information out there. There was some wrong information. But we went there. They did their whole WWDC Apple keynote, talked about a bunch of other stuff. And then at the end, they pulled out the old one more thing and showed Vision Pro. And I remember two moments, when they first showed the eyesight camera on the front, they showed the eyes light up, everybody in the audience like, Whoa, wow, like, you know, they didn't really know, like, is that just like a concept? Or is that like, really on the headset? Like, is this just like a marketing like idea? Like, Oh, look, you're inside of it. So, that really wowed people, and then at the very end of the presentation, I remember so vividly this huge crowd of people at WWDC at the keynote. When Apple said the price at the very end, $3,500, the reaction of the crowd was like, whoa, okay, hold on now, slow down. It's amazing to hear everybody react like, oh my God, that's a lot. That's a lot of money. But hey, here we are on these headsets. So yeah, I was there for the announcement. I was among a handful of press to get to them, demo it during that conference. And remarkably, it was pretty, pretty darn polished and comparable to what we're using right now. It's actually kind of amazing because that was like nine months prior. They had stuff down, you know, so after that it was you know this eight or nine months later the headsets coming up we got to go do I want to say two demo sessions for the headset prior to launch like within two or three weeks of launch and then we did get our hands on a headset before launch and got to play around with it, poke around with it, and post some of our first, you know, impressions that ever got out there before the headset went out. So that was pretty cool, you know, after being in the space for forever and eventually getting to try this headset. I'd be one of the first people to, you know, try the finished product.
[00:10:48.048] Kent Bye: Yeah, I had a chance to have it for the last three days. And for me, one of the most striking things is, first of all, the display and the resolution, but also just how it is like a productivity device. It's a computing device. There's been a lot of talk about how Apple was not willing to say things like virtual reality or augmented reality or mixed reality. They're really emphasizing this. phrase of spatial computing, but I do actually think that this is much more like a Self-contained compute device than it is like a game console like the quest and so I've just been really amazed about how Much of a flow state of productivity I've been able to achieve even with this very early phase and this is something that you actually called out in one of your first titles where you said I forget what the title exactly was, but it was something along the lines of like, it's not for gaming, but it does everything else like amazing. So maybe you can just recap some of your first impressions and how that's really held true.
[00:11:46.246] Ben Lang: Yeah. So that was a headline that I had wrote just after my first hands on back at WWDC. And my takeaway really, you know, the headline was Vision Pro isn't for gaming, but it's better at everything else. And I looked back at that headline just the other day, right here at launch time. And I was like, that's still exactly how I would describe it. You know what I mean? People who have been in this industry are used to a very game-centric, game-focused VR landscape where these devices are being primarily used for gaming. Gaming is the best and most entertaining use case. They're kind of built all around gaming. Everything else is kind of secondary. And in many cases, other things that you try to do in the headset are just insanely clunky. Even if they technically can do it, it's just like usually a pain in the ass. It's like kind of a self-fulfilling prophecy, right? It's like, well, people love gaming and VR. Well, yeah, because trying to do anything else in VR is a terrible experience. So that continues to be the impression here. I had the same experience that you did, I think, of, you know, there's like real productivity in here, right? Like if you just kind of ignore the comfort factor and that's bulky, like it really does stuff and it does stuff well enough that you can justify using it and wanting wanting to use it. It's capable of doing things in a way that aren't just novel. You're just like, okay, I'm getting work done here or right now. You and I talking like this, being able to see each other, I really feel like I have eye contact with you, even though we're these personas, these early versions of personas. I think it's pretty good. I feel kind of present with you here. I don't know about you.
[00:13:19.722] Kent Bye: Well, I feel like there is a difference between seeing the personas in 2D and 3D. I feel like there's a part of me that I just want to have more of a stylized avatar. I don't want to have my actual representation. I understand that there's a connection between my identity and having the optic ID and ensuring that it's actually me and you're not getting some sort of spoof. So I understand that, but I feel like they could still use the optic ID and allow me to have more of a stylized representation of myself. Because when I see my persona, I feel like It's got this resting bitch face that I feel like doesn't have this reflection of me. So I've had trouble capturing like what I feel comfortable of my own identity and there's this uncanniness. But I will say that the representation of being able to have it in 3D is a lot better than what I've seen in 2D. So I agree with that.
[00:14:08.513] Ben Lang: Yeah. Well, and I think this is a prime example of, you know, there's just so many people out there saying, well, you know, it doesn't do anything that other headsets can't do. And like, in many ways, that's like technically true, but it's the technically that's the problem. Like you and I could have jumped into a Quest headset and through some means been like quote face-to-face like this and recorded a conversation like this and we've never done it right why have we never done it because it is a pain through and through using the interface configuring your avatar figuring out which application we're going to use how do we invite each other like do i have to tell you on another device to be in the headset at a certain time so that when i send you the invite you're in there and you don't miss it It's just this absolute jumble of little annoyances that make a real big annoyance and stop people from really doing that stuff. So, whereas in here, I was working on my MacBook. through this as a virtual display. You sent me your phone number to invite you here. I copied it on my MacBook and pasted it straight into FaceTime on the headset. Like getting your phone number from one to another, not doing it that way, it would have been annoying. It would have been bring up a virtual keyboard, you know, tap in six or seven inputs and That's just six or seven inputs less that I had to do in order to get that information from A to B. And it really just felt like full circle. It's like copy, paste, there it is. Not that getting a couple of digits of a number from one computer to the next is anything special, but because it made this simple and seamless and obvious, like, yeah, okay, easy. You gave me your number on that device. I don't have to like get into my headset and find out if you're online or not. It's just like, I'm using the same tools that I use in my daily life in this headset. And it makes that connection from in the headset to out of the headset. actually feel like there's nothing there in many ways, right? Like we're talking about the eyesight display, like, oh, you can see into their eyes. It feels like you don't have something obscuring your vision. In an abstract way, too, or in a conceptual way, being able to use all the same tools and all the same paradigms of productivity that I'm used to in, quote, the outside world in this headset makes it feel like I'm still just sitting in my room, right? I still have access to the same information and the same capabilities when I have the headset on versus when I have it off. And compared to other VR headsets, that's a completely different thing. When you put the headset on, you're like, I can't see my phone. It's buzzing. Who's calling me? I want to check my email real quick. Let me take off my headset or go through like a crappy browser and log in because I haven't logged in before. Now I got to, how do I get my password into here? Anyway, you get it. It's a difference of friction, not technical capabilities. And yet it makes all the difference in the world.
[00:17:02.140] Kent Bye: Yeah, a number of things come up. First thing that comes up is that I feel like I'm like one foot in one foot out of the Apple ecosystem. So I don't have a Mac laptop yet. I don't have a magic trackpad. I do have a Bluetooth keyboard. I'd say the Bluetooth keyboard is absolutely essential for using this as a standalone compute device. But if you have a laptop, a Mac laptop, you can use that, and you have a knock-on effect of using all of these other systems that are integrated, very tightly integrated across all these different platforms. Another thing, I am on the Google Drive as my primary cloud service, and I don't have iCloud, and I get all these messages saying, oh, your photos aren't backed up from your iPhone, into iCloud, well, I have them backed up to Google Drive. But when I get on to Apple Vision Pro, I feel like, OK, this is an instance where I actually may need to change my primary cloud service because it's going to be so much better to have everything tightly integrated. So there's this experience of getting this as a standalone unit, but it's so much more powerful when you are fully bought into the Apple ecosystem. You can just start to leverage all of those things together. And so being able to have that experience that you have, if you fully commit to that ecosystem.
[00:18:19.683] Ben Lang: I'm actually in a similar boat as you. My main PC is a Windows PC. I use Gmail. I also use Google Photos. I use Google Calendar. But the nice part is, just like I have an iPhone, on my iPhone, I don't use Apple Calendar. I use Google Calendar, and there's an app for me, a nice performant Google Calendar app. that I use, and it's the same in here, right? Like, okay, yeah, it'd be nice to have the official Photos app have my photos in there, but I can go get the Google Photos app, right? And I kind of have it just like that. It's like, that's what I think makes it feel a lot more like a computing device than kind of a gaming machine.
[00:18:54.845] Kent Bye: The caveat is that Google is not sending a lot of their native apps. They don't have YouTube. They don't have Google docs, Google drive. They're probably not going to have Google photos because they're going to be creating their own tightly integrated system.
[00:19:05.233] Ben Lang: And so I feel like really cool part, sorry, keep going. But while you're talking, I'm going to check the app store right next to you. We'll figure that out. Sorry, keep going.
[00:19:14.498] Kent Bye: Oh, yeah. So I feel like, yes, there are all these things that would be great to have native apps because Google has opted, as far as I can tell up to this point, to not even make their existing iOS or iPad apps available on Apple Vision Pro. They have to explicitly opt out. And I believe they actually did do that for all of their apps. I couldn't find them.
[00:19:35.553] Ben Lang: Yeah. Okay. You are right for Google. And yeah, this is, it is a bummer. I, you know, I'm not seeing Google photos on here. As you mentioned, had they not gone in and specifically said we, for whatever reason, do not want our apps on vision pro yet or at all, that's not here. But for any developer who is not for some reason specifically objecting to the platform, it is just in here as a compatible app, which is really nice.
[00:19:58.970] Kent Bye: The caveat is that it is on Safari, so anything that is available over the web, you can use the web, but a lot of times Safari's user interface is not necessarily optimized for Apple Vision Pro, so you end up misclicking or there's not hover effects, and I found a lot of friction, even with the iOS apps that don't have native integration I found that I have to often bring it close to me so I can actually like touch the buttons because There's no mouse over hover effects. And so it's certainly very early days with the operating system I imagine a lot of these things are going to be fixed However, I do think there are some companies like Google and Spotify and Netflix because of the 30% tax I feel like a lot of companies are going to avoid wanting to have their perhaps onto the Apple Vision Pro. So that's another part of the ecosystem wars that I've run into for my own productivity use cases.
[00:20:50.143] Ben Lang: Yeah, and you're right. One of the big bummers for me is no YouTube on here. That sounds like, ah, whatever, just one app. But I watch a lot of YouTube. Being able to have all my subscriptions and my videos right here floating and being able to maybe share play it with people to watch YouTube videos together is something I've always wanted to be able to do easily in VR. And yes, I can do it through the browser, but they snagged that one and said, Nope, not for you. So presumably they're going to hold on to those things for maybe their own headset, which we know is coming at some point.
[00:21:21.639] Kent Bye: There is a native Juno app that I would recommend you check out. It's five bucks.
[00:21:25.762] Ben Lang: Yeah, I need to try that. I just wait for it. That's the thing. This is actual multitasking. I don't have to shut down everything that I would normally do in my life, and it makes a huge difference, but it is really funny because It's good at everything else, right? But the gaming piece, like so far, there's just the games available on here feel like kind of like board games compared to what you get on other VR headsets, which are just like insane, fully immersive, like gamer games. This kind of feels just like the early days of the iPhone. The games you get was like Angry Birds, right? Versus your Call of Duty on PC or console. So, I'm curious to see if they ever get there, if Apple ever actually wants to embrace that segment, which is like currently every other headset is going after right now. That'll be really interesting to see.
[00:22:17.378] Kent Bye: Yeah, the Juno app is using the YouTube API. It was one of the developers of the Apollo app for Reddit. And so he had already done a lot of the code. And then when he had heard that YouTube wasn't going to be launching, he quickly put together this app and it does a really great job of creating a native interface. I have found that sometimes I have to force quit apps because like with the Juno app, for example, if I don't see the play buttons come up when I click, on it, I then have to force quit it and restart it. So I feel like there's some bugginess overall that I'm experiencing, but I expect that to be ironed out over time. One of the things that you mentioned that I wanted to call back to, which was the friction that it has taken to connect to people in VR. And I know that you've written like a series of articles where you're like, OK, I'm going to try to get together with my friends and I'm just going to document everything that goes wrong. And it's this really long, complicated thing that we were able to do it with low friction. And so there's this tightly integrated aspect of the Apple ecosystem with the messages like when you were calling me, it was also happening on my phone that's sitting here. So I could choose to use my phone or I could use it on this device. And so there's this seamless integration of all these things together. But I wanted to ask you about this statement that Meta made where they said that they want to be the Android of VR, which I thought was very interesting because they're actually probably more like the Samsung because they're not actually building the operating system like Google has. They're doing a skin on top of Android. And so very curious to hear what you think about that in terms of meta positioning themselves as a little bit more like the Android and if how far they would take that metaphor because they haven't necessarily thought about expanding OEMs with their own software. They seem to want to have their own ecosystem and control over that.
[00:24:02.093] Ben Lang: Yeah. Yeah, I think you're you're getting on to this point, which is I think that metaphor was fairly slim. You're right. They're not making the operating system and they are not letting other people build headsets, even for what they have built their virtual environment. So I think they were primarily just talking to like their openness, right? And they want to allow, at least from an application standpoint, want to allow whoever wants to run whatever on their headset. Of course, they have some restrictions, but like, you know, they'll allow you to run a web app without their permission on there and all kinds of different, you know, games built in different engines where Apple has a lot more restrictions and technical requirements and sort of like built in like guardrails that developers can't get around that they might be able to on other headsets. For instance, Steam Link, which lets people play PC VR games on a Quest headset, which means if you just do it that way, Meta is basically not benefiting at all, right? Then they let that on the store, right? That was a good move that showed some openness and Apple probably would not allow that to happen, either not allow it or just simply not expose the technical piping to make it even possible in the first place. So, to that extent, I get what they're saying about the Android metaphor. But yeah, I think you're right. If they want to be the Android and be that open, not walled garden, right? They need a lot more sort of on the foundation, right? You can't even run Android apps on it yet, even though it's based on Android. And the best they've done is like tried to get some web apps like into their store looking like they are full apps, right? But they're really just loading the web version of it. And so far, that doesn't seem to be gaining a whole lot of traction. So yeah, I think they have a lot of steps if they really want to be that. And I mean, I think the weird part about that and why you and I think other people, myself included, were like, Hmm, that is kind of a weird statement, right? That they want to be the Android of XR. because we know that Google and Samsung are also working on their own stuff. Presumably, they will be the Android of XR, because they are Android. So, I don't know, maybe it was a Freudian slip and a little secret hint that Meta is working with Android and Google to try to compete against the behemoth that Apple is and may become in the headset space, or they just have found themselves in an awkward place in the market. You know, they may end up in a position where they need to beat both Google and Apple in the XR space, right? That would be the last place that they really want to be because that's exactly why they set out to dive like so deep and so fast into this XR space in the first place was because specifically because we know this from Zuckerberg's own words, he said, like, we're doing this because we do not want to be under the thumb of Google and Apple as we are on smartphones, right? So now we are looking at a future where the exact same thing might happen if Meta doesn't play its cards very carefully.
[00:27:00.105] Kent Bye: Yeah. Yeah. A number of things come to mind when we're talking about all this is, first of all, I think Meta has been a lot more open when it comes to supporting things like OpenXR and Apple has had a beef with the Khronos group. It was coming up in the notes of the WebGPU meeting notes where there was like talk about how there was like pending lawsuits or beef between Apple and the Khronos group. The gist of it is, is that it sounds like Apple's never going to want to do anything with any Khronos standards, including OpenXR. which means that you can't have all of these existing peripherals that have been created within the XR ecosystem integrate into something like Apple. So Apple is very much like a closed ecosystem. They're not allowing that openness. I've got this Logitech K830 keyboard. It's got a built-in trackpad, but there's no mouse support yet, and you have to buy the Magic Trackpad. So maybe they'll include mice at some point, maybe not. But, you know, as I think about the overall ecosystem and comparing Apple, it's like you're either bought in or you're not. And it'll be interesting to see if there will be some sort of collaboration between Meta and these other companies. I know the CTO of Meta, Andrew Bosworth, a.k.a. Bos, has said that he's asked Google to have Android apps on Meta's ecosystem. And they basically said no. So I don't know if that's because they're wanting to do it themselves or what, but I feel like there's obvious comparison between what you can get for a $500 Quest and what you can get with a $3,500 Apple Vision Pro. And there's a lot of people that are saying, oh, hey, you can do all of these individual things on the Quest. But part of my takeaway is that some of the key differentiating spec factors are both the display, which is like super high res, and I can't see any screen door effect. Also the operating system, the software ecosystem that you have, and the M2 processor chip and R1 chip seems to have a level of performance that is at least a couple of orders of magnitude more powerful, especially when you look at how much power is being consumed, than what you can do on the Quest. And so how do you respond to that argument that people say, oh, well, you can do just as much stuff as you can on the Quest 3 than you can on the Apple Vision Pro?
[00:29:16.259] Ben Lang: Yeah, I've had a similar struggle articulating, you know, to people who say, but I can do all those things on this other headset, like why it's different, right? And why that difference matters. You know, I would say usability is like the most important part of any product. You can have something that does the most helpful thing in the world, but if it's really difficult, really confusing to use, very few people are going to get that benefit from it. And it's going to be difficult to market, blah, blah, blah. Usability, I think in a lot of organizations, well, stepping back a second, usability is extremely difficult to like turn into like a number, right? We can compare the resolution, we can compare the tracking latency and those things between headsets. But you can't exactly quantify usability into a singular number and say, well, this one's got twice the usability, right? There's too many variables. And yet, even though we can't quantify it, it's still incredibly important to how well a product works, how often people are going to use it, what the value it truly provides for them. And so, yeah, I've been having a real hard time having that conversation with people who say it does technically all the same stuff. But it's like, yeah, you're right. You're technically correct. It does almost all the same stuff. In fact, some stuff that sets can do that this headset can't. And yet, I think when we look in the long term, you're going to see that the retention of a headset like Vision Pro is going to be like off the charts compared to the others. There's so much more you can do in it. more easily, right? Even if it's the same stuff. And yeah, anyway, so I was kind of getting on to the point that Within a lot of organizations, I feel like usability is like it's just not valued as much as some of like maybe the hardcore engineering stuff. And I'm speaking purely from looking in from the outside, looking at what companies like Meta and Google and some of them have done over the years. But it feels like usability is almost like, well, just polish it up. We'll build it. And then you just polish it up at the end. I think that Apple does a pretty darn good job within their organization of having a culture of usability. And for them, I think that manifests in like what would you actually want as the user? I think they start there and then work backwards. And that comes through. It really comes through in this product when you compare it to what else is on the market. It's their first, their very first product, their very first operating system of this sort. And it's doing a lot of things right that any of the other companies out there could have done, purely from a software standpoint. There are some elements to this that rely on the amount of power the headset has, but so much about what people are liking about Vision Pro is how the software works, right? And it's like that could be replicated in other headsets had they cared to do it. And up till this point, they haven't really. So it's like, why? Why did it take until Apple? It almost frustrates me that it took until Apple to put out a headset to make something that's going to obviously force the other companies to up their game. Why is Apple the only company that can do that? Why couldn't Facebook have done that when they bought Oculus in 2014? Why couldn't they have done that five years ago? Is it just because they didn't really care that much about usability? Did they not have the vision for what you would actually want to do and how you'd actually want to do it in a headset, or maybe they did and it got just lost in the however their organization works. I really don't know the answers to these questions, but it is frustrating. Apple should not be the only company that can come out the door with a product that's super polished and super functional. It just doesn't make sense.
[00:32:52.354] Kent Bye: I think it gets down to the sci-fi visions of what each of the companies have seen is going to be a driver of adoption. With Meta, there's a ready player one, everything's going to be a game. Then you have the first iterations with the Go and the Gear VR, where they had probably around like 1,500 or 2,000 apps. And then when the Quest came out in 2019, When I last counted, there was on the order of like 614 apps that have launched on the Quest platform officially in the last 4.7 years. There may be another 2,000 or 3,000 apps that are not officially launched within the context of App Lab. I don't know if there's anyone that has a really good estimate. And there's around 2,100 or so Rift apps that have launched. So I'd say there's probably, on each of the meta platforms, maybe 8,000 to 9,000 apps that have launched in the last decade, and 3,000 of those are not even available with Gear VR and Oculus Go. And they've taken a very aggressive curatorial vision that things have to be at a certain bar of quality. But what that has meant is that there's all these other non-gaming applications that have been relegated to App Lab, or maybe even not even allowed on App Lab. I know there's a lot of medical apps that I've talked to people where they weren't even able to get on App Lab. So there are certain decisions that I think Meta has made where I feel like they have wanted to own the top performing application in each of those different sectors that has got this conflict of interest where they want to be a first party developer and that's where they want to make their profit, but they also want to enable the ecosystem. But when it comes down to preferencing whether or not they want to prioritize their own first party apps versus the ecosystem, time and time again, they've been investing in seeing how they want to have their own first party apps as like the golden ticket to success when it comes to XR. Whereas Apple, right out of the gate, they've launched 600 native Vision OS apps. The vast majority of them are probably developed on that simulator where people didn't even have access to the hardware to launch it, but they've got this operating system that allows people to not even have to use something like Unity or Unreal Engine where you can start to build upon the existing framework of what they've built with their other operating systems. And they have Unity. Time will tell as to how Unity and Apple Vision Pro work together, but just stating that they don't have a native game engine, you have things like WebXR, React Native. There's going to be multiple pipelines to be able to produce apps, but Apple out of the gate has already chosen to launch a million iOS and iPadOS apps to be available. And so you have this huge ecosystem of software, whereas even today with the MetaQuest, you only have, like I said, maybe at best 9,000 that have launched in the last decade and maybe 5,000 to 6,000 that are available. And of those, only about 2,600 are readily findable on their official stores.
[00:35:44.786] Ben Lang: Yeah, I think Apple just does a good job of leveraging its ecosystem. And it seems to me like they probably approached the development of this headset saying, like, we need an input system that will allow people to use this massive library of apps that we already have, right? If we don't do that, we're making a dumb choice, because we have a million applications. And they're designed for touch. But how do we make it work in here, right? And I think that's probably what led them to this look and pinch input system that we're using now that works phenomenally well. It's not like 100%, but like I would say for what it is, version 1.0, like really impressive. And for me, I'm just really surprised how easy it is to use an iPad app that you download on Division Pro. It has no idea it is on a headset, right? It thinks it's running on an iPad. It's 100% identical to what runs on an iPad. And yet, with the Look & Pinch system, it really super easy to use the vast majority of these apps. And yeah, I think that was a really good idea. I think they said, this is something that users would really want and benefit from, so let's make that the baseline, and then we'll build on top of that. And I think, again, looking back, I think that'll be clearly regarded as a very good decision. And why not, right? Apple, probably more than anybody else, is good at leveraging their ecosystem. And it's like, that's good because it provides real value to people. really improves the experience for people, you know, to whatever extent you are inside that ecosystem. And this is just one example. And yeah, I think I've said this before, like long before we even had rumors of an Apple headset, like people should never underestimate the developer army that Apple has amassed, right? A lot of organizations, as my understanding, sort of see iOS as like this is the flagship version of our app, and then we'll get it on Android, too, even if they launch at the same time. I feel like internally a lot of organizations look at it that way. And that's for historical purposes, but also Apple has tools that people like to use, and Apple does a good job of communicating how they think these apps should work and essentially sharing a paradigm of ideas that developers can build on more easily. So, like, you've got all these iOS developers, like, tons of them, probably millions or more than there are apps, And they've been working with these tools and in this paradigm for two decades, essentially. And so like, that's two decades of talent, of developer talent on these particular platforms that you can't buy that, right? Because you can't go back in time and have people working and refining all this stuff. So yeah, it should, you know, I said for a long time, we should never underestimate the talent that is accrued and the number of people who know how to build this stuff. Because as soon as Apple introduces something new, all those people are going to want to say, oh, this is interesting, and I can build for it with all the same tools that I'm already familiar with, essentially, right? That's something that Meta certainly does not have access to, and they've been kind of figuring out as they go, I guess. And yeah, that makes a huge difference. And it's like, that's kind of part of their ecosystem story, right? Apple's advantage is these things. And now you've got people who can make a version of their existing iPad app a little bit better in no time, right? Instead of having to learn something new. And yeah, that's an advantage that really ends up mattering to the product and the end experience, right?
[00:39:17.130] Kent Bye: I feel like that meta with the Quest is taking an approach where it's starting with the 3D first and seeing what kind of experiences you can have at the frontiers of embodiment and agency and tracked controllers. And it's really the height of embodied presence and the amount of agency you can have. And I feel like with Apple, they're starting from these 2D frames and these 2D paradigms and they're kind of bringing in all these ecosystems into that context. And I feel like that's going to be more of like slowly introducing more and more embodiment, maybe even eventually tracked controllers, who knows, but at least like the biggest complaint that I hear so far is like, oh, it's just a bunch of 2D panels in an immersive environment. Why is that so compelling? And I feel like I'm sitting here in my office right now. You're in the background. I have like Mount Hood and this beautiful mirror lake that's in front of me. Your face is like right in front of me. I have it like dialed down with the crown. So I'm only seeing it half of the room near me. I'm sitting there and I can see my body. I can see my keyboard. I can look down. And so I feel like there's a integration of what the experience I've been able to have with an Apple Vision Pro where There's a lot of stuff that I do with editing podcasts that I will need to buy a Mac laptop to be able to have this as a proper screen replacement. But I feel like for a number of people, even with the apps that are available or with having access to Safari, I feel like in some ways Safari is the killer app because it gives you access to all these other things that haven't launched yet. And all this other stuff with WebXR that may be coming. but I feel like there's a productivity and screen replacement use case that they're starting with, and that they're taking this existing ecosystem of things that have been developed in iPadOS and iOS, starting to bring it in, and then as time goes on and you get more people actually using it, watching a 3D movie, and I watched the Avatar, It was incredible to be able to see what it looks like. I actually had to use my soundbar that I have because the audio was so much better than the built-in audio. But I was able to have a TV screen that was probably like three times larger than my existing screen. And it was in stereoscopic 3D. I felt like it was an experience I wouldn't even be able to have if I just had a monitor. So I feel like there are some use cases that are killer apps for some people. I can't watch it with my wife, so that's a bit of a bummer. So there's going to be situations where, like, it doesn't actually make a lot of sense for people. So maybe traveling on the go or if you are wanting to have mobile workstation, I think the monitor replacement to be able to be mobile, move around and have better posture and just have a little bit more freedom with where you're doing your normal work tasks. And then the other thing that you had mentioned that I wanted to also point out is not only developers, but there's a whole range of content creators of people who just cover technology and everything that Apple does, and that I've just been really surprised to see how good of a public reaction the Apple Vision Pro has had from people that are using it, but genuine reactions. I think most of the reactions that I've heard is, it probably doesn't make sense for any of you to buy this, but if you have a specific use case or if you're just interested in tech or using it as a developer kit, you can use it. But I think it's actually like launched with enough functionality that could actually have people who are not in the XR industry or not developers. They have the experience of using the very intuitive pinch with the eye tracking. And that is enough of a flow state of productivity with using computing that is this new paradigm of spatial computing that I think is like this iPhone moment of like XR. It's the moment that it starts to go into the mainstream after it's been relegated into like this kind of niche subculture that we've been a part of for over a decade now for each of us. So yeah, I'd love to hear like any of your thoughts on just reflecting on this moment and kind of seeing what really feels like to me, like this inflection point, this point where things do seem to be going well and it's still very early and still very rough and still a lot of bugs, but just so much potential with what has been launched with this hardware.
[00:43:22.283] Ben Lang: Yeah. Yeah, I think you pointed out a really insightful thing, which is that we are kind of looking at approaches from two different ends. So you have Facebook and kind of the earlier XR industry has been really focused on sort of like full immersion out the gate. How do we like completely immerse you? And what do we do with that? And the answer to that in a big way has been games so far. And then Apple's coming at it from like, let's just start with being able to do like stuff that we already do on flat screens. And then, you know, we'll tack on More stuff sort of as we go sort of enhance it as we go and I think the advantage of doing it Apple's way is that I think it's gonna be easier for them to find real user value as they add those things because starting with the let's go like full immersion kind of turn everything into a 3d app and do it that way of some of the prior headsets and I think that honestly makes it like you start thinking and like, oh, like, what do we do? We have a huge blank canvas. Like, how do we completely reinvent like every wheel? to do Twitter and VR? Like, what's that going to look like? We'll have the tweets float at you, right? It's like, it just throws out almost everything that we already know people value, like being able to use Twitter on a flat screen. Apple, on the other hand, you can say, okay, people already like to use Twitter. Let's just have that as the baseline. Let's just make that available, right? We already know that's valuable. And then as people use that, like in real time, and they're extracting value from that, then I think it becomes easier It's just like the flat thing. That's nice. But like someone just shared a gif of a 3d model they made. Let's go back. We'll update Vision OS version of the Twitter app so that you can just pull that model out of the screen and look at it in 3d in front of you. Right. That's an obvious just because I used it just because I touched it and it was really there. That was an obvious thing to do. And those are the kind of things that I think when you start with that blank canvas of total immersion and just brand new use cases, it's much harder to say, oh, that's obviously a thing that's going to make this app better for people who are already using it instead of trying to create a new app for users that may or may not be out there. Imaginary, maybe people are going to like to do something like this. So hopefully, ultimately, they end up meeting in the middle, right? And I think they will, right? I think there will be a time where headsets generally have the fidelity and capabilities of what you and I are doing right now, but also have the total immersive capability and like hardcore gaming type stuff that a lot of people want to do also. So honestly, as much as Apple Vision Pro and Apple generally might seem like this scary competitive thing to meta, I think it is the best thing that could happen to them. I mean, not only do they get to just say, oh, we see all this stuff that works really well. Let's figure out how to do that. Right. That's great. But it is going to force them to do the one thing that they have really struggled to do in the XR landscape prior to this, which is just focus. Like, we need to like come down on like real useful use cases and we need to make them great. We can't just say do you know here's everything throw stuff at the wall kind of see what sticks and say oh it's a great gaming platform we figured that out because like developers built fun stuff like they overcame all the challenges to build really entertaining experiences great okay I guess we have a gaming headset right It's like they need to focus on what is it for, where's the value, and having a competitor in the marketplace like Apple that can lead the way in some of those dimensions is going to really help them up their game faster. It's going to force them to do that. And yeah, I think they've struggled. They've meandered really all over the place over the years, figuring out what they want their headsets to do. I mean, you called out Oculus Go earlier, where they said, Oh, it's a great, you know, people love watching movies on this device. And it's like, well, yeah, because you built it like just to do that. And that's the only thing you kind of can do on it in any meaningful way. Again, it's kind of like a self-fulfilling prophecy. It feels like in many cases to me where the use cases like fall out of what they built instead of start with a use case and build a thing to deliver that. And I think, you know, again, I think Apple has a they do that approach a little bit differently and results in some good things. But, you know, some of the unfortunate Apple things, too, like boxing out a lot of those real, genuine, you know, entertainment applications that people are enjoying on other headsets.
[00:47:51.005] Kent Bye: Yeah, and Meta has, I think, in some ways skipped over the enterprise market where they're trying to subsidize the cost of these headsets to make them cheap enough for consumers to buy. But I think by skipping over the enterprise market, I say skip over because they didn't necessarily start with it. They started with consumer. They had Oculus for business for a bit, and then they shut it down. And now they're just starting to get a backup. But I feel like there's been a big gap there between how something like the MetaQuest could be used in an enterprise context. And yet we have Apple right out of the gate saying, this is a pro device. This is not meant for general consumers. This is meant for either business or professionals or developers. You know, it's in some ways a developer kit that's very expensive, but in other ways, it is more for people who have a lot of budget that could write off for, because it uses some sort of business case that they have. And I feel like Meta has skipped over that. And by skipping over that, they haven't found those viable use cases. They've kind of surrendered a lot of that market to Vive and Pico. But as we move forward, it may be a focusing for Meta to start to dive into that a little bit more. But I did want to ask about some of the resistance that I hear from people talking about the Apple Vision Pro when it comes to like, say, a productivity or a device that you could use as a screen replacement. And it's around the comfort. And either one of the straps I found are not very comfortable. There's a lot of DIY solutions that are floating around out there for where you can wear a baseball cap. You could buy an extra Velcro strap to have more support on the head. I'm sure there's going to be third party strap developers that are gonna try to fix a lot of the ergonomic design. To me it's a great shame that it's a $3,500 headset and yet it feels like worse than the Quest with how it feels on the face and that I want something more like the Valve Index or some of these more professional head straps that you can use. So I feel like the head strap and the comfort issue is one of the things that's holding it back. I'm not the best gauge of like long-term use. I've accommodated to VR. My longest session is probably around 14 hours in the VR headset. And so the very first day I got it, I like binge experienced it for 11 to 12 hours. And I did feel the impacts of the weight distribution. So I'd love to hear some of your thoughts on the comfort issues and where you see that going here in the future.
[00:50:10.929] Ben Lang: Yeah, I'm actually a little bummed that Apple didn't move the needle more in ergonomics. That was one thing that I was hoping they might demonstrate some new approach, but they didn't. So that's interesting. From a hardware standpoint, form factor standpoint, they're pretty much in the same ballpark where other people are in terms of just the footprint of the device, the kind of power it needs to consume and the heat it needs to dissipate. I mean, they even had to do the battery, you know, external to the device. And yeah, I think hopefully this is one of those places where, you know, a company like Meta could have the superior thing, right? They could come up the gate and say, hey, we've got the better, more comfortable one. And now you've got Apple taking that idea and making it better, right? You know, this competition is just going to be really, really needed. And it's going to be nice to finally see. And yeah, I can't wait for third party straps for this headset, right? I'm with you. And I think it's a challenge of, you know, these are like devices you have to wear and they have to be highly customizable in order to be comfortable for everybody. And even though Apple put two straps in the box and thank God they have the one with the top strap because this device would be worse for me without it. There's still more preferences and different shaped heads and things that one thing works better, you know, for another person. And so, yeah, I really am looking forward to third-party straps to come out for the device and hopefully find one that makes the headset even more comfortable. I'm pretty surprised they didn't opt to just balance, you know, the battery on the back of the headset. That seems sensible, but I think they probably made that as more of an aesthetic choice of like, we just don't want it to look like that. We just don't want it to look like this pack on the back. And that's Apple being Apple, right? I think the obvious thing, though, is that we will want to get rid of the pack eventually and things need to shrink. And one of the things that I really like about what Vision Pro is doing and showing right now is that If the software is good enough, people will want to use it more and more. So if people are complaining, like, I've been using this headset for so long that it has now become uncomfortable in my head, you have a pretty good problem on your hands, right? People want to be in there. They are in there. There's something that they're doing with the headset that is keeping them in there long enough to reach a point of saying, I want to wear it longer comfortably, right? They have created kind of software experiences that shows like ergonomics are now potentially a bottleneck to why this device isn't used even more than it is, right? And that's a really interesting place to be because prior to this for a lot of the other headsets, it's mostly been software. You know, it's like, yeah, they're no more comfortable. You also get uncomfortable in those headsets. But there's tons of people who just end up not using them because whatever they want to do in there is not available or not easy enough. And now it's like people want to stay in here longer. We just need to make it more comfortable to make it better. We don't need to fundamentally alter some of the other choices that we made. Obviously, the software stuff will get better too, and there's lots of places to improve. But point being, it is a good problem to have. Third parties are going to help accelerate that too. And I think it'd be funny if we end up seeing like a $500 Apple Deluxe head strap as some of the other Other companies have their kind of deluxe head straps out there. It's funny, though, like they kind of made a similar mistake as the Quest 2 and Quest 3. They ship with soft head straps, which I find incredibly uncomfortable and kind of always have. I always end up wanting a rigid strap headset to the point that like when I'm talking to people who are going to buy a Quest 2 or Quest 3 for the first time, I'm like right out the gate. Just do yourself a favor. Get a third party head strap because those ones are just not comfortable. So hopefully that's figured out in due time. And just as the devices get smaller anyway, that'll kind of happen automatically, which is nice. And one day maybe we'll get down to the old glasses form factor.
[00:53:43.797] Kent Bye: Yeah, yeah. And, you know, there might be other things like eye strain, the virgins accommodation conflict, where eventually we'll have very focal lenses to be able to address the mechanics of having a screen in your face and how that can cause different dimensions of eye strain or eye fatigue. And yeah, for long-term use, especially. But I think you're right in the sense that now we're at the point where the software ecosystem within the Apple Vision Pro is still very nascent, but a lot of the core functionality, let's say, if you use any browser-based applications, or just the ability to have like this kind of infinite screen, ability to have these multiple 4K monitors with the foveated rendering that helps to create this seamless high-resolution experience. I feel like all those things tied together create this impulse to try to keep people coming back in the headset, despite the comfort issues that I'm going to try a number of different DIY solutions in my own way with like a Velcro strap or trying to arrange it in different ways that just feels a bit more comfortable for using it. But there were some moments that I found with like encountering dinosaurs that just really gave me chills. I feel like the level of fidelity was like, Oh my God, this, like my body feels like I'm here in a way that I know the plausibility illusion that Mel Slater talks about, where you know that you're not actually there, but it feels plausible, and it feels like this place illusion where you've got this sense of environmental presence. There are moments in that encountering dinosaurs, as well as with the 180 video on Apple TV with the woman that was walking across the tightrope, that 180 video, the level of fidelity of how many pixels they're pushing across, also gave me chills. And so there's these awe-inspiring wow moments that I had with the Apple Vision Pro. And I'm wondering if you yourself had any of those moments that just really took your breath away or just this moment that feels like, wow, this is really taking everything that I've seen so far up to another level.
[00:55:38.026] Ben Lang: Yeah, so I actually just demoed the Encounter Dinosaurs to someone who had never done any VR before yesterday, and they started screaming and asked to be done with the experience and took the headset off before it finished. and I feel a little bad for terrorizing them. They knew what they were getting into, but even though they knew it wasn't real, having a dinosaur that is life-sized coming out of your wall with its sharp teeth coming right up to your face, when everything, when the visuals and the sound and the motion are all good enough, the instinctual part of your brain is going to say danger, right? And that's exactly what this person felt. And it is sort of a testament to what this technology is capable of. I want to be fair. This isn't the first time I've ever seen that kind of visceral reaction in VR. I've seen it from other headsets. A long time ago, there was a really early Oculus Rift demo. I think it was back before they had Oculus Touch. A T-Rex would come walking at you down a hallway. I've seen people freak out like that before, too. Something about dinosaurs, I guess, is inherently a little bit scary to some. But yeah, it's like, this is why I wanted to pursue and was so interested in this technology in the first place. What is the limit of this ability to simulate our reality? Right now, we have devices that are, for the most part, tricking our eyes and ears, but we have a lot of other senses that are untapped so far. Adding those in start to make things even more and more realistic as we go. For Apple Vision Pro specifically, you asked if I had any wow moments. Definitely that dinosaur one. You know, it's not the most insane like thing or most interactive thing I've seen as a VR immersive experience. But it's just like the pixels on the display are... There's enough. It's super sharp. The textures are really crisp. It's like one of the most like photo real, real time VR applications I've probably ever seen. One of the best looking in terms of just like end to end, right? the models are good, the animations are good, the sounds are good, the textures, what the display is capable of, the brightness of the display, and all of that coming together into something that's like, yeah, this thing's popping out of my wall right here. It's right in front of me. And it felt like a new level of that compared to what I've seen before. Very interestingly, the fact that you're not in a rainforest and there's a dinosaur coming after you, right? You're in your room and the wall has opened and the dinosaur is coming out. Something about connecting this thing that is tricking you into visually and audibly making it seem like it's there, being in a space that you're familiar with can almost make that feel more realistic. because it's like this fake thing in a space that you know to be very real. And so I'd be really curious to see how that ends up getting sort of more and more integrated between the kind of fake and real things like interacting, you know, the mixed reality kind of interactions that are still like really nascent on all headsets, but can be incredibly powerful.
[00:58:47.531] Kent Bye: Well, I wanted to mention a couple of things technically in terms of the Apple Vision Pro. I noticed a lot of motion blur when I'm looking at a camera and I saw Azad Balabandian talk about how these are like high persistence screens that have more brightness, but they're not low persistence, which means you get a little bit more motion blur. I'm curious as you are, you know, in the still in the process of writing up your review, if there's any like big technical gotchas that you feel like the trade-offs that they've made that obviously we're not going to be moving around a lot. And so maybe you can handle more motion blur, but as you're looking at the headset, what are some things that are striking that you're noticing as all the different headsets that you've seen and reviewed?
[00:59:25.979] Ben Lang: Yeah. Yeah, I think the persistence is kind of the immediately obvious one. It's not that it's significantly worse, I don't even think, than other headsets that are out there. But to some extent, the resolution is so high and everything else is so sharp that things like that become more apparent. And I think that's why you've seen a lot of people say like, oh, it's kind of blurry in here. Because when you're still and that persistence blur isn't happening, you have this expectation of how sharp it should be. And then when you start moving, that goes down a little bit. I don't think it's terrible. Technically speaking, I think a little bit of what we're seeing is persistence, which relates to the display capabilities itself. But I think more of what we're seeing in terms of like the blur people are talking about is the exposure time on the cameras. So if you're in a really bright environment, the cameras can have a very fast shutter speed, which means they're only like looking at the environment for a certain fraction of time. As the light gets lower, they want to be able to collect the same amount of light as they could before, and so they have to leave the shutter open longer, which means that as you move, anytime you've taken a time-lapse photo, move your camera, shake it a little bit, things get blurry. That's exactly the same kind of thing that's happening here. So, I actually think that very likely improvements in the cameras on the outside will improve that. quite a bit and I think the display itself is probably not quite as much the issue. So, easy way to test that will be how much does a motion blur look like in a fully immersive environment, you know, where you see no part of your real world versus out in the real world or when you're doing the pass-through view into the real world because then you're relying on the cameras to render rather than a sort of software camera, if you will. Otherwise, technical gotchas, I think the 3D model hand tracking doesn't feel excellent. I think Facebook still might have the edge in terms of getting a 3D model of your hand to interact in fully immersive and interactive things. However, the pinch detection, which I think is probably not relying on estimating a full 3D model of the hand, that is very good. So much so that I mean, it's still not it's not 100% but it works often enough and it's fast enough in conjunction with the eye tracking that I've actually I started using Vision Pro trying it with a trackpad and for the most part prefer to just go with the pinch, which is kind of crazy to me. That's partly based on how the operating system actually works, because with a mouse, you just have to move from one side of your screen to the other to get somewhere. But in Vision Pro, I might have to move all the way, basically looking from one side of the room to the other side of the room. My eyes do that much faster than trying to drag a mouse all the way across that space, right? So it actually ends up being kind of faster to navigate with this particular interface paradigm with your eyes. And I actually see that Maybe being something that people end up wanting on their computer like if they have a really big monitor Being able to just look and start typing and you're looking at the right windows So knows where you want to type people have been experimenting with that before but this is one of those products that might sort of prove out why that's so useful and make people want it more and So yeah, aside from really, I think ergonomics, I would say are the weakest part of Vision Pro for me so far. And it's an ability to play like some of the coolest, most fun, you know, awesome content that currently exists in VR. You just can't do it. Like, it's a big bummer that you can put on a different headset and have some really incredible entertainment experiences that you can't in this headset. And yet they also have, you know, on this headset, they have some other things you can't do, like an actual, you know, solid library of easy to get to high quality 3D movies that you can just pull up and watch. So again, kind of just comes back to, I hope, I hope this all ends up meeting in the middle. People are going to borrow from each other, get the best stuff. And, you know, five years from now, devices will feel like they accelerated faster than they did compared to the last 10 years up to this point.
[01:03:09.832] Kent Bye: Awesome. Yeah. And I always like to ask people at the end of all my interviews, what you think the ultimate potential of VR is. And as we're at this kind of inflection point with what may be the iPhone moment of VR, I'm curious what you think the ultimate potential of virtual reality might be and, or spatial computing more broadly, what all the potentials might be and what it might be able to enable.
[01:03:32.357] Ben Lang: Yeah, I think, you know, I've answered this question in the past and off the top of my head, it's basically like, you know, a year or two between our recordings, right? And you've answered, you've asked the question before. I'd be curious to go back and compare, you know, what my answer was then versus now, but I want to say that I have leaned into for a long time, the idea that The ultimate potential of this technology is to sort of close space between people like we're doing right now. This is now easy to get a solid representation of actual Kent and 3D in front of me and have a conversation, like kind of uninterrupted by technical constraints for the most part. I imagine, you know, we could pull additional people in here, friends in here. And the easier it is to do, the more that we're going to want to do it. And I kind of feel like we're in a state of sort of technology and culture right now where the internet is this weird abstract interface between like people and ideas and like there's just you know you've got your trolling and your shouting and your and your politics and a lot of division that kind of bubbles up on the internet because people aren't directly treating each other as people whereas If people do this more often between people they know or maybe even people they don't know, feel like they're standing actually in front of them, looking them in the eyes and talking to them, I think that a lot of that goes away, right? If I'm just someone who just fundamentally disagrees with some idea of yours, You know, I'm probably not going to sit here and stare you in the face and scream at you in the same way that somebody might type in all caps and say, you know, you're so stupid and blah, blah, blah, because you're a person, right? You're not like just an anonymous somebody on the Internet somewhere through my screen. You're, you know, you're here. So, yeah, ultimate potential, I think, is to help people connect with people in an authentic way that kind of gets us back a little bit to that humanness that is in our global world hard to come by. for means of distance and expense. So yeah, I look forward to doing this more often with people, being able to easily jump in and just talk to them.
[01:05:32.594] Kent Bye: Awesome. Well, based upon my memory of the previous answers, there's a consistency of connecting to other people. I think that you've been emphasizing for 13 plus years now that you've been covering it since 2011. And yeah, you've really been at the forefront of covering this space and just always really appreciate checking in with you when these moments of inflection points or launches and just get your take on things and reflecting on where we're at now and where we may be going in the future. And I very much look forward to diving deep into your what is always a very detailed and long and very comprehensive review of the Apple Vision Pro once you get your write up here published sometime, probably within the next week or so. So thanks again for taking the time to help break it all down.
[01:06:13.276] Ben Lang: Absolutely. Glad to do it and look forward to the next time.
[01:06:16.443] Kent Bye: So that was Ben Lang, he's the executive editor of Roda VR, and we were talking about some of our first impressions of the Apple Vision Pro, which released on February 2nd, 2024. So yeah, I feel like there's so much more to explore and unpack with the Apple Vision Pro, and I'll be diving into a lot more conversations with both developers and what's happening with the Apple ecosystem, because I actually do think that this is an inflection point with XR. It is a $3,500 machine. It's priced for people in more of a business context. I imagine a lot of the people that are buying the Apple Vision Pro are writing it off for tax purposes, which I think is a big difference between people who are getting it and the general public. I'm sure there are some people from the general public that are going to buy it, but for most people, they're using it as a development kit to be able to develop Whatever is possible with the Apple Vision Pro. It is starting from more of a 2d mindset Whereas the quest is starting from a 3d mindset But I feel like because they're operating from this existing repository of everything that they've done on with the iPhone and iPad they actually have a very comprehensive approach with accessibility some of the accessibility options on Apple Vision Pro and are way better than anything else that's out there on the market, just because it's baked into the core of the operating system. They're in control of the operating system. And you can create a lot of these apps without even having to use like a game engine like Unity or Unreal Engine, which is very bloated. It costs like over $2,000 per seat for a pro license, even if you wanted to do the polyspatial computing. You know, so I don't expect to actually see the majority of apps using Unity, just because there are so many existing iOS and iPadOS developers that are just going to use the native tools, but it's going to feel not very immersive with much agency. It's going to fall into that 2D paradigm for right now. But like I said in this conversation, I feel like Apple's coming from the 2D and moving into 3D, whereas the Quest is coming from a VR native perspective with embodiment and agency and tracked controllers, all the things that we associate with what makes a really, truly immersive experience. I feel like the Apple Vision Pro got some glimmers of that truly immersive experiences with the encounter dinosaurs. but it's really not meant to Be very active and moving around a lot like you do in a quest if you want to move around and play a lot of games get a quest but if you want to actually replace your monitor and explore the future of what spatial computing might offer then I do think that the Apple Vision Pro is the first viable VR headset that has this idea of spatial computing that I We've seen a lot of complaints about Apple memory holing, virtual reality, augmented reality, XR, mixed reality, like they're preventing developers from putting any of those in the titles or using in the descriptions for how they describe their application. And I've certainly been critical of that because I feel like I'm the Voices of VR podcast and I'm covering the Apple Vision Pro because it is primarily a VR headset. It has mixed reality, but it is also a spatial computer. It is able to achieve a level of compute with the M2 chip and R1 chip with all the offloading of the processing. that it actually is super impressive as a compute device. It's absolutely essential to have a Bluetooth keyboard to actually use it in that capacity. And I'm likely going to have to get a Magic Trackpad and probably even a MacBook laptop to be able to really push it to its limits as a screen placement and as a productivity tool. So yeah, I'm seeing lots of excitement right now with what's happening just from the reactions from I think it's just surprising to see how many Non XR industry tech influencers and reviewers and people who are looking at it because it's like the first major New platform that Apple has launched since like the Apple watch like a decade ago And so there's a lot of excitement in the broader technological reviewers fear where this is really a novel interesting new paradigm for Computing that is actually really interesting for a lot of people to cover So you're gonna see a lot of people trying to push it to the edge of what's even possible. So there's this moment of exploration and novelty and complete surprises for what is going to be the biggest things and Juno is one of the apps that was filling the gap for YouTube deciding to not launch at least they say now it's on their roadmap but just to have a native YouTube app is like one of the number one apps and it's like the must-buy like a lot of people watch a lot of YouTube videos and just to have the native integrations is super magical so you're getting a little bit of a glimpse of taking an app that we know and love and starting to have like a native treatment so I feel like the Juno app is like one of the first applications to do this kind of translation from what our experience of YouTube is and starting to use it as a background context for doing all sorts of other different type of computing tasks. And you can also have a full screen within a Safari and have multiple videos and everything going on at the same time. I have found that a lot of times you can only have one audio stream playing in your phone, and I've run into that a few times where you can only have one video playing at the same time, at least with audio. You can have the visuals of things playing, but you can only have one audio track playing right now, so there's not a lot of muxing together of audio streams that you would get on a computer. So I'm sure over time, all of this is gonna get ironed out, and I'm super excited to be diving in and covering it a lot more. The Apple Vision Pro was not cheap, and so I am very much encouraging if you enjoy this type of coverage that I'm doing, then please do consider becoming a member of the Patreon. just to help offset some of this cost and to help enable me to continue to dive into both the developers and some of the applications of the Apple Vision Pro. Because I really do think this is a turning point for the larger XR industry, even if you personally doesn't make sense to get one yet. Although if you are a listener of the Voices of VR podcast, it's likely that you are already within the XR industry and it just may make sense professionally to get one, but financially or personally, I understand it may not be. Within the scope of many people there is a apple card option where you can offset the payments over like 12 months You do have to have a credit limit that is at least $3,500 if you want to use that payment plan Because it's zero interest and you can spread it out over 12 months, which makes it a lot more affordable so Anyway, I'll be diving into a lot more coverage of Apple Vision Pro here within the next month, and yeah, stay tuned. So, that's all I have for today, and I just want to thank you for listening to the Voices of VR podcast, and if you enjoyed the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a listener-supported podcast, and so I do rely upon donations from people like yourself in order to continue to bring you this coverage. So you can become a member and donate today at patreon.com slash voicesofvr. Thanks for listening.