I speak with UploadVR Editor Ian Hamilton about his hands-on impression of the Apple Vision Pro that he had on Tuesday, June 6, 2023. Check out his full article: “Apple Vision Pro Hands-On: Way Ahead of Meta In Critical Ways.” Also see my Twitter thread live coverage of #WWDC23.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
Podcast: Play in new window | Download
Rough Transcript
[00:00:05.412] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye and welcome to the Voices of VR podcast, the podcast that looks at the future of special computing. You can support the podcast at patreon.com slash Voices of VR. So continuing on my coverage of the WWDC and Apple's Apple Vision Pro announcement that happened on Monday, June 5th, 2023. Today's episode is with the editor of Upload VR since 2015, Ian Hamilton. Previous episode was Ben Lange from MotiveVR and next episode is going to be Scott Stein from CNET. So I unfortunately was not invited to go to WWDC to be there amongst all the other developers and journalists and to get some hands on onto Apple Vision Pro. Next best thing for me is to talk to all the folks I know and love from the XR industry, including Anne Hamilton. He's been the editor at Upload VR since 2015. We crossed paths at all the different XR conferences over the years, and I want to get his take on not only his experiences, but also the larger reflections of what this means in the larger XR industry. and has a lot of deep reflections about both the story of being able to go and cover this at Apple, but also just the significance for really validating what has been happening in folks in the XR industry, toiling away, believing in something that a lot of folks didn't think was going to be a thing. And now that Apple has entered into the chat metaphorically, I think it's legitimizing a lot of work that folks have been doing in the XR industry. super exciting on a lot of different fronts and I think that Ian really reflects a lot of those sentiments and gives some hands-on impressions and takeaways for what the Apple demo got right with the Apple Vision Pro and the type of stuff that they actually could have some lessons from what's happening from years of research with virtual reality and avatars and you know what they're calling the digital personas so I'll leave Anne to be able to give some of his impressions, and then I'll unpack it just a little bit at the end. And if you missed the previous episode, it's with Ben Lang. And then the one before that, I had a chance to talk to a couple of XR developers with Raven Zachary and Sarah Hill with their first impressions about what's happening with Apple Vision Pro. And then the next episode, I'll be diving in with Scott Stein. So that's what we're covering on today's episode of the Voices of VR podcast. So this interview with Anne happened on Tuesday, June 6th, 2023, when Anne was still in transit from after just filing his report of Apple VR of his hands-on impressions for Apple Vision Pro. So with that, let's go ahead and dive right in.
[00:02:35.179] Ian Hamilton: My name is Ian Hamilton. I've been reporting on VR for more than a decade. I've been Upload VR's editor since about 2015. And yeah, that's more or less my focus in the space.
[00:02:47.643] Kent Bye: Great. And yeah, maybe you could give a bit more context as to your background and your journey into covering VR.
[00:02:53.605] Ian Hamilton: Yeah, so I started covering VR a little bit before the Oculus Kickstarter, I did a couple stories that were about local VR companies in the Orange County area doing VR things. Disney being one of them, they were doing immersive planning for their theaters before Oculus, and another one was, I think, Eon Reality, that was a local Orange County company. But as soon as I tried the first Oculus duct tape demo with a local company to me, I kept coming back to my editors saying I would like to do more stories about VR and they were sort of rolling their eyes at me a little bit for a while there. And when Oculus was acquired by Facebook, I took that as a sign that it was time for me to try to make a full gig of this. I went freelance and it took me about a year before I started writing for Upload. And from there, we've been working to try to be the best, most credible, reliable source of information about VR and AR.
[00:03:50.312] Kent Bye: Great. So take me back to May 23rd, 2023, when both you and Ben Lange of Road to VR wrote an article saying that you have been invited to Apple to come cover WWDC. Take me behind the scenes a little bit as to how that came about.
[00:04:08.139] Ian Hamilton: Um, I had messaged every person I knew that had gone and gotten sucked up by the Apple mothership prior to that. So I was in all their LinkedIn messages, their DM chat, trying to say, Hey, if you can get me on the list, please do so. I would love to be there when Apple announces its thing, knowing that none of them can reply to me or even give me a hint of anything. I don't know how I got on the list. But yeah, I was in my mom's house in Bentonville, Arkansas, like my son's last day of school. He was just getting ready for his summer vacation. And I'm walking around preparing for my own personal travel that I've got coming up. And I looked down at my phone and the email is there that you've been invited to Apple's. My son was born the day I accepted the job to work at Upload. And he saw me jump through the ceiling, almost, in a way that he's never seen me jump. Like you say, he's almost eight years old, and he saw a moment of joy. in me that he has never seen in his life and he flat out said I've never seen you act like that but it's it was meaningful and I explained it to him that I've been I've been more or less working towards this moment his entire life and it feels weird to frame it that way but it is essentially true and yeah I so the steps there were kind of wild because it was like Okay, we've got the invite. All right, let me go through and fill out the info. So I RSVP and I don't lose my spot in line. Okay, let's go talk to my team and get a pre prepped story going. Because it's instantly obvious that by Apple inviting a VR publication, it seems newsworthy, that that's happening. And, you know, in the email, there's no obvious like you can't share this language. So I figured we were free to publish it, but I still emailed Apple just in case, and I assigned a story to get written about it just in case. My reporter, Henry Stockdale, asked the same question I did. Are we free to report this? And then I saw on the Twitter replies as soon as our article was out there. I was like, imagine getting disinvited to the Apple event. because you reported that you were going, which, yeah, it was in the back of my mind, this fear of that happening. But the article got finished writing, right? It got drafted. And then the email came in from Apple saying, yep, you're good to go. There's no issues there. And I had confirmed by looking on Twitter that other people had gotten their invites and were putting it out there publicly anyway. So it was fair game from multiple directions. And then we hit publish. And, you know, it's a weird feeling of like, we worded it very carefully in this article to say that you know, make any conclusions of this what you will, right? Where the writing is on the wall to us, right? I don't know if it's going to be on the wall to everyone else, right? We put it out there and sure enough, people are like, that's the confirmation right there. And I won't lie, right? I went to the keynote. I sat there through watch OS updates, iOS updates, iPad OS updates, widget updates, multiple timers. So that's wonderful. worried that it was the biggest head fake in the world, right? And the moment that Tim Cook comes out and says, well, on video, the ridiculous thing about the keynote is I have to say, I flew out all this way to come to Apple Park headquarters. Tim Cook comes out on stage physically at Apple Park and says, this is a big day for Apple. We're going to have some of our biggest announcements ever. And then he goes and sits down and plays a video for the next hour and a half. So there was no value whatsoever of being in that audience just to see the premiere of this pre-produced Apple video. But yeah, that was a panicked feeling to get through this whole thing, thinking that there's one more thing coming, knowing in your heart somewhere that one more thing is coming, but still just, you know, preparing myself mentally for the anguish of the last 10 years happening again.
[00:08:23.750] Kent Bye: Well, I was at Augmented World Expo just a few days ahead of that, and there was enough confirmations from different people that seemed to indicate that this was indeed happening. So I had a good sense it was happening, but even I was watching through the keynote, reporting on all the different ecosystem outside of the XR headset. Not knowing if it was actually going to happen. So when that did happen, it did feel like a moment of relief in a lot of ways, because a lot of folks have been waiting for this to happen. Just a quick logistical note. Did all of that happen in one day? Like you got the email and you published the article or was that over the course of a couple of days? I'm just curious.
[00:08:59.449] Ian Hamilton: It was a course of a couple hours. I mean, I got the email and we had the article up within probably an hour and a half to two hours. I mean, uh, don't. you know, giving away something to our competitors and how quickly we work. But yeah, we work fast and we have to in order to get to things quickly. And yeah, we move fast on that.
[00:09:17.902] Kent Bye: We certainly have one of the biggest teams in the industry, you know, with some of the highest output of covering the space. And yeah, I was just curious, cause I had like some inquiries and where I was sort of waiting, whether or not I was going to hear anything back. And I ended up didn't hearing thing back. And so I'd already made my plans to go to AW and come back by the time you had heard anyway. So. Anyway, I was not able to be there and to have the hands-on. So maybe let's skip to over the last couple of days, you had a chance to watch a prerecorded keynote, which like I said, it's not going to be like, I probably had a better experience of the keynote because I'm in the comforts of my home, able to see high resolution on my screen and cover it pretty robustly. But you have to see it in a big crowd of people. But the difference between my experience of it and your experience is that you actually got scheduled a demo to go get some hands-on for this Apple Vision Pro. So Walk me through the process of that.
[00:10:09.609] Ian Hamilton: Well, all right. So let me think of this. I don't think any of this is under embargo. I mean, I wasn't told to keep any of this secret, but like the way the schedule worked was, you know, it said briefing. You said briefing time, a secondary on my schedule. So initially, you know, there was even a possibility that I would go to a meeting and they would explain to me things about the headset and I wouldn't actually put the headset on my head. So I was being very cagey up until the moment that I know that this thing is going on in my head of actually saying to the public, to my readers, yeah, we're going into a demo right now. So yeah, the first day here was just to witness that keynote. And let me just say, despite its being on a video stage, right, and being just introduced by Tim Cook and just watching a video. It was a lot to take in the grandiosity of what the Apple machine is by actually getting to see Apple Park, this ridiculous structure that's literally surrounded by nature inside and out. And they've got like these, you can't see the rest of the city. They've got these little hills built in with all the greenery. And then just the number of Apple employees walking around and directing you where you need to go. I've been to a lot of these meta events and Microsoft events, even Google events. Nothing quite matched the feeling of being there in this place that feels a little bit like an Apple store, but way, way, way large. And it's this enormous structure that's mind boggling to walk around. So there was something I gained from just being able to ingest the size of the Apple machine firsthand. So then my demo is the second day and I get my appointment time in the morning, go back to Apple Park and get in a little cart with like five other journalists. And now we're in a cart running around this giant structure and they're driving us around. And I could hear the Jurassic Park, Welcome to Jurassic Park Island music playing in my head because it felt exactly like that. Like we're going on a safari and there should be dinosaurs walking through these bushes at us. And we're being shuttled around by people that have very big things that they're excited to show. So the cart pulls around to this building in the back of the thing. And I, I've tried to describe this in my head of just, I put a video out there. I was able to, so I wore the Wey-Wey and Story sunglasses from Meta. and recorded minutes of me walking around the space because I didn't want to do the selfie stick thing. I think that's a little bit, I don't know, I don't like doing the selfie stick stuff. I'd rather just get my perspective out there and show what that looks like. So I'm recording minutes of footage as we go on this drive. And I walk up to this building and I know that once I'm in the building, there's no recording allowed. They said they reiterated to it as soon as I got inside the building. No audio recording, no photography, no video is allowed, at least for us and our publication. Looked like Good Morning America got some, which that's a Disney owned company, which Disney was on stage. Yeah, OK. So it was a weird feeling to be walking up to this building that's like It's picturesque to a degree I can't really describe. There's a field that's hundreds of yards wide and it's just pure grass. And then there's this space station structure, the spaceship structure over there in the distance you can see. And then you look over to the right and there's this big open building with no doors on it. And then there's people with tablets waiting to greet you with your name. It feels like you're at the gates of heaven and they're ready to invite you in to see if your name is on the list. And they looked on the list and my name was on the list. And so I get to go in and try the Apple Vision Pro. I was about to call it the Quest Pro. It's weird after this many years of writing about meta and Facebook. to have another product name that sits in the same category, that bears the same weight, and it's wild. So I waited for about 20 minutes for my demo time. Well, first thing they do is there's a room right at the front. where they asked me if I have prescription glasses. I said, yeah, I do have prescription glasses. I don't typically wear them, but I do have them. They said, can we see them? All right, so I hand them my prescription glasses, and they put the prescription glasses on a little machine and apparently must have deduced what my prescription is by putting them on the machine and then putting that into their system. Then I walk out of that room and did a face scan and an ear scan. And then what waited for about 20 minutes for my demo time. I go into the demo. It's a very small room. There's two Apple employees in there. The headset is sitting on a table right in front of me. It's got the battery pack curled up right next to it. And I'm excited. They want to explain to me the buttons on it and how it works. I'm diving in as quickly as I can. They had a very specific path they wanted to walk me through from app to app to app. of what they want to do. And I'm, you know, I'm pinching my fingers to go into every app I possibly can as quickly as I can. And they're like, no, pinch out of there and get back to the home screen. They've got AirPlay screaming the whole time onto the iPad in his hands. So he was watching what I was seeing inside the headset. I just published my hands on with this experience right before I got on the call with you, Ken, right before we started this recording. And I have to reiterate this moment that I got across on my ride home. Very first few seconds in that headset, I'm looking at my hand and they felt like my hands. I'm looking at my hands through reconstructed computer vision. And I can only convey with words like it's always the case with VR in general. but to have my hands there in front of me, to feel that they're actually depth correct, they're bright, they feel like my hands and to know how badly Quest Pro missed the mark on that compared to its low resolution camera, what feels like a cell phone video, this felt right. This felt incredibly right just from the very first second. in the headset and then you get to all the other things right that they did the user interface all the different type of content that they're envisioning for those first phase but i just have to convey this sense that like they invested in the right places. $3,500 versus $1,000 versus $500. I've been in three Uber rides and they all brought up that you're at the Apple conference, the driver, and we had a conversation and they all said, how much is it? They all asked on their own, how much is it? $3,500. And it's like, oh, that's not for me, right? It's universal, right? It's the universal reaction. And the way I think about that is it's worth a couple thousand dollars to have better vision, at least for some people, right? And that's quite literally what it is. It's a better vision while you're wearing the headset. I'm going to be frustrated every time I put Quest Pro in my head going forward and look through their past as compared to what I saw out of the Vision Pro. It's that stark a difference.
[00:17:44.003] Kent Bye: Just to clarify around looking at your hands, because I've seen a number of different pass-through mixed reality experiences. Sometimes you just see through the pass-through video, you see your hands. Sometimes you have like an overlay of like a digital avatar over your hands. And so was this just a picture of your hands with depth correct information, or did they have any avatar overlay over the top of your hands?
[00:18:05.484] Ian Hamilton: So no, just my hands. And there were parts where I overlaid virtual content into the scene, and then my hands were getting cut out against the virtual background. And when that happened, there were scenes where I could notice a little outline just around the edges of my fingertips where, you know, it wasn't getting to the exact pixel of my finger. Like there was an obvious outline in some instances moving fast enough. It was a very fast moving demo. I went through a lot of things very quickly. So yes, it was my hands the whole time. And sometimes it was just my hands with the physical room. Sometimes it was my hands in front of virtual content.
[00:18:50.703] Kent Bye: Okay. Yeah. I had a chance to talk to Ben Lang of RotoVR and he was quite impressed with the integration between the eye tracking and the hand gestures. Felt like it was very intuitive. He said just there was a couple of times out of the whole half hour demo that he clicked and it didn't detect and that he was struck because it was working so well that he was quite surprised with how seamless it was. So I'd love to hear some of your thoughts of, you know, because this is a system that doesn't have any hand track controllers. The only method of controlling it is a combination of either eye tracking to be like the cursor, your fingers to be the clicks or to scroll and some voice interactions to be able to enter in text. So I'd love to hear some of your thoughts on the input scheme of how that felt for you to have this combination of eye tracking with using your hands to click and scroll.
[00:19:36.853] Ian Hamilton: Yeah, I'm a firm believer after this demo that this is the mouse clip of VR, that they did get it right. But there's a nailed combination that headsets that don't have this combo will feel antiquated afterwards. Quest Pro has this combo, technically. A couple AR headsets out there have this combo of eye tracking and hand tracking. I think it's a really glaring, obvious thing that's likely to change in Meta's ecosystem of attaching eye targeting to your pinch gesture. But then again, the Quest Pro doesn't have a depth sensor on it, so its recognition of this might not be the same level as the Vision Pro. But still, it was fantastic. I noticed myself dropping away the meta-based affordances that I had learned. So initially, I was holding my hands very clearly in front of my face to do this pinch gesture. And that's because I've used Quest 1, Quest 2, and Quest Pro. And very often, the only way to get it to recognize that pinch gesture is by making sure your hand is appearing directly in front of the headset cameras. Over the course of 20 minutes in this demo, I'm leaning back on my chair and I'm putting my arm on the arm rest, just like Apple shows in their video, and I'm pinching right here next to my head, and it's picking him up every time. And I'm just looking around this interface and starting to do things, right? I'm looking for... There were a couple times where I had to get used to, like, Okay, I've got to look at the bottom bar and then I've got to look to the corner and then I can pinch to resize things. There was less fumbling in my first time with this headset than every time in every other headset. That's what just works means. Apple has always been praised for its just works mentality. This did just work. And I can imagine a few people struggling with this idea that their eyes are targeting these objects. But by and large, it was just such a such a delight to get into this head that and start rampaging through the user interface within seconds, literally 20 seconds before it started feeling pretty natural.
[00:22:14.611] Kent Bye: Yeah. And talking to Ben Lang of Road to VR, he was saying that the demo was fairly scripted and guided where you were kind of stepped through each of the different apps and directed very deliberately with not a lot of room to explore around. And so it sounds like a pretty set script that everybody was seeing. And so after you walked out of the demo, I'm curious, what was the one thing that really stuck with you as your big takeaway? I mean, your headline for your hands-on is App Envision Pro hands-on way ahead of meta in critical ways. And so how would you start to break that down in terms of the stuff that you're taking away?
[00:22:47.544] Ian Hamilton: Yeah. All right. So I outlined in that writeup, there's a piece of this overall demo that I would call the weakest. So the weakest part of this demo, the worst part of this demo. was their avatars or what they call personas. So they call them personas because they're trying to get across the idea that these are linked to you individually and not transferable, right? You can't let someone else wear your persona. Hypothetically, they're going to be scanned by you and owned by you. Meta is fully aware of this with their own technology. They were trying to approach this in their own way. With Kodak avatars, I was deeply unsettled. I was unsettled. I won't say deeply unsettled. I was unsettled by the Uncanny Valley in conversing with another person who said they were wearing a Vision Pro and I was having a FaceTime call with them. and there was just something off about their appearance, right? Think of Leia in Rogue One, she looked great as a reconstructed person, but still just a slight little bit off, right? And it's always been the case with avatars in various forms across all forms of media, you know, having something that you can lean around something that reacts to the wide range of expressions that a person can have, it's a startlingly hard problem to solve. And Apple hadn't solved it as far as I could see in that demo. Now, I tried a different demo last year with Meta, where I went and tried their codec avatars running from a PC, and it crossed the uncanny valley. I loved looking at that avatar. It felt like I was conversing with a person who was thousands of miles away from me, and they were standing right in front of me, and I'm leaning around in their aisles, moving to match where I'm moving. It felt fantastic, and it felt better than what Apple showed today. The only problem is that meta demo was done with a person who was scanned by 220 high-resolution cameras, and they had to stand still for hours at a time to get scanned in every possible position. And then it had to process for days or weeks, something like that. That was the avatar that I'm comparing what Apple just did on a completely standalone device that you know this is a scan that happens with the headset and then it's an avatar presented by the headset and driven by the headset sensors. But you gotta get outside, you gotta get past that uncanny valley. That's the reason... Like, I don't know if I would prefer them to the cartoony Horizon avatars that have been made fun of mercilessly for the last year and a half. Like, I don't know which version I would prefer, but I did feel comfortable after just a minute of interacting with this avatar. It didn't feel human. It was a weird space. So I have to say that like maybe it improves by launch, maybe everything I saw improved by launch, but it still was in a bad place for me personally. Others might not be bothered by it as much as I was. And then just everything else, it's such a, you have been at this along with Ben, along with Norm at Tested for a decade plus, right? We've been at this, believing along with Mark Zuckerberg, that glasses are going to be future personal computers. And I can forgive anyone's skepticism of Mark Zuckerberg's strategy. He's certainly gone in a lot of weird directions that didn't pan out. It's much harder to call us crazy when the largest technology company on the planet is doing the same thing that Meadow does. So there's just an incredible amount of validation for this being a reality, right? Like, for this bet we all made years ago that, you know, I've had people ask me, why focus on VR? Why just do this? You're so knowledgeable about other areas of technology. Why not just help people in other things? And I can't convey, we are going to sit together in the future, even though we're not in the same room. It's going to happen. And, you know, Apple just legitimized and showed, showed the world that this is actually going to happen. You know, it's a weird, it's a very weird thing where like, You know, and it's $3,500. How many quests did Apple just sell? Because it's priced that high, right? Like, it's almost like, Oh, wow, I can get a great experience for only $500. Maybe I will get the quest through, right? Like, it's a weird thing where like, they've in effect legitimize each other's efforts.
[00:27:58.057] Kent Bye: Yeah. Although there are some distinct differences when it comes to privacy. And that's the thing that I'm still worried about how much of that lower price is subsidized by a business model of surveillance capitalism. Whereas Apple seems to be on the right side of that conversation. However, it will cost people at this point, it'll cost them a whole lot, like seven times as much as it costs for a single quest three as it comes out. But you know, just to reflect a little bit on those avatars in the uncanny Valley. This is something that's been explored within the academic literature for a long time, where some level of stylization does help to lower that uncanny Valley. Because once you get a little bit too close to somebody's image and likeness, but you don't have all those micro expressions or the eye gaze, you know, it just feels like creepy. And it feels like they're well within that uncanny Valley with what they're showing here. So either going with that more stylized Memoji approach or I don't think they can necessarily go up to the Kodak avatar, but there's been plenty of different folks within the VR industry that have learned these lessons to go towards that stylized approach. But you know, the fact that they almost refuse to even say the word virtual reality, they'll say augmented reality and they'll say spatial computing, but they're almost like rejecting so many different aspects of the heritage of what virtual reality even is.
[00:29:17.761] Ian Hamilton: So that's, that's a big, So that is a lasting thing from the demo. There was no artificial locomotion whatsoever in the demo, right? I sat on a couch and at the end of the demo, when a portal opened on the wall and a dinosaur walked into the room, I said, can I get up and walk around now? Cause you've been, you've been walking me through this and telling me that I have to do this app, this app, this app. Like I want to get up and walk around and experience that. And they're like, yeah, okay, you can. And they back up and get as close to the walls as I can. And I walk up to the wall and I'm moving really fast to try to test the tracking and everything. And like, just that that wall is really there. They're reminding me that the wall is really there. And I'm like, I'm not, I'm not going to be the first person to run through a wall, a physical wall in the, in the Vision Pro headset. I'm not going to be that mean. But yeah, so they tossed out 10 years of design progress in teleportation, in vignetting, in developers trying to make really uncomfortable experiences as comfortable as possible. And they tossed all of that out the window in this demo, right? There was no locomotion whatsoever. All of the content was presented in the room with this exception of a dinosaur being shown in a realm beyond and then like a CG environment that felt like it had actual positional movement for a theatrical experience. So there were some like CG environments that I could move around inside if I wanted to. There was this room, there was a space that existed beyond the wall, the dinosaur realm. But what do good games look like on their system? And when will Apple, it's not an if to me, but I guess it could be an if, but it feels to me more like a when will Apple embrace accessories beyond just the keyboard and mouse being picked up and brought into the experience? And when will they, will they ever support, I guess, I'll be less robust, but will they ever support this other idea of vast virtual worlds that you navigate using artificial locomotion? I don't know. It's one of these things where like, Apple didn't have a pencil support, right? And famously, Steve Jobs said, if you see a stylist, somebody screwed up. And then five years or seven years later, they have a stylist. There's other examples that I can't think of offhand of just Apple doesn't do it until they think they've got the right solution. And it's heartbreaking to think of games like Walkabout Minigolf, Puzzling Places. Puzzling Places might work. But Walkabout Minigolf, where you're going around 18 holes and they've got a very comfortable teleport built in there. And if they're just like against the idea of navigating vast virtual worlds, that's a lot of content that's going to be exclusive to a competitor's headset. And I, that feels like a mess if that's what happens.
[00:32:18.480] Kent Bye: Well, Apple's always been really strict with their human interface guidelines, and they have not released their guidelines for Vision OS. They're supposed to come out later at the end of June. So that will be their opportunity to really set up the bounds for what is and is not possible. I will say that based upon my conversation with Brandon Jones, talking about WebGPU and the history of how that was evolved, he did mention that there was a conflict between Apple and Khronos Group, meaning that Khronos Group runs OpenXR. OpenXR is the open standards where you have all these interoperable standards for things like peripherals. And actually traced it down to like the WebGPU minutes meeting from December 9th of 2019, where there was a representative from Apple who basically said, Apple's not comfortable working under Khronos IP framework because of dispute between Apple, Lingle and Kronos, which is private. It can't talk about the substance of this dispute and can't make any statements for Apple to agree to any of the Kronos group's IP framework. So essentially, there's this beef between Kronos group and Apple, which means that Apple is essentially never going to adopt OpenXR, which means that right now, Unity had a agreement to be able to do whatever it took to integrate with Vision OS and all their framework. But essentially, If Unreal Engine, it's still an open question as to whether or not Unreal Engine will make an appearance. There is an existing legal battle between Epic Games and Apple, and Epic has really committed to OpenXR as the way that they're going to be dealing with the special computing in the future. So you've got a little bit of these intractable things that are happening that essentially point to Apple's going to be a pretty closed ecosystem where they have their existing frameworks and systems. Unity is maybe a backdoor to bypass some of that, but in essence, they're going to be building a lot of this ecosystem from the ground up. And from based upon most of the demos, Ben Lang said about 75% of all the demos he saw were interacting with these 2D planes in these spatial environments. And so I can imagine that Apple is going to be starting with bringing in all these apps. But the thing that Raven Zachary told me, who's been an iOS developer and has worked for HoloLens, he said that Apple's much more interested in bringing in their existing developer ecosystem into this spatial computing rather than trying to reach out to the existing developers of the XR industry and have them port their games over. So it'll be very interesting to see what those interface design guidelines are and what the boundaries are for what Apple is or is not going to accept. Meta's already been really strict for what they accept on their official store. There's the App Lab and SideQuest, but hopefully we won't have a similar situation where Apple is similarly really picky about what can and cannot exist on their platform. There's always WebXR as a backdoor, but I think these are the types of things that over the next couple of months, and as we start to learn more about it, hopefully we won't be replicating exactly what we have with meta, which is a highly curated system that is essentially eliminating so many of the different possible applications that are out there. I think a lot of developers are hoping that happens a little bit more open. So anyway, I just thought I'd share some of those thoughts. I don't know if you have any reflections on that.
[00:35:32.822] Ian Hamilton: Yeah, that's interesting. I hadn't heard about some of that. I think I saw on Unreal Engine 5, there was a Vision Pro page somewhere listed there. I was going to... We need to see if there's a story there, but it seems like there might be something there that might have worked at some level of integration. But like, I saw Rec Room listed on Twitter there, there, that's a wonderful example. So Rec Room was shown in a screenshot on stage. That's an app that is very heavily, you're moving around large virtual spaces. Meta took such grief for their curated approach. And yet the curated approach paid out to the devs that got through the filter handsomely. If you got on the Quest store, the curated approach worked. You made money. If you didn't get through the store, you were dead. But then even on App Lab, things like Guerrilla Tag found markets and found lots of people to play. It's really hard to imagine these flat apps making really compelling AR VR versions overnight. when these devs have built really cool vast virtual worlds that are compelling, like the Vision Pro looks like the most awesome laid back VR experience they're going to have, but what's it like when you get up off your couch and want to move around? How does that experience work? I didn't see Beat Saber, right? And Beat Saber is a kick-ass experience on any platform that it's on, right? Like it works within those guidelines, right? The boxes come into your room. So why can't we have Beat Saber on there? Is it Meta stopping it or is it Apple's guidelines? And that's going to be something we're probably going to come back to over time.
[00:37:28.932] Kent Bye: Yeah, I think as it starts to open up, definitely when it starts to launch, it'll be a bit of a test to see if some of the different experiences that are under the publishing arm of Meta start to come over to this platform. I hope that it does, but I guess we'll see. But I did want to ask a couple of other questions around the specific demos, because I know that there is both the spatial memory capture, as well as like showing kind of a 180 video format, but also like watching 3D movies. Love to hear some of your other impressions of things that stuck out for you when it comes to the demo that you saw.
[00:37:59.639] Ian Hamilton: So, I like watching Avatar 2 in 3D. Yeah, it looked great. I would watch it in 3D in that headset from start to finish. I didn't go to see the 3D version in theaters because I know that it affects brightness and I don't like the projection system very much. I will enjoy watching it in shareplay in a headset. The immersive memories, it doesn't have positional tracking. I couldn't, you know, lean around the content. 180 immersive video was NextVR, that had to be NextVR's technology. Now being called immersive video, again, you can't lean around it, but they're doing best in class clarity on what you're seeing. So you can go courtside and have this great live content. It's weird to come full circle back to the Oculus Go days. I think you and I had some battles over the value of Oculus Go content, 360 content. yeah it's it's okay it's good but it it's not i'll be taking the spatial videos and photos with the headset when i have it i will be trying to record things that way but what i really wanted was the live call right i want the live call with real avatars that's the dream the ultimate promise of virtuality the question you ask every single person on your podcast the ultimate potential of vr is the collapse distance and bring us together, even though we can't be together physically. And the spatial memories get you part of the way there. The 180 video gets you part of the way there. And the avatars or personas that they're showing now get you only part of the way there. Everything optically, UI, gestures, phenomenal out of the park from Apple. but it's still clear we are a couple generations away from getting to this holy grail moment where people maybe consider a VR headset instead of a phone in some circumstances. Like there's certain circumstances where a person would want to watch right now instead of getting a phone. You could get like a different phone or a stupid phone, but get an Apple Watch that has cellular and takes it with you everywhere. There's certain instances where you would want to do that. We're still a far way away where you get a VR headset as your spatial computer, as your computer instead of a laptop or instead of... That was the question I kept seeing on Twitter, right? Am I going to get this instead of a laptop? Even with all the benefits that Apple has brought here, it's still hard to imagine it replacing a workhorse device. It's just ever so much closer than Quest Pro and Quest 3 and 2.
[00:40:52.465] Kent Bye: Yeah, I suppose it kind of depends on what kind of work that you're doing, and how much the tool set is kind of lining up with your day to day tasks on whatever computing device that you're using. Just a few words on NextVR. I think the live streaming of courtside, different live sports events, that actually could be quite compelling in terms of if it's a high enough resolution and if they have integrated in NextVR. I can definitely see that as being a really compelling use case. And just for the 360 video, I want to just give a shout out to all the amazing 360 video creators, the winner of Venice of 2022. was a Taiwanese production called The Man Who Couldn't Leave. And there's another amazing piece called All That Remains. And there'll be another couple of other 360 videos premiering at Rebecca. So it's a medium that actually has not died and actually has a thriving existence for pushing forward what's even possible with the storytelling art, and it typically doesn't catch the attention of folks like Upload VR, Road to VR, but in terms of the film festival circuit, it's a thriving medium, and I expect to see even more once the integration between the capture and the playback. I think the fact that you could capture spatial memory and potentially play it back, I mean, you're not going to be capturing a full 180 video with some of these, but the fact that you can at least capture spatial memories on the same headset. I'd say the thing about memory capture is that memory capture often isn't really valuable until like five or 10 or 15, 20 years later when you look back on it. So when you're taking something in the moment, it's not until things have changed so much in your life that you're looking back on it that I think it's kind of an understated thing that people may be capturing these memories, but they won't perhaps fully value what those features mean until another five or 10 years once they're looking back at all of them.
[00:42:42.752] Ian Hamilton: So we talked about all these different formats, right? Where going back to my clips history, there's an article I wrote called Virtual Reality and Not Just a Game. That was the headline. And I'm sure that I've seen that same headline on at least two dozen other articles over the years. It's a very, very easy one to write, right? Like people understand VR is for games. They don't understand what else it's good for. And in that story, I was talking to NASA, and I talked to Dr. Jeff Norris at NASA, and had a phone call with him. I remember where I was when I had this phone call, because it opened my mind up a little bit to the possibilities. And he talked about the television being the right medium for broadcasting the moon landing. You can have 30 or 100 people gathered around one tiny flat screen in a room and they can all get the same live content beamed directly from the camera on the moon. And his comment was that it's no longer the right medium. And there is a future where we land a ship on Mars with humans on board that ship. And the camera systems outside the ship scan the surface and beam back the geometry of the surface of Mars to Earth. And then the astronauts step out of the capsule and down the steps. And that portion of it is beamed back with an eight-minute delay or whatever it is, a six-minute delay, a 30-minute, I can't remember how long it is, eight minutes to the sun. I can't remember how much it is to Mars. So there's this giant delay. But at the other end of that data stream, you could walk around the astronauts in your room. You would have the astronauts stepping down onto the lander, you know, onto the surface of Mars. And you'll feel as if you're standing on Mars with them to witness that moment. And it's all done with the same the same reality construction algorithms that are reconstructing my hands and the vision crow are going to be doing that on the surface of Mars. And we will have that like very real, very real feeling of being able to witness history first person, no matter where it's happening in our solar system. Dr. Jeff Norris went and got hired by Apple. He was one of those people that got sucked into the mother ship and went quiet on me.
[00:45:16.828] Kent Bye: Yeah. Yeah. I remember doing an interview with him, I think back in like 2016 at the unity AR VR summit. It was where they gave away all these free vibe headsets. But yeah, I guess as we start to wrap up, uh, first love to hear some of your closing thoughts on what you think this entry of the Apple vision pro means for the overall XR industry.
[00:45:39.774] Ian Hamilton: Yeah. It's validation that you were right all along. no matter if you were in this for a year or 20 or 30 years. It's a big moment where a lot of people that dismiss this are going to wake up and realize, yeah, this is actually happening. It's also a challenge to make the right choices again, right? We all remember the VR winter of 2017 to 2019. Investors turned their noses at this technology. And Meta and Apple are true companies, right? They're not going to be handing out money left and right to anyone that comes and asks them. And they might ask for weird stipulations. They might ask for exclusivity that makes it impossible for you to make money on other storefronts. And you as a developer, you as a creator, have to decide, OK, what platform am I targeting first? How much money do I need to actually deliver the project that I'm planning? There's a long list of very skilled developers who have done the work and know what's good and what isn't. But you've got two very different platforms, at least two, right? You've also got PSVR, you've also got PCVR to consider. And there can be Google and others that pop up again very soon. It's not easy. Just because Apple has validated this market in a very, very big way doesn't mean that you're going to make money by putting an app on their storefront. It doesn't mean that they're going to sell enough of them in their first year to mean anything to people. For all likelihood, Quest is going to have a very, very big year at Christmas, and that should be your path as a dev. Get on the Quest store, do App Lab, maybe get a Quest Pro so you can test the eye tracking with the pinch gesture, and then consider Apple when a little bit more known about it. But you're going all in on Apple just because it's the best device they're in is overly reductive. It's like you're not understanding all the nuances of just how hard it's going to be to make money with the right kind of product. Like Raven Zachary, I think, is someone who's a good person who's thinking of how they found a market out of ideas that weren't necessarily consumer-facing. That might be what the Apple device is for a little while. And I also just, I want to contextualize the price also in the history of personal computing, right? The first computers were priced like this Apple headset. And our computers today are a million times better than those computers then. And yes, they're cheaper too. This is the first of a new classic computer. I will want one, even though it's as ridiculous as it is, I will want one in a box next to my original Quest and original Rift because it does mark such a significant moment in technology history.
[00:48:51.610] Kent Bye: Yeah, I wanted to ask a follow-up in terms of the clarity of being able to read text, because I feel like we have a total of 60 pixels per degree for our retinal resolution. And Carl Guttag of KG on Tech, I had an interview with him at Augmented World Expo. He was saying that the minimum amount of pixels per degree to be comfortable for reading and 40 degrees he was saying is around the threshold and he was estimating that this device may be crossing that threshold. So it feels like it may be at a high enough quality to be a viable screen replacement. Carl is a little bit skeptical of that in terms of like, burgeoning accommodation conflict and eye strain. You're only heading there for 30 minutes. I don't imagine that you were able to give a full accounting of that, but I'm just curious what your intuitive gut is in terms of whether or not you feel like this device could be a viable screen replacement and how comfortable it was to actually read text in the device.
[00:49:47.438] Ian Hamilton: Yeah, I hate that question because it's, it's a good one and it's hard to answer. Um, And I've gotten roasted by looking at these things as personal computers before, right? Like if you go and get a nice three monitor setup with 144 refresh rate and get a multi-thousand dollar, even a thousand dollar Samsung display with super panoramic, it's going to look great on your desk. compared to this. The clarity of Quest Pro was a huge step up, like the lens edge to edge clarity. I feel good reading text in Quest Pro. I would wager I would feel even better in this headset. But the only moments that I had to really read text was when they were showing me multitasking and they showed a Safari window next to two other windows and I was like cussing out Meta's UI paradigms because it just felt so good to just pull these windows around in front of me, resize them really effortlessly. But that's not the same as sitting there and trying to read a couple chapters of a book. And I think it'll be comfortable enough for that, but it's still going to be one of those things like Apple's working on 4k and even above displays across their line. And it's hard to say whether like You should go get a $3,500 VR headset or a $4,000 display system that's purpose-built for whatever your task is. HDR video editing or coding for 12 hours straight. It's hard to... We're going to need to spend significant time with that before I can make anything intelligible.
[00:51:29.642] Kent Bye: Well, that's what I figured, but I thought I'd ask, cause that's, I think the biggest question for one of the use cases that they're advocating for that if it is a viable screen replacement, then if there is a $5,000 monitor that you get just as good as experiencing a $3,500 headset, then that's actually cheaper to get that. But, you know, I imagine that the $5,000 monitor is actually going to be better at being that monitor, but I don't know.
[00:51:57.930] Ian Hamilton: I mean, I bought a MacBook that's, what, 1,200, because I needed to have... I tried to use an iPad, right, for my on-the-go going to conferences, and it just wasn't quite there. There's too many windows to move around, and too much research I need to do. And you probably do this all the time, where you've got three windows up, and you need to reference what's set in this window against what's set in this window with the window that has all your notes. And 12-inch display feels super constrained. 13-inch display is super constrained. 15-inch, yeah, you get more space. Then you've got this bigger, bulkier device. But then it's amazing to think about this idea that floating right above your main display, you can have three more giant ones. And you can pull these windows right out at any given moment. It's a cool idea that I have yet to do in practice in any way, shape, or form that's like, yeah, I'm going to be doing this on the regular. I know there's spatial and immersed and big screen even claiming serious usage times. And big screen beyond is a special use case where they're pushing the PPD. I don't see it yet. I don't see the use case where like I have this all the time happening with my 12 inch where it's like I really wish I had two extra monitors but I'm portable right now and I don't have them. It doesn't make sense for me to pull my headset out of my bag also and put that on and now this display goes black and I've got private displays to interact and do my research like I described, right? I've got notes over here, two bits of text that I need to reference. I don't see bringing around a battery brick, a headset and a laptop anytime soon.
[00:53:42.806] Kent Bye: Yeah. Yeah. Like I said, it's probably depends on what your work actually is as to whether or not this is going to fit your workflow. So I'm sure for some folks it's going to be perfect, especially if they do like spatial design or something like that. So, but, uh, yeah, just, uh, final questions. What do you think the ultimate potential of spatial computing might be and what it might be able to enable?
[00:54:06.042] Ian Hamilton: Yeah, I want the holodeck and we're so close. A room you can go into and make the scenario written to your exact specifications to relieve you from the stress of any day, make it as challenging or light as you want with That's what I want. I think the last time you asked me this question, I talked about how much I control and the way you can be controlled using these devices. I thought it was a very interesting pitch in Apple's language of saying, where you look is private to you. And I'm going to want to compare that to the thinking around meta and your gaze and how that's correlated or analyzed. Yeah, distance collapsing and being able to watch things and experience things together, even when you can't be like, I just overdosed myself on the quality of real world physical travel, right? Like I got to see some amazing places, physical places. feel the wind on my arm going down rivers and stuff like that. And I know that VR is not going to get there anytime soon. It may never get there at all. But the idea that you can get to 90% of that experience and be there with three other people who aren't even present with you, that does feel achievable. And that also depends on me.
[00:55:32.839] Kent Bye: Is there anything else that's left unsaid that you'd like to say to the broader immersive community?
[00:55:38.444] Ian Hamilton: Read up on VR.com and please consider becoming a member. I've always appreciated your model of Patreon support, Kent. I know I've talked to you over the years. thinking about whether I needed to do Patreon. Well, we went and added a membership to our site that removes ads, and we don't have many more benefits at the moment, but you can comment on our articles. We'll reply to your comments because we're trying to make that a very positive place to communicate, get information about VR. We will try to get your questions answered. We have a limited amount of time to do all that. So it makes sense to make the comments members only so that we can respond to people and really focus our time on getting more information out to those people, as well as if you're paying to be a member, there's a very good chance that your question is very relevant to more than just you. So we're trying to answer those comments. And you can get an ad free experience. The site is super clean. And yeah, we're following your lead a little bit there, Kent. It's a nice model in my head to have direct support of journalists trying to, you know, Meta and Apple both, Google, all of them will have immense control or immense strength to control our lives, to dictate what we see, to filter our reality for us. And we need to be on them and you can help us do that. We can get more people at this. I want bureaus covering every aspect of virtual reality and augmented reality and becoming a member and supporting us directly. Maybe we can get there.
[00:57:17.040] Kent Bye: Awesome. Yeah. Just to shout out that member support model. It definitely relieves a lot of strain and pressure and opens me up to be able to do the coverage that I do without fear of losing advertisers or whatnot. And the sad reality is, is that it ends up being around 0.5 to 1% up to maybe 5% of all the total listeners that are going to convert. But I think just if you are in that small percentage of folks that see the value, yeah, please consider either becoming a member of my Patreon or become a member of Upload because I think it's, it's a model that's going to help free up folks like us that are trying to cover this industry to be able to not hold back and to not.
[00:57:56.794] Ian Hamilton: I don't want to think about clicks, right? Yeah. You don't want to, you don't want to, you don't want to decide who to interview based on how many views you think that podcast is going to get. I don't want to do the same thing from my side. I want to go where my heart tells me there's news where there's something of relevance to a lot of people and membership does that for both of us. So yeah.
[00:58:18.236] Kent Bye: Well, awesome. Well, and thanks so much for doing the due diligence to get an invite in the first place and to be on the ground and to bear witness.
[00:58:24.922] Ian Hamilton: I'm sorry. I'm sorry. It didn't Kent. I, uh, yeah, I, my heart goes out to you. I'm sorry.
[00:58:31.048] Kent Bye: Yeah, it worked out because I'm, I'm here remote and I don't know if I would have. talk to as many journalists as I have. I talked to you, I talked to Ben, I'm going to talk to Scott Stein from CNET. So I'm getting your first-hand reports, but I'm receiving them as a second-hand witness. But I'm grateful for you to be able to be there, to have your first-hand experience of all what's happening in the industry and to bear witness to this moment. Because I do feel like it is a historic moment that I wanted to capture. So thanks for taking the time to both share your perspectives and to, you know, talk about your experiences and your journey into this whole space.
[00:59:05.320] Ian Hamilton: Thank you. Talk to you soon.
[00:59:07.422] Kent Bye: So that was Andy Hamilton. He's an editor at Upload VR since 2015, and he had a chance to have some hands on with the Apple vision pro on Tuesday, June 6th, 2023. That was the day of his demo and he did his writeup. And then after that, I had a chance to do a whole deep dive unpacking and debriefing his whole experiences. So. Yeah, just some quick follow ups and I'll move on to the last of this series. You know, just really appreciate Anne's reflections on what this means for the larger XR industry and this kind of moment of validation for what folks have been working on. And I think the caveat there is that we still don't really fully know what Apple is going to do with their vision for how they're going to be integrating the more virtual reality components of mixed reality, extended reality, augmented reality. I mean, they're really emphasizing this phrase of spatial computing, which was popularized by Magic Leap in how they were thinking about this kind of mixed reality experience. But it even predates that. Avri Barzev on LinkedIn was saying it goes back all the way to like the early 90s when some of the first uses of spatial computing Either way, Apple seems to be extremely allergic and resistant to even acknowledging and referencing what they're doing is virtual reality, even though it pretty much is. It's virtual reality with mixed reality pass-through, but they're calling it spatial computing. So what kind of locomotion are we going to have? What's going to happen with the existing game systems? And yeah, are there going to be ways to do controllers at some point? All that is yet to be seen. I think obviously there's a lot of concerns around Motion sickness and what happens with locomotion in these virtual environments certainly that gives lots of folks Different degrees of motion sickness and their approach is something that's very safe for the majority of people But there are gonna be people who have accommodated to virtual experiences that will want to have something a bit more So it'll be very interesting to see what they allow what they don't allow when they release their human design interface Guidelines for vision OS later this month So yeah, just to touch on a few points that Anne was making, just that the avatar that they have, what, well, I guess they call it the digital persona because they're not having anything to do with these virtual environments and avatars, but the digital persona is supposed to be a reflection of you, but Anne really thought it was super uncanny, didn't resonate with it at all. Preferred the hyper-realistic aspects of the Kodak avatars from Meta. Although, you know, there's accounts that I've heard that people still feel like there's certain elements of uncanniness. I think the safe way to go is just a complete stylization, more of the Memoji approach. But, you know, what Apple is trying to say is that they want to make it so that they're preserving different aspects of this mixed reality and spatial computing and, you know, connecting to people's image and likeness. But, you know, I feel like it'll be very interesting to see if they actually adapt when they launch this sometime next year, based upon some of the other feedback that they get from other folks. You know, people from within the XR industry, they have a finely grained approach for being able to identify what they do like and don't like. And I think this is a thing where it's in this uncanny region with what they're doing with the digital personas. So yeah, but overall, I think the a lot of really positive takes on this completely different approach for extended reality, spatial computing, virtual and augmented reality with mixed reality pass through. So yeah, be interesting to see how things continue to develop. I recommend going back and listening to the conversation I had with Raven, Zachary and Sarah Hill, because Raven I think really nailed it by saying that Apple is much more concerned about bringing all their existing developer ecosystem into the first steps of spatial computing through this 2D interface than they are with trying to integrate all the needs from folks that are already fully immersed within the virtual reality and extended reality community. where they meet in the middle I think is going to be something that's really fascinating to see as we continue to evolve what happens with hand and eye interfaces as well as speech to be able to do natural language processing as you're interacting with these virtual spaces or I guess spatial computing is the preferred term from Apple. So next episode, I'll be diving in with Scott Stein. If you missed the previous episode, I have both a conversation with the editor of Road to VR with Ben Lang. And then the previous episode is from a couple of XR developers who were at WWDC to get some of their first impressions. So that's all I have for today. And I just wanted to thank you for listening to the Voices of VR podcast. And if you enjoyed the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a list of support podcast. And so I do rely upon donations from people like yourself in order to continue bringing this coverage, so you can become a member and donate today at patreon.com slash Voices of VR. Thanks for listening.