#1386: Chatting about Apple Vision Pro with fxpodcast’s Mike Seymour

The contributing editor and co-founder of fxguide Mike Seymour invited me onto his fxpodcast to share some of my thoughts on the Apple Vision Pro, the ecosystem differences between the Quest and Vision Pro, the potential for different killer apps, and where we see it going in the future. This is a rebroadcast of fxpodast episode #368, but with some additional context about Seymour’s work with digital humans as well as the training application that he’s developing on the Apple Vision Pro.

I still see the Vision Pro primarily as a developer kit, but there are certainly many productivity & screen replacement as well as media consumption use cases. I’m hoping to do some more coverage of the Vision Pro here soon, including airing an interview with Resolution Games’ Tommy Palm about their development of the Game Room application produced by Apple.

Apple announced yesterday that they’ll be announcing some new products on May 7th likely including the Apple Pencil 3 given Tim Cook’s X/Twitter post saying “Pencil us in for May 7! ✏️ #AppleEvent, and also “revamped versions of the iPad Pro and Air, according to people familiar with the matter” per Bloomberg’s Mark Gurman. I expect more Apple Vision Pro updates and news to be given at WWDC on June 10-14, and a first-look at how the broader XR industry is adopting the Apple Vision Pro at Augmented World Expo (AWE) on June 18-20, which I unfortunately will not be able to attend due to a family medical situation.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.412] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR podcast. It's a podcast that looks at the future of spatial computing. You can support the podcast at patreon.com. So in today's episode, I'm going to be airing a conversation that I had with Mike Seymour a couple of weeks ago. He wanted to do an interview with me about the Apple Vision Pro to get some of my thoughts about what's happening with the ecosystem, what are some of the different potential killer apps for the Apple Vision Pro, and also where I see it going here in the future and comparing it to what's happening with meta and the Quest ecosystem. So Mike is the co-founder and contributor editor of the FX Guide, which is a website that covers a lot of special effects and what's happening with the special effects industry. He's got the podcast of FX Podcasts, which this conversation is also going to be airing. In this version of the podcast, I have a little bit more questions just to get a bit more context of Mike and his background since he's both a university lecturer at the University of Sydney, as well as works on different artificial intelligence and neural rendering and did a whole presentation at Laval Virtual last year around virtual humans. And he's also working on his own Apple Vision Pro app that I had a chance to ask him a bit about as well, about training. At the end of the day, I still think the Apple Vision Pro is going to primarily be a little bit of a developer kit. So I think some of the most interesting applications are going to be either in the enterprise space or training applications. And it's going to take a while for all of the potential for the Apple Vision Pro to come out I think we're still at the very beginning. Most of the people that I see using it on a pretty regular basis are using it more in a productivity and a screen replacement context. Tim Cook, the Apple CEO, just tweeted yesterday that there is going to be an Apple event. He said pencil it in, so I'm expecting to see the Apple Pencil 3 be announced and who knows if it's going to have any type of spatial capabilities and integrations with the Apple Vision Pro. And then the developer conference for Apple is coming up here on Monday, June 10th to June 14th. And so I expect to hear some more announcements around either operating system upgrades or at least another minor release with more features and capabilities being unlocked for WWDC. Since that's going to be all the developers coming together, that's usually where they announce these new types of capabilities. So expect to see a lot more information coming out here in June for Apple. And also at the Augmented World Expo that's coming up here on June 18th to 20th, expect to see lots of different applications for what's been happening with Apple Vision Pro. I unfortunately will not be able to make it. I have a family medical situation that's going to be limiting my travel throughout the course of 2024, but I'm looking forward to see what kind of new things are going to be announced or talked about there at Augmented World Expo, at least remotely tracking all the different events that are happening there coming up here in June. So that's what we're covering on today's episode of the Voices of VR podcast. So this conversation that I had with Mike Seymour happened on Monday, April 8th, 2024 for me here in Portland, Oregon. And for Mike, it was actually April 9th there in Sydney, Australia. So with that, let's go ahead and dive right in.

[00:03:09.058] Mike Seymour: I'm Mike Seymour. I have a research lab at Sydney University where I basically look at digital humans. That's my kind of core thing. And in that respect, we're interested in having, I guess, a more human interface on computers. And so from a background in media and entertainment, I still do a lot of stuff in media entertainment. I'm a co-founder of an AI company that looks at doing neural rendering and facial conversions. But my primary interest is, hey, how can we interact with this computer in a different way? And I think a digital human or a human presence gives you a different type of interaction with a computer. And we've really fascinated to explore that. Gosh, we've been doing this now for a while. We found lots of interesting aspects to how people actually engage when you put a face on a computer versus what they think they'll do when they're predicting it. Often when they're predicting it, they're like, yeah, that'd be creepy, or I don't want to do that, or that would be dumb. And then when they engage with it, they find it, well, engaging. As a consequence, it has some real benefits. So it's tech that we enjoy exploring, but for the end use of making it better to kind of interact with technology.

[00:04:18.073] Kent Bye: Nice. And maybe give a bit more context as to your background and your journey of working into this space.

[00:04:23.715] Mike Seymour: Yeah. So I came from the film and TV industry and in visual effects and in visual effects, one of the cutting edge problems was how do you make a digital human? And when we started that process, it was very much one of what you might call traditional computer graphics problem, which is model, texture, light, and render. And so we did that, and we did that both just as you'd expect as a video, but then we started getting into interactivity. You and I were together at SIDGRAPH when we were presenting in 2017, the Meet Mike project, which was having a digital human talking to another digital human notionally in Sydney that we were both sitting in the LA Convention Center. And in that space, we were seeing a VR representation of me in their space and a visual representation of them in my space. And so what that allowed us to look at is, hey, how would people interact if you had a high fidelity, high quality digital human? And we really mean high quality. At that stage, it was sort of bleeding edge. And we had, I think, nine people running the computers behind the stage. But we were interested because there is a sense sometimes just more tech is better. And we were wondering, would there be more of a sense of presence and trust if you had a highly realistic human? Or is that just sort of a guess that we always made that it would be better to have it looking realistic? And as it turned out, people were more trusting and more accepting and had more affinity with a highly realistic digital person, one that looked much more like you and less like a cartoon figure. Cartoon figures are good avatars and memes and stuff are fun, but if you want to have actual presence and trust, then having a high fidelity face representation worked. What was interesting though, is we had a little bit of machine learning back then on reading people's faces to have real time interaction, which was running it back then at like 60 frames a second in stereo in VR. We then shifted gear a couple of years later to the model that isn't a classic CGI model, but an inferred model. So it was much more machine learning, if you like AI. I tend to use machine learning rather than AI because there's too much baggage with that word. And so a lot of the work we did then was around neural rendering. So for example, we did the first film where we converted all the actors who were previously looking like they were speaking a foreign language like Polish or German to look like they're speaking in English. It's not actually face replacement because it's the same actor's face. We're not replacing their face, but it made them look like they were speaking lip sync perfectly. And we did a multi-year contract with one of the big streamers and we've done four four feature films and some major series with them. And we also did work on hybrid models. And now we're doing stuff that's touching on 3D Gaussian splats and nerfs and stuff, which I think will play in long-term into AR and VR. It's a really good way of having digital humans. But right now there's a more immediate problem, which is how can we get our handles on the particular characteristics of something like the Apple Vision Pro?

[00:07:19.834] Kent Bye: So yeah, that's a lot of really amazing stuff that you've been working on, especially with the Meet Mike project that you had back at SIGGRAPH and being at Laval Virtual, you know, seeing a lot of the stuff that you were working on. It was pretty mind blowing just to see the evolution of all these AI interfacing with XR technologies, I think. the type of language replacement and the types of stuff that you've been working on is really at the cutting edge of seeing how the evolution of all those technologies have been developing and how you're tying that in together in the special effects realm, but to do something that's very pragmatic, which is doing that type of dialogue replacement. And yeah, I was just really blown away by the presentation that you gave at Laval Virtual, just kind of recapping all of your work that you've been working on.

[00:08:01.810] Mike Seymour: And then- I was going to say though, your presentation at that same conference was equally inspiring. I was also really interested in your multidimensional analysis of the issues that you were talking about. And I actually wished yours had been like an hour long because I felt like 15 minutes hardly gave you justice. But what I wanted to talk to you about today, given ... I mean, you were kind enough to interview us back when we did the Meet Mike thing back at SIDGRAPH and stuff. And so I've been listening to your podcast for years. Your perspective on what's happening in the VR, AR space, in light of the Apple AVP now being out, the Apple Vision Pro seemed to change the conversation at a consumer level because there seemed to be a sentiment shift from, well, VR is dead and it's not going to take up to suddenly, oh my God, this is the hottest thing and it's completely changing things. But I guess I'm just curious to start with your take. Do you think that there's been a fundamental shift in the perception of this general class of technology because of Apple's involvement, or do you feel like it's just, I was misreading the market?

[00:09:08.597] Kent Bye: You know, a question like that is really difficult and almost possible to answer with any specificity because we don't really have any numbers or any way of putting an objective quantity on the state of the XR industry. What I can say from my perspective is that Apple coming into the mix definitely creates a bit of competitive pressure against Meta, which has been really holding the fort down with XR since they acquired Oculus over 10 years ago now. And so they've been in the game and taking a completely different strategy, which is to try to create something that is affordable and trying to really push at the lever of gaming, starting with immersive embodied interactions. First, I did an audit of the quest store, which has been around for 4.8, 4.9 years now. It's coming up on its fifth year anniversary in May. But it's around 68 to 70% of all of their apps that they've released are games, which means that most of it is being used on Unreal Engine, Unity Engine. And if you look at a similar kind of sample, I took a look at the first thousand applications from Apple, and it was the opposite. It was around 68 to 70% were applications. I think with Apple, they're coming in with the 2d framework and they're coming in with starting with windows and frames. And then eventually they're going to be adding more and more spatial embodied components. But you can see how from just their frameworks, it's got a completely different design philosophy. And in some ways, over time, Apple is going to be adding more and more of spatial dimensions over time. And I don't know how well meta is going to be able to add the 2d dimensions because Apple with their iPad OS and iOS have such a streamlined operating system that they're building off of and that they're able to do this vertically integrated systems in a way that Meta really can't match because they're building on Qualcomm chips on top of Android, and then they're having to put their own skin on top of everything, and then they have Unity apps on top of that. And so you have a lot less synchrony across the software stack on Meta side, but on iOS, you have something that's much more vertically integrated. So I suspect that right now Vision OS and the Apple Vision Pro is mostly a developer kit so that a lot of developers are just trying to see what's even possible. So because of that, I expect it to have much more of a long-range impact than anything that we can really say this or that's being shifted. I can say that just from using the Apple Vision Pro that their user interface and their user experience is so much more streamlined than what I'm getting on Meta's app. Plus you get over a million different iPad and iOS apps that are able to be integrated as well. So I think that the use case of the Apple Vision Pro is mostly like media entertainment, watching 3D videos, but also like workplace productivity, like a screen replacement. For the people that I see coming back to it again and again, it's the applications where you're able to actually do your work and be more productive. So whether it's more screen real estate, or you're just trying to use all the affordances of the different applications together. But I know for my own use, I'm not completely bought into the Apple ecosystem. And so I don't have a MacBook Pro or any of the existing Macs. So I have a Windows machine, a laptop. And so I feel like the more that you're bought into the Apple ecosystem, the more benefits that you're likely gonna get out of the Apple Vision Pro. It doesn't fit into my day-to-day workflow, but I have it just because I want to keep track of what's happening within the ecosystem.

[00:12:46.169] Mike Seymour: Yeah, I guess I am that guy with the MacBook Pro and the Mac Studio and the iPad and the iPhone and everything else. To your point, I think that really does benefit the Apple system, not just from what you pointed out, but for example, in creating content, the iPhone 15 has been great because it actually does really good spatial capture. It does computational photography, obviously, to get the interocular that you would need, but the dual lenses records and plays thus back in the AVP really, really well. I mean, stunningly well, actually, like more than I thought it would. And that's an easier in for many people than getting an R5 and putting a Canon dual lens on the front of it. And it's certainly a more cost-effective solution that way. I'm really interested in your perception of what you were saying, because I think a lot of people discuss Meta versus Apple in terms of the screen resolution and the specs and the price. And I'm glad that we're not going there because I don't think that's the most interesting conversation. I think the more interesting conversation is as you've started, like what the use cases are and also the perception of it and its perceptual, environmental and experiential aspects. So if we can delve into those a bit more, you said that you thought that at the moment he was working well for productivity, but I guess for a product like the AVP to work, people have to have a reason to put it on. And I question whether that yet is just having more screen real estate. I feel like there's nothing wrong with my screen right now. It's really good. It's computer screen is in front of me. It's easy to switch on. I don't have to do anything. There's no concept of having batteries or worrying about anything. So then having that in a virtual space doesn't in of itself seem like a really compelling reason to keep going back. It's a great initial thing to play with, but it's like, do you really feel like just expanding on the workspace is enough to justify the almost cognitive effort of having to get out the gear and put it on?

[00:14:49.108] Kent Bye: Well, I'd say first of all, the ergonomics for me is something that I had to fix in order to even make it comfortable to wear for long periods of time. And so I had to put Velcro straps over the top of it in order to even like wear it for any extended amount of time. But after I got the DIY ergonomics sorted out, I started experimenting a little bit more of editing my podcast and virtual desktop. So this is a peculiar use case because in order to edit my audio, I need to have like very low latency as the playhead is scrolling by. I want to have like a high enough frame rate. And so it ends up being like a, like even just a delay of pushing the space bar and having it continue to play, it can make it a lot less efficient. And so I started to play with both the screen real estate, virtual desktop affordances, but also like completely immersing myself into the Apple vision pro. And then I started playing with similar virtual desktop applications within the quest because the quest you can tether it, you can have link, you can have SteamVR, there's other players like virtual desktop to share my Windows desktop in a way that isn't as easy of doing just a screen share if I just had a Mac. of all the money of buying an Apple Vision Pro is another like, as anybody who's tried to go through the process of buying a Mac laptop, it's like, okay, do you do this version? Or do you spend a little bit more to amp it up? And then oh, by that time, you should go up to the next version. And then by the end of it, I was like looking at at a max of like $3,000 when it was like, I was like, okay, my budget was much lower. So I'm sort of like waiting to fully immerse myself into actually having a Mac laptop, because I do think that having a Mac laptop would make so much more sense for that productivity use case. But that all said, I feel like my experiments of using VR as a way of doing productivity, there were some times when my attention was just wandering, and I just needed to kind of close out the world. And I work from home, I work in my living room, and I just wanted to really focus. And I felt like VR was able to get me into that low state of productivity in a way that I didn't always experience. And so if you're someone who's working from home and you are working on your computer all day, then there may be some use cases where people jump in and just want to get into a flow state, especially if it comes to like answering emails or doing social media. Or, you know, I did buy a magic keyboard and a magic trackpad. And that made a huge difference in order to actually make it feel like a functioning, self-contained entity of a computer, you know, without those and without a Mac laptop, you'd be hard pressed to do anything that's quite productive. So like the trackpad and the keyboard are, I think, essential for if you're actually going to use it for any type of productivity use cases. So it really depends on what your workflow is. For me, I needed to have like very specific performance to use Adobe Audition. What I ended up finding was that in order to run VR on my laptop with Adobe Audition, it was like making all the fans, it was basically pushing it to its limits and it wasn't at a level of performance that I would get from just using my laptop on its own. And so I could use my VR machine and that would be a lot more higher powered, but then I'd be in a different location where it's not as comfortable to work in. So it's something that I think is going to play out for whatever an individual's specific use case for using it would be. But I do think if you're using something like coding or programming where you have a lot of additional real estate and that ends up allowing you to be that much more productive to have all that information right there. So yeah, I've seen a number of different applications to extend it from the default of one screen into multiple screens or to even like some applications to be able to start to break out. single applications out. In some ways, Safari is a bit of the killer app because you can start to bring lots of different web-based functionality. Again, it depends on your workflow. If you're doing Slack or Discord or whatever else, you can either bring in the web application or the iPad application, but Depending on what you're doing and if you have to switch context or have a lot of that information available. For me, I haven't gotten to that coding flow state yet, but I do know people who are using it for programming and coding and using it each and every day.

[00:18:59.742] Mike Seymour: One of the things that you touched on earlier was the idea that Apple's kind of coming at it from something more of a 2D point of view. I want to just zero in on that and unpack that a bit because one of the things that are super impressive when you first put on the AVP, you've got the environments and you've got an environment that's very much an immersive environment that feels like it's not wallpaper on the outside of a sphere that you're looking at in the way that a panorama is, but you're sort of in it. And I guess the dinosaur demo app is one of those examples as well. It seems to break the frame and come at you with 3D, and yet notwithstanding just watching movies that are in 3D, it feels like a lot of these applications are getting the attention of users for the fact that they're AR, so you can see the real world and overlay on the top of it, and that, as you say, you've got extra real estate and stuff, but they're not really three-dimensional in the sense of a classic. Do you feel like that is just where we are at the moment? was looking at VR and AR as being a very 3D stereo kind of experience. Perhaps not the right emphasis for some people, and that was kind of overplayed in the space or the defining characteristics of the technology.

[00:20:13.530] Kent Bye: Well, I think if you look at the frameworks that are being developed to be able to develop these applications, like the Swift UI as a baseline, it's basically like all these 2D frames and you can add volumes and make it fully immersive. But, you know, it's a lot different than starting to build an application on Unity or Unreal Engine, which has its own logic, which is driving you towards creating a game and something that's highly embodied and interactive. And so I feel like in some ways the Quest is preferencing all that first with the hand track controllers and the frameworks that make it easy to use something like Unity, Unreal Engine for Apple Vision Pro. If you want to use Unity, you have to buy the pro license, which is like 2000 plus dollars per year per seat. And then with the Unreal Engine that still hasn't been fully integrated. I know Alex Coulombe is someone who's been really pushing the edges of trying to make all the settings work so that you can actually create a Unreal Engine with 5.4, a bunch of tweaks and settings in order to actually get it to work. But So we actually haven't seen too many fully native unity based or unreal engine based applications with an apple vision pro yet but it's mostly the games that have been experimenting with that and sometimes anything else other than that feels like a little bit overkill because of the use cases that they're going for if you look at how many apps have been launched for the MediQuest, they've only had like 640, 650 apps on the MediQuest that have launched in like nearly five years. And within the first two weeks, there was over a thousand applications for Apple Vision Pro. Now that said, most of those, you could hardly call them VR or spatial apps. They're basically like these windowed applications that you could use the eye gaze and the pinch, but it wasn't really spatial and immersive or embodied in any meaningful way. So I'm still kind of making my way through the backlog of all the different apps to see some of those hidden gems that may be out there But I feel like this is a type of device that really is driven. Like why are you buying this? Is it to keep up with the latest technology? Is it to solve a very specific use case for screen replacement is probably the biggest one But if you want to just use it as a media and entertainment device, you know, for me, some of the most compelling experiences that I've had with the Apple vision pro was 3d movies and Apple immersive video. Like I watched the elemental Disney movie while watching the Apple vision pro and I had to share play it so that it was synced up with my audio system. Then my wife was watching it outside of the headset and I was watching it inside the headset, but it doesn't really make sense for anyone to actually do that. It's great for a solo, like if you're on a plane or on a trip or on a train, it's a great media watching device, but to actually watch it with other people, it's not really designed right now. Even as I bought an Apple TV, it's still not easy just to share it. I'm sure that'll get ironed out over time, but watching 3D movies and Apple immersive video, those are some of the more compelling applications for the Apple Vision Pro. At South by Southwest, there was a couple of 360 videos that were being shown and they decided to show it on the Apple Vision Pro just because they could shoot at a higher resolution and show 8K versions. And I've heard some people say you could show up to like 12K with the Apple Immersive Video. So it's super high resolution, better than any other display that's out there. And I'm excited to see where the media consumption aspect of it goes. I mean, for anybody that owns it, they're probably already in the XR industry, and it's a high likelihood that they're using it as a tax write-off because they're using it for some sort of professional capacity, whether they're creating something for it or if they're creating content around it. But I'm actually surprised of how many people maybe casually were going in and getting demos with Apple Vision Pro and buying it. I think it's gonna be within the next week or so that Apple's actually gonna release some of their numbers as to like how well it's actually done, but no one really quite knows, but I think it's probably done better than expected just because it does have this threshold of quality of the high resolution and the eye gaze tracking that creates a compelling enough experience for people that you may just be casually interested to see where this may go in the future. So for me, that's a big reason just to keep track of it, but I haven't been using it day to day like some folks like sadly it's Bradley who's been like, He bought a headless M1 Mac, which had a cracked screen, and he's basically using that as his full workstation, and he got rid of his monitor. So I'm not quite at that level, but, you know, depending on my workflows, I hope to continue to play around with it and see how it fits into my own workflow.

[00:24:41.463] Mike Seymour: At an entertainment level, just in terms of how people are viewing stuff, I guess I'm interested to know your point of view. Because it seems to me that there is a bit of a trend to 180 degree immersive experiences. You've got things like the sphere at the high end, and then you've got... Well, it's not as much at the high end, it's more at the large group of people end. And then at the personal end, you've got the AVP, where you can be experiencing this sense of it having the footage large and high resolution. And of course, a second part of that, as I said a moment ago, was that it could or could not be three-dimensional. But there's another dimension, which I don't know if you've seen the Gucci film that came out a couple of days ago, but in that you've got an experience that's essentially a 2D in that sense that I'm looking at this sort of 180 degree projected up image. And then there are three-dimensional elements that appear. So like a dog walks across the bottom of the screen for you, or you can take their products that they're talking about and actually look at them as 3D elements. Did you find that a compelling experience or not?

[00:25:45.022] Kent Bye: Well, I think that the Apple Immersive Video, if you've watched any of the different experiences that are on there, they've got like three or four of them right now. They just, within the last couple of days, released a new Major League Soccer highlight clips from the 2023 MLS Cup. So Apple acquired NextVR, which was doing a lot of live streaming at the time, but I feel like they've evolved the cameras into something that's super high quality. If you do have an Apple Vision Pro, I'd highly recommend watching the soccer clips. Seeing something like that live where you could actually be courtside and watch either basketball or soccer. I think that's actually super compelling, especially if you're into the teams that may be being streamed and you're able to get some perspectives that you wouldn't literally be able to get like above the soccer goal. So I'm excited to see where they continue to do that if they have any live elements of that. or just clips that they just released. In terms of the video, I feel like, you know, a number of years ago, John Carmack was up on stage at Oculus Connect, and he was saying that when they actually looked at the numbers, a lot of people were playing games, but there was like a significant number of people who were just watching their immersive content. Meta has built out their Oculus TV, or I guess their MetaQuest TV now. So they have a lot of immersive content, but they were taking a completely different approach than Apple. Apple has Apple TV Plus, where they actually have been producing pretty decent content. It feels like in some ways, some of the quality of the shows is like matching what HBO used to be. Now it's turned into Max. But they've also been getting access to like 3D movies where they I saw a list of like, oh, hey, we have access to these 3D movies for the next month or two. And so they're starting to curate both movies and 3D movies so that you could watch Gravity or Mad Max Fury Road, stuff like that, where you could just watch it on Apple TV. Now, Meta, they're just treating it as if it was sort of this weird business model where there's no ads, they're kind of taking a freemium model, but you also can't subscribe to it. So there's actually literally no business model to it. So they've been doing it as a way to be an additional selling point for people to adopt it. But I don't know how long that's actually going to last in terms of them continuing to produce and license out that content because a lot of their producers that have been working on that have stripped over to working on the meta horizon worlds. So that could be something that over time, just slowly disappears and goes away. Whereas an Apple, they actually have like a business model around it, where they have their own content. But they've also done this interesting thing. If you search around on their app, they'll say, Oh, here's this like 3d movie, but it's on Disney Plus, even though you'd have to buy a subscription for Disney Plus, they're still like aggregating it within the Apple Vision Pro 3d movie section. They're saying, here's all the 3D movies that you could have access to. You might need to get Paramount Plus or Stars or Max or even like Disney, but they're at least aggregating it all, which is a lot more of a comprehensive like approach than what Meta has been doing with this kind of immersive video component.

[00:28:39.553] Mike Seymour: I think Meta's done some really great concert stuff, music concert stuff. But if you're looking forward, it seems to me that the absolutely killer use case is what you touched on earlier, which is sport. So Apple has enough depth in finances and connections that if you started having live sport that you could witness in the AVP, then that's a sort of a It's something that has sat in cable as a huge thing with ESPN and you need to be someone of Apple's size to be able to flex enough to get rights away from a behemoth like Disney. Of course, Disney looks big to everybody that isn't Apple, but Apple. towers over Disney. And it could well be that it's soccer or football, as the Europeans would say, because that's a non-American international sport that's going to find a huge audience. And then if you could build that into the U.S. audience and suddenly have the kind of interest level in the U.S. And of course, there are things like Welcome to Wrexham and other initiatives that have been happening that have been playing into that space. But yeah, a sports experience that would be live would be incredibly compelling to many people to actually justify. And it's an event you kind of are happy to sit down, organize yourself and put on the headset as opposed to just sitting down in front of the TV after a long day at the office and hitting the remote. Would you agree that that would be probably the biggest play that we could see if it was gonna be a turning point?

[00:30:08.813] Kent Bye: Well, I think I have a question just to turn back to you. How do you define a killer app?

[00:30:14.387] Mike Seymour: I define it as the reason that justifies people getting the technology when they're not primarily interested in the technology. At the moment, I think people that are buying the AVP are really interested in the AVP. They're not saying, what do I need to be able to have this experience? Is it the Apple VP? Okay, I'll buy that. So if you have a killer application, it's the application itself that drives the rest of the hardware acquisition and infrastructure, as opposed to, I think it'd be really good to have this new thing. Now, what can I do with it?

[00:30:47.192] Kent Bye: Yeah, yeah, I agree with that definition. I just wanted to ask because I think that there are going to be killer apps and they're going to be a wide range of them. There's not going to be one. There's going to be many of those different types of applications that are going to flip people over to actually buy Apple Vision Pro that for some people, it could be screen replacement. For other people, it could be media consumption because As of right now, when you go through the different demos that they show you, I think that the media consumption aspects are probably the most wow factor that for a casual consumer may flip them over into wanting to actually get this as a device. And so I think live sports definitely fits into that type of media consumption aspect, and it'll be very interesting to see how they continue to develop that out. And I do think that it will be. one of those things that they may start to lean upon to sell devices. I think they've probably been surprised with how many different people have been able to buy it. I've been surprised in terms of like, I just felt like this is a developer device. I mean, this is like crazy expensive, like who's going to want to buy this? But, you know, just to see how many people were just anecdotally, again, I don't have any data or numbers or anything. I think once they report their first quarter results, everyone will get a lot better sense as to where they're actually at.

[00:31:57.656] Mike Seymour: I think you're right about the killer app thing. I think in the film industry, for example, a killer app for me in a niche is if you are a director wanting to review a film, it's very hard to review that on a screen. It's convenient to review it on an iPad, but it's completely inaccurate how you perceive the footage when it's on something that's... 18, 12 inches across than if you're in a cinema. If you're in a cinema, you actually have to change your gaze direction, change your head direction a lot. You mentioned Mad Max there for a second. One of the things that George Miller was really good about is knowing where the audience is looking either side of a cut. Now, if you're looking on an iPhone, it really makes no difference, right? You don't really have to change your gaze at all. But if you're looking in a cinema and you're looking over at the left-hand side of the screen, on the other side of the cut, it really matters whether it's suddenly action on the right-hand side of the screen. So, for directors wanting to review a theatrical major motion picture, that's going to go into cinemas, especially in this age where IMAX is a good model for generating revenue, then they need to experience on a big screen. But reviewing dailies on a big screen is like a big ass, right? You have to go to somewhere that has a big screen. And even if you go to a reviewing suite, often at a post house or whatever, those reviewing stages aren't that big. Sure, ILM or whatever has big viewing theaters, but you know. So the simulation of a large space is something that requires either the large space or the simulation of a large space. So yeah, that's a really small niche in the VFX industry that would be like, okay, if I want to review materials, I get that idea of how it looks big. I need this thing. What is this thing? It's AVP. All right, I'll put it on and I'll use it. But for the general public, the only one that strikes me as being massive that I can see conceivably changing the entire game as it were, would be sports. Though I'm sure there will be many other small niches that become apparent. But then this is in the context that I would have said to you pre-AVP, that the thing that I think is most important is that it's... not a VR experience, but an AR experience, because I think people were very against a closed off VR experience in the sense that they felt isolated. So that brings me to the next point, which is, do you feel like we've seen yet, or there is a good example of where the overlay on my reality, which of course in this case is happening through video cameras, is there an obvious place there that you feel this would be either the killer app or compelling content? Because watching a movie isn't that, it's like, you know.

[00:34:35.868] Kent Bye: Yeah, I think, you know, some of those, you know, if you want to call it killer apps, it could be like enterprise use cases where there are companies that are buying the Apple Vision Pro because of their specific use case that they have in mind for it, whether it's training or whether it's, you know, like, cause most of the time with Apple Vision Pro, most people are going to be using it at home. They're not going to be taking it out and about despite the fact that very early when it was first sold you saw a lot of people for content going out almost for shock value wearing it there are people that are wearing it on trains and people that are wearing it traveling and commuting but i suspect that the vast majority of people are going to be at home and when you're at home it becomes a matter of okay are you going to be walking around and want to have like video overlays on top of things while you're doing something else which I think that there likely are some people that might be doing that. But again, I don't know if that's like the vast majority of people, I think. Right. If anything, this is a developer kit to start to really develop out some of those different use cases. So just to turn the tables a little bit here, I'm curious if you could share a little bit about how have you been using the Apple Vision Pro?

[00:35:42.275] Mike Seymour: Sure. So we're interested in using it for a bunch of applications that are, let's say, more complex. And so our first step was, can we develop something that's relatively simple that would let us just get our skills up in the development space? So we came up with an educational application. Now it's going to sound a little weird, but it really makes a lot of sense when you think about it. It's how to do ultrasound scanning of sheep. Now, the reason for this is that we have connections where my research lab is here at Sydney University into the vet school, which is like one of the top 10 in the world. And they have full permission to scan sheep to see if they're pregnant or not. And in fact, they train people how to do that. Artificial insemination program, the IVF programs that you hear about with people, it's exactly the same biology and tech for animals. And so nearly all of the people that are trained up start with animals. No animals are harmed or whatever, but nevertheless, they figure it'd be a good idea to train students for an initial amount of work before they got to an animal. So even though it in no way distresses the animal, better for the student if they could have an experience of understanding what they're looking at. And if you've ever looked at an ultrasound as a parent, they say, hey, there's your child. And you go, I can't, what are we even looking at? And they're like, no, no, that's the head. And you're like, really? I thought that was like a smudge on the screen. So that's the context now for what it means from a developer's point of view is we said, okay, let's come up with three things we wanted to be able to do. So the first is film and actually edit a spatial video. So we tried filming with the red camera, with Canon R5, with the iPhone and with the actual headset itself. And then seeing what we got out of that, seeing how we would edit it and seeing how we'd use various tools of post-production to make that a good experience when then played in the headset. So our first part of our training program or app is, hey, here's a video of someone explaining how to do ultrasound scanning. And you watch that in a special video. and it has overlays and graphics and we tick that box. The second stage was, hey, what the heck does an ultrasound kind of do? And so here we have a 3D visualization of a sheep or a ewe, and you actually show them the spatial cut through a sheep. Now, of course, when I say spatial cut, I don't mean a medical cut, I mean a ultrasound cut. But if you then show them what they're actually seeing on a real ultrasound scan, say, look, if you see this on an ultrasound scan, this is what it is showing you inside a sheep, then it helps people to visualize. That allows us to do things like getting blender models in and working with 3D models, having actual volumes that you can turn around and look at the sheep from any angle. And also it gives you real pictures of what the ultrasound would look like. So stage two is, If one, someone showed you how to do it, two explains what you're looking at, and then stage three, you get to interactively scan yourself, in which case you hold up your hand and as you move around a digital sheep, in real time it shows you a real ultrasound scan of what would have been shown at that moment in time. Now, from an educational point of view, that's good for the student, right? It's like, this is what you're going to do. We're going to explain what you're seeing so you can understand it. Now you have a go. Classic education. From a developer point of view, it gets us to edit and make spatial videos and color grade them and do all that kind of stuff. It lets us make 3D models and bring them in. It lets us do complicated hand interactive manipulation in three-dimensional spaces in the third stage. And that's the three parts of the skill space that we wanted to develop up. And so that's been a heck of a lot of fun. And obviously it's all done with ethics approval and it's much easier to get ethics approval to scan sheep than it is pregnant women. But having said that, you can imagine a medical application being the same thing. Wouldn't you like your trainee doctors to have use the simulation first before they actually at a teaching hospital started sticking ultrasounds on your pregnant partner. So I think it's a valid use of the technology, but it's also primarily a training use for us to kind of get up to speed and learn. And I don't know if I've ever said this to you before Kent, but my father literally said to me once, I mean, he was a man I was very close with, but he actually said, Michael, you can read the entire works of Sigmund Freud, but son, sooner or later, you're going to have to take a girl on a date. And I've always taken that attitude that rather than just theorize about these things, it's better to roll up your sleeves and kind of make stuff. And then you just learn stuff about doing it and things you don't even know when you sort of start. So yeah, we just have had a really good experience in doing that. Apple's been very supportive, but being in Australia, we do have partners in the US and so we obviously got our AVP in the US and have US accounts and stuff.

[00:40:21.853] Kent Bye: Yeah. Yeah. When I was saying earlier that I think a lot of the applications that Apple vision pro will be developed or it's for folks that are making stuff. That's the type of thing that I expect a lot of the stuff that's happening behind the scenes, all the stuff that you had just mentioned there and that we don't have much insight into that type of innovation that's happening. So I expect that this is a device that is going to be driving lots of different types of applications like that, that we may not actually even ever hear of because they're just private enterprise applications that are solving a very specific use case, but the technology is enabling some applications that are able to enable all of this kind of fusion of these things that any other VR headset on the market isn't able to achieve all that.

[00:41:04.040] Mike Seymour: And to your point, it is really interesting when you're doing it to start learning about aspects that you knew but hadn't really fully zeroed in on. It's very immersive in the you can't really fail to watch it sense. If you've got students, sometimes you can show them videos and are they really watching them or are they staring off into space or are they on their phones? if the student is in a spatial 3D video showing them something that they themselves, they know that they are going to be doing themselves moments later, you get an incredible level of leaning in kind of engagement. And of course that makes sense when you say it out loud, but it isn't until you start doing it that it kind of, oh yeah, well, that's true actually. So we're interested not only in what I just described, but from an educational point of view, we're going to do some research and look at retention after say two or three months, because we think A lot of people focus on immediacy, but we're interested in how much did that actually really sink in? Are they going to retain knowledge two or three months down in a way that they wouldn't have if they'd just been sitting in a lecture or watching a video? I think that's where we're going to see a significant difference in terms of impact for this particular educational application, but that would apply across the board, I think, if you're doing corporate communications or safety videos or any number of other things, you want to have higher levels of retention. And if that's a way of doing it, great.

[00:42:27.053] Kent Bye: Awesome. Yeah.

[00:42:28.490] Mike Seymour: Yeah. So then if you are at home and you don't need the overlay, what you do need is that problem that you had with your wife, which is you don't want to be doing things in isolation, right? Like my wife doesn't want me to disappear into a thing and not be communicative, even if I'm sitting in the same room. So that's all a thing about present, how present I can be with somebody else. And of course, in this case, more tech isn't necessarily more present. So the problem it seems to be with any of these technologies, forget Apple for a second, right, is just getting the net coding under control so that you can have the level of immediacy and interactivity and low latency that gives you presence with somebody else. And even if you look at games like Fortnite, they don't really allow much actual interaction with another player. You're just aware of them, but you certainly don't have the kind of presence that I'd want to have with a partner.

[00:43:23.257] Kent Bye: Yeah, I think having covered the VR industry for nearly a decade now, I feel like this is something that comes up a lot when people critique VR. They say, okay, it's very isolating. And I think there's two responses to that. One is that in social VR, it's actually extremely social. And with the spatial personas that just launched within the last week or so, you have a lot more ability to start to bring these avatars into your spatial context in a way that goes beyond like the FaceTime flat windowed version. So you actually have the capability to be even more social with other people that are like kind of distance. And so it's like breaking down the constraints of distance in that respect. So that's one aspect. The other aspect that I'd say, I mean, certainly you're correct when, you know, even as somebody who is as into VR as I am, you know, doing this kind of experiment where I'm in VR watching the 3D version and my wife's watching it on the 2D plane, it's still awkward. And I've only done that once. And it's not something that we keep going back to just because the asymmetry of that is weird. But the other aspect is that when you are completely immersed in media, Like when you're watching a movie, people often want to not have a lot of distractions. I mean, you can watch your TV and have your phone there and be distracted, but when you're in VR, it's often difficult to be distracted. For me, I've got all notifications turned off, so I'm not getting any things that are coming in. I mean, people could have that on, but every time I have an option to be notified, I'm like, no, I don't want any notifications, thanks. If I wanted to go do something, I'll go do it because I'm very clearly context switching between these different apps. But when you think about immersion, that's actually what people want is to be completely immersed.

[00:45:02.749] Mike Seymour: Yeah, the trouble is I like going to see a movie with my wife. Like if you said to me, Tuesday night, we're going to go and see a movie, I would be really bummed if my wife went into theater two and I went into theater six, right? I want to be in the same cinema with her, but I've not got my phone out and I'm certainly not chatting to her through the whole movie. But there is a level of being with somebody to have a shared experience that you don't need a huge amount of verbal communication over to still have it as a meaningful kind of engagement.

[00:45:26.947] Kent Bye: Well, so I totally agree with co-located experiences like that. You definitely want to be present for the affordances of what it means to be with somebody. But when you're not able to actually be with somebody, I think that the spatial personas and I've seen some clips of people playing video games together and watching media together with that spatial persona. So I feel like there are ways in which that something that would be normally isolated can actually create a group cohesion. out of that. So I feel like that there are ways of using some of those integrations with that spatial persona to recreate some of those social dynamics. But the other larger point that I was making is just that in some ways, one of the fears of XR VR is that it's going to be escapist and addictive. and that we want to be completely immersed into it. And that there is the line of like, okay, to what degree are you escaping your responsibilities by not being in right relationship to all your obligations in the world. But I do think in terms of just generally, when we think about trends of media over time, we want to get more and more immersed into whatever we're watching. And I feel like these headsets are enabling us to do that. Although with Apple, they have made the choice to not have the hand-tracked controllers that would give you that extra level of embodiment. I mean, they've done an amazing job of actually allowing you to see your hands into the experiences, but in terms of interaction paradigms, it's a lot more slimmed down for what that possibility space is, because most of what's been developed on the Quest line of these types of VR interactions have been largely dependent upon those buttons to have these complicated interactions. And so to take away all those buttons, now you have to figure out ways to try to recreate that or produce that. So I'll be very curious to see what Alchemy Labs is doing with their job simulator and everything that they've been porting over all of their VR games over into the Apple Vision Pro. But also I'd point to like ALVR which is a way that people can start to do things like VR chat with the Apple Vision Pro. So there's a whole section of people that just want to have social interactions with other people within the context of VR chat and have like super high resolution and also have like eye tracking and everything else. And so I'm sure that as time goes on that they'll have more and more of those capabilities be translated into a avatar representation in something like VRChat. But yeah, something like VRChat and ALVR, which is still in the test flight beta phase right now, but, you know, allowing you as a user to bring in all the games of SteamVR into Apple Vision Pro, whether that's you're using your lighthouse and your track controllers, or you're just using the hand tracking

[00:48:03.363] Mike Seymour: Do you think that the lack of a controller is going to limit them in the gaming space then? Do you just feel like you just can't ... If you've got a classic Xbox controller, you can be using both thumbs and plowing away really fast, and that's going to just stop that type of game having traction in AVP?

[00:48:20.712] Kent Bye: Well, if you go back to when the Quest first launched, it didn't launch with the controllers either. It had gamepad controllers, and the gamepad controllers was like a stopgap until the touch controllers came out. And so I don't know if it's in their plan at all. I think overall, Tim Cook has been fairly dismissive of the affordances of VR just generally. He's, you know, Apple is resistant to even call anything about the Apple Vision Pro VR. They're insistent that it's a spatial computer. And so, Maybe over time, there'll be some third party solutions or there'll be like an ALVR is something that has started to close that gap of the make it so that you could at least use your index controllers with lighthouse, but through a PC. So it's kind of basically the same type of airplay solution that you would see in the quest or virtual desktop, but through this interface of ALVR to open up the SteamVR library to Apple Vision Pro. But my budget point is that there could be like niche use cases for people that valve hasn't released an update to index. And so a lot of people are going with the big screen VR in order to have like a fully immersive, lightweight experience within the context of VR chat. But the Apple vision pro is a whole other possibility now that people could maybe go with that if that fits what they're doing in the context of something like VR chat.

[00:49:34.709] Mike Seymour: Yeah. Maybe we could see a innovative use of an iPhone as I turn it on its side and it becomes a gaming handset, both thumbs to be wildly doing stuff while you're in the AVP. So the other thing that's kind of interesting to me in terms of that immersion stuff is being able to have a lot of connectivity. You were talking about some interesting companies there. I don't know if you've heard or seen the stuff that ModRock has been doing and they've demoed stuff with Adobe, but they're like allowing incredibly fast synchronization between users by having more of a central server. The game, as it were, doesn't sit on your computer and mine as separate things, which obviously opens the window to hacking and stuff. It exists on a server, but then it's sending out geo information to the sort of like a hub and spoke model. And the thing about that is you can then do things like physics sims because you've got the sub 60th of a second update. I can be interacting with a sim, you can be interacting with a sim, we're both going to have exactly the same thing happening in real time. But the other thing that that opens up, which is interesting, is it doesn't necessarily have to be that it's to an AVP. So I could have that experience where... I've got you and I in AVPs, but six other people have just got it on iPads or on their PCs because it's agnostic. And therefore, they would be interacting through a 2D screen. We'd happen to be interacting in a 3D world. And that seems to me to be a barrier that needs breaking down, that it shouldn't be necessarily that it just works here. It should be it works here and it works for those people that aren't in, and you can basically move between those two fairly seamlessly. Otherwise, you're not going to have enough engagement, enough broad use if you require everybody to buy expensive headsets.

[00:51:13.600] Kent Bye: Yeah. And was there a specific application or a game?

[00:51:17.102] Mike Seymour: This is core technology that they're developing to allow that level of interactivity so that you have immediate updates, solving the netcode problem of trying to update quick enough. And as you know, many games companies spend a lot of time and a lot of effort on their netcode to just try and get communications happening quick enough. If you can break that apart so that you can have virtually instantaneous updates to you and I and anyone else simultaneously, and a hundred users in the same physics sim with a very low bandwidth to the outlying spokes from the hub, then it starts to make that possible. I think those technologies are yet to kind of be adopted. The example that somebody pointed out to me is like, if you're in Fortnite at the moment and I do something, it seems to me that you're seeing it the same way and instantly. But in fact, it could be up to like a second delay of what you're seeing compared to me. And that's fine in a game like Fortnite where you're just doing very rudimentary things, but it doesn't allow you to do a full level of complex interaction. And I guess my point was, we need the interaction and we need the interaction to be not just between people that have headsets, but between people that have headsets and people that have non-AR solutions as well, and then we'll see a much wider adoption.

[00:52:32.605] Kent Bye: Yeah, this is I think getting into like the core technology stack where most of the different experiences have been built using like the game engine of Unity and Unreal Engine and all the network that's there. I feel like once Unity had announced that they were going to change their licensing agreement, a lot of people started to look into things like Godot or other open source versions. There's also WebXR, which is a whole other technology stack that, you know, I think There's very early phases of WebXR. You have to set some flags to enable it on Safari and the memory limits are a lot lower on Safari compared to Google. And so it's still experimental in the sense that it's like a moving target. So the different people that I've heard that are working on something like WebXR are still waiting for it to get stabilized a little bit. And so then there's this whole other strand, which is. React native, which is taking the essence of some of those components of the web, but starting to use the native APIs. And there's a whole other aspect of using the Swift UI and all the stuff that Apple's building from ground up, you know, and I think most of the stuff that they're doing with their version of social networking has been through this context of. their own personas, and I don't know how much that is going to be opened up to developers. I'm sure that once we hit WWDC, we're going to be perhaps seeing the 2.0 version of Vision OS and a lot of different updates. It feels like, you know, just like when the iOS launched, back in 2007, there was a lot of stuff that didn't have native applications until the second iteration. And so once we get into this next phase of announcements, these next couple of WWDCs over the next couple of years, I expect to see a lot of big announcements of new APIs and new capabilities. And so

[00:54:11.215] Mike Seymour: I mean, you and I are old enough to remember when Apple came out with the iPhone, it didn't have third party apps. I mean, that wasn't at first release. And now it's inconceivable that you would have a iPhone ecosystem that wasn't built around apps. I guess for me, not only am I interested in a second version of the OS, but also when there is a second version of the headset, I think the thing that's most I'd be interested in your opinion, if you're going to do a wishlist of how you'd upgrade the AVP, apart from price issues. For me, it would be the cameras. I feel like the display technology is spectacular, but the camera tech still feels like they're sort of HD cameras and thus not giving me the fidelity of my environment around me enough. How would you rate that? Or would you say I'd want hand controllers or something else that's haptics earlier than that?

[00:55:00.380] Kent Bye: Yeah, I mean, I think there's going to be the hardware stuff and stuff that can shift in the software. And so, you know, there is a bit of stuttering that happens because the way that they've optimized the tracking is in a way that they've made different decisions so that when you're still and not moving, then it looks great. But when you start to move around, then it starts to fall apart a little bit. And then that could be stuff that might be set on the software side. They could start to tweak that. I think for me, I'm a little bit more of wait and see to see where their own priorities are. I feel like if anything, the biggest thing, well, for me, the biggest thing is that their store and their search absolutely is terrible. So if you actually want to like discover applications, it's very difficult to really just search all the apps that have already been released. So their amount of discovery and filtering with the VisionOS store, I feel like that for me is like the biggest ask that I have is just like give some updates so we can at least really fully explore what's being created by these developers. Cause I feel like they highlight things, but you know, unless you know what you're actually looking for or you have lists of all the different apps, it can be difficult for discovery. So I'd be very curious to hear from more developers to hear like what it's been like to release their app, if they've been able to get traction. So.

[00:56:11.795] Mike Seymour: The thing that would change on the software side for me is it's not a good industrial or organizational device because sharing it amongst people is really quite complicated and it's designed as a personal device. And so those applications we referred to earlier where you might be using the AR for training or whatever, it's not a good training device because it tends to be... designed for me. And if I have to share it with you, it's a bit of an effort to set all that up. And then you've got to do it within sort of five seconds, it feels like. And you only have limited agency or you have everything and it's a bit frightening. So yeah, it doesn't feel like an enterprise version of how you would be running apps.

[00:56:50.448] Kent Bye: No, for sure. And I was surprised, actually, that there were at least two or three different exhibitors at South by Southwest that were showing on the Apple Vision Pro. Despite all the different technical glitches, they felt like it was high enough fidelity that it was worth going through all the hassle of it not really being optimized to show across different people. And of course, you need prescription lenses. You can't wear it with glasses. For me, I guess a couple of things that I would say is that for one, the Windows management is something that could use a lot of help. I think even within the iOS and with Mac, the Windows management is pretty sophisticated. And with the Apple Vision Pro, it's pretty rudimentary. And I feel like just the ease of being able to switch the different contexts and move spaces around And then also just being able to even organize your applications. I mean, it's all alphabetical right now, so not even be able to like organize that, but those are something that's so simple that I imagine that that would have to be like an next level upgrade. So I imagine there's going to be a lot of quality of life improvements like that. They're going to be coming. And if you look at what has happened with meta and the release with the quest, they've released the hardware and they've consistently made it. much better device with each of the different software releases. And that's just the nature of XR devices is that as long as you have that hardware there, you can ship the software to continue to expand its capabilities. And I feel like the version of the Quest 1 is a lot different than at the end. when the Quest 2 came out and the same thing the Quest 2 has improved vastly and even the Quest 3 continually being upgraded and updated. So the device that it is at launch is a lot different over time and I expect the same thing to happen with the Apple Vision Pro is that they've got all the hardware there and they just need to have the software to help to unlock more of its potential.

[00:58:30.740] Mike Seymour: I mean, they really do have some serious hardware in the headset, so it doesn't feel like they shouldn't be pushing at the edges of what the hardware is capable of, yet not even close. It feels like there's a lot more, as you say, that could happen without having to even rev the hardware. Though, I must admit, there are a couple of things that like the magnetic soft cushion against your face only magnetically attached to the headset has caused more people to nearly drop mine than I get to speak about. Yeah. I don't want them to have something that is so prone to being dropped when it's cost so much.

[00:59:03.335] Kent Bye: Yeah. Like other companies, like Magic Leap did a very similar decision. Like they designed these things to be only be used by one person. And so, yeah, but there are things like that where, I mean, just for me, the ergonomics are just terrible. I mean, of all the different devices and headsets that I've had, you know, this is one of the worst. Ergonomics, like for some people it's totally fine, but for, I'd say the majority of people, they have to find some sort of workaround to make it comfortable. For me, I had to put like Velcro over the straps just to give some additional support because it's so front heavy and just really not comfortable to wear for any significant amount of time. So I think with the next iterations of their hardware, and maybe even that's something they could release as a additional device, like an add on because they have the developer strap, which was like $300 just to have a USB and to the headset, which I think is pretty ridiculous that that's not

[00:59:52.983] Mike Seymour: I have to say that even as an Apple fan boy, I found the developer strap, the most phenomenally expensive cable slash attachment in the history of Apple. It was like, but then I thought about it and I thought how many they're going to be selling of those. Right. And I thought maybe they're going to be only selling like a thousand or a few thousand and they just that expensive to make, but I don't know. Yeah. It seemed like you would make those much cheaper to encourage more development.

[01:00:16.209] Kent Bye: Yeah, but showing the developer strap, it is possible to start to take off some of those different arms and have a new arms. And so it is possible for Apple to potentially release like a pro strap, but yeah. And I continue to look and see other stuff. I think I've had a solution that works for me. That's pretty cheap with getting like $8 Velcro straps and another like $15 sleep apnea strap that Alvin Wayne Graylin had pointed to. So, but yeah, just having those two on there has made a huge difference.

[01:00:43.972] Mike Seymour: Yeah, I will say on the positive side of things, the eye tracking for me in terms of the precision of it has lived up to expectations that there is, we're having a bit of a go at things that we don't like, but the eye tracking seems to be really reliable. And once you get used to it, I have a habit, I don't know if you do this on my emails, I tend to delete things by looking at the thing beneath it. So in other words, I'll like hitting delete for the thing above it so that I am one ahead of myself. It's a habit I got into, so I don't accidentally delete things that I want. and it doesn't work at all. You need to be looking at what you're doing. But that being said, if you are used to looking at what you're doing, it's pretty precise. Is there any aspect about it that you are particularly impressed with?

[01:01:25.275] Kent Bye: Well, the resolution and the level of immersion, like not being able to really see the screen door effect, I think is the thing that I've been most impressed with. There are some commentators like Carl Guttag, who's been doing this kind of really high end critical analysis of the screens. And, you know, from my own phenomenological experience of it, I don't really experience the screen door effect at all. And so just that resolution of it makes a huge difference. But overall, I would say that it's the integration of everything just working together in a way that we haven't quite seen in any other headset out there before, just to be able to jump in and be productive on being able to jump between different apps that I use on my iPhone. And so, yeah, I think just the seamless integration of it, Apple's design philosophy and their standards have translated over into being able to have all these operating system level stuff like the accessibility features within Vision OS are completely unparalleled for any other headset that's out there because they've been building upon what they've done with iOS and iPadOS. And so in terms of accessibility, I feel like they're really pushing the edge, especially when you start to consider where If you have something like a Unity application, most of the applications on Meta are on Unity, and Unity itself doesn't have that level of accessibility built into it. And so you may lose a lot of that type of accessibility features when you go into the Unity apps, like on Meta. But because there are so many different applications that are built from the ground up, using the Vision OS and SwiftUI, it has a lot tighter accessibility integrations than anything else that's out there right now.

[01:03:02.903] Mike Seymour: Well, look, it's been great chatting with you about this. Thank you so much. I really value your opinion and insights on these things. And your podcast is just been the cornerstone of keeping track of not just what's happening. And I think we haven't even got into kind of your elemental theories of VR and stuff that I find really interesting. So thank you so much for your time. And again, really appreciate it.

[01:03:23.527] Kent Bye: It's been a pleasure being here. So thanks for inviting me on, Mike. So that was a conversation that I had with Mike Seymour for his podcast called FX Podcast, which is airing in episode 368. So I have a number of different takeaways about this conversation is that first of all, well, there's something that he was talking about network latency. I wanted to call back to a conversation that I did with David Smith and the croquet browser based operating system. I suspect that this may be similar to what he was talking about, the hub and spoke model of having new types of architectures for having synchrony across different experiences in different locations, being the fact that Mike's in Australia, then that's something that he has to deal with a lot with the type of network latency and lag that may create splits in what's actually possible for some of the different experiences that we have. There's certainly possible to do some of that today and there's still a little bit of lag and I think people are able to make do with that. But having like super low latency architectures is something that Croquet has been experimenting with. So that's back in episode 1088 if you want to go back and take a look at that. And I do still see the Apple Vision Pro at this point as primarily a developer kit. There's certainly going to be a lot of consumer applications and still a lot of value that people are using it. And most of the people that I see using it on a pretty regular basis are using it more in a productivity and a screen replacement context. Hopefully I'll be able to have a chance to talk a little bit more of other folks out in the community to see what's happening and how people are using the Apple Vision Pro now that we're about three months in since the headset's been out and released into the wild. I'll have another conversation that'll be airing here soon with Tommy Palm. I had a chance to talk to him with the Apple Game Room and Resolution Games with what they're able to create there. And also Mike had mentioned in this conversation, the Gucci experience, which at the point of this conversation, I had not had a chance to see it, but I did go back and actually have a chance to see it. I highly recommend checking it out just to see how different brands are starting to use this kind of multimodal way of, you know, doing a whole produced 2D video, but also adding different spatial elements as you're watching the video to create this whole immersive experience. And for me, the lighting that they use in that experience was really quite powerful to have the kind of the mixed reality pass through as you're watching the video and then sometimes switch into the full immersive mode and change the color and have ways that the lighting can actually shift the mood as they're telling the story of their new creative director there at Gucci. So that was quite an entertaining example of seeing how you could use that for different brand activations. And there's actually some interactive components there as well as you can start to look at some of the different products that Gucci is selling. So definitely worth checking out the Gucci application if you haven't had a chance. And yeah, hope to air a little bit more of Apple Vision Pro coverage here soon. And like I said, on May 7th, there's going to be a whole announcement that sounds like they're probably going to be announcing some new iPads and potentially the Apple Pencil 3. We'll see what kind of integrations that may have with the Apple Vision Pro and be the first official peripheral that they have for that device. And yeah, like I said, the WWDC is also coming up here on June 10th and 14th, so I expect to see a lot more announcements either with a minor release or potentially even like a full 2.0 version of the Apple Vision Pro OS. Although I think at this rate, I wouldn't be surprised if it's just a minor release with some new features and updates that are coming out there as well. So, that's all I have for today, and I just wanted to thank you for listening to the Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a list of supported podcasts, and so I do rely upon donations from people like yourself in order to continue to bring you this coverage. So you could become a member and donate today at patreon.com slash voicesofvr. Thanks for listening.

More from this show