#1395: Apple Vision Pro as Screen Replacement Power User Brad Lynch on Overlays & Multi-App XR

YouTuber Brad Lynch (aka SadlyItsBradley) has completely replaced all of his computer screens with an Apple Vision Pro, even going as far as getting a headless Mac Book Pro that does not even have a screen. He’s been using the Apple Vision Pro for around 8 hours a day since launch, and I wanted to get a sense of how he’s been using it. It turns out the he is mostly streaming his gaming PC via Moonlight and using social VR apps like VRChat via ALVR to hang out with friends and ambient hang out virtual spaces. He’s also using SteamVR overlays to augment his virtual reality experience with Steam apps like XSOverlay, VRHandsFrame, and OVR Advanced Settings. It’s feels like a very niche use case of a hardcore VR enthusiast, but one that mixes and mashes realities in a way that might be a sign of things to come. Some of the most compelling apps for Lynch are open source that enable experiences that are being driven by his high-end Windows machine.

Lynch has been closely following a lot on VR hardware developments over the last number of years, but the Apple Vision Pro has satisfied most of his desires in what he wants within a high-end spatial computing device. The resolution is high enough and the quality good enough so that he can spend more time exploring different screen replacement use cases, augmented VR experiences via overlays, and productivity use cases of XR.

Most PCVR enthusiasts are Windows users, and so Lynch’s audience has traditionally focused more on the gaming use cases of VR. As a result they have not been as interested in the Apple Vision Pro due to the lack of high-fidelity input controls. But for Lynch, the basic locomotion gestures made available in ALVR are good enough for him to get around within VRChat without needed to hook up or use any external controllers. Because of the perceived or actual gaps between his ideal spatial computing use cases and his VR gaming audience, then Lynch actually scrapped his formal review video and is considering releasing clips or falling back to Q&A livestreams to field many questions about the trajectory of hardware in the XR industry.

Lynch also has been enjoying the mashing up of spatial contexts in XR, mostly via the SteamVR overlays and windows but mentions some experiments of bringing in fully spatial objects. It reminds me of the interview that I did with the PlutoVR founders in 2020 when they were experimenting a lot with the idea of multi-app spatial computing paradigms with WebXR and apps like Aardvark by Joe Ludwig.

Apple is slowly building out more and more spatial primitives across all of their operating systems, and are slowly becoming more and more game engine-like as new APIs were announced as a part of their WWDC, where visionOS 2.0 was announced as coming out later this Fall. We talk about some of the quality of life features, but also the role of an integrated ecosystem, and what BigScreen Beyond, Valve, and Meta might do to keep up with how Apple is pushing forward these ideas of multi-app integrations within spatial computing.

At the moment Lynch’s 8 hours of daily usage is likely an extreme outlier, but the types of ways that he’s blending realities together feels like there’s something deeper that we’ll continue to see moving forward. VR typically involves a complete context shift, while AR tends to bring in modular elements of other contexts to shift your existing context. Lynch is on the bleeding edge of fully immersing himself within these virtual contexts, but modulating his experience with these SteamVR overlays in what could best be described as a sort of AR within VR use case. The visionOS 2.0 Beta release (coming this Fall) allows users to overlay their Mac Virtual Display over immersive environments, but SteamVR Overlays already enable this on PCVR experiences and so Lynch’s experiences with overlays in VRChat could definitely be a sign of how Apple might extend this overlay functionality in future versions of visionOS.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.458] Kent Bye: The Voices of VR podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR podcast. It's a podcast that looks at the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So the Apple Vision Pro came out on February 2nd, 2024, and it's been out for a little over four months now. And just last week, Apple held their WWDC, which is their annual developer conference, where they announce a lot of new features that are coming to all of their different platforms. And they're They announced VisionOS 2.0 that's coming out later in the fall. Lots of different quality of life improvements with the operating system and also just a lot of APIs. I had a chance to watch a lot of the videos and just get a sense of where Apple is taking this in the future. But I also wanted to catch up with Brad Lynch, who I think is probably one of the biggest power users of the Apple Vision Pro. He's completely replaced all of his monitors and just uses the Apple Vision Pro as he is working with both his headless MacBook Pro that he got, but also streaming through SteamVR and going into VRChat and doing a lot of stuff that is on his PC VR with the SteamVR overlay. So he's really experimenting with this multi-app VR experiences within the context of PC VR. And so because Brad is a YouTuber and his audience is mostly a lot of other PC VR users using the Windows operating system, not a lot of them have necessarily been that excited about the Apple Vision Pro or wanted to really give up all the VR gaming that a lot of people are really into. Brad's use case is very much focusing in on this spatial computing and what you can do to push that to the limits and doing more like casual VR experiences like in VR chat and social VR and just being in these immersive worlds that aren't necessarily doing a lot of gaming where you would need a lot of input controls. So the Apple Vision Pro fits his use case very specifically. And so I had been waiting for Brad to release either his review video or to really dive into how he has been using it. And he's been doing a number of different live streams, but those are often dictated by his audience and the questions that they're interested in. And they don't always just solely focus on the Apple Vision Pro. So I wanted to have him on the podcast just so that I could understand one of the top power users of the Apple Vision Pro using it as a screen replacement and just to get some of his experiences and where he thinks everything is going here in the future, not only with Apple, and their next iteration of the Gen OS, but also a little bit about Big Screen Beyond and Valve and Meta and some of the other big industry players of how they're fitting into this concept of spatial computing. So that's what we're covering on today's episode of the Voices of VR podcast. So this interview with Brad happened on Sunday, June 16th, 2024. So with that, let's go ahead and dive right in.

[00:02:49.403] Brad Lynch: Hi, I'm Brad. I do occasionally YouTube videos on hardware, mostly very hardware focused channel. But yeah, and on the up and up, I was well known for a bit of time to doing data mining, trying to figure out what companies were working on and the software side. And a lot of that would, of course, kind of lead to the hardware side, a little less of that these days. And on the other side, I also work full time for a company called EoZ who makes mainly accessories that is purely to make VRChat users have a better experience in VR.

[00:03:19.801] Kent Bye: So yeah, that's me. Okay. And maybe you could give a bit more context as to your background and your journey into VR.

[00:03:26.403] Brad Lynch: Oh, God. Okay. Yeah, I started in VR in 2016, late 2015, really, with the Vive Pre at the time. I had a great experience of being able to go to Valve. Back then, they did a press-only event, which was called the SteamVR Developer Showcase. I got to go to that just purely because pure luck, pretty much. Yeah. That was definitely a life-changing moment, I think, trying out all the wonderful... You know, it was back when they had Job Simulator first being shown off and Space Pirate Trainer. Just great, great stuff back then. And I was kind of... Around that time, I was very much into gaming, but at the same time, I was not... enjoying how gaming was going at the time and seeing this like new promise to what gaming might become how more and more more immersive it might get i got really into it and spent the next eight years pretty much following this vr thing as it evolved to now so yeah

[00:04:24.775] Kent Bye: Okay, well, I wanted to talk to you today just because I think you might be at the top of the leaderboard of people who have used the Apple Vision Pro the most since it came out on February 2nd. Maybe you could just give a bit more context for how much on average a day do you use the Apple Vision Pro?

[00:04:42.253] Brad Lynch: So I think the best way to imagine is like, a lot because despite me using a screen with a web camera built in, I'm using my wife's iPad just because I'm afraid to use a headset to do my call because I'm a developer beta 2.0 and things crash a lot. But normally, 100 percent of the time, I don't use a flat display anymore. I just use the Vision Pro for everything. I'll tap into my PC to do everything I need to do on there and I'll do a lot of the smaller stuff like Discord and whatever. all on the Vision Pro. So I'm literally wearing it like eight hours a day, like every day, pretty much. So yeah, I'm insane. I'm aware of this.

[00:05:23.936] Kent Bye: Well, so I saw that you actually got a headless Mac, which is like a MacBook Pro. I guess there was a problem with screens that were cracking. And so because the laptop you have doesn't have a functioning screen, you actually took that off and you just use what's essentially like the keyboard portion of a MacBook Pro and you hook that up to your Apple Vision Pro. And maybe you could just elaborate on this idea of a headless Mac.

[00:05:49.571] Brad Lynch: Yeah, it's exactly like you said. I just bought a very cheap MacBook that was unlocked on eBay because normally you have to worry about it being iCloud locked a lot of these times. But it was unlocked on eBay for a very cheap price. The backlight was completely broken on it. So either someone buys it and replaces the screen or I figured in my case, I knew about this Mac virtual display feature. It worked pretty well. I will see if it works with the headless MacBook, and it works actually very well. Once you do, you do have to connect the HDMI port to an outward display and set it up first for your Apple ID to actually get it to your headset on the same Apple ID to see it. But other than that, it works just like you would pretty much want it. It's always funny because... The little AR connect button is always hovering above where it thinks the display would normally be, even though there's not actually a display there. But I like it. I don't really, I'm not much of a Mac user, ironically, but this is something that I'll bring with me if I'm traveling with my headset, because it's just like kind of best of both worlds for me, where I have a very capable computer that can do a lot of video editing, especially. And then I have the whole spatial computing thing with me there.

[00:07:02.861] Kent Bye: So walk me through this process of deciding to get rid of your extra wide monitor, because you had a monitor and you were using it, but you in some ways were eating your own dog food of being an evangelist and enthusiast for spatial computing. And so you decided to get rid of your monitor, dive head first in. And so just talk me to the point where you decided that and what were the things that you were considering?

[00:07:28.564] Brad Lynch: Yeah, so I've always thought to myself that I would do this one day. I always kind of had that itching. I was always a big fan of how SteamVR does their overlay system. The idea of being able to have persistent elements between VR and not really touched on that much on AR, but you could do it. I always wanted something like that that built that out more, but at the same time as well, Obviously, headsets were not even close to getting to the visual fidelity for me to also want to support that. I was using Big Screen Beyond a lot before I was using Vision Pro, which is a very light headset and also uses micro-LED displays at a pretty high resolution compared to what you saw at the time when it released. It was getting close for me even with that, but it still wasn't really there for me. And then when I got Vision Pro, Obviously, I think I wanted it to be a challenge for me for a few days to see if I could live like this. I was like, hey, I can do the YouTuber thing where I make a big fuss about me replacing my monitor with a freaking headset, a 600-gram headset on my face and see how it turns out. The more I started doing it, the more I realized hardware hit the threshold for me personally to do it. I freaking love it. This is like this headset and the hardware and software is basically what I wanted for the CBR overlays. But it's like built around that and an amazing package and everything. So It just, I don't know, for me, it made sense. I had an ultra wide monitor, like the 32 by nine, as you said. I love that thing. I freaking was a big, I still am a huge ultra wide pusher. If you're really into like high productivity, I think it's way better than have multiple screens even because you just have that full center view and everything. You put things around it and it feels like if you really want more after that, you kind of have to go to the whole, everything is digital. All your displays are digital at that point. And it's, Yeah, I like it. It's everything I kind of wanted from XR at the current state of time.

[00:09:29.134] Kent Bye: Yeah, for my workflow, I do a lot of stuff within the context of Adobe Edition where I need to have super low latency from when I see the playhead moving across the screen. And a lot of the different types of solutions would either have a certain amount of lag or they wouldn't have a frame rate. I just found that for my workflow, for what I do with editing audio, that I couldn't just dive in headfirst to using the headset. The other thing is that I haven't gone into the deep end and bought a MacBook Pro laptop, which I think you really need to either have a MacBook Air or a MacBook Pro in order to really utilize the virtual display. And I feel like that's a key differentiating factor where there are a number of solutions that I've tried out, but... i think if i was doing more coding stuff where i needed to have a lot of information in my field so i can have like a larger working memory as it were same reason why a lot of computer programmers have multiple monitors to be able to have lots of context that's there that just helps them be more productive but there wasn't that immediate thing that i was doing in the day-to-day stuff that i do that has it so that i'm doing more stuff in vr I did find that when I wanted to really have focus and maybe something I was distracted and procrastinating and going into a virtual environment, there have been some cases where that was allowing me to really get into a deeper flow state. But I'm really curious to hear from your perspective, what is it about the immersive nature? Is it having multiple windows? Like what is the thing that you're getting out of using the Apple Vision Pro?

[00:11:02.946] Brad Lynch: To first clarify something, I don't really use my MacBook as often as my PC. I use Moonlight to stream my PC over. The cool thing about that, this is again where I like to think about how you can do certain things with this setup that you can't really do with what's on the market for hardware, like hard monitors. my aspect ratio for my monitor that I made digitally all with virtual drivers is something that just isn't available. It's not extremely wide, but it's the perfect wideness and also there's 21 by 10 aspect ratios you see a lot, which a lot of coders like. I want something wider than that. I want something that's like 21 by 10, which is literally just not for sale on the market. So I was able to create that digitally and get what I wanted. And I could do that with whatever aspect ratio or size that I want. And that's really enjoyable to me, having that sort of control over my other pieces of hardware. Doing all this makes me feel like that personal computing has gotten way more personal because I feel like despite using an Apple device, which is most definitely a very locked-in device in a lot of ways, just this whole spatial computing thing just again feels like way more control of everything I'm doing with my computer as long as it lets me and the other benefits coming with it. You're talking about the focus thing. I think it's pretty great. Before I was doing the Vision Pro stuff, I was again with the SteamVR overlays, which I probably will talk about a lot all the time. I was always in VR chat, hanging out with friends or even being alone in a very beautiful environment and just having my giant monitor or some windows open in that environment. And it was just always nice to be somewhere that isn't really in my space. Before I got Vision Pro, I was living in a pretty somewhat fair sized house. Now I'm living in a one bedroom apartment, very small space. And I think with this use case as well, it's kind of funny because VR back in 2016, It was like, you really can only get the full benefits of it if you lived in a big space with like a lot of room. Whereas I feel like it's kind of completely switched with this use case where it's actually the people that don't have much space are benefiting the most from having a headset on because maybe they can't even fit a giant monitor in their space or whatever. And I feel a lot of those benefits like all creeping in with the input and the personal feeling and like almost first steps to feeling like I'm using a BCI in some ways. So yeah.

[00:13:37.358] Kent Bye: Okay, maybe you could elaborate on Moonlight in terms of, I know there's a number of different programs that I came across. I know like even Steam Link was one to share your screen and Moonlight was another one. And I know that you've been looking at things like ALVR to be able to dive into VRChat, but maybe you could first dive into the Moonlight. So what I gather, what you're saying is that rather than connecting to your MacBook Pro, you're actually connecting to your PC VR PC, which could run really high-end VR. VR experiences, but because you likely have a good enough graphics card that allows you to drive a larger screen, or maybe just describe a little bit of like, what are you doing when you're opening up like Moonlight?

[00:14:15.922] Brad Lynch: Yeah, yeah. So Moonlight, for those who don't know, is actually it kind of piggy tailed off of what Nvidia used to be doing. They used to have their own streaming platform that required Nvidia GPUs, but then they kind of abandoned that and a bunch of open source developers created something called Sunshine, which basically now any GPU, even APU can tap into. a very very well built series of streaming applications and services on your pc to stream to because it's open source they usually like port it to different platforms so you can even get moonlight on most android phones you can get on like iphones you can get on A lot of stuff. And someone built a port for Vision OS. So it works very nicely with Vision OS and works with a lot of the stuff there. It works with like auto dimming and all that garbage and stuff. But yeah, it is connecting to my home PC that I would normally use for everything else I've been doing in VR for the past years or so. It's a strong PC, of course. And it's where I do most of my things. And it just feels like... again, just feels like a monitor connected to it. Obviously, there's latency because it's wireless. There's no wire. That's one of the things I actually feel, well, one of the two things I'm most bothered about when it comes to using Vision Pro full-time is I do play all my 2D games streamed this way most of the time. Like Dota 2, for example, I love playing that game. It's a very competitive type game. So obviously that game, I will have to get my brain used to the latency and there will always be to the point where the latency will be a competitive disadvantage for me. So I always wish that I can have some of these even legacy type 2D games running on device, whether or not it's designed for it. I think there is kind of something there. But yeah, for the most part, it's just very powerful PC. It's basically delegated either how I use the PC, either I want to do all 2D flat screen content, or like you said, ALVR, another very open source application, has also been running in test flights. And especially a person I'd like to shout out, Shiny Quagsire, who's been developing it, And really putting the Vision Pro APIs, every feature in it, to ALVR so you can really take advantage of a lot of the hardware pieces in Vision Pro to do, mostly for me, VRChat. Again, some of the same stuff I did before, but using all the cool features of the Vision Pro headset, like the full finger tracking and the full... I can make my... fingers and my hand and my arms, the occlusion fully show up in VRChat and I can disable my virtual avatar's local rendering to where I can pull up a mirror. I can see me with my fingers and my hands, my real life fingers and hands in the virtual environment. But in the mirror, I see my virtual self replicating that. Yeah, it's just like it's just a really powerful hardware connected to my PC. It's delegated a 3D mode and 3D VR mode or a 2D mode to apps at this point. So yeah.

[00:17:16.232] Kent Bye: So I know that a lot of times when I've used the Apple Vision Pro, you almost need to have some sort of Bluetooth keyboard. I ended up getting the Apple Magic Keyboard just because the Bluetooth keyboard I had was not always consistently connecting. And I was just like, you know what, I'm just going to go all in with the Apple ecosystem and get their keyboard. And so do you have a Bluetooth keyboard or do you just use your MacBook Pro as your kind of Bluetooth keyboard?

[00:17:38.978] Brad Lynch: No. This is the keyboard I use usually on my Vision Pro. It's a Logitech G915. The reason why I like this keyboard for what I do is it has two modes. One is a 2.4 gigahertz dedicated connection, so you can plug in a little dongle to my PC. It has very fast, low latency connection. Then when I need to type directly on a Vision Pro app, there's a Bluetooth mode. When I click the Bluetooth button up here, it automatically connects to Vision Pro. I can constantly switch between these things as I'd like. Technically, I could also, Moonlight does actually pass through what I type directly to the Vision Pro to the PC, but I find the latency is not enjoyable for me to do it that way. This is my little jink method of doing that most of the time.

[00:18:23.775] Kent Bye: Okay. And I saw that the Vision OS 2 is going to enable mouse support. I don't know if that means only Magic Mouse or if they're going to have other mice with scroll wheels or other things like that. I know that you've downloaded the Vision OS 2 beta, and there's a lot of stuff that might not even be fully launching until the fall, but what do you do for a mouse?

[00:18:44.890] Brad Lynch: So when I'm using my PC, I still use my mouse, like just directly connect to the PC. I think that's fine for what I'm doing. If I was not using my PC, for the most part, I actually prefer trackpad than mouse movement for this system because I feel like the apps are very... designed for touch input a lot of the times and like obviously you can do this too but i don't ever find myself missing a mouse unless i'm connected to hardware that was clearly designed for a mouse for the most part just for the podcast listeners when you say do this you were squeezing your fingers together and moving them so it's like a pinch and drag motion so that's what you usually do like the native way of dragging stuff on the vision pro and your trackpad did you get a magic trackpad or are you using just your mac os trackpad built into your keyboard so yeah this is why i will definitely when i'm talking about trackpad in this case it definitely is like the headless macbook full tilt at this point but again for the most part i still just use the direct input with like you i always do have a bluetooth keyboard that one i was talking about earlier connected because keyboard typing and really any VR headset, SteamVR, Meta, this one are all just miserable in comparison. So yeah, for me, it's usually having Bluetooth keyboard and the direct input of the system itself has been pretty okay for me. I don't have issues there.

[00:20:07.722] Kent Bye: Okay. Well, I know that there's some plugins for Steam, like excess overlay or other things to, because there's default functionality for overlays, but maybe you could walk through some of your additional things that you have for overlays within Steam, or if you use just the defaults.

[00:20:24.542] Brad Lynch: So I'll say a few of them that are the best ones. XS Overlay is definitely, if I want to have a actual display in my view, that is the one I go to. But for other quality of lives, there's one that's not well known that I think is a fantastic overlay that's called VR Hands Frame. And it basically allows you to make a L gesture with both hands kind of adjacent to each other.

[00:20:46.330] Kent Bye: It's like you're framing a photo. It's like you're taking a picture with your fingers.

[00:20:51.568] Brad Lynch: Right. What's so cool about this is, again, this is on top of every app you run in SteamVR because it's an overlay. It has three options. One, it'll take a picture. Two, it will scan QR codes. Three, it will do a translation as well of text. This is very useful in VRChat especially. I know I keep bringing that up, but that's mostly what I do in SteamVR. Because there's a lot of Japanese, very Japanese worlds, and a lot of times they don't translate what's on their signs or whatever. So you just do the gesture, it takes a picture, and then it writes out the actual translation. But it's, again, just great for all of SteamVR. I think everyone should be using it. It's so much better than any built-in screenshotting system right now because you choose the size of the cropping for screenshots. And it also works when you have pass-through on as well if you're using a Valve Index, which most people don't use the pass-through on the Valve Index because it sucks. If there was a headset that could take advantage of it, you would still be able to screenshot that cropped image of what you're trying to look at. Those two are the main ones. Obviously, OVR Advanced Settings is great. And there's... This is what's frustrating me because I've been screaming to the clouds about overlays for a couple of years at least now. But the problem is that there's not much documentation of how to build really enticing ones. You mostly see kind of just the same old, hey, here's your monitor, your flat monitor in here, or a different keyboard, but... There's been some really great experiments where I've seen Erectus made like a really silly thing where you can spawn a bunch of capybaras walking around your space. And it works for either AR or VR. So you can literally be in VR chat world and have a bunch of capybaras running around that's tied to your floor. So yeah, really silly stuff. I wish SteamVR overlays were built more, but I understand that it's not really there yet. Yeah.

[00:22:43.086] Kent Bye: Yeah, the way that I start to think about AR versus VR, in some ways, some of what we're talking about is like having AR type of features within VR because within VR, you're completely switching your context, but with AR, you're able to pull in like modular aspects of other contexts and blend them into your reality. And so when you go into these VR chat worlds, you're able to pull in all these other windows and with these overlays within Steam, you're able to overlay within the context of this virtual world So it feels like you're at the bleeding edge of trying to augment your virtual experience in a way that you're bringing in all these additional aspects of context, whether it's Discord channels or live streamers do it a lot. People who are doing films within VR are controlling different stuff. So yeah, go ahead.

[00:23:30.188] Brad Lynch: I like it too a lot. I'm obsessed with this idea. I always call them overlays. Apple calls it spatial computing, call it whatever you want, shared space. I like it because it turns XR into really, to give an old age comparison of technology. Computers, personal computers used to be like a single application device and that was it. If you really want to open up something else, you really had to close out that application and type in the command prompt of what you want to open next and that was it until they started using graphical user interfaces and creating the whole Windows and everything. And that's why I like overlays and what is happening right now, because it is actually taking XR to that step going from a single application use case to like a multi-application having a... Maybe some people don't like it because it's kind of making some VR things and AR or your real life view kind of like a desktop background compared to all these other smaller apps. But for me... I think there's a lot of just great use cases that will be built with that as it goes on. So I freaking, I love it.

[00:24:37.172] Kent Bye: I know that Pluto VR was doing a lot of these types of experimentations where they were pulling in overlays and not sure what they're up to now. They had the cloud streaming for a while and they were doing the overlay stuff. And I had an interview with the co-founders and one of the things that really struck me was that there's this paradigm shift of going from 2D to 3D where you're When you think about sharing multiple windows on a desktop, it's essentially in these frames that are bound to contexts that are independent of each other, but that when you really start to think about spatial computing and having these objects in the same physical space in all three dimensions, then you start to think about things like Spotify, rather than just as a window, it's more of like a radio that's an object that's in your virtual space that may be broadcasting audio with you know, in the model of VRChat, there's like the local versus global. So you can be in your own virtual space and maybe augmenting stuff that only you are hearing. Or maybe there's a future where there's a shared context where people are having these objects that are in the same room. So you go in and you have like a TV set that everybody's watching. And that TV set is actually like an application that everybody is sharing in some ways. And so I feel like there's this paradigm shift that we're still very much in this windowed era of spatial computing where everything is still in the same context of those windows. But that at some point in the future, I can see how we're long range of things is that you actually have these objects and that you might have a flashlight that's like emitting light and then having reflections on other objects that are also in the room. And so it becomes less of a portal window into these other applications and more of like fully functioning spatial objects within the context of three dimensions.

[00:26:18.208] Brad Lynch: Yeah, I think that was like, you know, when Apple first showed off Vision OS at last year's WWDC 2023, that was like the biggest thing that current VR people were very skeptical about. It's like, wow, they're only showing 2D panels and everything on top of the spaces. That seems very boring. And it's like you said, this is a baby steps thing. Like we're still... XR as a whole is still so baby steps in early days, really, when you really think about it. And we're still figuring out what people are comfortable with when they put on the headset and what they want to do with it. And once people figure out the right methods of... I mean, Vision OS 1.0, there was no way to summon the home screen unless you press the button on your headset. And now they actually switched to like, hey, let's attach stuff to our hands and stuff. And I think there will be more progression as that goes on. I think what's good about it is... We don't only have SteamVR showing a very bare use case for this. We have Apple going full on in on this. And we have Meta that's also clearly trying to maybe catch up to that use case as well. And I'm sure they're going to share some of their ideas because they have genuinely different takes on how this should be done, especially with input and stuff like that. So it'll be very exciting, I think, for current VR people that maybe feel... a little confused or whiplash from what's going on in XR, I think it'll all make sense as time goes on.

[00:27:42.375] Kent Bye: Yeah, one of the things that I found myself being a little bit surprised by, coming from mostly covering VR for the last decade, is that sometimes within Apple Vision Pro, when you go into fully immersive mode, you essentially block out the capabilities of doing anything other thing. And I know that sometimes I just want to maybe watch a YouTube video or something in the background. And then when you go in immersive mode, everything basically shuts down. And that's the only thing you see. I saw that within the Vision OS 2.0, they're going to have the ability to allow your Mac virtual display to break through that immersion. So at least you can start to blend things together. And imagine if they do that, then maybe they'll allow other applications of saying, even though I'm in this virtual world, let me break in a Juno YouTube app to be able to watch a video while I'm doing this other thing. But right now it feels like you're almost getting penalized by doing fully immersive and excluding the capability of doing what really the Vision OS is really built for is to have this multi-app full holistically integrated experience, but that the fully immersive mode kind of eliminates that.

[00:28:46.004] Brad Lynch: Yeah, I totally agree. That's my biggest problem with VisionOS right now is what you just explained. It's very painful. I was talking about Damio on my stream yesterday that just kind of released on VisionOS. And for me, it felt like the worst port, actually, of the app. And I like Damio. I have it on SteamVR and I have it on MetaQuest. And when I thought of it coming to VisionOS, I had a certain ideal of what it would be like to take advantage of it. what you expect from VisionOS, and they have two modes. They have a shared space mode and an immersive mode. Like you said, when I clicked the immersive mode, I didn't want to do it because I lose everything else that I get in VisionOS. And that's like that with every immersive app right now. With shared space mode, all it is is basically a flat version of Daimio with a 3D frame around it. So it's basically like playing the iPad app they had before with just a fancy frame. So it felt like the worst of both worlds, and if I ever wanted to play Daimyo, I would not play it on VisionOS. And I'm sure this is just one of those things where, in their case, Unity is still catching up for what VisionOS allows you. I know they made some big announcements for VisionOS 2.0 and Unity. But, yeah, right now it was a very bad showing With every other immersive app, I get the same feeling where it's like, I actually don't want this full immersion experience if I lose everything else. which is why I'm guessing it's also a compute thing because usually when you are making an immersive app, you are actually pushing the chip inside to its fullest for the most part. You're running all these pixels on these full FOV and everything, and they're probably not comfortable with allowing that right now. Hopefully that changes as the compute gets better with future additions and everything. Going back to the developer feature to allow the Mac virtual display on top of environments, That was very interesting because I keep saying how I don't really use my headless MacBook very often, except when I travel. But once they have the extremely ultra-wide version of Mac virtual display, and I can do this on top of immersive apps, that might actually push me more to using macOS more because of that capability. And I already can... open ALVR, stream VRChat from my PC to get the full VRChat immersive experience, and then get that whole productivity of an ultra-wide monitor for my Mac, that's very enticing for me. And it almost makes me worry for the competitors that are not They'll have to do something like that, I think, because I think there's a lot of people like me that might find stuff like that very enticing and might drag people over. Like Apple always does this. They always have these features that want to drag people over to other aspects and operating systems that they own. So, yeah, very powerful.

[00:31:38.063] Kent Bye: Well, in terms of input, I know that right now I still have my Lighthouse with SteamVR set up. However, my Index controllers, one of the controllers are kind of busted, so I can't actually even use it. I've actually been using my Quest 3 because of just the controller issues. I just didn't want to have to buy a really antiquated version of the Index controllers. Maybe if Valve came out with a new headset, that might change my mind, but I know that you've... You've been covering that quite closely. However, maybe that your opinion on that has changed just because I've watched your live stream and you seem to indicate that you really aren't using any of the other headsets now. You're just using the Apple Vision Pro. But in terms of input, I know some people are using like their Joy-Cons that come with their Switch to be able to do input control for something like ALVR. Maybe you have the index controllers. But what do you use for input when you're actually connected to ALVR, connected to your PC to be able to get into VR chat through your Apple Vision Pro?

[00:32:35.034] Brad Lynch: Yeah, so I'm aware this is not the best case scenario, but because of how I mostly use the headset and mostly only spend time in VRChat, I actually use the full finger tracking and ALVR comes with a bunch of gestures to be able to simulate. Let's say you make a fist with your hand and you move your thumb on top of that index finger on your fist. It knows to simulate a joystick. It's not... You would never be able to do this for something like a serious vr game because it's not that good but for something where you're mostly just kind of like walking around with friends or even just walking up to a group of people and just Kind of sitting there and chatting for a while in a beautiful environment That's fine enough for me But I did recently, literally two days ago, I did spend some time with a developer friend giving a lot of feedback on their developing a quest game. And it was the first time in a while in this past four months that I actually went back to using a handled controller with the full tracking that includes. And it really... I really did miss that, actually. I really did miss having mostly the tracking fidelity. I didn't really actually miss some of the other stuff with it, but really the tracking fidelity of how fast movements were and things were like that were really great. Going back to your comments about how I feel now about waiting for a Valve headset or whatever, I think this is something I've never really said publicly, but I think people kind of figured it out based on how I've been using this device. But I would love for Valve to release a new device, but I'm not as obsessed with it these days. I'm not relying on them releasing something because... Whatever they release, I was always hoping it'd be pushing the high end in terms of what can be made for VR hardware. And really, no one can get away with doing something like that compared to Apple. And this headset kind of showed me that, really. Whereas for now, I'd be quite happy if, frankly, Valve did anything related to VR. they can frankly just keep building out what they're doing with SteamVR Link and everything like around that and just like support the headsets that are being built. And I think both them and people like me who have a headset they're comfortable with will be more than happy. And then whenever they feel comfortable with adding something that they feel is necessary to hardware from a VR HMD standpoint, I'm sure they'll hop in at some point, but yeah.

[00:34:55.660] Kent Bye: Well, I know that Big Screen with Big Screen Beyond has really stepped up in terms of really providing like something that's new and interesting relative to the PC VR enthusiast. A lot of people that I know who are like really hardcore VR chat users have gotten the Big Screen Beyond. Some of them had to return it one or two times to get the IPD right, and some people have issues of accommodating to it for comfort. But for people that have switched over, I could see that that's a whole other strand of the super lightweight form factor that deepens the level of immersion. For me, it took a while for me to really find the right hack that I had to do in order to make the comfort even bearable for the Apple Vision Pro. I ended up with like some Velcro in the front. And I know you've got the additional band that you've added to it with some 3D printed elements to really do a better job of the weight balancing. But I feel like that's the biggest thing for me in terms of like the comfort. And even when I use it for extended amounts of time, it still has the weight distribution isn't great, I feel like that for some people that may come down to the big screen beyond form factor if they're a hardcore PC VR user or something like the Apple Vision Pro that allows you to have the resolution but also has this other completely fleshed out operating system and ecosystem that is also something that's qualitatively different than what something like big screen beyond or even with Valve came out with something kind of vertically integrated hardware, software, and across these different platforms of both iPad, iOS, and MacOS, maybe even tvOS at some point. But I think having those things work together creates a value proposition that's different than what any of the other, you know, even meta, if Valve gets back into the game or big screen beyond. So let me hear some of your thoughts on that.

[00:36:42.816] Brad Lynch: Yeah, I think whenever I think of if Valve got back into this, they would have to make something standalone at this point of some manner of amplitude with the current market. I don't think that they could release just a tethered only headset in current age. But at that point, they have to think about what they can add to the everyday consumer. And I think it's like every other company right now. People who are not excited about Vision Pro for the most part probably aren't in the Apple ecosystem. Discounting price, obviously. Price is like the first thing. But second, you're not in the Apple ecosystem, so it's really like a no-go for you at that point. So some people are literally okay with the idea of buying an expensive headset that can do all these things, but they're waiting for their ecosystem of choice to come out. That's why there's a lot of hope for Google and Samsung and what they're working on. So it's all about ecosystem. And I think if Valve had to come out with something, they would have to really figure out a way to bolster not just their VR games and bringing that to standalone. I think they would also would want to push kind of like what they're doing with Steam Deck because Steam Deck was... It really is a quite good device for getting people who are already in the ecosystem the mobility of the entire ginormous 2D library. It also does kind of the same for people that are not currently already in the Steam ecosystem as well. People who are used to a console-ized device, they have that, and then they can get more people into the Steam ecosystem as well. I think they would have to do something like that for VR and they already have this wonderful overlay system, even though it's again, not fleshed out, no documentation of how to build these things. I think if they even built something around the idea of having this kind of what you're seeing with, I mean, you're seeing a little bit of a push of what Meta, they have the Xbox cloud gaming service, right? I think if Valve did some of that, but really just had full control over the operating system, the rendering, they'd do something. For Steam Deck, they have the GameScope compositor, which is a very low-level thing they built just to have the best gaming experience for a very low-level piece of hardware compared to what most gamers who are usually buying expensive PCs to play these games. And they've been building out This game scope with a VR session that kind of hints to the idea that they are working towards this idea of like doing a spatial computing overlay system and also focusing on playing 2D games in a very novel way, taking case of like hardware. We haven't seen it yet. We don't know how far away that is, but that does seem like something that Valve might see an opportunity in someday. But I think they're also a company that realizes that for them to do something like this, to make people want it, they would have to wait for the hardware to really make it worth replacing a person's main computer or even the Steam Deck because now they might cannibalize what they're doing with Steam Deck. They have a really... nice screen on the OLED version now. So I think it's just they're letting Apple and some of these companies like really push the components industry to bolster it and let that stuff come down in price as time goes on. They're building stuff in secret. That's what they'll bolster their ecosystem as the time comes. And I think eventually we will get something. It's just those two things have to align with each other.

[00:39:59.181] Kent Bye: Okay. I know that when I watched your live stream, you had made the comment that you really aren't playing any other VR games that's mostly like VRChat. Maybe you could just comment on what was it over time that you felt like your attention was kind of shifting away from like the gaming agency exploration? Like VRChat seems to be satisfying a lot of your needs for hanging out in the social dimensions of VRChat and also the worlds that they have. And yeah, just what shifted or changed over time that, you know, now that with the Apple vision pro you have the whole frontier of spatial computing, which is in some ways a little bit more like enterprising in terms of like productivity and how to go into the next phase of what's it mean to have the future of spatial computing that you're really seem to be much more interested in that than what's happening in the gaming space.

[00:40:44.707] Brad Lynch: Yeah, I guess, so one reason I still like VRChat is I feel like everything that's made in VRChat very much caters for the enthusiast. Like it's very, creators who are making worlds or avatars or anything in that platform, they're not thinking, oh, we're trying to target the very most common denominator gaming enthusiast, right? They're doing things that they think is weird or interesting or actually bolsters the use of XR. And as time went on and I mean, I feel I feel the gaming industry in VR, very small as it is, to make a profit, they obviously have to target that common denominator gamer. And I think what makes VR games good actually is not compatible with the traditional gaming mindset. And it's shown from 90% of games that have launched in the past few years. And every time that something gets extremely hyped up, like an amazing VR game, I go in to try it, and they're always hyped up, extremely. I go in to try it and I don't find the things in it that really prove to me that it should be a VR game over something that I can just play even in a window, even in 3D or something most of the time. It doesn't take advantage of that. I would say it's just the market itself and just the way things are less experimental and actual game products being made and sold. I get it. They have to make money. I get it. That's what they're trying to do. But I have no interest anymore in that.

[00:42:23.416] Kent Bye: Yeah, it's very interesting to hear you make those comments, especially in the light of how this past week, VRChat announced on this past Wednesday that they're laying off 30% of their staff. They've raised like $95 million in VC funding. And so they've been bootstrapping everything they've built with that VC money, but they're still at the point where they have to figure out how to translate this kind of avant-garde, independent, experimental approach culture that is very gift economy driven a lot of times where people are just creating for the sake of creating and trying to push their own creative expression with their identity, their worlds, their avatars that they're creating and the different events and music. So it's certainly cultivated a certain amazing culture, but I feel like it's in this one foot in this new paradigm of this utopian gift economy, but yet the financial realities of all that, of how do they actually turn what they have into monetization and the creator economy and everything. I just published a whole conversation diving into it with some VRChat creators in the community, kind of unpacking it. But let me hear some of your thoughts, especially because you're at the bleeding edge of both looking at the virtual culture from the context of not only the hardware side, but also the gaming side, but also what's happening within VRChat and what's unique there. But yet, how are they going to close the gap between the realities of how to actually sustain what they've got going on?

[00:43:46.994] Brad Lynch: Yeah, I always get in trouble for making comments about how I feel about certain booms that the VR industry has had, because I always have a very different view of it, I feel. So there was obviously the Quest 2 boom with COVID, and then the Metaverse boom right after. And you go back to that, and people in XR thought this was great. It's so great. We're going to have all this funding during this time, and VR is going to grow exponentially from this point on. And I, back then, felt it was a little... problematic actually having all this money rushing in for something that wasn't really proven and still isn't proven. It's sad that I have to always say this, but I clearly love XR. I clearly love everything about what is happening in the industry as it goes on. I think these big giant booms and these CEOs that come out and always say that, hey, this is next year or even five years from now, everyone's going to be doing this, actually does more destruction to the industry than like them just kind of being honest and being like, hey, this is a long haul effort. This is something that we don't know how long it's going to take for everyone to want to wear a headset. I don't know when your grandmother is going to put on a headset like they use their smartphone every day. And even VRChat, they did their last raise during the metaverse boom. And if you look at how much they got exponentially to their previous raises, it was pretty clear that they probably raised too much, frankly. And there's a lot of companies that probably raised too much. So it was always inevitable that they probably were not going to make everyone that put money into them happy after the metaverse boom. You see a lot of people in XR who are raising startups saying there's no VC interest in AR, VR at this point. And they always mark it as like, this is terrible or whatever. But this is kind of the reality. There's not many thing. I feel like there's more failure stories in XR than there are actually success stories of these companies and game industry things happening in VR. So I knew it was coming eventually. I'm sad that I'm sad for the people who lost their jobs and not just VRChat, but the entire industry over the past year, because it's not just VRChat. A lot of it. Sony's been letting off people, their VR studios and stuff before this. People were talking about a month or two ago. It's just reality is here. The only reason why it seems painful is because reality was distorted a couple years ago. Okay.

[00:46:15.202] Kent Bye: Well, so I know that you've been using the Apple vision pro for eight hours a day for the past a hundred plus days since February 2nd. And I've been waiting for like some sort of like big review video or some sort of like what's Brad think about the Apple vision pro and maybe you could just elaborate on what happens with, I know you've been doing some live streams and maybe switching over to kind of live stream, but it seems to be very much reactive to what the people are interested in and your audience is interested in actually like everything with an XR. So I haven't seen like a one dedicated video where it's just solely digging into everything in Apple Vision Pro, which is part of the reason why I wanted to talk to you today, but love to hear you maybe elaborate on that.

[00:46:55.580] Brad Lynch: Yeah, I know I told you in private that I actually did have a whole video that I felt was kind of like a definitive review, and I ended up throwing it away. I might post some clips here and there of what it was someday, but I just didn't like it, and I put a lot of effort into it because I... Again, I was talking about how I use the headless MacBook when I travel and like how powerful that kind of feels. And you're seeing both Meta and Apple show the U case even just like seeing an airplane and having the virtual screens and like a virtual cinema there. Like there's a lot of magic to that. And then finally makes the standalone headset use case make a lot of sense to me. But... When I did the first live streams of me using Vision Pro and my comments about it, which they were not reviews in any sort of way. They're just live streams of me just candidly talking about how I feel about and when things going on. I know it's a huge disparity between how I felt and like the outside view and like a lot of my community that just didn't really get it right. So It's not that I felt like I couldn't release something because I don't actually care that much about what the XR community thinks I say or things about it, but I'm totally aware that this is a $3,500 product that is not really a consumer product at all, really. Like when you get down to things, And I think to really get the usage out of that price, you have to be a weirdo like me who is spending a lot of time out of it. And I know there's not that many people out there to do it. So I keep kind of struggling with this responsibility I put myself in where I'm very cynical about any headset I review. I mean... I usually hear on this side of negativity for the big screen beyond. It was more positive, actually. It was in the middle. I said I liked it a lot, but I always gave a lot of negativity points for each hardware. I gave PSVR 2 a pretty low review at the time compared to others. I gave Quest 3 a mid review and said a lot of bad things about it. But for Vision Pro, When I talk about it, I kind of have this... I have feelings about it that are perfect for everything I want from XR, but it's still so far from me being able to say to people that you should buy it that I don't feel comfortable even releasing a review for the most part, like a standard review, because... If I say all these good things, that is going to push the everyday, even current everyday VR user probably to buy it. And I just don't think they should. Even if I disclaimer that, I just, I don't know. It's really weird because I've been using it. I've been tweeting about it. I've been showing all the use cases. I still have people coming to me saying, Brad, I bought the Vision Pro because of all your tweets. And that makes me so uncomfortable. That makes me genuinely uncomfortable because I know... where the technology is at. This is the best VR headset on the market by far, but it's still not there yet. And I don't want people spending money on it. So, yeah.

[00:49:54.827] Kent Bye: Well, I think for me, I think it comes down to the use case and the context that people are going to really find that they're using it. For me, I'm in the XR industry. And if I wasn't in the XR industry covering it, I probably wouldn't have bought it. And the retention that I have with it is very much driven by apps that I'm testing out that are interesting or that are coming across my radar or for film festivals that are sending me a build. And so I'm glad that I have it, but I'm not using it day in and day out like you are. And so I feel like in the industry, there's a first impressions video that people do. And then for people like yourself, I know that you like to spend an extended amount of time with a piece of hardware before you give it a proper review. But then the question is, what is the audience of that review? So it seems like that's a little bit of what you've been dealing with. For me, what I would love to hear from you and maybe answer now or in a video, but I feel like there's something about the Apple Vision Pro that's changing your relationship to technology in a way that you are interacting with it in ways that because you're so dedicated to it, you're really tinkering and testing it out and finding either it's like, increased focus, more deeper flow states, allowing yourself to go into these immersive worlds and to be immersed into a beautiful world in VRChat, but then at the same time, bringing these other contexts. And so these are things that are really driven by your workflow of what you're doing, whether it's like you're managing a community and you're doing Discord messages or writing email or you're editing a video or whatever it is that you're doing while you're immersed into a VRChat world, it could be that you're the type of person that really enjoys being on the bleeding edge of blending these different levels of reality together. And I feel like you could be like a sign of things to come where everybody in the world will be doing something like what you're doing right now. Or you are like this self-proclaimed weirdo and that this is like a temporary thing and that the Apple Vision Pro doesn't find its core audience or its core use case that is going to drive people to buy it and to like really adopt it like you've adopted it.

[00:51:56.812] Brad Lynch: Right. I think I should say something that's kind of dim, and I don't mean to make anyone sad if they're fans of me and listening to this, but I think if I didn't get something like Vision Pro within this, maybe, let's say two or three year time span, I probably would have been so cynical to XR to the point I would have been too exhausted to cover it anymore in general, because... I was very excited about Quest Pro, actually, like very excited. I was like, wow, Meta is going to put all their research and development and release something that is like teetering to the edge of hardware that might get us closer to the point where everyone might want to use it. And when it released, yeah, I usually do spend a lot of time. I usually spend at least a month before I release a review. Every headset I've done in the past three years, it was always one month later. Quest Pro was like one month, two weeks later because, But anyway, that headset didn't do it for me. It actually made me very sad. Actually, I felt very sad that this was the best that the biggest company in XR could release for that price point in terms of hardware and software. So that hurt me quite a lot personally, as someone who is very much aware that I love the bleeding edge of VR hardware and stuff. And Yeah, then they released Quest 3, and now they're targeting Quest 3S. We didn't get anything for Valve in the past five years now. They always kind of always wanted to push the high end more, at least try to, with what they could do. Haven't heard from them for a while. And then you, of course, would get... We had Big Screen Beyond, which was actually pretty good, too. Again, I liked Big Screen Beyond. I like what they're doing. But they're targeting also kind of a niche of, like... removing features to get the super light headset which is great for some things but still doesn't solve any of the problems that i was hoping to see xr do and then vision pro came out and it did a lot of the stuff that i wanted an xr headset to do it had a lot of the components that i was hoping an xr headset or a company would fund in to put in it, and they did it very well. They did it very nicely. Didn't have everything I wanted, but it was still marked off so many things, and it has kind of changed my relationship with technology. But I've never thought that an Apple product would make me interested in XR again, because I am not an Apple user before this at all. It's very weird and makes me feel a little gross in some ways, but... Yeah, it didn't come out with like this extremely high end, I don't give a crap if everyone buys it sort of mentality, then I probably just would have dipped out of XR for a bit and just waited for something like that to come. So it's very important for me personally.

[00:54:43.433] Kent Bye: Yeah, and there's some dark horses that might be entering in. There's a tweet that you made or an ex-post that you made that Palmer Luckey responded to that he's going to be announcing a whole new headset at AWE this Tuesday or Wednesday. So that's exciting that Palmer Luckey, obviously the... Founder of Oculus that sold to Facebook slash Meta and now at Enduro. When I did an interview with him, he said he's still working on different stuff. So I'd imagine that it might be more like either enterprise grade. So like more of a Vario scale or maybe something that I don't suspect that would be too enthusiast, but I don't know, maybe I'll be surprised for whatever that is.

[00:55:18.817] Brad Lynch: I would be pretty disappointed if it wasn't Enthusiast because he's always made comments publicly since he got let go that he always believed meta should push the Enthusiast barrier. So I think he should put his money where his mouth is in that sort of regard. But I like Palmer. It is funny that he replied to that tweet to announce that, but he always likes to dip into my DMs sometimes and talk about grievances with the current industry and stuff. And I think it's cool to see him Yeah, we'll see what he does, but I think it's pretty cool to see what he does related to that. He's also a big proponent of something that I am very also a big proponent of in VR hardware, and that's like a compute puck. He's very always very positive about that concept, and I am very much too, because I want to throw all the compute into VR if we're going to be going to this mobility standalone use case. So, yeah.

[00:56:12.702] Kent Bye: The session he's going to be presenting at AWE is with Darshan Shankar. And I don't know if it's going to be like they're doing something together with Big Screen Beyond or if they're just happen to be both talking about kind of the more non-big tech company HMD approaches were more organically grown of really, you know, Big Screen Beyond, they did a great job of working with. different enthusiasts and hardcore users like yourself, Thrill Seeker, other folks that were really at the core of the VR enthusiast influencers and also users. So it sounds like they really found a market for that. And I'd be very curious to see that session this coming week to see where that goes. But yeah, I'll be tuning in.

[00:56:51.533] Brad Lynch: Yeah, I'm clearly never... I'm not, you know... One thing I would like to bring up is, like, I remember when Darchon first brought me in to kind of check out Beyond, and they brought me a couple times throughout a whole year of actually prototyping different prototypes and giving feedback and stuff, and... even near the end when i was trying the device and talking to him about it i was asking him how much he thought he was going to sell this device and when he told me i didn't really believe it would work out that way even though i was very positive to him about the device and like the niche he was targeting i didn't think it was going to be as successful but he's proven even me wrong on that he's done they've done very well for themselves better than i thought they were going to do so yeah i think that would be cool if they did partner up because i do think Even though big screen doesn't do everything, they're targeting what a small company can do. And I think that's, again, like you said, very organic and very good for the industry. So, yeah.

[00:57:45.306] Kent Bye: Yeah. So TBD, AWE, I'll be tuning in this week and see. Well, I did want to ask you, you mentioned ALVR, mentioned Moonlight. Are there any other killer apps or other apps that you find yourself using a lot within the Apple Vision Pro ecosystem?

[00:58:00.271] Brad Lynch: Oh, man, it's all right now. It's definitely just like tapping to my PC for the mainstream and then always having like music in the background or discord in the side and stuff. And I have been doing a lot of video calls with the persona feature. I actually found that to be pretty nice, better than I thought it was going to be actually for that use case. But it's very small things. It's mostly still definitely feels like. Maybe there's some nostalgia there, because a lot of the apps that are releasing directly for vision OS feel like some of the apps that I tried in 2016. Like they're very, some of them are very weird and experimental and very small scale. And I'm fine with that right now. But yeah, it's mostly stuff I would normally do on my PC, but all in the spatial environment with a very relaxing overlay and or music or video or something going in the background while I do my stuff. So yeah.

[00:58:50.836] Kent Bye: Nice. Well, I know that you've downloaded the Vision OS 2.0 beta. I have not, mostly because I read online that it's not an insignificant thing to fall back into the Vision OS 1.2, which is the latest version that came out. And so I don't know if you have to do like a factory reset or take it into like the Apple store to get it reset, but I'm on the hook to be trying some different apps and I didn't want to try it with an operating system that's going to crash the app because I feel like that would not do justice to the people who are sending stuff for me to check out. But what have been some of your early takes of the Vision OS 2.0?

[00:59:28.684] Brad Lynch: It's just a lot of certain, like a consumer facing thing. A lot of quality of life stuff is very, very nice. Like it doesn't feel too much different, but the things that they did improve, like, okay, really the two biggest improvements for me is one is, The video players, not just from Safari, but a lot of apps can now tap into the whole cinema thing for any environment with all the reflections stuff going on. That's nice. But also alongside that, there is what they keep calling a recliner mode. Whereas before, if you recenter the headset while you're like laid back in a recliner, it wouldn't put the giant player in the sky or anything like that rotational value. So that's great. Another big thing that they added us not really talked about, is there's 3DOF fallback. So if you are literally... if you're using the headset from daylight to night break you know there's no lights anymore camera tracking now it actually like automatically will fall back to three of tracking if you're doing something very simple which is just nice to be able to have that automatically done for you just until you turn on the lights and then it will automatically snap but all the other stuff that i've been seeing from vision os and like kind of testing it it's all quality of life but there's a lot of apis that are being added i think are more what I think VR and XR enthusiasts want is some of those newer APIs that will really like tap into stuff. Like I saw some stuff with like being able to do low level shaders and everything, which will be very big because there was a lot of limits and problems. Even I remember I talked about Damio earlier and their limitations of building a good VisionOS app. I remember they did a whole keynote in like November last year talking about how problematic there was a lack of shader support and stuff, but both Unity and some integrated tools are getting it. I think some of the native apps are not just going to look like flat apps anymore. I think we're going to start seeing a bit more interesting stuff as time goes on, thanks to VisionOS 2.0.

[01:01:27.466] Kent Bye: And that API that was more of a low-level mesh, so it's like taking meshes and modulating it with code rather than any shader language that they're using yet. But when I was looking at it, I was actually quite curious around some of those API changes. And one of the impressions I got from watching a lot of the videos was like, It's like, you know what, they have Mac OS that's been around forever and then iOS, iPad OS. And that Vision OS is building upon a lot of the same APIs as iOS and iPad OS. And I actually scraped their documentation pages because I really was quite curious to see like, OK, what are the things that are completely unique to Vision OS and to what degree are unique? things coming from the VisionOS and being backported into the other platforms. And so having more and more of the compositor services and RealityKit and other things that were maybe catalyzed by the path towards VisionOS. But when I looked at it, I realized that a lot of the stuff they've been slowly adding over time and that there was actually very few new APIs from VisionOS that were being backported just because they've already... had so many of those different APIs that they're building upon with iOS and iPadOS. And it's more that they're bringing all of those really refined 2D operating system APIs into spatial computing and that they're slowly adding the spatial computing and that it's less that that's infusing backwards. But when I was watching some of the videos, I was like, they are really creating more and more game engine-like components into their core operating system. So that ideally they wouldn't even need to have something like Unity because you'll be able to, at some point, just use their framework to create these kind of game engine-like stuff. And I was like, wow, they've really, at a low level, architecting something that is going to have a long lasting effect of how we build these spatial computing apps in the future.

[01:03:23.331] Brad Lynch: It's really funny that you said that because I literally was talking to a friend of mine who's a very low-level Unity-type dev today. And that's what literally they said is like, it's like Apple's creating their own Unity with Reality Composer Pro right now, basically with all the low-level stuff. So it's funny that you said that because some of my friends are feeling that way too. But yeah, again, I think I'm very happy with VisionOS 2.0 and what it came out. My biggest thing that I'm not happy with is the fact that they're not giving access to... the eye and face tracking to do, like, I can't do eye and face tracking in VRChat, for example. They're not allowing me to send that data over, which is something you can do on Quest Pro, actually, wirelessly, thanks to, like, Steam Link and some other apps. So that's my one big, brutal thumbs down right now that no one can access that stuff. But all the other little stuff, it's been really nice. I think it's a... It's actually kind of crazy when you frame it that VisionOS 1.0 technically only came out four months ago. So like VisionOS 2.0, obviously they've been working on this stuff in the back end for a while, longer than four months. But when you compare it to that timescale, it feels like kind of a lot. So, yeah.

[01:04:30.726] Kent Bye: Yeah, usually 2.0 jumps are, I don't know, maybe there's a threshold. It's kind of like an annual thing. So they were showing it last year. So they showed everything last year at WWDC. And they launched it, but yet the 1.0 version was kind of like already three quarters of the way through the development cycle, I guess. A lot of people are saying that the Vision OS 2.0 is kind of like... all the features that were delayed for getting into 1.0 so it does feel like it's still kind of in the middle like technically be like 1.3 but you know there are some things like the enterprise apis for getting access to the camera is something that a lot of people who are developing more augmented reality experiences, they want to have that raw camera access to be able to do all sorts of stuff for augmented reality and that you will have to get like enterprise support. And even if you do, you're not going to be able to release that as a consumer app. It's only available for like inside enterprises to be released internally. And like the thing that you're talking about, the eye tracking and the face tracking is a lot of their privacy architecture. And for me, one of the biggest announcements was that WebXR is going to be coming with Vision OS 2.0 and that's going to be, without any feature flags. But again, they have a number of different ways that they are trying to architect the privacy so that it's not tracking your eyes. People that are on the other side, they only see it when you take an action of like clicking, they'll know what you clicked, but they don't know where you'll have been looking at. So I think there's certain things where the privacy architecture that they're designing has been a big part of holding back some of these features that you want to be able to have with VRChat integrations.

[01:06:07.732] Brad Lynch: Yeah. Yeah, and I think all of this is just great because it's like, again, it really is pushing competition. Again, even though I don't feel like Apple cares too much that they're selling millions of these things, they don't ever feel like they act like that. Even in public comments, they keep only talking about enterprise more than consumer at this point. But what they're doing is pushing companies like Meta to really... push what they haven't been doing or some of these things because like the direct camera access, right? Meta's had no access to no one forever. And they most recently, their meta developers, I think Twitter recently hinted that they're going to have announcements related to that. So I can imagine a future where Meta to one up Apple is going to like have some sort of framework faster for consumers and like developers to access cameras for consumer applications, just to one up them. And then the competitive scale will keep teetering over and over. And I think that's pretty cool.

[01:07:06.934] Kent Bye: And it sounds like some of the really, really big features that I imagine a lot of people are going to want to actually use are not going to be launching until later in the fall anyway, like the Mac virtual display enhancement that allows you to have like an ultra wide screen. And there's a bunch of APIs that won't be able to be used until you have more applications that are actually using it, like a whole multi-screen APIs that were developed that you started to see a little bit of that with the NBA app, but starting to actually build that in as a core feature. They announced the iPhone mirroring for a Mac OS. However, you're not going to be able to mirror your iPhone on a Vision OS. You're going to only use it as an AirPlay receiver, so you'll be able to see it, but you'll still probably have to have it in your hand and interact with it in order to engage because you'll be able to receive the data coming from your phone, but you won't be able to send it back, as it were, like you can with the iPhone mirroring that they're having with macOS. There's some features like that. And also there was so much of a focus of AI and all the different AI integrations that are going to be happening with the future iterations of their operating systems with macOS, iPadOS, and iOS. But almost literally none of that is going to be coming to VisionOS. And so... There was a comment on Reddit that was saying that they're already capped out with driving all of the different chips that are on device to the limit. And so we might have to wait till like the Apple Vision Pro 2.0 in order to start to get more of these AI features that are launching with the broader ecosystem with iOS.

[01:08:32.766] Brad Lynch: Yep. And, you know... It's kind of the crazy thing about Apple and why I'm more comfortable being okay with them being the only ones releasing a product of this grade right now is because, again, I'm aware only they can probably push some of those things. Like, they have the whole silicon chip stuff that they can spin up and grab from other products and stuff. And they have so much holdover, especially the display industry. Like, I'm so excited. for them pushing even the display in Vision Pro. And frankly, the display in Vision Pro, the brightness is still so far from what I want it to be, which is why you get some of that motion blur when you move your head and stuff. And I'm confident Apple is going to actually push that over time with their suppliers. And they're probably the only companies that can. So, yeah, it's fun to watch it very slowly kind of evolve. And yeah, very happy times for me right now in XR.

[01:09:27.054] Kent Bye: Nice. Yeah, well, I guess as we start to wrap up, I'm very curious to hear what you think the ultimate potential of spatial computing might be and what it might be able to enable.

[01:09:38.201] Brad Lynch: Hmm. I think really... I don't know, like people really judge the term spatial computing and everything like that. But again, before it was known as spatial computing to the mass audience and before all that, I was obsessed with the SteamVR overlays and persistent applications across AR and VR and truly making XR into a multi... application computer. And I think that really is all it takes for the ultimate use case of VR. Because the difference between using a computer and an XR headset is computer, you're always for the most part going to be limited to screen space and 2D applications. Whereas XR, if we go to multi application, those can be also 2D volumetric and all that stuff all at once. And I think that really is just the magical thing that the entire XR industry should be pushing for sure. So, yeah.

[01:10:37.507] Kent Bye: Nice. And is there anything else that's left unsaid that you'd like to say to the broader immersive community?

[01:10:43.676] Brad Lynch: Um... Thank you for all the support. I just kind of wanted to mention that I have had a lot of support over the years, and it's kind of crazy because, as you kind of hinted to earlier, what is the audience for someone making a super deep review one month later for a headset and the things I do? And the audience is very small, just like a lot of things in XR. The audience is very small and unsustainable, but thanks to the support of... Really, the mass enthusiasts out there over the years, they've been very kind to me and I'm very thankful. So again, thank you. I'm a weirdo and thank you for supporting my weirdness. So, yeah.

[01:11:27.717] Kent Bye: Awesome. Well, Brad, you've got your YouTube channels of sadly it's Bradley and also gladly it's Bradley a little bit more shorter takes in your live streams on sadly it's Bradley. And I know on your Twitter slash X posts that you've been posting and also your discord. So I know you've been really diving deep into all these things and I, As far as I know, you're probably at the top or near the top of the leaderboards of people who've used the Apple Vision Pro the most. And so really appreciate hearing a little bit more about how you've been using it and mostly as a portal into your PC VR and to do all these other things. And so, yeah, very curious to see where it all goes in the future. And yeah, just thanks for taking the time to help share a little bit more insight as to how you've been using the Apple Vision Pro since February when it launched. So thank you.

[01:12:11.534] Brad Lynch: No, I'm glad I can come on here. I think if people want to know or people want a review of the Vision Pro, I think I'll just point them to this podcast here and be like, hey, I think I cover it mostly here. So, yeah.

[01:12:26.440] Kent Bye: Awesome. Well, thanks so much. So that was Brad Lynch. He's got a YouTube channel called Sadly, It's Bradley. So I have a number of different takeaways about this interview is that first of all, well, I was surprised to hear that most of the different types of applications that Brad is doing is streaming over into his PC VR, where he's going into like VR chat and using the Steam VR overlays, a lot of which is the excess overlay, the VR hands frame, which allows him to kind of make a photo gesture with his hands to be able to take pictures, to do QR codes and translations, but also the OVR advanced settings and some other Steam overlay applications like from Rectus that brings in these spatial objects into the environment that are more fully embodying this kind of idea of a 3D spatial application. So he's really pushing on this multi-application use case. So I know back in October of 2020, I did an interview with Pluto VR co-founders Forrest Gibson and Jared Cheshire, where they were exploring this idea of the SteamVR overlays. They were bringing in WebXR, and they were really prototyping some of the early experiences of this multi-application environment. I think that the Apple ecosystem is really well-suited to be able to pull all these things together. And when I do look at Apple's APIs across all the different platforms, including iOS, iPadOS, macOS, tvOS, and VisionOS, a lot of the APIs are shared across the different platforms. And there are some that are completely unique to VisionOS. But I think over time, even those that are unique, I think are also going to be percolating throughout their entirety of all the different platforms. So there is this cross compatibility that they're slowly integrating more and more of these spatial components. And adding more and more like game engine-like components into their operating system so that at some point you could potentially create a fully interactive immersive experience with just their native operating system. They're having the tabletop kit that's coming out in the next batch of VisionOS 2.0 that is going to enable all sorts of different types of tabletop games that, again, are more socially driven and allowing you to have this blending between mixed reality components, but having like personas coming in and having like game night through augmented reality. which is a lot of what Tilt 5 has been trying to build with their physical AR setup for people to come over and play some of these games that are more around people gathering face-to-face. So it was also really quite interesting for me to hear this split between what Brad is exploring with his deep dive into spatial computing, screen replacement, and a little bit less agency driven VR experiences that really require like hand track controllers and input. He's doing more VR chat, hanging out in social spaces and also just having this ability to bring in his SteamVR overlays. It sounds like for him, the most compelling thing is to be fully immersed into some of these different experiences and to bring in other elements of context and monitor sharing and to have a very specific monitor size. I think he actually misspoke and I got a little bit of a clarification because he was talking about the ultra wide monitors that are usually 21 by nine, and that he actually wanted something that was more like 16 by 10. So something that was just an aspect ratio that he couldn't even have in a physical monitor. He's able to have a virtual monitor that really suits his own workflow for what he wants to do within the context of these mixed reality environments, mixed reality in a sense where he's immersed into these VR chat worlds, and it's bringing in these other applications through these different overlays, including the excess overlay VR hands and the OVR advanced settings. The ALVR Vision OS is in test flight still, and that's something that is building into a lot of the Apple APIs to more directly connect into some of these different SteamVR experiences. Specifically, VRChat is a big driver for people that have the Apple Vision Pro and want to have a higher fidelity experience, but even potentially have things like hand tracking and have your body that you see in more of a mixed reality context, but then looking into a mirror, you see your full virtual avatar. So just to increase the level of virtual body ownership illusion, and also just to feel more fully embodied into these social VR experiences. However, the facial tracking and the eye tracking is something that is not likely to anytime soon be opened up to be able to use within the context of these third party applications, just because of all the stuff like that is really locked down because of the privacy. So there is this like ecosystem element to the Apple Vision Pro where the more that you have like an iPad or iOS, specifically like a MacBook Pro to do the Mac virtual display or potentially even like TV OS, that there's going to be more and more of these integrations that are making it more and more compelling. But for Brad, he's mostly coming from PC VR. And so he's not necessarily like fully immersed into using all these different integrations just yet. but that he's mostly just portaling into his PC VR to do all the different types of computing applications in a way that's a high enough resolution and really the quality bar that he's really looking for. So for me, I still don't know if this type of hardcore PC VR enthusiast use case is something that is going to necessarily like branch out into the general public. And then I think that's a challenge that Brad sees is that he is going so deep down into a very small niche. That's like a niche of a niche that from him creating content around this, it's been a little bit difficult to know how he's like really explaining or, or covering or talking about all the things that would be like a proper review for the Apple vision pro. That would be like, who's the target audience here. which may be slightly different than his core audience of mostly PC VR gaming enthusiasts or other people that are looking at the highest end of whatever is out there. For Brad, this is the best VR headset that's out there that suits all the different needs that he needs. And there's still some stuff that he said, like the face tracking, eye tracking that isn't quite there. But yet for Meta and the Quest Pro, they didn't bring all the things that he was really looking for for the higher resolution and everything else. You can go check out his review to get a lot more details for that. So I'll be curious to see how Brad continues to cover all the stuff. I would personally like to see some of more of the specific use cases and ways that he's really find compelling. He's been mostly posting that stuff onto his ex formerly Twitter posts where he's having these little clips where he's really documenting all the different scenes and moments that he is using all these different overlays and everything. And he might be releasing some of the review video that he never released that has a little bit more of his clips where he was using his headless MacBook Pro in conjunction with his Apple Vision Pro when he was traveling. So more on the mobile ways that he's using it. And my understanding is that he's also just enjoying being able to move around his house without always being tethered to where his monitor is. So having the ability to have both his PC mouse and his keyboard and to be able to move around and be in lots of different contexts for how he's interfacing and interacting with his PC, mostly through the application of Moonlight, which is another open source project that is allowing him to give access to his PC. So that's all I have for today, and I just wanted to thank you for listening to the Voices of VR podcast. And if you enjoyed the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a supported podcast, and so I do rely upon donations from people like yourself in order to continue to bring you this coverage. So you can become a member and donate today at patreon.com slash voicesofvr. Thanks for listening.

More from this show