#1481: Agile Lens 2024: Context on Boz’s Apology to VR Devs, Orion AR Glasses Impressions, & Unreal Engine Mediation with Apple/Meta

I interviewed Alex Coulombe, Studio Lead & CEO of Agile Lens Immersive Design, at Meta Connect 2024 about Meta’s Orion AR Glasses first impressions, being a mediator with Apple and Epic Games as well as with Meta and Epic Games, and a bit more context as to why Meta CTO Andrew “Boz” Bosworth gave an apology to developers at the beginning of the developer keynote. See more context in the rough transcript below. Also be sure to read Coulombe’s Orion hands-on impressions in his UploadVR article.

Here’s the Developer Keynote where Boz gave an apology to developers:

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.458] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So continuing my series of looking at different interviews that I did at MetaConnect 2024 and doing this deep dive into my backlog of previously unpublished interviews with Alex Coulombe, Today's episode is with Alex Coulombe, who's the studio lead and CEO of Agile Lens Immersive Design. And I wanted to go back into my backlog just to kind of give a little bit more context for all these other projects that he's been working on. Because essentially what happened was Alex was able to bring some of the executives to Austin, Texas, showing them this high end photorealistic project. larger-than-room-scale experience where you're essentially able to walk around and get a bit of a guided tour for these condos that had not yet been built. So this actually eventually led to Alex getting an invite, coming to MetaConnect, and also having a chance to see the Orion demo. He did a whole write-up for Upload VR, diving much more into his first impressions and everything. I wanted to catch up with Alex just to hear a little bit more around not only some of his first impressions of the Ryan AR Glasses demo, but more than anything to get a little more context for all the other things that he's working on lately with Unreal Engine and kind of being what he describes as like this marriage counselor between Apple and Epic Games, as well as with Meta and Epic Games to be able to have better support for Unreal Engine for all these different XR devices from not only the Apple Vision Pro, but also all the different platforms on the Quest. And so there's also this apology that Boz made at the very beginning of the developer keynote to developers because of the constantly shifting platform. And he basically said that it hasn't been easy to develop for the Quest platform over the years. So I was very curious to hear from a developer's perspective some of those examples for what that might mean in terms of development. either lack of documentation or not great support for being able to work with the Unreal Engine from the meta side, and also just generally features that were not fully baked or not working correctly, and then having to have all these alternative off-ramps to use completely different technologies where something like metaspatial anchors could theoretically work, but they needed to go to something completely different in order to do what their specific enterprise use case was demanding. So we're covering all that and more on today's episode of the Voices of VR podcast. So this interview with Alex happened on Thursday, September 26, 2024, at MetaConnect in Menlo Park, California. So with that, let's go ahead and dive right in. ALEX COULOMB.

[00:02:44.195] Alex Coulombe: Hi, Ken. My name's Alex Coulomb. I run an XR creative studio in New York City called Agile Lens Immersive Design. And I've been at this crazy, crazy career for a little over 12 years now.

[00:02:54.021] Kent Bye: MARK MIRCHANDANI. Great. Maybe you could give a bit more context as to your background and your journey into the space.

[00:02:58.740] Alex Coulombe: Absolutely. So I came from the world of architecture and theater over at Syracuse University, and I took a very strong interest in utilizing emerging technology and real-time engines from a pretty early state all the way back to modding Half-Life and whatnot to get my building designs working in there. used some early now defunct game engines, one called Aspirian, one called, appropriately enough, Quest 3D, back in the 2000s. And then when the Oculus Rift DK1 Kickstarter came out, they were very game-focused, but as an architect who was struggling to convey design intent with the designs we were trying to show to people who weren't necessarily construction document literate, it immediately became apparent to me that if we could put people inside a VR headset and show them these designs we were working on, particularly for theaters, which is what I was specializing in, we could not only show them what it felt like to be sitting in different seats, looking at different design options, but also showing them different options of what could be on stage and start to pre-visualize performance. And so that was in 2013 and then just kind of expanded from there. Right.

[00:04:00.107] Kent Bye: So I'm going to go back in my backlog archive and air this as a third in a trilogy because we've had three conversations, two of which have not been published yet, but I'm going to put them all out all together. So we had a conversation back at Tribeca in 2023. We were talking a lot about Unreal Engine and all the latest stuff that you were working on at that point. You mentioned some of the stuff that you're working with the Four Seasons Project. And then at Filmgate International, I had a chance to actually see this Four Seasons Project where it was a really great location based for the very specific purpose of selling condos that had not been built yet. And then we also talked a lot about the Christmas Carol project that you have as a kind of ongoing R&D project to prototype all the latest technologies that you've been working on. And now, a little under a year later, we're here at MetaConnect of 2024. And some of those projects that we talked about before are now a reason why you're here this time. So maybe you could just set the context for what happened for you to have plans to go to Unreal Fest, but not even know if you're coming to MetaConnect, to then coming to MetaConnect, and then being able to actually see some of the hottest ticket demos and everything. So I'd love to hear how you tell that story.

[00:05:04.491] Alex Coulombe: Yeah, and so just to clarify, so I knew I was going to Unreal Fest. The travel plans that I had for this section of the year for a while have been giving some talks at Unreal Fest about augmented reality, one of their templates, as well as bringing Unity developers over to the Unreal Engine ecosystem. And Epic's always been very good to me and my company, and they take care of all of our travel and reimbursements and all that. I knew I was going to that and then I knew that we'd be showing off a Star Wars experience that we'd made based on the Galactic Star Cruiser immersive theater experience at an event called Halcyon happening. So I already had quite a few travel plans booked for a while now with coming to Seattle and then an overnight flight to Orlando. So I was prepping myself mentally for like how I was going to balance all of our client work and other projects with that. And then very last minute, like after the hotel discount codes had already like shut down and everything, I got an invite to MetaConnect, and I really wasn't sure if it was going to be worth it to come. I loved the old days of Oculus Connect. That was some of the first times when I got to meet you in person, Kent, and I loved the energy of the Oculus Connects. Looking from afar, it didn't seem to me like MetaConnect could possibly recapture that energy and essence. I'm not speaking here or anything, so I thought, is it really going to be worth it to change all my travel plans really just to come as an attendee? But I'm very glad I came. Do you know why I'm glad I came, Kent? I am, but yeah, I'll let you tell the story, though. Yeah, sure. So I posted kind of my concerns online and was sort of saying like, yeah, I mean, what do you all think, you know, public masses about me coming to MetaConnect? Is it worth it? Who was here last year? Like, what do you think? And a lot of people were like, you know, you're going to get to hang out with all those great XR devs who you miss so much, some of whom, of course, I haven't seen in years, some of which I've still only met in the metaverse, so to speak. And so that seemed OK. Like, I was kind of leaning toward like, ah, maybe I'll make this work. For the first time in our years of following each other, Boz actually responded to my Twitter post and said, we might have cake, I think. And we had talked. I can't go probably into too much detail, but I had actually gotten to spend about four hours recently regarding the Four Seasons project with Boz and Mark Rabkin and some of the other VPs, so to speak, of Meta recently. And we had some really good discussions about their roadmap and all that. I wasn't invited to MetaConnect at the time, so just assumed that'd be our last conversation for a while. But we did talk very briefly during our time together about the Orion Glasses, which sounded very exciting. And so when Boz responded to my tweet with, maybe we'll have cake, I did DM him. And it was the first time I'd ever DMed him. And I just said, just to be clear, Boz, you're saying there might be cake. Is there any chance cake could be code for Orion Glasses demo? Because then I would absolutely come. And he's like, yes, cake is code for Orion Glasses demo. And I said, well, absolutely. I will see you there. And then that was it. That was enough to bring me over.

[00:07:51.698] Kent Bye: And you were able to actually get a demo slot. And you actually reached out to me because you had an ability for people to come and watch the demo. And I had already on the books conversation with Norman Chan and Scott Stein, who had actually seen the demo. And then despite all my best efforts, I was not able to get on the demo. Shortlist to see the demo, but you actually got a chance to see it and you were inviting me to see it But then I had to make a decision Do I go watch Alex do the demo or do I talk to? Norm Chan of tested and Scott Stein have seen it for 45 minutes about what their experiences were for the podcast and I opted to Just talk to them about it because I don't know if they were gonna be leaving I didn't know if I'd be able to have that conversation It seemed like that was my best proxy to actually get a lot of the other details for people that have very specific

[00:08:34.045] Alex Coulombe: Lenses into all the different experiences and all the interviews they had done to be able to get all the information so that was my Proxy, but yeah, maybe just share a little bit about like what your experience was with the demo Absolutely, and I should mention no need for us to do this live to figure it out But I am fortunate to have one other colleague from agile lens who was also going to be doing a demo later today And he also can bring a couple guests so that could be another opportunity to check it out I loved this demo so much it brought me back to the days of the DK one and experiencing not only things like the Tuscany demo and some of these early things on Oculus Share that just got me excited about what the capabilities were then, but also getting me very excited about that trajectory and where everything was headed and this path we were going down toward this very, very bright, infinite possibility future. Because the demo itself was very impressive in its own right, but also I just found myself thinking the entire time about where this could go next and where this might be in five years, ten years, etc. And so I'm sure my demo is very similar to probably what you discussed with Norman and Scott. And by the way, super glad you talked to them because they're both brilliant. But I got to sit down with Joshua, who I think is like the VP of product or something. And we went through kind of a day in the life of what it might be like if we hop in our time machine and pretend we've got these glasses on all day in the future, what that feels like. And so we did normal things like Messenger chats and checking my Instagram, which is funny because I don't actually use Instagram. to some AI conversations. I got to do the cooking demo and kind of say, hey, I got a bunch of food in front of me. What can I make with this? And then see recipes that I could flick through. I also felt like so much of the experience in general was kind of a culmination of every XR device and experience I've had over the last 12 years or so. But the whole was greater than the sum of its parts. hand tracking and eye tracking, and I've used a Mio armband back in 2015 for gesture input with haptic feedback, but I'd never seen all the stuff come together in one package, especially in a glasses form factor that made the whole thing just feel so natural and comfortable and like the kind of thing that could actually someday convert some of the people who I know who are like, I'm never going to wear a bulky headset or big, thick AR glasses on my head. This felt like the device and the trajectory that is going to potentially get those people excited about this space in a way that the Quest product line just never can based on the direction it's going.

[00:10:52.389] Kent Bye: Yeah, and it sounds like from your perspective, you're doing Unreal. You're also trying to get Unreal more compatible to be working with the Apple Vision Pro. But there's also this augmented reality pass-through aspect for things like What If for Disney. When I had a chance to talk to Disney, one of the producers, Sharif, was saying that, yes, that Disney has their own fork of what they're doing with Unreal Engine to be able to actually do some of the magic of that. And you've also been a big part of trying to get a lot of the Unreal Engine to be compatible with the Apple Vision Pro and a lot of just trying to figure out all the different settings and just trying to get your workflow so that you can use what you use for the Unreal Game Engine and to be able to ship different mixed reality experiences within the Apple Vision Pro. So I'd love to hear some of where that's at in terms of getting everything on Unreal side to be more compatible with being able to ship on the Apple Vision Pro.

[00:11:43.891] Alex Coulombe: Yeah, so it's funny, I feel like so much of my professional career has become a larger version of so much of what I did as a kid, which was always trying to bring together disparate friend groups and trying to get them to start new friendships, start relationships in some case, and be like, oh, you absolutely need to talk to so and so, Tom and Cindy, et cetera. And so I feel like that's what goes on a lot with me now with Apple and now Meta and Epic Games, where in so many cases they're talking past each other. And I do feel like I'm in a unique position trying to work across all their ecosystems, me and everyone in my company, where we want all these things to work in a way that should be mutually beneficial to us and them and to each other. And so with the Apple Vision Pro, for example, like, of course, with all the fighting that's been going on between Epic Games and Apple. We didn't expect there to be any direct support, but the very small XR team at Epic is super clever, super brilliant, and they kind of just using the publicly available SDK tools from Apple found a way to treat the Apple Vision Pro ecosystem as OpenXR, which is funny because, as we both know, Apple is not a member of the Khronos group. They don't actively support OpenXR. And yet, the way Unreal Engine is set up now for Apple Vision Pro is you can take an OpenXR experience that you've developed for MetaQuest or Vario or Vive or any of the other members of that consortium, and with very little modification, you can now get it to work with Apple Vision Pro. I had a hand-tracking setup that I was using across all OpenXR devices a while ago, And I was almost able to use that out of the box without changing anything. There were a couple like quaternion rotation things that needed to be adjusted, but like the gesture detection I had for pinches and open fist and all that basically came over to Apple Vision Pro with very little modification. The biggest issue was some of the like fundamental Xcode kind of building things, we'd get weird errors. And I haven't been a Mac developer in a long time. The last real iOS project I worked on was back in 2018. And so I did need to like reacquaint myself with this ecosystem. And so some of the learning curve was just like, yeah, what is it to be an Apple developer compared to an Android developer? But then some of it was just figuring out in some cases, loopholes, like, oh, wow, I can actually activate pass-through if I delete the sky and, like, have all these kind of weird alpha things going on. And then some of it was very actively trying to communicate with some people at Epic Games and some people at Apple and then triangulating the information I was getting from both of them because, you know, the word on the street from some people I know at Apple has been, like, yeah, we're not going to directly support Epic Games, but if you ask us very specific questions about what you're trying to solve in Xcode or with one of our features or SDKs, the stuff that goes in the plist file, for example, they could, in a very targeted way, help me with that. And so from both Apple and Epic Games, with me kind of being this weird sort of marriage counselor between them, I feel like we've been able to make a lot of progress and I've tried to share a lot of that with the developer community, but it comes from like me being this weird messenger that bounces back and forth between them and is like, you know, well, Mallory says that you said that yesterday that it feels like very gossipy in a funny way, but. Now we're at this point where we've got, like, for Vision OS 2.0, passthrough is working in a really clean, consistent way. The frame rate is excellent. There's fewer crashes. I'm having more and more conversations with companies that are really eager to develop apps using Unreal Engine for the Apple Vision Pro, and it just feels like that trajectory is headed to a much better place.

[00:14:59.502] Kent Bye: Yeah, so you described what was happening between Epic and Apple. What's happening in a similar sense between Meta and Epic with Unreal Engine?

[00:15:07.236] Alex Coulombe: Yeah, this is another funny one. And our relationship with Meta has felt tenuous for a while. We've, of course, been building ever since the Oculus DK1. And we've had contacts at Meta, but we were never given first access or turned into a managed dev. And to be fair, we only had our first app come out publicly this year with Body of Mine. Most of the work we've done has been more enterprise and more focused on very specific use cases for companies and didn't necessarily want to be available to the public. But we found that, especially on the Unreal Engine side, if we started to go through Meta's very outdated documentation, we would find all these errors and spend hours and hours realizing, like, oh, they wrote this thing in Unreal Engine 4.26, and now we're in 5.3 or 5.4, and we would create Google Docs that were basically like the corrected version of their documentation, and we'd be thinking to ourselves, like, we spent so many hours trying to figure out how to make this work on our own, you know, a basic setup for shared spatial anchors, for example, and we didn't want anyone else to have to go through We want more Unreal Engine developers in this ecosystem, and so we would pass these documents to our contacts at Meta and say, like, this is it. This is your updated documentation. Can you please post this? And they wouldn't. Or, you know, we'd hear back from them two months later, and it would be not the same person we were talking to before. It's basically like a new number. Who's this kind of thing? And so we just felt very frustrated, like there wasn't a real open line of communication there. Recently, that's gotten much, much better, and I'm very grateful to Boz and some of the people over there who have opened up these more active channels. And I do want to give very strong credit to Mark Rabkin, who actually has been very good at paying attention to some of my public complaints. Like if I would post to Twitter like, oh, this new Oculus update broke, you know, controller tracking or whatever, like he would actually DM me and say like, tell me more, let's fix that. And then we'd see like a new Oculus update like two days later that actually addressed that. So big credit to Mark for helping us out for over a year now in that capacity. So that being said, now that I'm having more conversations with Meta, more conversations with Epic, they both are kind of saying the same things about each other, where each side is kind of like, you know, Epic will be like, boy, I sure do wish that Meta cared more about Unreal Engine because we're just never up there with feature parity with Unity, and it makes it really hard for us to get our developers excited about XR. And then on the Meta side, there's people saying like, boy, we sure do wish that Epic Games and Unreal Engine cared about XR because we sure would love to get more of those Unreal Engine developers working in our ecosystem. And it's like, you guys can help each other. You can come together to make sure that these features are coming out at the same time. But it just seems like we used to have these very slow, non-productive conversations with meta. It feels to me like that might be what's also happening between Epic Games and meta, where they're just missing that open line of communication that would streamline a lot of the work that needs to be done to make sure that Unreal Engine developers feel very taken care of in the meta ecosystem.

[00:17:51.598] Kent Bye: Yeah, I know about a month ago, Boz was doing a Q&A on his Instagram, and I asked him, because MetaConnect was about a month away, I said, what are you excited for for MetaConnect? It was sort of like a softball, like, hey, announce something that you haven't announced yet. But I also wanted to hear, like, OK, what is he, like, excited about as someone in his position that there's going to be the bringing of the community together? Like, what's on his radar for what he's paying attention to? And he said, basically, like, the product launches in terms of, like, OK, like, there's going to be VR. Ray-Ban sort of stuff, AI, but also like, oh, maybe there'll be some classes, stuff that we'll be updating. But he said at the end, like, the thing I'm really excited about is, like, there's some things that we've not been doing a great job with our developer relationships, and we know of that, and we're going to address those issues. And that was kind of the way he phrased it, roughly, like a rough paraphrase. And so at the beginning of the developer keynote, Boz comes out and basically says... hey, I'm sorry for we know that we've not. I forget exactly how I phrased it, but it was sort of a vague, unspecified apology. For anybody who wasn't in that position, you'd kind of be like, OK, what happened? What is he actually talking about? And there's a number of stuff that were being addressed throughout the course of the developer keynote. But it was sort of vague and hard for me to follow what specifically they were apologizing for. So from your perspective, maybe you could give a little bit more context for some of these different conversations and what, from your perspective as a developer, what some of those pain points have been and what you hope to see as we move forward.

[00:19:15.103] Alex Coulombe: Yeah, I appreciated the spirit of Baz's apology yesterday. And I was looking for kind of those specific action items of here's what we're going to do moving forward. Some of them were sort of addressed, like there was a mention of a streamlined documentation that walks you through many of the samples in a much stronger way. And I'm very excited about that. Like if that's true and the documentation gets a proper overhaul, especially on the Unreal Engine side, that would be fantastic. Over the years, we've run into issues where a lot of their code samples just fundamentally don't work, and we have to modify them. And you just felt like no one at Meta was dogfooding their own software. I think some people would maybe testify to this being true, that a lot of the people working in reality labs never wear headsets. They're never actually trying this stuff out. And so when we would say, hey, we just spent 15 hours in a headset trying to get the presence platform features to work, and they're fundamentally broken in these nine ways, the responses we might get would be from someone who clearly had like never tried it for themselves. And so you want to feel like when you're having these conversations that you're speaking to developers and it was very hard to talk to anyone who wasn't more at that kind of managerial level where their their understanding of their own tech isn't particularly deep. Compare that to Epic Games, where I'm constantly speaking to the people there who are actively writing the code that we're using. And so if we have a problem with something that Unreal Engine has, we can often talk to someone who is like, oh, yeah, I did a change log yesterday, and it did this to the code. And I understand why that broke. And now I'm literally going to push to the public open source Unreal Engine GitHub that fix. And you can download it in 10 minutes. That level of responsiveness. Some would say maybe we're spoiled by that because we have this really nice relationship with Epic Games, but I think anything that brings meta to a more active place of talking about that would be wonderful. I'll give one example. I'm looking at my colleague, Marshall, who's sitting across from me, and we had a conversation with meta about Shared Spatial Anchors, which we've been trying to get to work for a while, especially for our Four Seasons project, where we were lucky to have John Carmack come, and he commented on the fact that the whole OptiTrack system we have there for our local multiplayer, that should be possible with Shared Spatial Anchors, and yet it's not, and we've done everything we can to do it. And then when we finally got to have a sit-down with Meta about this, so many of the things they talked through, they're like, oh, well, you should submit something to the GitHub issues. Marshall had done that like 50 times and no responses. Then he'd be like, oh, well, you should try X, Y, and Z super basic things. Here's how you make a spatial anchor. It's like, we know how to make a spatial anchor. We've been doing this for a very long time, and we're very aware of the fact that we're pushing on this more than any of the people we're talking to at Meta. And so we often felt like we were just being talked down to and that we were in many cases doing a lot of the work for free that we felt like they should be doing just basic support for like, here's a functioning SDK that you can use day in and day out. And so going back to boss's apology, like there are some just fundamental quality of life things that hopefully they're on the path toward doing that just makes it easier for not needing to have a team of, Unreal Engine experts who can go in and identify where their code and examples are broken to fix it. I teach so many Unreal Engine classes, especially intro to Unreal Engine, and especially when I'm going to be talking to all these people who come more from Unity at Unreal Fest about how to make their way over to this ecosystem. When I'm talking about XR, I want to tell them that it's going to be really easy. I want to say you can open up the VR template, download Meta's XR plugin and like build to their device, no problem. And yet there's so many artificial roadblocks mostly put in place by Meta that make that much harder than it should be. So yeah, anything that moves us more in that direction where it's easier for new people to come into this ecosystem, it's just going to benefit everyone.

[00:22:55.506] Kent Bye: Yeah, it sounds like they're now listening and acknowledging some of those problems and hopefully putting in different things to address them. Do you think that there's a part of it that because you've been in the enterprise space and that Meta for a long time, they started with consumer. They're a consumer company at the core. And that they had Oculus for Business, and they stopped it. And then they just relaunched Meta for Business recently. So I don't know if some of the stuff that you're talking about is more on the enterprise side, where Meta's just not got some of those enterprise-y features or dev relations stuff worked out, or if you feel like some of the stuff you're talking about is universally applicable to anybody, whether or not they're designing games that are for the store or doing more enterprise apps.

[00:23:36.638] Alex Coulombe: Yeah, it's a great question, and the answer is sort of all of the above. So we, of course, are doing a lot of enterprise apps that want to use like retraced PC VR, hyper photorealistic things. And of course, we all know PC VR is not a priority for meta right now. Fine. Like they want to focus more on what can happen standalone. I understand that. We had a great time at the Meta Mixed Reality Hackathon back in April in New York. Meta did a fantastic job hosting that. We all felt very well taken care of, except they had to extend the hackathon by an entire day because they realized that they had released some fundamentally broken features in their co-presence platform, and so much of what we needed to do during the Mixed Reality Hackathon was make mixed reality apps that could work in a co-present experience in the same space. So these are basic things that all developers need, especially if they want to push on the mixed reality side. And yet even they had to be like, oh, we're sorry to the 25 teams here. We're going to change up the whole schedule and give you an extra day because we didn't probably look at this well enough to realize that it was broken before you needed to use it. So some of the stuff is just fundamental, affects everyone. Some of it is PC VR. And yeah, of course, there are certain things that are very particular enterprise use cases where we understand they're not going to spend a ton of energy fixing something that us and maybe three other companies are going to do. And we are perfectly fine figuring that stuff out for ourselves. But going back to like, you just want to feel like someone getting started in this for the first time, that they just have an easy path forward toward building their first experience.

[00:25:05.936] Kent Bye: Yeah, I'm going to be talking to the Starship Home folks here in like a half hour. And I was playing it here on site in Palo Alto in the hotel room. And there were some things with the room mesh. And there's different objects you're placing in that. And then when I stop it or start it, then some of those would not come up at the right place. And I was talking to Doug Northcook. And he was saying, look, some of these things are at the platform level where it's beyond our control, where they're still working out some of these different things. And so their stuff as a mixed reality app and really pushing at the edge, they're on the bleeding edge of pushing what's even possible. But then there was a couple of times that I hit some game breaking bugs that I don't know if it's on their end or on the meta platform end. In order for these things to actually evolve and develop, you need to have applications like that that are pushing the edge. But it sounds like a lot of stuff that you're also working on is at that bleeding edge of stuff working, but then breaking. And then I guess it's just a process of trying to have a better feedback loop for trying to figure out if it's something on your end that you have a bug or something that's on the platform level in order to sort out how to actually make some of the stuff work.

[00:26:06.570] Alex Coulombe: Yeah, I mean, in certain ways it does sometimes feel like a bit of an abusive relationship, not to get too dramatic with it, but it's like, is it you? Is it me? Like who's really at fault here? And we do our best to like solve as much as we can on our own. And that's when it gets particularly difficult when we just hit a roadblock where it's like, we can't do anything more with this until Metafix is something in their cloud servers or whatever. And so we email support and we hope for the best with getting a response. But sometimes we just find ourselves pivoting a lot. Like I'd say the thing that we've needed to, foster the most as XR developers using Unreal Engine and Meta's platform has been just the ability to set up a lot of contingencies and alternate routes because we'd hit these roadblocks all the time. And that's how we end up creating a very expensive OptiTrack system for local multiplayer for Four Seasons, for example, when really there's no reason why a more stable version of Shared Spatial Anchors shouldn't be able to do all of that out of the box.

[00:26:59.864] Kent Bye: Yeah, so you've got Unreal Fest that's coming up. What are some of the stuff that either you're looking forward to presenting or stuff that you feel like are kind of hot topics within the context of the Unreal community?

[00:27:08.755] Alex Coulombe: Yeah, I'm getting really excited about some of the advancements in motion capture. For our production of Christmas Carol this year, we've partnered with Sony. And the Sony Mokopi system, which works great in Unity and Unreal and plenty of other apps, VRChat, for example, we've developed kind of this new way of doing full-body tracking. We've used Xsense and Rococo suits and all these other ways to get our actors' performances in the past. But now the Sony system makes it very easy for us to basically create a sub-$1,000, very robust mocap solution where our actors can wear a MetaQuest Pro headset, which unfortunately was discontinued yesterday. But with the MetaQuest Pro headset, we'll get their eye tracking and their face tracking and their hand tracking. And then we'll combine that with the Mokopi full body tracking. You can get a Mokopi used for like $300, a Quest Pro used for maybe $400 or $500. And then you have this system that actually feels really good. And we're going to be talking to a lot of people at Unreal Fest and giving some demos about that and that whole workflow. We've also made a lot of really good progress with optimizing things like full body capture data, so every single finger joint and all the blend shapes in the face can be broadcast in a very reliable way at 60 or 90 frames a second. So some of those elements of progress aren't necessarily specific to the latest version of Unreal Engine. I'd say a lot of the stuff we're excited to show off and talk about have come more from our own internal development. I will say one thing that's been a little bit frustrating about Unreal Engine lately, because I've been saying so many nice things about them, I have to be mean a little bit, is there's so much focus now on Fortnite and UEFN. And I felt that in a big way at Unreal Fest last year in New Orleans, and even a little bit over at Unreal Fest Prague a couple months ago, where They're so focused right now on trying to grow Fortnite and grow UEFN, which is a way for creators to develop for that, that it does feel like some of the fundamental features of Unreal Engine are being left a little bit by the wayside. One example of that, we love metahumans. We use metahumans a lot as a way to create our digital characters, and they're very heavy. Like downloading a metahuman is 1.7 gigabytes. And so working on Body of Mine with Cameron this year, One of the first things we had to address was how do we have a lot of metahumans in this project and bring the file size down into something that's going to run well on a MetaQuest 3, a MetaQuest Pro to get the face and eye tracking and all that, and maybe even a Quest 2. And so UEFN, they released incredible, optimized, good-looking metahumans. We went from 1.7 gigabytes per metahuman there to 70 megabytes. Incredible, and you really can't tell the difference that much in most cases. For reasons I can't explain, Epic has not yet made those metahumans available for Unreal Engine, and that would have saved us countless dev hours on Body of Mine and for some of the stuff we're preparing now for Christmas Carol. Hopefully there's some announcement at Unreal Fest coming up that's more about bringing some of these UEFN features to Unreal Engine. There's also a really excellent source control system over in UEFN that we'd love to have over in Unreal Engine. There's a new marketplace called Fab. That one is coming to Unreal Engine, so I'm excited about that. It's kind of a unified place where SketchFab, which was acquired by Epic Games, and their marketplace and the Quixel assets, et cetera, all can live in one location. So that's all great, but much like MetaConnect, the thing I'm looking forward to the most at Unreal Fest is just seeing the people, hanging out with the devs, catching up, seeing what everyone's projects are that they've been working on. And inevitably, these are the events where we meet other devs we want to work with, other potential clients, And it's really fun to give demos and just see people's faces light up when they're like, oh, I didn't know this was possible yet. So those are always the hallway run-ins that you and I love so much. That's really where the meat of these events are.

[00:30:33.325] Kent Bye: Yeah, and just to follow up on the Four Seasons projects, I know we've talked about it a couple of times in our previous conversations. And the whole idea is that the owner of this venture was going to be bringing people into VR to sell them to be able to buy these different homes. So maybe you could just give an update for how VR has worked in terms of a way of helping to serve this high-end real estate use case.

[00:30:54.952] Alex Coulombe: Yeah. So when we had Boz and everyone over at this Four Seasons Experience, Boz did ask, like, how fundamental has this VR experience been for sales? And our client very directly said, like, it's binary. Like, without this VR experience, this project would not be happening. We had a few sales before the XR experience existed, the holodeck, as we call it. And it was OK, but it wasn't the kind of thing that was going to sustain the level of investment that needed to go into making this $2 billion-plus real estate project happen. And there was very quickly like a 300, 400, 500% increase in the sales once people were able to put on the headset. And a lot of that comes from the positioning of our client and the very carefully crafted experience that he wanted us to build for him that does get treated something as a time machine. Like we've built a lot of very fantastical experiences in VR that are trying to take full advantage of the medium and do things you can only do in VR. And that's not what our client wanted for this. He wanted something that just had a very high level of photorealism that felt like you were just hopping into a time machine, stepping forward into the future, and you're walking around what this will look like when it's constructed. And so we've been fortunate to have a lot of really incredible guests. The founder of Four Seasons, you know, we've had celebrities like Matthew McConaughey and whatnot. And some people have been in VR a lot, some people have never been in VR, and it's been a wonderfully reviewed experience. I think we've done something like 390 tours so far. There's a great crew over there who runs the day-to-day operations, but they update us all the time on what people think and what little improvements we can make. So in terms of Selling real estate, very successful. And then in terms of pushing the limits of what you can do with photorealistic VR, we felt very excited about all the progress that's been made there. And we hope to find more opportunities to share and leverage all that research and development so that others can also do similar things.

[00:32:41.212] Kent Bye: Did you have a chance to try the Hyperscape demo, which was the Gaussian splat basically showing of these different scans that they have?

[00:32:47.853] Alex Coulombe: Yeah, I have not. I did try out Vario's similar experience called Teleport, and I actually made a scan of our VR showroom in Austin to kind of see how that felt. The Teleport one actually does a good job of, I think, creating several Gaussian splats and then letting you kind of blend between them. There's kind of like a sense of lerping as you move around the space. So I'm really eager to try the Hyperscape stuff. Have you tried that yet?

[00:33:08.160] Kent Bye: Yeah, I tried it during the demo day. I had a chance to try it. And the thing that's interesting about that is that it's actually being cloud rendered. So it's not being rendered locally. They're doing it on cloud servers. And they're able to do predictive rendering, so predicting where you're going to be looking next. But it's essentially like sending down a stream down to the headset that has this, like, essentially feels like a PC VR type of experience, where it's such a high res and such a high fidelity that it's just using the Quest headset as this kind of way of receiving all these images and being able to do it with their data center that's located in San Jose. And so if it's too far away, then you start to have a little bit more of latency and lag. One of the engineers was saying that it's basically limited by the speed of light for how fast you can start to have the latency down. But since we're so close to San Jose here and with all the Meta's huge data centers, it was doing it just fine. OK, well, I guess as we start to wrap up, I'd love to hear what you think the ultimate potential of mixed reality and spatial computing might be and what it might be able to enable.

[00:34:03.533] Alex Coulombe: Yeah, well, coming off the Orion demo, I'm thinking just a lot about what that everyday use case is where, especially if it becomes, I want to say more socially acceptable and in a non-intrusive way where like, I want to see that future where you and I can both have something like the Orion glasses on and still be having eye contact and a really meaningful conversation while getting contextual understanding of the other things around us. So I'm thinking more and more now about this layer of digital fabric on top of the real world that somehow enhances our real personal connections. Like I would love, you know, for us, for example, the next time we talk to be able to have like all these little connections to like our previous conversations. And, you know, then we feel like we're remembering in a more helpful way all the other dots we've connected over the years. Anything that just becomes not necessarily a crutch, but a really helpful tool for us to engage with people in a really human, connective way as excellent as possible. And this form factor of glasses, you know, not a big, bulky headset with pass-through cameras where we really can't see each other's eyes. This with how wireless it is and the field of view and everything that is now making me feel like there is a future where this could be something that people just casually wear And again, hopefully becomes a more positive impact on society and all of our relationships. I'm being a little techno optimist right now, but I hope we're started to head in that direction. But of course, it's going to depend not just on companies like Meta, but the developers, those of us that are actually going to actively build use cases and apps and experiences that foster that kind of more human use of the tech.

[00:35:33.908] Kent Bye: Yeah, it reminds me of just being at Snap Spectacles and having a chance to see the As Devlin Council experience, which was kind of like their flagship, paradigmatic example of what we foresee in the future, which was 12 people in a geodesic dome in a circle, everyone wearing the Snap Spectacles AR glasses that... had like a shared reality that everybody was able to interact with and you could see how other people were interacting with the objects. But also they facilitated this sort of community ritual where there was like this moon that was orbiting around the Earth. People would put their hand on the moon as if like a talking stick, but then they would declare their intention for what they want to see for the future of humanity. It would transcribe what they said and turn it into text and have that text then wrap around in this kind of curve, but then like orbit around the Earth. And so by the end of it, you had 12 people's intentions for what they want to see for the future of humanity, all kind of orbiting around the Earth. So it was like this way of using AR to facilitate these kind of emergent social dynamics and community rituals. that was really quite powerful that I saw that there's a distinct difference for the types of experiences that you can have with people co-located in the same place in the shared context that is this platonic realm of ideal forms or these digital objects that are helping to create a shared experience for everybody that is facilitating unique emergent behaviors. So for me, that's the most compelling demo that I've seen so far in AR into trying to bring all those different unique affordances together. And something that you could do in VR, but you would sort of miss a lot of the body language or miss a lot of the facial expressions. I couldn't actually see people's eyes from that far away. And people were with the wave guides and everything that's already occluding the eye line. So I had a conversation with 6LIV, who was also talking around how there's different social dynamics that if somebody is talking to you, and then all of a sudden they having something else that can be splitting your attention in a way that actually is not allowing people to be completely present for what's happening. So I'm also at the same time hesitant to be full all in to the ways that these 3D holograms are going to play into our lives in a way that is either going to help facilitate connection or disconnect us in a way that is kind of violating all these social norms that we've had built up for many millennia as human beings. So yeah, that's where I'm at on that. But yeah, I just love to hear if you have any other kind of final thoughts that you have for the immersive community.

[00:37:45.580] Alex Coulombe: Yeah, one thing that I was surprised to feel a little pang of sadness for during the announcements yesterday was seeing how much Apple was already going in this direction, but now Meta is also going in this direction of lots of multitasking, lots of windows around you, and lots of things that you can engage with at the same time, and especially being someone who loves theater, i was enjoying the parallel between vr in particular and theater as this thing where you can't have your phone out and you can't be doing these other things and you are a captive audience who in the case of a theater and a vr headset to a degree you turn off all the lights in the room and you focus on this one thing and i think there's been a lot of studies correct me if i'm wrong that say like people can't really multitask but they can focus switch but there's always a cost to that focus switching And, you know, it could be really bad for society. I mean, we already have these issues with short form videos and TikTok and whatnot that just trains us to have a shorter attention span. And as wonderful as it could be to have all these tools at our disposal that give us a lot more information and context and ways to connect with each other, we might lose some of that fundamental human connection that comes from just sitting in a room and not having any technology and having a regular conversation. So I hope we find a good balance there. But otherwise, you know, I'm really excited for the potential of where this is all going. And, you know, I've spoken a lot during this conversation about some of the frustrations and hopes I have for companies like Meta. But I really do want to emphasize that I think those of us out there who are trying to build experiences like, yeah, the onus is on us as well. We have to take responsibility for using these tools to craft the experiences that we think can improve the world and improve everything from social interactions to, you know, medical training and how we're going to do better surgeries and all that kind of thing. We have to be the people to actually create that. And all Meta can do is hand us the right tools and Apple and Vario and Vive and all the other wonderful competitors out there. So looking toward the future, you know, I'm really happy with the competition we're seeing right now. I love this kind of game of one-upsmanship where we're seeing, you know, Apple releases a feature and Meta claims like, oh, we were going to come out with this anyway, but then they released that feature. And then I thought, meta did a great job of throwing down the gauntlet yesterday for a few things where it's like oh how's apple going to respond to this the last time i was this excited about xr in general was 2018 when i had the lenovo mirage solo and google very kindly sent me a six off dev kit and so this was more than six months before the quest one came out And this was a Quest 1. It was a 6DOF headset with these 6DOF controllers that felt kind of like PlayStation Move controllers. But I saw that future of not just a totally standalone, totally 6DOF ecosystem, but one in which Google and Meta were going to be competing to see who could do it best. And I think we all suffered because Google, like they do with so many things, kind of pulled out and said, like, never mind. And then Meta, in many ways, was left in a totally clean position to take over that market for consumer VR. And I think in some places they've really become complacent, which is natural when you don't have a real competitor in the space. And so now with Apple and Meta and then, you know, all these other things coming now with Google getting back into the game with Android XR and LG and Sony and everyone else like this is the kind of competition that can only benefit consumers and developers as well. So I'm really excited to see where the next few years go. Selfishly, I'm particularly excited about use cases in architecture and training and theater, of course. Please, everyone, do check out our version of Christmas Carol this year, which I think we're going to make free. That's at xmascarolvr.com. If anyone is out there and is thinking like, boy, I'd really love to dive into this world becoming a developer or a creator, I am happy to say that I have the first and best and only Unreal Engine authorized training center in Manhattan. though most of the classes we're giving are virtual. You can do them over Zoom, so please subscribe to the newsletter at alexcoulombepresents.com if you want to be alerted for when we're giving free classes. They are, of course, about Unreal Engine, but my fundamental goal is like, I think there's so many brilliant people out there who don't even realize the potential they have. Again, one of my favorite things about going to something like Unreal Fest is all these people from different disciplines coming together to make really incredible content from different angles and sharing their cross-disciplinary approaches, and that benefits everyone. So similarly, I think there's so many people that, if they knew the fundamental ways to get started with XR development, they would be creating incredible work across a very wide variety of industries. And I would love to have this kind of legacy of feeling like I helped as many people as possible get into this world and to get excited about it and to create things that they didn't even know themselves they could create.

[00:42:13.757] Kent Bye: Awesome. Alex, it's always a pleasure to sit down and talk with you. I'm going to air this as a part of a trilogy of our previous conversations, because I feel like there's a story that's unfolding over the last three times that we've had a chance to talk, all these different things that you're working on. But a through line of this passion and enthusiasm of trying to get all these tools, especially with the Unreal Engine, working well with XR and all the different things that you're creating from all the theater and architecture and body of mind now since last we talked, the metahumans and all these different affordances. Mixed reality, spatial anchors, all the things together that you've been working on in each of these R&D projects. So I highly recommend folks go check out A Christmas Carol. I had a chance to check out last year's, and it's always a great example for the cutting edge of this intersection between theater and all these immersive technologies and how you can start to use the technologies to both tell stories but also use it to sell high-end real estate and all sorts of things that are made available with R&D. Unreal Engine is definitely one of the engines that has the most polish and look and feel. And so I always appreciate to see what folks are doing to bring that technology into XR. So thanks again for sitting down to chat and share a little bit more about your story and journey into the space.

[00:43:20.811] Alex Coulombe: Thank you so much, Kent. And one thing I want to say very quickly as well is I am very active on social media and all that. So I do feel like I'm often the face of many of these projects. But very briefly, like a bad Oscar speech, I do want to just very quickly shout out everyone at Agile Lens, Marshall, June, Dante, Whit, Kevin, all these people who've made all these experiences happen with me over the past few years. collaborators on projects like Four Seasons, like Dbox and Pureblink. And of course, our client Jonathan Kuhn there, whose vision drove the whole thing. Cameron, of course, for Body of Mine. Michaela Trenescu-Holland, who is the impact producer for that experience. And then also just everyone who has freely shared information, you know, across social media, across YouTube videos. It really has been a wonderful creator community to be a part of with the rising tide raising all boats. And yes, Ken, thank you so much for your time. And I thank you always for everything you do for our industry as our unofficial oral historian. And I look forward to the next time we talk. Awesome. Thanks.

[00:44:14.067] Kent Bye: Thanks again for listening to the Voices of VR podcast. And I would like to invite you to join me on my Patreon. I've been doing the Voices of VR for over 10 years. And it's always been a little bit more of a weird art project. I think of myself as a knowledge artist. So I'm much more of an artist than a business person. But at the end of the day, I need to make this more of a sustainable venture. Just $5 or $10 a month would make a really big difference I'm trying to reach $2,000 a month or $3,000 a month right now. I'm at $1,000 a month, which means that's my primary income. And I just need to get it to a sustainable level just to even continue this oral history art project that I've been doing for the last decade. And if you find value in it, then please do consider joining me on the Patreon at patreon.com slash voices of VR. Thanks for listening.

More from this show