#1310: Ethereal Engine as an Open Source, WebXR Pipeline for Apple Vision Pro and Beyond

I interviewed Ethereal Engine co-founder and CEO Liam Broza at Meta Connect 2023. See more context in the rough transcript below.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the future of spatial computing. You can support the podcast at patreon.com. So this is episode number five of 12 of looking at some of my interviews and coverage from MetaConnect 2023. Today's episode is with Liam Broja, who's the co-founder and CEO of Ethereal Engine. This is an open source WebXR engine that's built around prioritizing multiplayer first, high performance first, and cross modality and full body first avatars. So Liam is somebody who has been really at the forefront of pushing the edge of what's possible on open tech stacks like WebXR. He really got into Mozilla hubs and creating a different fork and more of a cloud service hosted social VR experiences that I think actually got a lot of contract work during the pandemic with a lot of crypto companies and Invested a lot of the money back into building what is now called the ethereal engine which is a really robust Game engine type of interface that is really optimized for being on the web first so Liam is somebody who is really looking at these broader trends within the context of a developing for XR, you know, there's been with the runtime piece from Unity, there's a lot of Exodus and folks looking for alternatives. And also just whenever you are downloading these XR applications, the downloads are just huge files that could be multiple gigabytes. And then at the point where you may not even actually have room on your XR device. So Liam has got a lot of really interesting insights into what's happening with Apple, digging into a little bit more of the production pipeline for what it takes to create an Apple Vision Pro applications and all the nuances of what does and does not work within the context of Unity, but also trying to find these alternative pathways as there's WebXR support that's slated to launch with the launch of Apple Vision Pro within the context of Safari. And yeah, just with his ethereal engine, trying to find this alternative pathway to be able to get these experiences onto the launch of the Apple vision pro without having to go through the existing pipeline of say, learning to use Swift or compiling everything down into a native application. Generally, as I was talking to lots of developers and folks at MetaConnect, one of the hottest buzz was, you know, had folks had a chance to try out the Apple Vision Pro or not, and some of their feedback. So there are lots of really interesting off the record conversations in terms of what's happening with the Apple Vision Pro. And one of my big takeaways from those conversations is that actually producing content in the context of a XR developer is not necessarily like an easy path. And if you go back to my episode of 1216 back from June 6, I had a conversation with Raven Zachary and Sarah Hill, who were both XR developers who were on site at the Apple Vision Pro. They didn't have hands on, but one of the comments that Raven said is that, you know, Apple's not really necessarily even interested in what's happening with the XR developers. More interested in trying to bring their existing ecosystem of developers into the XR space And so I think that's actually likely proving to be pretty correct because a lot of the different XR developers that I spoke to weren't necessarily happy with the amount of friction that it was of just even trying to get things up and running and working and and just to look at the amount of work that would take for a number of different developers to change architecture of how their existing apps are built in order for it to work. Apple hasn't implemented anything like OpenXR and so they're being kind of antagonistic to have this really custom bespoke way of producing software. What that means is that there's a lot of different XR applications that is not an insignificant port to bring them over into the Apple Vision Pro. And that's not even mentioning that there isn't even controllers for the Apple Vision Pro. So even if a lot of these ports were made, then they wouldn't necessarily even work. So I think what that is likely going to catalyze, from my perspective at least, is the desire to create completely new pipelines for generating content for the Apple Vision Pro. So in this conversation, we're talking about the Ethereal Engine, and the next conversation I have with Matt target, he's trying to have a whole pipeline to do react native, which for me is super exciting because with react native, the idea is to write it once as much as possible and to have these hooks to tap directly into the native APIs. And so there might be a way to kind of write towards the open XR and web XR specifications, and then be able to compile it to different targets, potentially even to the web. And then with react native, create these native applications that would work on some of these different devices. Particularly even to write code to work on both the Apple Vision Pro and the existing meta lineup and any other XR device that's out there. I think that's sort of the dream. We'll see how much that gets there. I mean, Unity is supposed to be the thing where you're supposed to be able to do that, but I think with some of the different restrictions of what may or may not work and be compiled down into core graphics API of metal, there's still a lot of open questions. So the caveat here is that this is one person's perspective, Liam Rosa, and just in conversation with other folks who've had other perspectives on what's happening with Apple Vision Pro, there may be other alternatives out there. So the idea is not that this is the ultimate truth of what's happening, but just a single perspective and some broader context for what is perhaps motivating this move into more web-based technologies. And the Ethereum engine is really on the bleeding edge of trying to integrate all these open standards and web technologies and also working with the Metaverse Standards Forum and the Kronos Group to also bring together all these things into more of these open standards and yeah just a lot of the fighting the good fight of trying to bring the more open aspects of a technology stack that is perhaps an alternative pipeline for producing XR applications for WebXR and the web. So, that's what we're covering on today's episode of the Voices of VR podcast. So, this interview with Liam happened on Thursday, September 28th, 2023 at MetaConnect at Meta's headquarters in Menlo Park, California. So, with that, let's go ahead and...

[00:06:13.462] Liam Broza: Dive right in. Hi, I'm Liam Broza. I'm the co-founder and CEO of Ethereal Engine. Ethereal Engine is a open-source WebXR engine really built around being multiplayer-first, high-performance-first, and cross-modality, full-body-first. And that really means we work on mobile, desktop, AR, and VR, and we have full-body avatars that you can embody like VRChat. right on the web. So you use VRM? We do use VRM and Mixamo. Yeah, so we allow you to upload any avatar in VRM or Mixamo. We have Ready Player Me, and we're looking at a few other pre-built avatar makers that we're building in. It's kind of a toolkit, so when you sign up and make a user as a guest, you can essentially make a social profile by connecting Facebook, Twitter, Google, Discord, all these social your whole contact list and Bring in photos bring in your avatar and really build like an identity wallet something that it's like a long-term project of ours I think there's really need to build a metaverse identity wallet that has like your avatar in it your name your contact list probably all sorts of basic settings like your six-foot right-handed things like that that you can take from experience to experience and That's something that we've actually focused really hard on. And we've been really working on the idea of web wallets, really out of what groups like MetaMask and TrustWallet have done. So, if you're going from WebXR to WebXR site to WebXR site, your identity persists, or even you could say like, oh, this is my business identity, this is my gaming identity. and your friends list can come with you. It's a really interesting part of the frontier of WebXR is deciding how these sites are going to interact and what's the fluidity from portal to portal to portal. And it's something I'm really interested in and going to the Kronos Group and Metaverse Standard Forum and W3C and figuring out I could kind of go all around on that. Yeah, just on the sign-up page, like there's a lot going on, so yeah.

[00:08:25.332] Kent Bye: Awesome. Well, maybe you could give a bit more context as to your background and your journey into the space.

[00:08:29.947] Liam Broza: Yeah, I went to school at RIT in Carnegie Mellon, and a lot of the people I went to school with ended up going to Microsoft Research and working on the Kinect and HoloLens with Alex Kipman, and went into Oculus and Magic Leap. So I got to live vicariously through my friends through the very beginning of XR. And around 2016, I got really tight with Magic Leap right before they launched and a couple people I knew there. And started to say, oh, this is the future, and I started learning Unity, essentially, to build what was then, we called the LifeScope Time Machine, which was essentially like a Google Earth VR experience with scrapbooking on top. And built that for Magic Leap, built it for a few other platforms. built it on a frame that was kind of like a hobby side gig of mine and When coronavirus hit I had people who were around the Mozilla hubs team and around a whole bunch of other Metaverse projects come to me and say hey, there's tons and tons of interest right now to build 3d worlds that pair well with zoom calls and So I started doing those projects in 2020, founded an immersive agency around that called Laguna Labs, and we did a lot of stuff. We did field trips, we did college graduations, we did all sorts of showrooms, all sorts of brand activations in 2020. And that was on top of Unity and Mozilla Hubs and A-Frame. And as we were doing that, we really started to talk to the browser makers. And they told us, like, these frameworks like 3JS and PlayCanvas, they're all about a decade old. They weren't really built for WebAssembly or WebXR or WebGPU. They have a lot of legacy cruft on them. And if someone actually turned the page and wrote towards modern standards and modern techniques, you could get near native speed and near native graphics. And so that was kind of an endeavor we did internally to drive projects. And we built a really cool toolbox and portfolio. And when crypto really took off at the end of 2020, 2021, We were in a great place, and we ended up working on many crypto projects, PFPs, Metaverse Worlds, NFT projects in the background. I never really had a lot of confidence in them, but I worked in the background, I built a lot of cool stuff, and we were able to take all the money from the projects and reinvest it into this toolchain. And by 2022, the agencies that we'd worked with, because we were really a tech agency and we'd worked with other agencies that were actually doing the 3D modeling and the rollout and the marketing. They came back to us in 2022 and said, hey, can we license your tool chain? We really like how you guys are doing content pipelines and multiplayer and voice and video. And we went, oh, well, hell, maybe we actually have a platform here. So we spent all of 2022 putting all of our tools in a box. And that has grown tremendously in the last 18, 20 months and become the Ethereal Engine platform. And you can check it out at etherealengine.com. There's a whole gallery of worlds, from augmented worlds to virtual worlds. And they're everything from hangout spots, to stores, to games, to education, to fitness. People are building all sorts of stuff on it. So, that's probably a good start to explain what's going on.

[00:11:48.449] Kent Bye: Well, we were at a house party last night and having lots of extended discussions on all sorts of different topics. And I think a theme that I took away is that in light of Unity's surprise announcement of retroactively adding all these runtime fees, which then has still kind of in the process of Walked back to certain degrees, but some aspects are still there. But it created this larger backlash for folks who've been building all of these immersive experiences and technologies and businesses on Unity and to have them retroactively pull the rug out from underneath them and add all these retroactive charges. gave a little bit more insecurity for a lot of folks. And I've seen a lot of folks moving over to Godot Engine. And I've also seen this potentially catalyzing more discussion around open standard stacks as an alternative. And whatever Google and Qualcomm and Samsung might be working on, I know that Brandon Jones, being a core contributor to WebGPU, there could be an open text stack that could be a part of some future potential headsets that may or may not be existing in that front. But also, there's Apple that's in the mix, and they have taken a very antagonistic approach with the Cronus Group and not implemented different things like OpenXR and just trying to do their own thing and roll their own stuff, but in a way that may actually, in the long run, be a huge detriment to the ecosystem of software for the Apple Vision Pro because they have a tech platform stack that no one really wants to use with Swift. you know, kind of like a hindered aspects of Unity. So there's a lot in there, but I'd love to hear, like, just how you start to see this existing zeitgeist of the moment. Lots of different big players, lots of different shifts in the technology stacks, and a lot of potential futures for what a more open tech stack might be and what you see as your role in that.

[00:13:31.067] Liam Broza: Yeah, I would say that the problem is even worse than that. When I'm talking to clients, they usually work backwards from the financial model. They're essentially trying to figure out how much money they're going to spend up front to make the experience, and then how much money they're going to spend marketing the experience to drive people through all of these gates of getting them to go to an app store, download the app and actually use it. And they have the gates at the engine level, where the engines are now charging about 10%, sometimes more. And a lot of them are really afraid that, like, Tim Sweeney or Rick O'teller is going to come out next year and mess with their top line and increase it. And then they have to give another 10% to 30% of their top line to the App Store. We did a lot of projects in crypto and subscription and micropayments, and a lot of those apps got rejected because they didn't monetize exactly the way Apple wanted to or something like that. And they got resistance there. And then if you build on Unity and Unreal, you're talking about multi-gigabyte apps. I've tried to shrink down apps as much as humanly possible and they end up usually around 5-10 gigabytes. And a lot of devices, especially mobile phones, don't have space. They have like an average of 500 megs of space left. Most people's iClouds are maxed out. Most people's hard drives are maxed out. And even getting people to the download button, it'll say not enough space and they'll go through attrition. It's starting to happen on these cheapo Quest 2s that you install 10 games on a 128 gig hard drive, you're out of space. And now they're having to decide what to uninstall and reinstall. The web doesn't have any of those problems. The other real problem is a lot of people want to build cross modality. They look at Unity and they go, oh look, it exports to Windows, Mac, Xbox, all these things. But it's not a magic button. You don't just hit render targets. You end up having to fracture your Unity app into many different versions to hit all these verticals. And to this day, they don't really have a good pipeline of how to share code against different compile targets. And they don't really have a good way to build a character controller. So when I build an interactable object in the world, I can use my virtual joysticks on my phone in like a Fortnite or Roblox experience and interact with that. I can interact with that object on keyboard and mouse. I can interact with that object in full-body VR or AR, usually have to rewrite all the interaction logic and all the character controller logic for each platform. And it ends up with a whole huge host of problems. So, as we've been building Ethereal Engine, we've been attacking that problem directly. Like, how do you write it once and execute it everywhere? How do you build a set of guidelines and tools so that not just multiplayer, but cross-modality multiplayer is right out of the box? How do you build a CMS? So, I design a big, large world, and there's trees and tables and all sorts of things. Each of those assets is baked in many levels of detail, and when I load up on a phone or a Quest, the Ethereal Engine will just benchmark that device. It'll figure out what's its memory, CPU, GPU capabilities, and then bring in just assets on that level of detail that fit inside that device's budget. But it's really just one level that has a low level of quality for the iPhone, medium for Android and the Quest, and high level for desktop and beyond.

[00:16:57.067] Kent Bye: It's like a responsive design for spatial.

[00:16:59.429] Liam Broza: You got it. Yeah, you know, we kind of think it is like baby Nanite right now. And we're working to do more advanced things like that. But that's one of the techniques we've really come on is building instancing and level of detail into a content management system that can stream in. Which is kind of the most advanced techniques being done right now on Unreal Engine 5 if you see that at Nanite But they don't actually like stream from the cloud when you go and download the matrix demo for Unreal Engine 5 It's a hundred gigabyte download because you have to download every single asset 18 times for 18 levels of detail. And that sits on your hard drive, not in the cloud, and then is streaming up to the GPU frustum, and that's how Nanite works. It looks like a really great trick to demo, but it actually doesn't make that really good of an app because Cyberpunk Phantom Liberty is 200-some-odd gigs? Like, you got 200 gigs of space to download that game? Looks great, but it's actually hard for people to play these things, and it's really funny to watch gamers now have to decide, like, I can only fit two or three games on my hard drive, you know? And it's usually just the AAA games they want to play the most. So yeah, I know I'm kind of sprawling with my answers and topics here, but there's a lot of really bad systematic problems with being a creator right now.

[00:18:20.691] Kent Bye: Yeah, I can see how there's a lot of systemic issues that you're addressing at many levels. But there's also this aspect I want to dig into a little bit for whatever you can speak about, especially around what's happening with some of the architectural decisions that Apple has done. When I did the interview with Brandon Jones with WebGPU, And he's working on the WebGPU spec, also working on WebXR. And WebXR, in order to work, has to have some sort of thing that is like OpenXR. It doesn't have to be OpenXR. It could be anything. But it just so happens that OpenXR is IP that's owned by Khronos Group, and WebXR is through the W3C. And Brandon said, well, there's some sort of pending lawsuit that I don't want to get into it. But there's some sort of beef that's going on between Khronos Group and Apple that is creating this situation where Apple's being antagonistic from using some of these open standards that pretty much everyone else in the entire industry is using, but they want to do their own thing. It gives me the sense of they're going down this path of being super closed and secretive and not really being driven by open standards in the community, meaning that anybody that actually wants to build on it has to go through this really antiquated, weird, bespoke workflow to create their app just to get it on there. I'm not convinced that A lot of people are actually going to think that it's worth it, especially when the price point is $3,500, and they're not going to have a software ecosystem. I'd love to hear any of your reflections as to what's happening there. Why is there this beef between Apple and Kronos, and why are they being so antagonistic to open standards?

[00:19:43.323] Liam Broza: I would say, when it comes to WebGPU and Vulkan and Khronos Group in general, Apple does not want to cede a lot of control. They very much wanted to do their own thing on the OS level, which has led them to build Metal, which is extremely hard to use, in my opinion, relative to the other things out there. And they were one of the founding groups of WebGPU, and I believe they've had some beefs on small technical implementation that they've now turned into red lines, where they're like, we're not ceding any control because we don't like when people make decisions that they don't like. It's put them in kind of a funny position. It's pretty much public knowledge that they've been kneecapping Safari for a long time. Probably the biggest problem it has is it has an insanely low memory limit of only about 280 megs per browser tab. relative to like Chrome on Android, which you can allocate gigabytes. And that's what really keeps you from being able to build really interesting AR experiences. Like everything else about the browser is totally fine. They could totally remove this artificial limit. I've talked to people at 8thwall about this. It's really kneecap their ability because their software could do incredible stuff on Android that they just can't do on iOS. This has come to the forefront of the EU, like advocates at the W3C are testifying in front of the GDPR board, antitrust boards and antitrust advisories in California and elsewhere to really force Apple to modernize the browser and really lift this memory limit, allow for PWAs, persistent web apps, allow for web workers, push notifications, all these things that would make web apps more first-class apps. They're also being pushed this way because of enterprise. Name one successful enterprise SaaS app that isn't a web app. They all are. And when it comes to single sign-on, compliance, deployment, all of that stuff's done on the web. That's why companies magically take WebXR and web browser support so seriously. It drives the enterprise ecosystem. And so, Apple's being dragged into this kicking and screaming. Apple, just in general, is in a really bad bind with the Vision Pro. They can't manufacture it for well-documented reasons. They only really have a Unity pipeline working. It's not working that well, from what I understand. You really need to have a very constrained set of requirements to get things to compile on Metal. You have to be on 2022. URP, 98% of stuff in the Unity Asset Store won't work for these apps. There's also other restrictions as, you know, you can't use AR Foundation, ARCore, MRTK, which is like the foundation of 95% of these immersive apps. All that has to go out the window. We were doing a survey in LA, because there's a lot of immersive devs out there, and we were saying maybe 5% of Unity apps could be easily ported over. A lot of the ones that my friends are working on that they wanted to port over, they were looking at it like they would have to rebuild it from studs. Only the assets would really come over. Major sections of the logic would have to be rewritten. And Apple's native tools suck. It's not easy to use, it's not well-documented, it's not very feature-rich, and I think that most developers would die a thousand deaths before looting Swift. So, you jump through all these hoops, which most people aren't going to do, and then there's not many people to sell it to. So I think they're going to be completely starved for content.

[00:23:12.661] Kent Bye: And when they do buy it, there's going to be no content because of all the hoops of, it doesn't make financial sense for anyone to really build for it.

[00:23:18.368] Liam Broza: Yeah, except for clout, right? And that's the only reason you'd build is for clout at this point. We're in a really interesting spot because there already is WebXR support in the Vision Simulator. We've tested our engine out. It's working great. And we're going out and telling people, hey, you can build for mobile and desktop, where you'll see 95% of the user hours or more, and then extend those experiences to AR and VR and make them future-proof. And you can go buy a Vision Pro and run the experience in the Vision Pro and get all the clout and have launch content on day one. I think we have eight. projects in the work right now that are on timeline to be launch content for the Vision Pro and each of those customers is like planning to buy a Vision Pro and shoot video of the experience in it. And yeah, it's just for clout. They expect like no one's actually gonna actually do it, but they'll have a YouTube video of someone doing it. So, yeah. Yeah, if anybody listening wants to build launch content for the Vision Pro, reach out to Ethereal Engine. WebXR is the way to do it.

[00:24:20.232] Kent Bye: Well, the other thing that they haven't necessarily committed to fully is that, you know, they've said that they're going to be launching WebXR for the Safari version that's going to be shipping with Apple Vision Pro, but they haven't committed to, say, launching WebXR for all their other ecosystems with Safari on iOS or Safari on iPadOS or Safari for the macOS. I mean, there's a lot of different Safaris, and as far as I can tell, they haven't necessarily even verbally committed that they're gonna be pushing the green button for WebXR and on any of these other platforms and so I feel like there's another aspect of yet again the entirety of the WebXR community is kind of waiting on Apple to on kneecap a lot of these constraints at least from the memory that you talked about but there seem to be other things of like Are they even going to launch the WebXR? It feels like they have, again, the ability, from their own business model reasons, to send people to the App Store to have that 30% cut. There's financial incentives for them to artificially make Safari a horrible experience. Hopefully, the EU will be able to step on the antitrust. I'd love to hear any of your thoughts, especially from a WebXR perspective.

[00:25:18.892] Liam Broza: Yeah, well, this gets into the wonky part. Yeah, so, at the very beginning, this was no surprise to me that WebXR came to the Vision Pro. If you look at the WebKit core, Safari's core is called WebKit. It's been open source for a long time. The PlayStation, browser, and Tizen TVs, like Samsung TVs, all use this core. And it's had WebXR code in it for almost three years now. And everybody has been hoping and praying that Safari 17 would have it everywhere. and Apple's been kind of kneecapping it, as you've said. There's kind of a funny thing, which is, so the W3C essentially defines what is a web browser. A web browser is essentially defined as all these different APIs, from the media API to the HTML API and all these things. And they have different classifications, like JavaScript, but it has to have JavaScript. AV1 video support, it should have AV1's video support, it doesn't have to. And, like, really being a browser in good standing is kind of having a preponderance of these APIs. If you ever see the website Can I Use, it'll show you what each browser has. And they're all kind of becoming uniform because everybody's switching over to the Chrome core. Like, Chrome is more or less the canonical one. So WebXR sits in this like weird multi-year space with WebVR, where it started off as like a provisional spec, and then it went to a recomme- I forget the names of them. It's like provisional, then a recommended spec, then a hardened one. And as years go on and on, it kind of moves towards a required API. And we're, I think, on the timescale like a year or two out from when the W3C's rules say that this actually has to be in the browser. Apple has gotten a lot of flack from Tim Berners-Lee and others because they implement these APIs as slow as they possibly can without incurring the wrath of the W3C. They've slowly started to roll stuff out, essentially being publicly shamed. Some of it's Tim Berners-Lee taking the podium and calling them out. Some of it's the European Union saying, this is a blatant antitrust problem. And it really takes us to collectively shame them and call them out. I've always debated how much should I start this fight because we're all kissing their ass but also angry at them all at the same time. But yeah, I think it's now becoming the time to be a lot more vocal about this because it is extremely easy for them to fix this problem. They just refuse to do it.

[00:27:50.073] Kent Bye: Well, I had pretty much given up all hope on them, except for they had hired Ada Rose Cannon, who's one of the spec editors for BoXR. And, you know, it gave me some sense of hope, but still I'm skeptical that it's actually going to happen, that they're actually going to turn it on for all these different platforms.

[00:28:04.500] Liam Broza: Yeah, it's not just ADA. There's several people from Mozilla Reality and elsewhere. They've hired even some people from the EFF who are all true believers. And I won't be saying any public names, but behind the scenes, there is a big push to turn this around. You know, conceptually, if you think of the metaverse as a federated set of experiences that everyone has the sovereignty to bring across nations, across systems, and connect them together, there's really only one thing that's ever been constructed like that, and that's the web. And the web has all these magical abilities. I can go to a domain now and instantly load up interactive software in a safe sandbox and interact with it. No app store, no installs, nothing. With a lot of assurances that this experience is coming from somebody and it's relatively safe and contained on my machine. The web browser is, it's not perfect, but it's a pretty darn good place to start building a metaverse. Especially with things like WebGPU, allowing us to do graphics and machine learning compute at near native speeds. You install a browser and it just immediately picks up all the CPU and GPU and sensor resources and standardizes and wraps them now. It's really magical, and it's really getting mature. I have to hand it to Samsung, Microsoft, and Google for really bringing this into its own in the last few years. I'm careful to what I say, but there's a lot of energy now to make sure WebXR is first-class supported on a lot of current and future devices. I'll leave it at that. But there's a reason I'm betting the farm on WebXR.

[00:29:48.580] Kent Bye: Yeah, well you you just mentioned same some Google and Qualcomm is there anything that you can say in terms of how? Some of these open tech may or may not be playing a part of that future

[00:29:58.390] Liam Broza: Well, I think the last count, I'll say this. Last time I talked to people at Qualcomm, they said 54 headsets are coming out in the next 12 calendar months on the XR2 platform, and they're all going to be off a common Android core with three or four different Chromium forks in the works right now to work on top of them, so I think it's going to be a very good time. Stay tuned kids Yeah, I'm warming up that snapdragon spaces is gonna have a lot of legs Wolvik by a galea is moving over to the chromium core now, and I'm hoping by the end of the year They'll have something that's pretty darn close to the performance and feature set of the oculus browser which is kind of considered the gold standard for WebXR implementation and Yeah, there's I'm always afraid to speak too much, because I really want access to these new and fun devices without getting myself into trouble, like most people do.

[00:30:58.221] Kent Bye: But yeah. Cool. Well, I'd love to hear any other reflections on this broader revolt, almost like people moving towards these more open systems from Unity to things like Godot, and potentially a WebGPU open web tech stack, and the larger zeitgeist that's happening right now of this movement towards these more open standard platforms.

[00:31:19.197] Liam Broza: Yeah. When it comes down to the people that are moving to Ethereal Engine, they talk to me about saving their top-line margin and regaining control over their ability to directly connect with customers. The magic of the web is not just you get a customer and you bring them to this WebXR experience, but now that customer can very easily type in an email address or a phone number and send out a link or copy and paste a link. and very quickly and easily share. And that type of opportunity for viral growth is huge. A lot of our customers are really driving this stuff from email campaigns and social campaigns where they only have one touch point of a link to get directly into the experience. That's just massive for conversion, like 10x better conversion than anything else. We've been playing a lot with this new concept for creators where we make them a virtual world or we make an augmented reality experience that they can then live stream out of and then that goes to Twitch or YouTube and at the bottom of the YouTube listing is a link that can take you right into that experience live or see the experience recorded with full-body avatars playing it back and we're talking to a lot of interesting influencer agencies in and out of the VR space about building those types of fan connection experiences and like passive to active engagement loops and I don't think that's really possible anywhere else but the web.

[00:32:56.049] Kent Bye: And finally, what do you think the ultimate potential of virtual reality, mixed reality, and all these open standard tech platforms might be and what it might be able to enable?

[00:33:07.316] Liam Broza: My fantasy is really an augmented reality. Virtual worlds are fun and I might be going into those a few hours a week, but I really want the AR headsets I can have all the time. I want to be able to walk down a promenade and have some sort of spatial index where I can look at a restaurant. And there's some sort of tying of GPS or VPS to a domain name to a physical building. And I can look at a restaurant and, you know, it's a Chinese restaurant and a dragon flies out and shows me the specials. And that type of like federated sovereignty, like I have a domain or some sort of digital home that when I come up to you, our systems can interact and we can trade and interact on a flat connected surface. That's a lot like where I started the conversation with talking about wallets, identity, these basic interoperable pieces. So that just drives our life. And I think it's coming sooner than people think. I'm playing around now with a lot of systems where you can form a party and go from site to site to site as a group in virtual or augmented spaces and go on kind of a federated adventure and you don't have to be inside a VR chat or Horizon Worlds, inside a single ecosystem to do actual exploration. So we're actively at the W3C and the Kronos Group and the Metaverse Standards Forum figuring out portals and parties and wallets and things like that. And once that's sussed out and there's a nucleus of experiences that show how it's done, I think everyone's going to hop on board really, really quickly. And I'm kind of disappointed to see Apple and Meta, they're not really focused on the big problems as much as they are the immediate problems. Not really working backwards from, okay, what is life with AR headsets on 10 hours a day? And I do think that that is something we should really strive for. I think that setting the phones down, closing the laptops, going out into the world is the potential of this thing. And it should be pursued at full force. I want to be able to go learn as I walk around my town. I want to be able to be communicating while experiencing. I want digital telepathy. I want digital telekinesis. So that's what AR means to me and where I want this to go.

[00:35:36.657] Kent Bye: Awesome. Well, I'm going to be heading over to the WebXR meetup here that's happening. And thanks so much for taking the time to give a little bit of background of what you're working on. I think it's going to be very timely, a huge intersection of these open platforms. And I think the time is right for something like this and to see where it goes in the future. So thanks again for taking the time to give a little insight for all these larger dynamics. So thank you.

[00:35:57.515] Liam Broza: Thanks for giving me the mic, Kent. It's always great listening to your podcasts. You have a great curation of the people who are really passionate here. So I'm happy to be a part of that. And I really just have to shout out to my team. We're coming up on 40 people now. And I stand on their shoulders to be able to share this. The WebXR community, I think, is such an awesome little niche inside the Browder immersive space that's really finally coming into its time. Awesome.

[00:36:27.210] Kent Bye: So thanks again for tuning in to one of my dozen episodes about MetaConnect. There's lots that I've been unpacking throughout the course of the series, and I'm going to invite folks over to patreon.com to be able to join in to support my work that I've been doing here as an independent journalist trying to sustain this work. Realistically, I need to be at around $4,000 a month to be at a level of financial stability. I'm at around 30% of that goal. So I'd love for folks to be able to join in, and I'm hoping to expand out different offerings and events over the next year, starting with more unpacking of my coverage from Venice Immersive, where I've just posted 34 different interviews from over 30 hours of coverage. And I've already given a talk this week unpacking a little bit more my ideas about experiential design and immersive storytelling. And yeah, I feel like there's a need for independent journalism and independent research and just the type of coverage that i'm able to do and if you're able to join in on the patreon five dollars a month it's a great level to be able to help support and sustain it but if you can afford more than 10 20 50 or even 100 a month are all great levels as well and will help me to continue to bring not only you this coverage, but also the broader XR industry. I now have transcripts on all the different interviews on the podcast on Voices of VR and in the process of adding categories as well into 1,317 interviews now that have been published after this series has concluded. So yeah, join me over on Patreon and we can start to explore the many different potentialities of virtual and augmented and mixed reality at patreon.com slash Voices of VR. Thanks for listening.

More from this show