#966: Overlaying Multiple Layers of Reality with Pluto VR Telepresence + Standards-Driven, Multi-App Ecosystem

Pluto VR is a general-purpose VR, telepresence application that hopes to provide the social presence glue for a standards-driven, multiple-application ecosystem using SteamVR on the PC. I had some new conceptual breakthroughs about the potential future of spatial computing after getting a demo of how Pluto VR is works with other applications like Metachromiumor Aardvark. Metachromium can run entire WebXR applications as an overlay on SteamVR applications, and Aardvark is an future-forward framewok that allows for the development of augmented reality “gadgets” that run on top of virtual reality experiences.

All of these technologies are utilizing the Overlay extension of OpenXR in order to overlay augmented layers of reality on top of VR experiences, and they’re working together in way that will facilitate emergent behaviors that break out the normal 2D frames of our existing computing paradigm. When you run an application on mobile on your computer, then there’s usually only one application that’s in focus for any given moment. You can context-switch between apps, or copy and paste, but our computing paradigm has been all happening within the context of these 2D frames and windows.

The promise of spatial computing is that we’ll be able to break out of this 2D frame, and create a spatial context that allows for apps to directly interact with each other. This will originally happen through lighting, opacity shifts, or occlusion between 3D object, but eventually there will be more complicated interactions, collisions, and emergent behaviors that are discovered between these apps.

Moving from 2D to 3D will allow application developers to break out of this metaphorical frame, but this also means that app developers won’t have as much control over the precise environmental and spatial context that their application will be running. This is a good example for where I think the more process-relational thinking of Alfred North Whitehead’s Process Philosophy has a lot to teach XR application developers to start thinking in terms of the deeper ecological context under which their spatial computing app is going to exist.

Facebook has not even made it possible to run multiple VR applications at once on either their PC or Quest platforms. It’s Valve’s SteamVR that is providing the platform on PCs for experimentation and innovation here. It’s admittedly a bit cumbersome to launch and connect each of these disparate applications together, but over time I expect the onboarding and overall user experience to improve as value is discovered for what types of augmentations will be provided with these overlay layers. But Pluto VR has an opportunity to become the Discord of VR in providing a persistent social graph and real-time context for social interactions that transcends any one VR application. It’s an app that you can hop into before diving into a multi-player experience, but it’s also enabling players to stay connected during loading screens and other liminal and interstitial virtual spaces, like the Matrix home screen of Steam VR.

Pluto VR has been working with a number of open standards that will be driving innovation on XR as an open platform including Web-RTC, glTF, VRM, OpenXR, WebXR, Web Bundles, XRPackage (XRPK) as well as the Immersive Technology Media Format (ITMF) from the Immersive Digital Experience Alliance (IDEA). They also hosted the W3C workshop on XR accessibility to get more insights for helping to cultivate accessible standards in XR. The Pluto VR team is taking a really future-looking strategy, and hoping to help kickstart a lot of innovation when it comes to creating AR widgets that could be used in VR environments, but perhaps eventually be ported to proper AR applications.

Covering the emerging technologies of augmented and virtual reality since May 2014 has helped me to isolate some of the key phases in the development and evolution of a new medium.

communication-medium-as-process

First there’s a new emerging technology platform that enables new affordances, then the artists, creators, makers, & entrepreneurs create apps and experiences that explore the new affordances of the new technology, then there’s a distribution channel in order to get these experimental pieces of content into the hands of audiences, and then audiences are able to watch the work and provide feedback for both the tech platform providers and the content creators.

The OpenXR and WebXR standards are enabling distribution channels of immersive content through apps like Metachromium and Aardvark, and the OpenXR overlay extension allows for this more modular AR gadget content to be used within the context of existing VR applications run on Steam. Then Pluto VR is connecting creators directly with their audience in order to share their WebXR apps or Aardvark AR gadgets in order to get that real-time, audience feedback loop cycle. This has the potential to complete the cycle and catalyze a lot of experimentation and innovation when it comes to what types of AR apps and widgets prove to be useful within the context of these VR experiences.

Here’s a demo video that shows how a variety of WebXR applications can be launched within a shared, Pluto VR social context:

I’ve had a number of interactions with the Pluto VR team over the past couple of years, and I’ve just been super impressed with their vision of where they want to take VR. They also likely have a lot of cash reserves as they’ve kept a small footprint after raising a $13.9 million Series Funding round announced on April 13th, 2017.

I had a chance to talk with two of Pluto VR’s co-founders Forest Gibson and Jared Cheshier on Friday, October 11th after getting a demo that blew my mind about the future concepts of spatial computing. We cover their journey of into VR, and how Tim Sweeney’s The Future of VR & Games talk on October 12, 2016 at Steam Dev Days where Sweeney laid out some of his vision of how the metaverse is going to include a lot of different applications within the same game engine-like, spatial context. So much of the VR industry, mobile computing, and PC applications have been stuck inside of a 2D, windowed frame and closed context that it was really refreshing to get a small taste of where all of this is going to go. They share some of their early surprises for spatial computing, and they know that there will be so many other key insights and innovations that will be discovered with the multi-application, technology stack that they’ve been able to set up. This is a very community-driven effort, and they’ll be showing off their technology and connecting to the wider VR community during the Virtual Market 5 (VKet 5) in VR Chat starting on December 19th.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. So I had a demo this past Friday that really opened my mind up to a whole range of new possibilities for the future of spatial computing. So PlutoVR is a telepresence application, and they've created a layer of social interaction that you could put on top of other VR applications. And so right now we live in this world where any spatial application that you're running either in VR or AR is completely self-contained into this app mindset where it's just the only app that's running. So most of the social interactions that we're having in social VR are in the context of a social VR app, say Rec Room or VRChat or AltspaceVR. But PlutoVR is trying to create a layer that's outside of the simulation. So you can think of it as a generalized telepersons application that you could use to be able to run on top of VRChat or run on top of these other applications or to be in the interstitial loading screens within SteamVR and to be able to have an ability to be able to connect to other people, let's say like sort of the Discord or Slack or other ways in that you kind of coordinate with people. And then from there, then you jump into the VR experience. Well, this is a way to kind of have this persistent spatial telepresence with other people. and to use a ecosystem of other applications that are using web-based technologies, so like Aardvark as well as Metachromium. They each are web browsers, so a headless Chrome browser for Aardvark as well as just a normal web browser for Metachromium to be able to run WebXR applications. And it's trying to paint this vision of a interoperable spatial computing platform. to have multiple applications running at the same time. They're interacting with each other. And the social glue that is going to be there to be able to create it as a distribution platform is Pluto VR. So you can start to rapidly prototype different open web technologies within the context of WebXR, be able to use OpenXR to be able to inject this into this shared spatial context. And generally, it's this shared telepresence application that Pluto VR is creating. So I had a chance to talk to two of the co-founders of Pluto VR, Jared Cheshire, as well as Forrest Gibson. And we kind of explore the genesis and the evolution of why they started Pluto VR, but also look into the future for what's this mean as you start to combine all these different spatial applications into the same shared context where they're actually interacting with each other. What is the concept of these layers and the concept of a shared spatial context? And thinking that more as a metaphor of a game engine is how these different entities and objects start to interact with each other. And what's that mean for the future of computing? So that's what we're covering on today's episode of the Voices of VR podcast. So this interview with Jared and Forrest happened on Friday, December 11th, 2020. So with that, let's go ahead and dive right in.

[00:02:54.545] Forest Gibson: My name is Forrest Gibson. I'm co-founder of Pluto VR, and I've been working in the world of virtual communications technology for quite a while now, I think just past six years. And then I'm just doing VR stuff in general beyond that a little bit longer.

[00:03:10.435] Jared Cheshier: Yeah, and I'm Jared Cheshire, also one of the co-founders of Pluto VR. Forrest and I have been working on VR since before Pluto, so it's the same amount of time pretty much that we've been working together.

[00:03:19.707] Kent Bye: Okay, yeah, so maybe you could each give a bit more context as to your background and your journey into VR.

[00:03:27.432] Forest Gibson: Yeah. So, you know, as a kid, I remember getting a chance to experience virtual reality at the Pacific Science Center in Seattle and really being blown away by the future of the technology. And that was in the mid late nineties. And then that went away for a while. And then I had a really interesting experience where I got a mysterious text message from a friend saying, you know, go to this location, just do it. And I did. And I got a really early demo of some of VR Studio and VRcade's technology up in Seattle. And I was kind of blown away by what I experienced because I had a fully wireless headset experience plus a motion tracked controller. And this was before the release of the DK1. So really early, fully positionally tracked experience. And then, you know, they're like, okay, got it. You know, I was so excited. They're like, okay, you got to get out of here. And then I left. And on the way out, the next person who was coming into demo was Jared. And I was like, I know this guy. And then I went back up with them and watched Jared demo. And from that experience, Jared and I just looked at each other and we're like, Hey, like we've done startups before we've worked before. We actually, we originally met each other doing video production in middle school as part of the Yakima Valley community colleges tech program for a summer course. And we just like, okay, this is it. Like we got to get into VR. And so we just dove in and started experimenting on, on what was possible at the time. That's where the formulation of Impossible Object and Giant vs. Horde, Impossible Object being our previous company that morphed into Pluto later, and Giant vs. Horde being this entertainment experience where one person was in a full-body mocap suit fighting a horde of other people, this asymmetrical game. We discovered presence there, and that's where we At first, we had this person in a mocap suit, and we recorded some stuff, and they're moving around. Then once we got DK2s, and we could have positional track headsets where multiple people could be in together, we had these white floating party masks. We looked at each other, and suddenly seeing the other person's mask moving like a real person was way more interesting than this full-body mocap. We just looked at each other, whoa, that's a person right there. And that was sort of what sparked, you know, got us on this journey towards founding Pluto with John Vici and Jonathan Geibel. And that's a brief story of how, you know, and my background's varied. I've done a bunch of marketing. I've done a bunch of viral videos. I did Know Your Meme. I've sort of been all over the place. But yeah, that's sort of my history on VR.

[00:05:54.643] Jared Cheshier: Definitely. And it's quite a bit of a shared history that we've got in VR technologies. And Forrest and I have known each other for a very long time and have collaborated on a lot of different types of projects. And I think we'll always work on really cool, interesting projects. And so, yeah, I think that areas that are maybe a little bit unique for me and my background in areas that I haven't worked with Forrest is I've had a really cool opportunity to work at other companies. I worked at Valve Software. I've worked at Microsoft. I've done some work in consulting, doing some development at a local development firm in Seattle called General UI. We did a lot of different augmented reality and virtual reality experiences pre-DK1. And even, you know, at different technology companies Forrest and I worked at, we brought in augmented reality technology, like using some of the earliest tech, like the Matayo stuff that eventually got acquired by Apple that pretty much, I think, became parts of ARKit stuff. So, we've been working on this stuff because it's super exciting and it's kind of like the thing that you're promised when you're a kid about science fiction and what could exist. And I just love all that stuff.

[00:06:53.560] Kent Bye: Yeah, there's a bit of a collision of our backgrounds to Forrest with in terms of I was a part of the video blogging community and the Yahoo video blogging group back in 2004, 2005, before YouTube had even launched. And so I was definitely watching Rocketboom and Know Your Meme and I know Kenyatta Cheese and Andrew Barron and, you know, the whole crew there. So it's sort of a bit of Internet history that memes and the role of memes have been very important. And I think the connection, what I see, at least to what you're now doing with Pluto VR is you're starting to try to figure out this whole new spatial communications language in VR. It's starting to play with all these different layers of reality. And with those different layers of reality, you're able to, what I think of at least is like, it's a telepresence app, but you're really built with having multiple applications run at the same time. And I think we're at the point now where you have an ecosystem where you can even functionally do that. So maybe you could talk about the metaphor you used to really describe these many different layers of reality. I mean, one that I use at least from after experiencing it is you have all these VR apps, and in some sense, you have like these augmented reality layers that you're overlaying on top of these VR contexts that you have with the coordinate systems and whatnot. And so you're opening up Steam, you're running these different Chrome browsers at different layers of a headless Chrome with Aardvark and Metachromium, but you are interfacing with all these apps to be able to actually have all these different layers of reality that you're mashing together. That's at least how I describe it. I don't know how you kind of describe what it is that you're doing.

[00:08:26.645] Forest Gibson: Yeah, so I'll start off on a super high level. And I think Jerry can dive into some of the more technical bits. If you look at this as a transitional time in computing, right? If you go back to the early days of personal computing, you had DOS where you could run an application. And there was this pivotal moment that happened where GUIs became possible, right? You had Apple's products, um, or Mac OS, and then you had windows where suddenly you're not just in app, right? You're not in application. You were experiencing this whole virtual background, the desktop environment, where you can experience multiple windows of applications at the same time. In that context, it was this concept of two-dimensional windows or frames. That's what allowed this really huge innovation in computing because you're no longer limited by one application. Once you started getting concepts like copy and paste, imagine using your computer today without copy and paste. The ability to do that is such a foundational layer to what makes our computing systems reliable and usable today, is that concept of these multiple layers at the same time. And I think we're just seeing the same evolution happening again in VR and AR and sort of spatial computing in general, which is this transition from single world or single app into this spatial GUI, this spatial windows, these volumes of applications that can overlap and overlay on top of each other. And that's really, from a design concept, that's really the thing that's happening is this multi-app world is allowing you to do more than one thing at the same time. And I'll give one example that's very, very simple, but I think very powerful. Imagine you have a pen, and that's an application, right? And then you also have a color picker, so you can choose which color you want. those two things don't have to be the same application. And you could swap out the different pen or the different color picker so that any number of developers can make the best pen and any number of developers can make the best color picker. And then just imagine the emergent behaviors that start to evolve from that. I just said two apps, but suddenly there's already, if I bring up a 3D model of a teddy bear, maybe that teddy bear can accept that color picker. So I can change the color of the teddy bear with a color picker. And you start to get all these magical emergent properties when applications can run alongside each other spatially.

[00:10:48.865] Jared Cheshier: Yeah, and I think that, like Forrest mentioned, some of the things around the technical capabilities that we've kind of entered as an ecosystem is there's actually a means in which you can start to prototype and develop those applications today that is just a really cool threshold to have met. And the opportunities that exist for multiple developers to create applications that can run alongside each other, they exist today. So there's frameworks like the Aardvark framework, And there's the ability to run multiple WebXR applications simultaneously. And both of those paths exist today. And then with standards like OpenXR and engines like Unreal and Unity, you'll be able to do those things in the future. And the opportunity that exists around using virtual and augmented reality as part of the same system is you can start to develop applications for augmented reality use cases before the virtual reality and augmented reality really merge. So you could expect augmented reality applications that work in VR to work really well in real life. But you can make them today before you actually have those cool AR glasses everyone's excited about for the future.

[00:11:55.147] Kent Bye: Yeah, I just had a chance to do some demos with you in Pluto VR. And what my experience was, is I open up Steam. I also have Aardvark running in the background. Metachromium is also running. You have this app that is able to launch these apps. So we're in a shared social space, but we're not really in any one app. We're just in this space where it's like the Pluto VR space. that's kind of like an overlay on top of my own personal solipsistic SteamVR virtual reality space, but you're beaming in there. So you have a shared social space within each of our own solipsistic spaces. So you could be in whatever virtual space you're in, but we have a shared social space together and you start to bring in different applications that are interacting with each other in different ways, at least from the point of having different layers of opacity that are being altered and changed and being able to have, I guess a metaphor that I think of is like in Photoshop, you have different layers and you have ability to kind of overlay on top of each other. And here there's definitely some dimensions of overlay where there was some inclusion. Sometimes when we opened up the big building and all of a sudden your avatars disappeared and I had to shut off the avatars and turn them back on or put you back on top. But you're able to have a model of a car and then draw with a pen on top of the car and have the pen be occluded by the car, which to me started to blow my mind because it's like, oh my God, you have two different separate models, but they're interacting with each other in this shared virtual space. This concept of a shared context where usually when you're in Photoshop, you don't just take a Photoshop pen and start drawing in like Premiere or start drawing in your website. You know, those are contexts, they're like completely separated. So it's like this paradigm of what's it mean to eliminate those contexts and have a shared context amongst all these objects that are interacting with each other. And I don't know, that kind of blew my mind in terms of like, wow, okay, what does this mean? I didn't expect this. I've been in spatial computing for so long and I didn't really extrapolate in my own mind some of these different things. And it sounds like you've been discovering some of these similar type of mind-blowing moments.

[00:13:57.335] Jared Cheshier: Definitely. I mean, we've been kind of imagining what all this could be for a while and kind of aiming for the scenario in which we reach this threshold as an industry in which we can do this kind of stuff. But even when we've imagined that, it's not until you experience it that you get kind of shocked by the capabilities that are there. And with that ability to run the multiple applications as if they're really kind of truly meant to be together, it just unlocks all these things you kind of can't predict necessarily, even if you know it's going to happen. And so the invitation kind of that I want to put out there is just like, if you can try and experiment with stuff, it's great. But even just getting to the point where you start imagining use cases where you can start to count on some amount of additional apps running, it just opens up all kinds of possibilities that aren't there in an ecosystem where one application is running and it has to provide everything and it's in one place and one location.

[00:14:51.201] Forest Gibson: Yeah. I mean, there's a couple other moments I've had recently, which is with our kind of shared WebXR launcher, you can start to have shared lighting, which that was something that was really crazy where that kind of weird old Western town that we opened up, it has lighting. And so I was actually, when I, when we brought in the Grand Theft Auto map, it actually relit that map. And so this idea of having additional pieces of these 3D ecosystems, and lights in 3D is often hard because they're often invisible, right? Yeah.

[00:15:18.850] Jared Cheshier: And that first app, we're like, okay, well, a flashlight app suddenly now makes sense. And you wouldn't have thought about that before. Who would have thought you'd have a flashlight app you'd be able to use with other applications?

[00:15:30.142] Forest Gibson: Exactly. And I think I'm really excited to explore that because that was accidental. I was loading up multiple apps and then suddenly the whole world changed. And I looked around and said, what happened? And Jared's like, oh yeah, I think they share a same scene graph. And so there was another light there. Whoa, mind blown. Like there's so much more possibilities. And I know that with the XRPK is there's collision meshes. And so this idea of having multiple WebXR apps that can actually provide colliders so that it's multiple applications providing, you know, you could have a ball and you could bounce the ball off another app.

[00:16:02.446] Kent Bye: Yeah. Maybe you can explain like, what is an XRPK?

[00:16:06.095] Jared Cheshier: Yeah, an XRPK is a web bundle with some additional information you can add that makes it friendly to be able to run alongside multiple applications and load them up. So it's web bundles, and it allows you to be able to do things like take a WebXR app and then bring it over as a single file. You can also bundle up things like GLBs or GLTF, as well as VRM avatars, and it makes it really friendly to be able to share these things across to different places that support them.

[00:16:33.705] Kent Bye: So a web app, so I'm assuming that's HTML, JavaScript, CSS. Does it also include WebAssembly?

[00:16:39.732] Jared Cheshier: Yeah. Yeah. You can make a WebXR app that has WebAssembly. Any type of application that can run in the browser, you bundle up those files and that's what's available. It's kind of an alternative to hosting a website.

[00:16:50.395] Kent Bye: The way I think of it, at least is like in a mobile app, you go to a website and it has what's referred to as the progressive web app where you can like save it. And then it essentially becomes like a native app once you have the progressive web app. So I kind of think of these like XRP cases, like these progressive web apps. There are these self-contained apps, but they have the spatial dimensions to them. Is that kind of similar? Like, like there are these, or how are they stored? Do you consider them to be kind of like apps? I mean, they're called XRP case, but I've never heard of that before. know, the closest metaphor to it, a 2D realm, if that's close to the progressive web app, or if it's close to like, you know, JS bundle of things, or like, what's it the closest to when it comes to the 2D metaphor?

[00:17:29.308] Jared Cheshier: Yeah, so, yeah, it's a web bundle. So you might deploy a web bundle, and it could be a progressive web application, like a normal 2D web application. And you want to be able to bring it with you, you want to have it loaded up in a browser in which it's not necessarily being hosted. And all the underlying technology is definitely just straight up normal browser technology, what you would normally load up in a 2D browser. So these WebXR apps are just straight up web standards. So WebXR applications in which just all the files that you need are containerized, you know, and they're in a container that you're able to have be with you. And it is based on a web standard, which is those web bundles. It's just there's some additional information that's in there that allows them to be able to be friendly, to be loaded and run alongside each other.

[00:18:12.127] Kent Bye: OK, yeah, I haven't come across the web bundles too much in my old days of the web. So the other metaphor that I think of is just that you're having these objects potentially interact. I think of it as sort of like a API, so application programming interface, to be able to sort of exchange information. But in this sense, it's spatialized information. So I don't know if you could metaphorically have these objects have an API that says if it detects a collision, then it catalyzes this type of behavior. So you have these self-contained objects that are able to interact with the collective environment that you add in, but they're able to have this universal way that you can pull in these objects and kind of have them interact with each other.

[00:18:54.608] Jared Cheshier: Yeah, so we're still in the early days in which multiple apps interacting with each other is an area of discovery that we're really interested in. So if you think about the kind of behavior you see in WebXR applications right now, you're running one app at a time. And as you're running multiple applications at a time, we're just starting to discover what it is and how you want to have applications communicate with each other. Sometimes you want to have them communicate with each other, but you don't always want them to communicate with each other. And so if you think about it as these discrete separate layers, they don't inherently have interaction to start off with. You kind of have to implement what that interaction is. So if you did want to create an app that was a ball that was able to be thrown and collide with collision meshes from another application, that certainly need to be implemented at some level. And you may want it to be at the level that you're loading apps together, but you may want to have it be a discrete connection between the collision mesh app and the ball app for whatever reason. And I think the really interesting thing to think about this is to kind of mimic what the APIs look like on augmented reality devices that are on the horizon. Because the opportunity here is, why not just make it so the AR app you make that works really well alongside other AR apps and VR works really well for the real world? And the real world's having this really difficult time of like reassembling and reconstructing the world from camera images and sensor data, where you get this mesh about the real world and you have to like rebuild everything. With virtual and augmented reality, it's all already there. So I think that a really interesting opportunity is to really look at that interface as kind of like one interface. Collision mesh for the world and maybe it's coming from the real world or maybe it's coming from the digital world But rather than trying to solve these problems in silos Treated as an open ecosystem with a future progression in which these apps come with us and I think that's really the cool opportunity that exists that I want people to start thinking about because You could make that cool ball app today and then you know a few years down the road when you had air glasses Be able to still have it with you

[00:20:55.142] Kent Bye: Yeah, just the fact that you can start to do this rapid prototyping in VR where you know all the data. It's like this idealized realm of you have everything like precise. And when in AR, it's like has to do a lot of computer vision and everything's a lot more sloppy. So you can really lock in stuff and really get it down really heat to the point where it feels like it's this layers over the top of each other. So I can definitely see how So a lot of the different experiments that you're starting to do here is going to start to work out what does the spatial computing mean when you're able to really have all these things interact with each other in the same shared context in that way, and what type of things do people want to do, and then different types of behaviors that then emerge out of that that have these things that we can't necessarily all completely imagine what type of stuff is going to be possible. But I can say that we have a pretty mature system of 2D web apps that have already developed. And then starting to think about that, okay, what's the 3D version of each of these? And then on top of that, have them break down the existing firewalls and have them all in the same spatialized context. And they're playing and interacting with each other in the same shared space. It's like, okay, that's a level of complexity that I can't quite fully imagine. But I think as people start to experiment and tinker with that, then they can start to figure some of that out.

[00:22:10.668] Forest Gibson: Yeah. One thing I'll say is it's hard because there's so much potential. There's infinite potential in VR, and that's what we saw when VR was first taking off, and there's infinite potential here. The real thing to consider and to think about is starting from the foundation, starting simply. One app I'm really excited about experiencing in this sort of multi-app use case is Ruler. I want to be able to measure things, because once I have multi-app, the interaction is, how long is something? It's a really straightforward interaction, and the API is, be in the same coordinate frame so you can measure how long something is. And suddenly, Ruler becomes a very, very important and meaningful multi-app that on its own, if someone said, I made a VR app that's a Ruler, that doesn't make any sense. And then you're like, well, what do you measure? Like, there's nothing in the app besides the Ruler. But in this multi-app world, that makes a ton of sense. And you kind of see this in AR, where one of the first kind of apps that Apple released in their AR kit stuff was a Ruler. And so starting as simply as possible, because the complexity will come later, and you can't just get the complexity right now.

[00:23:17.150] Kent Bye: Yeah, well, maybe we could talk a little bit about this larger ecosystem here. Cause it's quite an interesting approach where you have Pluto VR, you have artwork, you have metachromium and they're each doing a little bit different things. So maybe you could just sort of describe some of the apps that I've went through here is sort of having all these, which in some ways there's a lot of friction for people that kind of have to have all these things running at the same time. It's not like just running one app you have these multi app environments requires all these apps. So maybe you just sort of break down each of these and how they're playing with each other.

[00:23:44.993] Jared Cheshier: Yeah. So Aardvark is a future forward framework, and it's started by Joe Ludwig. And it's a means in which you can create gadgets. So think about items, 3D things that you kind of bring with you or have around you. These gadgets are really friendly. There's a lot of things that exist in Aardvark that make it especially friendly for applications that don't have to update at frame rate. So you can have things like, I made a boom box. I made a boombox. It plays music. You can see it from across the room, and it can play music. And it doesn't need to have a lot of different shaders and things that are going on that are updating it frame rate necessarily. It's a 3D model that has some functionality, some UI, and you can listen to music. And another app was cards, being able to have playing cards that you can be able to have, you can bring out. And they're these static objects. Again, the faces of them, they don't need to update at frame rate. And because Aardvark's built in this declarative way, those applications are really friendly to run alongside each other and also be able to be super performant over time. And that kind of framework is one that's future forward in that it has less standards. It's still built on web standards, and it's really friendly for web developers to be able to create an Aardvark app. But it's not just straight up a standard you can go make reference to or get the benefit of all the different libraries that are available. Whereas with Metachromium, which is a spatial web browser that allows you to run WebXR content alongside virtual reality content, you can run WebXR alongside Aardvark apps, and you can run it alongside virtual reality. And then there's Pluto. And Pluto allows you to be able to communicate with people as if you're in person. You can have an in-person-like experience, and then you can also be able to use apps that are made using different frameworks simultaneously alongside all of these things. And one of the things that we see that's really interesting is any one of these can kind of bootstrap the whole use case. You can go into Pluto, you can launch an Aardvark app, you can launch a WebXR app. But also the opportunity is like, you know, you can go into Aardvark, find a cool app, and get connected into a Pluto conversation from that direction as well. And the thing that I'm really excited about is, you know, these things are in this current state and you can create these apps now. So you can definitely experiment with this stuff and start to build applications that work in this way. But with OpenXR and some of the different standards that are coming together and some of the different engines, you know, you'll be able to use Unreal Engine, you'll be able to use Unity over time as these extensions are adopted. And so we're doing work. with Unreal Engine and to make this possible as OpenXR is coming out. And we're really excited for the ecosystem to form and you could expect all the different tools to be there with you.

[00:26:21.270] Forest Gibson: Yeah, and this is the vision that sort of has been laid out in the past. And I know that we were at Steam Dev Days back in what, 2016? Mm-hmm.

[00:26:30.034] Jared Cheshier: Yeah, so in 2016, there's a bunch of different talks, and one of them that we attended was Tim Sweeney's talk, and it was this talk about the future of VR. And there's this description of this future in which not one company is going to build the metaverse. Everybody's going to participate in this way. And it doesn't get into the technical details of how this is going to be implemented. And so at the time, it's very unclear how this is going to occur. But now, kind of where we are now, it's very clear that one way to weave these things together that's really powerful is having multiple applications that run alongside each other. And there's even specific callouts in the talk in which he says there's going to be multiple engines. There's going to be multiple frameworks and open standards and protocols. And it kind of sounds like this thing that it's unclear. But standing where we are right now with the capabilities that you had a chance to experience, I think we've reached this moment that's really interesting in which there really is an opportunity for an open metaverse, which includes a bunch of different apps produced by many different people.

[00:27:31.354] Kent Bye: Yeah, and maybe you could talk a little bit about some of the other demos that you had. I know one of your engineers, Ryan, was telling me about some of the medical applications. And so maybe you could talk about what's happening with how you're applying Pluio VR in these other contexts.

[00:27:45.887] Forest Gibson: Yeah, so there's a medical startup that's really focusing on building telerobotics technology so that they can perform surgeries remotely. And one of the challenges with remote surgeries right now is they're limited by a video screen. It's not the same experience as being in the operating room and by leveraging spatial technologies and having the layer. So I'll break down the layers, right? They have Pluto. So people who are in the room, as well as people who are remote can all be spatially relative to each other and feel like they're together in person. And then they actually are using WebXR to have applications that visualize this data so they can visualize the inside of a heart. And so I got a chance to experience the inside of an aorta actually, and they're able to have this catheter going through this model and like that they could do in a real surgery. And not only can people be like this, where they're able to. see and hear each other spatially as if they're in person, but then they can be in the aorta, like they can have the aorta around them, they can actually be shrunk down to a very tiny size. And then they can also have a mini map, right? Because you're not limited to one reality, you can have multiple instances of the same reality at different scales. And so they have the mini map version, and then they have the kind of the full size version. And then one person's running an additional app, which is a separate application, right? There's Pluto. There's the sort of visualization app. And then there's the controller app, which is a separate app. And what it is, is it enables them to use whatever the HMDs controllers are to control this robot. And so one person's operating and they're controlling this robot and everyone's able to see it together. And everyone's able to feel present with each other, even though they're geographically all over.

[00:29:26.243] Kent Bye: Yeah, as I hear that what comes to mind is that a lot of VR apps would have to have their own social networking aspect to it. So like VRChat has its own way of having a social presence. And I'm sure that Oculus has different ways that they've implemented SDK to be able to have a social layer. then there's something like normal VR that has, within the context of a Unity app, ways of doing those social interactions and potentially into even other Unreal Engine and other implementations as well. But this seems to be breaking outside of that app context where you're able to have an agnostic context where the Pluto VR has created its own social context, but it's in with this larger set of open standards where you're able to potentially pull in a Unity app an Unreal Engine app, which I kind of think of as a fractal geometry, where if you pulled in a Unity app and you have a social network inside of the Unity app, then you're going into these layers where you kind of want to break outside of the simulation, more of a godlike presence. And so how do you have that social networking outside of the context of that application? It seems like that. Pluto VR is having that omniscient context and being able to pull in other applications, but not be limited by having only the social interactions by those closed contexts of these other Unreal or Unity apps or WebXR apps.

[00:30:41.485] Jared Cheshier: Definitely. I mean, like, the way that we think about it is, you know, what you basically described is, like, all the attributes of physical location. And so with our purpose of helping humanity transcend that physical location, it's like the way that we can manifest our purpose most directly. So being able to do that to the most degree you can in real life, but also in physical life, in physical life, in VR, like, Being able to have that context and that connection, but not being limited by any one particular app or being trapped in one particular app is like a big motivating factor and a lot of what allowed us to be able to discover this stuff. We've thought about this from that outside perspective of being able to fully transcend your physical location about no matter where you are, you can still be in person with other people and be able to kind of do anything.

[00:31:25.009] Forest Gibson: And there's some really interesting benefits that comes from this, right? When you're separating these out into these different contexts, there's the simplest one, which is you're using a VR application with someone else and you go into the loading screen, right? The loading screen of loneliness. And with Pluto, you're able to remain connected. You can still see and hear the other people as if you're in person, even through these different loading screens. But another way that's kind of more utilitarian version of this is security layers, right? When it's not one application, right? You can be an application developer and you can produce and develop your own application that you have total control over. And you can still leverage Pluto's presence to be able to connect with other people. And Pluto has no, you know, only the information you want to give them about your application, right? There might be some things that you offer make it easier to get into a shared context, but you don't have to do that. So if we wanted to look at a 3D model together and we were on using this other application, we could do that in a secure way that never touches Pluto servers, that never goes off our own systems, but you could still use Pluto alongside that. So I think there's a lot of interesting benefits there too, aside from sort of the single stacked model where you have to upload all your assets to one provider.

[00:32:33.397] Kent Bye: Yeah, and I know around the time Aardvark had first come out from Joe Ludwig, there was an announcement of a whole open standard for layers that I saw that you were involved with as well. Maybe you could talk about this open standard for layers and how that fits into this ecosystem of all these other standards.

[00:32:48.790] Jared Cheshier: Definitely. So with OpenXR, there's an extension that we've been working on for quite a while that's being co-authored by some folks over at Epic and some folks that are in the OpenXR working group in which we got some help from a company called Lunar-G, which is a big part of the Vulkan standard. And we've been working on this means in which you can run multiple applications similar to the way that we're doing today in OpenVR for OpenXR. And so the opportunity that exists for OpenXR is to be able to have this be much more performant and to have all kinds of cool new features that allows overlay applications to run alongside each other. And it's really the playground for being able to create things in which you can do interesting compositing and stuff. And so we're really excited about that extension. You can check it out in OpenXR Working Group materials if you go find the OpenXR portal, you can check out the extension that's there. And there's also some presentations out in the wild describing how to implement this stuff as well. But really, it's the kind of stuff an engine has to implement.

[00:33:47.933] Kent Bye: Yeah, and I think that I know Jen, and I think I stumbled into the very first meeting of M3. And I don't know, were you there at the very first meeting of the Metaverse Makers Mixer group?

[00:33:59.276] Jared Cheshier: I wasn't at the first one, but I'd certainly enjoy attending those. Like, that's one of the most fun communities to go hang out in VR.

[00:34:05.203] Kent Bye: Yeah, I know, because I went to the Decentralized Web Summit in 2018 and I met Jen and ran into him and other folks from that community back in 2019 at the Decentralized Web Camp. And I feel like that group of the metaverse makers, Mixer or the N3, whatever it stands for now, the third N was silent for a while, but now Metaverse makers are like this hacker group that has been getting together. I know Aviar used to be Exokit, now Metachromium, where it's sort of more on the Chromium base and to really push forward this potential experimentation there and to maybe have that feedback into these larger ecosystems at some point. But it seems like that the types of experimentations that has been happening here, I know I went to a talk that happened at one time where I know that you were all there and there was like a transfer of an object between one reality, the next reality. I don't even know exactly what was happening. I was taking lots of shots.

[00:34:59.456] Jared Cheshier: Yeah. Yeah. So that particular, I think it was AWE was the first like spatial copy paste. So we copy pasted an object from one application and into hubs. And so everyone saw this like kind of magical moment where Aviar was like, grabbing this object, copying it, and then pasting it into hubs. And everyone saw this sword appear in the middle of the room, and then he wielded the sword. And it was incredibly epic that this moment occurred. But yeah, that community is some of the folks that are participating in these standards and these capabilities.

[00:35:31.955] Kent Bye: But they were using Pluto in that, right? Was he seeing a Pluto layer, and then he copied it from the Pluto layer into the shared context? Is that what happened?

[00:35:40.219] Forest Gibson: I think we were on Pluto as kind of a back channel comms. So we, you know, in the same way that you might have your own kind of private comms in an event like that, we were having that social layer, but Pluto is the people layer. And so that was just another application that was running alongside Pluto.

[00:35:54.135] Kent Bye: Okay, yeah, I was there for that. And it felt like it was like a historic moment. And it was like, okay, I was sort of like, I didn't exactly see precisely what happened. And then even when I saw it was like, okay, what actually happened, but the copy paste moment from one reality to the next, but we have these sort of glaring of different realities. And I think that a lot of that innovation, like the virtual market for, I know that I was hanging out with the Metaverse makers and three folks. I know you were there. And also that seemed to be a context under which that you have all of these different things that were happening there. And you had people capturing QR codes and then unlocking artwork. And so you didn't know what the artwork was. And so everybody was going around you know, having this layer of augmented reality and capturing QR codes. And if you collected enough of them, then you had this whole art piece that happened at the end that I saw a video of afterwards, because I wasn't myself going around. But maybe you could talk about like the context of the virtual market, because coming up here, virtual market five, You're going to have a booth there and being involved in maybe announcing to the larger community. So maybe you just talk about what happened at the last virtual market four and the different types of experimentations that were happening with these multiple layers of having layers of augmented reality on top of this whole virtual reality. And then where you see it going here at virtual market five.

[00:37:07.470] Jared Cheshier: Definitely, yeah. So at Virtual Market 4, it was just like a really cool place to go experiment with all the different technology that has been assembled. And it's a bustling, active community of people and lots of people in VR at the same time discovering and finding things. And so, yeah, we were running this dark booth app running around scanning these QR codes because there's QR codes everywhere at VKit. And so we had this chance to record these QR codes, which would unlock that dark booth, which is like this hidden booth. And it was just like a really magical experience to be able to see that. And it was interesting because there were some people who didn't have the Dark Booth app. And so they're like, wait, you see something? This actually means something? And so suddenly there's this ARG style experience on top of VKit that VKit didn't produce. It was this fan group that loves VKit that made this Dark Booth using this technology. And so it kind of showed the potential of what you can do in VR. And I just imagine this stuff all being really cool in the future and in real life too. But yeah, it was so much fun. And the community is so much fun to run around with and experiment with this different technology. So this year, it kind of inspired us to participate. And VKit doesn't have a huge Western audience, but this is doubling every year, I think. And I just saw a bunch of really interesting things going on. And so yeah, Pluto's got a booth at VKit 5. And we've got our own original booth, and we've got some things you can check out there. So if you want to go explore, there's some things to discover. And it's a really cool place that I'm excited to go experiment and discover interesting stuff. There's definitely cool avatar and wearable items, and there's all kinds of progression around virtual representation that we're just excited to go learn about and participate in. And then it's a cool place to go hang out and help support, too, because VKIT's just a lot of fun for a lot of people.

[00:38:58.895] Forest Gibson: Yeah, and one note on the Dark Booth project, it's a really powerful moment. There's a lot of talk right now in the ecosystem about the fear of these locked down, closed platforms, and that when you can have an open platform, it helps foster innovation and creativity, that a disconnected community fan group could create an augmented or alternate reality game, wrong AR, you know, an ARG, an alternate reality game, just like you would in person, right? Like you could totally have set up an underground ARG as part of a major conference, and that that could happen now in a virtual conference. And that the fact that the conference hosts didn't make that, right? That's like a really powerful moment where You're breaking out of this concept of top down ownership that just because you have the space means you have total control over it. It's like, no, if you're at a conference center, people do all sorts of stuff. And like, you can kind of be creative and, and you can get your, your word up, you know, you can do your art piece, or you can try to promote your brand. And, and no one's telling you, or like being like, this is the only way you can do it. And like all its restrictions. And so I think that really gave me hope for potential in the ecosystem.

[00:40:05.296] Kent Bye: Yeah, I've had this experience in VR chat a lot of times where you have the difference between the local and the global variables where you have something that only you can see and options that you can do. So it's like your own solipsistic world, but then you have global variables that everybody can see. And it's not always clear as to where those global variables and what's the local variables. So you're kind of like you're having your own reality, but you're not literally quite sure what other people are seeing, because you don't know what kind of switches you flipped with night or day or whatever that ends up being. But it can be drastically different between people being in the shared social space, but having different experiences. And that here with the Pluto VR, it's like a whole additional layer of potentially being able to bring in other objects or other entities, other experiences, while you may be with your small group of friends going through a VR chat world, but you may be seeing like a shared reality with each other that's sort of like these ghost-like apparitions, that's a shared hallucination. But because it's within your shared context, that's like this fractal geometric instance within that instance, then other people can't see it. But you start to have these layers of reality that are kind of nested into each other. And that's quite interesting to start to think about who can see what and what is going on and what is the experience of anybody, because you're starting to already add in all these additional layers on top of something that's already this virtual reality.

[00:41:24.159] Forest Gibson: Yeah, I mean, I think the progression that I see when you think about it is, especially for a really younger generation, a lot of parents and adults right now even have a hard time parsing all the realities that their kids are experiencing just on mobile phones, right? How many different communication apps and instant messaging things and photos and videos and TikTok and you name it, all these other things, like those are little tiny slivers of different realities. and that's still hard for some people to parse, and that this is going to take it to many orders of magnitude more complexity, right? If we think about many people who still struggle in dealing with personal computers and desktop computing, there's a lot of layers there when you say, okay, go to your settings and go to this thing, go to this sub, sub, sub menu. That's still complex for a lot of people. And I think that raises some interesting questions in terms of understanding all these complex layers. And that's sort of why when we're talking about Start Simple, and that's sort of why we're our default experience and what we're kind of saying is like, start with just one layer. start with just like, here's people, you can see people, here's people, great. And much as we showed you, we, you know, we would often bring in one application at a time. So you could kind of start to build up that understanding, like, oh, this is a different app. And then you recognize that app, and you can close it, and you can kind of start to build that up. Because it's not a muscle that you can't just go in and be like, here's 50 layers, and you have no idea what's going on. These things have to be built up. And there's a whole new computing design language that we all have to start speaking. And it's going to take a while to learn that.

[00:42:50.844] Kent Bye: Hmm. Yeah, well, I'm curious to hear about any other open standards you think that are going to be pretty key here. Cause I know you're already working with a lot of standards, but there's a standards that are already being developed, whether it's with avatars and VRM or whether it's a NFT and cryptocurrency or web assembly is a whole new thing that is going to open up all sorts of new stuff. And so I'm just curious, just as you're starting to work with the spatial computing, what are some of the other key open standards that either you are directly starting to work with, or you see are going to be important in the future?

[00:43:21.505] Jared Cheshier: Yeah, we put a lot of time and effort into implementing things, utilizing open standards and open protocols because the opportunity that exists around this open ecosystem is so powerful as you have those. So there's things like the real-time transmission of things in which we use like WebRTC. There's the rendering methods that are out there and like the way you can connect software to silicon and with an XR. There's also WebXR and there's a lot of different progression around GLTF. and some of the standards in W3C that are starting to form that'll start to feed into these more immersive web technologies as well. So that's a really interesting area that's progressing quite a bit. There's some amount of opinion that WebXR can't do as much as it can clearly do. And so it's really cool to be able to experience some of this stuff directly. And the standards, I think in a lot of ways, helps empower part of that. The things we're really interested in on the horizon is some of the stuff that doesn't have an example in place right now, like light field formats. Light field formats are really interesting because it allows for some different types of reprojection that's just incredible. So we're really excited about and are contributing to standards across all of those. So we're members of W3C, we're members of Kronos, and we're a small company, we're a startup company, but it's something in which we're participating in these areas and helping be able to kind of remind folks of multiple applications across all of these different areas. And then we're really excited about new standards forming around this stuff because I think that there's a way in which all this stuff can be open that it needs to be standardized and open.

[00:44:53.763] Forest Gibson: Yeah, I mean, we're participating in IDEA for the ITMF format. It's still looking towards the future, but it's going to be an important piece of that light field technology.

[00:45:02.110] Kent Bye: And what about avatars? And I've heard some stuff about VRM here. I know that that's something that avatar creators have started to create. I know in Japan, there's quite a number of different companies that are starting to have some sort of consortiums and standardization around an avatar format of VRM. So you're just curious if that fits into what you're doing or if that has the potential to come more of a generalized W3C standard or just avatar embodiment representation seems to be a pretty significant aspect as well.

[00:45:27.831] Jared Cheshier: Yeah, so I think there's an analogy out there that Aviar mentioned, which was like, yeah, actually, I don't know if I can nail the analogy, but it's basically VRM is based off of glTF. And so like glTF is a standard that's already out there. And VRM is like extremely powerful. So in Pluto, we have a glTF loader, and we're working on VRM. And it's one of the things we're really excited about with VKit. because there is this kind of adoption that we're seeing in a region. And we're really excited to support that as users have been requesting it because people are using these VRMs to represent themselves across different places. And the opportunity is to kind of have a representation that you're able to bring around with you. So we're really excited about VRM having more adoption and we're definitely working on it.

[00:46:07.243] Kent Bye: Yeah, especially in the VTuber community. I know that, you know, having these really high fidelity expressions and being able to sort of translate that. So that seems to be a pretty big thing there too. So you have OpenXR that's been implemented into both Steam and also Unity and Epic with the Unreal Engine. What do you expect that going to be enabled, having, say, OpenXR in a Unity app? What can you do with that that you can't do now?

[00:46:35.584] Jared Cheshier: I think having the portability that that is meant to provide is going to be really powerful. So you can, as a developer, create applications and then they can run all kinds of places that you may not have that hardware. It's going to be really interesting. And then OpenXR as a system, it allows for these extensions for us to be able to experiment in really interesting ways. The way that we've, I think, contributed is helping create that overlay extension. So the ability to run multiple applications simultaneously with OpenXR, there's a very clear path for doing that. And as a developer, if you want to use Unreal and Unity, those clearly need to be tools that folks can use. And so we're really excited about where we are in this moment, in which we have the ability to run WebXR and Aardvark apps. But we're really excited also by Unreal Engine and Unity being able to come in there too. And the way that that's happening is through the extension, that overlay extension. And it's just a big moment as OpenXR gets adoption that I think that it allows for experimentation in lots of different ways that's really difficult if you don't have that extension system that is a key part of OpenXR.

[00:47:41.008] Forest Gibson: Yeah, the method in which we are rendering and able to produce this AR and VR system is still really early on and is based on some older technology and is definitely something that is not ideal. And it's not going to be as performant as these new standards. And it's also hard to implement. And so really, our goal is to make what we're doing here and what you experience with Pluto accessible to the widest audience of people possible. I think that's what's really important is We need more people in here experimenting, making AR and VR before AR devices are pervasive and out in the world so we can learn these design patterns and make great AR apps.

[00:48:17.582] Kent Bye: And the accessibility part, I know that Pluto VR was involved with the WC3 accessibility workshop, which I was really impressed to see that, generally impressed with how far reach your little startup has been involved with helping shape the future of this whole entire medium by participating in these open standards. But the accessibility aspect seems to be one of the really big issues, I think, that in terms of where accessibility is going to go in the future and really making sure that we're not just creating applications for able-bodied people. And so maybe you could talk a little bit about that W3C workshop and how that came about for you to be involved with the accessibility initiatives there.

[00:48:54.513] Jared Cheshier: Definitely. So W3C has been really, really welcoming for us. And, you know, as a startup, it might seem a bit strange for us to be participating in all these different open standards, but it really is the area that we wanted to start first because they're offering these ways that we can be able to learn and then help. And we're onboarded and just like got the warmest welcome from folks there. And they're very excited about things. And if you think about that opportunity that exists with the open ecosystem, standards are a huge part. So we've joined And as we learned, and there are opportunities, we showed up in the ways that we thought would be best. So we hosted the workshop in Seattle because we have access to a really cool portion of a building in which our office is. And so we were able to, you know, this is pre-COVID, offer that space up. And people from all over the world came in and we worked on the accessibility user requirements for Immersive Web. And some of the work that we did there has led to some of those user requirements that let us discover how to be able to make good choices in the standards and be able to think about people that have different type of capabilities. And I learned so much. So it's just like a huge opportunity. And we're doing our best to be able to start taking in those lessons, those things we learn and putting them into the way that the software works. And not just for us, but across every opportunity that we have. We're not great at it yet, but we're getting better and better. So by utilizing the existing standards of the web, there's a lot of best practices that can be implemented that we're really interested in continuing to work on and learn about. And so yeah, it was a really cool experience to get onboarded and continue to work in W3C. And accessibility is such a key component because our purpose is to help humanity transcend physical location, which means everybody. And there's an opportunity there to continue to do that kind of stuff. And I'm really excited about it. They're very welcoming. So definitely participate if you're interested. There's community groups that W3C offers that people can get involved. And then just as a small company, I think it's been awesome to be a part of these different working groups and standards bodies.

[00:50:54.534] Kent Bye: Yeah. The thing that gets me really excited about where this is going and what you're doing with what you showed me in this demo is that. WebXR has had a hard time really taking off, especially when it comes to like, say like Firefox has kind of like shuttered a lot of their WebXR team. They've implemented some stuff and maybe they'll continue to implement it. But in terms of the competition, Chromium and Chrome has really implemented a lot of stuff, but yet Apple and Safari hasn't really done the same level of implementation of all this stuff. And so they're kind of like in their own closed wall garden and making their own hope. their own way of doing all sorts of stuff. And so just to see how with most of the immersive applications are either usually running either Unity or Unreal Engine. Of course, there's other engines that are out there, but for the most part, most of the applications are running one of those two. And so to use the baseline of the solid experience of those apps, And then on top of that, start to pull in WebXR to augment whatever that experience is, but to still have this self-contained app dimension, I think is going to get people involved with writing WebXR, but to still have the solid experience, but to find ways to kind of remix it and modulate it in different ways that I think is exciting because the difference between the quality of something of like Half-Life Alyx and Mozilla Hubs is like going back to like 1993 and looking at GeoCities versus like looking at the cutting edge of what we have with Valve doing their first VR game. I mean, that's a world of difference between the quality of experiences that you get with a native. Not that it's impossible to do that, within the WebXR, it certainly with 3GS and everything else, it's certainly possible. But just in terms of the economy and the distribution platforms and the bugs and everything else, it's just not at the same level of attracting the same level of talent. So to have these apps, to be able to have these experience and maybe even just like experiential apps where there's not even much of a game, but just to have an ambient experience that you're in the shared context together, roaming around, then pull in other people in other dimensions to be able to pull in more open standards development with WebXR and Three.js and everything else and to start to pull in those apps and to put them into these contexts.

[00:52:58.771] Forest Gibson: Yeah, I mean, one way that I've been thinking about it is, you know, when you talk about the sort of high-end game and entertainment experiences, you're going for this highest level of graphics fidelity and experience, and those are really powerful, and those are needed to help grow the ecosystem and really push the technology forward. You know, gaming on PC has really pushed graphics cards to even support VR to where we are today, and we wouldn't have that if we didn't. But there's this new thing that's forming, which is this idea that it's more than just gaming and entertainment, right? It is for connectivity, it's for collaboration, it's for work, it's for play. It's more than just the full gaming experience. And I think the way to think about it is developers don't have to go it alone, right? That you can be part of a greater ecosystem. instead of having to create a Half-Life Alyx, which is just an amazing game title, like it's just so immersive and such a powerful experience, you can create the best ball app. And no one's made the best ball app yet. There's not an AR ball that I can hold and throw to Jared, because I'm on Pluto with him right now. There's no ball app that's like, man, that's the best ball app I've ever used. That doesn't exist, right? Huge opportunity for a single developer to go make the best ball. And I could just name that for any, think about all these objects, think about all this AR things that you could create. You don't have to build a Half-Life Alyx to make a meaningful application that changes someone's life, that makes something fun or easier or more productive. You can make an object and maybe that'll grow your AR empire, but you can start there. And that's something that I think this multi-app world really facilitates and that really creates something really powerful that's accessible, right? You don't need millions of dollars in a startup and funding and all this stuff to make ball, but no one's made the ball yet.

[00:54:47.001] Kent Bye: Yeah, well, when you look at, say a WebXR experience, so if you have a website, you go to a URL and you have a whole self-contained experience. Do you imagine a time where you might go to like the weather channel app and maybe pull in the weather or go into another WebXR URL and you have like a mashing together of websites all together, where right now, when you go to a website, it's just like you have a 2D frame and you're looking at that website. And I don't know of any other metaphors of like opening up three websites at the same time, but instead of having like three tabs, they're all within the same spatial context. So you have like a web application that you can be some sort of VR game, but to open up a website and then get these spatial objects that you can see that other people can't see.

[00:55:30.775] Jared Cheshier: Yeah, you mentioned kind of progressive enhancement before. And so if you think about like application behavior on the spectrum of it goes everything from 2D up to desktop, up to AR and VR. And it's like one application that has all these different capabilities. And I imagine an application that would be like a really cool weather application Yeah, I certainly imagine providers just adding this functionality in that progression. So it may not even be a separate app that they have to build, you know, in the future. It's like, you could add this functionality in this progressive web app way that is starting to be discovered and documented. I know there's a handful of articles and tutorials out there about progressive enhancement specifically for WebXR. And as you start as a designer to think about busting out of the screen and into the space, you have to start to make design decisions about what information you know about the space. And sometimes you know a lot, and sometimes you know nothing. And I think one opportunity that does exist that's really powerful is there are a lot of applications that are bound together. If you look at comment systems like Disqus, or you look at widgets that exist around voting on different types of things and liking them, those are binding agents across the web. And I think that spatial binding agents are going to be things that we get to discover. And the first thing to do is just make it work in real time when you're there together. And so that's like the synchronized application that we experienced, like the pin, or more like even the ball that Forrest described. But I think weather applications or additional applications that are doing things that provide data and frequency and interval are really interesting things to share together. I could bust out an Aardvark gadget that shows the weather in the region that I'm in and show it to Forrest, and it wouldn't have to send very much information to him at all. It's totally irrelevant that I'm getting that from, you know, the Weather Channel or what have you. But I think it's just going to be a really interesting opportunity to create web that's spatial and we'll discover how it gets bound together and how apps interact with each other that is just not quite discovered yet, but on the same spectrum of web apps that exist today.

[00:57:26.335] Forest Gibson: Yeah, and I was actually going to circle back to even the counterpoint to what you said. There are lots of times when I've been in Photoshop, and I went to a web browser and brought up the color wheel to give me the proper... I would copy and paste hex codes from that website to Photoshop because I'm not necessarily great with color. So I would go and get that. That's a situation where I had multiple windows open where I was trying to produce one use case. It's I want this thing in Photoshop to look good, but I need another app to help me choose which colors are complimentary colors. And we're mostly limited by screen real estate. Right. And that's actually the reason you don't do that a lot in 2d is just like, you don't have enough monitors, but I know people have tons of monitors and they do this constantly.

[00:58:08.789] Kent Bye: Yeah, well, one of the things that you're doing here is you're creating like a shared coordinate system for you to have like the spatial relationships to each other that you kind of have this virtual space, but it's an augmentation on top of that, but you're creating a shared coordinate system. And as you're pulling in objects, those objects have certain scale. Do you have the capability either now or you plan in the future to be able to resize things so that you could sort of like create like these miniatures of these entire websites or maps? Like you had the grand theft auto map that you pulled in and sort of rescale it down. Is that something that you expect to be able to not only scale, but potentially even change the scale of the entire world, but yourself and kind of go between the big and the small?

[00:58:49.393] Forest Gibson: Yeah, I think that that stuff that's coming online fairly soon. And I do think that as we think about this as the next generation of a computing ecosystem, this idea of being able to sort of resize the windows, resize your volume has to be a kind of foundational layer of, of what this whole computing ecosystem looks like, because in the same way that I can take my window and my Chrome window and make it full screen or make it small and, and change its shape, like that's going to happen, but with volumes of space.

[00:59:15.958] Kent Bye: Hmm. This is all very mind blowing. What type of experiences do you expect people to do? I can just say already from virtual market and VR chat, when you're in the loading screen and you're just all there by yourself, I know last year at virtual market, you were there talking to each other during those interstitial moments and those liminal spaces where you're in a loading screen and you don't have anything going on. you can have like these overlays, we could still have a persistence of a conversation. But where do you see like, there's certainly use cases like that. But where do you see like, people are going to be adopting a tool like this? Like what's going to actually have them go through the trouble of installing all this and connecting to their friends and being able to like do stuff?

[00:59:56.438] Jared Cheshier: Yeah, so I think initially there's a big giant list of ideas in the Aardvark Slack channel right now. So if you go in there, there's just like a giant list of all these different concepts and stuff that we've thrown in there and folks from the community have thrown in there. And it's pretty much like physical goods made digital. And so you're like watch, clock, ruler, tape measure, just like all these different things that you might physically have. And then you're like, well, okay, then you have toys. You have all these different fidget spinners or cards or game pieces and stuff, things that can be chained together. And all this stuff, I think it's really interesting as it exists, but we're in this phase in which it's like, you know, we're kind of sharing this stuff as early as we can. And, and it's like, you can create this stuff that can replace physical objects that you might want to have with you. And some of those, I think that there's a lot of low hanging fruit out there that are these digital items. And there's a whole lot of ideas. I don't think anybody's, there's so many ideas that I think you can make something very, very interesting, very quickly. And that's when it gets interesting, in which it starts to have a lot of value and you want to have it running and have it with you all the time. But we're not quite to the point where there's a big repository of a bunch of items yet. We're at the stage where we're like, come make an item. You can make a dice. You can make a jack. It's at that stage right now.

[01:01:03.398] Forest Gibson: Yeah. And in terms of the, the, how Pluto fits in, in the use case for us, it's, you have to be in spatial already out to be in a headset to then be connecting with and talking to other people. And that's definitely an area of accessibility we're working on, but we're trying to really focus on what makes this the most impactful in person, like experience and build kind of from a spatial first mentality.

[01:01:24.444] Kent Bye: I know this has been a big problem in terms of just getting together with people. There's like a paradox of either you're on Discord and you have a call or you have to coordinate with someone. And if people are already on something like, in some ways it's like the Discord of VR, where you could just have like this social layer that you could use to connect to people or pull people in. in that like right now there's a lot of thrashing where you kind of have to like be in the VR experience and then connect in before you can even sort of talk to someone you have to kind of like find each other in the same world and go to them and just like you can get to talking to people first and then in the background start to be loading up all this stuff that to me it seems to be a pretty strong use case in terms of whatever application that is you can kind of have a direct social connection first and then dive into the VR.

[01:02:09.568] Forest Gibson: Yeah, I think we see this as a flipping of sort of the fundamental paradigms of computing where instead of being app first, it's people first. So you don't go to the app and then find people, you find the people you want, and then you can just bring the apps to you.

[01:02:25.095] Kent Bye: Hmm. Great. And I guess as we're starting to wrap up here, what are the type of experiences that you want to have?

[01:02:32.228] Jared Cheshier: Yeah, so I want to just be able to comfortably be able to do all the work that I do and have spatial apps that are the things that I do every day. I want to have a spatial text input system that I can comfortably use. And one of the things that I've done is experiment with the in-real glasses. And I've worn them all over the place. And the use case I've used the most is I plugged it into my PC and used a folding tiny Bluetooth keyboard. And it's the most amazing workstation that I've ever experienced. And I'm able to go into the airport, when airports were a thing that I went to, and I'd sit there and people would come over and they're like, what is that guy doing? And I'm like, I have the magical workstation that's all around me that allows me to do all the work that I do anywhere. And it's kind of silly because it's like this Bluetooth keyboard that I've had since pretty much since Pocket PC days. I've had this pretty similar Bluetooth keyboard. But yeah, that's my killer use case that I really want at the moment is folding Bluetooth keyboard, all of the apps that I want to run around me. And then I can just put that in my pocket and go anywhere. And then it's also available when I go into VR too. I don't want different apps. I want the same apps.

[01:03:37.365] Forest Gibson: Yeah. And to echo that, I mean, very similar here and that it's something that can be shared seamlessly. So if we're working together, we can just sit down next to each other as if we're at our workstations at work and we're just, you know, his desk is there and my desk is here and we can just be, be working on stuff. And so that barrier of, of where he is versus where I am, man. Yeah. Having those lightweight AR glasses, a Bluetooth keyboard, maybe a mouse, maybe not. And as my virtual monitors in front of me, some cool 3d gadgets, and then Jared sitting next to me.

[01:04:06.745] Jared Cheshier: Yeah, just replace my desk with digital items, please. Yeah.

[01:04:12.315] Kent Bye: Great. And finally, what do you each think is the ultimate potential of virtual reality and what do you think it might enable?

[01:04:21.546] Jared Cheshier: I mean, I can't help but go straight to full-on science fiction where you look at all the trends with brain-computer interfaces and you look at what's possible. I think it's just really interesting when you can start to craft reality in interesting ways and share things in interesting ways. I think as communications progresses, being able to gesture and stuff, I think these things are things you might be able to do without gesturing, and especially if you follow accessibility research and things. I think that shared presence is the kind of thing that you could start to be able to be able to do without having to have like physical ability to input and stuff and I think that that's going to be really powerful and on the journey to the things that look like. I don't know if it's going to look exactly like sci fi but it's going to start to occupy the slots that sci fi describes. as those capabilities come online. And I think that as that occurs, connectivity is going to go and become incredibly powerful and significant all over the world. I'm really excited about Starlink being able to bring the internet to more people and make that very high speed. And I think all of those things combined alongside just the progression of technology leads to an opportunity for just a really cool outcome of capabilities for a lot more people. And that's just going to be awesome.

[01:05:31.494] Forest Gibson: Yeah, I mean, it's the next phase of computing. It's the next kind of industrial revolution, the next revolution for humanity and making where you are no longer relevant for the physical world or for even virtual worlds and sort of they become interchangeable. And I think that that opens up such new possibilities that I don't even claim to know what it could do, but Having, you know, grown up with the rise of the internet and the rise of the personal computer and then seeing kind of how that changed. I always ask people like, you know, name one business or industry that hasn't been impacted by computers. You can't even find something that really hasn't had been impacted by computers. I see spatial computing in this next revolution doing a very similar thing where there'll be no industries in which this hasn't significantly impacted.

[01:06:16.299] Kent Bye: Hmm. Great. Is there, is there anything else that's left unsaid that you'd like to say to the immersive community?

[01:06:23.104] Jared Cheshier: I don't know. I feel like the thing to do is use your imagination. Think about the possibilities of being able to have like multiple applications running at the same time and think about that as an open ecosystem and what that opportunity looks like. I think Tim Sweeney mentioned that thing, which is like, it could be a dystopia or it could be a utopia, right? And if we really work together, I think it can be a utopia. And I think that there's a clear path to make that happen. And I think that there's an opportunity to start thinking about things in that way, in which you can start to count on multiple applications running simultaneously. And just use that as a starting point when you're thinking about a problem you want to solve with spatial computing.

[01:07:02.853] Forest Gibson: Yeah. Collaborate with people. Don't go it alone. Be a part of the ecosystem and share with other people, and you'll get it back tenfold.

[01:07:14.562] Kent Bye: Yeah, well, I'm super inspired with where this is all going and what you've been working on and just how much you've been working with all these standards and bringing all these different things together, these different layers and applications. And I hope that there's folks at Apple that are listening to this and that once it starts to implement OpenXR, I don't know if they have plans or WebXR even, just to have all of the different devices be able to have fully enabled with the different types of stuff that you can prototype here. And I would encourage anybody that wants to see a more open future to get involved with this application and to start tinkering around and playing with what's it mean to start to mash together all these realities. Because I think it is this rapid prototyping for the future of augmented reality and also with VR and starting to add in all these different layers of augmentation. I think it's, to me, inspiring just to see how you can start to blend all these things together and start to, for me, collapse the context and put it all into this one shared spatial reality and how they start to interact with each other in ways that I didn't necessarily imagine in my mind until I actually experienced it. I think there's going to be a lot more of those moments that you could potentially philosophically think about, but in hindsight, once you experience it, then it's like, yeah, of course, this is going to be like this. But I think we're still in this phase of trying to figure out what those new affordances and new paradigms even are. And I think as I start to see where this is going, this is going to be unlocking all these different conceptual ideas that are going to be implemented into all these other contexts of spatial computing as we move forward. Yeah, really excited to see where it's at and where it's going. So yeah, thanks Jared and Forrest for joining me today on the podcast.

[01:08:51.079] Jared Cheshier: Yeah, we really appreciate it. I want to make a shout out and thank you to the community. A lot of these things are something that they would not exist. This is definitely something that's more than the sum of its parts and it's contributed to by many different people in the community. So thank you.

[01:09:06.311] Forest Gibson: Yeah, thanks so much for having us on.

[01:09:08.233] Kent Bye: So that was Forrest Gibson and Jared Cheshire. They're co-founders for Pluto VR. So I've learned different takeaways about this interview is that first of all, well, I had some pretty paradigm shifting experiences while going through this demo because it really started to open up my mind to where this is all going. So Pluto VR as an entity is what I'd say is a little bit more of a telepresence application that is trying to be agnostic to what other applications that you're running. And it's really trying to create this modular social interaction. So you can think of it as sort of like the discord or the slack, where you have this persistent layer to be able to have these back channel communications and conversations. And within VR, when you're opening up these applications, usually whatever application you're running is taking up all of your CPU and GPUs. And you don't necessarily have multiple applications running at the same time. Although I will say that within SteamVR, there's a lot of different applications that you can start to run with checking your frame rate or advanced overlay options. So the thing that they're building on top of is this overlay layer. So within OpenXR, there's an extension to that OpenXR standard that allows you to do overlay windows. And so you can think of when you were in SteamVR, when you hit the menu button, you have the screen that comes up, that's overlaid on top of the immersive experiences that you're having. So PlutoVR is existing in that overlay layer that you can start to control and But you can also start to overlay that within the context where you're each within your own virtual experience And so you can actually be in multiple virtual reality experiences at the same time Pull up Pluto VR and be able to have people in the same coordinate system relative to each other But yet each of you are in your own virtual reality experience And you can start to share your screen back and forth each other, but I kind of think of it as this fractal geometric metaphor. So fractals are layers of self-similarity. There's going to be the lower level of like being in a shared instance within VR chat as an example, where everybody's in that same experience with other people, but within that. instance, you could have local variables that other people don't have. And so within that, there's like a fractal geometric instance that you have within your own experience that is controlled. And then as you go out, then everybody that's in that instance, it's going to be experiencing pretty much the same thing. And so I think of it as like these fractal geometric self contained and self similar layers of reality, but it's within it. And so you're in these instances, but you're having your own personal experience. And so you may be in whatever VR experience you are. And then at some level, there's going to be like this telepresence application that you're able to have these shared interactions. Now, the other thing is that they're pulling in these other applications. So artwork and metachromium, they're both chromium based, meaning that they're using these web technologies, but Chromium is trying to push forward some innovations when it comes to open standards, but more or less trying to be a spatial first implementation of all these existing open standards that are out there and eventually have those be integrated into the main line of Chrome and Chromium. But then Aardvark is sort of like this more experimental app platform that doesn't have any open standards. And it's trying to figure out what is in some sense, some of the overlay layers, but what is ending up being a lot of these other interactive components. So Jared had pointed to a talk that he had seen Tim Sweeney give at the Steam Dev Days back in 2016. It was actually on October 12, 2016. Tim Sweeney gave a talk of the future of VR, and he's really talking about his vision of the metaverse. I actually had a chance to see Tim Sweeney speak at SIGGRAPH 2019, which I think was a bit of an updated version of the same talk, going into more detail about some of those different protocols that he was laying out in his original talk that he gave there at Steam Dev Days of 2016. I saw him on July 30th, 2019. But this talk back in 2016, Tim Sweeney starts to talk about the future of the metaverse. And I did an interview with Sweeney back in 2015, talking about his early inclinations of the future of the metaverse. And one of the things he told me then was that, you know, a lot of the science fiction authors were talking about the metaverse before the whole gaming industry had really evolved and matured. And so there's a lot less mental models around how Games and gaming is going to be the primary metaphor for a lot of these different interaction design as you think about the future of the metaverse We still think about the metaverse in terms of these static spatial self-contained 2d apps that are in these individual applications within their own context, you know, you can interface between them through the copy and paste metaphors, but really this paradigm shift of spatial computing is breaking out of that 2D frame and breaking out of this normal conceptualization of these app mindset where everything's soft and closed into this closed walled garden context and that Tim Sweeney in his talk is really talking about like this future of the metaverse where it's like these interoperable entities that is more like this ecosystem of how they're in relationship to each other. And Sweeney's in a very good position to be able to talk about this because he's developed the Unreal Engine, which is all of this interaction and game mechanics and all that same type of code of interaction design is going to be what is going to be the future of the web and the metaverse is those different ways in which these objects are going to be able to interact with each other. So I think that Sweeney's actually someone to really listen to when it comes to his visions of where the metaverse is going to be going and what type of protocols and standards are going to be made available because he's pretty clear that it's been pretty bad to only focus on this vision of the internet and the mobile computing and just computing in general that is with that closed walled garden mindset and you know he's really put his values to the test by going to war against apple and trying to sue them against like these antitrust perspectives of trying to close down the innovation and to take everything that's happening with computing and to put this wrapper around it that is all proprietary and anything that happens within that platform has to have like a 30% tax that goes to Apple. If we only stay with that model, we're not going to live into the full potential for where VR and AR are going to eventually go. And so you have these major companies like Apple and Facebook, who are only living into that closed walled garden mindset in that ecosystem. And so when I had an opportunity to actually have a direct embodied experience with the type of interactions that you have, that gets out of that mindset, meaning that you're able to start to run multiple applications at the same time that have these modular interactions between applications. And so You have this shared spatial context, which means when you put a car into your environment and then you start to have the pen tool, as you're drawing the lines, then you could have that car object start to occlude when you're drawing or to have different levels of opacity or the lighting start to interact with all the objects that are in the scene. So you have this concept of like the shared aspects of rendering and all those different physically based rendering principles and all the different lighting and everything else and the camera angle. You know, you have this shared context and as you're pulling in these different applications, then they're starting to interact with each other. But it goes beyond that in the sense that it's going to eventually be what happens when these start to collide with each other, what kind of game logic and coding are they start to have these emergent behaviors where when you start to combine them together, then you start to have different behaviors than you couldn't even imagine. You have to actually experience it. And so. That's a phase that Jared and Forrest are in where they're creating this social glue layer. And what I would say is that when I think about the principles of a communication medium and a communication technology, I break it up into four different phases where we have the new technology that has new capabilities. You have the artists that are pushing the limits of what that technology is able to do. And then you have the distribution platforms that are able to allow people to experience that technology. And then you have the audience that is able to actually experience those specific experiences that you're creating. And in some ways, what Pluto VR is doing is it's able to allow people to kind of tinker within the realm of these open XR, WebXR applications, and to have a distribution platform dynamic where you're able to bring in people so that they can actually experience some of the things that you've created in the shared social context in a way that you don't have to actually build the entire social VR networking platform and be able to host it. You could start to go into these existing platforms and even just go in into your own SteamVR loading screen and start to have ways to do peer-to-peer sharing in this sort of decentralized architecture where you start to share these models back and forth with each other with these other applications like Aardvark as well as their application to Aardvark to be able to start to share this content with each other to create these shared social contexts and And so you have the metachromium that's also able to load up in these WebXR applications that you can start to have people put out and put a URL and start to load that in and start to pull in different dimensions of those spatial computing widgets and objects and whole scenes and experiences. Or you could just go into Half-Life Alyx or you could go into VRChat and start to also exchange these little widgets. And it's like these concepts of trying to bundle up all these interactions between these objects into these open web technologies with these web bundles. I think it's actually going to catalyze a lot of innovation when it comes to WebXR and OpenXR and all these other HTML, JavaScript, CSS, as well as WebAssembly bundled entities that have these objects that are starting to interact with each other in specific ways. And the downfall of the WebXR has been that, you know, it's just been hard to have as compelling experience overall. So you go into a Mozilla Hub experience and it's nowhere near the type of quality experience that you have in, say, Half-Life, Alyx, or VRChat, or anything else that's out there. Like, most of the attention and most of the creators are putting all their energy into having the best quality experience, which means that most of that are within the context of these Unity and Unreal bundles that have their self-enclosed nature. I think PlutoVR is able to kind of inject this social dimension on top of that, so you can go into Google Maps, or you can go into these really high-fidelity experiences, and then on top of that, start to add layers of reality on top of that, and to really start to prototype the future of where augmented reality is going to go. this is what I think is probably one of the biggest innovations with this whole ecosystem approach is that you have a scene graph with all the assurances of what is happening within that scene. And then you're able to start to add layers of augmentation on top of that virtual reality, where you can start to prototype the different types of applications that we're going to eventually start to have within augmented reality. But you don't have to mess with all the different computer vision, all the complexities of trying to have a world map and measure where everything's at you can for sure know that you have other people in that same shared social context and a shared coordinate system where you can start to have objects that you can start to have together and I think because of that then it's going to create this context that allows people to start to tinker and creating these different Webex or applications and seeing how you can start to add these layers of whether it's a ruler or a clock or a watch or a an x-ray machine or whatever ends up being the killer apps of these interactive components of playing card games or balls or whatever that ends up being. I think we're still at the very early phases in terms of trying to replicate all the different dimensions of reality and things that you want to share with other people when you're in a shared social context. So I'm super excited to see where this is starting to go, because I think that what Pluto VR is doing is actually quite visionary when it comes to a lot of these other XR startups that are out there, just because they are so intimately involved in all these different open standards and really intimately connected to like these hacker maker groups like the Metaverse Makers Mixer, which have folks that are really tinkering with the future of the Metaverse. I think they're really quite inspired with this Open model and trying to create a potential for what is possible and I think for anybody who is complaining against like these closed-world gardens and having these big major companies whether that's Facebook or Apple those big major players or Google eventually if they start to enter in you know to really start to live into what is that open ecosystem really start to look like and and start to unlock the different types of experiences that you wouldn't even imagine were possible with that closed world garden. And I think there's also a dimension here where there's limited resources with something like the quest. So the quest really can't run multiple applications at the same time. Although I will say that I think it will be possible to potentially add these WebXR bundles potentially on top of this, although whether or not something like Aardvark or a Chromium browser within Oculus is ever gonna get to the point to be able to run multiple applications at the same time I think that they're so concerned about performance that they've really limited for you can only really run one application at the same time So if you do want to do that on the quest you'll have to sort of do a streaming solution to be able to get that in there or maybe even a cloud rendering solution and to be able to see all these things coming together. So there's a lot of limits in terms of the compute power and resources and just the business reasons for Facebook that prevent you from running these multi-app applications. And so because Valve has been creating a generalized context on the PC VR to be able to experiment and tinker with this, then I think it's gonna actually push forward a lot of innovation when it comes to what's even possible when you start to tinker with the combination of all these things coming together. So, yeah, I'm really excited just to see where this ends up going. Cause providing this dynamic where you can start to create these little applications and you can already be within like the virtual market five, where maybe you want to create some sort of application that allows you to have a shared map where you can start to annotate where the cool stuff is at. And so you can have this collaborative way to start to discover. What's happening when this massive social VR space with like nearly 2,000 different exhibitors across all these different worlds? I think there's a discovery process that you know, how do you communicate across all these other people and start to say? Okay, where's the cool stuff? Where's a map and start to use these? additional layers of like augmentation within the context of a VR app to be able to start to create new social dynamics and social behaviors so Yeah, and I think it's an area that's bright for innovation when it comes to what type of stuff you want to do, what type of widgets you want to produce. And, you know, they're starting to do stuff with medical applications and stuff like that, you know, where there's a scientific visualization and then there's a robotics integration and that, you know, there's the social VR app. Instead of creating this vision where in order for you to do anything with OpenXR that you have to basically learning all the nuances of the networking and the social dimension, you know, just like as we're going in and doing also different social apps and these experiences, and you have like a back channel within Slack or a back channel within Discord, where there's one application that really does that best job. But to create that social glue, that people-first orientation, to have this modular approach of spatial computing that's starting to integrate all these other open standards and all these different applications, and to start to really experiment with the future of spatial computing. What type of things is this going to afford? What can you do? Yeah, I'm just really excited to see where this all goes, because I think it's starting to open up a lot of insight and tinkering and surprising innovations that will be completely obvious in hindsight, but maybe we didn't notice beforehand. I think someone like Tim Sweeney has somebody who has been thinking about this quite a bit, because he's, he's working in game engines. And so when you start to think about the metaphors of game engines as applied to the web, Then it's a lot different than coming from the more static closed 2d frame of the web and then trying to project out to the metaverse It makes more sense to think about the unity or unreal and think about how does the web start to be more like this and what are the different open standards that you need to start to have and that's the like groups like the idea the Immersive Digital Experience Alliance, where they're coming up with the immersive tech media formats, the IMTF, which is everything from a container, the data encoding, and the scene graph, and all these other open standards from WebRTC, OpenXR, GLTF, the Web3C standards, WebXR. And there's eventually going to be self-sovereign identity standards in terms of identity. There's going to be cryptocurrency dimensions in terms of the economy and how you exchange there. There's going to be the VRM for specific avatar representations. And, you know, Tim Sweeney and his SIGGRAPH talk of 2019, he really starts to go through all the other different protocols that he starts to lay out as well. And in terms of like a programming language and how do you have these interoperabilities between all these different things. And I actually expect something like WebAssembly to potentially start to play in that role as well. Although, you know, what that exact programming language is going to look like, we don't exactly know. I think in his talk in 2019, he starts to lay all that stuff out as well and go into quite more details. But I think at the beginning, just even having an experience of a modular multi-app experience and just having things up and running, it is a little bit of a friction of getting stuff up and going right now. It's not like completely user friendly. There's a lot of still rough around the edges, user interface design, and you almost need to have somebody kind of step through it. So hopefully there'll be some tutorial videos to know what it takes to be able to open up. Aardvark as well as metachromium and their specific app to be able to launch and do file sharing and Pluto VR within themselves. And so it's still very early days in terms of all that. So that's my biggest caveat is that it's, it took an engineer from Pluto VR a half hour to be able to get me even bootstrapped to be able to have this demo with them. And so I think all of that, that's going to be just a streamlining, but as people start to tinker around and experiment, I think, you know, just having these compelling experiences is going to drive. the future evolution for just making it easier and just getting more people involved to create these different types of widgets with Aardvark. And hopefully, at some point, I'll be able to talk to Joe Ludwig just to see where that is and where that's going in terms of this future forward framework to be able to start to pull in these specific types of web bundles using those overlays, but also to be able to have these different ways of interacting with these different objects. And if folks are interested in that, then definitely check out the Aardvark Slack, which is where a lot of this latest discussion is happening. But overall, I think this is just really mind blowing for me to be able to have a direct embodied experience of this vision of spatial computing, which I'm sure as I go back to my archive, there's people have sort of talked about these different dimensions. But as we start to have our own direct experience of that and start to break out of this 2D frame, but also break out of this closed walled garden, separate context and start to mash together context together, where you have that on the web with APIs and application programming interfaces, but still your experience is that you are on a website, you're on an app, and that's basically it. You don't have a context where you're kind of mashing this together. Like the metaphor forest had mentioned is that like being in Photoshop spatially and having a color pickle tool, that's completely different than the tool of the Photoshop. So, you know, have these functions broken out in different ways and have them combined together and interrupt in a way that they just work together. And I think that's the vision of where this is all going. And the Pluto VR is kind of a social glue in the social air to be able to facilitate that, which creates these distribution medium dynamics that I think is unique when it comes to trying to catalyze innovation with rapid iterations of being able to show that people get feedback. And I think it's just going to refine all these other distribution tech platforms and the technology itself and the user interface and everything else. I think it's. you know, having the people dimension of there, I think is actually going to be a pretty significant catalyst to be able to help push forward things that are happening in the WebEx are in these self-contained bundles and eventually into WebEx are itself into these whole websites to be able to pull in as well. So, uh, yeah, the virtual market five is coming up here. They're going to be tinkering around and playing around with all of this as well. And yeah, if you are interested in that, then definitely check it out and see how they're pitching it and how they're selling it. But I'm, I'm just excited to see being involved in running into both Jared and forest. You know, they're really thinking about the future and, you know, where this is all going. And I just love to see how they've been involved with all these different communities and groups and open standards and trying to see how all these things are starting to come together, really catalyzed by the talk that. attempts when we gave back in October 12, 2016, but also just this vision of the metaverse and this vision of the open web and how you can create a completely new paradigm. And I think part of that paradigm is getting out of that reductive materialistic, you know, we are these isolated entities into this more ecosystem process, relational dynamic, where you're trying to see how all these things are playing together and that more open mindset and all these open standards. So that's all I got for today. If you enjoy this podcast, then please consider entering in some sort of relationship to me, either reaching out and just connecting to me on all the different social media channels, or consider becoming a financial supporter through Patrion. That's how I make most of my money to be able to sustain this type of work. If you find it valuable, then please do consider becoming a donor and to be able to help. to sustain this type of work and to be able to grow and expand it as well. You can do that at patreon.com slash voices of VR. And more than anything, just, you know, reach out and reach out to what's happening with Pluto VR, go to the virtual market five, see what their booth is and start to tinker around and see what kind of apps they have and see what kind of new social dynamics you may be able to discover when you go to virtual market five. So that's all that I got. And thanks for listening and hope to see in the metaverse. Take care.

More from this show