#1699: Announcing “The Institute of Immersive Preservation” with Avinash Changa & His XR Virtual Machine Wizardry

I interviewed Avinash Changa about The Institute of Immersive Perservation on Tuesday, November 18, 2025 at IDFA DocLab in Amsterdam, Netherlands.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.458] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the structures and forms of immersive storytelling and the future of special computing. You can support the podcast at patreon.com slash voicesofvr. So, continuing my coverage from IFA Doc Lab 2025, today's episode is with Avinash Changa of WeMakeVR. And so... Avinash was announcing a brand new Institute for Immersive Preservation that he was talking about and announcing at 3.37 p.m. Central European Time on Monday, November 17, 2025. As part of the playroom, Avinash was giving a presentation around this need to be able to kind of preserve immersive experiences. And so... The approach that he's taking is sort of creating a virtual machine to plug a piece of hardware into a computer and it kind of basically clones everything that's on that computer. And so he's able to then plug that into a virtual machine and so then create these ports and interfaces so that then you can directly connect into these computers. virtual machines and create these time capsules. So basically the challenge is that having these different unity projects, unreal projects, and then, you know, at some point they may update a driver or something changes and make a new upgrade on unity, or even like on the hardware side, if something on the quest headset gets pushed and there's no way to control whether or not you want to deploy that or not. And all of a sudden all of your stuff is broken. And so, um, This is something that a lot of immersive creators have had to deal with, and so it's difficult to maintain this kind of source control with all these kind of moving parts. It's not like a simple ecosystem. Everything's different, and it's very brittle, which can make the process of creating XR a pain, but also the idea of preserving it over time. even more intractable of a problem and so avinash has found a solution for this problem so this is a brand new idea it's not fully funded right now he's trying to figure out the plan and if it is going to get funding he's got his existing thing with we make vr and this is kind of like a thing that's born out of that he's supporting it through we make vr but it's also something that kind of merits its own existence and so we kind of talk around the evolution of how i got to the point and some of where he hopes to take this here in the So we're covering all that and more on today's episode of the Voices of VR podcast. So this interview with Avinash happened on Tuesday, November 18th, 2025 at IFA Doc Lab in Amsterdam, Netherlands. So with that, let's go ahead and go.

[00:02:40.731] Avinash Changa: Dive right in. Hi, good to see you, Kent. My name is Avinash Changa from WeMakeVR. And I've been in the immersive industry since 2012. And our original claim to fame was that I happened to invent one of the first professional VR film cameras, a 360 camera that solved the problem of proper stereoscopic convergence without making it weird. But this was 2012. This was before VR. headsets were a thing, and I came to that because as a kid, I wanted to become an astronaut, I wanted to explore strange new worlds, so I grew up with lots of different VR, which was all crappy, never something that excited me, or, no, that's not right. Everything excited me, but it wasn't the promise that we saw in movies like The Matrix. And when I tried the early DK1, I was like, oh, this was supposed to fulfill the promise of VR gaming, but it looks like crap. Why can't we make this look photo real? So I had a random flash of inspiration and ended up a year later leading to that first camera. So that was our initial claim to fame. But I didn't get into this for technology. I got into this because I wanted to create worlds. I wanted to tell stories. I wanted to create experiences that allowed myself to enter these worlds that I would see on TV. But also, you know, take my friends, my family, my people that I love into these worlds with me. Because as a kid, my mom was really worried. She thought I was not that smart. and i was she was worried because i was living in this fantasy world like oh my god what's gonna happen to this kid but my imagination was my safe space so i always had that desire to take people into that so fast forward to now after that initial camera we ended up doing a lot of 360 videos in the early days and that then evolved to doing a lot of research, working with a lot of people, coming up with new immersive solutions. The key thing of what I do in this industry is I'm not in the production realm. I'm on the innovation area. So every project that I work with, I work on with WeMakeVR needs to find something new in the language of what is immersive. And that new could be emotional involvement, could be social. It could be narrative concepts. And in some cases, if we come up with an idea and the technology to make that possible doesn't exist yet, there's always that balance like, oh, could we research that? Could we invent this technology? Or is someone working on it and can we integrate that into what we're doing? So as long as we explore uncharted territory, as long as we expand the language, that's an area that I'm interested in. And that's what I try to do with Immersive.

[00:05:23.311] Kent Bye: Nice. And so we sort of picked up in 2012, but maybe you could give a bit more context as to your background and your journey up to that point.

[00:05:29.513] Avinash Changa: So as a kid, like I said, I wanted to become an astronaut. But because as a kid, I was told I was not smart. That was a bit problematic because you don't get the right stimuli. You don't get to be placed in the right environment. This needs a bit of context. So I was born in Amsterdam. Both my parents came from South America, from Suriname. Hopeful young people came to the Netherlands to study, to build a life for themselves. They ended up both getting addicted to hard drugs and alcohol. And so I grew up with a very unsafe, violent household. When my parents broke up, when I was four, my mom started dealing drugs to provide for the family and her own addiction, which pretty much led to me fending for myself, which is why... creating my own safe space through video games and comic books and storytelling was essential to my emotional well-being and survival in that sense. So I started getting into programming games when I was seven or eight, again, to create my own stories. I did my first, I think, branching narrative game on a weird Casio mobile device that wasn't meant for that. So I was always trying to push tech, even as a kid. But because I was so in my own world, my parents, my teachers, no one really picked up on, oh, this kid has an interest in certain things in storytelling. They just thought because I was asking a lot of questions in classes, I was always disagreeing with teachers. They just thought I didn't understand. So when I was 10, I left my home. I started living on my own because of reasons. And then my teachers told me, well, because then you get to the point where you have to go to high school. And they told me, well, you know, go into a vocational education. I mean, you know, become a carpenter or something manual labor-ish because you're not that smart. And at that time, because I wasn't living at home anymore, but on my own, I also needed to work. So I got a job at age 11 to provide for food and rent and things like that. So my high school years were a bit of a struggle in terms of just basic life facilities, but not in terms of education because the level wasn't that high. So I wasn't very challenged. When I was 17, a little side story, there was a film festival called The Weekend of Terror. Science fiction, cult movies, fantasy. And I'd been going to that for years as a teenager. When I was 17, that festival stopped. So I called the festival like, hey, you stop doing this. I could do this, this 17-year-old kid coming up with a plan. And lo and behold, they said, yeah, sure, we're stopping anyway, so go ahead, have at it. Suddenly, I became the director of this film festival, and I started programming all of these sci-fi films and all of these things that make me passionate. The festival wasn't doing well financially, so I put together an ad campaign with a startup ad agency, and that ad campaign ended up winning a bunch of awards. So suddenly my face was in these magazines and an ad agency called me like, hey, you know about tech, you know about films, can you come and work for us? So then I spent a couple of years bridging the gap between traditional creativity and digital technology to see how we could creatively make these things possible. So that then led me to starting up a small VFX agency and consultancy. So I started more and more bridging these gaps between creativity, digital technology and my imagination. I did that for a bunch of years and in that process of exploring new technology and creativity, vr was always a thing i've always been trying the different generations of vr headsets even you know in arcades and things that weren't consumer facing but there was always that fascination like when is this going to be better when is this going to be actually interesting and that then led us to getting an early prototype of that dk1 and that's where everything started to click i could use my technical proficiencies my creativity my storytelling desires and my knack for tinkering with technologies and if anything It's in my head and there's no technology out there. I just try to make something work. I mean, it's always hacky. I never do proper consumer level builds. But it works. I do that to make a prototype that works. And that's when that first 360 camera worked. I realized, hang on. There's no one here that makes 360 cameras. There are no players in Unity that can allow us to do 360 playback in a VR headset because the DK1 was meant for games. It wasn't meant for video at the time. There wasn't even any stitching software. So I ended up... doing photo stitches frame by frame, and then creating a first sequence. So that led to my first prototype, and I have a little boat here in Amsterdam. And I put that prototype camera on the boat, and we did a little tour. I spent about three, four months manually stitching, gluing everything together. And then I started getting people into our studio. I would put them in the headset. They would look around in this boat tour in the canals of Amsterdam and they would look around. They wouldn't realize that they could look behind them and above and left and right. And then they saw bikes. And then when I was shooting this, there was another boat coming in the opposite direction. And what do people in Amsterdam do when they're on a boat in the canals? They wave to each other. So suddenly, just by pure instinct, people started waving to the boats in VR, which was a surprise to me. But the most surprising thing was after about three, four minutes, the tour was over, the headset would be taken off, then the guests would go like, oh, that was very cool. And they would look around and go like, okay, so now where's the fan and where's the heat blower? And I was confused. I was like, what do you mean? We don't have one. No, no, no, you're messing with me because I felt the heat from the sun and I felt the wind in my face. That's when I realized, oh, hang on. If there's enough auditory and visual cues and if it's immersive enough, your brain starts filling in these blanks. Your brain starts filling in the gaps that you think should be there but are not really there. So that was kind of my first eye opener, like, oh, there is a lot more to this than I know. So that's when I also started working with researchers and universities, like, how can we map this? How can we not just create stuff, but also understand why this works in the way that it does? So that led to the last 10 years of continuously working with lots of different research projects, which is part of exploring that new language.

[00:12:00.860] Kent Bye: Hmm. Nice. Well, yesterday you just made a big announcement of this new immersive preservation institute. And before we get to that, though, I want to get a little more context for WeMakeVR and just give me a bit of a sense of what kind of led to the need for trying to preserve some of these immersive experiences.

[00:12:19.462] Avinash Changa: So with WeMakeVR, we started doing 360 videos back in 2012. But quickly that became kind of restrictive because, yes, you can look around. But the first thing that we always notice is at the time when we would put people in a VR headset and they would see that 360 video. First, they're amazed. They're looking around and like, oh, wow. The next thing they do is they try to step forward or they try to grab and interact with things. So there's that instinctive. urge to interact with a world that is fully immersive. You kind of, as a user, if you don't have any technical knowledge, you kind of expect that to be possible. Because that's what happens with new technology. When it's so advanced or so alien to you, it becomes akin to magic. So if you're not restricted by technical knowledge, you just assume that all of these things are possible. So that led to, okay, let's make these things interactive. Let's do hotspots that you can trigger with gaze because we didn't have controllers at the time. So that was the first advancement. Let's change the code. Let's make these things interactive. Then we got controllers. We actually cobbled together a six DoF system on the DK1 with magnetic resonance trackers and things like that. Just to get to that point where we could fulfill the instinctive needs of users. So with these early steps, we started introducing new technology. But that would work on our prototypes, that would work in our office. And it was really hard to capture that, to reproduce that, to take that to venues or festivals. So that was already a challenge. And at the time, there simply was no solution. So we then moved from doing 360 video and 180 video into more sixth off type experiences, gaming, interactive narratives. And we then ventured into multiplayer. So we kept adding new technologies, making this domain broader and broader and broader. But then you also realize that as we make VR, since our role became, okay, we're kind of explorers, we're innovators, we're doing pioneering new things, and we're working in environments that don't have running ecosystems. There is no ecosystem for preservation. That wasn't even a thing at the time. Everyone was just running full speed ahead in this new exploration domain. But at the same time, you do have to create value. You have to tell stories. You have to create educational experiences, experiences for elderly. We were very broad. We still are. So in terms of the works we make, we span from education to healthcare to storytelling in a linear sense, to interactive, to multiplayer, to creative tools. It's super broad. But because everything moves so fast, every four, five, six months, we make something new. And we then get to the point like, oh, we've created something that people really like, but it's an old work, it's from three years ago, and it doesn't play anymore. We would send people an EXE, and they were like, oh, hang on, it doesn't work on my computer. We would look at it and like, oh, right, we made this with the DK2 runtime, because we weren't at the Rift CV1 stage yet. So everything was changing fast. It didn't work. So in addition to that, a lot of the creative works at the time, were not funded, so we're self-funded. And in some cases, these works would be funded or supported by things like the Film Fund or other creative industries. There was a requirement for these funds, like, oh, when the work is done, same as with film, send us a copy of your work. So we would send them, like, oh, here's an executable. Here you go. And they're like, well, how do we play this? We don't have your VR headsets. We don't have... Just capture it as a video and send that to us as a file. And we got into these conversations like, well, sure, we can technically capture a VR experience on a flat screen. But whenever people look at that, they come away very unimpressed. They're like, oh, yeah, well, what's special about this? And we would see the same thing at festivals. Festivals would ask like, oh, we have this cool VR experience. but it's only one person in a headset at a time. So can we not just get the feed from the headset and display that on the screen next to the installation? So in the early days, yes, we would do that. And guests would walk by, they would stop, look at the flat screen and then not queue up. They were like, oh yeah, I've seen it. And they move on without grasping what the power of that immersive experience was. So we became very vocal, like, let's not ask makers that are funded to supply a video. Let's find ways to get them to actually present the work in its native, original form. And we started having these conversations back in 2016, 2017, 2018. And it was just very challenging because there were no tools. Everything was changing really fast. And when we were speaking to Valve and Oculus and organizations like that, no one really was focused on that. It wasn't on the radar to preserve these things. And for me, as a maker, and for us as a studio, that also became an issue. Like, oh, we've got this work from 2016, 2017, but we don't have working HTC Vive Gen 1 headsets anymore because they're all busted, the cables are broken. Can we not just play this on something new? And it wouldn't because, you know, we still had the Unity projects. We would open these projects and on a modern version of Unity and the project wouldn't open, we would get gazillion errors and a lot of errors that we couldn't solve because the drivers weren't there, the runtimes weren't there. So for us internally, there was also this need, like, oh, how could we come up with a better workflow to allow ourselves as a studio to update projects a couple of years down the line? So in that process, we also tried to start collecting libraries, drivers, runtimes, just for our own uses. But from that era, from 2016 even to a couple of years back, we encountered dozens of projects that simply were lost. We had all the Unity projects. We had all of the 3D models, all of the files. But often it was like, if we still want to use this, if we want to present this again, if we want to update this, we're going to spend months kind of rebuilding this entire project from scratch. So that didn't work. So there was always this need, at least in my mind, like there should be a better way to do this. But unfortunately, there weren't any tools in the industry for that. And then what happened? So... Early on, I was thinking like, oh, but couldn't we just use simple things like virtual machines? Because, you know, the whole concept of virtual machines is, of course, not new. You use that in software development. Just for context, a virtual machine, most developers basically go out, they buy a desktop computer, they put in a graphics card, a CPU, and all of that, they install Windows, and they install Unity, and they start developing their VR projects. And as a developer, what you often encounter is like, oh, there's this new feature. We now can do hand tracking or we can do eye tracking or something. So let's update the version of Unity or let's update the Oculus SDK, and we can use all of these new features. So you install that on your Windows PC, but that means that some of your older projects simply don't work anymore. So then you end up, okay, let's update this computer and then we have another physical computer next to it and we're not going to touch that older computer until the project is completely done because then we can compile the build and we can ship it and we're safe. So don't run any updates. This is kind of a common practice, not just for developers, but also for bigger companies. For example, Steam, they do a lot of updates. But whenever there is a really big... Games Festival, like back in the days when we still had the E3, they would lock the Steam release branch, so they would prevent any new updates from being rolled out, just to ensure that every game developer that was presenting a new game or prototype game at the conference would not run into issues of their demo not working. So they don't want to risk rolling out updates in these crucial moments.

[00:20:42.799] Kent Bye: I wish that meta would follow the same. They don't necessarily have the same part. Sorry, I just wanted to throw that in.

[00:20:48.524] Avinash Changa: No, absolutely. Well, that, okay, little side story. A good friend of mine, Tupac, who you know, Tupac Martir, very creative, very talented maker, has been working on a big multiplayer project. And early on, he was already talking to me like, oh, you've done multiplayer frameworks, and can we talk? Can you help us out? So he had been developing this project for the last two years. Then a couple of months back, He calls me up, like, hey, Avinash, Meta just released a new update. It pushed a new software, a new firmware to all of the Quest headsets. And now our project is broken. Help, what can we do? Unfortunately, of course, you cannot roll back firmwares on a MetaQuest headset. Meta is always looking like, oh, how can we improve the user experience? How can we improve functionality and get new features on these headsets? So they develop stuff like new hand tracking or improved tracking systems. They roll that out, they push that to the headsets, and there's not really a way to prevent that from happening. That software is going to be forced on your Quest headset And you cannot say, well, we'll just keep the headset offline and avoid these updates because then the headset doesn't work. So you are kind of forced into that ecosystem of having a Quest headset, having it online, accepting these updates, and just run with it. But what then happens is that in some of these updates, existing functionalities or backwards compatibility breaks. So a project that you might have in development stops working, which is exactly what happened with this project. So Tupac was, of course, completely stressed out because... The thing that he worked on for two and a half years suddenly didn't work anymore. So luckily we were able to help out because I hacked together a couple of solutions, but it was a temporary solution and they did end up moving away from meta headsets. Which is, of course, mind-boggling, because here you have Meta, the company from Oculus went to Facebook, went to Meta, which were the key factor of VR becoming a thing in our modern days. I mean, without those early days, VR wouldn't be where it is now if we wouldn't have had Facebook investing in that, which I will be internally grateful for that. But what they have done now is creating such a closed and forced system that it hinders creativity. It breaks these efforts of these pioneering makers simply by forcing updates that we don't want. So that was a problem. And that was unfortunate that they had to move away because there's a lot of good things about the modern day headsets, including the Quest. But this was definitely a problem. So yeah, I fully agree. I wish Meta would be a bit more flexible, a bit more open in their software policies. But hey, what can you do?

[00:23:36.527] Kent Bye: So let's go back to the E3 and Steam. So Steam would normally do that. So go back to the sort of building up to the point of creating a system that solves a lot of these issues.

[00:23:45.586] Avinash Changa: So as an industry practice, everyone kind of knew, like, okay, you have to be very careful with rolling out updates because it can break projects. But that, of course, is a very temporary thing. Like after the E3, after these big conferences, sure, the next updates will roll out and the industry moves forward. But with every step forward, that means that compatibility with projects in development or legacy projects decreases. is at risk of breaking. And that's not just Steam or Meta. As developers, when we create an immersive piece, there are so many moving parts. And it starts with, you know, when we buy a new PC, can we get a new graphics card? Can we get a new CPU? Sure, we can. That graphics card requires a specific driver. And let's say you would want to get the best graphical quality, like we want to have ray tracing. So, okay, we need to install the newest graphics card driver for that GPU. So that's one part. Then we have Windows. Okay, certain functionalities. We had to move from Windows 10 to Windows 11 because Windows 10 was not supported anymore. Right, but does Unity still work with Windows 10 or with Windows 11? And what version of Unity does that involve? So then within Unity, oh, we want to use hand tracking for this piece. So do we use OpenXR? Do we use the Oculus or the Meta SDK? And this kind of grows exponentially. We end up with so many moving parts, both in the hardware and the software domain, that it's kind of impossible to keep track of everything. It's impossible to keep a log of every little piece of software and runtime version and library that a certain project requires. So while we're in development at some point when something needs to be updated okay well let's back up the project update unity and just fingers crossed hope to god that it'll still work after the update or we spend months patching things and trying to debug so we knew like oh this is a problem so that became a practice for us in our studio like okay let's have a lot of different computers and one computer per project and leave it at that, don't update. But it's really expensive and really cumbersome to work in that way. So for software developers, not for immersive developers, but for software creators, it's very common practice to use a virtual machine. So basically you have a software version running on your computer and, hey, we can boot it in this way and we have Windows 10. Boot it in this way, you have Windows 11 or Linux or what have you not. So you have one physical computer running different operating systems and all these different types of software, which kind of works, but again, it doesn't solve anything. It still introduces problems. So then the alternative solution was let's do computers and do multi-boot OSs. So you start it up, and then you have a choice of different OSs that run native on the hardware. So let me explain this. If you run a virtual machine, there is this basically an emulated version of your Windows and your Unity and whatever you have running on that machine. That virtual machine is also emulating the CPU, the memory, and the graphics cards, which is fine for software development, but not for VR. Because with VR, you have your headset connected to your GPU. That headset and everything attached to that needs to talk to the real hardware. Otherwise, it simply won't work. Which is why when you do a multi-boot computer, so you have multiple hard drives with multiple OSs, every copy of that computer will still directly talk to all the hardware. So that kind of became a practice. Okay, let's have multiple OSs and that way we can still apply this sort of virtual machine process from the software development best practices to immersive development. So that works, but that still means that for every copy of this multi-boot system, you then need to have different copies of Unity, different runtimes, so the problem doesn't really go away. You just move the problem. So for a production environment, it's still a nightmare. And I started talking to people years back, like, how do you guys do this? Other studios, other indie developers, how do you deal with all of these different software components? And everyone kind of shrugs and goes like, well, we just have to deal with all of the problems, allocate for time to do bug fixes, patch, and with every update, we accept it, and we just hope things work. Or we accept that we're going to have two, three months delay in our releases. So, same as in the early days of the VR camera, I had that question like, why is there no solution? Why is there no VR camera? And in this case, why is there no better solution for capturing our development environments? And then I also started talking to people that work with virtual machines, with software developers. And I started learning like, oh, virtual machines are a thing, this runs, but the biggest hurdle would be hardware. Virtualized hardware is not a thing. You just emulate what you can. So three years ago, I started trying to come up with better best practices, better workflows that I would document. I would just make all these big white papers like, okay, guys and girls, if we're developing Keep a log of every version of the driver. Keep a copy of this driver. Download the new installer from Meta. Download this SDK. Keep it on the server as a copy. Don't overwrite. Don't work in a destructive way, which is something that often happens. Like, oh... We just overwrite our previous version. Why bother keeping the backup copy? Because we're not going to roll back anyways. Well, I would demand, yes, we have to do that. We have to create these backups. But that just didn't work. Because then we would have, okay, we have backup A of a project, which has this version of Unity and that version of the GPU driver and that set of plugins. And then we have the other backup. Oh, yes, the Unity version is updated, but that driver wasn't updated. And... We ended up with dozens and dozens and dozens of different copies with different iterations of a project, and we would just get lost. We wouldn't know, like, oh, we want to roll back this functionality because something broke. We roll back one version, it doesn't work. Two versions, it doesn't work. It just became impossible. And it was a little frustrating because I would just end up telling everyone, like, get more and more of these copies on our server. It would just become, what's the expression? You don't see the four through the threes anymore. So very messy. But we tried. I tried to come up with white paper. I tried to come up with best practices. I tried to come up with backup solutions, our internal data archives. Nothing really solved that problem. Nothing made it convenient and easy for us to say, like, oh, we broke something. We can just roll back one version and fix it. Because over the course of a project, any project that we worked on generally runs a year, year and a half, two years. Over that time, there's of course also a lot of new hardware that's introduced. There was a project that we started on the Valve Index, but by the end of the project, we had to run it on a Quest because wireless, mobile, easier. So with all of these challenges that are beyond RStudio, but that are in the realm of the hardware makers and the SDK developers, we have zero control over that. So we have to allocate for that as well. So at the time, my thinking was, what if we would have a virtual machine that captures everything and that also emulates the hardware and all of the drivers so we can have a full snapshot of the work as it is? And there was a little aha moment here. There was a little bridge. Because going back to our relationship with creative funds, they of course want a copy of the work when it's done. And that screen capture is not really a valid version, representative of the experience that the user has. so there was always a wish for the creative funds to have a copy of the work for the archives but also to present it so what would often happen is we would go to the film fund with our own development computers and our own headsets install it there and then show the work on our own systems because we would know this is going to work because they don't have the knowledge they don't have the capabilities they don't have the hardware and software to run our projects even if we send them to build. And the same thing applies still to date to a lot of the immersive festivals. A lot of makers that get accepted into a festival end up traveling there themselves, bringing their own computers, running the presentation themselves because that's kind of the only way to be sure that your premiere is going to go off as smooth as possible. If you end up just sending your executable with a PDF like, hey, this is how to install it, the chance of something not working are high. It happens. And we've seen it often that makers calls like, oh, are you at that festival? Can you help? Our project doesn't work. And so from that angle, we got feedback from festivals, from presentation venues, from makers on all these points of friction. Like, how can we ensure that what leaves the studio will also be good at the consumer presentation end? So bringing all of these components together, kind of, in my mind, led to the logical solution. We need to find a way to not just capture and backup the VR project or the Unity or Unreal project, but capture the entire development system to present it properly. So I even dabbled with the idea at the time, like, oh, can we just, when a project is creatively funded, can we include a bit of budget just to buy a new computer and just ship the computer into an archive? Because then we know that theoretically it should still work. That would require, of course, that computer remaining offline. And that introduces a new problem because certain moving parts of the immersive industry don't work in offline mode. As we were talking about before, a Quest doesn't work if it's offline, or sort of. But there's a lot of limitations to its functionality. So that question of can we virtualize this was there a couple of years back. So I started playing around with virtual machines and I kept running into the same problem. Like, okay, there's a lot of the software development environments that we can capture. And for us as developers, that works, but not to present. There's still no way to use a virtual machine to actually present the work because it doesn't talk to the headsets. It doesn't talk to the runtime libraries on a hardware level. And all of the experts that I spoke to said, well, yeah, sure, but that's simply not going to be possible. You cannot virtualize a lot of these components. So over the last two years, I did start moving our development process, doing a combination of virtual machines and, of course, our backup process of all of our Unity projects to kind of have a working system. So that worked fine for us as a studio, but we still didn't have a way to properly present these things. So then about a year, nine months ago, I had this weird flash of inspiration. Something popped in my mind like, oh, what if we do this, this, and this, and if we do it in this way, that would actually allow a virtual machine to think that it's running on the bare metal hardware. So I started doing a bit of research, looked at a lot of Microsoft papers and other tech papers on virtualizing hardware. And pretty much everything stated, no, this is not possible. You cannot create this. But that weird flash re-inspiration got me on another path of programming. So I just started dabbling with some very hacky code just as a weekend experiment. And nine months ago, suddenly, I exported my project. I hit run. it started working. I picked up my quest and was like, hang on, I've got an image. That was totally unexpected. I'm getting goosebumps right now if I'm thinking back at that moment. That was the same feeling that I had when I had my first 360 player working. I was like, hang on, this is not supposed to work, but it works. So my heart started beating fast, like okay, what did I do, what steps did I take, let me trace back my steps. but i knew because i saw it right in front of my eyes like this is working so i spent two years coming up with the basis of the system but the last nine months or so getting it to actually work throughout the entire pipeline from development to actually showing a legacy work or even a modern work on a non-physical machine that is something that i i did not see that coming and hit me by surprise but The moment that I had the first thing working and in the next couple of weeks when I knew like, oh, this is I can get this to actually work in a stable environment. I have to share this with people. I mean, if I had known this before Tupac called me with his problem, I mean, I could have solved his problem before it actually became a problem. But I started implementing this in the way I work, and I started bringing back some of my legacy projects, complicated projects, things with custom multiplayer libraries, things that are very hard to even get working on your bare metal hardware. And yeah, before I realized it, it became second nature. It became very common to me to work with these virtualized machines, to test different things, to roll back, to try different hardware. It's like, oh yeah, it very quickly became for me a very normal way to work. And then I was like, okay, if I can do this, this is going to solve so many problems. Because, as I was saying, over the past years, the past, I'd say, five to seven years, so many makers have asked me this. Like, oh, can you help us present the work because it doesn't work anymore? Can you debug? This is one of the things that I do often. I often get asked to coach or mentor new makers, both in their creative process, but with Immersive, the creative and the technical process are highly intertwined. So I end up being a technical consultant, I end up being a facilitator for festivals, for museums, other studios ask me for help. So I'm in the fortunate position that I have a very broad scope of the challenges that everyone is dealing with. So then I was like, okay, since I know that everyone is dealing with this, and I know that so many works have come and gone in the past 10 years from amazing creators that can never be shown again to modern audiences, what can we do with that? So I knew that this need for preservation existed, and I was like, no, I have a solution. Then the next step was, how can I turn this into something that we can actually share, that we can bring to the industry? So of course you could say, oh, you know, I can write a white paper, publish it, and everyone can try and reproduce this. Turns out it's not that simple. There are so many moving parts, so many different combinations of hardware and software and custom parts that... I was like, okay, I can build this piece of software, I can build this piece of hardware, and that will allow us to preserve and present these works, but it's not something that we can just do in a white paper or as a piece of software that we can distribute. It ended up becoming a full-blown suite of hardware and software tools. It's like, well, I never got into this industry for the technology side. I got into this industry to tell stories, to create experiences. So now it's like, OK, I don't want to do a tech startup. I don't want to be a vendor of hardware. But this is valuable enough and important enough to share because I went through the past 10, 15 years of learning and developing by collaborating with other makers, by seeing their works, by mentoring them, by seeing their works at festivals. I mean, there were so many works in the early days of Oculus Share that people just went nuts. They started doing experimental stuff, crazy stuff, things that had no commercial value, but were fun, were beautiful, were weird. I learned from that. I was inspired by that. And so many makers that I mentor nowadays have never seen these works, have not gone through that discovery phase.

[00:39:56.964] Kent Bye: It's a great tragedy that Facebook at the time just nuked it. They just deleted it. No sort of sense of obligation to do any cultural preservation of this history?

[00:40:07.029] Avinash Changa: No. And from a business standpoint, I get it. I mean, Facebook comes in, they buy Oculus and they're excited about the promise of immersive experiences, but they're also running a business. They're like, okay, we need to, you know, make money, sell headsets. And their focus at the time was like, okay, let's do both the hardware and the software because we're going to sell games on our own hardware. And all of these earlier projects, they're just experiments. They don't have, in their eyes, they don't have commercial value. No one's going to pay X amount for a weird piece of software that requires you as a maker to get all of these weird drivers, these weird controllers, solder something together and then have a cool experience. Like doing early days, multiplayer, you know, get five headsets, glue them together with cables and see what happens. There's no commercial value in that. There's Creative value, there's the value of stepping into uncharted territory. So for Facebook at the time, their focus, I'm assuming, was okay, we need to sell headsets, we need to have commercial games. So we have this library of these weird experiments, eh, there's no value for us to keep that as a company. there is value for the creative community, I don't think that was on their radar clearly enough at the time. And what we also saw in the transition from Oculus to Facebook and then to Meta is that a lot of the developer relations kind of went away. In the early days, there was a very close relationship between the creators and the very motivated, inspired team of Oculus. When it became Facebook, even though I'm very grateful that it helped accelerate the industry, That part was lost. And even today, just a couple of weeks ago, someone called me like, hey, you're working on this preservation thing. Do you still have some of these Oculus Share projects? Do you have Titans of Space or other of these things that I saw back in the days? So, yeah, it would be nice if that would still be available. But it's not. And this is not just Oculus Share. This is a lot of the other platforms and weird pieces of software from back in the days that are not really out there. So I knew, okay, I have this tool set, and over the years, everything that I've been doing, especially a lot of the creative experimental stuff, has been self-funded. There's not really commercial value in that. So now I was like, OK, I'm sitting on this tool that is going to make the lives of a lot of people and museums and other organizations a lot easier. I cannot afford to pay for bringing this to the industry and just like, hey, here it is. I'll share it and have at it because it requires so much specific components, so much hardware, software modules that we've created. So my desire was to share this, to solve this problem. But the other desire is also I don't want to go personally bankrupt doing this. And a lot of people were like, oh, yeah, just try and get some commercial funding. But there is no commercial funding because there's no business model for this. Then try to get creative funding. Anything in that area, in that domain of creative funding is, of course, limited. You get a grant that will give you six months, a year, maybe a year and a half of runtime. But in order to make a real change for our industry, this needs to be sustainable. That means that we need to generate money by preserving these projects as a service and generating a revenue every year. So we can keep developing and updating these tools. We can pay for the hardware. We can pay for the storage. We can pay for collaborating with makers that don't have the money to pay for this. So then in my mind, the logical conclusion was let's set up an institute. So I'm setting up the Institute of Immersive Preservation. And the mission of that is to work with the three different tiers. So we would work with the presentation environments like festivals, museums, etc., by giving them a very easy option like, oh, here's a computer, plug in this device that has everything needed to create this virtual machine to present this piece of VR work, press a button, it'll play. That decreases the cost of presenting works. Then there's that second tier of the media archives. Media archives traditionally don't have the hardware or software, and specifically, they don't have the knowledge of the immersive industry. And even if they have it, it's really hard to keep up with the fast pace of developments. So we're like, okay, let's take away that heavy lifting. Let's provide that as a service. So we'll just work with them in a way like, oh, if you have a project, that needs to be archived, we will use our tool, we will work with the makers, we make sure it's properly archived, we store a copy, you store a copy, and with that first layer, our presentation machine, we can guarantee that it'll be playable five or ten years from now. And then there's that third layer, working with the makers directly. As I was saying, I have this workflow in our own studio with these virtual machines of backing up our projects, virtualizing our own production environments. The makers, of course, don't have money to pay for this. They don't have the money to buy these expensive machines and run this software. So I would love to provide this to them as a service, but they don't have the money. So my idea is if the presentation venues and the archives... collaborate because they're generally funded by the creative funds. If we then create these budgets for new creative projects that we can assign to makers, that allows the institute to collaborate with these makers, provide them with the best practices, keep them up to date, mentor them, but also provide them with a virtual production machine so that they don't have to worry about having five or 10 different computers in their studios or paying for that. So that's gonna save cost on their end. The only way in my mind to properly facilitate this is to have a separate institute. For now, I'm funding this myself from We Make VR, but I want this to be long-term sustainable. Even if We Make VR as a studio goes to a different direction or just focuses on creating immersive experiences, that institute of preserving and presenting and producing everything, that needs to stay. That needs to be here 50 or 100 years from now. So that desire to set that up as a self-contained entity, something that is... self-funded or funded by the industry but is long-term sustainable that is what needs to happen and that is what i tried to do in our presentation yesterday to convince these entities the funds in the museums like this needs to happen so i hope that my message landed the reality is that we have a long way to go but i think I think we've taken a good first step. And what also is helping is that yesterday, people saw the proof. People saw that this is not just a pipe dream. It actually works. I was so happy that you actually got to see it, to try it. You asked a really good question. Like, oh, this is emulation. What is the performance? Is it actually working the way it should?

[00:47:05.858] Kent Bye: Does it have additional latency?

[00:47:08.281] Avinash Changa: Exactly, which is a very fair question. Anytime you try to emulate something, we see that in game emulation, you lose performance, your frame rate drops, etc. But someone yesterday called it voodoo magic or witchcraft. The fact that it runs in these virtualized systems and it has pretty much the exact same performance as running it on bare metal, that was that aha moment, that moment that gave me goosebumps and it's still working. I'm super happy and excited about this solution myself as a user, but I'm now very happy that here at Edify I was able to present it and that I got a bunch of experts from the industry to see it and to validate it. So it's not just me shouting, like, this works. No, it's actual objective people that have no interest in my company or in what I do that can say, like, oh, yeah, this works. This is a real solution now.

[00:47:55.287] Kent Bye: Yeah, I was able to hop in the experience and it works. And I do have like a million questions around it. However, we're sort of running out of time for my next interview that's coming up here shortly. So I'm excited that there's at least a solution that's there that solves the systems as they exist now. I guess one of the things that comes to my mind is that you have like... HTML websites that are still able to be rendered within a browser, but a browser is a lot more controlled than, say, a computer environment that has a lot more moving parts. So I love the idea that you could create like an open source format that has all the open standards that then is creating stuff that gets outside of the proprietary systems that are resulting in all these other things. side issues that there's no backwards compatibility, not sort of an awareness for those proprietary systems to think around preservation. It's always around like what to deliver that works right now on this machine. And so, yeah, I think you're solving a lot of these kind of intractable problems that exist in the current developed environments. And I'm hoping as we move forward, we can also think around like creating systems that are kind of preventing the need for all this kind of virtualization in the future so that's sort of my my sort of gut reaction is i'm super excited that this is possible and i have a game jam experience i made back in 2014 that i don't know it still runs and so it's just like oh wow like this this project may still be able to be played so and also all these kind of oculus share games that have been kind of lost to history you know just the idea that amongst all the different people that have them on their hard drives maybe there'll be an opportunity to get access to some of that early history as well so and another thing that comes to mind is the internet archive Brewster Kahle and a lot of their works and emulation and other stuff to kind of preserve these games and so it's exciting to know that there might be some of these systems that are allowing people even at home sometime in the future like what they've done at the internet archive to be able to experience some of this so those are just some of my first thoughts I'm really excited but I guess as we start to wrap up I'd love to hear some of your thoughts on kind of like the ultimate potential of this medium and what it might be able to enable.

[00:49:50.652] Avinash Changa: In terms of my dreams for the future, well, of course, having this as an institute, once we get that rolling, is that we can keep expanding support for different types of systems. But I also hope that as this institute would grow, we get into more of these conversations with the powers that be, with the metas and whatever other companies are going to come, to advocate for more open standards. And I think Valve is doing a great job with that, with the upcoming Steam Frame system, headset and the entire frame ecosystem having something that is open that they are saying well this is our hardware if you want to run a different os on it go right ahead if you want to create something that you cobble and and you get it to work have at it that mentality i think is going to really empower creativity because you're not restricted by just the sdks that the bigger players give you so i'm hoping that once we start getting into these bigger conversations that it's going to be a dialogue, a dialogue with PowerShell B to collaborate, see if we can find open standards. And there have been efforts in the past to come up with open standards. Sony and Meta and others have tried to put together consortia to come up with this, but no one has the Metaverse standards consortium. No one has really committed. No one has really opened up. And I think, I mean, if, and this is a dream, if we get to a point where Institute of Preservation becomes kind of a partner or as big as what the Internet Archive is or the Wayback Machine is, then it shows to the industry like this is a need. We need to work on open standards. We need to be more collaborative. And yes, I understand the business needs that the players have, but there must be a middle ground that we can find. And Yeah, one of my goals would be to get to at least start these conversations. Another dream that I have is that at a certain point and that is actually very possible that because we're now talking about immersive media, what I've already heard is we can apply this to other creative mediums as well, because when you're dealing with film, VFX, when you're dealing with music, there are so many again, so many moving parts. We've already tested this. The current solution can do those kind of complex environments. We don't know what new complex environments are going to be developed in the next 10, 20, 30 years. But I know that from a creative standpoint, there's so much stuff coming down the line. And what I would hope is that we can keep updating our system and expanding that. we kind of get to a solution that is universal, that whatever you're creating in the digital realm, whatever custom wear solutions you're doing, this will allow you to preserve it, to show it to future generations, but also not just present it. Even if 50 years down the line, someone sees your project and is inspired, they can actually take your project, open it up, unpack it, and build on what you've built 50 years ago. That's a dream. Not sure if we'll get there, but we'll see. nice and is there anything else left inside that you'd like to say to the broader immersive community i do anyone in the immersive community come talk to us come help i mean i can come up with this idea i can build this but this idea will only work with the support of all of the players in the community and that is the bigger platforms but that's also the indie makers i want to have these conversations i want to hear what people think i want to get their feedback because I don't have all of the wisdom in the world. I don't know where this is going to go. But I know there's a lot more problems out there and challenges out there that might not be on my radar. So to them, I'd say, yes, come find me, contact me. But even more right now, I'm making a call to the larger funding organizations. Help get this started. Help get the ball rolling and help turn this into a sustainable institute. Because without you and without this help, this is not going to work.

[00:53:45.745] Kent Bye: Yeah. Awesome. Well, Avinash, thanks so much for joining me here on the podcast to give a little overview of the Institute of Immersive Preservation. I'm really excited that you went through all the trouble to figure out some of these problems that I'm having. It's something that has given me great pains to see all of this cultural history get lost. I feel like part of my role is to try to go and bear witness to them, capture conversations about them. But that's certainly not the same as creating a context for people to have their own experience with some of this work that has been so ephemeral and really in this context. Festival circuit and things that are not accessible for people to really get access to. And with all the distribution limitations and everything, there's been a lot of work that has been created that's inspired other creators, but not really made available for the larger audiences. And so it just it's really exciting to hear that it's going to potentially create a solution for something like that, but also to go back and all these projects that have gotten broken through all these different updates and all this stuff. system that's not really backwards compatible that is always progressing but yet you lose so much cultural history and so it's really great that something like this exists and so thanks so much for going through the trouble to make it happen and for joining me here today on the podcast to help break it all down thank you for having me and one final note all of the makers from the oxley shared days if you still have copies of your project come and find me we'd love to help you Nice. That's all that we have for today. And I just wanted to thank you for listening to the Voices of VR podcast. If you enjoyed the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a listen-supported podcast, so I do rely upon donations from people like yourself in order to continue to bring you this coverage. You can become a member and donate today at patreon.com slash voicesofvr. Thanks for listening.

More from this show