#1523: XR & AI Ethics & State of the Industry Discussion with Polys Ombudsperson Winners

The latest Polys Ombudsperson of the Year is Ingrid Kopp, and previous winners Brittan Heller, Michaela Montagna, and myself gathered again on December 8 2024 to reflect on the current state of the XR industry as well as the specific ethical concerns about the intersection between XR and AI. We also speak about some of the power of storytelling to unpack some of these moral dilemmas, as well as tech policy in the context of the existing political climate, but also some of the broader and general trends of the XR industry.

In my takeaways at the end, I also elaborate on the leaked memo from Meta CTO Andrew “Boz” Bosworth saying “We need to drive sales, retention, and engagement across the board but especially in MR. And Horizon Worlds on mobile absolutely has to break out for our long term plans to have a chance. If you don’t feel the weight of history on you then you aren’t paying attention. This year likely determines whether this entire effort will go down as the work of visionaries or a legendary misadventure.

A lot of their Meta’s commitment to XR seems to be directly tethered to the success or failure of their own first-party app. Zuckerberg’s AR/VR Strategy memo from 2014 leaked to TechCrunch by Blake Harris now seems particularly prescient: “for now keep in mind that we need to succeed in building both a major platform and key apps to improve our strategic position on the next platform. If we only build key apps but not the platform, we will remain in our current position. If we only build the platform but not the key apps, we may be in a worse position.

Zuckerberg later elaborates that apps and experiences are their top priority in XR: “I think you can divide the ecosystem into three major parts: apps / experiences, platform services and hardware / systems. In my vision of ubiquitous VR / AR, these are listed in order of importance.

As time has passed, it’s become more and more clear that the success of Meta’s Horizon worlds may be directly tethered to Meta’s overall commitment to the XR industry. Just yesterday, Meta announced a $50 million dollar fund for Meta Horizon World creators, with special awards for mobile: “By investing in mobile content, we can reach a lot of new people who don’t yet own a Quest headset and ultimately grow the pie for everyone. That’s why, as part of our $50 million Creator Fund for creators of mobile and MR worlds, we’re also announcing our first creator competition of the year with a focus on mobile.

UploadVR did a really great follow-up report about “From Quest To Horizon: How Meta’s Shifting Priorities Are Affecting Developers” that includes many anonymous statements from XR developers talking about the impact of Meta’s shifting priorities.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.458] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the structures and forms of immersive storytelling and the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So in today's episode, I'm going to be airing a panel discussion that I had with the other award winners of the Poly's Ombudsperson of the Year at It's sort of like a award that was initially given to me back in 2021 for the year of 2020. It's at the Polys Awards where they award a lot of the WebXR experiences. That's actually how they started. And then they expanded out and created the Academy of Immersive Arts and Sciences as a nonprofit that is running both the Polys and all these other initiatives and expanding beyond WebXR. But they award the person who is doing a lot of ethical work within the context of the XR industry. but also XR and AI. And so it was initially given to me for the calendar year of 2020. The ceremony was actually in 2021. And then I gave it to Harvey Barzev, and then he gave it to Britton Heller, and then to Michaela Mantagna, and then to Ingrid Kopp as the latest recipient. And so everyone but Avi was able to come together again this year. We had a first mission-responsible conversation last year where we just had a general discussion around what's happening in the broader sphere of XR from an ethical lens. But this year, we're also talking about XR and AI as well, since a lot of those are interwoven together. You have, in my perspective, like XR on the front end and AI on the back end. And so these systems are often intermixed together. And so what are some of the different ethical moral limits that we're seeing, some of the concerns that we have, also some broader issues within the industry, how we're going to be navigating the larger political context, not only here in the United States, but around the world. So this is actually recorded a couple of months ago before a full on onslaught of what feels like a daily chaos of all that's happening in the wide world. So this is our sense of where things are headed when it comes to addressing some of these larger issues. And so, yes, it's kind of like a sampling of some of our current thoughts of what's happening in the XR industry. It's certainly a kind of a strange, weird time, I think, especially for XR developers. Meta has been focusing a lot more on Horizon Worlds. UploadVR did a great survey just looking at some of the impact. Meta just announced like a $50 million fund for creators for Meta Horizon. So very much they're focused on like their own first party app rather than supporting like the broader ecosystem and content. We kind of see that playing out in different ways. And so I'm going to be actually at South by Southwest here in a couple of weeks, and I'm going to be covering the XR Immersive Exhibition, but also just they're generally available, talking to different people in the XR industry just to get some of their latest thoughts. I'm trying to get a beat of what's happening in the broader XR industry, as I think is a topic that's on everybody's mind. And so this conversation is also tapping into some of those other deeper themes. So we're covering all that and more on today's episode of the Voices of VR podcast. So this conversation with the Polly's Awards Ombudspersons of the Year called Mission Responsible 2 happened on Sunday, December 8th, 2024 on the Engage platform. So with that, let's go ahead and dive right in.

[00:03:18.022] Julie Smithson: It's time for the main event. Now, without further ado, please welcome Kent Bye, Britton Heller, Michaela Mantegna, and Ingrid Kopp as the fifth Polly's Ombudsperson of the Year. Congratulations again, Ingrid. Please take it away and share your knowledge with everybody here. And once again, thank you everybody for joining us today.

[00:03:41.368] Kent Bye: All right. Thanks so much, Julie. And yeah, so I guess I'll kick it off and then we'll have each of our introductory statements, what's on top of our mind, and then we'll have a bit of a discussion about different topics. So again, my name is Kent Bye. I do the Voices of VR podcast. And I think my way into this through the practice of oral history and going to these different events and having conversations with people Talking about the potentials of where this is all going, but I think the flip side of all the potentials are all the negative, less exalted ethical and moral dilemmas that these technologies are raising. I really see it as these two sides of the same coin, where there are the potentials, there are all these new risks at the same time. And I think over time, I have tried to map out a lot of the landscape of what those ethical and moral dilemmas are. I did like a whole XR ethics manifesto. I helped to lead the IEEE Global Initiative on the Ethics of Extended Reality and trying to help people navigate this landscape effectively. So I'll have a couple of comments in terms of one privacy and then AI and then where my own focus over time has been focusing in on. So I think privacy is probably obviously the biggest issue in terms of the threats that these technologies are presenting us. And if you look at. Some of the news that happened this year, there was a couple of students that took some Ray-Ban meta classes and they started to stream everything that they were seeing and then also do facial recognition. And Meta's response to this was like, well, you could do that with a phone. And I think the difference is that we're going to be wearing these computers on our faces. And I think we're entering into new implications for what's it mean to have these computers somewhat always on cameras, recording everything that we're doing. And I think that project actually catalyzed a lot of discussion publicly around this idea of bystander privacy, which I've been talking about and covering on the podcast for a number of years, but it wasn't until It was made real by an actual project that catalyzed all this media coverage. And at the end of the day, we're going to need some sort of new laws to actually address some of these. And so a framework that I use a lot to navigate these different types of discussions is Lawrence Lessig, where he says that there's like these four different major dials that we can start to turn when we talk about these different types of issues. The cultural layer is what we're doing here. We're having discussions. It's the academics who are doing research, trying to come up with these different frameworks. And then at some point you have the legislators come up and create new laws to enforce best practices. And then there's the economic layer, which is if you don't like surveillance capitalism, you can choose to support companies that aren't in those models. And the last one is the actual technological architecture and the code. What are the different types of models? technologically mediated solutions to some of these problems. I think any sufficiently complicated issue ethically is going to be hitting all those different areas, whether it's a cultural, legal, economic, or technological. And so over time, I feel like I've become disillusioned to what degree can legislation actually make meaningful change in terms of privacy legislation, especially as we have a new administration that's going to be even more laissez-faire capitalism and less interested in doing more regulations. And so I did do the Extended Law, Extended Reality, the whole article on contextually aware AI and the threats of this always on AI that's paying attention to everything that we're doing. And I think my interest lately in terms of where I can find The most highly leveraged to where I put my energy is not in the legal. Like I did some of these white papers, but I'm not sure how much impact it's going to have over the long term. I'm glad that that exists for other folks who are making those policies to have as a reference. But I think the real leverage point that I've been really interested in lately is looking at what artists and storytellers are doing and how they're addressing these different issues. Because I do think that storytelling and having a direct and immersive experience to some of these topics is a way that takes it from this abstract realm and really grounds it down, just like the students had done. And so just going to if a doc lab, Ingrid was there as well. There's a lot of projects that are exploring the potentials of A.I., And some projects that are, you know, like you sit down in front of the camera and you have the AI that judges you. And it's basically like looking at all these different issues of large language models and the biases that are included there. What if you had an art project that tried to remove all those biases and you just try to see to what degree those biases exist and that you're being judged in a very stereotypical way? So it's kind of a provocative project where people feel compelled to be judged by these technologies and by AI. I think there's another layer of to what degree is that our project really deconstruct and critique and to really analyze and to have solutions with curating different data sets. So I feel like AI as a topic has experiential elements, but there's all these other elements in terms of the deeper contextual relational aspects for how these data sets are being connected, the impact of when there's missing data. And that's the type of experience where the creators were like, you know, we don't even suggest that people do this experience because it could be harmful for the ways that AI is judging them in a way that is very stereotypical. So it's those different types of projects that then lead to conversations that I'm covering that I feel like I am a little bit more interested in personally just right now, rather than trying to figure out how these laws are going to be changed in a way that feels like I have less and less agency in that realm. So just wanted to do a little bit of ground set for how I'm thinking about it. Some of the topics I'm thinking around, you know, as we go on, we can talk about some of the other issues that I see are coming up in terms of AI and archives, the sanctity of those archives and whatnot. But I think that's probably a good ground setting to pass it over to Brittain to hear a little bit about what's on top of your mind and where you've been putting your focus.

[00:09:38.029] Brittan Heller: Thank you. Hi, everybody. Thank you for having me here today. The sort of things that I've been looking at over the last year, one has been really the alignment or misfit of existing laws with extended reality. And it's kind of evolved into more of a a concept that shouldn't be new to anyone in this room, but to people at law schools, it's kind of, it shakes them up a bit. I've been talking about the future of the internet looking more like an embodied web and what that means for the laws that we use to govern digital spaces. The short version of that is Looking at AI regulations and regulations coming out, looking at Web3 and laws that are trying to address the Internet and XR spaces. If you regulate one part without considering how it interacts with the others, then you're missing the bigger picture. I was having a discussion with Ben a couple of weeks ago and he told me that he and Neil Trevitt had this analogy of a constellation. I said, that's so funny because that's how I have been talking and thinking about this as well, where you have constellation of emerging technologies. If you regulate for one part in lieu of the rest, you miss the bigger picture. So I've been trying to think about what the bigger picture might look like. This means I've been doing a lot more international work. I'm with you, Kent, that I am kind of disheartened looking at the United States and kind of the patchwork of state-based regulations that are supposed to be governing this, but are really wholly insufficient regulations. So I saw that Verity is in the audience and I've been working off of her and Catherine Allen's good work where I'm doing a follow-up study for the Council of Europe about freedom of expression in immersive spaces and if existing legal frameworks in Europe, in the Council of Europe and in the existing human rights mechanisms are sufficient to protect freedom of expression. or if they need to create new mechanisms, new laws, new human rights bodies. I just testified before the committee this week, and so the report should be live in June. Trying to look more granularly at the way that we need to protect the things that make XR spaces valuable and meaningful, especially as we're entering into a period of political instability and volatility going forward. That's what's been keeping me up at night over the last year. Like Kent, I also am haunted by the ghost of Lawrence Lessig. So I put out a report about updating CODA's law for spatial computing and AI systems. The one takeaway that I think is the most important for groups like this is He talks about having four different pillars that make up the way we govern spaces. And he talks about the social pillar or, like you said, the cultural pillar. But he doesn't really give a lot of shrift to that in his writing. He talks a lot about code as digital architecture and law and coded law as being equivalent to each other. He talks a lot about economics, but he doesn't talk a lot about how social rules are the way we govern spaces where the law doesn't touch. And so I think that that's a very important thing to look at as generative AI is allowing us to create interoperable 3D spaces from a static image. So it's going to be more important going forward to think about Not just what the laws are, but how the social dynamics work in these spaces and what we can do to create places where everybody belongs. So that's what I work on. Thanks, everybody.

[00:13:32.754] Kent Bye: Michaela.

[00:13:34.960] Michaela Montagna: Time and in. Hi, everyone. So happy to be here. One year, many things have happened this year. Yeah, I have been thinking and taking notes of what you were saying. I think a lot of us have been keeping late at night with all the changes. For those who don't know, I'm living in Argentina, which it's a particularly political situation at the moment. What I tell people to pay attention to what's happening here, because I feel it's a laboratory for things to come on the world in terms of the shift on politics and policies, particularly a downgrade, we might say, on a lot of the things we used to take for granted. I agree with Kent, what he was saying about storytelling. I had the feeling that we have been talking in terms of, for example, the ethics of AI for many, many years. And a lot of the things that are being discussed today in policy environments are things that we have been warning for the past, I don't know, six, seven years and things that we have been forecasting and nothing has been done in time. And I fear that we don't have time to put the cat inside the box again. And maybe we should take this lesson for our coming technologies. And with Ingrid, we have been talking a lot about this narrative of the metaverse is dead and this thing that the metaverse was just a passing fad. And to quote something that my grandmother used to say about the devil's best work is to make everyone believe that it doesn't exist. Going into this invisibility is like a double-edged sword because it's a blessing in disguise. On one hand, we can continue to work and create this quiet metaverse in a more responsible pace. But at the same time, this shifting focus takes away the importance of creating the legal architecture and policies for this industry to build on solid ground. I also agree what people were saying about art and storytelling in terms of how we reach broader audiences to include them into this discussion. This year, I have been doing a lot of work in terms of the intersection of generative AI, art, culture. And it's intriguing seeing how people allow themselves to sit down with generative AI. For example, that was an experience that was at Sonar in Barcelona this year. And people were sitting down to be roasted by an AI. It was an interactive exhibition. And for me, like the real experiment is not what the results of the AI were coming and the images that were coming from them, but why we allow this, like the behavioral changes that we as a humanity are seeing. I'm going to make a lot of nerd reference. Those who know me know that pop culture is my love language. And I was thinking about what Ken was saying about there is no shadows, like we are all the time in place being recorded. And I was thinking about like, what do we do in the shadows? This series is like we don't have any more space to have this conversation. Privacy, and this is shifting the way we behave. People is now getting used to being recorded, but also is used to record other people as a means of defense. To take out your phone and don't expect consent from other people and start recording as a means to put something out on social media. This for me is like one of the behavioral impacts that all this perpetually is having in our cognition. And also thinking about like the invisible architecture of generative AI and immersive experiences is something that people that is not in this environment think about immersive experiences and digital experiences as something that You are completely free. You can create anything from scratch. And we know that there are a lot of regulations that are already in place that are preventing that kind of like invisible architecture that limit us in what we can do. And I'm very concerned, particularly in my field, when it comes to copyright, how this is going to weaponize against our culture. and create a more unsustainable living for artists. Last thing that, well, I have many on my head, but another thing that keeps me up at night is what's the next step of this? While also like building in the shadows, while the focus is outside, For me, the next shift is the intersection of neurotechnology and neurotechnology, neural devices to access immersive experiences that are going to gather even more information about ourselves. And how is the cognitive impact of all this information being used against us? How is in the long term the impact in our memory, in the formation of our relationships with each other? and how this intertwines with the economic business models of digital societies. So the book I'm writing now, for the past two years, is called Brain Dancing in the Metaverse, a Capitalism of Cognitive Surveillance. And I think it's like, in my mind, trying to unite all these elements in terms of culture, in terms of surveillance, in terms of business models and economics. Because for me, it's always puzzling that we are talking about all these beautiful technologies and the potential of these beautiful technologies, not just the downside in terms of access to education, in terms of democratization of information, that everyone can access even more information. And for me, this is truly important coming from a really small town in Patagonia in Argentina. But what is keeping us from these post-scarcity models like in Star Trek language? And these business models we adopt for this technology. So I think maybe I'm not as coherent as I want, but these are kind of like the things that are keeping up me at night. And with that, I will pass to Ingrid.

[00:19:52.093] Ingrid Kopp: Thank you so much. It's funny, you know, I was coming here thinking about the fact that I am going to make a case for storytelling and then all of you made the case for storytelling for me. So thank you. Because I've actually been kind of feeling it the other way around. I was really, you know, I've been in the storytelling, immersive storytelling space forever. really since 2020, but definitely since 2011. And it's always felt a little bit like it wasn't enough. You know, we felt like we weren't really moving the needle. And there were so many interesting things we were exploring and some of the work you were doing on ethics, Kent. And I was thinking, you know, we really need to be doing more of this kind of work in South Africa and in Africa. And it's funny because the more I do the storytelling work, the more I am actually seeing how it is opening up these spaces. But it just never feels like quite enough because, you know, especially right now, just given the state of the world and how precarious things feel. It just feels like it's not happening fast enough. So, you know, I see through some of these projects, it feels like a space opening up where you can explore a lot of these things around access and making sure that, you know, like I was saying earlier, that we are imagining a future. I mean, I love, Michaela, what you were saying about it, you know, like maybe it's more Star Trek and less sort of, you know, cyber dystopia. And I think about this all the time, like what is this future that we are creating together? And the fact also that we're in this space, but we're also in our other world in real life with these real issues, one of which is actually that my headset battery says that it's low. So if I disappear, that's why, but I'll go and plug it in after I speak. So, yeah, so I think for me, it's like, how do we move the needle, given the fact that everything does feel particularly precarious right now? You know, and I'm also thinking a lot about environmental costs. So when we're looking at the fact that a lot of these metaverse technologies are converging with the blockchain and with generative AI. You know, there is also a cost, an environmental cost there. And I think for me, it's always, I mean, I'm sure we all feel it in this space. There's always this kind of like, oh, isn't this exciting? Like there are so many amazing things that we can do now as we gather in these spaces. And I'm so inspired to see some of the communities that have sprung up. You know, at our lab that we just ran for Electric South, we were looking at communities in VRChat, deaf communities, communities around trans identities. And it was so inspiring because actually seeing what people are doing in these spaces that they can't actually do necessarily in the other world. I don't know what to call it now. is really, really powerful. But there is also the reality that a lot of these headsets don't last very long and they end up in landfills and we're really worried about all the data centers that are required for the compute to do all of this work. So I don't have any answers, but I think for now... All I can say is it is sitting very heavily with me. There is this huge potential and so much creativity in the sector, but there is also this big, big stuff that we have to deal with. And I guess the only thing that I can really offer is that I do think that the answer is to expand the circle. You know, I think more and more what is happening in these spaces because there is still an access issue. You know, it's still, for example, you can't actually buy a Quest headset in South Africa the regular way. You have to import it, pay extra customs. And then, of course, you need the Internet and all the rest of it. So there are still all these barriers to entry. And I think for me, a lot of these solutions and a lot of these imaginings require more people in the space. We need more diversity here. We need to really be thinking more expansively so that, you know, this future that we're all building together is truly all of us building this together rather than right now quite a small subset of us building this together. So I think that's what I'm trying to do with Electric South. We're a tiny non-profit. We started in 2015 by accident. I was asked to actually curate an exhibition of virtual reality work, storytelling work, at the Goethe Institute in Johannesburg, and I couldn't find any pieces made in Africa. There were a lot of pieces made about Africa, but no pieces actually made by African artists on the continent. And a lot of the stories were kind of familiar tropes of war, famine and disease. And I just thought, oh, God, not again. Like, there's got to be a better way to make sure that this new medium doesn't repeat all the mistakes of, you know, film and other mediums. So that's why Electric South began. And we're now, you know, a nonprofit and we run lots of labs and produce work. And now we're doing policy and research work as well, because I'm trying to move the needle in other ways. But yeah, it always feels like a bit of a drop in the ocean. And I kind of wish we could go a little bit faster. But anyway, and then the one last thing I want to say is I come from a documentary background and I actually do a lot of work with journalism. And so I think one of the other things that I'm worried about, particularly now, given the sort of political climate that many countries are in around the globe, is trust. And how do you trust the media, whatever media it is that we're talking about? So things like synthetic media and trust in journalism, these are all really, really important issues for me. So that's another thing that I just want to bring to this conversation. And I think with that, I'm going to have to go and plug in my headset and come back. Thank you.

[00:25:20.896] Kent Bye: Awesome. Thanks, Ingrid. So I have a couple of points I just want to make and then throw it back. So at IFA Doc Lab, there was a number of different documentary projects that are exploring all these different issues. There's a book by William Uricchio and Kat Cizek. And one of the quotes that William Uricchio wrote was that documentary as a form represents 90% of all the first films that were made in the first decade of film. And so MIT Open Doc Lab is also identifying that documentary as a form is always on the frontiers of emerging technologies, whether that's XR, the first time of like Using audio that was disconnected from reels and also like color film. But now we have XR with Nani de la Pena, Hunger in LA. But AI ends up being featured in a lot of the different documentary projects that were at IFA Doc Lab this year. Before I get into the AI themes and trends that I'm seeing, I also just wanted to make a shout out to a project called Drinking Brecht. which was this mix between you're in a classroom, it's like a speakeasy, and you're learning how to do one of the very first biohacking experiments where you're using strawberries and pineapple with alcohol to basically separate DNA. And you end up doing this whole lab experiment and drinking your experiment at the end. But it was a project that was really exploring this idea of Brecht and what's it mean to do political action here. in the world of technology and politics, that there's this kind of Aristotelian model where you go into the theater and you cathart all your emotions and you leave your emotions in the theater and you go back and you live your life. The idea that you're purging your emotions. And Brecht was really trying to get this idea of how do you actually become politically engaged and bring about change. And I think that's kind of like a subtext of, okay, now that there's education, now what is the political action that you're going to take, especially if like the legislators aren't actually implementing any laws that are protecting us. And I think that's the place that we're going to be struggling with over, especially here in the United States over the next four years, is how do we have a resistance to what's going on, but also looking at these larger issues and trying to think about them systemically. I don't know what the answer to that is going to be, but for me, I'm going back to these stories and these experiences just because that's at least a place where I feel like I have some agency to see where artists and creatives are exploring some of these issues. So I just wanted to point out some of the other big trends that I was seeing at IFA DocLab. All these themes of like, what's truth? What's reality? Like the opening night film of IFA DocLab was a documentary that featured a large language model trained on the entire corpus of Werner Herzog's films. It created a fake true crime documentary that then they cloned the voice of Werner Herzog to narrate this fake documentary. And then they were featuring experts seamlessly blending into this world. But you didn't quite know at any given moment in time what was real or wasn't real until the end where they go through the disclosure of what was generated by generative AI. And so the fact that that was the opening night film, I think, speaks to the fact that we're in this realm where what's real, what are the alternative reality bubbles? How do we navigate this with media literacy? What are the ethics of disclosure? There was an archival producers alliance that was having a meeting in MIT Open Doc Labs saying, look, as we have documentaries that are out there, if we start having generative AI create fake archival images, is, then how do we preserve the integrity of the archive of what is real and what isn't real? And so there's a number of other projects where someone who was living in France, but also had Vietnamese heritage, had her grandmother destroy photos of the family. And so she took some of the photos and she used generative AI to create all these fake family photos of her family and telling the story of her family. So she was using real audio interviews and having this constructed reality of these family memories and then running into having to overcome all the biases of like if she was trying to generate images of Vietnam in the 1970s, then the overall bias is going to be creating images of the Vietnam War. And so she had to use other languages to prompt stuff that wasn't triggering all these stereotypical tropes that are influenced by Western culture that is directing how all of these different images are being produced. And so by talking to some of these different artists who are at these edge cases, I think starts to reveal some of the limitations of the Western influence of all these AI models. So there are projects like that that start to elucidate some of the different larger dynamics that maybe are invisible. So I guess the last point I'd say is that it's driven me to move to like a paradigm shift of process relational philosophy that is trying to preserve the relational dynamics of all these contextual dimensions. And that XR is the front end and AI is the back end. And then as we have these invisible back ends, then these types of experiences can start to allow us to become more aware of those gaps in the contextual dimensions of how these data sets were being put together or how these biases may be impacting certain populations or certain experiences. And I think that's a part of the emerging fluency that creators are going to have to be aware of is being aware of that and how to either disclose that or be aware or create their own data sets in order to overcome some of these existing biases. So yeah, those were some of the different trends in AI that I was seeing and what's truth, what's not true, and then how do we navigate that as we move forward were some of the big themes that I saw coming up.

[00:30:50.148] Michaela Montagna: No, no, I was thinking about an exhibition I saw at Ars Electronica this year, because there is an essay from past year that's called AI and American Smile. This is what you were saying, Kent, about how we overimpose our occidental imagery into history and how in that we create new realities and new histories and also connects with like this cognitive thing on Maybe it's not the reproduction of your grandmother or your lost family, but in your head, it becomes the de facto representation of your family. This was an exhibition that was applied to food and was, for example, the blood cake, which had a very specific meaning in some countries to refer to some dessert. But if you use models trained on Occidental imagery, of course, it's going to replicate a literal blood cake. And for me, this comes to the point of something that I have been researching and I published a paper this year. And the aesthetics of AI, how is created an homogeneous imaginary, visual imaginary, and how these new images that AI is creating are redefining ideas of beauty, of success. which the very real consequences that we have in our human cognition, because as you were talking about, like this division of online versus offline, that we know that it doesn't exist, this kind of like the magic circle, we are not able to push ourselves of emotion. We are not able to disengage to the sometimes provoked emotions, particularly when it comes to social media. So how do we back from even more immersive experiences? And this is something that for me is very worrisome because I haven't thinking of this in terms of like how divers decompress when they go back to the surface or how people that do BDSM practices go back from that state of play into the quote unquote reality. And you have stages to readapt yourself. So for me, it's like, How do we enable some kind of immersion aftercare protocols? And also in terms of how we kind of make our brains think in terms of logical thinking of the images that we have seen, not in terms of how we kind of learned about these ideas of the world, these ideas of who is beautiful, particularly thinking about teenagers and the impact on teenagers and the mental health. and how we're creating these new relationships with generative AI. It's kind of ironic because we have been talking for a Turing test for many, many years, badly called Turing test because it's not actually that. And now we're seeing like this kind of like a reverse Turing test when we are totally asking all the time, what is reality? And is this real? Is this synthetic? And the end of the day, one of the things that for me is really concerning is like, even if we can like, think about it and understand it, maybe there is like something subconsciously that doesn't connect the dots and the image is persisting in our psyches.

[00:34:04.753] Brittan Heller: Are you familiar with the study that came out from Amazon Web Services in June that said that 57% of the content on the internet is now created by generative AI or run through a generative AI filter? So for me, what you're saying is really, really salient because it's not just going to impact immersive worlds. It's kind of the wider concept of digital spaces is And this question of authenticity that underlies it is actually increasingly salient as the wave of generative AI is swallowing up spaces. I don't mean that to sound negative, but I think it does belie, like you're saying, more fundamental questions of truth and how that impacts people's minds.

[00:34:53.137] Ingrid Kopp: Yeah, I mean, absolutely. And I think, you know, it's interesting. I always feel like when I start talking about generative AI, I do end up sounding really negative. And I actually, you know, I'm very excited about a lot of the things that you can do with generative AI. And I don't mean to always come across as sounding so negative when I bring it up, but it's really difficult. I mean, when you're looking at the slop, I think they're calling it slop now, the AI just feeding on itself. And it's like kind of, rubbish in, rubbish out, and the rubbish actually gets more and more rubbishy as it's like an Ouroboros, right? Of just this circular AI eating itself, but it's also eating the web. And I think one of the things that I keep remembering is like part of the reason why I'm so excited about this space is because I was so excited about the internet. I mean, I'm old enough that when I was at university, the internet was only just kind of being introduced into classes. So it was so exciting. And for me, I was really in all those chat rooms and I was just so excited to get my first email account. And I was really part of that DIY web culture. It was really exciting. And I find a lot of that excitement in this immersive space as well. But I am really more and more worried about what is happening to these spaces as they become more closed off and gated, as AI starts eating everything, literally. It does make me worry about what these spaces are going to look like, not just immersive spaces. I mean, all these digital spaces in five years, in 10 years. And Kent, going back to your reflections on it for DocLab this year, I found the whole process of being there this year really moving. because there were so many projects that just brought up so many of these ideas right now and and I love that so many of them were so kind of personal and reflective but I think what I kept thinking is like how many people are going to see this and how much is it going to move the needle and I think that's kind of the frame of mind I'm in right now where part of me is feeling like we need these very personal very intimate stories to connect with to really make us think about our role in the world and how we want to participate as citizens, as voters, you know, as people with families and loved ones, as citizens of the earth, like how we do that. But it's also like the bigger picture is like, are people going to see these projects? Like, is it enough? And I'm saying that because I think that's something I'm really struggling with right now with Electric South is it just feels like these problems are so big. How do we even address them? Yeah. Maybe someone has an answer for me.

[00:37:27.481] Julie Smithson: And hello, everyone. Thank you so much for your comments. We're kind of wrapping up time. We'd love to hear all of your final thoughts before we do close off. And then, you know, if anybody wants to stick around and continue this conversation, we want to respect everybody's time. So back over to you, Kent, to wrap up this roundtable and such important conversation that needs to obviously be lifted off here and back into the world for all of us to consider are your thoughts. So thank you.

[00:37:56.738] Kent Bye: For sure. I have two thoughts. One, just to follow on what you were saying, Ingrid, in terms of distribution and getting access to these different experiences. I mean, that's a hot topic within the context of the broader festival circuit of these different immersive stories. Venice Immersive had a think tank on these distribution strategies. And I'm actually in the process of writing up that report in terms of like doing a state of the industry in terms of distribution and getting these different types of stories like more broadly out there into the world. Also, MIT Open Doc Lab is actively looking at distribution as a problem to be researched and solved. But I guess my closing thought would be that I recently wrote an article for Ultra Magazine out of Italy, and it was titled, From Possibilities to Actualities, the Reality of these Speculative Futures. Substance metaphysics would say physical reality is the ultimate reality, but I think what we're seeing from a process philosophy perspective, they're looking at these range of possibilities as also being real. So it's like that quantum wave function of what's possible is mutually implicative to what gets collapsed into what becomes actual. And that if you want to have the fundamental building blocks of reality, you can't just use the stuff of physical reality. You actually need to look at reality. These realms of possibilities, these processes, these potentials, these kind of relational dynamics that are in these math functions are also completely real as well. And so there's this idea that these speculative futures have real impact on the world. We can just look at virtual reality in terms of how science fiction over time has influenced and inspired so many different people to actually go out and build these and come into this reality. So It's more of a process from going from the mental pole into the physical pole, and that these ideas and these possibilities will somehow potentially come grounded into physical reality. So the potential for these immersive technologies to explore these speculative futures that allow us to prototype new aspects of culture, to explore contexts beyond the existing economic and political dimensions and constraints of our reality right now, to look at indigenous futurism, Arab futurism, feminist futures, Afrofuturism, all these different ways of using the immersive technologies to escape the constraints of a reality. And Lisa Masseria, the anthropologist from Yale said, she's seeing more and more of these techniques being a valid way of exploring the dynamics that get beyond the constraints of our rational thinking and allow us to tap into our imagination, our creativity, create this playground for us to explore these new possibilities. And I feel like That gives me a lot of hope for the way that these immersive technologies can leverage these types of speculative features to allow us to imagine a world that we want to live into. And so I guess that's kind of my closing thought.

[00:40:46.949] Brittan Heller: I think my closing thought is more of a call to action to people where I feel like since we all feel like we're on a tipping point around the world this year to basically use the way that you interact with immersive spaces and the way you share your gifts of creativity or critical analysis or connecting people, use that to create the next version of the internet that you want to see. I always joke around and say that with XR Spaces, we literally have a second bite at the apple here, so let's not waste it.

[00:41:23.739] Michaela Montagna: On my head, many, many things, but trying to look really far into the future, for me, is something that maybe we are not going to see, but if you see the historical cycles of our civilization, There were at some point systems that were there forever and were unmovable. I was thinking about AI out of fashion because I also have been thinking a lot about in terms of how capitalism is eating itself and there is not going to be enough labor for everyone in the future because we are going to see more and more automation. And on the other hand, we have these beautiful technologies that allow us to have digital abundance. So maybe, maybe looking into the far, far future, what we can expect to see is a shift in economics. And maybe that's the hopeful thought that I would love to leave the table with, like that we are maybe at the moment of the birthing pains of a new economic moment in the future, that we can collectively decide that this hasn't worked out for us in terms of climate change, in terms of inequality, in terms of bias, and maybe take the technology and change the business models in a way that we can truly become a post-scarcity society.

[00:42:43.132] Ingrid Kopp: I think for me, and I'm trying to think of a way to say this so it sounds like a positive, optimistic ending. But, you know, I think if you look at the rightward shift in Silicon Valley recently, there is a tendency of some of these digital technologies to kind of take you off world, you know, where you're kind of sealed away in these bubbles and your brain is in a jar. And it's kind of like the most like out of body experience possible. It's almost like, you know, bodies are yucky things that we kind of want to like disassociate from. And I think my plea would be to remember, I mean, right now, as I'm saying this to you, there's a mosquito biting my ankle. We are people. We are bodies in a world that we share with animals and with the planet. And we are embodied. I mean, we're embodied humans. in these metaverse worlds in wonderful ways, but we are also bodies in the real world and what we do here has ramifications and vice versa, right? So I think I absolutely agree with Brittain, like we do have a chance to build the future internet we want, but this is all connected to the world we're in and we've got to remember that we are bodies in real worlds with real world problems. And I think the more that we can connect what we're doing in these spaces with what is happening in the world and not feel like these spaces are an escape. I think that for me is where we will start to see more progressive, more expensive and open spaces in the future. And that's what I want. I want us to feel like we are actually building something great and not just trying to escape from a world that we're all trying to like run away from. So yeah, I guess that was both positive and negative, I think. Thank you.

[00:44:20.868] Julie Smithson: Amazing. Amazing. Thank you so much. I can honestly say that leaving here on a positive note, having the four of you, the four of you at this as a community, we really have a mission we're all responsible for, not just those sitting at the table. So thank I think with all of us working together, we keep up these conversations. It doesn't continue once you take your headset off or you leave the polys world here. It's a constant way of building humanity first, ethics first, so that we all live in a place that we want to. And that is protected and guarded as well as, you know, just putting the right things into play. And there's so many different considerations now. So thank you, everybody, for joining us for Mission Responsible 2 in celebration of the Polly's on behalf of the Academy of Immersive Arts and Sciences. Let's not close off these conversations. There's lots to talk about. So on behalf of Kent, Britton, Micah, Ingrid, thank you from all of us for what you're doing. Keep up the fight. We're fighting there right with you. This has been Mission Responsible 2, the Polly's Colloquium of Ombudspersons. And yes, that is a hard statement to say. I'm Julie Simpson. Thanks for being here. Please enjoy the rest of the day and we'll see you at the Polys 2025.

[00:45:48.399] Kent Bye: So that was myself, Kent Bye, as well as Britton Heller, Michaela Mantegna, and Ingrid Kopp at the Ombudsperson of the Year conversation. It's the Mission Responsible 2 that was run by the Polys. The Poly Awards are actually going to be on Sunday, March 25th, 2025. So you can go watch that full ceremony where they're going to be awarding all the different awards for this past year for all the different WebXR awards and then also some broader XR experiences as well. So a number of different takeaways about this conversation is that, first of all, yeah, in this conversation, we talk a lot around the intersections between AI and XR. Like I said in the conversation, I see like XR as the front end and AI as the back end. So we'll be seeing lots more integrations of these systems as we move forward. And a surprising thing in some ways is that everyone had something to say around the power of storytelling. Just because we are in a time where the stories that we're telling are helping to frame a lot of these different issues and kind of ground them in a way that people can have access to them beyond the abstractions of some of these different discussions. I think whenever you're talking around XR ethics or AI ethics, it tends to be a little bit abstract. But when things are grounded into like a immersive experience or a specific story, it sets a certain bound and context where it may be a little bit easier to dive into some of the specifics of how these things play out. And so, yeah, there's a great article that Britton Heller wrote. It's called Revisiting Code as Law, Regulation, and Extended Reality. It's in the Vanderbilt Journal of Entertainment and Technology, unpacking a lot of these different issues. Michaela had mentioned AI and the American Smile. It's an article that was unpacking a lot of the Western influence on these large language models. One of the things that was happening at IFA Doc Lab this year was they had an all R&D summit event. I wrote up a whole report that hopefully will be coming out here soon. And Ingrid had actually led a group discussion about AI and artists and working with it. And during that discussion, there were people that were actually coming forth and saying, look, we should just avoid using any of these systems altogether. These systems are created by these huge technology companies. And I think there's this feeling of not having a lot of agency. But part of the other discussion that was happening there was that Part of the artistic practice was actually to curate your own data and to train your own AI models that were on data sets that seemed to be gaps in what existing large language models were having. So part of the future of AI art is more of an archivist role where you're gathering in the data and trying to create something that's different than these homogenous large language models. So just this idea that you have more of a diversity and plurality from many different types of perspectives as embedded into a large language model. With girdle incompleteness, you're never, ever going to get a complete AI model that includes everything. And so having a diversity and plurality of lots of different models, I think, is one way of addressing some of those AI limitations. However, with each of these existing AI architectures of large language models, there's certainly a lot of limitations for hallucinations and the lack of reasoning and a whole set of other limitations and constraints that I think is just helpful for artists to be aware of some of those limitations. One of the things I found covering the different experiences of a doc club is that these artists would be creating something, but it wouldn't be until the conversations that I had with them that I start to unpick some of the different limitations like this Conversation I had with Emmeline Corsier, her piece called Burned from Absence, where she was using generative AI to reconstruct these family photos, but she ran into a lot of issues with trying to generate images in the context of Vietnam in the 1970s, where most of the images that are trained... in the large language models as coming from like the context of the Vietnam War. But she was trying to show aspects of these villages independent of that war context and just finding it really difficult having to find creative ways of using the Vietnamese language to prompt things rather than English language. But it wasn't until the conversation that I had with her that was explicating a lot of those different types of limitations that she was running into. We could just watch the piece and not even realize that some of these photos were created with generative AI, unless you maybe read the synopsis or you have a little bit more of a trained eye of being able to identify when things are synthetic versus not synthetic. But obviously, as time goes on, it's going to be more and more difficult for folks to be able to tell the difference. So, yeah, I think just overall, we were covering a lot of the broad discussions and context. I don't expect to see a lot of legislation that's coming out of the United States regarding any of these different types of tech ethics or AI ethics over the next four years. And so Britain was saying that a lot of work that she's doing is like looking at these issues individually. From an international perspective, European Union is certainly at the bleeding edge. And so a lot of the work that she's looking at is looking at beyond the US context, just because it doesn't seem like there's going to be much legislation that is going to be constraining the American companies that are putting out all these different technologies. Michaela was also just bringing up these larger issues around capitalism and the possibility for these digital spaces to produce these type of post-scarcity models of like hey we don't have the same types of constraints so are there new economic models and I think we certainly see that in some of these different digital spaces. I think in a lot of ways, you see some of the tensions playing out in terms of the default gifting culture that you see in a platform like VRChat as an example, where you have people just creating and exchanging this kind of virtual culture independent of any monetary exchange. And then you have other pressures of needing to find ways to make a platform like VRChat sustainable. And so you have, like, how do you monetize some of these different things? How do you You support creators, but also how the platform itself of VRChat is going to be able to sustain themselves in the long term as they're fighting against this kind of default of that type of post-scarcity, abundant, gifting culture that has been cultivated over many, many years of not having any economic system built in. And I've been spending a lot of time diving into the MetaHorizon worlds over the last number of weeks, just because there was a letter from Andrew Bosworth who said, look, this is kind of a make or break year for XR. And they're sort of putting all their eggs into one basket of the MetaHorizon worlds. In this memo that was leaked to Business Insider on February 3rd, 2025, that said MetaCTO said that the metaverse could be a legendary misadventure if the company doesn't boost sales. In this memo, it says, 2025, the year of greatness. Next year is going to be the most critical year in my eight years of Reality Labs. We have the best portfolio of products we've ever had in market and are pushing our advantage by launching half a dozen more AI-powered wearables. We need to drive sales, retention, and engagement across the board, but especially in MR. And MR, Pawsworth clarified on Twitter that even though MR means mixed reality, they're putting VR to also be a subset of MR, which is... quite a choice because it's just confusing because people are thinking that mr is just mixed reality but is also vr for them but anyway he continues saying in horizon worlds on mobile absolutely has to break out for our long-term plans to have a chance if you don't feel the weight of history on you then you aren't paying attention this year likely determines whether this entire effort will go down as the work of visionaries or a legendary misadventure So it sounds like that they're pinning a lot of the hopes of the future of XR onto like the success of Meta Horizon Worlds, which is really quite sad and a number of different fronts. It reminds me of the memo that Mark Zuckerberg sent out on June 22nd, 2015. It's called VRAR Strategy N1. It was reported by TechCrunch in an article. If you want to find it, you can search for Mark Zuckerberg buying Unity. It was a discussion where he was thinking about should they buy Unity. But in this article, there was a memo that Blake Harris was able to acquire because he had this amazing access to all this information as he was writing the book, The History of the Future. So he is giving this memo to TechCrunch to publish. And so it's like this larger strategy document. And in that strategy document, Zuckerberg says that we'll discuss the main elements of the platform and key apps further below. But for now, keep in mind, we need to succeed in building both a major platform and key apps to improve our strategic position on the next platform. If we only build key apps but not the platform, we remain in our current position. If we only build the platform but not the key apps, we may be in a worse position. We need to build both. So from the perspective of Meta, the reason why they're doing this is that they need to have both the XR platform and the key apps. And they put all their eggs in the basket of the key apps being this Meta Horizon Worlds, but it's not really ever taken off. There's so much more compelling stuff that's happening in Rec Room as well as in VRChat that has like VR native perspective and people actually like creating these interesting worlds. So in this memo, Boz is putting a lot of weight into the success of Horizon Worlds. But going back to this memo, one of the things that Zuckerberg says is that if you divide the ecosystem into three major parts, apps slash experiences, platform services, and hardware slash systems, in my vision of ubiquitous AR slash VR, these are listed in the order of appearance. Although it's worth noting that Apple has built the world's most valuable company and with a high end vision by reversing that order. So he's essentially saying like the most important thing are our first party apps. The second most important things are the platform services. And the third is the hardware. So certainly they've been subsidizing the hardware. And he's saying that Apple has been traditionally focused more on profiting from the hardware sales, but they're going to reverse that. And they're going to say that their apps and experiences are going to be the way that they're making the most money. which they've acquired a number of different apps over the years. And like I said, they've putting a lot of stock into really having the killer app of the social and they really haven't had that so much. So anyway, that's overall just a little concerning just because of how much they're pinning on the success of not only horizon worlds, but also horizon worlds on mobile, which is like not even like XR. It's like a completely different type of experience when you're Most of these experiences were built to be done in VR, and when you're trying to play it on the mobile version on the Meta Horizon Worlds app, it isn't a compelling experience. The design constraints for creating a mobile-first design versus a VR-first design is completely different. You do have this weird hierarchy when you're in a game with people who are on mobile and you're in VR, then you can certainly tell that they're kind of moving around in a robotic way. So Meta just announced a whole $50 million fund. And so they're certainly putting a lot of their funding in trying to build up the different worlds and Horizon worlds. And also when you go into MetaQuest, you just get promoted all out of these Horizon world worlds. And it's kind of put on the same level as what's happening with the larger quest store. You know, one of the things that happened last year is that the meta ecosystem made their app lab on the same level of their paid apps that had to go through all this really tight curatorial process of the app store curation, but they flooded that system. And then at the same time they flooded it with all the horizon worlds worlds. And so you have this kind of double hit from, um, App Lab apps on top of just even when you're looking at the default screens on Quest, it's promoting a lot of these Horizon worlds. And then if you open up what used to be the Oculus app, then it's called the Meta Horizon app. And then again, you have to click on the upper right-hand corner of the store just to even discover the paid apps that are there. And so it's that prime real estate of advertising these different immersive experiences by the Game developers that are third-party apps are falling to the wayside for meta promoting their first-party apps. And so Upload VR did a really great article by Henry Stockdale that's published on February 5th, 2025, From Quest to Horizon, How Meta's Shifting Priorities Are Affecting Developers. They talked to over a dozen developers, many of them off the record, the different impact of this kind of curatorial shift technology. Anyway, this is sort of a lot of like independent research above and beyond what's happening in the conversation that we're having here about the ethics of XR. But I think the fact that Meta is putting so much effort in their first party apps and in what's happening in Meta Horizon Worlds, then just thought it's worth mentioning here as people are trying to get a sense of what's happening in the broader XR industry, especially as just yesterday they announced like a $50 million fund to XR. continue to try to push forward what's possible on my horizons they also launched a desktop app to design more sophisticated applications on my horizons so we'll see how that continues to develop and i'll likely be diving a little bit more deeper into what's happening on Meta Horizon Worlds just as I've been exploring around going through all their incentives there's a lot of gamification of different achievements and if you do this then you get this then you know a lot of ways that they're using avatar representation to drive engagement so if you do this experience for a half hour then you get this many style points and with those style points you get avatars and so they're creating like the end game economy but rather than getting like revenue and money they're trying to just drum up the engagement or they can eventually turn up more of like the end world economies that are there as well so Anyway, that's all I have for today. And I just want to thank you for listening to Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon at patreon.com slash voices of VR. Thanks for listening.

More from this show