Consensus Gentium is took him the Grand Jury prize for SXSW Immersive 2023, and it’s the most immersive phone-based experience that I’ve ever had. The Latin title translated means that “If everyone believes it, it must be true,” and it’s a near-term speculative sci-fi piece that explores what China’s social scoring system might look like in the context of the United States where mobility is restricted by algorithms but can be expanding if citizens agree to be surveilled by a phone app. The experience shifts automatically shifts between mobile app videos onboarding you into a surveillance state juxtaposed with Facetime calls where your face appears in the lower right corner, and text messages are seamlessly popping throughout the experience as you jump in between different tasks and cut scenes that build up the world and immerse you deeper into what director Karen Palmer describes as a “reality simulator” that feels entirely plausible near future.
The piece is also tagged with the logline of “the film that watches you back” as it integrates facial tracking technologies and an eye gaze mechanic that allows you make a few conscious and unconscious choices throughout the piece. Some of the themes of the piece are around algorithmic bias in facial detection algorithms and how that impacts marginalized communities (and why the EU’s AI Act bans facial detection in certain contexts like police enforcement). It also explores agency, self-determination, and biometric threats to freedom of thought as you are scored on a spectrum between how compliant you are to the state versus any deviant or resistant behaviors.
Overall, using the multi-channel communication affordances of the phone is able to create an entirely plausible portal into this near-future world that Palmer is creating for us which I found deeply, deeply immersive. It’s no wonder that this piece took home the top prize as the SXSW jurors seemed to agree with that sentiment, while this piece also presciently covers many relevant topics around AI, bias, and threats to our cognitive liberty — notably I spoke to Palmer on the same day that Nita Farahany’s book officially released titled The Battle for Your Brain, which I had a chance to unpack with Farahany ahead of SXSW.
One critical note that I’d like to make is that there are moments of unconscious agency that Palmer has coded into this piece that most audience members probably did not notice at all. The piece is critiquing the government’s use of algorithmic decision making, while at the same time experimenting with the very same type of algorithmic decision-making that’s fairly occluded from the end user. Days after SXSW ended, there was a paper about this idea of “visceral notice” for eye tracking that could be provided to end users to help provide more transparency into whenever some of these biometric tracking technologies like eye tracking might be being used in these XR experiences. This paper just came out after SXSW, and Consensus Gentium has been years in the making, but I’d like to encourage artists like Palmer and others to consider how to use immersive storytelling and art to provide paradigmatic examples for how to escape certain dark patterns that are subtly undermining the future of our agency and self-determination as we sleepwalk into this biometrically-driven future where AI and algorithms will continue to shape our lives in more and more drastic ways. What this piece does is translate this very real potential future into a visceral and immersive story that has the real potential of bringing broader awareness to these vital issues around the future of our cognitive liberty.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
Rough Transcript
[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the future of spatial computing and the structures of immersive storytelling. You can support the podcast at patreon.com slash voicesofvr. So in today's episode, we're covering what ended up being the grand jury prize winner of South by Southwest. It's called Consensus Gentium, which translates roughly to the common understanding of all mankind. It's a foam-based, speculative, near-future piece. So you're sitting down in front of a phone, and you have different interactions with this piece by either your facial expressions or your eye gaze to determine different things that are happening, either through your conscious awareness or sometimes your unconscious awareness of what you're looking at, what you're paying attention to. So there's some branching elements with three different endings. So it's a really timely piece that's covering a lot of themes around the future of these type of algorithmically driven technologies that are making inferences and decisions based upon our behaviors. So it's by Karen Palmer. She calls herself the storyteller from the future, and she's come back to enable us to survive what is to come through the power of storytelling. She calls these experiences that she creates these reality simulators that are transporting you into a near future. reality. And so it's a lot of speculative sci-fi that is a little bit more near term. So she's kind of taking inspirations from what's happening in China with the social scoring system on top of the ability for us to move around or not move around and the ways in which that algorithm sometimes make decisions against people of color in a way that facial recognition and what the impacts of that are. And the piece actually uses facial recognition technology to allow you to engage and interact in this experience and have this experience look back and observe onto you as well. And so the thing about this piece I think that made it so compelling was that you're looking at it on a phone. And the phone has a lot of different existing apps and communication modalities, whether it's text messaging or FaceTime, or, you know, you're watching a video, you're opening up different apps. And so just from the context of all those existing affordances that we have on the phone, It's mixing between all of these different things and you'll go from opening up an app and watching a little video into like text messages coming in and to like having a FaceTime with somebody and you see your face there and so you're listening to these stories that are unfolding in front of you that feels like through this phone you're getting a portal into this speculative near future. So a couple of things. One is that it was deeply, deeply immersive. One of the most immersive stories that I've seen on the phone, even though it wasn't virtual augmented reality, still deeply an immersive experience. And it's covering a lot of really timely topics. I happen to do this conversation on March 14, 2023, which was the release of Nita Farahani's The Battle for Your Brain, where she's arguing that we need a new human right of cognitive liberty. And so there's a lot of themes that we're repeating throughout the course of this piece in terms of our Self-determination our free will and to what degree do we have the freedom to think about things or how much do we have these technologies that are? Increasingly encroaching in on our digital lives and our experiences in physical reality So this collapsing of the digital and the physical and ways that some of these different algorithmically driven decisions can bring real impact on our lives and so those are some of the deeper themes that we dive into hopefully this piece will be made available for people to watch at some point and Karen described it as an early prototype and there were some things that I could see that they're still working through and maybe to get it ready for primetime to be out into all these different phones. But I do think this is a potential experience I could get out there and make a real impact of trying to connect the dots between some of these aspects of algorithmic bias and the future of AI, as well as our interface between these political uses of technology and how they are impacting our lives. So that's what we're coming on today's episode of the Wizards of Yarr podcast. So this interview with Karen happened on Tuesday, March 14th, 2023 at South by Southwest in Austin, Texas. So with that, let's go ahead and dive right in.
[00:04:04.580] Karen Palmer: Hi, my name is Karen Palmer and I'm the storyteller from the future and I've come back to enable you to survive what is to come through the power of storytelling. So I create films that watch you back using artificial intelligence and facial recognition technologies. I call them reality simulators that transport you into a near-future reality so that the participants who experience it, you're not observers, you're participants in the narrative, are able to experience a near-future reality to then understand their role within it. My current project here at South by Southwest is Consensus Gentium, which is Latin for, if everyone believes it, it must be true, and is set in a near future of biased AI and surveillance. Within the narrative, you have an objective to visit your sick nana, and to do that you have to download a global citizen app within a story scape of a film and within that you are going to be evaluated from your eye gaze and your face detection to see if you are more dissident or more compliant citizen. That in turn will then lead to different branching narrative at certain parts within the story and there are also potentially three endings for the story based upon your potential dissidence or compliance.
[00:05:28.502] Kent Bye: Wow, OK. And so maybe you could give a bit more context as to your background and your journey into this space.
[00:05:33.726] Karen Palmer: Yeah. So my background is that, well, way back in the day I was a fine artist. And then I moved into film. And I actually then moved into making documentaries, TV commercials, and music videos. Because I like authenticity, but I like things to be cool and not cutting edge. And I also like how that juxtaposed with music. And then about maybe 15 years ago, I kind of felt, you know, the future is kind of film and tech in terms of storytelling. But I felt like a linear film is, in the future, particularly with young people, is going to be not interactive enough. Back in the day, that would probably be called transmedia or interactivity. So about 15 years ago, I started to work with film and tech in particular. and then I made a series of films. One was part of the commission for the Olympics in 2012. It's part of Culture Olympiad and I've always kind of worked within film and with a strong positive message and then I started to really as a black woman get I mean we've always been very aware and conscious but I really started to have an aspect of social justice within the kind of background and context of my work like it became very pervasive and particularly with the murder of Eric Garner that really had a profound effect upon the kind of work I was making. So 2015 I made a project called Riot that was kind of inspired by that experience and the sense for me of I felt like at that time a huge responsibility with what was happening in America, because even though I'm black British, there's kind of a connection with it. And that really impacted my work and what I wanted to do with my work. So I was thinking, you know, what is the role of the artist? What is the role of the storyteller? We have a responsibility beyond making these really cool type projects. So 2015 I made a project called Riot and that was my first kind of foray specifically into AI and that came out in 2016 on like a world premiere stage. I started using artificial intelligence and facial detection technologies in a storytelling scape where it puts the participant in the middle of a riot because not just what happened to Eric Garner but how black people being portrayed in the media in terms of protesting, it was like we were looters, you know? It was like that was the focus as opposed to the focus on what had happened to Eric Garner. So that was kind of the influence around Riot. So I made Riot in 2016 and 2019 I made Perception.io. So what happened between 2016 and 2019, I was based in America and I was doing a lot of work in AI. I was working with a company called ThoughtWorks Technology and I was their AI artist in residence. So 2016, 2017 I really went down the rabbit hole of bias in AI. And I feel America is very kind of at the cutting edge of this type of conversation about bias in AI. Like people in England weren't really like, what were you talking about? And people like Ruha Benjamin, I was like very aware of and went to her talks and just kind of, and working with technologists to understand bias in AI and how it was basically, it's impossible not to have bias in AI. And I also work with neuroscientists and behavioral psychologists. So that influenced my next project, which was Perception.io, which came out in 2019, which was kind of the future of law enforcement, which would be biased AI, because the same biased, racist, prejudiced people and institutionalized racism is going to be creating these systems, automated systems. So I would kind of watch in horror in 2020 when there was all the demonstrations and they were talking about, you know, we need to get rid of police. And I'm like, no, no. What are they going to replace them with? You know, automated systems of oppression. Like, you can kind of argue with a biased or racist cop, but you can't really argue with a biased or racist algorithm. So I made this system called Perception.io, which was the future of law enforcement, where, as an artist, I see my responsibility is to make people aware of their role within it, and to kind of take down from up here these kind of very intellectual concepts that you know, people who are just trying to pay their bills, they may not even read the Times or really understand these things, to make it really accessible for them by putting them in a storytelling experience so they can experience it first hand for themselves. So with Perception IO, you are watching a film and you have the first person perspective of being a cop and you're watching a film as if it's a training simulation and you come into contact with a white person and a black person and I've filmed different scenarios where the white person is either a criminal or has mental health issues and the same with the black person is either criminal or mental health issues character and you don't know which one you're going to have and you see a white and black and you watch the film and if the facial detection determines you as the cop is angry with that person coming towards you maybe the situation will escalate and you may kill them. If it deems that you are calm and rational, you'll be able to pick up the kind of cues and the branching narrative will get to the optimum result, which could be if it's a criminal, you arrest them, or maybe you call for backup if it's someone who has mental health issues. And if you're scared, then as the cop, you come out of it bad, like maybe you would end up being shot by the criminal, because there's repercussions and consequences for all these decision-making in these high volatile type of conflict situations. And the perception I.O. was, as I said, I worked with behavioural psychologists and I worked with NYU Emil Blassets there, who's a professor in associate biology there, so that I'm really using these experiences to make people very conscious and aware of their own potential implicit bias, as well as aware of how these systems are created. Because by putting people in these scenarios, they can experience it, they're experiential. They're not theoretical, they're not intellectual, they're not academic. And I feel in the times we're living, that to me is my responsibility as a storyteller. Which brings me to present day, so Consensus Gentium. So Consensus Gentium, as I said, is Latin for if everyone believes it must be true. This project took me three years to make. I'm so proud of it, I can't tell you. It's very emotional being here. because when you've been working on something for three years and you've gone through all the ups and downs and the kind of like points where most people would have probably given up and you're just like you don't even have the strength to keep going but you're like I just go to bed and wake up tomorrow it's so fulfilling to kind of even what you said to like acknowledge the many levels to it, so it was very important first of all that it's on a mobile device because first of all I need to democratise art by bringing it to the masses, I need to bring it to the young people, I need to bring it to the demographic of people who are going to be affected most by these themes. I'm very honoured and privileged to be at such an esteemed place by South by Southwest but the people I'm trying to reach are not have that privilege of being here so it's imperative that it be on a mobile device so that in the future we're able to create an app that can reach them directly. It was also imperative that the story, when people do experience in the palm of their hand, they feel this kind of intimacy and this authenticity that you have only with your mobile phone. So it had to be a vertical storytelling experience. So all those elements weren't decided for me. They were like, this is the audience I'm trying to reach, so this has to be the platform that I take. So the story is basically, as I said, the kind of complications and repercussions and consequences of trying to just go get somewhere as a regular person, which is increase your mobility. I believe we move more towards a kind of controlled state on a global level. That's why I use the word global citizen often throughout my work. And that you have this objective to get somewhere, you know, you're constantly being evaluated through your eye gaze and your facial detection. So I'm kind of, This evaluation is constantly already out there. I kind of redeveloped the functionality with my developer from Immersify Studio, the existing functionality on the iPhone, but use it to kind of prove a point. It's kind of, this is here guys, this is on your phone when you go home. I'm using it within the storytelling experience to let you know that this already exists, but I'm using it to make you aware of the kind of levels of manipulation that is going to be coming. And maybe already here. And most probably already here because they haven't clicked the button in this country. You know, this is, there is definitely influences here from the social credit system in China with a slightly different twist. But yeah, I think I should actually let you ask a question.
[00:14:23.225] Kent Bye: Yeah, well, first of all, I want to say that I do think that the format of the phone made it a really interesting, intimate experience that allowed me to imagine this potential near future or present day kind of sleepwalking into a dystopia, from my perspective, that we're kind of headed towards this near future and that I love how you're going back and forth with the different video vignettes and then the calls and then the ability to have my face show up but also with the text messages that are coming in and so there's like layers of the story that's happening and unfolding and so it feels like a type of experience of like this multi-channel communication environment that we already exist in and really leveraging this future scenario where you're in communication and contact with all these different characters in the story but also on this journey of trying to go through all the different hoops to get authorized to be able to travel to see your Nana in this piece. But I also really appreciated all of the aspects of the biometric surveillance and the future where people are deciding to Forgo different aspects of their privacy in order to have more autonomy in their movement But yet not necessarily fully recognizing what they're giving up. And so I feel like it's a great visual metaphor that actually a lot of times the privacy is much more like we have a society we're emerging our privacy but it's on the other end we get you know free stuff people are consenting to it but yet they don't know what they're losing and I feel like a piece like this is really demonstrating what types of things that we're losing when it comes to our freedom to assemble our freedom of thoughts and all these different human rights that are slowly eroding away. So I feel like your piece is really at the heart of so many of the big issues of our time. And so I really appreciated how you're really of the moment exploring all these different issues in this format.
[00:16:07.540] Karen Palmer: Thank you. I kind of have put so much, I'm a research-based artist, so I do a lot of research into this and I work with amazing people as part of my R&D, like techno activists, anarchist authors. So, I'm so excited. So, there's so much themes and influences in there that I'm so happy that you were able to acknowledge all of it because I have to kind of, I can't overload people so I have to kind of find all these little subtle nuances or characters or themes or little commercials that go in that are really like fully loaded with all these associations. So there's a part in with Fourth Industrial Revolution and the World Economic Forum where they've talked about you will be happy and you will earn nothing. Like that's actually gone out there and looked at some of their films where they've actually said, you know, there's like this film they have on the internet and it's 15 minutes long and about 12 minutes in for 20 seconds, they say something very pivotal where they say, well, you know, in the future when we access your thoughts, don't worry, we're going to create a little area for you can still have your divergent thoughts and that will be for you. And then they just keep talking and on the comments, it'd be like, what the fuck? Did everybody hear that? And then all of a sudden the comments.
[00:17:19.064] Kent Bye: So this is a actual video that came out. Okay.
[00:17:21.145] Karen Palmer: real, real, real video. And then I was reading all these comments like, what the fuck, what the fuck? And then when I went back, there was like, comments were disabled. And it was like, it was like buried in there. And I was like, oh my God, like, this is actually like a PR video. Like it's kind of almost, I would think satire, like I would put in my thing, like it was satire. So this kind of own nothing and be happy. It's like, someone's always going to fucking own something. This won't be you. And then within my piece, I have an influencer. And I'm like, well, who would the influencers be sponsored by in the future? I think it would be like my work is the global citizen. It's kind of this fusion between the state and tech, right? And maybe medical, pharmacy as well. But I didn't go there this time. But it would be like, that would be. And so if as an influence, that would be your biggest sponsor. So I was like, yeah, we wouldn't have a commercial. They would have an influencer fronting it, you know? And I remember when I thought of that two years ago, I was like, oh, that's perfect. That's so perfect. And that's like, what would happen? So what I do is that I kind of immerse myself in what's out there and I use my imagination and I project myself into the future. So I developed this concept, started on this journey three years ago and it's out today and now it's supersonically relevant. I'm kind of fortunate because it could have come out and I could be like one year too late behind the curve, you know? So what I do is that I do these Hack the Future workshops because as you may have noticed I love talking but I also love listening to people so I'm fortunate that I've gone to quite a few of these events and I meet amazing people like yourself and I asked them do they want to create like these think and action tanks where we use storytelling as a tool for liberation so there are people like Ruha Benjamin was there as I said, techno-activists, Emilie Bloquet, who's a neuroscientist, people doing PhD in police and surveillance systems in the Netherlands, the VAG in the Netherlands, social and arts organization, new institute in the Netherlands, LAPD spying coalition, all these amazing people from around the world. And It's not speculative. It's like, what do we need to do today to reach the future we want to reach in the future? And we did one of these events in the Netherlands. People were coming in remote. And there was, like I said, Ruha Benjamin. I had techno activists, all these academics, authors, institutes. And we were kind of looking at, what is the answer to this? How are we going to prevent bias in tech? And one person was like, well, why do we want to help them make it better? Why do we want to even be invited to that table? We want to break that table in two, right? And it was like, well, whatever the question is, the answer is revolution. And it's like all these people from all these organizations just all nodded like, yeah, we're all in agreement. It's like with Marie Antoinette. It's like when they got rid of and they killed in the French Revolution, they cut off their head, but they kept the state, right? I mean, France is kind of different because they still have that spirit of, yeah, we can do it, but it's still like a system there. And that kind of really inspired me for the next piece, actually, because this one is about democratization of AI and making people, they should be scared when they come out of this, like, what the fuck? I want to put you in the middle of this experience because my objective is to Make you aware of your responsibility. If we continue down this current path, are you going to be compliant? Are you going to be dissident? Are you going to acquiesce? So I want you to see your role. So when you have three different narratives at the end, where you are compliant, you get a certain ending. And if you are halfway, you get a certain ending. And if you are dissident, you have a certain ending. And if you're happy with that ending, that's fine. Then you know where you're going. If you're not happy, you can kind of go in the experience again and change your reaction. It's kind of like reprogramming yourself in a way, because I want people to understand that Your eye gaze, not your thoughts but what's expressed in your face reflected on your thoughts or the conscious or subconscious decisions you make with your eye gaze determines the narrative and also in the way your emotions and your feelings. I want you to understand that in the exact same way these thoughts, these emotions, these feelings determine the narrative of your life. So I want you to be aware of that. It's not just a film. You have a role to play. So the objective of this piece is basically to make you conscious of your subconscious behavior, enable you to be able to move through fear, why I put the parkour in there, to be more focused and be more aware of your critical and cognitive thinking, and then activate agency in you at a time when we've never been more scared or nervous about the future. So that's like my agenda. through seducing you into this really cool piece of work. The same way that, you know, people selling advertising or social media platforms, they have an agenda. My agenda is to empower you, you know, by transporting you to the future and saying, you know, and all the endings are screwed. We're all totally fucked either way. If you decide that you want to fucking fight the system, That's a hard fight, but at least you know what you're up against, and at least you're like, you know, that's not a great ending, but I can live with that. If you decide to acquiesce and be compliant, you may be happy with that right now. I don't know how you're going to feel in the next six months to a year, you know, because to me that's a harder route to travel, right? So I'm not saying to you, oh yeah, this is one size fits all, or this is really easy, or it's good or bad. We're all screwed. We're all totally fine. We're living in the time, and as a storyteller from the future, I'm saying that we've come back now because we don't realize our power, you know? So I'm using storytelling, I'm returning to the kind of source of it as mythology, as the rites of passage, you know, to activate that agency in you, to kind of shake you, you know, to say, look, What the fuck? What are you doing? What are we doing? What are we doing? You know? If you can't manage it, it's just a little 22-minute film, chill. But if you're ready, if you're kind of just something in here kind of connects with you or you're feeling the hair grow up in the back of your neck, like, yeah, you know, let's do something about this, you know? Because as I say, I'm the storyteller from the future. I came back to enable people to survive what is to come through the power of storytelling. But I say I did not come back alone. So I've come back to reconnect with all the people that came back with me because we've got some serious work to do.
[00:23:36.731] Kent Bye: Yeah, well, I mean, this piece is, I'm seeing it today on March 14th, 2023, and today happens to be the launch date of Nina Farahani's Battle for Your Brain, where she's talking about the need to establish a human right of cognitive liberty, which includes the right to self-determination, the right to freedom of thought, the right to mental privacy, which includes our physiological and emotional reactions to things. And so basically trying to establish a new human right in order to work through the system to have some things that are off the board for what's possible. So I also just did a whole 10-episode series looking at XR privacy and talking to Daniel Leufer of Access Now. And in that, he's talking about the AI Act, which has three different tiers of saying, here are unacceptable applications of artificial intelligence technologies. These are banned applications. Then you have high risk, which is the areas which need to have monitoring from the government and some sort of transparency reporting obligations. And then there's the medium risk, which includes different aspects of biometric data right now, and all of this AI act is still going through the trilogue process. It's not really settled, but at least in the European Union, they have some awareness of trying to apply human rights principles to some of these technologies. Hopefully, eventually, that'll come to the United States, but the United States is like 5 to 10 to 20 years behind, and so I feel like a piece like this is at the forefront of how this is an urgent issue. And it's also, as I was watching it, I was reminded of the film Coded Bias, which I'm sure you're aware of. I saw it at Sundance 2020 with Joy Bulemani and the Algorithmic Justice League and all the stuff that they're doing with that. And I feel like What Daniel Loufer was saying is that there's a utilitarian approach that a lot of people will say, oh, if we just do self-directed optional guidelines, then it works for 95% of the people. But for the 5% that it doesn't work for, then it is essentially amplifying injustices for the people that are already marginalized. He says, we need this human rights approach that is able to address these different issues. And so I'm happy to see that there's at least some solutions in the European Union. But I feel like in the United States, we're just, again, kind of sleepwalking into this area. We're not really sort of addressing this. And I think your piece is really highlighting the urgency of some of these issues.
[00:25:42.898] Karen Palmer: Wow, you said so much amazing things. I'm so excited. That's what you said, the equivalent of going out on a Saturday night to me. I'm really excited. I don't know how to respond to all of that. My piece, moving back from the last thing you said, it is so vital to bring the themes that you just said, which are so vital and complicated, to a mass audience. Because they're not part of this narrative. and we need them to be, you know. That's why when I have the Future Labs I have academics, I have philosophers, I have scientists, I have techno-activists, you know, and authors, because it has to be interdisciplinary. I can't be like this, oh, I'm an artist trying to do this. No, no, no, we need all to work together. I can create amazing stories, but the nuance of what you were just talking about and the book that's coming out is just like, my God, that's so amazing. I just need to absorb myself in that book. But also I'd love to invite that person to the Future Lab so that they can, because this is part one of a trilogy. This is highlighting to everybody, shaking you, this is coming. So it's about the democratizing AI. Part two is about the solution. Okay, we got to do this this way, guys. And it's part of that narrative is about decolonizing AI. Okay, so everything you said has made me just so excited, because they're like comrades in action. And I'm an artist. So that is my expertise. And I can absorb, absorb, but the level of what you've just said about it, like even just the terminology, that's different terminology that's being used a couple years ago. And I just want to continue to create some alliances with some of these individuals you've mentioned. It has to be a multi-pronged approach. And as I said, my concern is that the conversation could be in just one area, where the masses have to come on board to understand the complexity of what's happening here. and it has to be brought down in a very rudimentary way, which is, I feel, my objective with this. And that's like my background of like, it's documentary, so it kind of feels a bit influenced, it feels a bit real, but then it's kind of music video meets TV commercial, so, you know, their attention span's really short, so, you know, I'm not gonna lose them, you know? And it looks cool and it looks sexy, so, but the themes are really like, you got really to the heart of it. Some other people might just leave with like one or two thoughts, but, you know, they stayed to the end and they're still thinking about it, you know? Yeah, I have no idea how to respond to what you said. It was just so exciting. I just want to go and read the book.
[00:28:17.296] Kent Bye: Yeah, I guess one of the things that Nita Farhani talks about in Battle for Your Brain is that we're on this cusp of the neurotechnology revolution where we're going to have ear pods to be able to read our EEG. And as we move forward, already having companies that are monitoring people's brainwaves, but how it's being deployed within China in the context of like- Attention exhibitors. Please be aware that we are about to open the doors. The line currently has around 150 to 200 people in it. There will be a mad rush. Be prepared. So one of the things that she's talking about is that there's already being deployed for the workplace monitoring, but in places like China to do things like cognitive warfare, but also to potentially in the future start to test people's allegiance to the Communist Party and to actually see to what degree are they compliant with whether or not they are in alignment with what the rules are within China. And so she's like explicitly making these connections that you are implicitly as an artist also tuning into the same type of dissident nature of your piece. So I'd love for you to expand on that.
[00:29:25.611] Karen Palmer: Oh my god, you just get me so excited. Okay, expand on compliance and dissidence. So I was doing quite a bit of research in terms of the psychology of the individual versus the group mentality and how the group is kind of stupid in a way because the more people you have in the group the more simpler the message has to be and the more that you can be swayed within the dynamics of a group and the groups are more swayed by emotionality than logic and this kind of you know concept of the sheeple or mass versus the individual So I did a lot of research over the time of my R&D in terms of that and how media is so good at manipulation, like people like Edward Bernays who's like the godfather of advertising and understand the use of the manipulation tactics used within advertising and media and other formats today as well. So That's why I'm putting you, you're not an observer in the experiments, you are a participant, right? Because it's down to you. It's not like, oh, it's going to be other people or whatever. It's always somebody else's fault in life, right? Why didn't they do this or they shouldn't have done that? I'm saying the onus is on you as the participant. So this kind of dissidence or compliance, I'm putting the spotlight on you, like, which are you? Because it's going to come down to that. And it's not going to come down to the person to your left or the right of you. It's going to come down to you. So I'm putting a very soft spotlight on you now to see how you feel about that in an unconfrontational way, in the format of a storytelling that's going to start to stimulate that part of you and see how far you're going to go to be compliant or get what you want or what you think you want. This whole thing is just designed to make you conscious of your subconscious behavior so that you can then start to become more self-aware and you know just see if you happy with who you're looking at in the mirror you know and it's just a film now but maybe something else will come up in the future where you'll be like Hmm, get you thinking. You know, I really need this to come out of this story scape into the real life. So it's very much a neurological, associate, behavioral, very specific impact. So when they talk about VR, right? Well, you're expert, you're the VR king. You know, children can't tell the difference between VR and reality in terms of their memories. It's like what we're doing with these type of XR, AR, MR, AI experiences that we are basically messing around in your head to some level and some degree as storytellers. And I don't know if many of us are aware of the levels of that happening. I'm a storyteller first, I'm not a scientist or technologist first, but as a storyteller I'm very deliberate and intentional. in the impact I'm trying to have on you. But I am one of the good guys, so I'm doing that for good. So I understand from a manipulation point of view. I fully understand from a technological point of view. And it's not just like, oh, I want you to manipulate your emotions. I want to empower you. I want to make you self-aware. I'm very, very, very deliberate in what I'm doing here. you know, really deliberate and I've got a responsibility and it's going to constantly be fine-tuned and fine-tuned and fine-tuned until it's like, I don't know, making soldiers of people, I don't know. There's something that they're making consumers of people, I'm kind of freeing you from that. So when I first started doing this work I had this analogy of like, When I used to work in music videos one time, I actually went out and saw someone wearing the style of fashion that I did with the stylist. I was like, oh my God. And they were like, yeah, they started to kind of create a look. And I was like, wow, this is really big. This is a big responsibility. And I was like, you know what? I don't want people to kind of be influenced. It's like projecting back on them an image or an ideology or a thought or a belief. And then you become that thing. I kind of want the film to be a feedback loop so that you're like, OK, if I do this thing, I'm this person. That's the impact. Oh, I don't like that impact. Oh, shit. OK, let me try something else. Oh, OK. I like that impact. I'm not projecting onto you. I'm kind of enabling you to expand yourself. And yeah, it's really heavy out here and it's just warming up, right? Things haven't even started to, the hors d'oeuvres haven't even come out yet, the start hasn't come out yet, they're just preparing it in the kitchen. And I'm just like, guys, can you just please wake up? It's not about that person or this person or this situation or that, it's about you. I need you to start to activate your full potential or at least be aware of what's going on in your life so you can't say, I didn't know. You know? So that's like with my work at the least, I'm like, you know now. So back to the dissidence and compliance. That's what it's about. You know, I feel most people is just on the fence. They can breathe through life in that. They don't have to. But maybe you've done that up to now, but there's going to come a point where you can't do that anymore. And it's come in my life with this project where I had to make certain moral decisions that people would be like, you know, you shouldn't do that. And I'm like, well, You know, I know I shouldn't do that, but based upon my integrity, I don't have a choice. So I've got to now put my money literally where my mouth is in terms of some decisions I'm going to start to make in life. And before it was I could talk, but now the times we're going into it, you really have to start to live it, you know, and this is kind of shaping you to be prepared for the future, that you are happy living and not just for yourself as an individual, but your extended family and community as well.
[00:35:12.032] Kent Bye: Yeah, I think it's really landing when it comes to achieving that. And, you know, one of the things that you were just saying in terms of the persuasion and manipulation, this is a section that Nita Farahani is exploring in her book because there's a subsection of cognitive liberty is self-determination, which is philosophically, it's the idea of free will, but it's also agency. And, you know, from a legal perspective, you know, there's different aspects of like informed consent and, you know, informing people, disclosing in the, AI act, there's certain obligations that, say, if you're talking to an AI through an AI voice, that it's disclosed to you that you're talking to an AI and that you're not talking to someone who you think is a human. And so there's certain transparency aspects that are going to be included. And so I think we're still on the forefront of where this is going to go. But even in pieces like this, as a storyteller, I mean, the line between, say, persuasion, which is a part of freedom of thought, and also manipulation, like, what is that line of violating someone's right to self-determination? And when does it flip over to becoming manipulation? And when does it just normal rhetoric and persuasion that as a storyteller that you're doing that in a more ethically aligned way? But what is that line of that threshold? And back in 2017, when I was at a neuro gaming conference, talking to a behavior neuroscientist, he was saying that as a behavior neuroscientist, what is the line between persuasion and manipulation when it comes to, you know, the neuroscience. He's saying it's actually using the same research. So how do you know what is the unethical manipulation versus the persuasion? So in a piece like this, you're making choices and sometimes you're looking explicitly, but sometimes you may be making unconscious choices. So maybe you could elaborate on the different ways that people are expressing agency in this piece through eye gaze or through emotions or other things. Like sometimes I was aware of making a choice and other times I wasn't or maybe not fully aware if there was something that I was doing that was happening behind the scenes. So I'd love for you to maybe elaborate on all the different degrees of agencies being expressed in this piece.
[00:37:01.678] Karen Palmer: Oh, wow. As you said something in the first part, I was thinking something, but it's gone. I was going to say something. I guess I want to say first, just as we're going along, I'm really enjoying this interview because, yeah, it's to the meat and bones. And this is an amazing conference, but it's more like the high-end commercial, but the themes you're talking about, they're not niche, but they're not normally part of much of this narrative here. And I'm just so thrilled. because I haven't been able to. This is what has inspired the piece, you know, and that you're getting into the really like the nitty gritty of it. So it's just to be acknowledged. It just feels so good because I haven't had these type of conversations here. Had some other great ones, but this is exactly why I made the piece. So agency and autonomy. It's so interesting because you kind of develop these approaches and this terminology. And I was talking to not just behavioral psychologists, neuroscientists, but other types of psychologists. And you come to these terms and you're like, oh, yeah, you're talking about Farhani. Farhani. Farhani. That's the terminology she has. You're like, OK, cool. We're so aligned. We've kind of distilled it down to the same word. Right. So conscious and subconscious. So this is a high level prototype. It's not even quite finished yet. So there's some things which a conceptual that I'm still intrigued if it in terms of getting so there's parts in there where you're asked to make a decision consciously and look at a certain part of the screen which will determine a certain narrative you're making a choice like if you want to be part of auto self-surveillance which is a concept that I came up with that you can buy into auto surveillance for more privilege more mobility more benefits from the state right and freedoms and you can consciously make the decision with your eye gaze There's a part in the film where it's more subconscious and it's still not as clear-cut as I thought. So when I was working with Emily Blissett from NYU, she was working with eye gaze and measuring bias. And her test, when she brought participants in, was that they would watch a film with someone being apprehended by the cops. And if you spent more time focused on the person being apprehended, that reflected your empathy with them. And that's what I would do when I watched a film and she tested me. And then I'll be going, oh, God, because I'll be imagining what it's feeling like to be that person being arrested. If you spent more time focused on the cops and what they were doing, you felt more empathy with them. That's why often when people are looking at these images, particularly of different races, they're coming to these different conclusions, they're basically looking at two different parts of the screen. So in that time when I was working with her, like before COVID 2020, my kind of objective at the time was to be like, oh, maybe if I can get people, this was to do with perception IO, was to get people to look at different parts of the screen, I can then expand their perception of reality. So that was kind of like a test with that piece that was kind of inconclusive because there's so many triggers. With this particularly, I took that directly into this example. But when I tested it on people, it wasn't as conclusive because I don't know if the format of the phone made it different because it made it more real. And when I was testing like on my nieces, so what was the original testing was, if you looked at the black man being apprehended, you're more empathetic with him, that means you're more dissident. If you spent more time looking at the white cop doing it, then that shows that you are more compliant and more empathetic with him. When I did the original user testing, my nieces was doing the user testing, and she's a young black girl, obviously like 21, 22, and it branched, and I said, where were you looking? She goes, I was looking at the cop because I wanted to take as much information as possible in case I needed to be a witness. And I was like, fucking hell, this is not what Emily Blissett's got in her research. But that was a controlled environment in a certain time, probably just around, she was doing her research maybe from 2015, 2016, up to 2020, where the climate's changed, I feel a little bit in terms of social issues and more public cases coming out. So people's sense of responsibility is different. So we kind of were like, oh shit. And we kind of didn't take it into so much of account. We were like, if you look a bit more up there then as opposed to as cut and dry as we had it before. So we kind of took it into account, but it wasn't as conclusive as we thought. So when you're doing things like this and you're doing storytelling and you're doing science, it's new territory. Because when I do what I do, there's not like I can Google storytelling, branching, narrative bias. facial detection, it's a whole new terrain as far as I'm aware. When I first made Riot in 2016, as far as I knew I was the first artist using AI, definitely with facial recognition technology, 100%, there wasn't even many using AI compared to now. it's kind of like you work with a scientist and they say this is it and then you work as a storyteller you say it's not as absolute actually because you're in a lab giving me that so then we're kind of figuring it out together we're kind of figuring it out together what would be so um to answer your question that's how i'm doing the dissonance and compliance from a scientific perspective but when you're putting it in the phone and it becomes more authentic there's other ramifications to bear in mind. So then this is a continuation of experimentation that we can go back and we can evaluate it to know how putting in a mobile device and the eye gaze is going to be slightly different from a controlled environment if you are more dissident or more compliant. Or do we have to create a whole new set of testing which I think will be the case in terms of the next part to really understand. Even, like, when you're out and people may video something, if something's happening, not if it's to do with race, and they don't want to get involved, you know, rather than somebody who would run up and help. That shows, I think, lack of empathy, or maybe it's fear or whatever. It's like there's this new kind of dynamics around eye gaze and agency and a mobile device. It's very unique, you know, so I kind of gone off a little bit, but I think you kind of understand that it's, I have a barometer through science and tech, but it's not as clear cut as I had imagined it was going to be.
[00:43:11.953] Kent Bye: Yeah, I feel like in some ways there's ways in which that you're addressing, critiquing the algorithmic decisions, but also you're implementing it in some ways in the storytelling. And I think there's sort of like both the companies and yourself as a storyteller experimenting with the same type of ideas. And I think what the European Union is trying to do with things like the AI Act is as we move into this more algorithmically driven future, then what kind of transparency or auditing capabilities are we going to need for these different types of systems that are making these types of judgments about us? And you're doing the same ways in terms of like, how are you going to drive the narrative? But all these companies are making decisions as to whether or not they're going to get insurance or whatnot. So we're kind of in this realm where we're already in an algorithmically driven society. And then the question is like, what the Algorithmic Justice League is trying to do is bring awareness to these broader issues. And I think the AI Act is trying to at least create these tiers of harm that's saying, OK, here's how this can do the most harm. So we should just get rid of these. Now, here are ways that you need to report to the government to show that you're not bringing harm. And so I feel like we're at the very early phases of the regulatory bodies. And then enforcement is going to be a whole other aspect. But from a storytelling perspective, you're trying to identify these potential harms and give people a direct experience of that.
[00:44:22.609] Karen Palmer: Yes, yes. Okay, so AI governance, regulation and policy. This was a big part of the initial conversation at the Future Labs where I said that some of the people like from Stop LAPD, Spying Coalition, were like, we're just kind of going to enable them to police us even more through their systems. And so I also don't know if that is the right answer, but it is a relevant answer. But I also, through doing this work, is kind of bringing people's awareness to the need for that. But also I'm trying as an artist to discover the right question. Because how can I get the right answer if I don't know the right question? Yes, we do need to have more transparency, more regulation, more governance, but of systems which we don't control. So we're kind of tweaking them, making them a little better to control us or influence or interact with us. There's other questions to be asked around that. So that's like my Part two is gonna be almost, what's the questions we should be asking? And we're all gonna have different questions to be asking, but as a creative, I'm just trying to figure out the right questions. I did a talk in Germany a few years ago, and then I went back, and I'm also like a futurist as well, but through the lens of a black woman, and I find that traditionally, a lot of futurists tend to be more affluent white males, and I've sat on many panels with them saying how our future's gonna be so bright and golden. I'm like, maybe for you and your friends, Most people I know probably won't have a job and they're going to be victims of automated bias. Because when I made Perception.io about the future of law enforcement in 2019, in 2020 was actually the first case of the black man, Robert Williams, I believe in Detroit, who was arrested and apprehended by an algorithm because they misidentified him. That was in the piece as well. So, that's my lens where I'm kind of imagining this future. And when I was in Germany, I did a talk there, I think 2017, 2018. I remember the following year, and this young lady said, oh my god, I've got to tell you what happened. I said, what happened? She goes, well, she was on a chat, on an online chat to do with her insurance, and she's having this really lovely conversation with someone. And then she said something a bit sarcastic, and then there was like a gap. in the breakdown of the communication, and she was talking, talking, and she says something sarcastic again, there was a breakdown, she goes, oh my god, I'm chatting with an algorithm, but they didn't understand the nuance of sarcasm. Right, right? So when you're talking about the transparency of just declaring this is a bot or something, yeah, absolutely, 100% because it's a whole new world. I know it sounds cheesy, but there's so many different conversations and discussions and things to be looked at as we move into this world, which is just being created and constructed for us under the guise of efficiency, speed, and our safety. It's all being sold to us in that way. But they're not telling you that if you have the ring and if everybody on the street has the ring that there's like an emergency law that the police can actually take control of those rings and turn them into their own personal surveillance cameras. They don't tell you these things because, you know, they're sold to you on, you know, it's for your safety, you know, but really most of this is about security. As a futurist, I talk about how smart is really another word for security, you know, it's not another word for safety, it's security. So there's so many conversations to be had around this. from so many different perspectives, that that's why I am really into collaboration and partnership. Like, I love to do my shit, you can see this is me, but I haven't got the whole picture, you know? And it's like, none of us have the whole picture, but if we come together, we're like, oh shit, you realize that, yeah, that that's happening over there. Oh, that's happening there. Oh, that happened six months ago. Oh, that's planned for two months. It's like, then we can kind of maybe have a chance. this you know. So that's why I say connect with those that came back with me because that's our only chance. Yeah I actually can't remember what you asked.
[00:48:20.418] Kent Bye: Well, I think it's a conversation. I'm throwing a lot of associative links. You're having a lot of associative links. And I think we're getting at the landscape of what you're covering here, which is really at the forefront of where our society is at and where it's going. And it's a cautionary tale that is meant to, as you walk away, give you this sense of unease and then that unease get transformed into action. So rather than despair nihilism and like, we're all screwed and like, there's nothing that we can do. So where you mentioned that you had been in conversation with like anarchist thinkers, but also like the process of decolonization, there's been abolitionist movements, especially with the AI Act has been like eliminating certain implementations of facial recognition technology for police and for identifying people as an example because of the harms that can happen for people who are misidentified and they get sent off to jail is enough of a disproportionate harm if it's 95% effective and it's sending 5% away people as like false positives to jail, then the harms that happen to those people is disproportionate to say we just need to eliminate this and have an abolitionist movement. So there has been like different abolitionist movements, but you're talking to a lot of different people, a lot of different interdisciplinary intersections in your work and aspects of decolonization. So yeah, what those intersections are for you.
[00:49:31.567] Karen Palmer: Yeah, so democratizing AI, like democratizing the arts and bringing the arts to the masses, like taking them out of amazing spaces like this and bringing them to the people, particularly young people, particularly people of color, women, working class people. And I've got a thing in my heart, particularly for hard to reach young people, disadvantaged young people in particular, democratizing AI. is a big part of my work as well. Normally, we're talking about it with this one, but my last previous work, we made the AI available open source because we just really want to put it in the hands of the people because we believe that what we will be doing with it would be slightly different to what's being done at the moment. And then we're building on that further for the second part of this one. So Consensus Genshin Part 1 says there's a problem. Part 2 is presenting you with one solution. Part 2 is called Source Code, where we, as part of the research and at the future labs, we looked at the origins and basis of most forms of technology, which has been something which has developed largely in the West, or at least from white men, and that a lot of that is based upon colonization. So, as part of part two, we're going to be, this is really exclusive, I haven't told anybody, particularly in the public realm, just for you because you're awesome, is for part two. And all my podcast listeners. Oh, but they won't tell anyone. They're part of the resistance. Or they'll just live it. They won't talk about it. I'm working with Kambale, who's an activist in the Congo. and we're going to be going to Ghana and we've kind of like well if these systems are built on the current systems is built on systems of oppression we can kind of see that oh this is why the inevitability of how it's being used is more about surveillance and extension of that through the tech. If we as people of color or minorities or just people with a different holistic perspective of the world were to start to build code from the bottom up. What would that look like? What would that be? So we have this kind of, it's not even speculative. We're going to go to Africa next year or this year to look at how we would build code because from that we would have a different ecosystem that had a different agenda to us. So he's already been looking into that and he's like a very high level experience activist and techno activist and we're going to go and take that narrative as far as we can go and See how that will influence part two. See how that process, working with someone who's trying to build code, to connect with our indigenous ancestral roots as an influence for creating code which is going to empower us within the present day, and what that code would look like, and look at what would we want to build through technology. What would be useful to us as human beings? What would we want it even to be useful? Because right now we kind of have this efficiency and speed seems to be like, whether it's chat GPT or something, it's going to make everything so easier for you, better for you, you know, as a human, it's there to serve you. You know, is that what we would want technology, servitude? Would we want to find a way to kind of amplify our humanity? Who knows? Because we haven't ever had these conversations. We've just kind of said, oh, yeah, iPhone, this is cool. We've never really looked at, you know, we've been able to, we've been able to kind of appreciate someone else's vision. But if we have people of color would say, well, if we were to build our tech from scratch, what would we really want technology for? What would be the purpose of technology as an extension of our existence? So that's the next step in part two and using storytelling. I mean, what is storytelling? It's like, you know, everything around us was once an idea that someone made a story that, you know, I want to have a microphone or I want to build a dustbin. That's what storytelling is. People don't realize the power in it. Like everything around you is a story that someone's told someone else that they're going to do or in your life you've told yourself that you're going to do or not do. So I want, as I said, using storytelling as a strategy of liberation to build these new systems. Imagine then, once you've imagined them, as I said, they're thinking action tanks. Once we've imagined them, how do we then build them? You know, that's what I'm here as a storyteller from the future to do. You know, let's get this shit done, you know, and it's so hard. So let's just use our imagination. It's the power of the imagination to do that. It's the next stage, part two to build that tech. As outlandish as it might sound, why fucking not? Why not start there?
[00:54:07.213] Kent Bye: Yeah, there's a guy that was at the Stanford Cyber Policy Center. He's an indigenous man named Michael Runningwolf who's working on indigenous AI in partnership with his wife Caroline Runningwolf. They're trying to create immersive virtual reality experiences to document the cultural heritage of the languages of first language peoples to be able to use AI because AI doesn't normally get trained on indigenous languages. And so he's doing a lot of work that I think might be resonant with some of the the perspectives of that decolonized indigenous perspective on AI. So he's here at South by Southwest and a number of different panels. And so, yeah, a good connection to make, because I think there's a lot of overlap. But I want to just ask you quickly about world building. I know Alex McDowell has a whole process of world building and bringing together all these different experts and having these intersections of expertise and looking at specific places and times that he sets, and then what happens if we project ourselves into this place at this time. And so I'd love to hear about your own process of world building.
[00:55:02.598] Karen Palmer: Thank you for asking that. It's so nice to talk about arty shit, you know? Ah, what a nice question. So I have this concept of world building for world building, that I'm creating this world because this world is going to ultimately impact this world around us in the same way that Philip K. Dick created his Minority Report and influenced many different kind of hand gestures with technology. And I think with the film Minority Report, I think like 15 or 21 patents came out.
[00:55:30.268] Kent Bye: I was like over 100, yeah.
[00:55:31.808] Karen Palmer: 100 even more right and that's Alex McDowell's one of his amazing and I met him actually at an event in London a talk on the kind of the future of film we're both being involved in and I was talking to him about his world-building approach and how you you stand on the street corner because the street corner is a really universal way to kind of interact with the world so in terms of my world building And it's funny, I just did a masterclass on futurism for a university in Brazil a few days ago. I try and just create my own strategies and techniques because everything is so easy to kind of jump on someone else's. So I just try and do what feels natural to me. So I kind of tend to Absorb a lot of information and then just not do anything like I haven't really looked at or read much for like a year I wanted to kind of stay in my own head and not be overly influenced by things so that to really let my imagination just guide me on it and just imagine How do I want the participant to feel? Right? So that would be my guide. And then that was like a whole thing. How do I want them to feel? And it took me like maybe nine months to say I want to activate agency in them, you know? And then I came up with this terminology because in the arts where they talk about impact, impact. And I was like, well, what type of impact? And I was like, affirmative impact, which is like a very specific type of action impact. So I was like, OK, if that's how I want them to feel, The world that I build is kind of like a near future, like a three to five years. And with COVID going on at the time that I was, I was in development with this for 2020. And a lot of the ideas that I was speculating was like here. And I did a panel talk called, is fear the most dangerous pandemic? And I did that within the first couple months of COVID and I brought someone from the military, high-level ranking general. I then brought an elite parkour trainer. I brought a naturopath. I brought an associate behavioral psychologist. I brought a conflict negotiator. All these different kind of live at this intersection, you know, of fear. I always try to be like two to three years ahead. So hold on, let me just answer your question about world building. So I try to keep myself in the zone of feeling. I'm a very kind of feeling person and imagination. And I do these workshops and I just go away and just percolate and then imagine, imagine these future worlds. based upon like when they said the answer's revolution. That stuck with me for like two years in terms of that. I knew that's going to be the basis for part two. But you have to prepare for revolution now. If there's a revolution tomorrow, if we haven't got a system to replace it, then you're just going to use what's existing at that time to just kind of put a band-aid over it. So you kind of have to be ready, spend the first two years preparing for the system, the paradigm system, which is like a belief system, that then you could then replace the existing system. So it's quite abstract. And then I go away and think, OK, well, how could I then clothe that? What would that look like? So I've gone a little bit into part two and world building. But with this one, it was very much about living in the phone was like the number one priority that I was like, it has to be in the world of a mobile device. And it sounds so simple, right? But I'm not aware of anybody it's not a common form of storytelling now, right? And it definitely not with the facial detection and technology. But I believe in the next two to three years, you're going to be seeing loads of vertical storytelling experiences after this. So it had to be within that world, but it had to be slightly in the future. And then other aspects of world building was just being aware of what is with the government, what I'm seeing. So I see with the Tory government in the UK, doing a photo shoot with Bill Gates. I'm just like, Marshall, hold on, let me get his name right, is something Marshall, he's a philosopher, Canadian philosopher. Marshall McLuhan? Yes. And he did this quote called pattern recognition, that artists, he said, are greater than the scientists because they have to come up with original thoughts. And they have to have this thing called pattern recognition. So if I see Bill Gates doing a photo shoot with Matt Hancock, in the Tory party to do with maybe 2021 or 2020 around COVID. I'm like, yep, tech and the government in bed together. So it's more like connecting up the dots around me and being hyper aware to this reality, always being kind of switched on. It's not like, and then from that kind of unequivocal point that trajectory will send me on. Okay, what will that look like in two to three years' time? So I've got my imagination loaded up and ready to go. I'm just looking for that point to then send me there. And it can't even be something which someone's told me. It has to be something which I've seen because I'm a very critical, cognitive person and I question everything and I hate to do what I'm told. I really hate it. You know, if someone says something, I'm like, well, why? okay, that makes sense, but I'm very born to be anti-establishment. So I like to figure things out and then go, ooh, I don't feel comfortable about that. What would that look like in three years' time, four years' time? It would look like the character in this film, this kind of symbiosis of the state and technology, which is where we're going, ultimately, you know? So that kind of pattern recognition, I feel, and understanding how my brain works individually, because as an artist, Well, as anything, it's really easy to kind of hijack someone else's process, but you have to have the confidence to develop your own. And that's kind of, through years of honing my craft as an artist, is much of that is honing the confidence to do it your way. So when people look at this, they're like, wow, I've never seen anything like it before. It's like, yeah, nothing about this. I might love Alex McDowell, watch a couple of his films about him, but I'm not going to... I'm going to appreciate what he's doing, but then I'm going to go away and make my own methodology, because that works for him, the type of person he is in the world that he works. He's the best in the world at what he does. I'm like, you know, for what I want to do, I need to create my own process, my own methodology. Otherwise, I'm just going to be like a weaker version of him, you know, and it's just you can't take his techniques and then apply it to what you're doing and you have to just stand firm in you. So I would say pattern recognition, being aware, like talking to people, understanding, but much of that was more indirect. They kind of helped me be hyper aware of this world to kind of see more patterns. So when I saw things, I'd be like, yep, this is a real person. This is a real person in the future. And that person, I have no doubt, is going to exist. And oftentimes, when I create these works, two to three years, I start to see these signs around the world. Even the term global citizen, I was using that, writing that up. And then I saw, I think, Prince Harry use it on some stage, like global citizen. And I was like, oh, God, yeah, because that's this kind of eradication of countries, right, as we just move towards, like, states. Yeah, is something so you won't you won't be like a British citizen or you're going to be a global citizen. So kind of all these kind of, I would say, being always just gradually becoming more and more hyper aware. Wow, that's a really long answer. I got there in the end, right?
[01:02:59.687] Kent Bye: Yeah. Yeah, I think starting with the phone, there's a certain immersive and it's plausible because we're used to what the phone communication modalities are. And so you're using all those things, but setting it in a near future that is also very plausible. So I feel like by that, it creates this very immersive experience of this dream state into the future. But yeah, just to start to wrap things up, I'd love to hear what you think the ultimate potential of this type of immersive storytelling and world building might be and what it might be able to enable.
[01:03:28.425] Karen Palmer: You mean of my work?
[01:03:29.305] Kent Bye: Yeah, your work or just in general?
[01:03:32.327] Karen Palmer: I think, um... I don't know how I ended up here, I gotta tell you, in terms of immersive. Like, I started off as a fine artist, and then I moved into sculpture, and I wanted to take photographs, and I wanted my pictures to walk and talk, and I made film and video, and I was like, wow, film and video. It's like, people go to the film, it's like when you go to a church, it's like, to worship, you know? Like, no one can talk in the cinema, you know? It's like reverence. That's why you have stars, but TV, you have celebrities, you know? It defines a different type of persona and almost regality in there. And that homage to film is powerful. And as a filmmaker, I was like, yeah, that's amazing, but I don't, there's a bit, you know, the times is changing. And this thing in the palm of our hand has a different type of power, right? So me personally is about connecting with the masses with a very particular agenda and objective because this is basically mind influencing and the more we're getting into tech combined with this film is the more we're getting into like influencing people. and programming, manipulating, persuasion, whatever it is. And I'm aware of what that is. I'm conscious, I'm deliberate. I have a very specific agenda. I'm for us, we the people. And that's fine. I think within immersive, XR, VR, everything, I feel it's very easy for it to be escapism. for good, not so good, or for something in the middle. Who's to say what's good or not so good, right? I'm sure like it's going to become probably like the internet was, a lot of the internet advancements was on the back of porn, for example. I'm sure it's what's going to happen with this kind of media as well. You know, that's the industry that wants to invest and make the advancements. So it's like, it's impossible to even imagine the level of influence it's going to have on us. And different people are going to have different agendas and objectives and different ways. And this is really going to become our new world. But my focus is on you go in there to come back. You go into that world to come back to make this world a better world, your world, not to kind of check out and eventually check out for as long as possible and become a new person and then not really want to be in this world much. I don't think I was born for that. I don't think a lot of people were born for that. It's going to be something for everybody, you know, and that's good. But I have an agenda, an objective, and this world building is going to be, these new worlds is where we're going to reside, soon going to be residing most of the time. So I'm just kind of urging people that don't forget this world. This is the real world. That world might be quite cool and cozy and surreal and visceral, but yeah, I'm just here to remind you that it's this world, like this planet Earth, these people around you, that you can create this beautiful world around you as well, as much as you can in that other world and have that humanity, that human connection. Instead of a really cool Jack GBT doing all your work for you on the side, you can kind of balance everything. It's not all bad, right? But I feel it's going to become like a new frontier of not just storytelling, but living. And that's why I made this. This is like the living space of the future, where you can kind of check in for 24 hours and just come out. This is how people interact. They won't be interacting. They'll just be around some column, drawing them in. That's the digital life. So something for everybody. And that's how it should be. But just don't lose sight of what you're here to do.
[01:07:11.687] Kent Bye: Is there anything else that's left unsaid that you'd like to say to the broader immersive community?
[01:07:15.123] Karen Palmer: generous. The immersive community is a powerful community. I don't even know how I ended up here, seriously, like I just kept going on this course and course and you know before like a few years ago it was interactive and before that it was transmedia and It just seems to have a lot of power in its hands, you know? And it's a weird industry, because if you're in here, this looks so amazing, but, you know, there's people I've spoken to here who are like, yeah, I've maxed out my credit cards just to be here, you know? It's not... People don't come into it to be, like, the most lucrative of what they're doing, but people can swoop in from organisations and companies and, you know, there's so much money to be made here. You know, so my point is a lot of us are here through passion and I'm also in it for I want to make some good money too. So I want to kind of do that. But to say to the people that I just say that. It's for us to continue to find a way through that's going to benefit us as individuals and the community, whether that way through is financially lucrative or it's personally satisfying. I know for lots of people it's hard in this community, but it's been hard for me. I've been at home, like, crying, like, how am I going to do this thing tomorrow? Just get a good night's sleep and get up tomorrow. Whether you, you know, I like green juice, whether you like weed or go talk to someone, just keep going. Just get your story out there. And hopefully I meet you somewhere with a bit of money in your pocket at an event like this so we can share stories together. Yeah.
[01:08:43.312] Kent Bye: Awesome. I was really excited to see your piece and just so much a pleasure to break it all down with you. So thank you for joining me today.
[01:08:49.709] Karen Palmer: Thank you so much. I mean, people have walked past like, hey, I need to talk to you. I'm like, I don't care. Like, this has been like one of my best conversations, and I've had some great conversations, but I really, really enjoyed this conversation so much with you. Particularly, as I said, that you've just gone to the like, the exact new, what they like, the themes of which is really important to me. So I've really enjoyed this conversation. I can't wait to kind of listen to all your other podcasts as well and really just get into it because you're really on the money. So thank you so much for taking the time to ask to speak to me today and being generous with your time. Thank you.
[01:09:19.067] Kent Bye: Yeah, thank you. So that was Karen Palmer. She calls herself the storyteller from the future, and she's come back to enable us to survive what is to come through the power of storytelling. And so she had a piece there at Southwest Southwest called Consensus Gentium, which ended up winning the grand jury prize of all the different immersive experiences that were in competition this year at the Southwest Southwest Immersive. So I have a number of takeaways about this interview is that, first of all, Well, the topics and themes of this piece are extremely, extremely relevant and timely. And I feel like it does a really great job of translating these theoretical or abstract ideas into a near speculative future that connects you into the story of what would it mean to have like a social credit score in the context of the United States, where your mobility is being impacted by your behaviors. And then to increase your ability, you have to have this surveillance technology on your phone. that ends up doing a lot of really creepy things. I mean, there was some moments in there that like starting to see like your text messages disappear and just the immersive quality of understanding what are ways that we communicate on our phones with the existing affordances and all the sound effects and everything, how all that mash up together, it gave this sense of plausibility. Like you could imagine yourself looking into this portal into the other world because our phones are these portals into these alternative realities. We already are kind of got this doom scrolling and everything else that we kind of like totally addicted to our phones in mass in our culture and so if we use this mechanism that everybody is used to and all the different affordances of going in between different apps and going into FaceTime and to kind of mash together this near speculative world building that she's created in this piece and to tie these deeper themes down into these personal stories that you have access to through this story and really brilliant job of fusing everything together. And I can totally understand why the jury voted for this piece to be able to win the grand jury prize, not only because of the timely topics that are being covered here, but also because of the immersive nature of it. Like I said, I haven't seen an immersive story that was this immersive on a phone before, and it was just really well executed. Now the critique I'd say is that there were some moments where the calibration process of me looking up or down, I had a hard time seeing real-time feedback. And so as I was trying to have that real-time feedback, I don't know, even within the context of calibration process, if I didn't follow those instructions correctly and if it wasn't like necessarily triggering in a way. I was trying to make decisions that it was kind of making the opposite decisions. It's basically you look up or down to make a choice. And I was looking down sometimes and it didn't trigger. So that could either be that I wear glasses in the environment, or maybe it's something like the calibration process, or to do really high level sophisticated eye tracking on a phone, you're just basically using the camera. People have lots of different, I mean, already doing eye tracking with actual eye tracking cameras requires a lot of specific technology to really get that correct. And so just using on the phone, I'm not sure how effective it was in its implementation, or if I just didn't calibrate it right. The other thing is that there's certain aspects of unconscious agency where whether or not you're looking up or down or looking at the cop or you're looking at the person of color who's being arrested who's on the ground so either you're looking up or down and to say that whether or not you're looking at where is kind of deciding what branching happens and so in some ways This piece overall is critiquing the levels of automated algorithmic bias. On the other hand, the piece is also implementing those same very types of algorithmic decisions in a way that's occluded, that we're not seeing. There's also driven aspects of the story that may or may not be working in a way that we kind of elaborate in the piece. There's actually a very recent article that actually came out after South by Southwest But I was slightly aware of research like this when I went to the existing law and extended reality Symposium that was happening at the Stanford Cyber Policy Center I was able to talk to some people that were extending this research that had yet to be published but I There is this piece that was published in the Privacy Studies Journal that was called eye tracking in virtual reality a visceral notice approach for protecting privacy And so this was authored by Evan Salinger Eli Altman or Ellie Ely Ellie or Eli Altman and Sean Foster and And so what they're essentially saying is that are there ways that you can disclose to people that eye tracking is being used in the context of an immersive experience with what Ryan Calo calls a visceral notice to make it clear that some of these different things are happening on the back end. There's a kind of a deeper need to be able to have this level of disclosure for these types of notices as These maybe algorithmically driven decisions are being made and maybe you're making decisions without really necessarily knowing it and so this unconscious Decision-making that is a part of the agency like you are making decisions that you don't even aware of making decisions about and And the piece overall is about self-determination. And so there's some aspects of even how that technology is implemented in the absence of this kind of visceral notice idea that is kind of propagating that same type of problem, which is like this unconscious decision-making that's happening in technology. It's already happening and it'll be happening at scale. But I think it's difficult for the audience to know that they're making it, especially if there's no type of visceral notice that that type of technology is happening. Trying to I guess uncover this type of invisible mechanism I think is a part of the potential for a piece like this and it's a again a Challenging issue and there's some existing research that is being continued on that front in the context of VR But you're just using a phone based application without the benefit of eye tracking technology But just this idea of visceral notice, I think, is a part of where things need to go in the future. And as an artist, I think Conor Palmer is right on the nose with a lot of these things and identifying these problems. And I think part of the potential for where the immersive storytelling could go in the future is to also show some of these potentials and to implement those potentials in a way that the larger community of these technology companies could also use that as an example and design inspiration to be able to find more of a way that can preserve our sense of self-determination when we're making decisions in these immersive technologies. Because as we move on, there's gonna be all this tracking of what's happening with our bodies, and this idea of cognitive liberty is all about, you know, with the three subsets of self-determination, freedom of thought, and mental privacy, and that aspect of self-determination and the ability to make intentional actions or free will or choices means that there's like a choice that you have. And I think the challenge with these technologies is that it's undermining that choice. So anyway, that's part of, I guess, in some ways she's trying to bring awareness to that, but in the absence of that visceral notice, it's kind of happening on the back end. And I think that message isn't as clear as it could be, I think if it was made more explicit. And so she mentioned that it was still in this kind of prototype phase. And so there, there may be aspects of some of the mechanics of that interactive aspect, whether it's through the facial tracking technologies or the eye tracking technologies that need to get really pinned down so that people can understand that. they're able to make these decisions without actually touching the screen, but that's kind of invoking this camera to do this type of facial tracking or eye tracking. So anyway, overall, though, I think the piece is quite amazing, and really on the nose of hitting these different topics. And like I said, really quite immersive of being able to explore these deeper issues, especially around algorithmic bias and the work of Joy Volimani and other researchers that are featured in the film Coded Bias that are highlighting this aspect of algorithmic decision-making that are happening from like facial detection. The AI Act from what's happening in the EU, I have an interview with Daniel Lufer, who is laying out different applications of AI technologies that should be banned. And some of the things that are being covered in this piece are things that are being actually banned of saying that the consequence from a more of a human rights perspective is that the harm that's being done, people that are not, where this technology is breaking and not working, is disproportionate to the degree that we need to have this abolitionist approach and just ban certain applications of this technology. So the conversation I had with Daniel Lufer of Access Now takes that real human rights approach of seeing how the European Union is in the trialogue deliberative process of trying to figure out how to regulate some of these different AI-driven algorithmic decisions in these different various contexts that this piece is also covering. So that's all I have for today, and I just wanted to thank you for listening to the Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a, this is a part of podcast, and so I do rely upon donations from people like yourself in order to continue to bring you this coverage. So you can become a member and donate today at patreon.com slash Voices of VR. Thanks for listening.