#869 VR for Good: Using AR to Explore Moral Dilemmas of Anthropomorphized Virtual Beings with Asad J. Malik’s ‘Jester’s Tale’

Asad J. Malik is an augmented reality A
artist and director who created Terminal 3, which premiered at Tribeca 2018, and Jester’s Tale, which premiered at Sundance 2019.

Malik’s AR works typically play with the boundaries between reality and augmented reality, and there’s also usually some sort of provacative twist at the end. These experiences typically have a film festival run, and then they may potentially be translated into mobile AR apps at some point in the future when the production pipeline makes it easier to Port and distribute the experience.

With that said, the rest of this post and my interview with Malik at Sundance 2019 on Jester’s Tale is packed with spoilers

Malik wanted an opportunity to unpack some of the moral dilemmas he’s exploring in his work, but because it’s not always immediately clear what the deeper intention or meaning is, then it helped me a lot to unpack his artistic intentions and allowed me to have an even deeper appreciation for the deeper message he’s exploring.

In a nutshell, AI agents and virtual beings a re already starting to have embodied anthropomorphized representations, but what happens when these AI agents start to hijack our social body language cues to start to manipulate and control us to do things that are not in our interest.

Specifically in Jester’s Tale, you’re asked to sacrifice your own body and life to be subjected to a lab experiment rather than the virtual hologram character featured in the piece, but who also happens to be an actual child actor in a hidden cage embedded that’s embedded in the wall. The larger point being that as AI continues to move towards photorealism & move beyond the uncanny valley of realistic interactions, then will we project a false sense of agency onto these virtual beings who will be hijacking our loop holes of social engineering in order to serve the marketing or persuasion motives of a wide range of potential bad actors.

It’s a provacative conceit that reveals some profoundly important discussions about the benefits and limitations of anthropomorphized representations of virtual characters.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.412] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR podcast. So continuing on in my series of looking at different VR for Good projects, now I'm going into my second part, which is looking at specific projects about VR for Good. So in the previous interview, I was talking to Amy Zimmerman from Unity, and she talked about how the Unity for Humanity came about in part to support a project called Terminal 3, which was Asad J. Malik's first augmented reality piece that debuted at Tribeca in 2018. So Asid is an augmented reality artist and director, and his second augmented reality piece actually debuted at Sundance 2019. So I didn't have a chance to see Terminal 3, but I did have a chance to see Jester's Tale at Sundance. And there's a bit of a twist in this experience, and we're going to unpack fully the experience. And I think it's had a chance to make its rounds a little bit, and he might end up releasing it. But just be forewarned that there's going to be some spoilers, and we're going to be unpacking a lot of the intention for why he created this experience. And he's looking at broadly issues around artificial intelligence. And when you start to give a physical embodiment, like a virtual being, if you anthropomorphize the AI, then what are the impacts of that starts to manipulate different aspects of your emotion. And he starts to play with that concept in a very interesting way. That's what we're covering on today's episode of the Oasis of VR podcast. So this interview with Asid happened on Wednesday, January 30th, 2019 at the Sundance Film Festival in Park City, Utah. So with that, let's go ahead and dive right in.

[00:01:42.715] Asad J Malik: My name is Asid J. Malik, and I consider myself an augmented reality artist, director, and I had a piece at Tribeca earlier last year called Terminal 3, and I'm here at Sundance right now premiering a piece called The Jester's Tale. And basically the core of all my work is that it's narrative-based AR. It's actually stories that unfold with characters, or Terminal 3 was more documentary, this is more fictional. And generally there is some political or philosophical purpose to the pieces. With Terminal 3 it was a bit more obvious, like the politics or the personal narrative comes through a bit more clearly. With this you have to do a bit more digging. But yeah, I'm hoping to do some digging.

[00:02:24.264] Kent Bye: Yeah, let's do some digging. So first, just for a full disclosure, we're going to be talking about content that happens in the experience. And because this is an installation piece, it's happening at Sundance. There's a handful of people that can see it here. Maybe we could first sort of set the context as to where people could be able to see it, or if this is, at this point, maybe the only place it's ever sewn, so we can feel free to talk about everything.

[00:02:45.844] Asad J Malik: So it's interesting because I do, so we're in conversation with a couple of other festivals and venues, so we will take it to other places over the next few months. And one of the things with doing this piece in particular, one of the intentions from the start was to release to the public using ARKit or ARCore, that's kind of the bigger thing. It's likely that we'll end up publishing on the Magic Leap as well, although that's slightly less interesting because the audience is so limited. So with the Magic Leap, mostly it's LBE experiences, but I do hope to push it out on ARKit and ARCore. Having said that, I do regardless want to dwell into the actual storyline because I feel like unless I start doing that and start building some context around the piece, it will be hard for people to really decipher the meaning of it. So with that in mind, I would say this build probably have a lot of spoilers, and if you see yourself getting into a festival anytime soon, maybe Tribeca, maybe something like that, I don't know, do it on your own risk.

[00:03:44.346] Kent Bye: Yeah, so fair warning for anybody, stop listening now if you want to ever see this piece, because we're basically going to spoil it. So this is a piece that I saw, and I had a lot of thoughts about it, but you know, there's certain moments that, especially the ending, that was just like a shock to me. I was like, what the fuck? What is going on here? So, first of all, I feel like there's a lot of things that you're doing where you're starting to blend these holograms and then you're taking a key and you're actually interfacing with them. So you're starting to blur the line between what's real, what's a hologram. There's rats in a cage and different dimensions of what I see as a metaphor for something. But at the end, I'll just share my own personal experience. So at the very end, I'm having this whole interface with this hologram volumetric kid and his mother. and there's a whole story that's unfolding and different dynamics and then it's a conversational interface so I'm being asked to participate and to speak and so I get into this flow of speaking and responding to the questions and there's presumably these various different branches that are guiding me towards these different things. So then at the end I'm set with a certain ethical and moral dilemma where it's asking me to make a choice. And it's essentially asking me whether or not I want to subject myself to something or the kid to something. And then before it asks me the questions or at the same time there's this wall that opens up and there's a kid that is behind in a cage in the installation. which it looks exactly like the same actor who was a volumetric. So I just had this whole experience of this actor and this volumetric actor and then all of a sudden he's in the wall and I'm looking at him and I'm being asked to decide whether or not to subject him to some sort of lab experiment. And first of all, I was just shocked because I was like looking at it was like I can't believe that there's a kid in the wall right here like what what it was like the logistics in my mind is like What is there's like the same actor like sitting here for 14 15 hours a day like in the wall Like how could this be happening? so then I was like left with like do I want to subject this boy that I'm looking at and in the wall to an experiment, and the boy was sitting there right at me looking at it. And after I made my decision, I was thinking about it, I was like, oh, I wonder what my decision would have been if it would have been the hologram boy versus the boy that I was looking at. And then I made the decision to go with myself, because I was like, I couldn't find myself to say I'm going to subject this boy in a cage in a wall to an experiment. And then the experiment's like, OK, your experience is over. That's it. Game over. And I'm left with like, well shit, I wonder what would have happened. And I start to question my own decision as to whether or not it would have been a better story or better ending if I would have chose the boy and I would have experienced something that was different. And then I start second guessing myself, like, did I make the right ethical and moral decision? I was like, well, no, I feel like I made the right decision, but I'm kind of pissed off that I don't know what happened. So that's sort of like, it was like that experience that I had that I appreciated that I'm sure you were trying to be provocative in that way. And I kind of wanted to just talk to you to be like, what the hell's going on?

[00:06:46.279] Asad J Malik: Yeah, well, so one of the things is that, yeah, the N is, in a way, it's anticlimactic because kind of nothing happens and you fail. And yeah, to spoil it for you, you fail regardless. So either way, you would have failed. Either way, you would have failed. And I mean, so if we start unpacking some of the themes that I was trying to tackle, like one of the big things was, of course, what is real, what is not real, always does end up coming in conversation. Generally in my work, I attempt to put it in, but also with a medium like augmented reality, especially at the start of this medium, those questions are inherently built into it, right? The whole history of art and mediums, so much of it is about what is real, what is not, what is assimilation, what is an essence of some real object. And this is a medium that is literally dealing with those questions. It's not just some conceptualization anymore. So that stuff does come in. But the big theme of the whole experience for me is it's about AI and it's trying to criticize the agency we so easily associate with AI. So it's very easy for humans to personify inanimate objects in general than animate objects like dogs. Like we look at a dog and we give it more credit for what is probably happening in the dog's head. And same happens with AI. Usually, AI unfortunately right now has no agency at all. It's data crunching on steroids with usually a layer of storytelling on top, which is usually branching narratives and interactivity similar to how we do these kind of experiences. But this false sense of agency that big companies are able to associate with it and the conversations that come out in pop culture about the fears of AI are so misleading because they distract from the real fear of AI, which is where big companies are able to do what they want to without claiming any responsibility for what their so-called agents go out to do things. So that was one of the big things we wanted to tackle and we thought the ultimate way to tackle that is by this really absurdist reverse Turing test in which the viewer is interrogated by an AI in this bedroom with emotional stories and things happening that you can't quite put your finger on. You know that every question is trying to judge you somehow and see if you're responding either like a human or like a robot or like a robot that's pretending to be human. And the last question had to be a paradox and the response had to be that you lose and you fail because the test is not valid to begin with. The setup in which an AI in any scenario would interrogate to see your humanity is just messed up and should not happen anyway.

[00:09:25.505] Kent Bye: Yeah, this is a little bit of a take on the evolution of the CAPTCHA, where on a CAPTCHA, you are basically clicking, like, I am not a robot, and you have to say you're not a robot. So it's almost like we're conditioning ourselves to, like, say, like, we're not robots. Rather than saying we're human, we're saying we're not this other thing. And so I could see how there's, like, these volumetric stories that are unfolding, and then the AI would be detecting how you're responding to the story to see if you're responding like a human would be. And I do see that we're in the midst of this huge ethical crisis when it comes to technology, that both virtual reality, augmented reality, and artificial intelligence, all of these technologies are actually presenting these huge ethical and moral dilemmas that unless we completely shift the way that we do our economic and moving from a more reductive way into more of an ecosystem and holistic way of looking at things, then we're going to start to create these situations where these companies are essentially using AI to dominate us. So, there's also the aspect of the anthropomorphizing AI technologies. I had the Micah demo with one of the creators from Magic Leap, and she was really insisting on referring to Micah as she, and that it's a human. And there was something about, in my gut, I was like, that actually feels a little wrong. Like, to equate it as a conscious being, that in some ways these are tools for us, not necessarily like it should be thought of as exalted as a super intelligent being that should be treated like better than animals.

[00:10:52.161] Asad J Malik: Man, this conversation, I've been having this conversation like the most heated format of the last few days with some of my friends that are visiting helped me write the story. And I had one of my good high school friends was an academic advisor on the project as well. And this is such an interesting time because The big difference with what Sundance looks like to me this year is this idea of interactive characters and AI coming in. Micah is of course a good example. On top of it, the Edward Tatchi announcement and fables, like what they're calling virtual beings, I believe. That whole idea coming in, then of course you have little Michaela and those kind of social media influencers that don't really exist yet. And with a lot of these virtual beings, they don't really have AI capabilities necessarily. Half of them are actually just models. They're just 3D models. That's kind of the limit of it. And it's the same idea as telling story, like Mickey Mouse was a written character, you know? And now if you can interact with Mickey Mouse, or if Mickey Mouse has kind of more of a branching presence, it doesn't make it an AI. My problem with most of the stuff is not that we should not do virtual characters. I think it's a very interesting space and I think being able to build interactive characters that are not just one storyline but have more complex arcs and people can build relationships with. I think it's great. I think it's a good step. The problem is when a false sense of agency is associated with these characters. We have to understand these are written characters, people are behind them, they're making narratives, they're trying to use make-believe to get you to interact and be involved with the story of the character, the same way as everyone has done it for years. We just have a more complex way of doing it now. And sure, you can use more machine learning and whatnot to understand, make this character even more flexible and adaptable to interact with you to make these interactions feel even more natural. But as soon as people start marketing these things as like intelligent agents with their own agency, that's where the problem lies, because that's when you give up responsibility yourself as a creator and you hand it over to this agent while everyone knows or while everyone should know that it's not an agent.

[00:12:55.849] Kent Bye: Yeah, it's like a machine learning, which is basically doing a lot of statistics and making inferences, but based upon a lot of data that's been fed to it. So it's being curated by humans with data labeled by humans with algorithms that are tuned by humans to be able to take up information that's more qualitative in the sense that they have to make inferences and find these different patterns with neural network architectures. But at the core, it's still essentially just coming out as a tool that humans are creating. And I think that To me, I think it gets back to what consciousness is and where is consciousness. Because if you just think that consciousness is an epiphenomenon of the brain and that it's just all an illusion, then you're going to equate AI as to be just as exalted, actually, as superhuman. You're going to say that this AI is like a god. because it has more intelligence, if it can be a Wikipedia or be able to look up things or do fast calculations, that you're essentially saying that this is better than a human being, better than a conscious being. And in fact, I think one of the things that you're trying to flesh out in this experience is juxtaposing how we treat animals versus how we consider like this technology and we're exalting this super intelligence of it, but yet something that may have arguably more levels of conscious awareness in their bodies, but we treat them like there's a threshold by which we are going to have ethical treatment and it's kind of stops at dogs and cats and anything that we can eat is kind of treated like not even as a living being in some ways and how we produce food. So yeah, I'm just curious to hear your thoughts on that.

[00:14:23.731] Asad J Malik: Yeah, definitely. So I mean, one of the things was so we wanted to show a world in which these things have gone wrong to a point where there's this fictitious department of digital integrity that is out here to make sure you're human or not. And like there's certain power in being able to prove to yourself that this AI proving to you that you're not human. That's kind of the ultimate power, in my opinion, anyone can have is convince the masses that they're not even real. They don't even have agency. They don't even have free will. They're just not even human in that sense. So we wanted to strike a good balance between showing a world where AI is personified in that way, where this AI gets to decide, but at the same time, the particle cloud that the AI represented was a thought experiment of looking at what happens when we don't personify or give it humanoid forms. So this was a complex representation of an AI that shapeshifts and becomes different scenes and is the rat and is the child and is the mom. The AI is all of them and can place itself against itself to create these scenarios where you become the subject and you don't quite know what's happening or where everything is placed. So the reason, for example, the child was wearing white clothes or the mom was wearing white clothes, the rats were white, the rat queen was white clothes. I wanted to show a similar line between all of them being like this one entity that the AI was presenting to you. Although there were conflicts and dilemmas among those characters, it was all the AI's attempt to manipulate you and your thoughts.

[00:15:56.027] Kent Bye: I see, so the AI is more of like a floating organism that is clearly not humanoid in any way, and then it morphs into humanoid depictions through these holograms to be able to essentially do what, or captures to show you a piece of story, and then you're supposed to react to it to see whether or not you're reacting to it like a human would. So do you think it's dangerous for this trajectory to actually create humanoid avatars with AI and have these like body language representations? Do you think that we should go more the abstract route as you depict in a Chester's tale?

[00:16:28.286] Asad J Malik: So I think there are two different things to do here, right? One of them is AI as a tool. AI as a tool I think is better off being a more amorphous kind of being. I think giving like human forms to it once again goes to associating this false sense of agency and building kind of empathy with these beings or avatars that I think is false and not necessary and not needed. But on the other hand, the whole idea of virtual beings for storytelling is, I think, a whole different conversation. It's not as a tool anymore. It is actually more interactive, more interesting avatars. And as long as it's clarified that there's no actual agency, we're not saying that these avatars are actually like real and have thoughts and feelings and they have thoughts and feelings the same way Harry Potter has thoughts and feelings you know and people can appreciate it for that storytelling and engage with these things in more complex fashions I think that's totally fine but it is a thin line and I think defining the frameworks of how that is done is actually very critical right now in this moment I have a really weird idea of kind of like, I want to do a character. And like over the next year, let's see how it unfolds. But I have an idea of kind of a weird self-portrait kind of character that goes a bit too far with a lot of things and is, you know, we get people in motion capture suits, multiple people, very talented people. And the weird thing about this character that it's a bit too early to talk about, but I'm calling it Young Mesh. And the idea of Young Mesh is that Young Mesh is the best at everything. like they're viral videos of Young Mesh shot on an iPhone, like shaky or like a band performing and Young Mesh is playing the drums in the background, but Young Mesh is killing it. And like the video goes viral because the drummer is so good. But the drummer then in another video is like now a football player and he's just killing it on the field. And we do it by getting the best of the best in motion capture suits and having them give us an epic performance and make these like videos. This is an idea for a project that I'm talking about. And the tricky thing here is what I want to do, and I think it might be a bit too problematic, but I think maybe talking about it is not the worst call. I want YoungMesh to be a slave. I want YoungMesh to be a literal slave, my slave, my virtual slave, in chains, visually attached to me. I understand that this imagery in this country is especially provocative and especially in line with the history of slavery that is very real, real people being abused and enchained and their free will being kept aside. I want to bring up this challenge in a way to all these people that are trying to suggest that AI needs their rights for these virtual beings. I want to play with that idea of thinking what a robot is. A robot, I believe, if I'm correct, the word robot is a Czech word that literally means slave, or the idea of an android. When we think of slavery, it's about an entity having free will that's taken away from it. So what happens when there is an entity with no free will? So I want to start asking those kind of questions in ways that are confrontational.

[00:19:32.810] Kent Bye: What would the ethnicity of this AI character be? Same as me.

[00:19:37.813] Asad J Malik: Okay.

[00:19:38.470] Kent Bye: Can you describe your ethnicity just for people who... Sure.

[00:19:40.611] Asad J Malik: So I'm Pakistani and I mean, that's a good question, right? Like, in fact, that question has come up. I'll, maybe we'll take a slight detour. Maybe we'll come back to this if you find appropriate, but I want to actually going back to Jester's tale, that last moment when you decide between terminating yourself or the child, some interesting feedback has been that, first of all, I've seen at this point around 80% of women decide to terminate themselves. while 80% of men decide to terminate the child, even though the child is in front of them. I'm sure that this data would be significantly skewed if the real kid was not there. But on top of it, I've had conversations with people where people have said, if the kid was a girl, I may not have killed her. Or if the kid's race was the same as mine, I might have felt more empathy. So things like that are coming up, which I find absolutely mind-blowing. I have to figure out a way to capture some data about it. Maybe we could have done some iterations. Maybe we would next time.

[00:20:41.093] Kent Bye: Well, yeah, I just wanted to ask about the ethnicity because, yeah, if you made it in African-American, in slave imagery, that would be a completely different context. And because, as an artist, the intent is unclear, then people would have all sorts of different visceral reactions. I think if we sort of just take the AI part, and AI as a slave, and one of the things that one of the representatives from Magic Leap was talking about on my panel was just how people speak to these virtual assistants like Alexa, and they'll just actually really treat them really horribly, and how a lot of the assistants are voices of women, and so what is that, training us in terms of these gender stereotypes, but also perhaps providing us a mirror for something that, because it's a disassociated robot, were maybe able to treat them at our worst or be really negative with them as if they were an actual embodied person or avatar, we probably wouldn't have the same reaction to them. So to me there's some interesting aspects of exploring like what is it about these virtual assistants and can it train us to be like if we do treat it as if it's a human, is that actually training us in a positive way, if we start to do the complete dissociated route, then maybe that is going to encourage us to speak in a way that doesn't have as much empathy as we could if we were to treat it with the utmost respect of an elder.

[00:22:00.715] Asad J Malik: I mean, like the question becomes, why do you need empathy for this machine? Empathy is for people or agents with agency and with free will or like empathy comes from this idea of like being able to associate someone else with yourself and imagine as if this other being or entity sees the world in at least somewhat of a similar fashion as you do or as your mind operates. I don't think there's anything inherently wrong in treating Alexa like a servant or whatever. I think the problem in that scenario is that why is Alexa personified to begin with, right? Because then it becomes this weird thing where you are treating this character that is attempting or presenting itself as human, and then the problem lies in where why should you treat characters that are human-esque in that poor way? but if it was amorphous or it was in a form that is not really very human. then in that scenario, I personally don't find anything particularly wrong with that. My problem is exactly what you pointed out, this idea of, like, female voices or building characters and who gets to build these characters, who are usually the people behind them. And usually it is men, like, usually it is men that go ahead and make these characters female and sometimes disembodied, sometimes embodied but without a voice, like in the case of Micah, for example. which I have mixed feelings about Micah in the sense that there's a lot happening there that I can really appreciate. I think they're really being thoughtful about the relationships that people have with this whole AI servant kind of element, talking to Google or Alexa assistants. They were trying to differentiate themselves from that. They're trying to present this strong woman character kind of a thing.

[00:23:47.232] Kent Bye: Yeah, they were really resistant to calling Micah a virtual assistant.

[00:23:50.633] Asad J Malik: I mean, from what I know about it, I think it started like that. I think the initial goal, if I'm correct, or that's my understanding, was maybe to make a virtual assistant. I've talked to some people from Magic Leap and my understanding is they actually got to show this character to one of the women CEOs who was like, what are you guys doing? What is this? Once again, still we're stuck at this moment. and that made them realize and then they did this great thing of hiring a bunch of women writers for Micah and they shifted the whole thing to being like this more balanced power dynamic where Micah has certain control or has certain things that she can do that you cannot like so you're the one who puts the frame up on the wall but because she can't do it but then she's the one that makes the treachery of images appear on it because you can't do that And I like where that's coming from and like where that's going. Sometimes it does feel like somewhat of an afterthought to me, which it's still, it's fine. Like we're at a new time and if people are figuring out things as they go, it's totally reasonable. I think the problem lies when you start saying this is an agent or this is an AI or this is an intelligent humanoid figure that now we have to build respect for. I think building respect for a character is enough on its own and I think it should be presented as a virtual being with writers and interactivity and not necessarily as an AI.

[00:25:13.854] Kent Bye: Well, the argument that I would give in terms of why you should treat these virtual assistants or AI characters, whatever the proper term for them is going to end up being, why I think you should treat them with empathy is because it's habit forming. It's getting yourself into the habit to just, as soon as you start to classify in your mind that this is a second class citizen and this is a servant who's here to serve my needs, and you start to then dissociate your own sense of empathy for how to speak to other entities, whether it's AI or other people, I think there's a pretty direct translation that if you start to talk to AI assistants like they're second-class citizens, there's likely that you're gonna be treating a whole class of people as second-class citizens and not see their humanity.

[00:25:56.847] Asad J Malik: I think that's true and I think that using these kind of characters, it's the same as like a game or a story or a story that's told that you could listen to a story that makes you want to go be a serial killer or you might listen to a story that makes you want to be more empathetic with people or be more generous. To that extent, I totally agree with you. I think the whole conversation about do games make you violent and whatnot, it's a very similar conversation to this. But I think the distinction is very important between where the problem lies. The problem in that scenario would be it's habit forming. You might go and do the same thing with real agents and probably reflect something about you that that's how you enjoy talking when there is no repercussion. But inherently, is there something wrong? I think that distinction is really crucial to be made because it's very easy to be confused as to thinking, no, actually, the problem is that this character has some agency, and now I have to go around being nice to Google. Literally, that's what it's implying.

[00:26:56.827] Kent Bye: Yeah, and there's obviously power dynamics that are there in terms of economic, for sure. I'm curious to hear you talk a little bit about the rats and what the rats mean to you.

[00:27:05.775] Asad J Malik: Yeah, so, I mean, the rats are, you know, like, it's quite the metaphor. Everyone has association with rats in one way or the other, whether you live in New York or not. You know, everything from the rat race and, you know, rat experimentation and whatnot. So, it started off as somewhat of a joke, honestly. Like, the whole rat funeral thing kind of started as a joke. you know, we were writing this thing and suddenly we were thinking of, you know, how do we bring in moral dilemmas and interesting ideas that people can think about and consider their humanity. We were thinking about what factors differentiate a machine from a human. Simple things like computers are really good at random number generation. Humans are not very good at that. With humans you can pretty much give them a few numbers here and there and you have skewed and Bias them to a point that they might very well say the number you predict them to that's how magicians and mentalists make a living So we were thinking about these kind of elements and we didn't want to do really simplistic like utilitarian tests or like trolley problem-esque tests. The rats suddenly were something we were talking a lot about. That whole scene of the rats killing each other comes from, Thomas Nagel has this thought experiment. I might have mentioned this to you before, but it's pretty much a direct, like we were inspired by that. The idea was that he says he went to, it's called Spider in the Urinal, he said, I went to the urinal, there was a spider in it, and I peed. And when I went back to the urinal later that day, the spider was still in it. I was like, wow, what a miserable life the spider has, just sits in the urinal all day. So I decided to give it a whole new world. I put it down on the floor and I left it there. And when I went back to the bathroom, the spider was still on the floor, this time squished and dead. And this idea that he was projecting what is good for the spider onto the spider and it resulted in a completely opposite result, I found very interesting. And that's exactly what happens with the rats. You think they're going to reunite and their lonely rat friend is finally going to get to play with them. But no, when you open the case, they attack and kill their lonely rat friend. It's not all that pleasant. And it's this moment where, you know, generally a lot of these experiences end up having gestures or voice control and ends up feeling rather gimmicky. this moment I'm very proud of because literally you turning the key is you taking responsibility for what's about to happen in the most physical sense of it. And as soon as you open, there is just this brutal rat murder. And when we were talking to the animators, it was like, okay, how brutal does this murder have to be? And I was like, as brutal as we can make it in the time we have. And then it's just on you. And when the rats rise up, like you kind of have to take some responsibility. And that whole scene is supposed to skew once again, the last decision, like the kid kind of convinced you to open the cage, but you're the one that actually did it. There is a branching version where if you say no, the child opens the cage, which of course then changes what's going to happen as well. But that's kind of where the rats were mainly coming from.

[00:30:00.208] Kent Bye: Interesting. Well, maybe you could talk a little bit about Poppy and who she is as a pop culture icon. I wasn't familiar with her at all, so I don't have any context or background into who she is or what she was saying or how she was related to the rest of the experience. Maybe you could talk a bit about that.

[00:30:15.765] Asad J Malik: For sure. That's a good question. I mean, a lot of people come out being like, okay, that was really cool. But like, where did that come from? You know, like it doesn't quite, there's nothing that builds up to suddenly there being a human sized rat queen. But so basically it started as a joke as well. It was like, okay, so if the rat is dead, we should do a funeral and there should be a rat priest and it should be Shia LaBeouf and he should have a tail. Like that's how it started really. And then it became really serious about it. We wrote like dialogue for Shia LaBeouf and everything. We reached out to him and, It was a whole thing, and he had a movie here, and he was going through some personal things as well. It wasn't going to work out. Then we approached a bunch of people, and eventually it was pretty last minute that I was like, OK, we don't have time. We have the money to pay an actor for this one monologue. the space at the volumetric capture stage, the meta stage, saved for it as well. We almost, in the last moment, almost did like kind of a man-bear-pig version, like a half-human, half-rat animated motion capture suit version. I'm glad that we managed to get Poppy right at the end. The thing with Poppy is I put her in a similar trajectory or I think her contemporaries are Micah and little Michaela. Like that's how I look at it, right? Is she a real human or is she? She's a real human. So she's a, Poppy's a real human and her persona, which now that I've worked with her and spent some time with her is a lot of it comes from herself. It's there. Even in normal interaction, you see where it's coming from. It's like Poppy is trying to be more of an AI robot, like CG kind of asset, while little Michaela is a CG asset trying to be as real as possible. That's kind of, and same with Micah, that's kind of how I look at it.

[00:31:56.241] Kent Bye: So this is a real human who's trying to act like robotic or like, she's trying to act like an AI.

[00:32:01.111] Asad J Malik: So Poppy went viral probably a year or two ago when essentially she works with her director, creative director, his name is Titanic Sinclair and essentially they were doing these videos, these really bizarre videos with usually white like overexposed backgrounds and really creepy ambient music playing and she's there and she says Something unexpected or creepy like there's one video of her where she's saying I'm poppy I'm poppy I'm poppy just like jump cuts of her saying I'm poppy in different ways for 20 minutes or something and has around 18 million views so like that's the kind of following that she built for herself this really absurd like performative like Warhol esque kind of YouTube star who never breaks character. When she goes to an award show, she goes in a glass box. When people ask her, why do you make YouTube videos? She says, they made me do it. And all her pop music is also dealing with like the perversion of fame and perversion of this idea of being an AI or being a computer. And everyone usually in the comments is like, is Poppy real or is Poppy a computer? That's kind of the idea. So I thought she would be perfect for this role because that's the kind of stuff we're playing with.

[00:33:05.960] Kent Bye: Because she's like the AI morphed into these different entities. And so it's sort of a reflection of this amorphous AI who is now playing out the psychodrama between the rats and the rat queen, who's played by Poppy.

[00:33:18.183] Asad J Malik: Exactly. So the AI voice is actually someone else that's not Poppy. But yeah, so eventually it was like, OK, well, Poppy is going to play this rat queen. And this is like, I kept saying this was going to be a Braveheart-like speech. And when I got down to writing it, it was like, OK, So either we can go the direction where Poppy is clearly a villain. She's wearing black contacts. There are these rats. They're soon going to want to kill you. So this is clearly like they're against you. You don't want to be their friends. But I thought what it would do and how complex it would make the experience if actually Poppy is somewhat convincing. Like if her speech is actually somewhat connected and meaningful and you get to feel bad as a human that just allow the rats to kill themselves. Because that's funny. You didn't even kill them. The rats killed themselves. It's not even you shouldn't in a way have to deal with the responsibility for that but somehow she convinces you and the rats that it is your fault and I thought that would be really strong if she was able to do that and you're right it comes out of kind of nowhere and I'm getting mixed feedback about that like some people love that some people don't quite get it come out really confused because they don't know what quite happened in the narrative The reason I did it, basically I had two reasons for not giving her much of a warm-up time and just diving into a Braveheart speech. The idea was I want to establish a new way of telling stories in AR. It doesn't need to be the same way that film does it. Everything doesn't need to be defined and one after the other, linear. I think it can be a lot more visceral and could be a lot more about metaphorical and what you feel and what happens to you in the moment. It all doesn't need to make sense in that way. So part of it was that. Part of it was to highlight the absurdity of the fact that you're being interrogated by a machine to begin with, and there's any end to this. The fact that it's so absurd was kind of the point. And the absurdity of that scene when she's talking to the rats that are just going apeshit crazy, just screaming and being excited about the fact that now they're going to attack the humans. It's almost funny. It's so weird. It's almost funny and it leads into a child sitting on the wall in the right way. I think the child sitting on the wall in another way could have actually been very problematic and traumatic for people. But because that scene is so absurd, it kind of works out.

[00:35:31.972] Kent Bye: Well, I think there's something about your work that's using these moral dilemmas that are creating this visceral experience, like these impossible decisions that then you have to kind of reckon with them in some ways. Philosophically, there's the concept of a Hegelian dialectic, which there's a thesis, an antithesis, and then as you have those two polarity points in tension and opposite with each other, there's like a paradox where they can't be combined in any way. You have to somehow find this way to transcend the limitations and include the best part of both and then come with some sort of synthesis. And so I do see that there's this possibility to use a Hegelian dialectic and to, as an artist, take an extreme polarity point. you know, really be absurdist and where are you taking people? But the hope is that you would potentially have people swing back to potentially the other side or to have them deal with like, oh, well, if there's only the thesis, you're presenting them with like a metaphoric description of that antithesis that is really trying to catalyze them into like synthesizing into trying to find a new way.

[00:36:31.293] Asad J Malik: I think that polarity is exactly what the slave idea would get at. And I want to talk to some people before I do anything about this because I don't think I'll be able to get funding from anyone. But we'll make it work somehow. But I definitely want to dive into characters. I think that's a very interesting space and to do it right is actually very challenging and I appreciate that challenge. Another thing about this piece, because it was so absurdist and kind of left things unresolved, part of the intention from the start was when we did Terminal 3, it was literally an airport terminal. Whenever we would go to a festival, we would build a space that would look like an airport terminal with one-way mirrors and CCTV cameras and everything. And I was really thinking like it's really important for augmented reality for the context and the space to be specific. You cannot just view this anywhere and for it to work. That's not utilizing the power of AR to blend with your actual reality. So I wanted to figure out a way where we would be able to find a space that is specific yet still scalable because terminals are not scalable. So the bedroom just seemed to be the right call. Like a bedroom is just the right kind of specific personal intimate space that still is scalable because almost everyone that has a house has a bedroom and the bedroom has similar ideas like you know a bed is flat a flat surface usually has some texture generally good for placing holograms so we wanted to work with that idea but part of why we wanted to leave the story somewhat unresolved is because we wanted When your app is turned off, or when your device is over, when the experience is done, these characters have some ghostly presence left behind in the space. And hopefully you're still thinking about them, and hopefully you get to dream about them. How do you penetrate someone's dreams with an AR experience? The best way to do it is make it somewhat unresolved and put it on their bed. Wow, so have you gotten any feedback?

[00:38:23.850] Kent Bye: Have you generated any good dreams?

[00:38:26.739] Asad J Malik: Well, I think this is where that idea comes in of like mobile distribution and ARKit, ARCore, what are the advantages and disadvantages of that. A lot of people are like, wait, you won't be able to do the key then, for example. And it's such a crucial kind of moment. How are you going to replace that? And I think that's where my creative work has to come in. I have to go back and do some more conceptualization of what can we do with mobile that is unique and that utilizes the fact that, first of all, just the fact that it's going to be in your own space and will penetrate your dreams, that's a big plus. I think you can't do that with a space that you have no other relationship with besides this piece. And then the other things is I think there are some really interesting network things that can happen like thinking about this AI particle cloud as every particle being a node, maybe every particle being a cell phone that is experiencing this piece in different parts of the world and how somehow that can be built in to the whole experience.

[00:39:22.543] Kent Bye: And obviously, it's going to be very difficult for you to embed a kid into people's walls as they watch it in their homes. And so maybe you could just debrief us a little bit on how you're handling logistics of having kid actors at Sundance.

[00:39:33.625] Asad J Malik: So there are two big challenges with this whole piece. One is volumetric assets that are 60 gigs running on a Magic Leap. It's disastrous. It's been quite a pain. But we know we're pushing the device to its 99% limit. And the second thing is having real kids in a wall. I did a similar thing with Terminal 3, I don't know if you're familiar with it, but basically we had two rooms and in the first room you would interrogate this hologram and they would initially be very abstracted and as you would talk to them and ask more and more humanizing questions they would start appearing more realistic and at the end you would make a decision to whether or not you're going to allow them into the country and then you would get up and walk out of the second room. When you walked into the second room it was actually an exact mirror of the first room and the real person was sitting in exactly the space where their holographic self was sitting. It was really powerful. People came out crying because these were real people. These were actual people who went through all these real stories. It was a documentary piece in that sense. And I just found that so powerful that I wanted to copy it for myself. And I've realized that with immersive theater, you expect actors, and it's a different scenario. But when you build a relationship and association with a hologram and have them then become real in a space where you were alone being interrogated to test your own humanity, and that one moment, simply because you have a real observer, a real agent, your humanity sinks in for yourself, that's really powerful, and I really needed to make it happen. So we dwelled into child labor laws in Utah. What do you want to know? So kids can't work more than four hours a day. If they're going to school, there are limitations. They can't miss school. We went ahead and we got doppelgangers for the same kid that we have in the experience. We did a great job with our selections. I don't know. I thought it was the same actor.

[00:41:20.557] Kent Bye: I was completely fooled.

[00:41:21.924] Asad J Malik: So we have the actor as well. So the actor is also back there doing it. But we have two other doppelgangers. They all look very much the same. I confuse them from time to time and they have shifts. They come every four hours. We change their positions. Their parents are usually behind them. That's a legal requirement. They always have to be there. Their parents. Now we have trained to actually open the kids themselves, which is even more bizarre. So, I mean, it's interesting because we're dealing with this dichotomy of what is real and what is not. And when you get to see the real kid, it very well may not be the real kid that was in the experience and just maybe a copy of the real kid as in a doppelganger. And the kid is sitting playing Nintendo on the sofa like all day. And then when Poppy says silence, everyone gets up and like everyone puts the kid in the cage and like then there's another cue and then the mom opens the cage and like that's what's happening backstage.

[00:42:13.420] Kent Bye: That's good to know that they're not just sitting in the cage the whole time. Okay. Wow. Okay. That's great. I mean, it was a very powerful visceral moment for me. I was not expecting that. So it was like, just my mouth must've been wide open and shock. And so what's been some of the reactions from people?

[00:42:27.741] Asad J Malik: Yeah, I mean, people do have strong reactions. I am surprised at the amount of people that come out and they're like, okay, like, I need some time to process it. Like, I understand that that's where it would come from. But a lot of times, like, people kind of don't know what to say. Like, usually even like, going crazy and be like, oh, that happened. That's so crazy. People just want some time. Like, I think it can be quite striking a lot of times and I think what I was telling you about how people are responding and actually answering, people come out then, like a lot of them, whenever there's an exception, whenever a man says no or a woman says kill the child, they usually have very strong reasons for it. And of course it's a weird way to create like this weird gender binary to describe this kind of situation. But I think there is something to be said about how people are conditioned, both culturally and because of their gender preferences as well, to go and make certain decisions. I think a lot of women come out being like, look, I'm a mother, you can't expect me to have done this. While a lot of men come out being like, I'm a father, but that kid was evil. Some people come out asking whether the kid was real because it's like this wall and like there's a light and like you know it's kind of like standing out in the whole room and we close the cage soon enough.

[00:43:41.505] Kent Bye: They're wearing a Magic Leap so they're already having like layers of holographic reality overlaid.

[00:43:46.645] Asad J Malik: That was kind of the idea. We wanted them to have that confusion.

[00:43:49.566] Kent Bye: I think I may have even lifted up my Magic Leap a little bit, just to double-check, like, is this really happening?

[00:43:56.148] Asad J Malik: A lot of people think it's like an Android or something, just like a statue dressed up, like slightly moving, that we have sitting in, because having a kid sitting in there for all this time would definitely be more problematic than a robot. I think that, on its own, says something about all these things that we're talking about.

[00:44:12.512] Kent Bye: Great. So for you, what's next? What do you want to experience in either augmented or virtual reality?

[00:44:18.196] Asad J Malik: I graduate in three months. So I really look forward to that. I'm moving to LA. I already leased a place for a year and I'm excited to be working on this stuff full time in a way and be able to take on multiple projects. I'm very interested with a lot of artists and characters that already exist in pop culture that already have followings that are actually very well primed for this medium and they don't quite know it yet maybe. So I mean like Poppy, I'm sure I've talked to Poppy's team and they do understand the potential of VR and they understand the potential of like simulation and how well that fits in with her character. They've even been doing some stuff, they've been doing a lot of VR180 videos with YouTube. So characters like little Michaela and Poppy and even like going beyond that like musicians like for example Grimes. Like there are a lot of these public personas and characters that I think are very well primed for an immersive future. And I want to start more collaborations and I want to start building my own characters that live beyond just one experience at a time. That's one of the big things I'm interested in. I've really come to appreciate volumetric capture I've now done two really prominent pieces with volumetric capture, all the way from Depthkit to now the best-in-class volumetric capture. And I think the kind of authenticity it captures is really unique and interesting. I think we're starting to mess around with retracking and, you know, the holograms being able to track your eyes and follow your gaze and whatnot. And those kind of things really add a whole new level of presence to these characters. And so I'm going to be playing with a lot of that. I think 2019 might be the first year I get into motion capture as well, because there's a lot of potential, in my opinion, of doing humanoid characters that are not fully human. I think Young Mesh, like, I imagine him having face tattoos and everything, but also, like, really absurdly long fur, potentially that glows in the dark or something. Like, I want to experiment with what these characters could look like and what kind of future we can build for ourselves with these new presences that we're going to have.

[00:46:22.966] Kent Bye: Great. And finally, what do you think the ultimate potential of augmented and virtual reality are and what they might be able to enable?

[00:46:32.283] Asad J Malik: I mean, I think like, you know, ultimately, yeah, it goes all very metaphysical. Of course, we know that I think we can get to a point when our own physical reality can be challenged or a new dominant reality might emerge. It might be multiple, it might be multiple or multiple people. I do think all those things are possibilities. And I just think it's important to think about why is that interesting? Why should that happen? What is the best way for that to happen? Because all these things could totally happen in terrible ways. A lot of things over the last couple of centuries have happened in terrible ways. So it's not like we shouldn't trust technology or the market or people's opinions and views on their own to just like... operate and lead us to, I think, an equitable, nice, virtual future of any sort. I think all these things require conversation and experimentation and really asking questions of yourself. It requires a lot of introspection, unlike mediums like, in my opinion, film, where, like, I think Herzog says that he really avoids introspection. And I think this is a medium where it's very hard to avoid introspection and it should really be embraced and indulged in.

[00:47:43.586] Kent Bye: Is there anything else that's left unsaid that you'd like to say to the immersive community?

[00:47:48.587] Asad J Malik: I would say that I think that this is a really weird time for the community overall. A lot of funding opportunities that existed over the last year are shutting down. A lot of new ones are coming up. I think I was just having a conversation with Fred. He's the CEO of Atlas 5 that has a piece here, and they always have a piece here. When I was here last year, there were pieces like Spheres and Wolves in the Walls and Battlescar and a lot of these pieces you were starting to see some similarities and standardization and like they would always had like a title screen and a billing block there would be like a 10 minute animated experience sometimes slightly interactive sometimes mainly narrative, they would have a similar credit sequence and then you would start to see the industry come together to form like a type of content now and that's completely gone now. Once again, we're here and everything's just weird and different. Like probably the only piece that fits into that category is probably Gloomy Eyes, like more polished, complete, controlled kind of simple narrative piece that is really well done. And I think this is like just people have to really be considerate that this is still a time of experimentation. These things are still should not be expected to go out and make money. I think it's a worthwhile goal, worthwhile aim, but it's going to take some time. So if you're going to be in this industry, you have to be able to work for not much money and a lot of headache. Like it's, you know, like someone was saying on a panel, how much we envy like filmmakers who can make a film with either the iPhone or like a nice DSLR. like it goes to post-production and you're done. Because I am still, I get called saying the Magic Leap broke down and I have to come and I still make Unity project changes and new builds onto the devices. Like, it's not an easy space to work in, but that is why it's so rewarding. Awesome. Well,

[00:49:32.167] Kent Bye: Just wanted to thank you for doing the philosophical provocations that you're doing and I really enjoyed the piece and actually really enjoyed unpacking a little bit more. I feel like I can understand your deeper intentions and just appreciate the overall piece a lot more as well. So just thank you for your time.

[00:49:45.818] Asad J Malik: Wonderful. Thank you so much.

[00:49:47.679] Kent Bye: So that was Asad J. Malik. He's an augmented reality artist and director of Terminal 3 as well as Jester's Tale. So I have a number of different takeaways about this interview is that first of all, Well, this was a bit of a mind-blowing experience that I had at Sundance and it did end abruptly. So there was like this question as to whether or not I made the right choice. I think what Asad said is that he really wanted to provoke this unsettledness in the viewer to be able to have it be left with them as this dream, like quality, that be just kind of like be working through them. To invoke this moral dilemma, there's this paradox and to try to catalyze this decision that they have to make. which when you take a step back, in some ways you're being tricked by artificial intelligence to potentially sacrifice your own life in order to save what you presume is like this virtual representation of the hologram. But he's trying to point to the dangers of anthropomorphizing artificial intelligence to the point where now that we're starting to interface with these virtual beings, then how much are they going to be able to manipulate us through the tendency to want to connect with other human beings when these are actually just like digital representations and algorithms and could be used for ways that are not in our best interest. So he's just trying to critique this larger movement towards this virtual beings and to maybe push for a more abstract representation for some of these entities and not to anthropomorphize them so much because they're It could be so much problematic aspects of having AI entities that we are, you know, engaging with and we're trying to please them in some ways. And I think there is legitimate other arguments, which I think is that, you know, we want to try to be as empathetic with all beings as we possibly can. And that the way that we treat our virtual assistants in some ways could be a representation of, you know, bad behavior. If you are a complete asshole to your virtual assistant, is that reflecting of a deeper problem within you that you are interacting with other people as well. know, that's a good possibility. And so I think that this piece is trying to provoke that in different ways. And so the deeper principle I think that he's getting at here is like using augmented reality as a way to start to give you a sense of like interacting with holograms. And so there's a volumetric capture of these actors and, you know, and then at the end, he is revealing that the actor is right there. And so you have this relationship that you're kind of cultivating with this virtual entity, and then you're revealed this boy in a cage in the wall, and you're being asked to like make this moral judgment as to whether or not you want to subject this human being to a lab experiment. So for me, I think I might have actually even lifted up my glasses a little bit because it was a little bit of like not knowing whether or not this is real. I thought it was real, but it was just so implausible that they would have a boy like trapped in the wall in a cage. know, going through the whole details for how they actually did it. I could, you know, obviously it wasn't there the whole time, but it was just such a shock. And, you know, I didn't see terminal three, but, you know, he was using a similar conceit, which is that you are essentially like at a border guard and you're trying to figure out whether or not people who are trying to immigrate from a mid Eastern state, whether or not they should be able to immigrate into the United States or not. And then you end up seeing that exact person after you have this sort of interaction with the virtual hologram. And so it kind of grounds it into this becoming a nonfiction piece in that way. The other thing that I thought was interesting was Asad was talking about how he was using Poppy and that he wasn't doing a lot of exposition or explanation as to, you know, who this Rat Queen was and just kind of like pops up. And Poppy is somebody, I wasn't familiar with her work, but then I watched some of the videos and it's kind of a creepy reverse Turing test where she's trying to pretend like she's a robot, kind of make people question as to whether or not she's an AI or not. And so you have this person who has this whole persona, and then she's entered into this whole experience that has this meta discussion around artificial intelligence. And so she has this larger cultural meaning and symbol that, you know, if people are familiar with Poppy, then they start to take the associations that they would have with that person in that persona, and then potentially start to project it into the scene. Now, I personally wasn't familiar with Poppy, so that didn't necessarily happen for me. But I think the deeper point of what Asid was saying is that in augmented reality, you're going to be able to potentially start to do things like that, where you might be able to take these cultural symbols and these memes or these affordances of interactions and start to put them into these immersive experiences without explaining a lot. And hopefully, they'll be able to convey maybe a deeper meaning for people who kind of get the larger associations of that symbol. So that was just an interesting point for me to see. And then the last point is just that Asit was making this point that you have to really set this very specific context and, you know, for Terminal 3, they're recreating this feeling of being in a Terminal 3 in the New York airport where you're taking in people and asking them these questions, kind of like the video game Papers, Please, in a lot of ways of, you know, you have to make these moral judgments as to whether or not people come in or out. And that in this experience, he had recreated this entire bedroom. And so you're in this person's room, and then you're seeing these volumetric capture experiences right there in front of you, and then kind of blurring the line. And then, you know, because it's in that context, then you're able to really be transported into this whole other realm. And because it's a bedroom, then he wanted to, I guess, eventually, you know, everybody has a bedroom, presumably in their home. So he'd be able to have a place to anchor that once he potentially has an augmented reality version of that that you could see on your phone. So I, I like the deeper moral dilemmas that Asit is trying to evoke here, because I think there's some really valid points of, you know, this future of a lot of these virtual beings, you know, he talks about fable productions, and Edward Satzky has been doing a lot of this work with trying to gather up these larger communities and talking about artificial beings. And, you know, there's little Michaela and other completely digitally architected entities that, you know, we have the sense that they're real because they look like humans. But, you know, what Austin's saying is that, you know, they're just complete fabrications that don't have any real agency in that by anthropomorphizing these, these virtual beings and these AI characters, we are projecting onto them the sense that they have their own sense of free will and agency when it's all being kind of scripted underneath this whole storified realm. And so I guess he's just questioning as to what degree do we need to have these different levels of embodiment with these AI characters. I think for me, I think it's a little bit of an inevitability that we're gonna move towards this personification of AI. But I do think there are these risks that he's pointing out that we kind of have to navigate very closely, especially if we, in this case, you're essentially taking a captcha, you're trying to prove your humanity and through showing you these different stories and seeing how you react to it. And so, you know, what is the continued evolution of these cap shows is an interesting question that is exploring, but also to what degree are you going to be releasing your own sovereignty in order to protect these virtual beings, which is, you know, what essentially is being told if you decide to sacrifice yourself within an experience like this. So very provocative, and I look forward to seeing this type of work get out there a little bit more. And I think he's just raising a lot of really deep and interesting questions. So that's all that I have for today. And I just wanted to thank you for listening to the Voices of VR podcast. And if you enjoyed the podcast, then please do spread the word, tell your friends, and Consider becoming a member of the Patreon. This is a listener-supported podcast, and so I do rely upon your donations in order to continue to bring you this coverage. So you can become a member and donate today at patreon.com slash Voices of VR. Thanks for listening.

More from this show