The Columbia Digital Storytelling Lab had a design research project at Sundance that is exploring new models of cooperative storytelling and collaborative sensemaking, but it was also on the bleeding edge of integrating technologies ranging from machine learning, innovative projection mapped visual displays, IoT-driven instruments, computational dance, and using AI as a collaborator to cultivate group dynamics. Frankenstein AI: A Monster Made by Many had three acts that used immersive theater components in order to facilitate a range of different social experiences that reflected on what it means to be human and what it means to be connected.
Act I had one-on-one conversations in an empathy conversational model that was designed to explore vulnerable experiences of connection and isolation, and it concluded with matching feelings with body parts. The emotional associations from Act I were fed into the AI, which determined an overall emotional sentiment and influenced the questions that the AI asked about human nature to all of the people participating in Act I. The AI was trained on Mary Shelley’s Frankenstein, but was also scouring the Internet looking for clues about what it means to be a human, and what it means to be connected. Then the collective emotional sentiment from the entire week was aggregated, and this drove the questions that were asked to a large group in the final Act III. The audience’s answers were fed into the AI, tagged with sentiment, and then translated into instructions that were sent to a performer doing interpretive dance. The final performance had more of a narrative arc that explored the polarities between isolation and connection, and had the audience share take-aways that they could feed into the AI to help it understand humanity and connection.
I had a chance to sit down with a couple of the co-creators of Frankenstein AI from the Columbia Digital Storytelling Lab, Lance Weiler and Rachel Ginsberg. We talked about going beyond transactional data with AI, how AI holds up a mirror to humanity, the narrative design intentions driving the project, and exploring yin storytelling structures beyond the 3-act and 5-act structures.
LISTEN TO THE VOICES OF VR PODCAST
Joseph Campbell’s monomyth is arguably biased towards more of a yang, outward journey, while this new yin story structure model that is emerging has to do more with facilitating an inner journey of transformation. Specifically Frankenstein AI focused on creating shared culture, fostering common understanding, driving empathetic conversations, facilitating a future-thinking practice of worldbuilding, and inspiring new mythologies about our relationship with artificial intelligence. These are all about fostering yin behaviors that are cooperative and pro-social.
We are seeing a lot of evidence for how the yin structures of storytelling is happening with the cultivation of embodied presence within VR experiences, and Frankenstein AI is starting to explore what this might look like at larger scales of collective transformation by working with larger group dynamics. In order to create shared culture, then shared experiences are facilitated by AI that is driving a collective Socratic dialogue about what it means to be human and what it means to be connected. And just by reflecting on the deeper patterns of our humanity and connections, then it makes us more human and more connected.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
Support Voices of VR
- Subscribe on iTunes
- Donate to the Voices of VR Podcast Patreon
Music: Fatality & Summer Trip
Rough Transcript
[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye and welcome to the Voices of VR Podcast. So at the Sundance New Frontier section this year, there was a piece that was probably integrating the most cutting edge technologies that are out there today. It was a piece by the Columbia Digital Storytelling Lab. It was called Frankenstein AI, a monster made by many. And they had all sorts of different components of like artificial intelligence training, had these immersive theater components, and had like Internet of Things drums, as well as dance performances, and all these different social interactions that they were facilitating. And so they're really trying to blend all these different things together to create the frontiers of what the next generation of storytelling is going to look like. And they're really focused on how do you cultivate and generate these conversations, as well as create these collective meaning structures based upon these group dynamics that are facilitated by interacting with artificial intelligence. So this is a design research project from the Columbia Digital Storytelling Lab, and I had a chance to talk to a couple of the creators, Lance Weiler and Rachel Gensberg. about their process and their intentions of what they were trying to do with this piece, but as well as, you know, the different innovations they were doing and what they were able to learn after the course of the week of training AI to be able to interact with these groups of people. So we're covering all that and more on today's episode of the Voices of VR podcast. So this interview with Lance and Rachel happened on Wednesday, January 24th, 2018 at the Sundance Film Festival in Park City, Utah. So with that, let's go ahead and dive right in.
[00:01:42.492] Rachel Ginsberg: My name is Rachel Ginsberg. I'm the creative strategist and experience designer and a lead artist on Frankenstein AI, a monster made by many.
[00:01:49.674] Lance Weiler: My name is Lance Weiler. I'm a storyteller. And I am also the director of the Columbia University School of the Arts Digital Storytelling Lab. And I'm a lead artist on Frankenstein AI, a monster made by many.
[00:02:02.176] Kent Bye: Great. And yeah, maybe you could tell me a bit about how did this project come about and some of the deeper questions that you were asking in order to do this experiment.
[00:02:12.114] Lance Weiler: The project comes out of the Columbia University School of the Arts Digital Storytelling Lab. It's a design research project. And so a lot of the work that we do at the lab explores new forms and functions of storytelling. So Frankenstein AI makes use of a number of different methodologies that we use at the lab, and it's an opportunity for us to kind of push at the edges of what a story can be. One of the things that we were looking at with this project, and it's a very multi-layered project, it's a multi-year project, here at Sundance is the world premiere, In this particular version of it, we're looking and working to build a kind of a human corpus. And so we do that through an empathy conversational model that we've been developing at the lab. And so people will come through the experience, they interface with an AI, but instead of it being like the way that when somebody normally interfaces with an AI where it's kind of as a personal assistant, we kind of flip that relationship. And so the AI has questions for humanity because it wants to understand what it means to be human. And the AI, Frankenstein's monster in this case, has been wandering the wilderness of the internet in search of human connection. And so the narrative conceit here is that participants are coming in and helping an AI to understand what it means to be human. And so that's done through human-to-human contact because one of the Things that we've realized through the work that we've been doing is that a lot of algorithms are shaped with transactional data And we thought it would be interesting to do something with human to human contact using that as a way to surface data and what if we were to build algorithms from memories and Emotions fears and hopes what if we were able to build them from human data? What would that look like and would that change what an algorithm could do? so in a sense the project kind of holds a mirror up to humanity and Has a lot of elements that are deeply rooted within Shelley's original text and themes that are derived from that work I'm curious if you could kind of talk about your entry point into the project and some of the things that you were looking at
[00:04:24.787] Rachel Ginsberg: Yeah, absolutely. So I mean, definitely, what Lance was saying is all quite true. And, you know, to add yet more layers on top of it, I think, just sort of being conscious of how pervasive AI already is in the world that we live in, and how it will continue to become more pervasive. And in addition to the dystopic narratives around it, That have been kind of dominating what we think of when we think about it There is also this just sort of pervasive fear of being replaced or of being made obsolete And I mean that that's true. There are definitely a lot of jobs and a lot of situations that AI will eventually replace but I think given that really quickly impending reality I think that what we're really trying to do here is to give people an entry point into the conversation around artificial intelligence that is not simply another Hollywood narrative about it. And so really sort of putting people through an emotional experience where they engage with a stranger, where they understand what the stakes are in the conversation around AI, and then encouraging them to interact with an artificial intelligence, which as Lance said, flips the script on what we've come to expect of AI, at least in our lives to date. really feels like a good jumping off point for inviting people into the narrative around AI, encouraging people to really claim their own agency and find their voices and what they would like the future to look like in the hopes that we can all sort of come together and collectively imagine a future that's going to work for everyone.
[00:05:54.163] Kent Bye: Yeah, the thing that I'm really struck by, by both the Frankenstein AI as well as the Tend AR experience by Tender Claws is that in both examples, they're using AI as a way to create social interactions between two people. And in the case of Tend AR, it's to actually embody these different emotions so that you're kind of training AI through your emotional expression. In this case of Frankenstein AI, it's a lot more about exploring how to connect people in a very vulnerable kind of one-on-one, almost like an immersive theater type of setup that we're sitting here at Sundance in the midst of this room with these glowing red candles everywhere. you create this one-on-one interaction, but then you have these other interfaces where the full focus is actually trying to answer the question of what it means to be human. And so, yeah, I'm just curious to hear about that design process of that social element that you were trying to connect people in new ways, and then what is being generated as input into the AI neural networks that you've been creating.
[00:06:56.460] Lance Weiler: Sure. I think we are deriving at least the first act in the experiences three act structure. The first act is very much where you have two people, ideally two people that don't know each other, sitting down in a very intimate space. And there's two different questions that they're presented with. One is a question about isolation and the other is a question about connectedness. And so the environment is designed to end the prompts, or designed to try to help evoke not only a sense of connection, but an ability, you know, hopefully to create some sense of reflection. You know, so they're able to kind of open up and share with each other these interesting stories that touch into memories, emotions, fears, and hopes. And in doing so, I think that first step is very interesting because it aligns a lot with the core of Frankenstein, the core of the text, where a lot of the drive of that story is about connection. Frankenstein's monster wanting to connect, wanting to understand or have connection with, you know, the family that he ends up taking shelter with for a period of time within the book. And then when he feels betrayed, he lashes out at Victor Frankenstein's relationships and decides that he's going to try or, you know, the monster is going to try to make Victor Frankenstein feel the same way. that he feels that he's been treated, you know? And so I think in that respect, it's fascinating to kind of think about, okay, well, what is it really truly to be human? And when prior to them coming into the room, we have them kind of almost do like a survey. They're asked and posed a question. There's a dropdown of a whole bunch of different questions that relate to what it means to be human. when they select the one that they feel the most connected to, and then it takes them through a couple steps where they're kind of considering if they have a memory associated with that question, you know, how, if there was a hope that they had, a fear that they had, so forth and so on. So we collect some data from that first step, and then when they come in and they sit down across from each other and they start, you know, sharing these stories, At a certain point, there's these beautiful tables that feel like they're from the 1800s. The table awakes and through microfiche starts to create this mapping interface that allows them to map words to body parts, which is meant to provide an opportunity for them to think about the conversation that they had just had. So for instance, if they decided that they both felt some level of vulnerability when they were talking to each other, they would be looking at these various body parts that are illustrated in 18th century kind of style, like medical drawings. And so there's like a heart, a mouth, eye, hand, and brain. And so they select the one. So they might say, oh, vulnerability, you know, I felt that. So they decide, well, what does feeling mean? Well, I think it means, feeling means the heart. so let's map it to the heart. And so they do that and then that data is collected and then that affects how the AI responds to them when they walk into act two. And so the AI takes on the emotional state of what input it's received from that survey from this interaction at the surface table.
[00:10:23.232] Kent Bye: Yeah, and Rachel, coming in as a creative strategist, and there's a lot of experiential design here, I'm just curious to hear a bit more about your participation and the different aspects of the experience that I had a chance to go through all three of the different experiences and kind of the different things that you were looking at.
[00:10:38.035] Rachel Ginsberg: Yeah, so my touch is in pretty much all of the experience, but in different ways. For me, a lot of my role in general and in various collaborations that I have with Lance and things actually that I'm working on with Nick too outside of this project, I'm really focused on sort of what are the themes that we're exploring? What is the narrative through line? Are we giving the audience, are we giving festival goers or participants or collaborators, depending on who they are and what we're designing, enough information to really engage with the story in a way that they're taking away the information that we're trying to communicate. Because obviously this project is quite cerebral. There's a lot of things to consider and a lot of sort of areas of inquiry. And so being able to bring people through the experience in a way where they understand and learn what it is that we'd like to demonstrate and sort of have the opportunity to think about and to continue thinking about kind of beyond the point of the experience, the questions that we're exploring. So specific aspects of that were, for example, in the conversation model that takes place in the parlor, which is the room that we're all sitting in right now, writing the actual prompts and writing the intro monologue. That was, for the most part, I was leading that effort. Thinking about the order of the interaction and the tone of the interaction in the lab room was something that I was involved in. So it's really a matter of kind of setting the emotional state, managing the information flow in a way that's effective for participants, Thinking about communication around the project, too. A lot of the language that we developed, I was closely involved with. And then also, I mean, thinking about the relationship of this project to the Columbia University School of the Arts Digital Storytelling Lab and our mission there, which is to explore future forms and functions of storytelling. and how many different aspects of sort of that mission are really manifest in this project. And so thinking about what is the work that we're doing outside of the lab as far as like the creative work that we're bringing to other places like Sundance, how is that really fulfilling the mission of the lab and sort of continuing to further our inquiry there. which to sort of dig into that just a little more, it's pretty interesting actually. So when we talk about forms and functions of storytelling as we move into the future, forms is a conversation that's pretty well established and we define as sort of what are the media through which stories are told and what kind of form do they take in the world. So that would be something like AI powered storytelling, for example, or virtual reality or an alternate reality game or whatever sort of new manifestation that happens to be. But functions of storytelling is something that, though there are lots of people exploring it, they don't tend oftentimes to be sort of located in the school of the arts. And our exploration around functions of storytelling is really how can we surface stories in these sorts of collaborative, many-to-many kind of play-driven experiences and actually use those stories to create culture and to create common understanding and to drive conversation and to ultimately to kind of feed into this sort of future thinking practice that we've been building, which is really like visioning for the future by kind of creating commonality through shared storytelling in the present. And that can be used in a lot of different ways. We're exploring it with the narrative medicine program at Columbia University. We have done some work in technology companies. We've done some work with UNICEF and with the State Department. And so thinking about how can we create culture shift through storytelling. And so this project is sort of this interesting intersection between exploring a new form, like machines and humans collaborating together to tell stories in new ways and bringing audiences along. but also then using that shared storytelling to create a shared narrative around AI very much falls into the kind of functional aspect of the mission.
[00:14:24.693] Kent Bye: Interesting. That's really fascinating. And the thing that I'm really struck by seeing this project and all the people that are involved is that it's a bit of a, I guess, Frankenstein entity within itself of all the different people that are coming from all different backgrounds and disciplines and domains. I know, Lance, you had said that there's so many different levels at which you're really kind of pushing the bleeding edge of technology. And I'm wondering if you could kind of maybe give a map of the landscape of the collaborators and the participants that all had to come together and collaborate on this project in order to really bring it together.
[00:14:56.002] Lance Weiler: Sure. I think what's interesting about a lot of the work that we do at the lab is it is focused on this notion of the best collaborative environments are when there is such a diversity of practitioners, you know, from all different walks of life, all different backgrounds, all different industries. And with Frankenstein AI, it's like, it is very much like a system. And I think you're right on. You're very pinpointed in saying that it's like a Frankenstein unto itself. Because if you looked at it on paper, it's such an eclectic mix. You know, you have people who are doing data science, you know, looking at wrestling machine learning and crafting, you know, algorithms. And then you have people who are building visual systems. you know, and are really, you know, like clip collective, like at the top of their game in terms of projection mapping and visual systems. And then you have Peter English in terms of what he's doing with the score and also the sound design. And then Peter and Jeff are doing a really wonderful job in terms of the IoT-based instruments, you know. And so the physical forms that the AI embodies, it's almost like a ghost in the machine. We made use of a lot of practical-based things. And in order to use practical-based things with code-based things, it becomes like a cross-section of like, oh, OK, we have a production designer. We have normal producers. But then we also have a choreographer and a dancer and computer scientists, as I was saying, and visual artists. And it's a really interesting mix. And, most importantly, kind of almost at the center of that is a collaborator that's a machine. And that machine is driving a lot of what we're doing and we're feeding off of it and kind of seeing how it responds because we kind of... send it off, see what it's gonna say, and then they're like, wow, that wasn't expected, or, oh, you know, that's not going to work, or whatever it is, and we kind of continue to refine it. So it's really interesting in terms of collaborative practice, because you have such a diverse group of people, but then you also have the AI in the center of that, right? And the way that we're using the AI is in a way of augmenting the creative process. So I think that this project has really stretched all the different participants who are stretched all our collaborators. And I think that's fascinating because it's forced us to try to shape new grammar to help us to be able to collaborate. So I think in regards to the project, some of the things that we're working to do at the lab and that I'm working to do outside of the lab too in my own work is to try to find process that allows you to come into these projects so you can really hit the emotional resonance, you know, that's what you're trying to do. But then to be able to communicate that vision across all these different stakeholders who have different language in terms of how they work. And then you have to kind of lay a collaborative language over top of it. You have to quickly try to find it and refine it so then you can make sure like, oh, we're all moving towards the same goal. Because sometimes you can say, you can be talking past each other and not even realize it.
[00:18:16.862] Kent Bye: Yeah, and when I think about storytelling and interactive storytelling, what I think of is this spectrum between authored narrative and generative narrative. And I could imagine how you could have done all of this experience, and from a phenomenological perspective, the people going through it, you could have done it without using any AI at all. You could have just completely authored the experience. But yet, there's a certain amount of control that you're giving up, and the trajectory of what's happening here with using AI in that way, and sort of allowing things to be generated that you can't predict and is totally surprising to both you and everybody else. And so I'm just curious to hear a little bit of that process of watching this project evolve over the week as well as kind of watching the performance and kind of some of the big takeaways that you have from that.
[00:19:03.176] Rachel Ginsberg: So many. So, yeah, I mean, the AI is an unpredictable collaborator, and unpredictable collaborators are exciting and challenging. I think all of us for the most part on this project, to varying degrees, but particularly those of us who, you know, the lead artist, Lance, myself, and Nick Fortunio, all work with the Digital Storytelling Lab, we are all pretty accustomed to a level of ambiguity, as Lance was saying, that I think is, and Lance wasn't saying this, but I'm gonna say this, much higher, I think, than many people in the world, just because there's so much, it's such a priority in our work to, in our sort of more education-driven work, to actually increase people's capacity to tolerate ambiguity, because there's just so much of it in the world we live in right now, it's really become, kind of key to resilience in the world that we live in so I think to some extent that That ambiguity around the AI is a little more manageable for us and it's for some people that said I think one of the challenges for me and one of the things that I've observed over the last few days, especially is I've said this probably every time we've talked about the project, but it's just so true that AI does not work the way a human brain works, despite the fact that neural networks are inspired by human brains. But the logic of artificial intelligence is not the logic of the human brain. And it is very unintuitive, and it fails in unintuitive ways, and it succeeds in unintuitive ways. So we wanted to have a really light touch in terms of authorship ourselves in this process, as you observed. But we had to have enough that we could create an environment for people where they could understand what was happening narratively with the understanding that the AI couldn't take them through an intuitive journey through a narrative. And that's sort of the challenge is, a lot of what we were trying to figure out early on is what is the right size of the role for the AI in the experience? Because, I mean, as I said, the workings of artificial intelligence are fairly opaque to the human mind. And the way that people can identify with stories is by understanding them and having the setups be right and seeding things at the beginning and picking them. I mean, all of the sort of laws of good storytelling, the AI kind of can't really do that. I mean, it can and we're continuing to push in that direction. And sorry, just to be clear, I'm talking about our algorithm because it's still quite young. So we're still training it and that's sort of part of the process too. But I think in general, the process we're going through in terms of how much to author it and how much white space to leave for participants and when to bring in the air at the right time. It's a lot reflective of how we feel about artificial intelligence in general. And the reasons that a lot of the decisions are being made around those same topics around AI right now are really transactional. And so our decision making process is much more artistic and driven by design towards an emotional aesthetic as opposed to we want to get somebody to do something, therefore we're going to design this interaction around this AI that's very particularly sort of pointing someone in a direction like clicking like or clicking by or handing over information, whatever that may be. So I think you bring up a really nuanced question. And I think that in terms of this work, we've seen a lot of success already as far as the amount of white space that we leave in the experience for people to really fill it up with their own stories, which is so much the point. creating a space where people feel comfortable narrating themselves into the story around AI. But as it evolves, and as we continue to build a corpus and gather data and get more people engaged in the project, I do think that we're going to need to continue to refine the role of the AI in the various kinds of experiences we design, and really sort of, I think, be very pointed about its purpose, but what that purpose is yet, I don't think that we know, because it's still so early in this process, and the research is still sort of, it's exciting, and it feels like it's moving in a positive direction, but it still has to kind of gel for me.
[00:23:19.122] Kent Bye: Lance, what about you? I'm just curious to hear your takeaways from the week.
[00:23:22.769] Lance Weiler: Well, I think that Sundance is an amazing platform to be able to show work and exhibit work at. And I think coming together in the timelines that you have, it creates a really amazing pressure cooker that allows you to work with a group of people to, quote unquote, put on a show. So you have the barn. Now you've got to get everything together. You've got to get it all in there and have to figure out how it is all going to work in a very condensed period of time, which I think is really interesting in terms of the creativity that comes from that and working within those constraints. And so I'd say like some of the things that I have been surprised by, I've definitely been surprised by collaborating with an AI. That's been a really wild process, you know, like the give and take and like, okay, I'm not really sure why it's doing that, but that's interesting. There's something on the fringe there. There's something that feels maybe uncanny even, but let's lean into it. to watching the rest of our collaborators experiment with that. And then I think also watching festival goers realize that they're actually not just passive participants, but they're actually helping to craft a monster made by many. And even some of the nuances, I think, of an iterative project that, you know, as we lean into that ambiguity of what it is, giving it the space to grow and being okay with not knowing exactly where it's headed, you know, being open to the possibilities. And I think a lot of the work that we see these days is at odds, you know, with this notion of ownership and authorship of stories. And I think there is such a wide range of possibilities in terms of the way stories are told now. You know, and this project is very much about the physicality of narrative. You know, you walk into these rooms, it has wonderful immersive theater elements to it, but then it also is, at times, is something that is different than that. You know, it's, yes, the AI, I guess, has some type of character that it takes on, But it's almost like there's a little bit of mental illness within the machine, you know, because it swings kind of so erratically, you know, and you're not sure. Like, when we first started the week, a lot of what it was doing was very angry. But a lot of that had to do with it was initially trained on Frankenstein. Well, it was angry and sad. And a lot of that had to do with the fact that it was trained on Frankenstein, like, on the original text. So the models that we were working on kind of started there, used some things from Reddit, you know, and as a way to kind of look at the internet, you know, because it's out there scraping the internet, trying to understand humanity. So I think that those things are interesting. And then I think it's like, what are the, what's the language or the grammar for this type of work when you're trying to help at the same time, you know, people are opening up at the same time, they're in immersive experience at the same time, they realize that they're part of what that is, that they're actually shaping the narrative of what it is that they're actually in, you know, their, their sentiment has impact. is really fascinating in terms of the opportunity for storytelling. Because now it becomes like a whole different set of parameters. It's not necessarily one to many, it's many to many. And when you're creating with so many different people, I think that's fascinating. So it's been really wild to watch the reaction, the reaction to people feeling like at a certain point that they are being reflective, that they are connecting in interesting ways with strangers, but then also they're having deeper philosophical thoughts about what a ubiquitous technology like AI might mean to them and the world that they live in.
[00:27:08.152] Kent Bye: Great. And finally, just curious to hear what you think is kind of the ultimate potential of immersive storytelling and artificial intelligence and what it might be able to enable.
[00:27:20.254] Rachel Ginsberg: Oh my gosh, what an exciting question and really exciting thing to think about. I don't know, I mean, I think kind of the sky's the limit. I've somewhat recently been nerding out on Nordic LARPing, particularly as it relates to sort of educational possibilities. I'm really, in general, encouraged by methods and practices that bring people together and have them telling stories together in ways that are helping to facilitate futures. And I think that immersive storytelling is this really interesting opportunity to make people feel things, and not just make people feel things, but bring them along with you in the building of things to feel, almost. As sort of abstract and bizarre as that sounds. in this way that I just don't know how many other media really can. I mean, I was actually telling Lance the other day, it's really easy to make me cry in a movie. I'm sort of an emotional person, and I get nostalgia sensitive and whatever. So it's not hard to get an emotional response out of me, but I want to make the kind of work that changes people, not just in the moment, but changes the way that they think about things permanently. And I feel like the potential of immersive storytelling is that. And I think with regard to AI, I mean, oh my God, the possibilities are endless. And to be frank, like I don't even know that I know enough about AI to start talking about what I think the possibilities of it are. But what I know definitively about it right now is that we have the opportunity to influence this trajectory. And with the knowledge that moving forward, the kinds of changes that AI is going to bring about in the world, particularly around the value of human labor, it's gonna be really significant. And as artists, I think we have a responsibility to weigh in on what we think that means. And in particular with this project, we have a responsibility to and have taken on the responsibility of really starting to help to drive that conversation. So, I mean, I think the possibilities for the media, you know, for the immersive storytelling medium and for AI as a medium, I think are really like kind of endless. And I feel overwhelmed even thinking about how I might answer that question in specifics. But what it makes me think is that the world is changing a lot and that we have to do as much as we can to create as much space as possible and sort of flexibility as possible for all of us to figure it out in ways that will bring all of us along with it.
[00:30:01.043] Lance Weiler: I think from my perspective what's truly exciting about it is the infinite possibilities. The idea that stories could be personalized in some way, that they could allow for multiple authors, that They could go and become physical in ways that were never expected, or they could really embrace the ubiquitous nature of what artificial intelligence is. It's not screen-dependent. It's like a ghost in the machine. It can embody physical objects. It can be around you at all different times, which is really fascinating and has a lot of narrative possibilities. I also think that with a project like Frankenstein AI, as we're leaning in and kind of experimenting in this mad scientist lab, you know, we're trying to figure out like, oh, what's the right balance of these things? How can we build creative systems that allow people to become better storytellers? And I think AI offers a really unique opportunity to hone that, to model that, to play with it in interesting ways that challenge, you know, a three-act or five-act structure, you know, that, you know, the beginning, middle, or an end might not necessarily even matter anymore, you know, that this thing is living, breathing. It's a window into another world, and it's a world that we can all build together. So that's fascinating to me, and I think the creative possibilities of that, as we've seen just in our limited time over the last year with this project, I'm really excited about it. As a storyteller, having worked for well over 20 years as a writer, director, producer in film and television, the opportunity to kind of step into a realm and collaborate with a machine is something that is so foreign, yet so incredibly exciting. It's interesting. It makes me think of that moment in Her where Joaquin Phoenix finds out for the first time that Scarlett Johansson, she's not just talking to him, but in fact is having concurrent conversations with well over a thousand other people or something like that. And that's like mind-blowing, right? You know, like the potential of what artificial intelligence could do in terms of the stories. Not only the way that we're telling the stories, but the way the stories are being delivered and the way that the stories are evolving. You know, so I think it's an interesting creative companion. And I do think to Rachel's point earlier, I think there's an ethical consideration to it too, you know, like, can we use the arts as a way to better understand these challenging emergent technologies and to make sure that they are inclusive in terms of design practice? And I think storytelling is a wonderful way to make sense of things that we maybe don't fully know. And I'm optimistic in the end. I hope that if we can infuse humanity into what we do, maybe that has some small part in helping us to shape technology that's more reflective of the needs, not just of some, but of as many as possible.
[00:33:07.582] Kent Bye: Awesome. Well, Rachel and Lance, I just wanted to thank you for joining today on the podcast.
[00:33:12.224] Rachel Ginsberg: Thanks so much, Kent.
[00:33:13.704] Lance Weiler: Thanks a lot. Always have me back. I love coming back.
[00:33:18.486] Kent Bye: So that was Lance Weiler and Rachel Ginsburg of Columbia's Digital Storytelling Lab. And they had a piece at Sundance this year called Frankenstein AI, A Monster Made by Many. So I have a number of different takeaways about this interview is that, first of all, so they're really inverting what you would normally expect to be kind of an AI character within a story. Usually every time I've thought about artificial intelligence within a story context, it's you're interacting with a character and then you're having some sort of story that's unfold around these sort of non-player characters. But this is much more of using AI as somebody who's asking questions, trying to investigate the nature of humanity and what it means to be connected. They have been able to train the AI on a different corpus, as well as scouring the internet. And they were coming up with a series of questions, and they had a little bit of a Wizard of Oz in the background of someone who was actually choosing the question that was going to be asked to the general audience. So there was a couple of times that they were doing that, especially in Act 2 and Act 3. And in the first act, it was much more about you interacting with a person one-on-one and having this really intimate and vulnerable conversation. And they were doing these different prompts about times that you felt isolated and times that you felt really connected. And so, you know, from that, then you're doing this training of the AI by matching up an image with the emotion that were coming up from the context of your conversation that you were just having. And that was being fed into the second act of the AI that was then trying to take the emotional tenor of the discussions that were happening as a collective And then from there, be able to choose different questions that were reflecting that emotional tenor. And that was something that was very similar as to what happened in the third act, which is that you had a group of people that were all together in a room, probably 40 or 50 people. And again, there were questions that were being asked to the audience, and then they were answering those questions that were being In real time, it's semantically tagged with different emotional intents and then that was being fed in real time into a dancer who was then trying to embody emotions of the AI. And so they had figured out a whole type of choreography in order to do that dance performance. I actually did an interview with both the choreographer and the dancer of that piece, and I'll be focusing on that third act exclusively here in a future episode. But just unpacking this episode a little bit, a couple of other things that really jumped out was that Lance had said there at the end that he's really curious as to whether or not working with these new immersive technologies, if there's going to be new story structures that are emerging, moving beyond just the three-act or the five-act structure. And so what I would say is that this is something that I've been thinking about a lot as well. And you have Campbell's, what's typically known as the hero's journey or the monomyth. So people were thinking, well, is that something that is very gendered to a male hero? And is there a opposite heroine's journey or something that's much more into the feminine archetype? So I think personally that using the word of hero and heroine is a little problematic just because it puts it into a gendered, you know, almost like essentializing the gender into saying that only men can have a hero's journey and only women could have a heroine's journey when In actuality, there's, I think, much more of a blending of masculine and feminine archetypal energy in all people. And there's people that have different combinations of what you could characterize as yin and yang, maybe a more neutral term to talk about the differences between the expression of agency outward versus the receiving of information, either with your body or through your emotions. And so you have this outward journey, and then an inward journey that I kind of see is the difference between the monomyth from Campbell tends to be much more of an outward journey. But there's always kind of a parallel inward journey that's happening. And there's an inner transformation that's happening as well. And so you can Use that same framework of Campbell's monomyth to be able to go into the inner journey but I would say that there's also this difference between the young and the end is that that young is much more into competition and conquering and a zero-sum game and there's one winner and Whereas within the yin, there actually could be much more about cooperation and collaboration and collective understanding and much more of that interpersonal transformation. So either it's an individual transformation of you getting really present into the experience, Or it could be sort of a collective transformation. And I think that some of the things that the Frankenstein AI, a monster made by many, some of the things that I think they're trying to do is really explore this process of cultivating this cohesion within a group where they're creating shared culture. They're creating a group dynamic by which someone's going to have a shared experience of this conversation about AI. So when you take a look at some of the functions of storytelling that they were trying to say, it kind of reflects that. That they're trying to create a common understanding. They're trying to drive connections. They're trying to do a future world-building process. We're trying to birth a vision of the future of a world that we want to live into. And so we're doing this collective imagination of what is even possible. And I think that is much more of a yin process than a yang process. It's this imagination of a world that we want to live into. But it's also this conversation generator to be able to actually stimulate and have these conversations with people and to come up with a new mythology or a new story about what AI is and how it's going to relate to our lives. I think the mythology and the story of what AI is and how it's going to fit into our future is a story that we can either let Hollywood tell through these dramas like, you know, the Terminator, who's going to have these AIs are going to take over and they're going to have this power over us. Or it could be a separate model by which AI is simply just a tool. It's allowing us to do useful things. And it's actually more of a servant that's actually making our lives much more connected and better. And so there's these two competing myths that I see kind of reflected within science fiction, which is like this, you know, centralization, dystopic future where this entity of artificial intelligence has so much power that it's controlling us versus the more utopian or collaborative and cooperative vision by which we're able to actually find a way to actually work together and create shared meaning. And that the AI is actually, you know, facilitating these deeper conversations with people. And I think that is what they were trying to do with this design research project is start to really push the edges to see like, well, how can we start to use mediated technologies like AI in order to generate these novel interactions with either one-on-one interactions, small groups or large groups. I think there's generally a lot of anxiety that comes around artificial intelligence in terms of like, is this something that's going to either displace or completely radically change the relationship between us and you know, how we get our livelihood and our living and what our jobs are. So it seems like artificial intelligence is this entity that is going to be within our lives. And I think that it was really brilliant to actually look at Mary Shelley's Frankenstein and to look at how Frankenstein was this entity that was created by lots of different people. And it's sort of like this monster that was made by many. AI is kind of doing that, we are doing these machine learning processes by which we're sort of feeding it all this data that it's going to be able to then parse through and then come to some sort of understanding about, you know, either making judgments or decisions that would be more akin to human intelligence, or human intuition in a way that has been difficult to do with traditional sort of algorithms before. And so that's part of the reason why Rachel said that AI, you know, has a certain amount of logic that is way different than human logic. And if you were to try to characterize human logic, sometimes like the logic of how humans have act is not something that can be very easily modeled by a consistent set of axioms and mathematics because there's a lot of internal contradictions and incomplete information that really requires something like pair consistent logic to really handle all the different paradoxes and nuances of humans. And I think one of the really interesting things that Rachel said is that working with AI is teaching them how to deal with ambiguity. And that ambiguity is dealing with incomplete amount of information and dealing with things that are going to be completely unpredictable. And there are paradoxes that are going to arise within that ambiguity. And I think you know, reminds me of this interview that I did with Marilyn Schlitz at the Institute of Nomadic Science Conference, where she said, you know, one of the biggest skills in the 21st century are being able to handle paradox and be able to navigate people with different worldviews. And I think that if individually working with AI allows us to have a little bit more tolerance for that ambiguity, then maybe we're going to be able to have more ambiguity and tolerance when we're dealing with people who have different perspectives and worldviews than us. So this is a huge cooperation between lots of different disciplines and they're pulling in all sorts of different interesting things from machine learning to data science, to internet of things, instruments that were completely surrounds. You have like the spatialized sound and immersive sound. You had choreographers and dancers, and then all these various immersive theater elements that are generating these different social dynamics and situations. And so all these kind of cutting edge narrative design that's going into this experience that they had created. So I'm really curious to see where this project ends up going and how it continues to evolve. I think it's something that is super important in terms of how do you use this technology in order to create these unique types of interactions with people. And it's less about the AI being the center of attention. It's more about how can the AI facilitate this process of cooperation and collaboration when it comes to the creation of shared meaning. I would say that they're on this path of trying to discover what the underlying structures of this YIN model of narrative design is going to end up looking like. So, that's all I have for today, and I just wanted to thank you for listening to the Voices of VR podcast. And if you enjoyed the podcast, then please do spread the word, tell your friends, and consider becoming a member to the Patreon. This is a listener-supported podcast, and so I do rely upon your donations in order to continue to bring you this coverage. So, you can become a member and donate today at patreon.com. Thanks for listening.