#1514: Creating GenAI Flower Ecosystem with “Future Botanica” AR App

I interviewed co-directors Marcel van Brakel and Hazal Ertürkan about Future Botanica that showed at IDFA DocLab 2024. See the transcript down below for more context on our conversation.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.458] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the structures and forms of immersive storytelling and the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So continuing my series of looking at different experiences from IFA Doc Lab 2024, today's episode is about a piece called Future Botanica, which is a part of the Digital Storytelling Competition. So this is an augmented reality application that uses some generative AI techniques. And so the idea is that you're creating a specific type of a biological organism, and then you're able to give it a number of different properties. And then it's put within the context of this larger ecosystem and has all these different simulations that are running in the background. But It's got this whole optimized technique where you're choosing these different features and then you're creating your own plant and then you're having some kind of augmented reality interactions where you have this ability to kind of plant it into the ground. They also, in this installation, had a number of different screens, like 16 different screens, that are showing you images of what other people have created and some of their different stats. And then at the side of the installation is this model or simulation of all these plants together in this ecosystem. So that's what we're coming on today's episode of the Voices of VR podcast. And it's worth noting that throughout the course of doing interviews at DocLab this year, I was progressively losing my voice. And so you'll hear my voice particularly crackly and raspy in this conversation. So this interview with Marcel and Hazal happened on Wednesday, November 20th, 2024 at IFA DocLab in Amsterdam, Netherlands. So with that, let's go ahead and dive right in.

[00:01:45.003] Marcel van Brakel: Hi, I'm Marcel van Braakhoff from the design studio Polymorph. We create immersive experience. Most of them are focused on the body and how the body can be used with different modalities than only the visual. So we did a lot of experimentation with soft robotics, touch, smell, food, snacks and VR, all kind of different combinations, robotics.

[00:02:06.964] Hazal Ertürkan: I'm Hazal Arturkan. I am currently mostly working with immersive media, but previously I was more in the research part of design. And at that time, I was working with living materials that made me more interested in how people are interacting with more than human and then the illusion of having control, but then actually like... being afraid to face with like reality like how ecologies are complex or like how we are not the dominant species actually so that made me interested in this kind of experiences and then I was experiencing Marcel's work Symbiosis and then I was super impressed like how immersive stories can actually make you experience those kind of realities and change your perceptions so then I wanted to work with them so yeah

[00:02:56.311] Kent Bye: Great. Maybe you could each give a bit more context as to your background and your journey into the space.

[00:03:02.113] Marcel van Brakel: So I started as a filmmaker basically in art school. So I did documentary films and small short films. And then I moved into theater and I did a lot of big immersive theater projects here in Holland. Opera, music theater kind of things where the audience would be on location, walking around, discovering, also maybe interact sometimes with the actors. But theater is also complex and slow, slow process. And that the immersive field got my attention because I also had a teaching position at the University of Applied Science. And that made me very interested in this technology, using technology for performance, for creating experiences.

[00:03:47.136] Hazal Ertürkan: Yeah, I actually studied product design and service design and then I moved to more interaction design in my master's. It made me really interested in how to create different emotions through design and then like how in-depth like interactions are changing people's perceptions. And then afterwards I started PhD on like immersive technologies, how they can transform people's perceptions about like living materials from bacteria, algae, because people have prejudgments about them. But then how we can create more positive feeling about it. So then actually I got into more like technology and then its relation to storytelling and then immerse people with new realities. So yeah.

[00:04:30.961] Kent Bye: And so are you working for Polymorph, an independent producer, designer? Maybe you could just give a bit more context as to like your profession.

[00:04:39.355] Hazal Ertürkan: Yeah, I am working now like as freelancer. I kind of established my own studio last year, Studio Bilal. And yeah, like I am collaborating with different studios and mostly my own work. So yeah, designer, researcher, like different things I'm doing.

[00:04:58.092] Kent Bye: Okay, great. Okay, so where did Future of Botanica begin? The piece that you're showing here at IFA DocLab?

[00:05:04.602] Marcel van Brakel: So one of the starting points for the project was a previous work, like all of our work kind of spawns from the previous works, was Symbiosis, where we explored symbiotic relationship with nature and symbiotic futures and how would that affect people on a political level, on a social level, on an emotional level. And for that, we created a multi-user VR setup. But of course, VR is very exclusive. You can only be there one-on-one or with, in our case, six people at a time. So we had to leave out a lot of people and we were at that time in COVID and researching for the Sundance Institute. We were having a fellowship there and we were constantly debating how could we enlarge the ideas that we had on symbiosis to a larger group of audience. And that brought us to think about AR, augmented reality as a medium, because it's very accessible, you can have it in your phone, you can do it everywhere. So in the beginning, it was like more thinking about like an add-on for symbiosis to have a larger group connect with it. But it finally became an independent project where we thought, where can we make this kind of Lego or digital system where people can piece together new kinds of nature within scenarios of the future. And so in symbiosis, of course, we are the main author as an artist group to envision the future. But we were also interested in how other people might have different ideas, different perspectives and can we gather these perspectives and play around with them. Because everything starts in your mind with the power of thought and of framing what you think. So that was the core idea and Hazal started working together on that project. became very fruitful collaboration and that had some different iterations and in the end it turned out what it is now including also AI in the formula of the project to generate new kinds of nature.

[00:07:04.992] Kent Bye: At what point did you enter into this project because you know I know Polymorph has had a number of previous projects with Symbiosis and just curious to hear about your own journey into this project and your entry point into it.

[00:07:20.687] Hazal Ertürkan: Yeah, as I said, I was amazed at the project I had a chance to experience. And then I contacted with them and then I asked, okay, is there any way to collaborate? Because I'm also working with living materials. I'm interested in immersive technologies. And then we were trying to find in what project we can collaborate. So Future Botanica was one of them at that moment. But then suddenly it was more fastening the project. So then I got more involved. So when I got involved, I think two years ago, they were thinking more like quite complex system, modular. I was more doing research about ecologies, like how we can really make some kind of system that can create their creatures. And then the idea of like using AI came out. And then with the technology itself, we changed the concept a bit like more like using AI technologies also to imagine how would be future creatures. So a bit like, I don't know, spontaneously it came out like my involvement to the project.

[00:08:24.212] Kent Bye: Nice. And just a clarifying question in terms of the timeline, because last year you had at IFA Doc Lab, you presented a project in the planetarium, giving a sneak peek of some of the things you were thinking about. Now you had Future Botanica here. You said you've been working on it for at least one or two years. And so is that project that you talked around last year, did that come after Future Botanica or was that something that this project came out of some of the ideas that you were talking about last year at IFA Doc Lab?

[00:08:49.500] Marcel van Brakel: no it's all parallel to be honest i mean we had already during symbiosis we had the initial idea for future botanica but prior to symbiosis we already had the core idea of microbiome restaurant which we pitched last year here at it for dog club together in the dome theater and yeah since we worked it together quite well we started developing these two projects simultaneously And we were just lucky to win a prize with Future Botanica to have some starting money to start with that first. We used that money to research, initial research. Now we are also happy to have a large part of the money needed for the microbiome restaurant. So after this, we're going to start the realization phase, hopefully, with that project. So that's really cool.

[00:09:39.638] Kent Bye: Okay, so you're also working on that project as well?

[00:09:41.684] Hazal Ertürkan: Yeah, like basically, as I said, like we were looking for ways to collaborate. And then Marcel was saying, OK, tell me what you're interested in and then maybe something can come out. And then I went to him about like, oh, actually, human bodies are huge ecologies and then we don't have agency, blah, blah. Like I was reading books about it. And then he said, oh, actually, we have a project like we were thinking Microbiome Restaurant. So kind of we combined what was there already. And then Dinner for Two came out.

[00:10:09.671] Marcel van Brakel: We developed something together. If you collaborate, you exchange ideas and things grow out of that. So some stuff was already there and developed together into something new. Some stuff was becoming along the way because we were talking to each other. I also had a very cool research inspiration about how memories are also informed by the microbiome or stored within the microbiome and I thought that's such a poetic and awesome idea should use that so very organic.

[00:10:42.223] Hazal Ertürkan: Yeah it was really nice like we shared the same interests and then like same questions so it shaped that concept together yeah.

[00:10:51.144] Kent Bye: Okay, so I guess I'll stay tuned for the microbiome restaurant. Okay, so you're working on these two parallel projects. So let's go back to the Future Botanica then because that's what you're showing here. And so you had this expansion from the symbiosis trying to see if you could, you know, because only like five or six people could see the symbiosis and it takes a lot to onboard and offboard because there's a lot of gear. But if you have something like augmented reality on the app, then it's a lot more accessible and you can have these ideas just go a lot further. And so... I guess maybe you could expand a little bit around where did you begin in terms of actually starting to prototype this out in terms of the core of the experience of creating these ecology, having people generate these plants, how they're interacting with each other, having an AR component interaction through your phone. So what were some of the first steps that you took in order to actually iterate and prototype this?

[00:11:44.193] Marcel van Brakel: So first we thought about creating life forms more as a Lego system, like you have different modules, you can piece them together and then you can build something on the spot within the AR application. But then, of course, everything starts with research and also Hazal had a great input on that. We also realized it's so huge and so complex. So we tried to make a lexicon on how nature creates a leaflet or a stamp or a flower. and how does it do it in different algorithms and different ways and we came quite far with that but we also realized it's really hard because it probably takes a lot of time from the user to make stuff anyway so if you would give the audience a lot of freedom they also need a lot of time to make something really cool so that was also a problem on the project so one of the next step also because generative ai took such a great leap forward But anyway, in the beginning, we already thought, well, this might be the answer because it has all of the knowledge basically already on nature because it learns from the stuff that we feed it. The machine learning learns these patterns like almost unconsciously, you might say. And it could create way more biodiversity than maybe your own imagination can. So we thought that was very interesting. So that was the next step.

[00:13:07.421] Hazal Ertürkan: Yeah, like also while working on the project we also realized ourselves like how nature is complex and then how that creature is evolving differently and then we were also concerned a bit yeah but if we create that Lego system like future plans will look like maybe too much like now and then maybe it will not trigger people's imagination that much like really something not there yet we need to find out and then AI was really creating nice interesting visions at that time and then we came with the idea okay let's try if we put all this research like keywords like what kind of plant it is to see if it will create something really not there yet and to trigger people's also imagination like imagining those kind of ecologies so it was a nice change in the direction I think yeah

[00:13:58.135] Marcel van Brakel: I think also one of the important steps after making the decision to go in AI is just build our own prompt generator, to be honest, because we want to have also some control. If you give total freedom, it will also produce a lot of stuff that we cannot use. So how to limit down the possibilities for the user so it's also pleasant to have a fast creation process but also is kind of in tune with the realities of biology and at the same time it's logic and works for AI system. It was really interesting also in the research how you have to trick sometimes the AI to make what you want to have. That was a really learning process. And then I think also when we worked more and more with AI, we also, of course, a lot of new possibilities spawned, like how can we create future scenarios in the beginning? Now, at this moment, here at ITVA, we created them ourselves. Hazal made some, we made some. But in the future, we want to have the audience create these scenarios. But of course, with this generative AI, it's really cool that you can have a prompt, ask for it from a scientist or children or from a schooler and then turn it into a story, into a worlding. Not only in words, like things can happen and you can use AI to generate that stuff because that's also what we do in the project. It ultimately turns into 3D meshes that create worlding around you. And of course, this is like tiny baby steps at this moment. It's not there yet how maybe you want it, but it is this first step into on the spot generated worlds created by the user. It's a first tiny step towards such a future where immersive stories can be completely tailored to the individual who's doing the experience. And I think that's really, for us also, a cool insight. that I think we didn't realize in the beginning when we started this journey.

[00:15:58.171] Kent Bye: Yeah and so it sounds like that in order to have this app that allows you to generate new life forms like one of the first steps is to look at existing life and come up with the different features or taxonomy that you're going to use in order to start to have different vectors of control of modulating a number of different variables as you're going through and so Was that based upon existing biological research or is this something that you had to simplify it in a way that was going to generate these artistic life forms in a way that maybe not nearly as complicated as like actual biology? But yeah, I'd love to hear that process of trying to find that middle ground of both identifying those features, but then the process of making your own model, your own prompt pipeline in order to create these generative AI images. I don't know if you're training your own data sets or what the process was to actually tag and label those sets to be able to generate it, but the first steps of identifying the different features that you wanted to allow control to the users so that they could create their own life forms. So, yeah, I'd love to hear about that process and where you even began with that.

[00:17:01.266] Hazal Ertürkan: Yeah, there was a research phase in the beginning to understand, okay, what we need to involve, what we need to, like in plant kingdom, for example, how they're classified. We based on really like scientific classifications, we started. So for example, okay, how they are dividing them according to, for example, reproduction, according to, I don't know, body structure. So then we were defining, okay, these should be the variables in the things, like the way they are evolving. So we should say like, yeah, it's really based on the... scientific classifications and findings, keywords. The most important thing was for us to creating the prompt to make AI like imagine that life form. So in the beginning we were really using those scientific terms also like whatever it means like It's mutualistic or like reproducing through sports. But then we realized using scientific vocabulary is not really working on AI because then it was really taking all the images from scientific sources, which were super animated or illustrated things. so that's why we try to find a way like how to define these features like not scientifically but understandable and that will be reflected on the outside so for example there was a word we were really having trouble like camouflaging so camouflaging plant but then when you say camouflaging there were super weird like visuals were coming like camouflaging like suddenly reptile pictures or things so then we find out okay to be able to camouflage what plants are using for example okay it looks like leaves or like changing the color or so then we started to in the back end like the person is seeing camouflaging but then in the back end we are sending ai okay these are the physical features like randomly we are sending ai one of those keywords to create a camouflaging thing so it's based on the scientific research but we needed to find our way to make ai understand that

[00:19:03.151] Kent Bye: Yeah, it sounds like that you're giving a prompt to the AI and the prompt is really complicated, but you're trying to simplify it down into like one word like camouflaging and that camouflaging then gets translated into a much more sophisticated prompt that gets fed into the AI to get the effect that you want.

[00:19:17.003] Hazal Ertürkan: Exactly because interface shouldn't be also so complex like confusing people. We were also keep thinking it shouldn't be like a science like survey or something. It should be understandable daily language but then in the back end we need to also make it scientific and realistic the end result so yeah.

[00:19:34.854] Marcel van Brakel: This is so funny because in the beginning, for instance, we wanted to have like spiral body shape of a bacteria. And we could give the Latin name for that and the scientific name. We didn't get any good results on that. But if you frame it like facility pasta, like it gave great results. And so it made us realize most of these data sets are, of course, pop culture. It's not scientific data that is used. So that's also we had to rethink that. And I think also with the gardens, one of the things that we also try to achieve is to run simulations with them. So in the beginning we were focusing I think more on how do we generate something at all and how does it look and how can you have control over that. But later on in the process that shifted towards how can we build relationships, what is meaningful for a plant, can you input stuff to an ecosystem or output stuff from an ecosystem, what does that mean. And that's also still hard to tune because there we also have to simplify a bit to kind of get the most important factors of ecosystem in a certain scenario out. And also filter a lot of stuff away from the user because otherwise it would be way too much choices for it and you would get bored. So that's also still hard to find that sweet spot where it is still valid or kind of valid and at the same time accessible enough. So that's where we're currently at like trying to fine-tune that. And in the experience we represent these resources with World Admitters. So when you create stuff you're in some kind of molecular level, like a cell level, like microscope level. where there's some kind of robotic system that shoots out resources like water or UV light or that kind of stuff and make it accessible to the environment for the plants that are around there. And in the simulation we use that, it's depicted as an ecosystem and of course it needs to function if you want to calculate if you make a healthy or sustainable ecosystem with each other.

[00:21:41.144] Kent Bye: Yeah, so I see that this is a project that has a number of different phases that I'm going through where I'm getting onboarded into this world, some context setting, and then some AR bits that are also the world building part. And then there's a gen of AI part where I can choose the different features and then create something by default, chooses something for me kind of randomly. I can see what it looks like and then go back and then start to get a sense of How do I want to tune it? Also, I saw it in the context of the exhibition here, If A Doc Lab, where I got a chance to see what some of the keywords that I was shown might look like and some examples there. And then I created my own plant. And then there was a RPG, like give it points in terms of health and strength and resiliency and energy and reproduction all these different features that I'm giving score that I could give towards these different vectors turn into a little bit more of a abstraction in terms of okay now you're going to be put in relationship to another plant And now you're going to be a part of an ecosystem and some visualizations that were shown in terms of how I'm fitting into a larger plant ecosystem. And so I guess the first question I have is, is this a piece that you think is finished or is there more stuff that you're going to do to kind of flesh out some of these later phases of how this is in the relationship of these other plants?

[00:23:01.246] Marcel van Brakel: I mean, like always, work is never finished. Especially when we do a piece. There's technical stuff that we want to improve, but that's not so interesting, I think. One of the things that we tried, because we like the embodiment, and of course working with AI, is use the smartphone as a camera. So we want to create this three-dimensional space where you have to really move with your body, go close up to see something close up, meet each other. Also, we did a test in the Vondelpark here in Amsterdam with 15 people and it also became more like a social tool because you're standing next to each other, sitting on your knees in the grass, figuring out where your seat is going or maybe someone created a very tiny plant or something like that. And it starts also a conversation with each other. It's a social tool. I think that was very interesting. That's something we want to touch on more. Can we do the creation phase better or more? So you have more of this working together maybe in the ecosystem. And also with that comes now the simulation modes like very 2D. That was just also a time thing. We wanted to have it in here for IDFA, but the first idea was to also have that three-dimensional. So you can also way more easily see how you affect each other. And you can walk in the space and see more like a mycelium. What's feeding what? What is giving something to someone? What is receiving something? But that's stuff we would really love to do if we would have also the finance to kind of finish that off. So that's also the dimension as a producer. I have to take into account that's a large on our wish list, I think. And I think also now we here at Idfire, we show it also almost like an installation piece where you have, of course, the smartphone that was starting core idea of the project. But now it's got enlarged because we have 16 screens. outputting live data of user generating images that are creating there on the spot. So you also get this huge archive of not only scenarios, different perspectives on nature, but also how people deal with that. So I think it would be super cool if we have some anthropologist or something like study that stuff and figure out what does that mean? Because I think it's full of data and we store all the data. So you can see patterns of maybe stuff that people are fearing or wishing for. And also what kind of creatures do they create within a certain ecosystem and what tells us about us. I think it's hugely interesting. One also feature we didn't touch on is that we pre-created scenarios and I think technically and also how we kind of produced it. It's possible that the user can also add scenarios and that's also another feature now in the current setup that you can also, in a way that is not completely free, but still you have a lot of freedom. Because it will be very complex if you do a random scenario, future scenario, we'll add it to the system. We have to translate that into laws of nature and in pre-planted plants and in different creatures. So there's a lot of work there, but if we can have that going, I think it would be even more rich because then we really have what we really want. Automated also.

[00:26:23.294] Hazal Ertürkan: yeah for me like future botanicals one aspect is also quite important that people understand ecology is so complex and then like it's interchange of materials energy and then all that things so that's why actually we were giving people a chance to define a strategy of the plant because We keep seeing ourselves like we are having the agencies, but then actual plans are also keep having strategies to survive, adapt. So to make people aware of that is important aspect of it, I think. So that's why like in simulation part now it's 2D and then people is not having like too much interaction there, like to see, OK, I put my plant here and then how that is interacting with the other one, because it was the first idea to see, OK, like what I am producing is used by the other plants around me or like there is something dangerous, like bad gas or something. And then I'm seeing that molecules are, I don't know, taken by another plant and then I'm connected and I'm sharing information with other plants. Those kind of like visualizations will also make people understand the intelligence of nature. And then maybe it can help to not see ourselves only as intelligent creatures or something. So that's also an important next step for me, I think.

[00:27:43.208] Marcel van Brakel: I think that fragility is very important. We often forget that we are part of a very interconnected, interknitted entanglement with everything. And you can have the illusion that you have agency over that, but in fact you haven't. And I think that's something we can learn of ecosystem, that we have to be more humble because we are not alone. We don't decide. We are informed by the system. And I think that's more on a philosophical or political level, tunes in with the message of symbiosis, but also with micro-room restaurants, which we're currently developing after this. that we have to have a different relationship towards nature but also towards things like individuality and freedom and the idea that you're just living for your own and you're all interconnected with each other and we all need each other and are dependent on each other. Like plants, humans are too.

[00:28:44.622] Kent Bye: Yeah, part of the experience that I had in seeing the piece was that, well, first of all, I have like an older version of iPhone. And so I got a message that immediately said this project may break or not have enough RAM to run it. And then I ran into a point where it stopped running and then I got a phone that I borrowed and then I was able to use their phone and get through the experience fine. But I did download the app, but it's also gated with a QR code, so I can't go past a certain part to explore it further. But when I got to the end of the experience, there was the process of divvying out a certain amount of points or score that I could give towards these different variables. put me in a relationship to another plant. And then I was in this ecosystem, had these visualizations, but yeah, I think it was a little bit difficult for me to see the impact of that, or like I would need to see a process of that, or it was kind of abstracted in 2D. So yeah, just other ways of viscerally getting that part of the ecosystem, like some of the things that you were talking about in order to really flesh that part out.

[00:29:40.206] Marcel van Brakel: I totally relate to that. I mean, that's also what we experience ourselves. It has to do with time and money and how it's a really fresh project. To be honest, it's world premiere. But definitely one of the things that we try to do in simulation is also to give you story prompts, because if we do run the simulation, we can also say, okay, at 10%, the world is at drought, so you might die or it's not good for your garden. we already have that in the system but we want to explore that even more that you'll have these prompts inside but i actually like the initial idea to have that simulation more as a 3d world of more like mycelium kind of structures growing towards each other sharing stuff receiving stuff also as a dynamic thing if your child's plans die off you will see them shrink that kind of stuff it becomes way more emotional i think and also way more

[00:30:35.780] Kent Bye: less rational and distant so that's definitely something we want to include in the next versions and so there's some ai components and ar components that i want to dive into i want to first dive into a little bit more of the ai pipeline because you mentioned you had to create your own prompt and then also I'm wondering what kind of models were you using? Were you using existing models? Did you have to train your own model and tag it? And what kind of baseline and basically some of your AI pipeline and just some more details as to what you did in order to put this piece together?

[00:31:09.520] Marcel van Brakel: Well, there's a lot of trial and error there because also we're not specialists. In this version, we use already existing generative AI systems. So we use stable diffusion to prompt images and we use that images to prompt 3D meshes. But we also found out if we do the 3D software and we run the prompt directly into the 3D software, it has really not really good results. But with the image is way better. So that stuff we didn't anticipate is something we have to try and error. And then we found out that that works. And then we also have a loop with ChatGDP. So we built a prompt generator. Prompt is sent, but it also takes in account the scenario, which is already there as a context. So that gives you a wide range of biodiversity and also adaptation in all of the AI system that they really produce different results, but still fitting to the scenario. That worked out pretty well, I think. Sometimes we don't have control, of course, in this system. So sometimes also things go wrong. But I think we're almost at 90% accuracy, something like that. It really does what we want it to do. And then there's a lot of fine tuning in the pre-prompt and the backend prompt that we created, like Hazal already touched on. For instance, if we do transparency or translucency, we thought that was a very interesting feature for future botanica or even for current botanica. But if you put that in the prompt, you get all kinds of faces and pots and shiny things that doesn't have to do anything with plants. And how to filter that out was a challenge for us. And it has to do with the exact framing, giving also more points to certain parts of the prompts. So just still fine tuning process. I mean, in the end, we would love to use machine learning to kind of optimize the right decision. So you create a feedback loop in that too. that creates better results. We haven't included that yet, but that's also not impossible. I think it would be very possible to do that.

[00:33:16.898] Kent Bye: Yeah, I wonder, in the process of prompting, I did come up with a face that came through. And so I was like, oh, this must be an error or something. It was supposed to be a plant. So then I just went back and generated something else. But in terms of stable diffusion, you can have different models that you're using off the shelf. But there's also like a default model, custom models, and then there's models you can train yourself. So I'm wondering which kind of model you used or if you did a custom one or if you found one that worked really well.

[00:33:43.117] Marcel van Brakel: I have to ask my tech team to be honest. I know we don't use the standard model, but the separate one, but I don't know the exact name of it. So sorry for that. That's fine.

[00:33:52.501] Kent Bye: It has a really nice look and feel. So I imagine that it was something that was a little bit more tuned from biological samples or something. But I know that there's a possibility to do your own data sets and training, but it sounds like you were just taking something that was already existed and then using that from there.

[00:34:06.946] Hazal Ertürkan: Yeah, like we are already using the data like available, like how that system works, but then adapting to what we want to achieve. Also in the back end, as I mentioned, like in the back end, also we are trying to filter to prevent those kind of like faces and stuff. But it's always like sometimes you are not guessing that vocabulary can result to that. And then whenever we realize we are adding new filters, like negative prompts. So yeah, it's a bit... Again, like trial and error thing.

[00:34:37.497] Marcel van Brakel: But I think the machine learning will help a lot if we include that. For the moment, it's just also a time thing. Also, it's a new field for us. It's usually exciting, but I'm not a specialist. We don't have the specialists, real specialists on the team, on the AI. But we definitely know it's possible to do this stuff. And I think it will help the project a lot if we have that included. yeah and sometimes also yeah we put in moldy for instance and moldy is also maybe some person so then sometimes you get a face and we also discovered if you do a zero prompt so you send a prompt but it has no content you get a picture of a lady who is casually dressed so that is also interesting that the ai system shows how the system is biased by the material that it is trained on And on some stuff, we don't have complete influence, of course, except when you train it yourself. But that takes way more time than we have with the budgets that we have.

[00:35:39.419] Kent Bye: Right. Awesome. Yeah. Just curious to get a little more context on that, because, yeah, obviously it's going to take more time, but you have more control at the end of it. Yeah.

[00:35:47.343] Marcel van Brakel: It's also something you have to acknowledge. If you work with generative AI, the fun part is that you don't have control. And it's also a pain in the ass sometimes that you don't have control. But it's part of the deal, I think, that sometimes things like this happen. You should laugh about it maybe instead of... but what one of the things that we realized is to create this feedback system within the creation mode so if you have something that really doesn't fit what you want you can run the prompt again or go back and that filters out already a lot of trash

[00:36:22.370] Kent Bye: Talking to Octavian from AI, I mean, commenting on how these AI systems are much more like alchemy or magic or spellcasting in a way that from mathematical sense, it's more of a non-binary logic. It's like a realm of possibilities that have these archetypal potentials that you have to come up with the right words and spellcast. And you don't have that heuristic, logical, rational logic. amount of control on it so it's a new mode of thinking in that more relational context but also like yeah you have to come up with these magical spells in order to get it to work which is that type of spell crafting is like you've done all that on the back end and kind of abstracted that out into a way that was really a quite nice interface for people to slide different vectors and experiment and prototype so yeah it feels like it's a nice interface onto those abstractions but yeah it's kind of like casting a spell and seeing what you get.

[00:37:13.086] Marcel van Brakel: But I really love that analogy because I think it's not so much different than how we use our own creativity to create worlds in our brain. So in that sense, the system spawns things into reality that really have an agency, at least in the app, but also in the world. And in the same way that we create these by interconnecting stuff in our brain, we come up with new concepts because we glue things together that might not need to be glued at all, but by combining stuff. new realities spawn from it. And that has agency and I think that's also the core of the storyteller, that you are engaged with that. And that's also the beauty of it, the magic of it. So I really like that effect, that the alchemists were right in the end.

[00:38:01.550] Kent Bye: I'd love to hear some of your thoughts in terms of some of your background and working with living systems and working in these kind of speculative realms and how that plays into some of your other interests of why you wanted to get involved in this project and connecting those two dots between the speculative realm and the physical realms that you're working in.

[00:38:20.017] Hazal Ertürkan: Yeah, I was working really like more down to earth product design, but then how we can integrate these living materials into daily products, like maybe a curtain from algae. But then rather than maybe like opening the curtain in the mornings, maybe we will close to make algae. see the sun, so kind of adapting our lives according to those other living things in our daily life was interesting. But then these kind of like material development parts still in a super in the beginning and then I was always having difficulties in user studies to make people imagine these different alternative lives. So that's why like when I experienced symbiosis, I was like in a completely different feature or alternative reality. And then in my character, I was experiencing not being the dominant species. And then like during the experience, I was feeling the fear that I need to kind of follow up what the other more superior creature says me. Because there was food involved and then I was, although it doesn't seem nice, I was like accepting because in the story it was saying, okay, I need to accept this, like showing like my good intention. So then it really turned me into like, whoa, okay, you can really create and make people feel like differently than they are mostly how they are in their daily lives. So yeah, that's what I'm interested in actually, to make people imagine different realities and then maybe, hopefully, it will create some spark about like, okay, there are alternative ways of living, this is not the only thing. So yeah, this is my relation to the project, yeah.

[00:40:01.198] Kent Bye: Yeah. And I remember when we were talking around symbiosis here, if a doc lab, actually a couple of years ago, I had just seen it in Portland. And then I came and did an interview with you and your collaborator. And I remember you were also talking around like Donna Haraway, a lot of her ideas and philosophical thought of these speculative futures but also like this merging of our bodies with other species but also like the role of technology seems to also be a consistent theme that is coming up in this project as well of like technology melding with organic or biological systems and so i'm wondering if you can elaborate on some of those core themes that you see like may have started in symbiosis but continuing here in future botanica in terms of like this intermixing and fusion of humans and technology, but also the work of philosophers like Donna Haraway and how some of their work is influencing the work that you're doing.

[00:40:52.048] Marcel van Brakel: I was thinking also, of course, we dominantly work with AI system here, and a lot of people are afraid for that. Because they think they will spawn into consciousness and start conquering. The tendency is to think very dystopian about it. Or only as a soul tool. But for me it's a bit in between. I think it is of course a great tool that enables creatives like we are to do stuff that I cannot do at all. And even better. But at the same time I think... Our fear also comes because the current AI systems are trained on human bias, on human data, on human dreams and fears. And that reflects it back. And what I find really interesting, if we change that into different data sets or different forms of intelligence, Let the grass train the AI or let the ecosystem train the AI. You will have a really different way of perceiving reality, predicting reality, mirroring stuff back that we don't have a clue about. And that's introducing new voices to our realms that are alien but also inspiring and I think can give us a lot of insights and wisdom. So for me that's very interesting and I think one of the other threats is that really like you already touched on is that I think future nature will be a hybrid nature. The current divide between technology and biological systems as two separate things that will not merge. It's an old idea. It will change. We will genetically change. Evolution is already doing that on a daily basis. But at the same time, there's super cool research. For instance, the xenobots. I really love that stuff. Where they make robotic systems with frog eggs, for instance. They have robotic systems that are alive. They can crossbreed, they can reproduce themselves. At the same time, they're not alive because they're not a frog. There's something in between, a hybrid thing that is both nature and technology. I think we will see a lot of that in the future. And that will mix the full boundaries and possibilities of everything, of what it means to be a human body, what it means to be an ecosystem, what it means to be a technological system. I think that will be completely merging. And I think that's nothing to be scared of it is something to embrace and to be cautious about but also curious about i think that's the cool thing about speculative design that you can do that and at the same time i hate speculative design because it's speculative so one of the things that we always want to do with polymorphs do it for real as much as possible So I think the next project also with Michael Brown restaurant, it is speculative, but it is also real. And we want to do it for real as possible with the real experience, with the real stuff. So that's what I hope for. I think it's also a threat in the Polymorph portfolio, I think.

[00:43:55.410] Hazal Ertürkan: Yeah, what you are saying about AI, like I agree while writing the stories, I a bit realized that because I was always a bit starting with negative prompts like about future ecology. Ah yeah, plants became super efficient, but then this created another danger because now oxygen was too much, blah, blah, blah. Like I realized myself, I was always a bit going... to the worst scenario, but then we were using AI also in that. Like, okay, now imagine something unpredictable, like what can happen in this space, like scenario. And then AI was coming with different things. and then it made me actually think about oh okay like nature keeps finding its way in some way like although like i keep giving negative things it was like saying oh yeah but then plants adapted and then they started to do that and then humanity changed like this so kind of like ai helped me to see the positive side of like what is possible also and then that's also interesting like we keep thinking maybe like quite negatively But then these kind of tools are also helping to get out of our current setups or like fears and then see something different.

[00:45:08.643] Kent Bye: Nice. Yeah. And I want to come back to the AR parts and then we can start to wrap up. But in the experience, you have some interactive components in the piece with AR where you have your seed and you're planting it. Some of it was a little bit confusing in terms of like there's a lot of abstractions of other entities in the far field. And then there's basically some interactions that like felt like I was. asked to do something and then i did a couple of things and it failed and then i got it to work but then it was like okay well that wasn't exactly clear but i guess the idea with ar is that you can start to have people move their body through space but also interact in a way that goes beyond like just a 2d interaction so it had a spatial component But also it was really cool to see like a spatial representation of the plant that I had created. So it's interesting to hear that you're creating an image and then, you know, generating, I thought that like works well. And then they're also putting it in relationship to other plants. And so, yeah, I'm just curious to hear a little bit more as you're starting to design both the interactions with AR and not make it just so that it's just a phone-based 2D experience, but to involve people's bodies and also use the affordances of AR, give some sense of a spatial representation of some of these things. And I imagine in the future, especially getting to more of the ecological relationship between these plants to each other, you can really start to push that. Yeah, I'd love to hear some of your thoughts on both the intention to include different AR components and then what you were able to achieve from your perspective from a design and experience perspective with AR.

[00:46:38.206] Marcel van Brakel: Well, in the beginning, we were thinking to do all interaction and all of the user interfaces in 3D world, so there would be no flat screens at all. And we started with that, but it was quite complex and stressing also the systems. Like you mentioned, it already crashed on certain phones. In the beginning, it crashed on every phone because it was just too demanding. So at the moment, we're right at the edge of what the smartphone can do. because it demands processing but also data a lot of stuff is happening all of the time especially when you work three-dimensional on a phone so the choice to use the smartphone as a device is very good because it expands the audience but it's also limiting because we had to step back quite some times on our intent because it was just too heavy on the devices and stuff or on the processing part that said yeah i mean we're still like trying to find our way with it i mean personally i would love to have it almost like vr but as ar that it's stripped of a lot of interface stuff but as we cleaned it to more flat interface parts it really helped the user journey it speeded up processes was much nicer for people to interact with so we have to also there find the sweet spots And I think we can still do more, feeling the relationship with each other in the app, because you also forget that quite fast when you're working with your own app and your own world. I think it's sometimes almost as good as VR, but it also makes you lose track of each other and sometimes also of your stuff. And so that's still, I don't have a full answer to that. It's part of, I think, research we want to continue doing, how to optimize that. Maybe you have more?

[00:48:28.203] Hazal Ertürkan: Yeah like currently it's a bit like individual experience but in the beginning we were always emphasizing okay no people should be able to interact make connections between plants and then we were imagining even like you will walk around in between like 3D models and you will say, oh, this plant is producing this. I also want my plant to then copy that or like maybe I will produce something helping that plant. Okay, then I will place my plant next to it and then I will have a relationship and then maybe in the 3D world I will see an emitter like energy source and then I will just... put them to my plant then like more like embedded interaction exactly we were imagining but then yeah it's quite like heavy it becomes the app so then we now made like flat screens like what it consumes what it produce those kind of things to fasten But yeah, if we can manage in the future to make more like 3D, that would be also more meaningful for people. Because some people, if you don't know anything about biology, it's also a bit like maybe challenging. Okay, what producing something? What's like, and then what I will consume? So yeah.

[00:49:34.881] Kent Bye: Yeah, I think if I would have had a bit more context as to where my plant was going to be and be introduced to like, this is what is needed by the ecosystem or this is how I would be in relationship. I felt like with the 2D, it simplifies it, but also abstracts it and sort of removes it from its spatial context. And so it was sort of like a random assignment of that. variables that you can play with and then it was then difficult to see what the impact of that was either immediately or even through the simulated nature through another 2d abstraction and so i feel like there's a lot of potential for how to do a spatial representation of that but also it's a a hard design problem but also like you said it's going to be difficult to have a fully rendered ecosystem of all these plants because it's just going to be a lot of spatial data that's there. It was great to be able to see it at IFA because those high resolution images were really powerful to see those and to know that those were coming from other people that had participated in the project and if I would just have the app alone I wouldn't get that other aspect and so to have that type of experience you're able to do in the installation as a stopgap but have that within the app itself to have that special context for me to walk around and look at the other plants and other creations and yeah just see how they're unfolding over time or even to like swipe time to see how it's important because a big part of this is a process i saw like there's a simulation that said okay this plant died okay the whole ecosystem just collapsed it was like different vectors of like the environmental variables that you can kind of play with and so a lot of stuff that again was like these 2d simulations that I had trouble entering myself into it other than just a bunch of random data that was difficult to translate those data points into what that actually means for this ecosystem. So I think the AR spatial context has a lot of possibilities to flesh that out more. But yeah, it's a hard design problem. And also, like you said, it's something that's going to be a little bit of a stretch goal in terms of stuff that you're potentially going to be working on next.

[00:51:28.445] Marcel van Brakel: I think it's great feedback. But to elaborate a bit on that, we constantly have to kind of do a trade-off. Like now we create scenes, scenario scenes where people participate. Initially we thought maybe 30 people can do in one playroom, for instance, within a scenario. But if you add a 3D model, if you only generate one, it's so much data, your phone is exploding. So we have to still tune that down. So where can we do 3D models in and maybe less people or higher quality models? Because we can do way more better than we can now show you. But that is a concentrate of like how many users in what party. But if you have a small party, sometimes you also enter a space, you make something, but you're alone there because nobody's entered yet. Then that's very boring. So we already had to create pre-content a bit. So there's always something to see. That's already data that we have to subscribe from the amount of people that can enter so that's still fine-tuning that it's a lot of things that Have an impact Also, yeah particle system stuff. It's all very cool it eats and the simulation stuff it eats data of the phone and So that for us was a big concern. I'm really quite happy that we already succeeded to have something that kind of function and has all of the elements in the app kind of working. But I agree that there's still a lot of space to explore to make that even stronger.

[00:53:04.096] Kent Bye: Are those social elements implemented now? Because when I saw it, it was just me that did it. I didn't have other people. But is there a social component in terms of people doing it at the same time and being able to interact with each other? Is there any interaction if you have multiple people doing it so it becomes more of a social AR experience rather than a solo experience? Or is it just mostly a solo experience at the moment?

[00:53:23.042] Marcel van Brakel: I think at this moment, although we use the live generated data of each other, so you are in a social system. But I think in the experience, it's more a solo experience at this moment. We'd love to do more, but yeah. I'm constantly in a fight with my tech team. I say, I want to have this, and then they say, oh, it's crashing. Cannot do it. And then we have only one, only one. So, for instance, this, like me connecting with Hazal, Hazal connecting to you, show your offspring, show how stuff is growing in time. That's all stuff we would love to have in, but At this moment, in the current state of the technology or at our capability, it's just too much. So we have to reduce and make strategic decisions on what we can do. OK, great.

[00:54:08.008] Kent Bye: Yeah, that's helpful. And yeah, so as you continue to either work on this project or move on to the microbiome restaurant, then yeah, excited to see where it continues to grow and develop. But yeah, I guess as we start to wrap up, I'd love to hear what each of you think the ultimate potential of these immersive technologies in AI might be and what it might be able to enable.

[00:54:30.787] Marcel van Brakel: This is the first time that we created an augmented reality application. We already talked about the strength of AI systems. I think it will flood the immersive industry. We cannot stop that. And I think it's also a good thing. It opens up so much possibilities to enrich experiences, make it tailor-made, make it dynamic. make it layered live and interactive so i think that's really cool and i'm a fan of augmented reality because i think it's in a way much stronger than vr because it doesn't close you off from reality it's always hybrid or you can choose it to be hybrid and not to hybrid we also play with that in our own app now some parts are hybrid and some are just closed off And I really think that's a really cool tool. But I think also, yeah, this able to connect with reality in an in-between space that AR is, is very exciting. And we definitely are going to do more stuff with that.

[00:55:35.362] Hazal Ertürkan: Yeah and I really like also about AR like it's more like you are seeing in your environment you are not completely somewhere different still something relatable familiar like it's not scaring people too much like kind of like adapting something new and then something existing so kind of like passageway. I like that and in the future yeah like these technologies will shape us quite much I think because when you look also design history like with each tool of humanity is changing their daily practices their life so these tools also probably will shape us and then we will become I don't know something else probably in the future so all practices I think will evolve around yeah these and yeah.

[00:56:24.096] Kent Bye: Nice. And is there anything else that's left unsaid that you'd like to say to the broader immersive community?

[00:56:30.620] Hazal Ertürkan: Okay, like as I keep saying, I'm really interested in making people aware of we are not actually most superior species, like having all control. So even in this app, like you seem like kind of god mode, you're creating something, but then it can end up completely different. So this is also what I expect. I want people to be aware of, like we feel like we are full in control, but we are not. So as a community, I don't know like we need to be positive about future like because these days a bit hard also like to be hopeful about change about like if it's gonna be better or worse but I don't know life finds its way always it gives me hope yeah I don't know these are my thoughts

[00:57:22.040] Marcel van Brakel: I mean, we're here at IDVA and there's always this great playing room of all kind of stuff happening, stuff that you never anticipated to be possible. And I think that I'm very thankful always when I visit such a thing like IDVA Dog Club or another festival in the immersive industry. I always got surprised by new stuff and things I haven't thought about. And I hope that the industry stays that open and also stays invested in hybrid stuff, the in-between technologies, the in-between possibilities. There's also a lot of festivals only VR, VR, VR, VR, and it's kind of boring. And I think what's really nice also in a festival here It's about having a performance drinking DNA and that kind of stuff. I think that's the cool stuff. We can have more of that. Hopefully. Awesome.

[00:58:17.259] Kent Bye: Yeah, I'm a big fan of IFA DocLab and coming here each year always expands my idea of what's possible with the intersection between art and technology and storytelling reflecting on the creative treatment of actuality as John Gerson's definition of documentary. Yeah, and I just really appreciated Future of Botanica, especially just looking at the images in the installation. With generative AI, you tend to get a lot of stuff that has the same aesthetic or look and feel, but I feel like you were able to tune it in a way that gave me the sense of like, all in wonder which is really the magic of generative ai when it's really able to mash up these different archetypal forms in a way that creates something new or novel or interesting and yeah i think it's a challenge of creating a whole ecosystem experience out of it i think it's you know the app is a great start to just even generate and prototype that and then yeah the next steps being into making you feel like you're a part of a larger ecosystem and how they're interplaying with each other so some of the next steps that we talked about but yeah i just really appreciated having a chance to play through it and to have a chance to talk to both of you today to help break it all down so so thanks so much for joining me again thank you so much for your time and giving us such a platform and time also it's really nice to have a long conversation

[00:59:37.133] Hazal Ertürkan: Yeah, also super nice talking with you, having different insights as a person who experienced it. Thank you very much. I really enjoyed it.

[00:59:45.898] Kent Bye: Thanks again for listening to the Voices of VR podcast. And I really would encourage you to consider supporting the work that I'm doing here at the Voices of VR. It's been over a decade now, and I've published over 1,500 interviews. And all of them are freely available on the VoicesofVR.com website with transcripts available. This is just a huge repository of oral history, and I'd love to continue to expand out and to continue to cover what's happening in the industry, but I've also got over a thousand interviews in my backlog as well. So lots of stuff to dig into in terms of the historical development of the medium of virtual and augmented reality and these different structures and forms of immersive storytelling. So please do consider becoming a member at patreon.com slash voices of VR. Thanks for listening.

More from this show