Tosca Terán (aka Nanotopia) is an interdisciplinary artist at intersections of ecology, bioart, mycology, and sound, and she collaborated in creating the “Mycelia” performance with the Metaverse Crew and Sara Lisa Vogl (aka _ROOT_) to bring her fungi biosonification music into an amazingly immersive, audioreactive VRChat world.
Premiering at the AMAZE festival in July, Terán held a number of musical performances where she translates the conductance from Oyster Mushrooms into MIDI and a form of ambient electronic music in addition to her own musical performance. My mind was pretty blown after hearing her describe her process of how she generates her music, and I wanted to sit down with her to do a super deep dive unpacking each step of the process (See the show notes down below for more details as you listen in). I had a chance to catch up with her the day before her Venice performance in September, and I had a chance to see it a second time at the Raindance Immersive festival, where it ended up winning one of the Spirit of Raindance awards for it’s innovative, independent, and pioneering spirit.
This ended being quite a wonky deep dive into the audio production pipeline of biosonification of fungi, but also has some deeper thoughts about the implications of interspecies communication, the potential of using haptics, and spatialized ambisonics and sonification to further explore biometric or physiological data from either humans or non-human species. The Mycelia performance has been one of the more magical experiences I’ve had in VR, especially considering that it does provide a portal into the biorhythms and proxies of consciousness of non-human intelligence. So hopefully this conversation will not only help explain Terán’s creative process, but also help to inspire other bio-artists to continue experimenting and exploring the potentials of biosonification with the context these immersive worlds.
SHOW NOTES
- InterAccess Slime Mold as Capacitive Storytelling
- Elecricity for Progress – Sam Cusumano
- MIDIsprout MIDI Sprout Biodata Sonification system
- Midnight Mushroom Music – SoundCloud
- April 2020 – Goethe-Institut New Nature Exchange
- An Immersive Media and Climate Science Exchange between Canada – Germany – Mexico – US
- Symbiosis and Dysbiosis – working with Haptic – bHaptics
- Ganglion Board from OpenBCI
- EmotiBit: Wearable biometric sensing for any project!
- HerpDerpinstine VRCBhapticsIntegration
- 555 Timer – acting as a galvinizer
- Pleurotus ostreatus – Oyster Mushroom
- 16MHz Crystal selection for ATMega328P
- Animoog app
- Euroracks
- CV Tools
- Nanotopian bio-sonification modules (outputs MIDI)
- Biosonofication images
- Mycelia at Raindance Oct 29, 2021
- Mycelia at Venice VR Expanded
- Lingzhi_(mushroom)>Ganoderma lingzhi
- Chaos fungorum: Nuit Blanche live performance & exhibit Sept 29 2018
- Coalesce: Center for Biological Arts at Buffalo University
- Tangerine Dream
- Isao Tomita
- TouchDesigner
- Open Sound Control (OSC)
- Max (software)
- Holobiont is an assemblage of a host and the many other species living in or around it
- MIDI 2.0
- Brendan Lehman
- Mycelia in VRChat
- Poiyomi is creating Shaders
- Grone Drone Synth
- Moog Mother 32
- JOVE Filter
- Mutable Instruments Yarns
- Bastl 1983 Eurorack Module – Polyphonic MIDI to CV Interface
- Forest UnderSound Installation at The Museum in Kitchener Ontario for SONICA21
- Mellotron
- Moog Grandmother Semi-Modular Analog Synthesizer and Step Sequencer
- Moog Mother-32 Semi-modular Eurorack Analog Synthesizer and Step Sequencer
- Metaverse Crew
LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
Rough Transcript
[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye and welcome to The Voices of VR Podcast. So there's been a lot of club scenes that are happening within these virtual worlds and it's a whole vibe in terms of going into these underground clubs and having these different sets and finding ways to translate the music into different visualizations. Well, there is a interdisciplinary artist named Tosca Turan who also goes by the name Nanotopia, and she was doing this biosonification of fungi. So in other words, hooking up these mycelia and the fruit of mycelia with oyster mushrooms, whatever signals are putting out and translate it into MIDI. And then that MIDI gets translated into a musical performance, which is then beamed into a VR trap world that was created by the Metaverse crew. There's different audio reactive shaders that are in this world, as well as some full body tracking interpretive dance that was happening by Sarah Lisa Vogel, aka Root. It's a musical performance where at a base, there's these mycelia that are playing music. And then on top of that, Nanotopia is doing other musical performances to create this biosignification and communication with these alien species of these fungi. And this is all happening within these virtual worlds. So it was a pretty mind-melding experience to really unpack and understand all that went into creating this piece. At the end of each of her shows, she explains all the different ways in that she's hooking this stuff up. And so this premiered at a number of different festivals over this past year. I first saw it in July at the AmazeFest. It also showed at the Venice Film Festival, and it also had a screening at the Raindance Festival, which I had a chance to see it for the second time. And it actually just won the Spirit of Raindance Award, Raindance being very independently driven. festival and the overall vibe and the ambiance and the experience was very much into this independent spirit of what Raindance is all about. So it won the Independent Spirit Award for Raindance. So I had a chance to sit down with Tosca Turan to be able to unpack a little bit of like, okay, how do you take these fungi and then translate whatever they're doing into music? And so it's a little bit of a deep dive into lots of different of the technical wonky aspects of music production, but there's also deeper things that are about how do you start to take biosensors and start to do these different translations and visualizations and sonifications and create these virtual environments that allow you to connect to these deeper patterns of consciousness and nature of reality that may not be visible to you. So it's making the invisible visible and the micro macro. So thanks for coming on today's episode of the Oasis of VR podcast. So this interview with Tosca happened on Sunday, September 5th, 2021. So with that, let's go ahead and dive right in.
[00:02:41.453] Tosca Teran: So my name is Tosca Turan and I'm an interdisciplinary artist and I work in the confluence of ecology, like bio art, some craft, I suppose, because I do build, like, I work with metal and glass and I've done that for decades. So for perhaps maybe the past seven years or so, I've been really focusing a lot on biomaterials as an alternative to metal and glass past and kind of landed in mycology in a way. I'm not a mycologist, like, schooled or anything like that. It's more just a passion of mine and looking at different substrates, things like this and sound. Sound has always been a huge component in all of my work as a soundscape or things that make noise when you touch them and things like that. So what I'm doing is I build purpose-built modules, circuits, So it's a biosonification module using electrodes that are placed within or on top of, depending on the type of electrode you use or make. It picks up micro fluctuations and conductivity. And that's between a thousand to a hundred thousandth of a second. And it sends those into the module, which are then translated into MIDI. which is musical instrument digital interface. And that is how that information can be broken out to like synthesizers or DAWs, like a digital audio kind of interface, visuals, you can control video. So you can hook it up to yourself, of course, and have your own biorhythms controlling things. And like people do that with OSC and Brainwaves. And I just thought, wow, wouldn't it be kind of interesting to hook up fungi and see how it might control or what would happen, depending on, of course, I mean, you're setting the parameters and things like that. Yeah, so mycelium is just really fascinating to me. So I took a workshop in working with slime mold, Fisarum polycephalum, at a place called interaccess.org. It's here in Toronto. It's like a kind of a new media makerspace. It's very cool. And the instructor was showing us how to have our own kind of slime mold pet, if you will. And slime mold is very fascinating. It's not a fungi. It's a more of an amoeba type of organism, but it loves flakes. So if you place oat flakes around a maze, for instance, it will not do necessarily what you automatically would think it would like. It might not go from oat flake to oat flake. It might circumvent where you want it to go. People work with it computationally and things like this. But anyways, this instructor suggested the slime mold perhaps being like a way for us to work with or collaborate with an alien species. And that just, for whatever reason, it blew my mind in thinking about, wow, like I'm growing these gourmet mushrooms at home. And I had been looking at the substrate they grew on as a sculpting material. but I hadn't at that point really thought about what if I hooked them up or sound or what could they do. And so the modules and like the circuits, like I'm building these circuits, it is from an open source schematic and Arduino code. that's available out there through Electricity for Progress and Sam Cusumano, who originally designed it. And these were kickstarted way before I even found out about it, actually. And it was called the MIDI Sprout. And the MIDI Sprout has evolved to become, I think, the Plant Wave and things that you can hook up via Bluetooth and things like this. So anyways, I found Sam's information. through this person in Germany who has kind of expanded upon the circuit to add like control voltage capability out to be able to hook up to like a Eurorack type of situation. And then there are some Eurorack module builders, I believe in Scotland that actually make a module that's based off of the schematic as well, but also kind of broadening it just with more CV out. So back to my mind being blown by this slime mold and alien working with an alien species. So after I built this module, I had my electrodes with the biomedical pads on it. My partner and I had everything hooked up and I'm like, okay, well, so I set these electrodes onto the mycelium. And to be honest, like we didn't think, like we thought maybe we'd get a ping or a blip or a block. I were completely blown away. It was like this melody, like this music started happening. And so, yeah, here I am. So I started up this SoundCloud Midnight Mushroom Music kind of archive where I would set parameters, you know, like use the same synth voice, if you will, and not really alter anything at all. Just curious how all these, do fungi sound the same? Do different mushrooms have a different sound? and things like this and just through different residencies in Iceland and Australia and places like this, I've been hooking up all kinds of mushrooms and trees and spending 24 hours with them just to listen to what happens. And then VR happened. I've always been interested in VR and like back into QT VR days, which is kind of hilarious to even think about now.
[00:08:42.153] Kent Bye: And then 90s with Apple's QuickTime. Yeah.
[00:08:44.514] Tosca Teran: Oh my gosh. Yeah. And just, I was really interested in that and what Jaron Lanier was doing, like with these instruments and these virtual instruments and things like that. But of course the technology just was not there until I think even 2016, things were starting to emerge and people were starting to do really interesting things with it. So last year, 2020, when everything kind of came to a screeching halt, interestingly enough, to me, around, I guess it was April, the Goethe-Institut Montreal reached out to a number of artists throughout Canada, Germany, Mexico, and the States, creating what they were calling the New Nature Exchange. And so they brought us all together to kind of share ideas. And so the artists and technologists they brought, mainly the technologists, I should say, were a lot of people working in mixed reality, in VR. And I've created immersive environments, but not in VR, more just physical things that would take people into a different world. And so it was during this exchange that I was introduced to Sarah Lisa Vogel, AKA root of NBRC. And I did a little bit of a performance for this event where I had mushrooms hooked up and I would touch them and they would hear the sound changing. And I had this idea for actually a project that's being worked on called symbiosis dysbiosis, where we're creating a virtual forest of like a boreal region in Canada and coastal rainforest region, British Columbia. where people can go in. This also involves like a headset where we're tracking emotion and brainwaves. I'm working with a scientist on that called Brendan Lehman. He's a neuroscientist and he's worked with Muse and things like this. So he actually knows how to bring that kind of biodata into Unity. And so for that, we're working with a Ganglion board from OpenBCI. and an emotive bit, which was kickstarted earlier this year. So that enables us to also bring in human emotion. So the concept with that is that people will have a one of a kind experience because it will be their own brainwaves and their own emotions. But then the fungi that's in it is sending its bio data in. So it's like the forest responding to people, impact of human beings in a shared environment. because this is such a long and convoluted project, you know, we wanted to see if we could get something like this happening in VRC, but there's just timing and also with restrictions happening and stuff like that. But with Amaze, I was wondering with Sarah, like, I wonder if we could create kind of a situation that might be similar in the thought of mycelium controlling something, but in VR chat, like, could we do that? How would that even happen? And so I think it was Sarah who stumbled upon the Meta Crew in South Africa through South by Southwest.
[00:11:57.027] Kent Bye: The Metaverse crew?
[00:11:58.994] Tosca Teran: Yeah, the Metaverse crew, they're like wizards out there for sure. And so we'd put this mood board together of like a crystalline cave, you know, thinking of Werner Herzog's cave of forgotten dreams and mycelium and things like that. And it was just so amazing how quickly this world started forming. So meanwhile, I'm like, OK, well, I've said I'm going to somehow connect fungi to VR chat. How in the world am I going to do that? Because something, again, like for symbiosis dysbiosis is we're working with a lot of haptic sensors and not just like bHaptics, though bHaptics is sponsoring that project, which is awesome. but just kind of breaking out our own type of sensors and things that will be triggered by people touching certain things in the space, which sends data into the mushroom. And then the mushroom response is a form of touch. And wow, like it was my first experience with VRC. And I think VRC is fantastic and interesting in what you can do in there and just how it can be, okay, real world. versus entire fantasy psychedelic craziness, like just really wild. So I kind of stumbled upon, I think, Herb Durpenstein, their code for bead haptics and how they've been able to bring in sleeves and vests and all the different aspects. So I thought, OK, well, that might be, you know, a fairly quick and straightforward way, like we could just do this without having to figure out our own code. And so I'm working with sleeves. I mean, we do have a vest and like the whole thing, but I thought what I was thinking is, so people come into the space, they can touch an object or touch my avatar, which will have these sleeves on. I don't actually have them on. I have them around the mycelium. And if you're familiar with the bee haptics equipment gear at all, you know that those haptic motors are pretty intense. Like they really, when they go off, they're just like, you know, I think there's like 12 or something altogether in those sleeves. So it acts absolutely as a form of touch, perhaps intense touch. And then the mycelium can respond sonically. So the only issue that we're still, particular me, you know, as a sound person is the latency, like trying to figure out how to work around latency, like sending a stream to Twitch or YouTube, and then that going into VR chat. There's a little bit, we've updated a bunch of things recently in the world of Mycelia for the Venice performance, which is taking place tomorrow. And so far, I mean, I was doing all kinds of stuff and the people were hearing it. It was less of a delay, maybe a couple seconds versus 30 seconds, which, you know, that's a significant.
[00:15:00.603] Kent Bye: Yeah, that's a really good context setting the stage, because I think I first came across your work at your performance that was in the Mycelia performance that was performed by you in Anitopia. I heard about it through Mike Salmon, who I think is helping on the Venice Film Festival, helping to produce, and he gave us the hot tip that the Metaverse crew had created this really amazing world in VRChat, and that what was striking to me is just going to this event, was that I saw the whole performance, and it wasn't until afterwards that I realized the degree to which everything that I was hearing was being generated by these fungi, which really just blew my mind to learn about how you could have this biosonification for these fungi to be able to actually have whatever they're doing to be translated into electrical signals that then is transmitted into MIDI that's then put into a DAW that's then put into a music that's then input into this show that's being performed live. And I should mention that there's shaders all around the world in VRChat that is also this visualization of that biosonification that gives you this additional layer of being able to visualize what kind of things are happening. And so there's another translation there of those input signals that are also somehow being translated into whether it's frequency range, or we can get into what exactly is being input there. But I really want to have a better sense in my mind, this chain between the fungi, The sensor, what doll would, you know, digital audio workstation you're using. And then if there's anything you have to do to kind of feed it into this live performance within VR chat.
[00:16:28.028] Tosca Teran: Okay. Yeah. So I have, this is a scaled down version because I do have to something that happens. as the mycelium kind of cultivates whatever substrate it's consuming and essentially living on is a lot of phenols start happening. So liquid can build up and liquid is great. Like you want a humid environment so they don't dry up. but it can also be kind of a breeding ground for bacteria and other things. So our music studio space is kind of small, so we don't want like a fly sanctuary in here or anything like that. There are like electrodes here that are going within and plugged into the module. You know, you could hook up as many modules, like if you had a MIDI splitter for different channels or things like that, and There's like 555 timer chip on here for people that out there know what that is. Here it's acting as a galvanizer. So it's kind of like a zero and one, you know, note on note off. So there's a threshold that can be set to what's being picked up. The mycelium here is pleurotus Austriatus. So that's oyster mushrooms, just your basic oyster mushroom. And then there's an app mega chip. So that's the, our Arduino chip and has the code on it and stuff like that.
[00:17:58.957] Kent Bye: So how's the pitch being changed? Cause you know, you have the on and off, but how is the mycelia changing the pitch?
[00:18:05.840] Tosca Teran: So what it's doing is, I mean, I can do this too. So, okay. The dog that I'm working with is Ableton live. And Ableton's really great for what we do in general. And so in the Arduino code, you can either create your own scale or you can choose like Western chromatic and major pentatonic, minor pentatonic, things like that. Or you, again, you could input your own and then you can break out from that, of course, if you want within Ableton, or you could use something Like sometimes what I'll do if I'm out in the field, let's say, and I'm in a forest, I'll have maybe an iPad with me or something with just a straight MIDI recorder on there. So it'll just take in the raw data that I can later bring in to a DAW or an app or something like that. Apps I work with, like different things people could use, pretty much anything that accepts MIDI in. So like Animoog or, you know, like Moog has a whole bunch of apps. Reason has some things like that. You can work with Logic, I think Reaper, but we've been with Ableton since the beginning of Ableton. So it just keeps getting better out there. So then what we do as far as like bringing it into Euroracks, again, I mean, you can break out from Ableton now because they have CV tools and things like that. But we also have modules that allow you to bring MIDI in. And so you can directly hook up, say, this biosonification module into the MIDI in. And from there, you can really go out from there into any kind of synth voice, filters, all kinds of different things with control voltage or gates to trigger drums. For the Daw, I would say what I usually do, or even sometimes with Animoog, is like this mushroom or mycelium is incredibly active right now. and you can turn up the threshold on that. So each LED represents a note on a particular scale. And to be honest, at this moment, I don't even recall what scale I have on this chip. We often will work with like minor pentatonics and things like that. So it's most likely in a minor scale. And sometimes I might pitch it down in Ableton, like in the MIDI tools. And that's just me like preferring lower frequencies. So if I make a comparison, when I've set electrodes into a huge container, let's say of slime mold versus mycelium, a slime mold is sending kind of like tonal clusters. Like it really is just a ping. And maybe you might get a couple more. My partner will call it like a one note wonder. it's just really just a ping or even bad chance. And this is something, I mean, I should say to you that I know other people are researching further and I hope to as well to really figure out what exactly, I mean, okay, we know conductivity in every living thing has conductance. So it's really how, I think you're setting up the scale and how these each note, like the notes are being read when they come through the Arduino. So you're sending it through that scale. And as the mycelium's information comes through, I'm trying to figure this out. It's hitting all these different notes or not. You know, sometimes it's very quiet. Oyster mushrooms tend to be very active, I find, compared to say Ganoderma lucida, which is reishi mushroom, very popular right now in the medicinal mushroom thing.
[00:22:07.082] Kent Bye: Yeah. So just to kind of recap and kind of expand and ask a few more questions that you have the oyster mushroom, you have some electrodes in it. It's going into like an Arduino board. That's translating it through a number of different chips and other things along that chain that are also creating different binary off on and off, but eventually it gets into MIDI signals. Like, is it actually generating the MIDI from the Arduino board or is it going into the DAW that then gets translated into MIDI?
[00:22:31.880] Tosca Teran: No, it's translated through the bio module itself. It's translated to MIDI. Like it's going out here. So like there's a five pin DIN that's coming directly out of the board here. And that is going into the computer. I don't know if you can hear this. Like I have it right now. Can you hear that? Yeah. Probably through my microphone.
[00:23:01.895] Kent Bye: It's like a drumming kind of rhythmic.
[00:23:04.097] Tosca Teran: Yeah. I have it on a drumming beat, so you can better kind of hear it and change as well. I touch it.
[00:23:13.866] Kent Bye: Oh, so you're touching it and then it changes. Yeah. So in some ways you're kind of playing it like a musical instrument. One of the things I was wondering was that if you just let it sit there, if it's able to kind of generate its own music and if it can hear the music that it's generating, I mean, it's hard to know what the consciousness of a fungi is, but you know, to really know if it's able to kind of hear, but if you get the sense that it's able to play music or communicate to humans in that way.
[00:23:38.854] Tosca Teran: Absolutely. So when it does its own thing, like we just have it hooked up and it just like right now I'm not touching it. I turned off the drum part. I don't know if you can still hear that. It's very kind of ambient sounds, but it does change throughout the day. I'm feeling these changes have to do. It really depends. I mean, if you have the room lit, it's fairly active. throughout that time while it's being bombarded by this light versus if it's just in a general cycle, which might be usually mycelium is underground, right? It's fairly dark, I'm guessing, under the soil where it grows. So when we did the performance, I hope it's obvious, like we're playing some keyboards and things like that. That's not the mushroom playing the keyboard. We're playing the keyboard, but we have it hooked up to filters and things like that in these Eurorack systems. And then, you know, we have keyboards and stuff like this that we're playing and we're just letting it go. Sometimes I might've reached over and kind of poked it gently. And that would elicit like an immediate response where otherwise it's just kind of this energy is just being fed in. I'm hesitant to say it's like brainwaves or a heartbeat of a human, but like, did you just hear that?
[00:25:06.363] Kent Bye: Yeah.
[00:25:06.643] Tosca Teran: It was a big change. I didn't even touch it. That's just, it's doing its own thing over here. So as far as it hearing itself, that's something I'm very interested in. I am asked that often. We did a performance for an event called Nuit Blanche, where we had the mycelium literally controlling six synthesizers. And it did feel kind of like this call and response was happening. It was really interesting. Was it also though picking up on the people in the room? For some open studios that I've had, where I'll have my celium, like a large container or a sculptural piece connected with electrodes playing, and people could come in and they see it. As people approach, sometimes like they're not touching it, they're not doing anything, they're just observing it. And this is in an entirely contained structure. so the mycelium doesn't get contaminated, but the sound will change. So somebody comes up and the sound instantly changes. They move away and it goes back to the same thing. So like I had a whole group of people actually witnessed this, like suddenly the music stopped like entirely quiet. So we all looked over at this doorway and somebody was standing there. I don't know who they were. It just, the vibe, if you will, was really intense. And they came in And the whole time they were in the space, there was no sound emanating at all from the mycelium. They were wanting to, like, open up a greenhouse I had in the studio and all this and that. I'd ask them, like, kindly, like, if they want to see, I'm happy to show them. They were a little hostile, and then they left. The moment they crossed through the door, the music instantly came back on. And there were a whole group of people in there to witness this, which was fantastic. It wasn't just me going like, what in the world just happened? That was wild. It freaked some people out. They left immediately because they didn't understand. I wasn't standing by any synthesizers, like, tricking people. And I've noticed, too, sometimes when kids were approaching, it seemed to get very active and very frenetic. So I guess with symbiosis dysbiosis project, we're also trying to like, is this a positive or is this a negative response? If it's in this fully contained polycarbonate structure, how is it seeing or recognizing or sensing like motion? Is it feeling vibrations? So it's super fascinating. I really don't know. So as part of this other project, receive a residency at Buffalo University at the Coles Center for Biological Art. They're going to train me in working with atomic force microscopes and scanning electron. And the concept is to really look at frequencies and more or less research and see what we might find out, like what is really going on. Like it doesn't seemingly have eyes, It clearly senses a lot because, you know, they're the communicators in the forest. So that's definitely for mycelia and for other things that I'm working on that involve the biosignification of fungi. that it's being implemented in there as a communicator. So is it communicating? Some people definitely interpret it as that, like it's communicating with them. I think for the beautiful world, mycelia, and the audio reactivity, like even my avatar is audio reactive and some of the dancers' avatars for the performances are also audio reactive. So they're responding to the mycelium sending out this music and things like that. So we're trying to create that in a way that there's these communicative non-human things that are taking place that kind of like bringing the invisible to the visible. the micro to macro. That's what I love about VR is I feel that you can create these worlds and people can enter them and, you know, experience it and also learn more about these, maybe about this fungi communication. Yeah, so it's really new, I think, too, in bringing the fungi, connecting it to VR and creating, you know, this mycelium network within a virtual
[00:29:35.673] Kent Bye: Yeah, well, maybe, maybe we could turn off the sonification just because I got a sense of it and some of it is going to getting passed through a low, like a filter. So it's sort of cutting in and out a little bit over the zoom call, but just by listening to it, you can kind of hear. It has some variation that it's sometimes it'll kind of go off and do its own thing for a moment. What it reminds me of is this generative art or art that is kind of a perpetual machine of, you know, music. There's this whole. movement of generative music that tries to kick stuff off, but have enough randomness that it's kind of generating the music from a lot of these processes that are without any human interaction, but it's able to kind of on its own generate whole levels of musicality. But here, it seems like you're not only trying to tune all the sense and everything to make it sound like it's not just noise music, although that's a whole genre and it's valid in its own right. You're not trying to create something that is in that type of genre, but it's more of a, I guess what genre of music would you say that this kind of most closely fits into then that type of stuff that you're generating?
[00:30:37.852] Tosca Teran: What we're doing. Some of it is ambient. I guess you could call it ambient for the mycelia world. What could we call it? It's kind of like old school in a way. Like I'm having maybe some difficulty thinking here of a specific like genre because For instance, Tangerine Dream or Tamita used to way, way back be referred to as new age. And I wouldn't say it's new age at all, right? It was before maybe an electronic category really started where people were thinking ambient music. I don't know if it's ambient. Like what we like to do when we are collaborating, say with Mycelium and in general, even if we aren't, is create kind of like a journey or like we're taking people through kind of a story. And that's how we're looking at it. We love the sequences and things like that. Like we definitely are tuning everything because I'm more into noise and glitch and things like that. My partner doesn't really like that at all. They're definitely someone that wants everything to be tuned, you know, like let's make sure everything sounds really great. And a lot of things do shift and even having the mushroom like plugged into it, things are shifting and stuff like that. So. usually right before a performance. We're like trying not to get too stressed out 45 minutes beforehand, making sure everything's tuned and sounding really great. So I guess ambient electronic, I don't, I apologize because I don't really off the top of my head. I think it's just electronic, but electronic is so wide and varied.
[00:32:15.023] Kent Bye: Yeah, well, I guess what it reminds me of is that in the concept of electronic music, you have these random number generators that can be translated into music. So in some sense, that mycelia is putting out abstract signals, and it's really up to the musician to kind of figure out how to use that within whatever combination or genre or there's the human element of their, you know, who knows what, if it were up to mycelia, what kind of genre that they would prefer to play.
[00:32:39.215] Tosca Teran: Right. Yeah. Field recording. Yeah. I don't know. Bad jazz. But yeah, I think that, okay. So because it's going through the biosignification module and that is putting it into a scale, like collecting that information and that's being fed through a scale then out through MIDI. And then of course the musician can change that and things within their DAW. within their URL rack or whatever they wanna put it through. That's what I think is so cool about the MIDI aspect of this is that you can then use it in something like TouchDesigner, or you could use it with OSC or like Max, you can do different things. So again, it can control visuals in Unity, similar to how brainwaves work and people use them in Unity, like you can have it moving objects and doing different things like that. Yeah, I think that's pretty fascinating. And the human element or the hollow Biont element. And that could be, we're adding reverb or we're adding some delay to something. And that's usually the most I will do. For instance, when I just take the straight MIDI recordings and I upload those to SoundCloud and I'll say like, this is what I'm doing to this. Okay, I say I'm not doing anything to it, but of course the synth voice I'm putting it through, it's adding an element, like you were kind of saying, you know, like the synth voice is doing something to it, but you can definitely hear differences throughout different fungi and species and trees. And that's really interesting to me, just how you set all these things up. If you keep the same parameters, it's different. I haven't had a repeat performance, let's say from the mushrooms.
[00:34:29.757] Kent Bye: Do you happen to know if this board, is it outputting MIDI 1.0 or 2.0?
[00:34:36.861] Tosca Teran: 1.0. I haven't updated anything yet. So it's just standard MIDI.
[00:34:43.372] Kent Bye: Yeah, just in looking at some of the new spec, it seems like there's going to be a lot more bandwidth to be able to have, like in mini 2.0, you can do pitch bending per note or, you know, have this huge amount of information. And so as we move forward, I imagine that there's going to be just higher bandwidth of different types of input that you could take from like, say an organic entity like mycelia, then take that information and have even more opportunities to do different types of specialization or other things that you could start to use From the midi 2.0 above and beyond what the 1.0 gives you.
[00:35:15.551] Tosca Teran: Yeah. Yeah. Okay. So we do workshops actually myself and Brendan Lehman, uh, the neuroscientist that's working with us. We're going to be holding a workshop in November, bio data into unity and doing some other things with that. So it'll be bio data from living or non-human subjects and also the human. So we'll be talking about our work with the emotive bit. as well as open DCI boards. Like the whole idea is just to expand that, you know, I know there's a lot of people really interested in how in the world is this working? How can I use it in my project or hook up plants or trees or things like that and have them doing interesting things in my world. So yeah, and Brendan's held a bio data into unity workshop before my partner and I show people how to build the biosignification modules in different ways, like either through getting a PCB board and buying all the components or getting components from us, or also just doing a breadboarding everything themselves with an Arduino Uno or similar.
[00:36:25.280] Kent Bye: Yeah, I just recently took a, a mini file of Beethoven's fifth, and I put it in a reaper to do a whole ambisonic mix of that. And so I was playing around with what the MIDI as a format, what it gives you. And one of the things that I noticed is there's a way to translate MIDI into JSON to be able to look at the data and. MIDI has kind of built into it what the tempo is and the tempo actually ends up dictating a lot of how the pacing of how things are playing in it. I was able to kind of translate the MIDI file of that song that actually has like 60 different tempo changes for this specific mix of Beethoven's 5th. But it changed it into the SMPTE timecode information to be able to just translate it into something that is easier to make sense of on a timeline rather than kind of MIDI's tempo translation, which can be wonky if you have a lot of tempo changes. But is that something that it just sets a tempo and then from there it's able to play or because I've imagined if it were up to the mycelia, maybe it would have a completely different tempo or how it would be playing stuff, but everything's kind of quantized down to whatever the MIDI tempo is set on that chip.
[00:37:29.661] Tosca Teran: Right, we do quantize like well, if we're performing, we're definitely quantizing to be able to tune everything and get everything. We don't have a lot of percussive things, perhaps because we're not wanting things to go off like strange beats or things like that. So if there's percussive things, we're doing that with the oscillators themselves, but yeah. So with tempo, that's interesting now, because now I'm thinking about that and wanting to look at that and go back into the code. So we will set that definitely. Yeah. And I don't know, but, um, The scale is in there in the code velocity. Like you can of course play with that. And you can play with threshold and like, there's a number of different things, but the tempo is in the DAW and stuff like that. Like kind of what you're saying.
[00:38:26.186] Kent Bye: Yeah, for any new project in at all, it asks you to set the tempo. And I was having problems because this MIDI file had all sorts of different tempos and it was, it was weird. The concept of how tempo works in the MIDI that actually models time in a way that a lot of other formats don't model time. So MIDI as a format actually is very flexible to be able to drive all these other things in a time-based medium. Music is obviously happening over time, but I expect to see perhaps with MIDI 2.0 more use of that within the spatial context. because it does have a lot of flexibility to be able to drive something that may have more of a time-based performance, synchronizing the visuals as well as what you're hearing, but it feels flexible enough that people are already using MIDI to drive visualizations like that. And maybe that's a good segue to kind of talk about the other types of visualizations of the biosoundification that you have within the shaders within this Mycelia performance. that you're doing, like what type of input that you're using to do that type of music visualization within the VR chat world and your performances? Okay.
[00:39:27.299] Tosca Teran: Well, there's definitely some other people involved that could speak better on that front. We are working with Udon and we're working with shaders. There's a number of different shaders, Poyomi. is one shader that we really love, at least for a number of the avatars and the audio reactivity aspect. I have to say, I think someone that is better equipped to really talk about that is Tyson Cross, because Tyson, who's with the Meadow Crew, South Africa, kind of did, if you will, like the set dressing, a lot of that in the space. And also with Tontastic, who is really the audio person, like working with Tyson and making sure certain things, you know, are implemented. I know that, you know, with the updates and everything recently with VRC and Unity, there were some hiccups, I believe with the Udon link and things like that, just trying to, how we get the sound in. So I apologize that I am unable to really delve into that deeper. I know that Udon is what's being used. And then the Poyomi, who's amazing. And I think we all hooked up to their Patreon to get ahold of their shaders. And as far as like to, like, you're talking about time. So we do have things clocked. Like we definitely work with a module, at least in the Euro racks that is clocking, you know, so setting at least like BPM and things like that. So we can link the two at least together and even Ableton link. So, and I know, I think Ableton standard beats per minute always wants to be 120, but then there's things you can go, you know, you can change that. But if I think of it on the BPM level, generally, if we plug in the MIDI and we have the mycelium going in and we look at that, it's usually around 87 to 97. It's fluctuating back and forth. And often, I'm always very open to leaving it to just whatever this, I'll say non-human being wants to kind of go with instead of kind of upping it or doing things like that.
[00:41:45.750] Kent Bye: Well, I can just speak from my experience of the different shaders, which is that, you know, there's different theories of music visualization for what you're looking at in terms of frequency or rhythm or whatnot. And I mean, rhythm detection within music visualization is actually a lot more difficult than you would think it would be to actually detect beats as kind of like very algorithmically complicated. But I guess what I noticed was that they were very subtle. Like when you're in those spaces, it's not like so clear as to be able to correlate the music to what you're seeing, but when it shut off, you could notice that they're not there anymore. So it was almost like an environmental subtle context that was there that it was pointed out to me, but unless it was pointed out to me, I don't know if I would have noticed it because it was so subtle. So it's a type of thing where within virtual environments, you have the capability to be able to translate musical input into some sort of visualization that helps you actually potentially hear the music at a more deeper or subtle level. And so finding different ways to translate what's happening from the mycelia into sort of visual representations, there's a sonification that you can hear, but also to see the visually could also potentially in some ways help you maybe notice some of the more subtle patterns that you may not be hearing on its own right. So I think it's an interesting conceit, and there's a lot of potential there. I think people also were saying, hey, when you play normal drum and bass music or whatever, if you have a wider frequency spectrum, then there's more for that music visualization to do and more to respond to. But because the frequencies were more bounded, I saw a little bit less visual complexity when it came to the musical visualization and the piece. So that was a little bit of my own direct experience of that. And it's not important in terms of what was actually being fed into it, but that was at least at the end of it, what I was experiencing.
[00:43:29.599] Tosca Teran: So something like, for instance, earlier today, we were doing some tech checks because of updates in VRC and things like that. We want to make sure We did find there was kind of a little glitch in the Udon at first, but we managed to fix that. So there is like a control panel in Udon. So you can encode with the shaders, you know, you can attach it to whatever you want it to be making reactive. And Tyson was saying to me that what he found really interesting was like looking across the board at this controller. So it's showing all these different frequency levels. So we really are in this mid-tone area. Even if we have like these lower frequencies, it's not necessarily registering as this deep bass, like kind of tweaking these or whatever is how we can make things look more reactive or stuff like that with the high frequency or low frequency. But even not doing anything at all, he was just showing me how intense the sounds were registering in this mid-tone. kind of area. And yeah, I think, too, we really wanted it to be not too obvious or in people's faces. Also, mycelia, I mean, there's a lot to explore within the environment. And so we're just really wanting people to feel comfortable, first of all, in there. And, you know, now you can go swimming. I don't know if you've been in there since, but you can actually literally swim in the water that's in the center. using the triggers on your controllers and you're buoyant and you can splash, there's fish in there and there's mushrooms people can pick and things like that. I don't think that was something that was very obvious unless people went directly up and then you'd see that, you know, you could use that mushroom and carry it around with you and it's sporing the whole time, which is kind of fun. And then also, right, we've created some avatars that, okay, I should say we created these because of all the different elements that are happening in this world, it makes it kind of heavy. And so the more people come in, say, you know, in a giant dragon avatar or something like this, it could potentially crash the world. And when we're doing the performances, we were really hoping and striving to allow as many people as possible to experience that and not get kicked out into an instance where nobody's there. We have though, I should add, set it up in the event that does happen, say, you know, 64 people are in the space tomorrow, and the 65th person comes in and goes to a different instance. While we won't be there, we have the performance from a maze streaming in there. So they will at least be able to experience visually that and also sonically, like hear these things going on. I hope that doesn't happen. I mean, in a way, I guess we hope it happens because that means we've sold out, but then we also really want everyone to be able to have the full, here's everybody in there and this great dance performance and stuff like that that's happening. But yeah, so we're really registering on these mid-tones and trying to, I guess, kind of tie it into the environment so it had more of a look of say this mycelium communication, you know, like things like energy moving throughout the space and stuff like that. So it's interesting to hear, definitely. Like, I'd love to hear more feedback from people, like if they notice these things, what it made them feel like is, you know, am I seeing things? Is this really happening? Like what's going on? And just, if they did get that, it's really tied in with the music and stuff like that, especially for newer people. Cause I did have friends come in this is kind of their first time really experiencing VR. So that's interesting to just be dropped right into, you know, a world like this in VRC.
[00:47:30.450] Kent Bye: Yeah. I think the challenge is to see what you're hearing and what the visual corresponding aspect was. And it was for me too much of a gap to be hearing what I was hearing and seeing what I was seeing. That was hard for me to kind of know that that's exactly what was happening. I think when you have a pieces of music that have like, you know, the beat drops or something that's real dramatic, I think that's an opportunity for you to have that kind of visual synchrony and the multimodal synchronization of what you're hearing, what you're seeing all tied together. But the absence of that with the ambient music was harder for me to kind of really grok what was actually happening, what it was showing. So that was at least like my experience. I'd be surprised if people really had very finely tuned awareness and perception to be able to correlate what they're hearing, what they're seeing. So, but I wanted to bring in a little bit of what I'm seeing on your screen. Just talk about that just for a moment and then dive a little bit more into the VR experience and then, then wrap up. But I'm seeing behind you, you mentioned earlier, these Euroracks are all these different physical synthesizers. So you're, you're getting the MIDI signal out of this Arduino board that's connected to the oyster mushrooms. the oyster mushrooms are giving you the signal that's getting into the DAW, and then it's being fed out this MIDI signal that's being synthesized through the DAW, also being fed through MIDI to all these other physical synthesizers that are all these blinking lights and wires. It's like a whole epic rack of different types of physical synthesizers that you have there. So maybe you could describe just a little bit, like what's happening after the DAW and interfacing with all these kind of physical...
[00:48:57.432] Tosca Teran: Okay, so the DAW is not being used in this performance. These synthesizers are entirely what's being used. over on this one Euro rack that's in the back, at least on the screen. So that's kind of my setup over there next to the PC that's over there. And you can see my headsets up there. And so I set it up a little bit differently so I can wear the headset. I have it set up in a way so like my hair and my head's pretty big so I can wear it in such a way that it sits and still is tracking what I'm doing so I can turn knobs and do different things like that while I'm performing in the VR world. So generally under this desk is a large container of mycelium, which has electrodes coming out into different biosonification boards that have MIDI, like the five pin DIN connectors coming out of that bio module, going into a MIDI module in the rack. And then from there, I have control voltage going out to some of the synth voices and some of the filters, and there's gate triggers as well. So, you know, I can just be triggering a plate's mutable instruments module, for instance, or like in the beginning of the performance, we usually have a drone sounds happening. That's through a module I have called the grown drone. So I have the mycelium triggering that like an envelope, you know, there's envelopes kind of attached. And so the envelope opens and closes. So it's not exactly turning on and off the module, but it's modulating it a bit. And so we start off with the drone just so it's kind of like, you know, when people enter the movie theater or a concert or something, you know, you're hearing something's happening. And so This is really the mycelium happening there and all those sounds. So it's directly into that. And then we have different sequencers. We don't really have it doing anything to the sequencers because that's a little trickier. There's some Mug Mother 32s. We have three of them set up. And we have those set. So it's triggering some low frequencies that happen there. And I'm curious, I mean, later after this talk, I'm doing more tests with the haptics and things like that for tomorrow, because there will be a big Q and A afterwards. And I'm going to somehow demonstrate in VRC, like how this works. But anyway, so some of the low frequencies, it seems to come through, but not as deeply felt. You know, I think you'd have to have a subwoofer set up in the VRC space or something. But anyway, so we have the fungi controlling the drone, controlling one of the Moog mothers and controlling some of our other filters that do kind of harmonics that's happening. And that's through a Joe filter, which is entirely controlled by control voltage out. So in the module I use, which I'll just say it's the mutable instruments, yarns, which I think is fantastic. There's also a Bastille 1983 that I have in a installation right now for the installations called forest undersound. And that Bastille 1983 only takes MIDI in, and then you send that out. So I have. these huge containers, one that has native plants from Canadian boreal forest regions, and the other has mycelium and things like that. So that's been playing and changing since January this year at this museum. And that's pretty cool. For this, we're working with the Yarns MIDI-IN. And so with the Yarns, you can set up whether it's polyphonic or not, like how it's sending output, which is great. I feel for at least for the Eurorack. So that's what I'm using to then send, you see all these patch cables, it's total chaos. But yeah, so they're breaking out and going into different things. And really, I want to say it's just for the grown drone, what's really cool is it's not a synth voice. It's just literally like an oscillator. It has an oscillator in it and some LFOs and stuff like that. But it's just taking really the fungi MIDI data in directly. And so that's the sounds, these very strange kind of dark sounds are being created. And then we build upon that as we start playing. And primarily too, we're really playing with it because we do have the whole interpretive dance performance that is happening with Moot and some of the particle dancers. So that's something too, maybe might be good for me to mention is So you see there's this mushroom fungi being that's like dancing above me in this invisible platforms that we've created on these different levels with portals that they can like jump in and zip around these other portals and things like that. So I believe just the primary dancer route is fully tracked, like their feet are tracked, you know, everything's going on there. So you can really, see that they're dancing. And so they're interpreting the whole mycelium and they're sporing and there's spores coming off of them. And then there's these particles or these, you know, flashing lights and ribbons and things you see moving around them. Those are actually people that are in these very cool avatars that were created by, I believe, Deke online. And I think Tyson created some of those in face. Like they look like they're swirling around her and things like that. So I think what's happening is there, you know, when you pass through avatars and stuff. So if people just look at this, I don't think in the immediate you would realize that all those particles that are flying around or these spores, if you will, are actually dancers, but they are people that are in these really cool avatars that are dancing around this incredible looking fungi woman. It's a very beautiful avatars that were created for that. So yeah, so the Euroracks, they have a lot of synth voices in them. We are just working with three. Like I play the Mellotron, which we have a Mellotron back there, and then we have the Grandmother. So the Grandmother is played, and then the Mother 32s, and then we're turning some knobs.
[00:55:42.282] Kent Bye: So just to clarify how much, so underneath this table is all the mycelia and it's hooked up into these Arduino boards with a MIDI output. How many different MIDI inputs are you taking in from the mycelia in order to do a performance?
[00:55:54.567] Tosca Teran: What I like to do is have at least four or five hooked up so we can really split it out and have it, if you will, controlling more, like actually making more of this sound performance for the blast performance on a maze. I had two boards coming in. There's some Euro racks above as well. Oh, so I have like maybe three right now that I can see in those cables. So there's three inputs that are going in.
[00:56:23.932] Kent Bye: Okay. So, so in some ways it's an orchestration of mycelium that you're taking in and those inputs are going into a variety of different places and then it's, you know, you're getting the sound. But then presumably is there like a master mix that then you're feeding directly back into VR chat or where's the mix go?
[00:56:39.570] Tosca Teran: Oh, okay. Oh, so we have a big mixing board. That's also a chaos tangle of all kinds of inputs. There's also another mixing board up there that's taking some of the elements from the Mellotron that I'm playing, the grandmother, you know, there's an effects pedal on the grandmother. So yeah, we have another mixer over there, a mixer here, and then those are coming into our sound card. Also too, I mean, Right now we have everything and what I'm talking to you on is off a very old Mac. And that's what's feeding it into Twitch that then goes to VRC. And so I disconnected everything from the PC. I mean, it's a pretty mighty PC. It does run a 3090 on it. but I'm thinking you didn't make it to that performance or you probably would have commented on that. I think it was the second performance. There were issues in the first performance that we couldn't figure out who was the cause of this, but there was a lot of strange feedback that was happening. So the tech crew that were based in Berlin, then Tontastic had to fly to Dubai. So actually, That audio unit was completely in Dubai. Then we're over here in Toronto. And so, of course, we're trying to make sure everything's synced and working. So people were using Discord to chat directly into VRC. And I think it was the guys at FOMO TV came in and said, OK, we're ready to go live. And then maybe five seconds, okay, we're ready to go live. Five seconds, okay, we're ready. And I went, oh no. And you heard this, oh no, oh no. And I couldn't laugh because it, so I muted, I mean, I had my mic, I had everything muted. I even threw Discord off the PC because I thought maybe somehow, like I wasn't tied in, they were using it like walkie talkies, right? I threw it off because I thought, I don't want to be, am I responsible? You know, is it somehow feeding back? Then we didn't have any mics on on the Mac. The people in Berlin said they didn't, so we didn't know what happened there. So just to be safe, nothing is running through, you know, if I'm going to be performing, like moving around and moving certain things here, definitely I'm not miked, but still, I wasn't sure if somehow discord, you know, there's some hidden aspect that was feeding in some kind of weird loopback. And I actually noticed that happened yesterday in one of the tech checks when I think we had YouTube open and Twitch open and something else. And it was the YouTube was just sending a stream link from my YouTube channel, but another video had opened in another window that wasn't even directly related and somehow Even though that wasn't connected to OBS, I heard the feedback loop of this woman speaking about, I don't know, some commercial cereal or something in VRC. So, you know, we're just trying to be as careful as possible so people aren't going, what in the world is this weird? Yeah, there's a cereal commercial happening. But yeah, so that's just standalone. So I know like we keep going in and double checking and making sure there's none of that weird feedback loop. Because that was very bizarre.
[01:00:07.108] Kent Bye: Yeah. You know, as you go through all of that, it's like a snapshot of the early days of VR, of all the different things that go wrong and that you have to kind of debug and. You know, audio, because there are these different issues of latency and whatever else we haven't seen as many of that type of live performances. Like, as you say, it's, you know, you end up having to send it all out to Twitch and then just pipe in the Twitch stream. And then that's kind of a roundabout way to do that. That introduces all sorts of additional latency, but it's at least a way that you could isolate it from having input that's coming in and have a little bit more control. So that makes sense. But that'll be like a nice little. all the different things that people have to go through just to even get it to work.
[01:00:44.760] Tosca Teran: It's a lot of, you know, I really, I know there's other people out there that wish for this too, that we can just directly connect. Like, I don't know, like I've done a lot of different searches trying to figure out if there is a way to directly connect straight in to VRC. I know that admittedly, this was just something I've been talking about this morning, and going, oh my gosh, could we be doing it this way the whole time? So there is a max patch that allows you to bring real-time Ableton into Unity. I mean, this isn't gonna be a VR chat build, the symbiosis, dysbiosis project, but that is a way that we're able to bring some of that audio in. And so spatialization and things like that are gonna be really pretty cool, I think. for that project, but this max patch enables you to have everything happening in real time. And so I was talking about that with a colleague this morning, because they asked, can you use that in VR chat? I'm like, oh my God, I don't know, but I don't know how that would work, right? Because you're having to upload the world. So I don't know.
[01:01:57.099] Kent Bye: Yeah, everything's through Udon through VR chat.
[01:01:59.340] Tosca Teran: Yeah, so I don't know how you could keep any kind of connection in real time, but then there's the bee haptics, which is in real time. So there must be.
[01:02:10.275] Kent Bye: Well, that's, that's more taking the input from the world rather than putting something into the world. So it's taking what's being presented by the world and translating that, which maybe we should mention that briefly. Like, what is it exactly that you're doing with bee haptics in this performance?
[01:02:24.066] Tosca Teran: So the bee haptics is placed around. And there's a GitHub, you know, VRC haptics integration that somebody has kindly built and put out there into the world for people to use, like to be able to kind of receive hugs from friends, like during the pandemic, like, you know, touch, just people missing each other and being able to do things like that. So I thought what a perfect way to integrate touch, essentially people coming up and touching an object or touching, it has to be an avatar right now. It's really cool. You drop in this bee haptics SDK and then it shows you the sleeves or the vest or whatever gear you might have from bee haptics. and you place that onto an avatar and you can make it visible or invisible. I have them set to invisible otherwise it looks like something's wearing some kind of tactile vest or army sleeves or something. So either it can be audio reactive and that's something you can set up individually. These different aspects in the Beat Haptics you know has its own special little app so you can have it you know sensitivity you can change or you can change how the sensors behave. And for VEST, for instance, that is something outside of VRChat that we're working on to have like the mycelium so people can kind of feel the mycelium moving around them. So for the VRChat though, it's the haptic sensors are placed around the mycelium. So if people come up after the performance if they come over and they want to touch or even certain sound. That's something like I've tried to change the threshold so it can be more along the lines of people's voices rather than these deep basses or low frequencies so people could talk to it potentially and have these motors go off. The motors are intense. Like I mentioned a little bit earlier and I found especially if music is playing, they're constantly vibrating. So it is, for this type of event, a good idea to mess with those thresholds and have it either be on or off. So for the Q&A that will take place, I'll jump into that avatar that has the sleeves on. And so people can hopefully interact with the mycelium. I hope that it, you know, I've become, I should say more sensitive towards, you know, I don't want to be like on command, you know, obey me or whatever, I don't know, but they're going to be around the mushroom, even though my avatar, like that's the, in a way, a weird little disconnect is the avatars wearing them in VR chat. But I'm not literally wearing them in the physical. I have them instead placed around the mycelium to act when people touch me or talk to me. I'll hold out my arms so people can touch where the virtual sleeve is and see how we can hear that sonically. And of course, we've talked about latency. So there will be unfortunately, a little bit of latency. But again, actually, when we did these tests yesterday, the latency wasn't so bad, like the other people in there were hearing it fairly immediate. Maybe that has to do to with connectivity, you know, or wherever you might be based.
[01:05:54.120] Kent Bye: Yeah, just to kind of recount that. Cause it is a little, I didn't notice the first time, but the second time, I think as you walk through there, you have the people in VR chat that have virtual avatars that are enabled with be haptics integration. That means that when other people touch their avatar, they're wearing the actual vest, able to feel that hug or touch or whatever, but you're wearing a virtual avatar that has these virtual sleeves of the haptics, but instead of it going to your body, it's going directly into the mycelia. It's a way for people that are in this VR chat world to be able to touch your virtual avatar, but then to be directly stimulating in a bee haptics vibration, to be vibrating the mycelia, to be able to have this kind of virtual touch from people around the world directly stimulating the mycelia and then seeing what kind of sound that produces.
[01:06:37.558] Tosca Teran: Yeah, absolutely.
[01:06:39.399] Kent Bye: Okay. Well, as we start to kind of wrap up and I want to just get your sense of what's it like to be able to, you know, you're doing this biosignification and experimental really advanced stuff with mycelia and music, but then you're doing on top of that, it being transmitted into VR. So for you, what is it like to be able to actually be in the virtual reality world in that context, but also to be able to bring in people from around the world to have this performance?
[01:07:05.718] Tosca Teran: Well, for me, it's really meaningful to be able to share these experiences with people and VR definitely, I think it's amazing. And I think the potentials are just, geez, I mean, your imagination is the limit, right? Like just everything that you can do in creating fantastical worlds, different instances. What I have found through physical in real life installations Unexpectedly, often is, I have found people, for instance, in this one installation I created, this space was very kind and allowed me to bring in tons of dirt, literally, to try and create this underground world that people had to go through, these scrims, take off their shoes, put on special shoe covers. They didn't want to take off their shoes to go within this area where this huge mycelium sculpture lived and was creating the whole soundscape and people could interact with this soundscape. And I would find people, you know, if I left on a break and came back, would be sobbing and crying and so moved and so wanting to talk to me about their experience. And it just has been kind of people having these incredible empathic responses to non-human and thinking differently Interestingly enough, I mean, about the whole shared environment and how, oh my gosh, I had no idea mycelium, you know, or these things are alive. Or, you know, like I've had people say when I've worked with plants that, you know, children too, surprisingly, that they didn't know it was alive because they didn't see it moving. And so it just creates this really great dialogue that can happen because I think Now more than ever, it's really kind of vital, I feel, for me, in my heart, that humans just have tried to remove themselves so much. And, okay, VR is creating these virtual environments where perhaps you could say people are removing themselves from nature, but I feel that VR again, just allows you to create these experiences that can actually bring people closer to certain aspects of nature, because you can take them into the microscopic. You can introduce people that had no idea, for instance, or even considering what in the world is mycelium, you know, because often people think it's a mushroom. Well, it's not the mushroom yet. It's like this network, like almost like neurons in your brain of these tubes of hypha that send nutrients around. So being able to create these virtual spaces and work especially with the meta crew and just how we're just all very passionate about this project and really interpreting these things in different ways and being able to just have talks with people and share also biosonification. It's super exciting. I'm so excited about it. Like it's just, I love VR so much. And so on the flip side of that too, I'm a member of the eco art space. And for some people that consider themselves kind of eco artists, VR terrifies them. Somebody literally said it terrifies me. the other day. And so just kind of talking to him about what is it about that that terrifies you? Or, you know, what movie did you see? You know, did you see Lawnmower Man like ages back or something? Or, you know, what really freaks people out? And again, okay, it's usually the unknown or they haven't experienced it. And for me, I don't want to replicate too much of the real world in VR. I should say with symbiosis dysbiosis, something that we're doing that we are actually going to take a little element of this and see how we can feed this into VRC is we're working with a lot of point cloud data. Like we're turning these forests into point clouds, which I just think is absolutely gorgeous. But the reason we're doing this is we're wanting to show like these microbiomes. So like the fungal biome, the forest biome and the human biome and how they interact. What do we leave behind? There's so many things we might not see that's in the air, well, that we definitely don't see or we mediate. And I think with the biosonification, being able to bring that into unity and have, say, these non-human organisms controlling these environments in that way, and then people's brainwaves being able to set up, are they in a good mood, a bad mood, are they scared? I mean, we don't want to scare anybody. I should say that like we are in the world's wanting to create like nice, beautiful, calm and safe environment and just introduce people, I guess, to these other possibilities and other intelligence or cognizance and sentience. So, yeah, I think for me, VR just has a lot of potential. And I think especially because of the restrictions and pandemic, it feels to me like it kind of kicked it And to this next level, there's a lot more people that I know of anyways that are definitely going into VR and very interested in it. A lot of artists friends are wanting to explore it more and see the possibilities of. I think people are scared of this whole concept of the blockchain, you know, and what they hear about NFTs now. And I think that. the environmental impact is what they're thinking about. You know, we're all, we were all talking on a Zoom. So we're all in the computer and I, I don't think people need to necessarily be terrified of VR.
[01:12:49.906] Kent Bye: Yeah. Well, for you, what do you think the ultimate potential of virtual reality might be and what it might be able to enable?
[01:12:57.827] Tosca Teran: Well, I tend to think super science fiction. So I know that there are applications that are being used now in certain areas. I'm not so sure. Maybe Montreal, probably Europe more. I'm not really certain on that. MIT, okay, where it's being used as a kind of ways you can look at internal organs of people like for surgeries or being able to go into say the brain prior to a surgery that I think that if it's not already happening it will be happening I believe where imagine you're a brain surgeon and you can go inside of literally this virtual depiction of a brain what is that movie is when everybody shrunk and went inside of a human body, but kind of like that. I mean, you could create a simulation of something for surgical reasons or for cancer, even like to be able to go in and scan. I mean, technology is getting there, right, where you can scan people and you could scan different things. I don't know if anything where you could scan a living internal different organs inside of a person, or you could look at for cancer reasons or any even virus with COVID, if you'd be able to go in and look at something on that level in a virtual realm, and maybe that could help with how a diagnosis happens or how a surgical procedure happens and space travel. I think even for people that might never be able to physically go to a space and experience Something even on this planet like you know the pyramids or something or go to Mexico or you can do this virtually. And I mean, of course, you have to have a headset and things like that, so there are programs to where there's loaning libraries now where people are. you know, getting access to Quest 2s, I think 2, maybe that's it, but the technology becoming more accessible price-wise for a lot of people, or that makerspaces and things like that have these headsets and ways for people to experience it that, you know, it's like in Dune, you can travel without moving. I know, maybe that's cheesy, but I think that it can give people access to places they'll never be able to maybe afford to go to or physically are unable to go to. And I think on a professional level, definitely for medicine and being able to even explore, like say they map Mars. I know that that kind of technology is being worked on here in Canada through the Canada Space Agency, where they're going to be scanning asteroids and scanning Planets and creating a virtual replication of that and whether they're going into that then with a VR headset, like that's really fascinating to me. Yeah.
[01:15:57.076] Kent Bye: Is there anything else that's left and said that you'd like to say to the broader immersive community?
[01:16:02.158] Tosca Teran: Oh, just keep on making that incredible work. Like, yeah, I think the MediCrew South Africa are for people that are interested in VR. They're very open, the Meta Crew, and I think that they're worth looking up and maybe becoming a member. If you want to learn more about certain technologies or how to create an avatar or how to even create a world in VRC, VRC is a great place even just to use as a testing ground for a potential world you might want to build that's not necessarily in VRC.
[01:16:40.625] Kent Bye: Awesome. Well, thank you so much for coming on and sharing your, your process and in collaboration with these fungi and oyster mushrooms. And yeah, it's just really fascinating to hear the whole workflow and this whole concept of communicating with an alien intelligence. I think coming back to that in different ways of either biosignification and listening and hearing, or the haptics integrations and the way that we can feel other entities and, or they can feel us or they can hear us. Yeah, really fascinating frontiers of all this stuff you're working on. So thanks for being on the frontiers of all this stuff and experimenting and tinkering and bringing it into VR chat world, uh, and some performances that are being featured here at Venice. So thanks for coming on and unpacking it all here on the podcast.
[01:17:22.855] Tosca Teran: Thank you. Thank you for inviting me and having me in. Yeah. Thanks. I'm honored. Actually. I really am. I love sharing this work, so thank you. Yeah.
[01:17:34.385] Kent Bye: So that was Tosca Turan, also known as Nanatopia. She put on a musical performance that was at AmazeFest in Berlin at the Venice Film Festival and also won the Spirit of Raindance at the Raindance Festival. And she's an interdisciplinary artist that's looking at the intersection of ecology, bioart, biomaterials and alternatives to metal and glass, mycology and sound. So I have a number of different takeaways about this interview is that first of all, Well, just the idea that you could be communicating with these alien intelligence through this type of biosonification. So finding different biorhythms that are coming out that you can measure, and then you translate that into MIDI and then go through all these different sequences to be able to put a timing aspect to it, to have the different pitch sequence. And so there's certainly a lot of collaboration here from the artist and the musician to be able to determine what the overall quality of the sound is going to be. But on top of that, you have like multiple fungi that are being fed through all these various different chains and getting translated into MIDI and then getting into these different euroracks and made a big long list of all the different biosignification modules and different stuff that she mentions here. It's a little bit much to kind of recount all the specific things that she's pointing to. So it's a little bit of a how to, but I really wanted to dig deep because I just wanted to understand, OK, how is this happening? One of the things that was interesting is that there's a 555 timer chip and also within the DAWs and everything else in MIDI, you set the BPM. So I was really wondering if it was getting quantized to that BPM or if there was the ability for these Mycelia to start to have their own rhythm, you know, kind of like the polyrhythmic performances that they may want to do if it was left up to them. Also, there's a big question as to whether or not these different fungi in mycelia can actually perceive and have consciousness about how these different signals are being translated into this type of sonification. Can they hear it? Can they have feedback where they're actually deliberately and intentionally controlling what is being made? pretty much impossible that we may never actually know the answer to that question. But I think it was very interesting to hear how there's a little bit of a almost extra sensory perception when it comes to these mycelia as are playing and different people come into the room, it may change the vibe, maybe reacting to that in some way, maybe it's just correlated in a way there's nothing actually more going on. But it's kind of fun to think about what if there is some deeper layers of the nature of reality that these mycelia are able to tap into these non local fields of consciousness that may be there. And That's more speculative, but I think just the idea that you're able to do these translations of these what are seemingly random biosignals, but being able to translate that into music and that you set up the different parameters across all these different sense and everything that she has set up that she walks through in detail within this conversation and have different fungi hooked up to this and to hear the different quality of the sounds are coming from just even from these different species. And so Are there certain fingerprints that we can start to determine based upon these fungi intelligences? So also I, this really interesting to hear the different arts organizations that are either funding this or involved with this, either the Gerta Institute with the new nature project, you have different performances like the chaos fun Gorham. It was at the new at Blanche. You have different festivals that we're showing here with amaze festival, the Venice film festival, as well as the rain dance film festival. There's a coalesce, which is a center for biological arts at the Buffalo university, whereas there's different residency programs for these different bio artists. There's different museums that are showing this type of work, like the forest under sound installation at the museum and the Kitchener in Ontario for the Sonica 21. And yeah, just a lot of different musical modules that she's got hooked up here and how she's got it all wired together. Lots of various different open source projects, you know, using the haptics vest as well to be able to do the B haptics integrations, to be able to have colliders and avatars that have haptic integrations that would normally give you the sense of being hugged and touched, but be able to translate that into stimulating the mycelia. So to be able to touch an avatar and have that be activated and be able to communicate with these mycelia. The other thing that I think was striking that I wasn't quite aware of were some of the difficulties in terms of getting the music into these VR chat worlds and how there can be a danger of getting feedback loops where You're taking in information and maybe it's not as easy to globally mute people. So you may be getting feedback into this, you know, you just really want to have information that's coming in. I'm sure there's a way to get it set up and wouldn't necessarily have to rely on something like Twitch so that you're not getting the types of feedback. But yeah, just interesting to hear that that's something that they can't just kind of pipe in the raw feed from the mixer. It sounds like there's some patches like that from max live that are able to do that, but may not have proper integrations with the Udon scripts and whatnot. So, still pretty early when it comes to all this different stuff and trying to integrate these technologies into these performances. But I think there's broader trends here within VRChat, which are clubs and dance performances and trying to take the music and translate that into either video DJ information that's either live-mixed, something like the Ghost Club is doing a lot of that stuff, and lots of different types of clubs and different approaches with that. I'm really curious myself into how to do this spatialization and turn things into these more volumetric or 3D sonifications that are somehow tapping into the deeper levels of the chord structures and the frequencies that are being played rather than the amplitude, which is what you see a lot of. And I'd love to see a little bit more of tuning into the different circle of fifths and seeing how music is being sonified in that way. But I think that's maybe the next frontier of thinking about how to really spatialize these 2D shaders and looking at some of the amplitudes and covering the whole area. But again, like we were talking about, the shaders that were integrated into this world, which was created by the Metaverse crew within VRChat, and it's a world that you can go visit. I'll put a link here in the chat. Created by Tyson Axe and some other folks within the Metaverse crew. Really quite a beautiful world to be able to do these different types of underground performances. Also, I think it's worth pointing out some of the interpretive dance that was happening by Sarah Lisa Vogel as she was dancing up in the air. It had these particle effects that were going around her that were embodied by different characters. And there's a moment in the second time that I saw it where she started pole dancing and she's just twirling around this pole, but you don't see a virtual pole, you just see her body moving around like she's clearly in real life on a pole. And it was just like this surreal moment of seeing the blended of the virtual and the real and knowing what was happening in her life. Yeah, just to have that full body tracked and the motion of fluid aspects there. And yeah, I think it also added a lot to the show to have her being able to dance around up in these invisible platforms. And the actual room, there's an ability to kind of find all these different nooks and crannies. And I like to just go up into the second floor and to be able to look down and watch the dancing and watch it from above and jump down. And yeah, just kind of fun to explore around this space and to see it from lots of different angles. And yeah, it created this whole vibe. It was just a really amazing, immersive experience. And to know that the sounds that I was hearing were coming from these other intelligences that were not human intelligence, just the way it was all wired up to be able to hear it as this biosignification of all these different mycelia and fungi. So really interesting, innovative project, and it got me interested to check into what other type of bio artists were working on. And just this concept in general, taking biometric and physiological information in ways that you can translate it into these either sonifications or visualizations so that, you know, you're able to walk into these immersive experiences and get a deeper sense of yourself. In this case, you're getting a sense of these other intelligences with these fungi. And what kind of quality of experiences are you able to have there? And in the future, just finding more ways to have these interactive aspects of the audience that's in these virtual worlds somehow having more interactive and participatory dynamics that are happening with these fungi. So that's all I have for today. And I just wanted to thank you for listening to the voices of the podcast. If you enjoy the podcast, then please do spread the word, tell your friends and consider becoming a member of the Patreon. This is a listener supported podcast. And so I do rely upon donations from people like yourself in order to continue to bring you this coverage. So you could become a member and donate today at patreon.com slash voicestovr. Thanks for listening!