I’m diving into my backlog to publish three unpublished interviews with Alex Coulombe, Creative Director of Agile Lens Immersive Design, starting with at Magic Leap LeapCon 2018. See more context in the rough transcript below.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
Podcast: Play in new window | Download
Rough Transcript
[00:00:05.458] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So I'm actually going to be diving into my backlog to air three other interviews that I did with Alex Coulomb leading up to a conversation that I had with him at MetaConnect. So I've been doing the voices of VR podcast for over a decade now. I was doing a daily podcast for a while. And then at some point I asked my listeners, Hey, is this too much? Should I slow down the amount that I'm publishing? And I got a pretty resounding like, yes, you should definitely slow down. But I didn't slow down of actually recording all these different interviews. And so I've still been capturing all these slices of oral history, kind of real time oral history. And sometimes when I air them in real time, then it becomes real time journalism. But sometimes I don't have a chance to air all the different conversations and interviews from a trip. I've been trying to do that a little bit more lately so that when I go to an event, when I come back, I try to release a big batch and like in this 12 to 13 hours worth of content covering all these different conversations that I have. But I've literally got hundreds and probably well over a thousand different conversations that have not aired. So it always brings me great pain to realize that I have not been able to publish all the conversations that I've had a chance to have over the year. And so this was a situation where I wanted to have yet another conversation with Alex Coulomb, but yet had three unpublished interviews with him. And so I wanted to air this as kind of like a backlog series in the middle of my meta connect coverage, just because some of the stuff that he's working on ended up being a pretty key turning point in not only his career, but also trying to be the mediator between entities like Apple and Epic, as well as with Meta and Epic, as well as like getting a chance to check out the Orion AR demo after bringing some of the Meta executives to Austin, Texas, to see the high end real estate Four Seasons experience that he's had a chance to work on. So I'm going way back into my archive from Magic Leap LeapCon, which from other conversations that I had at the Snap Summit, Aiden Wolf said this was like the greatest AR conference that we've had yet because there were so many creatives and really like it was Magic Leap's launch party where they were launching their first Magic Leap One, but also just a lot of demos showing a lot of the potentialities for Magic Leap and augmented reality. And Alex was there in the hallway showing some of the demos of stuff that he's working on. And so I had this whole conversation with him at this intersection between theater, architecture, immersive storytelling, and the cutting edge of virtual beings and where all this technology was going with volumetric capture. And this is the theme that I think evolves over the years that I've had a chance to see Alex in AgileLens Immersive Design. So I'm going to start with a conversation that I had with him at Magic Leap 2018, and then a conversation I had with him at Tribeca in 2023, and then another conversation after having a chance to actually demo some of these different experiences that I had at the Filmgate International in Miami in 2023. So that's what we're coming on today's episode of the Voices of VR podcast. So this conversation with Alex happened on Wednesday, October 10th, 2018 at Magic Leap LeapCon in Los Angeles, California. So with that, let's go ahead and dive right in.
[00:03:19.280] Alex Coulombe: Hey, Ken. I'm Alex Coulombe. I'm the creative director of Agile Lens Immersive Design. We're a New York City-based consultancy that does all sorts of XR solutions, so VR, AR, originally just for the field of architecture, but now we're doing a lot more with theater, too.
[00:03:32.529] Kent Bye: So theater and architecture, is it kind of like a pragmatic, like do the architecture to get paid and then do the theater for the love of the passion of the art?
[00:03:42.552] Alex Coulombe: Oh, I never thought of it that way. I'm definitely passionate about both. I studied architecture and theater and then worked for a few architecture firms, eventually working at an architecture firm that only designed theaters. And that was my way into VR. So when the Oculus Rift DK1 came out, I started using virtual reality to test sight lines in the theater, put different things on stage, an orchestra versus a play. and then start to make design changes like maybe we want to move the balcony up a little bit. And so really active design tool in the architecture process. And then that led to doing other kinds of architecture projects that weren't theaters. You know, I go right to nightclubs for some reason because that was a fun one, but all sorts of projects, museums, residential units. And then eventually we got a few opportunities to do more on the theater side. The first big theater project was for Kenneth Branagh, who is bringing in production to Macbeth from Manchester in England over to the Park Avenue Armory in New York City, where we're based. And so that was an opportunity to use early virtual reality to put like a set design on stage and block out the actors and change some of the locations that was very custom to that production. And that was really satisfying because architecture, of course, can take many, many years to see the fruits of that labor versus a theater production shares many of the same processes, but can go up in a few weeks. And you can see those changes having a ripple effect on the final production very fast.
[00:04:57.915] Kent Bye: Yeah, I've actually been talking and very interested in this translation of space into experience. And I feel like people who either design physical tangible objects or people who have been trained as architects will have some unique insights into this spatial medium because you kind of have a lot of design philosophy for how to design geometries in space and translate that into experiences. And so it sounds like that you've been focused in on theater. And I'm curious to hear what type of insights you're taking from your theater and architecture background combined and how that applies to virtual and augmented reality experiential design.
[00:05:33.103] Alex Coulombe: That's a fantastic question. And I'd say some of the early pitfalls I saw with a lot of VR experiences being developed was a lack of narrative, a lack of guidance. And I'd see people dropped into a VR experience and feel very overwhelmed or motion sick. And just like a good theater production can be a really powerful narrative, a good architectural experience, a good VR experience of any kind is going to make you feel like there's some sense of clear progression for what you're supposed to be getting out of it. And so learning some early cues for early virtual reality and everything from how we would position the player at the beginning the onboarding of it you know we had some really experiments where we'd put you in a room that looked very similar to the real room you were in and then you might you know open up a portal or something that would take you into the virtual version so the vr equivalent of what happens when the curtains close in the theater and then the lights come up that's a transition that we find is very very helpful um yeah
[00:06:25.932] Kent Bye: So you're also looking at performance capture and the actual performative aspect of theater because it seems like that we have film and there's a certain acting style that you do in film which is you can try to piece together and construct a performance in post-production but in theater You don't have the luxury of cultivating and crafting that performance. You really have to do it live. And there's a lot of what I find with the virtual and augmented reality mediums that that live nature of that performance is something that is much more well-suited for this medium where it's long takes and you just do it all. But the mixed reality capture of that, of how to translate a character into an experience is something that either you're doing motion capture or Windows Mixed Reality or DepthKit. Maybe you could, from your perspective, what are the things that you're seeing here at LeapCon that are really pushing the edge of what's possible of doing performance capture within a mixed reality experience?
[00:07:25.293] Alex Coulombe: Yeah, and I'll give a quick background to what I've seen already in this kind of field. My company, in partnership with Kira Benzing and David Gottschfeld, have been working with Philip Rosedale's company, High Fidelity, to start to explore what kind of live events could really translate in VR, and that's primarily avatar-based. So you have people maybe wearing Vive trackers to start to have more natural limb motions, and there's some different technologies for facial translation, though honestly, in our experience, we've found that just giving actors masks can be much more effective in that kind of setting. But what's going on here at LeapCon is there's some really exciting work being done with both volumetric capture for the Royal Shakespeare Company and then the Imaginarium, Andy Serkis' company, has a really exciting demo that is very similar to something they showed at GDC with their Vicon setup where you can have an actor that in real time is translating everything from their body to very minute details on their face onto an avatar's performance that's much more high fidelity than anything you would get with Vive trackers or other currently existing technology, especially in a platform that's meant to support many, many people right now, like High Fidelity or VRChat or Altspace, which needs to reach a certain common denominator to accommodate all the people in there at the same time.
[00:08:38.601] Kent Bye: Yeah, I had a chance to see the Royal Shakespeare Company's experience of, I guess it was Shakespeare's Seven Ages of Man. It was one little snippet, but it just had a really powerful emotional impact for how they were able to add the character with some animation in the background that was symbolically reflecting the nature of the death and birth cycles that were being represented by nature metaphors, I guess you could say. But also there was a Micah live action immersive theater virtual avatar that had like the stylized dimension to it. So I think that there's going to be this combination of this photorealistic Windows mixed reality capture. And I've seen the early demos of Reggie Watts and something that was here at Royal Shakespeare Company is really high fidelity. It looks great. but there's something about the stylistic animation that it's juxtaposed with that there's can sometimes be a little bit of a fidelity mismatch that my brain in terms of expectations it doesn't quite match and so there's things like depth kit that can allow you to do low cost volumetric capture but add a level of stylization to that and so something like the imaging peeps performance that she did for the Wave VR using the depth kit, but add all these different particle effects and different aspects of a style on top of it. And so that's what I'm curious to see is that there's these trade-offs between the photorealistic on one end, but then you have a high expectation loop for what that needs to reach in order to really make it feel like it's there. versus doing something that's stylized where your brain kind of knows that it's not real and it allows you to have a little bit more of a suspension of disbelief, which is a little bit the approach of what the Micah experience did here at LeapCon, which was to have sort of a cartoonish stylization, but it allowed me to feel like I had more of an intimate connection to this interactive character that was responding to me rather than something that was captured. So I think there's a spectrum of different things depending on what you're trying to do. There's different technological decisions that you can make to get different reactions.
[00:10:36.839] Alex Coulombe: Yeah, and you're hitting on so many exciting things that I'd love to dive in detail to, probably at a different date. But I'd love to talk very briefly about how you felt because you've seen so many. I'm thinking about like film festival experiences where there might be a live event that's maybe kind of a theater piece, but it's very small. You might be the only person in the experience. And to me, one of the powerful things about theater is that it's live and that you're experiencing it as a shared experience with all these other people. You're there together in the moment, maybe everyone crying. And I saw the most world shattering theater in my life when I studied abroad in London 10 years ago. And I've kind of been chasing, is there any way to do something like that in VR as a shared experience? And so, you know, when you've seen something like Draw Me Close or Dinner Party, those are pretty private experiences, but do have a live action component. And I'm curious how you feel something like that might potentially scale. Do you think something like that could be effective with a lot more people?
[00:11:28.231] Kent Bye: I think what's actually the trend that I suspect that things are going to go into is live, real-time, one-on-one, intimate interactions. Something like Then She Fell in New York City. It can only have like 14 people, but it's really this clockwork architecture, meaning that there's a lot of one-on-one time that you have with the actors. something like sleep no more you go there with the hopes that you might be selected to have like this one-on-one intimate encounter and you know it's really kind of random and there's only a couple people to get it but then she felt was really architected to have that one-on-one intimacy and something like draw me close was one-on-one with an actor and that actor can respond to you and it was you were co-located in the same environment and there's actually some physical touch. She gives you a hug in the course of this experience. But it's in a stylized virtual reality, completely occluded experience. And so what I am talking to Yelena Rachitsky, one of the things that Oculus is working on is working with live actors from third rail productions, which produced Then She Fell. But those same immersive actors were involved with Wolves in the Walls. And so there's so much lived embodied experience for how to cultivate using your body relative to other people in space and how to read body language and also react to that and starting to systematize that into CGI characters and wolves in the walls where they're using the motion capture but doing a little bit more of a dynamic interaction for how do you cultivate engagement interaction because there's different player archetypes that are going to have different reactions. And so the theater actor has to be able to respond in the moment. And one of the actors that I have an unpublished interview with, who actually was featured as Lewis Carroll and then she fell. The thing he said is that and all of the hundreds and hundreds of shows he did, that every single show he ever did in immersive theater, there was something that was unique and that had never happened before, and that he had to respond to that. And so I think that when we think about this, it's gonna be like, well, it's gonna be less about how do we create this AI character that's gonna be so robust to be able to replicate a human, and more about how do you create these immersive theater experiences, which may be something like the Diamond Age, where there's these actors or actors who are going in and actually facilitating these types of experiences. So that's where I see things are kind of going. So what are the real-time interactions, real-time volumetric capture, the motion capture that's going to allow you to do that stylization, but that interactivity that you're going to be able to do this dynamic response, something that's going to be difficult to code into AI. So that's kind of what I see. Do you think immersive theater is a good template for what could be possible in VR or AR? I think so, yeah, because I think that immersive theater is the spatial experiential design system right now that is really focusing on narrative and agency in a way that is coming from more of a storytelling perspective and that some of the most cutting edge experiences that are happening right now aren't in VR or AR yet but they're not scalable at the scale of immersive theater because some of these things are only one-on-one intimate interactions and so what I see happening is that there's these immersive theater actors starting to collaborate with companies like Oculus and Fable who did the Wolves in the Walls and they're starting to systematize some of those ideas and concepts into CGI characters but that eventually some of the user experiences and types of things that are happening in immersive theater is really at the forefront of what's possible because you can just have pop-up experiences and there's not a lot of technological cost. And the people who are the storytellers with the theater background aren't necessarily like Unity coders and know how to optimize 3D graphics and know how to make a compelling VR experience because they're all about the performance and not about the technology yet. But there's this fusion. The Immersive Design Summit, I think, is at the forefront of bringing together those immersive theater people and the technologists but there's more and more I think we're going to start to see this cross pollination between immersive theater and virtual reality and augmented reality and I've already started to see it but yeah it's exciting to see how you know the theater and architecture and all these that your background is in is kind of on this collision course and already is happening.
[00:15:34.571] Alex Coulombe: Yeah, and I'd love to put out just a few ideas for anyone listening, because I think there's so much more exploration that could be happening in this space, and me and my cohorts can't do it all ourselves, so anyone who's interested in this stuff, I want to see people doing these explorations. One thing that I want everyone to be thinking about is what happens when there's a live experience where the audience is mostly live, but a lot of the characters are NPCs, so the equivalent of an MMO, or I think everyone should read a book called The Invention of Morale. Have you read that before? giving a very brief background on it. It's about a guy who lands on a deserted island. He thinks he's alone on the island. Then he thinks there's a native population there. I'm totally spoiling the book, by the way. And then he finds out it's not real people. It was a recording for like a 10-day loop or something like that of all these people performing their actions over and over. And when the guy realizes he's alone, by this point he'd fallen in love with a woman. And to stop himself from going insane, he kind of inserts himself into the narrative. And so because he remembers everything everyone's doing over that loop, he just starts to pretend he's a part of it. It's like Westworld. Yeah, it's very like Westworld, actually. And so, you know, there's something interesting to me about that. Could you have a theater performance that, A, most of the characters are prerecorded with a few live ones, and would the audience be able to tell the difference between the live ones versus the prerecorded ones? But then also, like, High Fidelity has the ability for... a performer to loop or build a performance all by themselves, the same way a musician might start to loop music over themselves and then add beatboxing and different instrumentation and different vocals. So you could have one actor start to play 12 characters all part of a live show. And first of all, there's something really exciting about that for the actor, but I'm also curious what that becomes for the audience. And then what's that ideal actor-audience relation? In an immersive piece, of course, there is a limit. You don't want to get too close to the actors to invade their space. But then in something like Sleep No More, where you're kind of a ghost in the experience, that's something a little bit more passive. You want to be a part of it and you chase everything around. But then I got to experience immersive Gatsby in London, which was very participatory. Like, you are a character living in a Jay Gatsby party, And that's something where you're talking to the characters and you're running off and hearing about how Daisy feels about a certain thing and everyone's confiding in you. And that could be a different kind of immersive experience that could translate well into VR or AR. But one thing that I really love is something that happens in both those experiences in the real world is there's a lot of running. You're actually moving around from space to space to keep up with the narrative. And as I said, you don't want to distract the actors. But in virtual reality, we've been playing with what if your avatar is invisible or your avatar is like a speck of light or a speck of dust? One of my favorite things about a live theater piece is seeing world-class acting up close. If I can sit in the front row for a really powerful live piece of theater, that's world-shattering to me. And so I love the idea that as the fidelity of an actor's performance gets better in VR or AR, there could be the potential to get right up close to that and experience that in a way that is not disruptive to the actor, but still allows the audience to have this really powerful experience. And I don't think we're there yet, but I want to see these explorations happening.
[00:18:31.365] Kent Bye: Yeah, and it also reminds me of Firescape by Ink Stories. They did this experience where you're kind of standing on outside of a building, voyeuristically like a Hitchcock mirror, where you're looking into the rooms of other people, but you can zoom in and listen in, but there's kind of like this murder mystery that you have to solve over three different hours or acts. But you have the ability to go back and listen because everything's happening in parallel. So it's impossible to see everything. So I feel like the models of which that you're able to use space and have parallel storytelling and you are exploring an area or you're making choices as to what to pay attention to. But it's a fragmented like experience where you have to then have the experience and then puzzle together either by yourself in multiple viewings or you watch it with some friends and then you you share what each person saw. Then you're able to have this kind of postmodernistic take on whatever each person experienced was their experience. But then you have to kind of melt it together, see if you can have some sort of consistent narrative that emerges or different things that you have different puzzle pieces. So it encourages that interaction afterwards. And they're actually going to be doing some live screenings of Fire Escape With some people in VR and some people are watching other people watching the experience on a 2D screen. So there's going to be like the audience is going to be able to get one experience. And then I was like, wouldn't that kind of spoil it? And they're like, no, if they want to go back and see it again, then they're going to be able to know what to look for when they go see it. So it's sort of like this concept of like there's so much happening in experience that you could see someone else's experience. experience of it on Twitch or whatever else but then that would inspire you to go have your own direct experience of it and then you'd be able to make your own choices and have your own uniqueness and so I think how to architect a narrative around that I think is where things are going.
[00:20:09.212] Alex Coulombe: That's so exciting to me and then it's also worth reminding people of course that as we start to move into this age where more of these live events theatrical or otherwise can take place in VR or AR you know that allows for replayability and looping in a way that is more difficult in the real world. I've got kids now, I work in New York City, but I would love to go back to Sleep No More five times and get every ounce of that experience, or for any kind of experience where there is a puzzle quality and you want to piece together a narrative, you know, with a VR version of that, even if it's live every time, so much easier to take two hours from your night from the comfort of your home to be a part of that live shared experience than to hire a babysitter and go through the hassle of all that. So I'm really excited for exactly the kinds of narratives you're talking about to make their way into immersive mediums where anyone can experience it.
[00:20:54.693] Kent Bye: Yeah, and after watching the Royal Shakespeare Company's experience in the Magic Leap and seeing Reggie Watts with the Windows Mixed Reality, I will say that there's something about the volumetric capture that's photorealistic that can capture a theatrical performance and to view it on either the HoloLens or Magic Leap is a way better experience than seeing it in virtual reality. because you have sort of the context of reality and you're able to, I guess, see more of that human emotion and those micro-expressions and that performative aspect that may not always translate into something that has to go into a VR experience where there's different fidelity. So I think the fact that AR can have a fidelity match with the rest of reality is something that I see as a huge strength.
[00:21:37.452] Alex Coulombe: Yeah, absolutely. And the last thing I'll just mention is that for the actor side as well, you know, an actor feeds off their audience and so they get so much energy from how they feel and audiences responding. And so there's something really exciting to me about different ways that an actor might be able to participate with the audience. The Reggie Watts experience. You say Windows Mixed Reality. I saw a Reggie Watts experience in alt space. Is that different?
[00:21:57.661] Kent Bye: It's a Windows Mixed Reality Capture. So it's the Windows sort of photogrammetry to do a live performance. Christina Heller, I think, has a place here in LA to do that, but it's basically their proprietary method of doing capture. It's one of the best captures that I've seen that's out there. So yeah, but it was on the HoloLens that I saw it at the Microsoft Build this past year.
[00:22:16.008] Alex Coulombe: Okay, and all I was going to say about the Altspace Reggie Watts experience was that that was kind of interesting because Reggie, of course, could see during that maybe 50 audience members, but because there were more than 50 people at that performance, they were all divided up into other duplicates of that comedy club. And so there was some weird things that would happen where Reggie, as a comedian, would want to interact with the audience and say like, hey, look at that guy. shirt or funny hat and you'd be in a room where that guy wasn't there so there's an interesting balance as well for you know how do you allow the actor to have those kinds of moments where they can feed off an audience while still allowing the shared experience to be as consistent for everyone as possible
[00:22:51.738] Kent Bye: Great. And finally, what do you think is kind of the ultimate potential of virtual or augmented reality and what it might be able to enable?
[00:23:00.322] Alex Coulombe: I think it will be people sharing their souls. It will be an extension of ourselves when someone can really personalize a VR, AR experience to the point where, let's use Magic Leap for an example, we're headed in a direction where you can start to populate the world around you the physical world around you with digital content and so if I create a world where I've left certain art pieces or certain tokens or puzzles or you know notes for like here's something special that happened to me at this place five years ago here's the place my wife and I met something like that and then you know that could be wonderful for me in the in the vein of something like a memory palace but then the idea that I could give that to someone who I care very much about maybe my children or my grandchildren and they could walk through New York City and have this experience of what was important to me during my life and There's a legacy or an immortality kind of quality to that that I think is really powerful. I'm excited for it. Great. Is there anything else that's left unsaid that you'd like to say to the VR or AR community? Kent, I've been listening to you since the very beginning, and I think you are the most valuable person in the VR AR industry, and I would just like to take this opportunity to thank you for everything you've done. Awesome. Well, thank you so much. Thank you.
[00:24:08.188] Kent Bye: Thanks again for listening to the Voices of VR podcast, and I would like to invite you to join me on my Patreon. I've been doing the Voices of VR for over 10 years, and it's always been a little bit more of like a weird art project. I think of myself as like a knowledge artist, so I'm much more of an artist than a business person. But at the end of the day, I need to make this more of a sustainable venture. Just $5 or $10 a month would make a really big difference. I'm trying to reach $2,000 a month or $3,000 a month right now. I'm at $1,000 a month, which means that's my primary income. And I just need to get it to a sustainable level just to even continue this oral history art project that I've been doing for the last decade. And if you find value in it, then please do consider joining me on the Patreon at patreon.com slash voices of VR. Thanks for listening.