Ramez Naam is the author of The Nexus Trilogy sci-fi novels, which explores the moral and sociological implications of technology that can directly interface with the brain. He gave the keynote at the Experiential Technology Conference in March exploring the latest research exploring how these interfaces could change the way that we sleep, learn, eat, find motivation to exercise, create new habits of change, and broadcast and receive technologically-mediated telepathic messages. I had a chance to catch up with him after his talk where we do a survey of existing technologies, where the invasive technologies are headed, the philosophical and moral implications of directly transferring data into the brains, and whether or not it’ll be possible to download our consciousness onto a computer.
Kevin Mack is an Oscar-winning visual effects artist and abstract artist who creates digital spaces with fluidly moving textures that are awe-inspiring in it’s ability to create a novel experience unique to VR. In Blortasia you float weightlessly exploring the ins and outs of a series of tunnels that have a consistent topological sculpture, but with an ever-changing shader of patterned frequencies of rainbox colors that cultivate a sort of visual neural entrainment. It aspires to recreate a psychedelically transcendent or transpersonal experience that goes beyond what your verbal mind can easily understand as there’s no content, message, story, game or objective beyond providing an experience that’s only possible in these virtual worlds. It’s this unique balance between seeing an exciting and novel visual experience that’s also simultaneously relaxing and has the power to induce powerful trance states that may have unique healing properties that are being discovered in medical applications for distraction therapy.
LISTEN TO THE VOICES OF VR PODCAST
Mack has a neuroscience background, and so he’s been collaborating with brain surgeons who are experimenting with using his Zen Parade 360 video as a hypoalgesic to decrease sensitivity to painful stimuli, but it also suppresses the normal thought processes of the left brain so that it neuroscientists can map out and discover new properties of our right brains. Preliminary studies are showing that his abstract design approach to distraction therapy applications in VR are actually more effective than other VR apps that were specifically designed for pain management.
Mack describes himself as a psychonaut having experimented with a lot of psychedelic experiences, but he’s also studied meditation, lucid dreaming, and a number of other esoteric and mystical practices. His career has been in the visual effects industry where he won an Academy Award for his work on What Dreams May Come, but with virtual reality he’s finally able to synthesize all of his life experiences and interests where he can allow people to step inside of his immersive VR art experiences that are designed to expand the blueprints of our minds. He sees that verbal language has allowed humans to evolve our science and technology up to this point, but that it’s also limited us and constrained us to a whole host of verbal neuroses. He hopes that his virtual reality experiences like Blortasia and Zen Parade can help free us from the shackles of our left brains that he sees are inhibiting the deeper parts of our intuition and unconscious levels of awareness. He’s personally had a number of amazing but also traumatizing experiences with psychedelics, and so he’s trying to use virtual reality in order to replicate those transcendent feelings of awe and wonder that come from mystical experiences in a more safe and controlled fashion.
Mack also shares his out-of-this world, retrocausality backstory that includes a substance-free psychedelic experience with a time-traveling artificial consciousness that’s he’s just starting to create now with neural networks embedded within his art. Is it possible that Mack in the process of actually developing a sentient level of artificial consciousness that will evolve to master the structures of space-time to bend the arrow of time? Or was it just the vivid imagination of a four-year old that has provided him with a powerful inspiration for his entire life? Either way, his Blortasia experience has stumbled upon some important design principles stemming the desire to create art that pushes the boundaries of consciousness.
Former employee Danny Bittman wrote about his brief time at Upload in a recent Medium post and there were some women who spoke out in a Buzzfeed article in July, but beyond that not many people with first or second-hand knowledge of the lawsuit allegations have made statements on the record. (You can find my Facebook posts about Upload since May here: 12345). There hasn’t been a lot of people who have been willing to talk about this issue on the record, but this seems to be changing after the latest round of news about the settlement lawsuit that has left segments of the VR community very unsettled.
One woman from the VR community who was willing to talk to me about the community fallout from the UploadVR lawsuit was Selena Pinnell, who is the co-founder of Kaleidoscope VR festival and fund. She is also a producer and featured participant within the Testimony VR project. I previously interviewed the director of Testimony VR project about their efforts to use VR to create an immersive context for women and men to share testimony about their experiences of sexual assault so that audiences can bear witness to those direct experiences. Skip Rizzo has said that healing from PTSD involves being able to tell a meaningful narrative about your traumatic experiences while remaining emotionally present, and Testimony VR is trying to create a new form of restorative justice by capturing these stories within VR that viewers can have have an one-on-one level of intimacy while they bear witness. Pinnell talks about how powerful it was to have over 150 co-workers and friends witness her testimony about being a rape survivor within the context of a VR experience.
LISTEN TO THE VOICES OF VR PODCAST
While VR holds potential for the future of distributing new forms of restorative justice, this issue with Upload feels like it’s a long way from achieving a state of justice and a full accounting of the truth of what happened. Members from the Women in VR communities privately do not feel like justice has currently been served, and Pinnell voices those common concerns as to why she can no longer support Upload as well as why in her assessment the leadership team of Upload never fully accounted for what exactly they did wrong and what they’ve learned.
She also says that it’s hard to trust the leadership after they originally declared that the originally allegations in the lawsuit were “entirely without merit.” Pinnell talks about how crushing it can be to have your testimony of your direct experience be so explicitly denied in this way, especially when it comes to taboo topics like sexual harassment or sexual assault. (Note that the original allegations against Upload were harassment, gender discrimination, hostile work environment, unequal pay, and retaliation, and there weren’t any allegations of sexual assault.) Pinnell emphasizes how important it is to try to listen to women when they are providing testimony about not feeling safe within a work environment, and to try not to go directly towards demanding objective proof from a frame of skeptical disbelief. Learning how to listen, empathize, and reflect the truth of a direct experience is a skillset that is needed here, and it’s something that the unique affordances of the virtual reality community can help to cultivate through projects like Testimony VR. But there’s many more unresolved issues and open questions that Pinnell and I discuss in deep dive into new models of restorative justice and the community fallout surrounding the Upload lawsuit settlement.
ARQUA! was one of the ARKit launch applications that was designed by VR veteran Isaac “Cabbibo” Cohen, and it has the same indie charm and shader art aesthetic as his previous VR experiences of Blarp! and L U N E.ARQUA’s gameplay involves you creating a rainbow aquarium by playing kelp plants, schools of fish, and 3D rods that you place around your space by turning your body into the controller. Cabbibo is really interested in providing users of his AR experience with an experience of agency, creation, and beauty in a way that recontextualizes their relationship to their surrounding environment. I had a chance to catch up with Cabbibo after a presentation about Art in AR/VR in Portland, OR, where we talked about ARKit, exploring what makes a compelling AR experience, lessons that VR has to teach AR, and how data is the ‘R’ in MR/AR/VR/XR in that it’s the transformation of real objects into data that allows us to have mediated experiences within a symbolic reality.
LISTEN TO THE VOICES OF VR PODCAST
There's nothing tastier than a sandwich caked up in @Cabbibo Crystals.
Intel is investing in the future of immersive computing through their Virtual Reality Center for Excellence. They’re pushing the boundaries of high-end of VR gaming experiences, pursuing initiatives to help VR reach critical mass, and exploring how RealSense depth sensor cameras and WiGig wireless technologies fit into the VR ecosystem. I was able to demo an early prototype demo of an HTC Vive game rendered on a PC and transferred wirelessly to a mobile headset, and it’s part of a research project to search for additional market opportunities for how high-end PCs could drive immersive experiences.
I was able to sit down with the Kim Pallister, the director of Intel’s VR Center for Excellence to talk about their various initiatives to advance immersive computing, their WiGig wireless technology, RealSense and Project Alloy, and some of the experiential differences between their lower-end and higher-end CPUs. He predicts that immersive gaming markets may mirror differences in mobile, console, and PC markets, and that there will be a spectrum of experiences that have tradeoffs between price, performance, and power consumption. Intel is initially focusing on pushing the high-end of VR gaming experiences, but they believe in the future of immersive computing and are looking at how to support and are looking at how to support the full spectrum of virtual reality experiences.
The White House VR documentary People’s House by Felix & Paul Studios won a Emmy for the outstanding original interactive, and I had a chance to talk with Paul Raphael about how the challenges of producing a high-profile piece. They didn’t know how many rooms they’d be able to shoot, and President Obama was such a fan of the project that he literally opened doors for the crew to record more than twice the number of originally scheduled rooms. They were limited to only two 15 minutes interviews with Barack and Michelle Obama, and so they collaborated with speech writers to capture memories and stories for this virtual guided tour.
Felix & Paul Studios create their own VR camera hardware, and they’re starting to use their fourth generation cameras while designing a next-generation, digital lightfield camera. Raphael said lightfield VR shoots are essentially visual effects shoots, which require shooting in different wedge segments that need to be composited in post-production. He also said that they’ve been consulting with most of the major HMD manufacturers including Facebook on an open standard for immersive 3D audio. Even though they’ve been creating a lot of hardware, they’re more interested in using it to stay on the bleeding edge so that they can continue to innovate and push the creative limits of what’s possible in immersive storytelling.
Do patients with anorexia nervosa suffer from body image distortion due to how they perceive their body or is it due to attitudinal beliefs? Betty Mohler has been using VR technologies to study whether body representation is more perceptual or conceptual. She captures a 3D body scan of patients, and then uses algorithms to alter the body mass index of a virtual self-avatar from a range of plus and minus 20%. Patients then estimated their existing and desired body using a virtual mirror screen which tracked movements in real-time and showed realistic weight manipulations of photo-realistic virtual avatars. Mohler’s results challenge the existing assumption that patients with anorexia nervosa have visual distortions of their body, and that it’s possible that body image distortion is more driven by attitudinal factors where patients consider underweight bodies as more desirable and attractive.
Mohler works at the Space & Body Perception Group at the Max Planck Institute for Biological Cybernetics. She’s collaborates with philosopher of neuroscience Dr. Hong Yu Wong to research foundational questions about self perception like “Who am I?” Where am I? Where is the origin of my self? Where is the frame of reference? What is the essence of me? How do we know that there’s an external world? What does it mean to have a shared self where multiple people share the same body experience? What does it mean to have a body? How big is my body? Is it possible to be at multiple locations at once while in VR?
I interviewed Mohler for the third at the IEEE VR conference in Los Angeles this past March exploring all of these provocative questions (see my previous interviews on the uncanny valley and avatar stylization).
Marshmallow Laser Feast is a collective of artists who are interested in using VR technologies to capture the aesthetic beauty of nature, and provide immersive experiences that inspire people to cultivate an even deeper with the reality that surrounds them. Their Treehugger provides an immersive experience of the lifecycle of water in trees as rain makes it’s way up from the roots of a Sequoia tree and is released as oxygen in a highly-stylized & beautiful point-cloud aesthetic. Their experience included smells and passive haptic feedback to make their simulated volumetric time-lapse even more immersive, and it actually won the Storyscapes award at the Tribeca Film Festival. I caught up with co-founder Barnaby Steel to talk about how VR could be used to inspire us to cultivate an even deeper relationship with the world around us.
The National Theatre has created an Immersive Storytelling Studio to better understand the practices, protocols, opportunities of how virtual and augmented reality technologies are creating new storytelling possibilities. They collaborated with the National Film Board of Canada on an immersive theater piece called Draw Me Close that premiered at Tribeca Film Festival. It featured a one-on-one interaction with a live actor in a mixed reality environment while the audience was unveiled within a virtual reality headset where you play the archetypal role of a son/daughter as your mother embraces you, draws with you, and tucks you into bed as she narratives a memoir of her life. I talked with Immersion Storytelling Studio producer Johanna Nicolls about the reactions, intention, and overall development of Draw Me Close, which is their first immersive theater VR piece.
LISTEN TO THE VOICES OF VR
The spatial storytelling techniques and skills that theater has been developing for hundreds of years translates really well to the even more immersive 360-degree, VR environments. But with Sleep No More and Then She Fell, there’s also a whole other “immersive theater” movement within the theater world that is bringing new levels of embodiment, choices, and agency into authored theater performances. No Proscenium podcast host Noah Nelson wrote up a great introductory primer of immersive theater that explores the nuanced differences between immersive theater, site-specific performances, and environmentally-staged theater. One differentiation that Nelson makes is that immersive theater has much more of an explicit experiential design that “feels more like an event you experienced than a performance that you witnessed.”
The version of Draw Me Close that I saw at Tribeca took a powerful first step in exploring how live actors sharing the same physical space within a mixed context provides a new dimension of emotional and embodied presence. The haptic feedback of an embodied hug from a co-present human is something that may never be able to ever be fully simulated in VR, and so this illustrates a clear threshold to me of what can and can not currently be done in VR. I also saw the Then She Fell immersive theater piece which featured a lot of one-on-one interactions with performers, and so I think that there’s a profound depth of emotional presence and intimacy that you can achieve with another person without the barriers of technology. You still can’t see the more subtle microexpressions of emotion or perceive the more nuanced body language cues when interacting with other humans while you’re in VR, but feeling the actor touch me provided a deeper phenomenological sense of embodied essence that I was interacting with an actual human in real-time. Directly interacting with another physically co-located person and feeling their touch closed some perceptual gaps and took my sense of social presence beyond the normal levels I have in distributed social VR experiences.
This was also such a new type of experience that I didn’t know the rules of engagement for how much I was expected to speak or interact. There weren’t a lot of prompts for talking or engaging, and so I mostly silently received the story as each moment’s actions were being actively being discussed, analyzed, and contextualized by a steady stream of real-time narration. There were not a lot of prompts, invitations, or space made available for dialogue, but there were a number of interactive actions I was invited to do ranging from opening a window to drawing TiltBrush style on the floor. There was a deliberate decision to be fairly vague in casting a magic circle of the rules and boundaries of what to expect, since the story, characters, and loving embrace of a motherly hug was all designed to be a surprise. This shows the challenging issues of balancing how to receive explicit consent to being touched while also maintaining the integrity of the mystery of a story that’s about to unfold.
Draw Me Close is an ambitious experiment to push the storytelling possibilities that are made available within a one-on-one interaction of an immersive theater piece while the audience is within virtual reality. It was a profound enough experience for a number of people who needed to have some level of decompression and help transitioning back from exploring some of the deeper issues that were brought within the experience. There are obviously limitations for how this type of experience could be scaled up so that it was logistically feasible to be shown on a wider scale, but it’s refreshing to see the NFB and National Theatre’s Immersive Storytelling Studio experiment, explore, and push the limit for what’s even possible. If too much effort is focused on what’s sustainable or financially viable, then it could hold back deeper discoveries about the unique affordances of combining immersive theater with immersive technologies.
Here’s some footage from Draw Me Close at Tribeca by Steve Rosenbaum
One of the most emotionally-moving VR experiences that I’ve had in VR was bearing witness to Holocaust survivor Pinchas Gutter share his experiences at Majdanek Concentration Camp in The Last Goodbye. Gutter not provides a guided tour, but he is able to achieve a level emotional catharsis through the process of sharing his story that his virtual presence within the experience amplified my own sense of emotional and social presence, which is what helped to make it such a profoundly moving VR experience. The Last Goodbye uses a unique blend of photogrammetry and billboarded stereo video that helped to transport me into a room-scale experience of multiple key locations at the Majdanek Camp as Gutter shared his memories of being there as an 11-year old child.
The Last Goodbye premiered at the Tribeca Film Festival, and it was an epic collaboration catalyzed by co-creators Gabo Arora and Ari Palitz and included HERE BE DRAGONS, MPC VR, and OTOY on the technical production side as well as the USC Shoah Foundation to oversee the process of capturing testimony about the Holocaust. Arora is the founder and creative director of LightShed, and is now an advisor and former founder of the UN’s VR initiatives where was able to gain access to the Syrian refugees featured in his famous empathy piece Clouds Over Sidra. In this interview, Arora shares his collaborative process, pushing the boundaries of volumetric storytelling by blending photogrammetry-based, room-scale VR with live-action, empathy-based storytelling, as well as how he had to guide Gutter to achieve the depth of emotional presence that makes the piece so powerful.