quillsfest
The Oregon Shakespeare Festival in collaboration with Artizen producers Ana Brzezińska and René J. Pinnell commissioned & produced four VR pieces that premiered at the inaugural Quillsfest, which was a two-day event November 19-20 held within the Museum of Other Realities. Each of these commissions paired established theatre makers with veteran XR artists to produce four different types of VR experiences that blend the design practices from a broad range of design disciplines. There were also five behind-the-scenes exhibition installations giving insight into the creative process of a variety of different XR productions, some of which had more live performance aspects. All of these pieces are still available to be seen within the Museum of Other Realities until December 18th, which requires a PCVR system to see them.

ScarlettKimPhoto2021 I had a chance to talk about all of these XR pieces with Scarlett Kim, who is an Associate Artistic Director & Director of Innovation & Strategy at Oregon Shakespeare Festival. We talked about OSF’s journey into immersive technologies, and their collaboration with Artizen in order to help pair established XR artists with established theatre makers. There was a lot of exploratory process to merge the different design disciplines in these distributed and at times asynchronous collaborations that spanned five different time zones amongst all of the creators. We also talk about how XR technologies are part of a larger accessibility roadmap for OSF in order to make immersive art and live performance more accessible to people who are not able to attend their destination theatre offerings within Ashland, Oregon.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

Below is more information provided by Oregon Shakespeare Festival about the four commissioned VR pieces and their creators as well as the five behind-the-scenes exhibitions.

Here’s a teaser trailer of some of the pieces shown at Quillsfest:

Guardian of the Night

Dede Ayite – Lead Artist
Michael Joseph McQuilken – Lead Artist
Joel ‘Kachi Benson – Lead Artist
Michael Thurber – Composer and Sound Designer
Jennifer Harrison Newman – Movement
Tyler Alexander Arnold – Set PA

Costumes Built by The Public Theater Costume Shop in New York

Within a number of West African traditions, there is a belief in spiritual guardians known as Zangbeto, Coumpo or Kwagh-hir. These beings act as spiritual guardians and emerge in a whirlwind of energy during festivals to speak to the people.

This project creates a virtual reality experience that immerses the viewer in a dark, unfamiliar forest. A guardian will appear and through dance and movement, illuminate and guide the viewer.

Raffia and a variety of recycled materials (solid worldly materials) are used to create the body of this otherworldly guardian, who appears only for a brief moment to reflect our existence back to us, and in so doing help us find our way.

This experience is an exploration of movement, and an embodiment of our oneness with the earth and nature.

Anakwad

Ty Defoe – Creative Director
Dov Heichemer – Creative Co-Director
alpha_rats – Developer
Clara Luzian – 3D Artist
Suzanne Kite and Devin Ronnenberg – Music

An Anishinaabe tale brought to life in an Indigiqueer dreamscape summoning the shapeshifter queer ghost as it nullifies linear time and learns to unlock syllabic truths to regain balance in the destructive destiny of now.

Ordinary Gesture

Raja Feather Kelly – Lead Artist, Writer, Co-Director
Illya Szilak – Co-Director, Creative Producer
Cyril Tsiboulski – Art Director, Technical Director, Lead Developer
Christoph Mateka – Sound Design & Score Composition

Ordinary Gesture is a Virtual Reality Theatrical experience that intersects theatre, meditation, and movement. The experience seeks to surrealize the experience of empathy by situating the player in 5 scenes that expand from their body to space-time (the universe) and back again. Inspired by the movies Magnolia, Melancholia, Waking Life, the poem “You Are Never Ready” by Nicole Blackman, and the writing of cultural anthropologist Ernest Becker, Ordinary Gesture asks the player to contemplate existence, suffering, compassion, and gesture as both ingredients to create theatre and a means to perhaps better understand empathy.

O-DOGG: An Angeleno Take on Othello

Performed by Tariq “Black Thought” Trotter
Executive Produced by Nataki Garrett for the Oregon Shakespeare Festival, Kirkaldy Myers, Shariffa Ali in partnership with Artizen & the Museum of Other Realities
Creative Producers: Joe Brewster & Michèle Stephenson, Rada Studio
Production Company: AliAlea Productions
Producer: Adrian Alea
Line Producer: Emma McSharry
Co-Directors: Shariffa Ali and Brisa Areli Muñoz
Writer & Dramaturg: Alex Alpharaoh
VR Engineer: Sagar Patel
Costume Designer: Tanaka Dunbar Ngwara
Sound Designer/Composer: Josh Horvath
Performer: Tariq “Black Thought” Trotter
Director of Photography: Kris Pilcher
Assistant Director of Photography: Kevin Laibson
Sound Mixer: Christian Guiñanzaca
Video Editor: Micah Stieglitz
Teleprompter Operator: Rudy Dedominicis
Production Assistant: Tanéyah Jolly

In this immersive experience, users are launched into a cacophony of chaos during the 1st night of the 1992 Los Angeles Uprising. Unable to look away, the participant is forced to contend with a city divided along race, class, and immigration lines as revolt fills the city, provoked by the acquittals of four White LAPD police officers who beat and nearly killed Rodney King. From Shakespeare to Shakur, Black Thought to Alpharaoh, these poets and lyricists have a visceral way of speaking honestly about the history of our times with critical precision. This piece, effusive as much as it is restrained, offers a heated conversation about race, colorism, bias, and culture in America through liberatory practices of hip hop, spoken word, lyricism, rhythm, and flow, inspired by Shakespeare’s well known Othello.

Here are the five behind-the-scenes exhibitions:

  • Laila is an interactive work created by Esa-Pekka Salonen, Paula Vesala, Tuomas Norvio and the Ekho Collective for the Finnish National Opera, music and visuality evolve and change in interaction with the audience.
  • Satore Studio’s Cosmos Within Us delivers a sense of hope and understanding to anyone affected by Alzheimer’s.
  • Dazzle by Gibson/Martelli + Peut Porter, recreates the optimistic, rebellious spirit of the 1919 Chelsea Arts Club ‘Dazzle Ball’.
  • POV by GRX Immersive Labs is a hyper-digital sci-fi virtual reality series immersed in a near future Los Angeles where personal data is the new currency and weaponized A.I. Police drones enforce the law.
  • Finding Pandora X by Double Eye Studios is a modern take on an ancient myth that shifts the perspective on a narrative that has long been misinterpreted.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

exploring_home

sara-lisa-voglExploring Home is a full-body tracked, dance performance in a VRChat world that premiered at Venice VR Expanded 2021. It explores issues around identity, avatar representations, being outcast and discriminated against, and it’s a unique blend of different genres and affordances of performance, world design, theater, sculpture, and audio production. It was written, produced, and performed by Sara Lisa Vogl and was catalyzed after she had an experience of discrimination for not wearing an anime avatar within a VRChat dance club. This brought up traumatic experiences of racism and discrimination for being a half-Iranian immigrant in Germany, and so she turned to the medium of VR to create an immersive experience that allows you to step into her avatar skins, go through the nested stages of her life, and listen to audio reflections of her own journey and experience with identity, community, and the feelings of shame from not being accepted. It’s a piece where you can fearlessly break through barriers, leave things behind, and be able to see tings from new perspectives.

I had a chance to talk Vogl after seeing the World Premiere performance at the Venice VR Expanded Festival where she shared her journey in creating this unique piece that’s part interpretive dance, part immersive theater, part guided tour, part social VR worldhopping, and part deep reflection on identity, embodiment, and avatar representations. It’s also worth noting the evocative sound track by Iranian musician Ash Koosha as well as another Iranian artist who cannot be named.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

hope-for-haiti-cover-image

sarah-liza-porter
Hope for Haiti is a Haitian-based, non-profit with the goal to improve the quality of life for Haitian people through education, medical, clean water initiatives, and economic development programs. After a visceral experience with a number of VR projects at VR World in New York City, Sarah Porter, who the Director of Business Development & Strategic Partnerships at Hope for Haiti, set out on a journey to explore how to use immersive technologies for outreach, education, and fundraising. She collaborated with Max Noir from China-based FXG Studios to create a Unity-based Hope for Haiti VR world (available on SideQuest) featuring a classroom (with interactive chalk), clean water punk, some informational videos, waterfall, beach, and campfire that serves as a social VR platform to hold events and guided tours through some of the projects, people, and stories of people working for their non-profit. They also have a NFT gallery featuring a number of artists who donated art to be sold to raise money for their non-profit.

Porter was a speaker with Noir at AWE in a session titled Virtual Reality for Social Impact in Haiti, and also had a spot on the AWE Expo floor playground showing off their experience to get feedback and support from the broader XR community. They held an initial event on October 20th, and have more plans to expand their virtual Haiti work and have more outreach, education, and fundraising events. I had a chance to catch up with Porter at the end of the Augmented World Expo to recap her experiences on the expo floor as well as the feedback she was receiving about their project from the broader XR community.

The Hope for Haiti experience is now available on SideQuest.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

alien-rescue

The MetaMovie Presents: Alien Rescue is one of the most immersive experiences I’ve had in VR. It’s part Cinematic Adventure, part Immersive Theater, and part game that’s fusing the affordances of each of these different mediums in a unique way. There’s one interactor who is cast as the protagonist who meets up with three other immersive theater actors who are taking you on a fairly linear adventure to rescue some aliens. The main character gets to do some live-action role playing, customize their character’s identity and expertise, and make a number of different choices along the way to potentially go down some forks in the story. There’s also a number of other audience members there who are cast as iBots, which don’t have as much narrative agency as they can’t really speak, but they have more embodied & locomotion agency in terms of being able to explore around the environment as well as chose which characters and storylines to follow as the part splits up at different points.

I’ve had a chance to see MetaMovie twice now. I saw the first half at their initial premiere at the Venice Film Festival 2020, and then the full experience again as a judge for Raindance, where it won the best multi-player experience and got runner up for best game. It’s part story, part game, and part immersive theater adventure, and so I’m glad that it was able to win the multi-player award as the overall sense of social presence has been some of the deepest that I’ve had in any immersive VR storytelling or immersive theater experience that I’ve had so far. It’s quite a unique blend of agency and story, and it’s a mix that has taken a lot of time to develop through years of different experimentation on previous MetaMovie Project experiments.

meta-movie-alien-rescue
I had a chance to talk with most of the cast and crew after I saw the experience at Venice in September 2020, where I got a lot of context and history about how the series of MetaMovie Project experiments have evolved. This interview includes director and creator Jason Moore, producer Avinash Changa, and actors Nicole Rigo, Kenneth Rougeau, and Marinda Botha. The fourth actor Craig Woodward was unfortunately not able to join us for this interview, but we’re able to explore both the past, present, and future of immersive storytelling in this wide-ranging conversation from the POV of the director, screenwriter, producer, and actors perspectives.

Moore tells me that they’re currently planning on doing weekly runs of The MetaMovie Project, and I HIGHLY recommend checking it out. I’ve been able to be the hero twice now, and it’s super compelling. But I also hear how satisfying it is to be an iBot as well to be able to help solve puzzles, track different characters, and have a bit more agency to explore the environmental and storylines that are the most compelling.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

mycelia
Tosca Terán (aka Nanotopia) is an interdisciplinary artist at intersections of ecology, bioart, mycology, and sound, and she collaborated in creating the “Mycelia” performance with the Metaverse Crew and Sara Lisa Vogl (aka _ROOT_) to bring her fungi biosonification music into an amazingly immersive, audioreactive VRChat world.

myceliaPremiering at the AMAZE festival in July, Terán held a number of musical performances where she translates the conductance from Oyster Mushrooms into MIDI and a form of ambient electronic music in addition to her own musical performance. My mind was pretty blown after hearing her describe her process of how she generates her music, and I wanted to sit down with her to do a super deep dive unpacking each step of the process (See the show notes down below for more details as you listen in). I had a chance to catch up with her the day before her Venice performance in September, and I had a chance to see it a second time at the Raindance Immersive festival, where it ended up winning one of the Spirit of Raindance awards for it’s innovative, independent, and pioneering spirit.

This ended being quite a wonky deep dive into the audio production pipeline of biosonification of fungi, but also has some deeper thoughts about the implications of interspecies communication, the potential of using haptics, and spatialized ambisonics and sonification to further explore biometric or physiological data from either humans or non-human species. The Mycelia performance has been one of the more magical experiences I’ve had in VR, especially considering that it does provide a portal into the biorhythms and proxies of consciousness of non-human intelligence. So hopefully this conversation will not only help explain Terán’s creative process, but also help to inspire other bio-artists to continue experimenting and exploring the potentials of biosonification with the context these immersive worlds.

SHOW NOTES

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

tiltfive

jeri-ellsworth
One of the most magical AR demos I’ve ever had a chance to see was Tilt Five at Augmented World Expo 2021. Jeri Ellsworth stumbled upon a complete paradigm shift towards AR while she was at Valve and accidentally had a beam splitter turned around and it was shooting beams of light that happened to hit some retroreflective material another Valve engineer was experimenting with. She saw a beautiful, high-contrast image that inspired her to continue to develop this idea into a fully-fledged, tabletop gaming focused AR headset that she was able to acquire from Valve after getting fired, go through another startup cycle of beginning and ending of castAR with too broad of a focus, and then eventually into Tilt Five, which has been much more laser-focused on tabletop gaming.

I had a chance to catch up with Ellsworth during a busy AWE showing, where she shared quite a lot of details about her journey into XR really starting with helping to bootstrap Valve’s hardware division, some of the internal dynamics there, her journey with castAR, and then finally with her latest efforts with Tilt Five. They should be shipping out their Kickstarter units here soon. It’s a completely different and unique approach to AR, and it’s also one that creates some pretty magical experiences when focused on a table-top gaming retroreflective material. It really feels like this is a device that is going to bootstrap quite a lot of innovation with AR gaming and the affordances of tabletop holograms, and I look forward to see how it continues to develop. But definitely keep an eye on Tilt Five, and try it out for yourself to see how they’ve been able to bring the magic of holograms to live with their Tilt Five glasses.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

Here’s a previous interview with Ellsworth conducted by Valve News Network’s Tyler McVicker that I mentioned in this interview.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

lynx-r-1-awe-stan-larroque
I had a chance to do a demo of the Lynx R-1 standalone mixed reality headset at the Augmented World Expo, which was a really compelling experience that blended the virtual and real better than any other headset I’ve seen do before. Part of the magic was having a headset specifically engineering to minimize the distance from the the cameras from my eyes, therefore minimizing the normal proprioceptive disconnect between what I’m feeling in my body vs the offset that I’m seeing in my arms and hands. It’s the first XR HMD that I’ve seen that can legitimately call itself a Mixed Reality device — as opposed to Windows Mixed Reality headsets, which are really just VR HMDs with the aspiration to maybe eventually live into their names of doing mixed reality.

stan-larroque
I had a chance to catch up with Lynx CEO Stan Larroque on the last day of AWE to recap his first public demos of the Lynx R-1, their collaboration with Qualcomm on using the full capacity of the mixed reality features build into the XR2 chip, their unique four-fold catadioptric freeform prism optics design, what their OpenXR runtime integration will be able to unlock with Qualcomm’s Snapdragon Spaces, their successful Kickstarter raising 2.4x their target amount, how they’re leaning into and leveraging the broader open source communities to compete with the biggest players in XR, and his own first real experience of mixed reality with the Lynx that only happened about a month ago.

The Lynx R-1 is expected to ship some of their first units in April 2022, and I’m excited to see what innovations in mixed reality and augmented reality prototypes are able to be created with this headset. They’re one of the few remaining, independent headsets out there competing against the biggest tech companies in the world, but yet taking a leap of faith in how compelling headsets that are truly optimized to be standalone and all-in-one devices capable of virtual reality, augmented reality, or mixed reality. They’re optimizations to focus solely on mixed reality yield some really interesting tradeoffs, but the end experience is totally worth it is a device that starts to show the real power of seamlessly blending the virtual and the real.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

qualcomm-snapdragon-spaces

On November 9th at the beginning of the Augmented World Expo (AWE),Qualcomm announced Snapdragon Spaces, which is a series of AR tools striving to cultivate an “open, cross-device horizontal platform and ecosystem.” The Snapdragon Spaces tools include “environmental and user understanding capabilities that give developers the tools to create headworn AR experiences that can sense and intelligently interact with the user and adapt to their physical indoor spaces.” This specifically includes spatial mapping & meshing, local anchors & persistence, positional tracking, plane detection, image recognition & tracking, object recognition & tracking, occlusion, & scene understanding.

What’s especially interesting to me is that all of this is “based on the Khronos® OpenXR™ specification to enable application portability and is the first headworn AR platform optimized for AR Glasses tethered to smartphones with an OpenXR conformant runtime.” The Snapdragon Spaces platform will enable Qualcomm to start leveraging the additional computational resources of mobile phones in order to enable a distributed compute and rendering capabilities that allows AR to become like an extended spatial display to existing mobile phone apps.

It’s also starting to build out the more software-driven potentials for innovation that comes from specific application developers, who will be empowered to write OpenXR extensions and modules that not only benefits their specific application, but potentially the broader XR ecosystem. This is a really exciting development to see Qualcomm go down this path of cultivating an open ecosystem like this, and it makes a lot of sense why they wanted to become a big sponsor of AWE 2021 with an opening keynote slot with Hugo Swart announcing Snapdragon Spaces (registration required) as well as a couple of sessions by Steve Lukas diving into more specific details of Snapdragon Spaces in the Ramp to the Future of AR session (registration required) as well the more generalized tips for what type of AR applications they’re looking for in order to grow the AR ecosystem here in this session on Designing Your Mobile App for Qualcomm’s Tools (registration required).

They also announced an early access program for XR developers called The Pathfinder Program that’s a new program for Snapdragon Spaces “designed to give qualifying developers early access to platform technology, project funding, co-marketing and promotion, and hardware development kits they need to succeed.” Generally availability for Snapdragon Spaces won’t be until the Spring of 2022.

Going to AWE 2021, it was made really clear to me the impact that Qualcomm has had on the cultivation of the standalone VR and AR HMD market So many of the latest standalone devices use either the XR1, including Snap Spectacles 4, Ray-Ban Stories, Lenovo ThinkReality A3, Vuzix M4000 & M400, or use the the XR2 including Quest 2, Vive Focus 3, Pico Neo 3, & iQIYI QIYU 3, Magic Leap 2, or HoloLens 2.

In fact, since 2016, there’s been over 50 devices that have launched on either the Snapdragon 820 (announced September 1, 2016 at IFA), Snapdragon 835 (announced January 3, 2017 at CES), Snapdragon 845 & VR Dev Kit Reference Design (announced February 21, 2018 at MWC and shown at GDC March 2021 + my previous Voices of VR interview with Qualcomm at GDC 2018 after seeing that reference design), and then their XR-specific XR1 chip announced at AWE May 29, 2018 and then their XR2 chip announced December 5, 2019.

Here’s a graphic that Steve Lukas presented at AWE that shows 33 out of the 50+ XR devices that have launched with Qualcomm chips since 2016.
Snapdragon-Devices-Launched_November-2021

Hugo-SwartI had a chance to catch up with Qualcomm’s VP & GM of XR Hugo Swart during AWE on November 11th, where I was able to get more context for their new Snapdragon Spaces platform and open ecosystem they’re cultivating, but also to recap the evolution of standalone VR and AR devices since 2016 when the Snapdragon 820 was announced as being the first chip capable of handling the needs of standalone XR devices.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Protopia-Futures-Framework

monika-bielskyte
Monika Bielskyte is a world designer, immersive artist, science fiction critic, and cultivator of Protopia Futures (@ProtopiaFutures on Instagram), which is a an intersectional design practice that was distilled down into a Protopia Futures [Framework] manifesto published on May 18th, 2021 that was created in collaboration with over 30 collaborators. I’ve previously talked with Bielskyte in 2017 about Designing The Future through Sci-Fi World Building and in 2018 about Sci-Fi Worldbuilding to Collaboratively Shape Protopian Futures, and she’s actually been moving away from the concept of “building” and “world building” and more towards the more relational and organic metaphors of world growing and world cultivating.

There’s also been a distillation of the Protopia Futures Framework, that’s anchored into the seven principles of:

  • Plurality — Beyond Binaries
  • Community — Beyond Borders
  • Celebration Of Presence
  • Regenerative Action & Life As Technology
  • Symbiotic Spirituality
  • Creativity & Emergent Subcultures
  • Evolution Of Cultural Values

I had a chance to do a pretty epic deep dive into these seven principles with Bielskyte on November 2nd, in a nearly 3-hour conversation that is broken down into four major parts:

  1. Background context, creative process, & intersectional inspirations for Protopia Futures Framework.
  2. Deconstructing Denis Villeneuve’s Dune through the lens of Protopia Futures
  3. Detailed unpacking and breakdown of the seven principles of Protopia Futures
  4. Deconstruction of Metaverse Sci-Fi, how Meta is basing their vision of the Metaverse on Ready Player One, & deeper decolonial critiques of their techno-utopianism and denial of context.

This is a pretty extensive, nearly 3-hour long episode, but is able to articulate so many deep insights about how those who control the fantasy, control the future. Given the power of these science fiction narratives in how the guide, direct, and shape our technological futures, then it’s worth having some critical frameworks to use to not only deconstruct the deeper patterns of these stories that are being told, but to also provide a design framework, map, checklist, and blue print for a world design, world growing, world cultivating, and future dreaming practice that nurtures radical tenderness, radical hope, and our imaginations for what types of Protopian Futures might be possible.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

virtual-being-summit

edward-saatchi
The third annual Virtual Beings & Virtual Societies Summit happened on October 29-30, and I had a chance to attend and unpack it with the co-founder of Fable Studio and instigator of a new Decentralized Autonomous Organization (DAO) called the The Culture DAO, which officially launched as a part of the Summit. They also launched the social token of $CULTUR, which is a token-powered Guild for virtual beings, avatars and metaverse identity, and you can find more info in the “$CULTUR – The Beginning” manifesto published on October 6th.

Saatchi was one of the co-founders of Oculus Story Studio bringing together storytellers and game designers to be on the frontiers of experimenting with immersive storytelling. After Oculus Story Studio was shuttered in May 2017, Fable Studio was able to continue development on the Neil Gaiman-optioned piece of Wolves in Walls, which premiered at Sundance 2018. Wolves in Walls featured an embodied virtual being character named Lucy, that distilled many embodied lessons of communication and body language from the Immersive Theatre company Third Rail Projects.

Wolves in Walls was released on the Rift in 2019 and the Quest in 2020, and Saatchi said that people felt such an embodied connection to the character that they would want to try to communicate with her. This triggered a long journey aspiring towards virtual beings that have a personality, character, incentives, and motivations that make them interesting to talk with in the context of a story world. In other words, the ultimate destination is Artificial General Intelligence. In my previous discussion with Andrew Stern, one of the co-creators of Façade, he said that the constructs of a story world allow you to create a bounded context that allows you have conversations with an AI character that would feel more real and capable than what AI is capable of today giving the limitations of contextual awareness and common sense reasoning.

After many years of experimenting with GPT-3 and one-on-one chatbot types of experiences with Lucy, he’s come to the conclusion that none of the current approaches are getting us any closer to the dream of AGI. This is a big reason why he’s looking for radical approaches of incentive design from the web3 & DeFi worlds of cryptocurrencies, social tokens, DAOs, and NFTs. Rather than data mining the Internet to train massive language models like GPT-3, Saatchi is hoping to create simulation worlds that are dedicated to creating mentorship relationships between humans and AI in a way that could have train-to-earn model of a cryptocurrency token or other gamified incentive structures to move beyond the current limitations of big data scraping and training models that have reached a dead end for now.

I had a chance to talk with Saatchi this past Tuesday after his Virtual Being + Virtual Societies Summit to unpack his quest towards AGI virtual beings, and some of the recent inspirations from the web3 world, multi-modal learning, incentive design, and The Culture DAO Guild to cultivate a cooperative training community to escape the extremes of AI Winters and AI Summers, blitz-scaled companies that lead to bankruptcy, and siloed information within innovation spaces. It’s part of a larger movement towards decentralization, digital ownership, and DAOs that are gaining more leverage through community organizing and mutually financially-incentivized, cooperative action.

I still have lots of cautious skepticism towards these technologies that are always in right relationship to the world around us, but there’s also a lot of exciting potentials to create new social dynamics that just may allow us to start to escape from the more negative aspects of the consolidated power of Big Tech companies their more settler/colonial mindsets of seizing our private data to fund the underlying immersive technologies of whatever may evolve into our ideas of the Metaverse.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

After attending the Virtual Beings Summit, I was inspired to buy my first EcoNFT from XR artist Sutu. I’m excited to see where he takes the VR and AR integrations for Neonz, but also have a bit better sense of what Saatchi was talking about in terms of being a part of a community that is mutually financially incentivized to contribute to the project.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality