fox-buchele-twitter-2021-01-14
On January 14, 2021, VR developer Fox Buchele announced on Twitter that he was fired by Owlchemy Labs in 2020, and “I’m sharing this now because I’m done with this industry. And I don’t see myself coming back.” He described being a VR developer as “hard, thankless work – made worse by toxic, business-first development cycles, authoritarian ‘trickle down’ management and design, and a general lack of respect for the grunts in the trenches.” And that “I can’t in good conscience continue to support an industry so broken and exploitative while pretending everything is normal.”

I reached out to Buchele to do an exit interview of sorts to explore his side of the story for what happened during his time at Owlchemy Labs, but also his perspectives on toxic work environments for VR developers within the game industry at large, what he sees as a stagnation of innovation and creativity due to authoritarian creative practices and a lack of diversity and inclusion, but also the larger context of the abuses of Big Tech.

This interview is an oral history of Buchele’s experiences and perspective. In the long run I’d love to capture other perspectives as well to get the full picture and other sides of the story, but the types of things that Buchele talks about are aspects of the games industry that others have been talking about as well.

Also, because Owlchemy Labs was purchased by Google on May 10, 2017, then this was also an opportunity for me to ask Buchele about that acquisition, and what types of insights that it could provide into Google’s overall XR strategy. Google has had a lot of XR projects come and go including Daydream, Google Cardboard, Google Expeditions, Google Poly, and Project Tango. Buchele points to the lack of willingness for Google to produce their own VR hardware combined with an already fragmented ecosystem within Android did not create a compelling platform for VR developers to buy into Google’s ecosystem. As a result, Google has yet to build up any serious traction within the broader VR industry, and they’ve been focusing their efforts on AR and AR Core within Android.

Fox’s time at Owlchemy Labs also mirrored the time in which Donald J. Trump was the President of the United States, and so he also talked about the dynamics of shutting down polarizing political discussions in the workplace during that time period. He talks about how he consciously and unconsciously shut down the more political parts of his social media presence in part because he didn’t want to have his private thoughts reflect poorly on the upbeat, positive, fun, and lighthearted brand of Owlchemy Labs.

He characterizes the working environment as one of “toxic positivity” that started with the stifling of polarizing political discourse, but ended in the resistance to having deeper critical deliberations about creative decisions. He claims that as time went on, then there was also a hierarchical creative decision-making process where only a couple of the leaders were involved making critical decisions in the absence of listening to creative feedback of the development team.

There’s certainly a number of things that Buchele discusses that is unique to his experiences at Owlchemy Labs, but also likely a lot of experiences that other developers in the VR industry and games industry at large have experienced as well. There’s a lot of taboos that Buchele is breaking in order to speak out about some of his experiences, and so I’m grateful that he was willing to elaborate on his Twitter. Sharing his story is in the spirit of being able to reflect upon some of the cultural aspects of VR industry that contributed to his experiences, and what we can do in terms of resisting Big Tech’s consolidation of power, the stifling of creativity through diversity and inclusion, and being willing to speak out about toxic elements of culture and how those can be changed.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

Here’s a Twitter thread where I reflect upon the deeper cultural and technological dynamics of political polarization and filter bubbles

https://twitter.com/kentbye/status/1348073773568716800

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

sundance-2021-film-party

Sundance New Frontier 2021 features 14 immersive experiences that will be accessible via the $25 Sundance Explorer pass starting on Friday, January 29th until Wednesday, February 3, 2021. The New Frontier site will also be hosting virtual premiere parties on their “Film Party” platform featuring six films at a time within a 3-hour window. This will be a chance for the directors producers of the 71 Sundance films and New Frontier pieces to have their own virtual premiere party throughout the festival. The schedule will be displayed within the Film Party virtual environment, but you can also see it in this PDF of the premiere schedule.

sharifrilotI talk with Shari Frilot, Senior Sundance Programmer and Chief Curator of New Frontier, in order to get a sneak peak of the 14 experiences at Sundance New Frontier 2021, as well as as overview of all of the virtual gatherings & film parties, the New Frontier Gallery Space, and the four virtual cinema house screenings (details below).

There will be a Virtual Film Party Bar that is VR-enabled as well has webcam support where clusters of 8 people will be able to have conversations within an opt-in, audio bubble that needs to be initiated. The capacity will be 250 people, but new instances will be spun up once it reaches capacity. The Film Party premieres for each of the films will be capped at 250 people. All of the New Frontier directors will be having their premiere parties on either Monday, February 1st or Tuesday, February 2nd.

The New Frontier Gallery will also have infinite instances of 250 people, but there will be no webcam support. There will be VR-chat enabled though, and so this will be a great place for the VR community to gather.

There will be an unofficial New Frontier Opening Night Party in the Virtual New Frontier Gallery on Friday, January 29, 2021 at starting at around 7 or 8p PST.

There will also be a virtual cinema showing four films in virtual reality as a part of the Explorer Pass. Capacity is 200 people on a first-come, first-served basis. Here’s schedule for the Virtual Cinema House for Sundance 2021:

  • Friday, January 29, 8 pm MST (7 pm PST) – Documentary Shorts
  • Sunday, January 31, 3 pm MST (2 pm PST) – Station to Station
  • Monday, February 1, 7 pm MST (6 pm PST) – Users
  • Tuesday, February 2, 4 pm MST (3p PST) – Mother of George

Also, for fans of simulation theory, there’s a midnight documentary called A Glitch in the Matrix, which premieres on Saturday, January 30, 2021, 10 pm MST (9 pm PST). Individual movie tickets are $15 each, and there are still tickets available for a number of screenings.

Facebook sent all of the Sundance 2021 directors Oculus Quest 2 headsets, and so these Sundance New Frontier virtual spaces could a really good opportunity to connect and network with the independent filmmaker community.

You can catch all of the other details for Sundance 2021 in my interview with Shari below (or you can also check out No Proscenium Podcast #278 for more context & information from Shari & Active Theory).

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

aardvark

joe-ludwig
Aardvark is an AR platform within the context of virtual reality applications. Here’s how Aardvark is described on it’s Steam page:

Aardvark is a new kind of web browser that allows users to bring multiple interactive, 3D “gadgets” into any SteamVR app. It extends the open platform of the web into VR and lets anyone build a gadget and share it with the community. Aardvark gadgets are inherently multi-user so it is easy to collaborate at the Aardvark layer with the people you are in a VR experience with.

Aardvark was originally announced by Joe Ludwig on March 19, 2020 and had it’s first, early access release on Steam on December 19, 2020. I did a Pluto VR demo in December that integrated their telepresence app with Aardvark AR gadgets and Metachromium WebXR overlays, and I got a taste of how multiple applications will start to interact with each other within a spatial computing environment.

Aardvark and Metachromium are both overlaying objects and layers on top of virtual environments, but they are taking different approaches. Metachromium uses WebXR depth maps to composite the pixels on top of the existing virtual environments. Aardvark is tracking your head and hand poses, and attaching declarative web app objects to these reference points or the room center.

Ludwig says that Aardvark is his white paper for why he thinks his approach could be easier to scale in the long run. Metachromium runs WebXR at the framerate, which has a lot more overhead. While it’s only the Aardvark app that’s running at framerate while the rest of the gadget is a declarative web application approach using the React framework that only runs JavaScript code when the user takes actions. Ludwig is skeptical that JavaScript will be able to run within the context of a 90 to 120 Hz render loop on top of pushing more an more pixels to displays in VR apps that are already pushing the GPUs and CPUs to their limits, and Aarvark gadgets reflect this design philosophy.

I had a chance to catch up with Aardvark creator Joe Ludwig on January 12, 2021 to get some more context on Aardvark, how it started, where it’s at now, and where it’s going in the future. Ludwig is still in the early phases of getting all of the component parts in place in order to bootstrap this new platform.

communication-medium-as-process

It’s still early days in fleshing out the flywheel of this communications medium feedback loop, but the potential is pretty significant. Ludwig says that Aardvark has the potential to start to prototype the user interface design and functionality of augmented reality applications within the context of a virtual reality app.

There’s still a lot of missing information to fully manifest this vision, especially in not having any equivalent of a virtual positioning system to get the X, Y, & Z coordinates of the virtual work instance and specific map and conditional states. Ludwig expects that his may eventually be provided through an OpenXR extension, but for now these AR gadgets will need to exist relative to the head or hand poses or localized to the center of your play space.

When Aardvark was first started, Ludwig conceived of it as an overlay layer. And so it’s been surprising to him to discover that there’s been a lot of of work in trying to get these spatialized gadgets to be able to communicate with other gadgets, especially within a multiplayer context. The early experiments show the power and potential of a multiple-application AR ecosystem, but there isn’t a single killer app or utility that’s tightly focused on a specific use case or context. This leaves a lot of room for exploration and discovery starting with a backlog of ideas, but without a lot of clear direction as to what will be compelling or build momentum within a specific community.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Mark-Pesce-Augmented-Reality

Mark Pesce’s new book Augmented Reality: Unboxing Tech’s Next Big Thing was released on Friday, January 8th, 2021. It’s a lucidly-written look at the past, present, and future of augmented reality. He contextualizes AR within the history of computing and evolution Human-Computer Interaction, while also looking at the underlying principles of adding metadata to space, spatial computing, and what it means for a physical space to go viral.

Pesce also looks at how AR is a technology that has to be watching, and the open ethical and technology policy questions around privacy. He says that the closer the technology is to our skin, then the more that it knows about us and the more that it has the capacity to potentially undermine our agency.

He also points out that as you change the space around us, then you’re also changing us. There will be a lot of emphasis on feedback loops for consumers wanting to have specific information and context about the world around us as well as an aspirational aspect of the world providing that information. Pesce describes this interaction as a combination of the space itself, how the space expresses itself through radiating out information, then there is how the people who are present in that space interpret and make meaning out that information, but then feed more information back into the space changing the meaning of that space.

He is grateful for Netflix documentaries like The Social Dilemma that starts to provide metaphors for some dynamics of technology companies and the role of algorithms in our lives, but that this role is going to only become more important as augmented reality technologies are able to overlay context, meaning, stories, and metadata onto physical reality that could have a lot more physical collisions with differing perspectives where they were not possible in cyberspace.

I had early access to the book, and I was able to conduct an interview with Pesce back on November 19, 2020. Pesce wanted to contextualize AR within the history of computing and human-computer interaction, but also to catalyze some technology policy discussions around the privacy and inherent surveillance aspects of augmented reality. With consumer AR on the horizon in the next couple of years, then there are a lot of deeper questions around how to navigate the relationship between humans and machines and Pesce’s Augmented Reality book provides a lot of historical context that brings up some of the first discussions and writings on this topic like Norbert Weiner’s The Human Use of Human Beings (1954) and J.C.R. Licklider’s Man-Computer Symbiosis (1960).

Pesce is able to not only contextualize the history of AR, but also give us some pointers of where the technology is heading in the future. If you’re interested in some deeper discussion and analysis of spatial computing, then this is a must-read account that’s grounding in this history and evolution of consumer augmented reality.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

metl-residents-2020

Note: This is a sponsored content post from the University University of North Carolina School of the Arts

I was a mentor for the UNCSA’s Media + Emerging Technology Lab’s Immersive Storytelling Residency program in 2020, which was a 6-month program featuring a technical artist, software engineer, and writer who were collaborating on an immersive story that was called BonsaiAI. I wanted to have the first cohort to come on the podcast for a retrospective, and so I invited Trent Spivey, Fernando Goyret, and Alex Moro along with METL director Ryan Schmaltz to reflect on their journey, challenges, and lessons learned from their immersive storytelling artist residency.

This was a really interesting experience for me as well in 2020 as I get more involved with providing my intuitive reactions and feedback along the way on a project that’s still within development. One of the hardest problems in experiential design is matching the top-down design with the bottom-up actual experience. Films are able to have a small gap between the representations of a pitch, idea, and overall story, relative to how it ends up on the screen. But games and immersive experiences that involve user agency are require more of a bottom-up approach with a lot of user testing, but there still needs to find a way for the overall experience to have a narrative throughline from beginning, middle, to the end. So being able to predict how the change of certain conditions within an experience effects the overall feeling of an experience is the essential an existential challenge of all experiential design. So there’s a lot of reflections upon that process within the context of bringing three strangers together for six months in the middle of a pandemic in order to figure some of that out.

If you’re interested in the 2021 residency, then January 31st, 2021 is the deadline for a six-month Immersive Storytelling Residency Program that runs from May 1st to November 1st, 2021. There are three different residency roles that will be collaborating over the six months including a game developer/programmer, 3D modeler/technical artist, as well as screenwriter/producer. There’s a monthly $3500 stipend, and a requirement to relocate to Winston-Salem, North Carolina for the duration of the 6-month residency. If you’re interested in applying, then be sure to also check this conversation with METL director Ryan Schmaltz to get more context and details for the 2021 edition.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

The-Devouring
The Devouring is an epic, 5-6 hour horror adventure game within VRChat that launched on August 14, 2020. It quickly went viral in the VRChat community because it was a unique, shared experience that had a vast world to explore, an evocative soundtrack, enemies that catalyzed unique reactions, a recursive map that allowed groups to split up and reconvene, and other technological innovations like late joiner capability that facilitated new social dynamics and group gameplay. Overall, it was a unique bonding experience as users had to commit to the full 5-6 hours of the experience as there was no way to save progress, and it ended up being a very satisfying and memorable journey.

The Devouring caught the attention of Raindance curator Mária Rakušanová, who created an a new category of Best Immersive World so that VRChat worlds like this could compete in the Raindance Immersive Festival. I was invited to be a juror, and after going through the 5-hour experience, the jury awarded The Devouring with Best Immersive World, Best Immersive Game, and runner-up for Best Multiplayer Game. We were all impressed with the novel social dynamics, game play innovations, and level of polish and interaction that went beyond anything else we had seen from a VRChat world.

4poniesThe Devouring started off as a much smaller project for Spookality Halloween worldbuilding contest in 2019, but the 4 Poneys Team blew through the initial deadline and the project soon expanded in size and scope over the course of an epic, 10-month production timeline. This volunteer team includes CyanLaser, Legends, Lakuza, and Jen Davis-Wilson (aka Fionna) who all met each other from the VRChat Prefabs community of worldbuilders. It started on the VRChat Discord, but it was spun off into it’s own Discord channel in order to share knowledge, keep track of free worldbuilding assets, and explore the newest worlds through the community meetup.

After watching CyanLaser’s Devouring Tech Overview at the Prefabs TLX conference on December 5th, then I reached out to The Devouring team to do a full retrospective of their creative journey and launch of one of the most impressive and successful VRChat immersive experiences yet. It’s not only an epic recap of this project, but also a window into the mission-driven, Prefabs worldbuilding community that’s pushing the edges of innovation and knowledge sharing within VRChat.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

Here’s the trailer for The Devouring:

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

virtual-market-5
Virtual Market 5 opens today December 18th, 2020 at 6p PST (Dec 19th at 11a JST), and will run until January 10th at 6a PST (11p JST). It will feature over 1500 VR artists and creators in 30+ worlds in VRChat that features avatars, meshes, clothes, music, prefabs, shaders, tutorials, and all sorts of other virtual goods for VR enthusiasts and immersive creators. It’s the impressive virtual expo of the year with some of the most awe-inspiring worlds and virtual experiences.

The first Virtual Market (aka Vket) started in the Japanese VRChat community on August 26, 2018, and featured around 80 artists and creators. HIKKY is a Japanese company that’s been running the virtual markets, and so there’s been a distinct imprint of Japanese culture and language in Vket 1-4. But the Vket Global Team has been bringing in more international creators, and creating English language options to make it more accessible to non-native Japanese speakers.

pfp
There’s an impressive amount of innovation and creativity when it comes to world building and experiential design for the consumer experience, as the Virtual Market is run and managed by 80% volunteer work. There are a number of corporate sponsor worlds with companies from around the world who are interested in experimenting with experiential marketing, but also connecting to the bleeding edge of virtual culture within VRChat.

lhunI talk with the Director of the Vket Global Team LilBagel as well as the CTO and Process Manager Lhun on Wednesday, December 16th after getting a 2.5-hour tour and sneak peak of some of the new worlds for Vket 5. I’m really blown away at the increased level of worldbuilding and experiential design, and the size, scope, and professional polish of the Virtual Market, which won the well-deserved AIXR award for best marketing experience in 2020 with Virtual Market 4. We explore some of the deeper context for why Vket originally came about, and how it’s evolved over the years. I also get some of history of how Japanese anime culture has become so ubiquitous in apps like VRChat.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

Check out Shanie MyrsTear’s preview of Virtual Market 5

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

uncsa-immersive-storytelling

Note: This is a sponsored content post from the University University of North Carolina School of the Arts

Melissa Upton, UNCSA Staff. 2/7/18January 31st, 2021 is the deadline for a six-month Immersive Storytelling Residency Program that runs from May 1st to November 1st, 2021 as a part of the University of North Carolina School of the Arts’ Media + Emerging Technology Lab (METL). There are three different residency roles that will be collaborating over the six months including a game developer/programmer, 3D modeler/technical artist, as well as screenwriter/producer. There’s a monthly $3500 stipend, and a requirement to relocate to Winston-Salem, North Carolina for the duration of the 6-month residency.

I talk with METL Director Ryan Schmaltz to get more context on the emerging media and XR programs at METL, but also do a bit of a recap of the first artist residency program that ran from March to September of 2020. We talk about some of the lessons learned from the first cohort with the biggest shift of moving the program to span May to November (rather than March to September) in order to make it more accessible to graduating seniors.

We also talk about some of the open problems and challenges from design to implementation of an immersive storytelling project, and the opportunity to get feedback from a number of different mentors. I served as a mentor for the first cohort, and I’m planning on returning again for the next cohort to help provide feedback and guidance along the way. METL’s Immersive Storytelling Residency Program is a unique opportunity to get paid a living wage to work on immersive project with a team for six months, and you can get more information on their website as well as more context from Schmaltz in this episode.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

pluto-vr
Pluto VR is a general-purpose VR, telepresence application that hopes to provide the social presence glue for a standards-driven, multiple-application ecosystem using SteamVR on the PC. I had some new conceptual breakthroughs about the potential future of spatial computing after getting a demo of how Pluto VR is works with other applications like Metachromiumor Aardvark. Metachromium can run entire WebXR applications as an overlay on SteamVR applications, and Aardvark is an future-forward framewok that allows for the development of augmented reality “gadgets” that run on top of virtual reality experiences.

All of these technologies are utilizing the Overlay extension of OpenXR in order to overlay augmented layers of reality on top of VR experiences, and they’re working together in way that will facilitate emergent behaviors that break out the normal 2D frames of our existing computing paradigm. When you run an application on mobile on your computer, then there’s usually only one application that’s in focus for any given moment. You can context-switch between apps, or copy and paste, but our computing paradigm has been all happening within the context of these 2D frames and windows.

The promise of spatial computing is that we’ll be able to break out of this 2D frame, and create a spatial context that allows for apps to directly interact with each other. This will originally happen through lighting, opacity shifts, or occlusion between 3D object, but eventually there will be more complicated interactions, collisions, and emergent behaviors that are discovered between these apps.

Moving from 2D to 3D will allow application developers to break out of this metaphorical frame, but this also means that app developers won’t have as much control over the precise environmental and spatial context that their application will be running. This is a good example for where I think the more process-relational thinking of Alfred North Whitehead’s Process Philosophy has a lot to teach XR application developers to start thinking in terms of the deeper ecological context under which their spatial computing app is going to exist.

Facebook has not even made it possible to run multiple VR applications at once on either their PC or Quest platforms. It’s Valve’s SteamVR that is providing the platform on PCs for experimentation and innovation here. It’s admittedly a bit cumbersome to launch and connect each of these disparate applications together, but over time I expect the onboarding and overall user experience to improve as value is discovered for what types of augmentations will be provided with these overlay layers. But Pluto VR has an opportunity to become the Discord of VR in providing a persistent social graph and real-time context for social interactions that transcends any one VR application. It’s an app that you can hop into before diving into a multi-player experience, but it’s also enabling players to stay connected during loading screens and other liminal and interstitial virtual spaces, like the Matrix home screen of Steam VR.

Pluto VR has been working with a number of open standards that will be driving innovation on XR as an open platform including Web-RTC, glTF, VRM, OpenXR, WebXR, Web Bundles, XRPackage (XRPK) as well as the Immersive Technology Media Format (ITMF) from the Immersive Digital Experience Alliance (IDEA). They also hosted the W3C workshop on XR accessibility to get more insights for helping to cultivate accessible standards in XR. The Pluto VR team is taking a really future-looking strategy, and hoping to help kickstart a lot of innovation when it comes to creating AR widgets that could be used in VR environments, but perhaps eventually be ported to proper AR applications.

Covering the emerging technologies of augmented and virtual reality since May 2014 has helped me to isolate some of the key phases in the development and evolution of a new medium.

communication-medium-as-process

First there’s a new emerging technology platform that enables new affordances, then the artists, creators, makers, & entrepreneurs create apps and experiences that explore the new affordances of the new technology, then there’s a distribution channel in order to get these experimental pieces of content into the hands of audiences, and then audiences are able to watch the work and provide feedback for both the tech platform providers and the content creators.

The OpenXR and WebXR standards are enabling distribution channels of immersive content through apps like Metachromium and Aardvark, and the OpenXR overlay extension allows for this more modular AR gadget content to be used within the context of existing VR applications run on Steam. Then Pluto VR is connecting creators directly with their audience in order to share their WebXR apps or Aardvark AR gadgets in order to get that real-time, audience feedback loop cycle. This has the potential to complete the cycle and catalyze a lot of experimentation and innovation when it comes to what types of AR apps and widgets prove to be useful within the context of these VR experiences.

Here’s a demo video that shows how a variety of WebXR applications can be launched within a shared, Pluto VR social context:

https://www.youtube.com/watch?v=uuN3KnUIJgw

I’ve had a number of interactions with the Pluto VR team over the past couple of years, and I’ve just been super impressed with their vision of where they want to take VR. They also likely have a lot of cash reserves as they’ve kept a small footprint after raising a $13.9 million Series Funding round announced on April 13th, 2017.

I had a chance to talk with two of Pluto VR’s co-founders Forest Gibson and Jared Cheshier on Friday, October 11th after getting a demo that blew my mind about the future concepts of spatial computing. We cover their journey of into VR, and how Tim Sweeney’s The Future of VR & Games talk on October 12, 2016 at Steam Dev Days where Sweeney laid out some of his vision of how the metaverse is going to include a lot of different applications within the same game engine-like, spatial context. So much of the VR industry, mobile computing, and PC applications have been stuck inside of a 2D, windowed frame and closed context that it was really refreshing to get a small taste of where all of this is going to go. They share some of their early surprises for spatial computing, and they know that there will be so many other key insights and innovations that will be discovered with the multi-application, technology stack that they’ve been able to set up. This is a very community-driven effort, and they’ll be showing off their technology and connecting to the wider VR community during the Virtual Market 5 (VKet 5) in VR Chat starting on December 19th.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

matt-segall-process

matt-segall
Virtual reality has the potential catalyze a paradigm shift around our concepts about the nature of reality, and one of the most influential philosophers on my thinking has been Alfred North Whitehead. His Process Philosophy emphases unfolding processes and relationships as the core metaphysical grounding rather than static, concrete objects. The Stanford Encyclopedia of Philosophy entry contrasts some of the fundamental differences to Western philosophy:

Process philosophy is based on the premise that being is dynamic and that the dynamic nature of being should be the primary focus of any comprehensive philosophical account of reality and our place within it. Even though we experience our world and ourselves as continuously changing, Western metaphysics has long been obsessed with describing reality as an assembly of static individuals whose dynamic features are either taken to be mere appearances or ontologically secondary and derivative…

If we admit that the basic entities of our world are processes, we can generate better philosophical descriptions of all the kinds of entities and relationships we are committed to when we reason about our world in common sense and in science: from quantum entanglement to consciousness, from computation to feelings, from things to institutions, from organisms to societies, from traffic jams to climate change, from spacetime to beauty.

Virtual reality is all about experiences that unfold over time, and the medium is asking to interrogate the differences between how we experience the virtual and the real. A slight shift of our metaphysical assumptions from substance metaphysics to process-relational metaphysics allows us to compare and contrast the physical to the experiential dimensions of our experiences. Process philosophy opens up new conceptual frameworks to look the world through the lens of dynamic flux, becoming, and experience as core fundamental aspects of reality, rather than treating these processes as derivative properties of static objects.

I think process philosophy makes a lot more sense when thinking about the process of experiential design. Human beings are not mathematical formulas, which means you have no idea who other people will experience your immersive piece until you play test it. There’s an inherent agile and iterative nature of game design, software design, and experiential design, where you have to test it lots of time with lots of people. This is different than the linear, waterfall approaches of building physical buildings or producing films where there’s clearly demarcated phases of pre-production, production, and post-production. For Whitehead, these iterative processes aren’t just metaphoric at the human scale, but he’s suggesting that these processes reveal deep insights about the fundamental nature of reality itself as having a dynamic and participatory aspect of navigating non-deterministic potentials that’s really “experience all the way down.”

The thing that I love about Whitehead is that he was a brilliant mathematician that turned to philosophy later in his life, and so has an amazing ability to make generalizations that deconstruct the linear and hierarchical aspects of language and make more sophisticated models of reality. He replaces physics as the fundamental science with biological organisms, or more abstractly as an unfolding processes that are in relationship to each other. This creates a scale-free, fractal geometrical way of understanding reality at the full range of microscopic and macroscopic scales.

Whitehead’s thinking has also impacted a wide range of areas including ecology, theology, education, physics, biology, economics, and psychology. Some specific examples include work in quantum mechanics, new foundations for the philosophy of biology, the psychedelic musings of Terence McKenna, and has opened up new pathways to be able to integrate insights from Eastern Philosophies, like Chinese Philosophy.

On October 31st, I attended The Cobb Institute’s 2-hour program on Process Thought at a New Threshold, which brought together an interdisciplinary group of scholars, researchers, and practitioners where summarizing how Whitehead’s Process Philosophy was transforming their specific domains. One of the presenters was philosopher Matthew D Segall, who is one of my favorite Whitehead scholars who writes and shares videos on his YouTube channel.

I wanted to get a full primer of Whitehead, his journey into philosophy, as well as how his thinking could facilitate a fundamental paradigm shift that the world needs right now. My experience is that VR and AR can provide an experiential shift in how we relate to ourselves and others, but the Process Philosophy brings a whole other conceptual level that has the potential to unlock a lot more radical shifts in all sorts of ways. I also think that the spatial nature of VR and AR is particularly suited in order to produce embodied experiences of process-relational thinking, but also help artists and creators have a cosmological grounding that helps them connect more deeply to their own creative process of unlocking flow states and using the medium to communicate about their experiences in new ways.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality