alberto-eliasHolonet is an open source project that implements the Decentralized Identity Specifications for open web VR platforms like WebXR. A self-sovereign identity system could enable the seamless portability of your avatar identity across multiple sites without having to use centralized authentication methods that would require you to login to every site with a unique username and password, or have to re-upload assets onto every metaverse world that you visit. The Decentralized Identity Foundation formed in May 2017 with a number of blockchain companies and bigger companies like Microsoft who got together to open source their blockchain identity IP in order to create a number of decentralized identity open standards.

Investor Chris Dixon recently published an essay titled “Why Decentralization Matters” where he argues that some of the most exciting entrepreneurial & development opportunities are in building out a robust decentralized Internet architecture that leverages the blockchain technologies and cryptonetworks. These decentralized systems are a counter balance to the aggregated power of centralized companies like Google & Facebook, who are currently dominating the the online advertising market. Dixon argues that these companies initially collaborated with third-party developers to grow their ecosystem, but they all eventually started to focus more on “extracting data from users and competing with complements over audiences and profits.” In order to drive their advertising-based revenue models, Google and Facebook have pioneered methods of “surveillance capitalism” that tracks information about what users do online to form unified profiles to model behaviors and ultimately match advertisers with potential customers.

VR & AR technologies will provide the opportunity to have access to even more powerful biometric, emotional, embodied movement, and eventually eye tracking data, which has an unknown ethical threshold between what is predicting or controlling user behavior. Be sure to check out Voices of VR episode #520 for a more comprehensive write-up, discussion, and links to other episodes covering the complicated privacy issues that VR and AR introduces.

Holonet developer Elias hopes that one antidote to companies tracking your every movement and action in virtual worlds is build compelling user experiences that leverage decentralized identity technologies to put the control of your identity and data back into your own hands. He’s released a sparse toolkit to start to integrate self-sovereign identity tools within WebVR sites on the open web, and he’s planning on working on integrations with the popular A-Frame WebVR framework.

LISTEN TO THE VOICES OF VR PODCAST

It’s still early days for where the open immersive web is headed, but High Fidelity is probably the most robust example of what’s possible with open web technologies. Co-founder Philip Rosedale told me that they’re planning on implementing a self-sovereign identity system, and High Fidelity also recently launched a beta of their own High Fidelity Coin cryptocurrency. Elias hopes that Holonet can provides tools for open web developers to create compelling user experiences that leverage the power of the open web with a decentralized user identity. There’s not a lot of compelling experiences just yet, but if Dixon is right, then we’re going to be seeing a lot more decentralized cryptonetworks in the future, and infrastructure tools like self-sovereign identity are going to be crucial ingredient for an open and portable metaverse that’s architected for privacy.

Other decentralized services mentioned in the podcast:

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality


Support Voices of VR

Music: Fatality & Summer Trip

vincent-edwardsRose Colored by INVAR Studios & Adam Cosco won the award for best live-acton VR experience at the Advanced Imaging Society’s Lumiere Awards on February 12th. I previously interviewed Cosco at VRLA last year, and I had a chance to talk with INVAR’s co-founder & chief creative officer Vincent Edwards & creative director Austin Conroy on their thoughts on the future of storytelling in VR at Kaleidoscope VR’s FIRST LOOK VR Market.

austin-conroyRose Colored is a near-future speculative sci-fi in the same cautionary tale vein as Black Mirror, but with a little bit more of an optimistic bent. Edwards identifies as an inveterate optimist, and enjoys the process of world-building potential futures in VR and exploring the moral quandaries of the logical extremes of how AR & AI technologies will impact our lives and romantic relationships. Conroy identifies as a storytelling geek, and is really interested in VR’s capability to allow you to embody a character using the visual storytelling affordances cultivated by cinema. It’s an open question for how you can get the audience inside of a fictional character’s head, which he compares to building a mind.

LISTEN TO THE VOICES OF VR PODCAST

Edwards says that VR storytelling reminds him of the early days of the DIY punk rock scene in Los Angeles where there’s a lot of experimentation and a willingness to forget everything you know. There are a lot of lessons about visual storytelling that will come from film, and the interactive storytelling innovations for VR are more likely to come from game developers.

As far as where VR & AR goes in the future, both Edwards and Conroy take inspiration from Buddhist and Hindu concepts. Conroy cites a passage from Eknath Easwaran’s translation of the Dhammapada saying that our experiences could be thought of as projection similar to how we experience continuity of a story when a movie projects 24 frames a second onto a screen. Edwards says that if that’s true, then perhaps VR could provide us with training wheels to be able to cut through the matrix and “awaken from the dream that is Maya.” They acknowledge that these are some dense philosophical and metaphysical ideas, but that it’s part of the deeper motivations for INVAR Studios to create multi-platform stories that help reflect on our identity and experiences in life, and to give us stories about potential futures that help us reconcile with the nature of reality today.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality


Support Voices of VR

Music: Fatality & Summer Trip

At SIGGRAPH 2017, NVIDIA was showing off their Isaac Robot that had been trained to play dominos within a virtual world environment of NVIDIA’s Project Holodeck. They’re using Unreal Engine to simulate interactions with people in VR to train a robot how to play dominos. They can use a unified code base of AI algorithms for deep reinforcement learning within VR, and then apply that same code base to drive a physical Baxter robot. This creates a safe context to train and debug the behavior of the robot within a virtual environment, but to also experiment with cultivating interactions with the robot that are friendly, exciting, and entertaining. This will allow humans to build trust in interacting with robots in a virtual environment so that they are more comfortable and familiar with interacting with physical robots in the real world.

I talked with NVIDIA’s senior VR designer on this project Omer Shapira at the SIGGRAPH conference in August, where we talk about using Unreal Engine and Project Holodeck to train AI, using a variety of AI frameworks that can use VR as a reality simulator, stress testing for edge cases and anomalous behaviors in a safe environment, and how they’re cultivating social awareness and robot behaviors that improve human-computer interactions.

LISTEN TO THE VOICES OF VR PODCAST

Here’s NVIDIA CEO Jensen Huang talking about using VR to train robots & AI:

If you’re interested in learning more about AI, then be sure to check out the Voices of AI podcast which just released the first five episodes.

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality


Support Voices of VR

Music: Fatality & Summer Trip

celine-tricartThe Sun Ladies VR is an amazing story about a group of Yazidi women from the Northern Iraq community of Sinjar, who escaped as sex slaves and started an all-female unit to fight ISIS. ISIS raided the Sinjar District in August 2014 and massacred over 2000 Yazidis, selling many women into slavery. There is a group of women who escaped and decided to fight back in part because ISIS believes that they’ll go to hell if they’re killed by a woman. These women call themselves “The Sun Ladies,” and their story inspired activist and actor Maria Bello to form a team of war journalists and VR creators to travel to Iraq to capture their story of women’s empowerment.

christian-stephenThe Sun Ladies premiered at Sundance, and I caught up with co-directors & producers CĂ©line Tricart and Christian Stephen. Tricart is a VR cinematographer & director who recently published Virtual Reality Filmmaking: Techniques & Best Practices for VR Filmmakers, and Stephen is a British war journalist who directed the first VR piece from a war zone in 2015 with Welcome to Aleppo. I talked with Tricart & Stephen about the process of traveling to northern Iraq, building trust with the Sun Ladies in order to share their stories of empowerment, their creative use of illustrations, and what they see are the strengths and limitations of using VR to tell stories within these areas of conflict.

LISTEN TO THE VOICES OF VR PODCAST

Their production process was a fusion of lessons from their backgrounds in war journalism and VR cinematography, and they used VR’s ability to transport you to another place. VR cameras are easily mistaken as a bombs, and so it was difficult to capture footage from the frontlines. This limitation inspired Tricart to reach out to lead VR artist on Dear Angelica, Wesley Allbrook, in order to create illustrated representations of the Sun Ladies fighting using Quill. In order to make that transition more seamless, they added a unique blend of illustrations on top of the cinematic 360-degree footage in order to emphasize the characters within the sparse landscape of Northern Iraq.

Both Tricart & Stephen wanted to avoid the trope of focusing on the tragedy and trauma of the previous experiences of these women, but rather tell the story of how they’re empowering themselves to fight back and protect their community. Stephen has a lot of deep insights about the dynamics of the region, and he points out that it was vital to have Tricart there to be able to listen and capture the stories of these women. The project was the brain child of Bello, and she put together an amazing team that’s pushing the boundaries of immersive storytelling by going into the trenches to capture these types of stories. They blending in 2D footage gathered from Stephen’s awareness from reporting in the region to the backstory, and then leveraged VR’s ability to transport you into other worlds to open a window into their journey into recovery and empowerment. The Sun Ladies is a really inspiring story that captures a lot of intimate moments, and it fuses in an artistic style with Allbrook’s Quill illustrations that really captures their fierce warrior spirits.

This is a listener-supported podcast, considering making a donation to the Voices of VR Podcast Patreon

Music: Fatality


Support Voices of VR

Music: Fatality & Summer Trip

danny-cannizzaroTender ClawsVirtual Virtual Reality is a Daydream-exclusive interactive story that won the best VR experience at the 2017 Google Play Awards VR, and it’s one of my personal favorite interactive narratives within VR. Virtual Virtual Reality (aka VVR) does a great job of balancing your control as a player with the authorial control of the story. You have the ability to openly explore the worlds, but it’s all within the context of a narrative that subtly guides you through a futuristic world where AI has taken over and is providing you an immersive hero’s journey experience.

The main mechanic in VVR is putting on VR headsets while you’re in VR such that you incept into deeper and deeper layers of simulated reality, and you have to navigate through these worlds and do specific tasks to progress through the story world. It’s a deeply satisfying mechanic, and Tender Claws was able to create an entire story that hinges on this process of going deeper and deeper down the simulated rabbit hole.

samantha_gormanI talked with Tender Claws co-founders Samantha Gorman and Danny Cannizzaro at VRLA in April 2017 about their experiential design and interactive storytelling insights from creating VVR. Gorman has been creating immersive stories in VR since 2002, and so there’s a level of polish and degree of elegant experiential design and storytelling that comes through both in VVR, their TendAR AR narrative at Sundance 2018, and their upcoming follow-up to VVR with Scottsdale which had an early prototype showing at Kaleidoscope VR’s FIRST LOOK Market.

LISTEN TO THE VOICES OF VR PODCAST

VVR is not only a fun and entertaining look at the future of our relationship to AI and immersive technologies, but Tender Claws is also using quite a lot of cutting-edge AI techniques for content generation within the experience. They’re also pushing the boundaries for storytelling in AR with TendAR. Cannizzaro said that Tender Claws is really interested in blurring the lines between the digital and real worlds, and that VR has been an excellent prototyping tool to develop some of the storytelling ideas they hope to apply within AR, which has been started to put into practice with TendAR.

Tender Claws is one of the most progressive and forward-thinking content studios in the VR and AR space, and they’re pushing forward the boundaries of interactive storytelling with each of their projects, which started with an award-winning iPad-based story called PRY. They really got the most out of the limited 3-DoF Daydream controller by using interactive leashes for what felt like seamless gameplay and interaction mechanics within Virtual Virtual Reality. Their writing strikes a great balance between humor, surrealism, and character-development, and they found a great combination of exploration with storytelling in their experience. Be sure to check out Virtual Virtual Reality on the Daydream headset, and I’m looking forward to learning more about their TendAR and Scottsdale projects later in 2018.

This is a listener-supported podcast, considering making a donation to the Voices of VR Podcast Patreon

Music: Fatality

samantha_gormanTender Claws, the creators of the award-winning interactive VR narrative Virtual Virtual Reality, premiered a new, site-specific, interactive AR narrative experience at the Sundance New Frontier called TendAR. It was a social augmented reality experience that paired two people holding a cell phone and sharing two channels of an audio stream featuring an fish guiding the participants through a number of interactions with each other and exploring the surrounding environment. The participants were to instructed to express different emotions in order to “feed” and “train” the AI fish. Google’s ARCore technology was used for the augmented reality overlays, Google’s Cloud Vision AI API for object detection, as well as early access to some of Google’s cutting-edge Human Sensing Technology that could detect emotional expressions of the participants.

Overall, TendAR was a really fun and dynamic experience that showed how the power of AR storytelling lies in doing interesting collaborative exercises with another person, but also becoming aware of your immediate surroundings, context, and environment where objects can be discovered, detected, and integrated as a part of a interaction that’s happen with a virtual character.

I had a chance to talk with Tender Claws co-founder Samantha Gorman to talk about their approach to experiential design for an open-ended interactive AR experience, the unique affordances and challenges of augmented reality storytelling, their collaboration with interactive storytelling theater group Piehole, the challenges of using bleeding-edge AI technologies from Google, and some of their future plans in expanding this prototype experience into a full-fledged 3-hour, solo AR experience with a number of excursions and social performative components.

LISTEN TO THE VOICES OF VR PODCAST

Here’s a teaser trailer for TendAR

Gorman said that they’re not planning on storing or saving any of the emotional recognition data on their side, and this is the first time that I’ve ever heard anything about Google’s Human Sensing group. I trust Tender Claws to be good stewards of my emotional data, and their TendAR experience shows the potential of what type of immersive narrative experiences are possible when integrating emotional detection as an interactive biofeedback mechanic. Mimicking a wide range of different emotional states can evoke a similarly wide range of different emotional states, and so I found that TendAR provided a really robust emotional journey that was a satisfying phenomenological experience. TendAR was also an emotionally intimate experience to share with a stranger at a conference like Sundance, but it demonstrates the power of where AR storytelling starts to shine — creating contexts for connection and opportunities to create new patterns of meaning in your immediate surroundings.

However, the fact that Google is working on technology that can capture and potentially store emotional data of users introduces some more complicated privacy implications that are worth expanding upon. Google and Facebook are performance-based marketing companies who are driven to capture as much data about everyone in the world as possible, and VR & AR technologies introduce the opportunity to capture much more intimate data about ourselves. Biometric data and profiles of our emotional reactions could reveal unconscious patterns of behavior that could be ripe for abuse, or be used to train AI algorithms that reinforce the worst aspects of our unconscious behaviors.

I’ve had previous conversations about privacy in VR with behavioral neuroscientist John Burkhardt who talked about the unknown ethical threshold of capturing biometric data, and how the line between advertising and thought-control starts to get blurred when you’re able to have access to biometric data that can unlock unconscious triggers that drive behavior. VC investor and privacy advocate Sarah Downey talked about how VR could become the most powerful surveillance technology every invented or it could become one of our last bastions of privacy if we architect systems with privacy in mind (SPOILER ALERT: Most of our current systems are not architected with privacy in mind since they’re capturing and storing as much data about us as possible). And I also talked with VR privacy philosopher Jim Preston who told me about the problems with the surveillance-based capitalism business models of performance-based marketing companies like Google and Facebook, and how privacy in VR is complicated and that it’s going to take the entire VR community having honest conversations about it in order to figure it out.

Most people get a lot of benefit from these services, and they’re happy to trade their private data for free access to products and services. But VR & AR represent a whole new level of intimacy and level of detail of information that is more similar to medical data that’s protected by HIPAA regulations than it is to data that is consciously provided by the user through a keyboard. It’s been difficult for me to have an in-depth and honest conversation with Google about privacy or with Facebook/Oculus about privacy because the technological roadmap for integrating biometric data streams into VR products or advertising business models have still been in the theoretical future.

But news of Google’s Human Sensing departing building products for detecting human emotions shows that these types of products are on the technological roadmap for the near future, and that it’s worth having a more in-depth and honest conversation about what types of data will be capture, what won’t be captured, what will be connected to our personal identity, and whether or not we’ll have options to opt-out of data collection.

Here’s a list of open questions about privacy for virtual reality hardware and software developers that I first laid out in episode #520:

  • What information is being tracked, recorded, and permanently stored from VR technologies?
  • How will Privacy Policies be updated to account for Biometric Data?
  • Do we need to evolve the business models in order to sustain VR content creation in the long-term?
  • If not then what are the tradeoffs of privacy in using the existing ad-based revenue streams that are based upon a system of privatized surveillance that we’ve consented to over time?
  • Should biometric data should be classified as medical information and protected under HIPAA protections?
  • What is a conceptual framework for what data should be private and what should be public?
  • What type of transparency and controls should users expect from companies?
  • Should companies be getting explicit consent for the type of biometric data that they to capture, store, and tie back to our personal identities?
  • If companies are able to diagnose medical conditions from these new biometric indicators, then what is their ethical responsibility of reporting this users?
  • What is the potential for some of anonymized physical data to end up being personally identifiable using machine learning?
  • What controls will be made available for users to opt-out of being tracked?
  • What will be the safeguards in place to prevent the use of eye tracking cameras to personally identify people with biometric retina or iris scans?
  • Are any of our voice conversations are being recorded for social VR interactions?
  • Can VR companies ensure that there any private contexts in virtual reality where we are not being tracked and recorded? Or is recording everything the default?
  • What kind of safeguards can be imposed to limit the tying our virtual actions to our actual identity in order to preserve our Fourth Amendment rights?
  • How are VR application developers going to be educated and held accountable for their responsibilities of the types of sensitive personally identifiable information that could be recorded and stored within their experiences?

The business models of virtual reality and augmented reality have yet to be fully fleshed out, and the new and powerful immersive affordances of these media suggest that new business models may be required that both work well and respect user privacy. Are we willing to continue to mortgage our privacy in exchange to access to free services? Or will new subscription models emerge within the immersive media space where we pay upfront to have access to experiences similar to Netflix, Amazon Prime, or Spotify? There’s a lot more questions than answers right now, but I hope to continue to engage VR companies in a dialogue about these privacy issues throughout 2018 and beyond.

This is a listener-supported podcast, considering making a donation to the Voices of VR Podcast Patreon

Music: Fatality


Support Voices of VR

Music: Fatality & Summer Trip

Jake-RubinThe HaptX Glove that was showed at Sundance was one of the most convincing haptics experiences that I’ve had in VR. While it was still primitive, I was able to grab a virtual object in VR, and for the first time have enough haptic feedback to convince my brain that I was actually grabbing something. Their glove uses a combination of exoskeletal force feedback with their patented microfluidic technology, and they’ve significantly the size of their external box driving the experience from the demo that I saw at GDC (back when they were named AxonVR) thanks to a number of technological upgrades and ditching the temperature feedback.

joe-michaelsI had a chance to talk with CEO & co-founder Jake Rubin and Chief Revenue Officer Joe Michaels at Sundance where we talked about why enterprise & military training customers are really excited about this technology, some of the potential haptics-inspired interactive storytelling possibilities, how they’re refining the haptics resolution fidelity distribution that will provide the optimal experience, and their collaboration with “>SynTouch’s texture-data models in striving towards creating a haptic display technology that can simulate a wide ranges of textures.

LISTEN TO THE VOICES OF VR PODCAST

HaptX was using a Vive tracker puck for arm orientation, but they had to develop customized magnetic tracking to get the level of precision required to simulate touch, and one side effect is that their technology could start to be used as an input device. Some of HaptX’s microfludic technologies combined with a new air valve that is 1000x more precise could also start to create unique haptics technologies that could have some really interesting applications for sensory replacement or sensory substitution or start to be used in assisting data visualizations in a similar way that sound enhances spatialization through a process called sonification.

Overall, HaptX is making rapid progress and huge leaps with their haptics technologies and they’ve crossed a threshold for becoming useful enough for a number of different enterprise and military training applications. Rubin isn’t convinced that VR haptics will ever be able to fully trick the brain in a way that’s totally indistinguishable from reality, but they’re getting to the point where it’s good enough to start to be used creatively in training and narrative experiences. Perhaps soon we’ll be seeing some of HaptX’s technology in location-based entertainment applications created by storytellers who got to experience their technology at Sundance this year, and I’m really looking forward to seeing how their textures haptic display evolves over the next year.

This is a listener supported podcast, considering making a donation to the Voices of VR Podcast Patreon

Music: Fatality


Support Voices of VR

Music: Fatality & Summer Trip

laura-wexlerDinner Party is an immersive exploration of Betty and Barney Hill’s alien abduction story that premiered at the Sundance New Frontier. Rather than using normal alien abduction tropes, writers Laura Wexler & Charlotte Stoudt chose to use the spatial affordances of VR to present a symbolic representation of each of their experiences to highlight how vastly different they were. Betty and Barney were an interracial couple in New Hampshire, and the encounter with aliens was a positive peak experience for Betty, but Barney had an opposite experience that Wexler & Stoudt attribute to his experience as a black man in the early 1960s. Inspired by passages of Barney’s hypnosis recordings posted online, Wexler & Stoudt expanded Hill’s story into an immersive VR narrative at the New Frontier Story Lab, and collaborated with director Angel Manuel Soto to bring this story to life in VR.

charlotte-stoudtDinner Party is the pilot episode of a larger series called The Incident, which explores the aftermath of how people deal with a variety of paranormal or taboo experiences. Wexler & Stoudt are using these stories to explore themes of truth and belief such as: Who is believed in America? Who isn’t? What’s it feel like to go through an extreme experience that no one believes happened to you? And can VR allow you to empathize with someone’s extreme subjective experience without being held back by an objective reality that you believe is impossible?

Dinner Party is great use of virtual reality storytelling, and it was one of my favorite VR experiences I saw at Sundance this year. It has a lot of depth and subtext that goes beyond what’s explicitly said, and I thought they were able to really use the affordances of VR to explore a phenomenological experience in a symbolic way. It’s a really fascinating exploration of radical empathy using paranormal narrative themes that you might see in the The X-Files or The Twilight Zone, and I look forward to see what other themes are explored in future episodes.

LISTEN TO THE VOICES OF VR PODCAST

Here’s a teaser for Dinner Party

This is a listener supported podcast, considering making a donation to the Voices of VR Podcast Patreon

Music: Fatality

eliza-mcnittSundance New Frontier had a solid line-up of VR experiences this year with a number of immersive storytelling innovations including SPHERES: Songs of Spacetime, which takes you on a journey into the center of a black hole. It’s a hero’s journey that provides an embodied experience of the evolution of a star from birth to death with a poetic story written and directed by Eliza McNitt, narrated by Jessica Chastain, and produced by Darren Aronofsky’s Protozoa Pictures.

SPHERES made news for being acquired for a 7-figure deal, and it represents a unique collaboration between science and art. There were a number of scientific collaborators including the National Academy of Sciences and physicists who study black holes, and so the VR producers had to come up with creative interpretations of mathematical descriptions of the edges of spacetime that push the frontiers of our scientific knowledge.

I had a chance to sit down with McNitt at Sundance in order to talk about the inspiration for this project, her journey into creative explorations of science, the challenges of depicting gravitational lensing in Unity, what’s known and not known about black holes, how listening to gravitational waves for the first time inspired the sound design, and crafting an embodied hero’s journey story in collaboration with Protozoa Pictures. The acquisition deal by CityLights was secured on Kaleidoscope’s funding platform, and includes this first chapter shown at Sundance as well as two additional chapters yet to be produced, and will be released later this year by Oculus.

LISTEN TO THE VOICES OF VR PODCAST

Here’s a promo for SPHERES produced by Sundance.

This is a listener supported podcast, considering making a donation to the Voices of VR Podcast Patreon

Music: Fatality

jeremy-bailensonJeremy Bailenson founding director of Stanford University’s Virtual Human Interaction Lab, and his latest book Experience on Demand traces his academic journey through virtual reality. It’s an intellectual memoir that focuses on his personal work in VR, and the insights that VR provides into human communication dynamics, as well as the impact of VR on our identity, empathy, education, medicine, and our ability to understand complex issues such as global warming and our impact on the environment. (UPDATE: Becoming Homeless: A Human Experience is now available on Steam.)

I had a chance to sit down with Bailenson to talk about his journey into VR, the major insights that VR has provided into human communication, and how STRIVR,jeremy-bailenson the company he co-founded, is moving from training elite quarterbacks in the NFL to landing major corporate training contracts including training Wal-Mart employees. STRIVR is gathering one of the most robust data sets for using VR for education and training, which is enabling them to build statistical models to make connections between unconscious biometric gaze data and the process of learning. We also talk about how AI and machine learning will help build powerful models for biometric data, but also some of the privacy implications of this data as well as what we know and don’t know when it comes to the risks and dangers of virtual reality technology.

LISTEN TO THE VOICES OF VR PODCAST

This is a listener supported podcast, considering making a donation to the Voices of VR Podcast Patreon

Music: Fatality


Support Voices of VR

Music: Fatality & Summer Trip