Dream was a series of 10 live performances over 8 days that used motion captured actors who had virtual embodiments set within an immersive storyworld of Shakespeare’s Midsummer Night’s Dream powered by the Unreal Engine. This project was a research & development initiative funded by the United Kingdom’s Audience of the Future initiative that involves the Royal Shakespeare Company, Marshmallow Laser Feast, Philharmonia Orchestra, and the Manchester Film Festival.
They were originally going to produce a site-specific, location-based experience focusing on playing with different haptic & sensory experiences within the audience members, but they had to do a digital pivot to an online performance in the midst of the pandemic. They set a goal of trying to reach 100,000 people with their show that had two tiers including a paid interactive experience and free livestream of the live performance mediated through the simulated environment and broadcast onto a 2D screen.
I had a chance to break down the evolution and journey of this project with Pippa Hill, Head of Literary Department at Royal Shakespeare Company, as well as with Robin McNicholas, Director at Marshmallow Laser Feast as well as Director of Dream. We talked about adapting the constraints and goals that there were setting out to do, which was to also feature some of their R&D findings within the context of an experience. There was a lot of work with figuring out how to translate real-time motion capture with the puppeteering of virtual characters, and some very early experiments with audience paritipation and limited interactivity with an underlying goal of making it accessible to a broad demographic ranging in ages from 4 to 104 years old.
We explore some of the existential tradeoffs and design constraints that they had to navigate, but overall Hill said that there wasn’t anything left on the cutting room floor in terms of the potential for how these immersive technologies will be able to continue to impact future experiments with live theatrical experiments in the context of virtual reality, augmented reality, or mixed reality. There’s also lots of exciting and difficult narrative challenges for figuring out different ways for the audience to participate and interact with the story.
There’s also some opportunities to futher explore a tiered model of participation with differing levels of interaction, and also a lot more underlying narrative structures and opportunities to receive either individual or collective agency for how that feeds back into the unfolding of a story or experience.
At the end, there’s probably more new questions that firm answers on a lot of these existential questions of interactive and immersive narratives, but the scale and positive response that Dream has received so far help to prove out that there is a potential market for these types of interactive narrative and live performance experiments. There was also a 60-question survey that I filled out afterwards, and so I also expect there to be even more empirical data and research insights to be digested and reported on in the future as well.
LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST
Here’s some behind-the-scenes video clips sent to me by part of the production team.
— Kent Bye VoicesOfVR (@kentbye) March 24, 2021
This is a listener-supported podcast through the Voices of VR Patreon.