Cris_MirandaFor the past three years, Cris Miranda has been taking VR pioneers down the rabbit hole of thinking about the deepest philosophical implications of virtual reality on his Enter VR podcast. He has been seeking out the boundaries of our understanding of reality through these conversations, and much like Elon Musk concluded a few weeks ago, Cris has concluded that it’s most likely that we’re already living within a simulation.

I had a chance to sit down with Cris at the Silicon Virtual Reality Conference to explore some of these deeper thoughts about the nature of reality through simulation theory, our predictions for the future of VR, how artificial intelligence will be dismantling the structures of society, the role of VR in cultivating empathy and shared realities, and the overall information war over what the story of reality is and whether we’re going to take control over our own destiny or let someone else define it for us.

LISTEN TO THE VOICES OF VR PODCAST

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip

christina-hellerChristine Heller is the co-founder and CEO of VR Playhouse, and she gave a talk with her co-founder Ian Forester at SVVR 2016 titled “How Fantasy Becomes Form” about the process of starting up a 360-degree video production studio. I had a chance to catch up with Christine at SVVR to talk about some of her main points, as well as talk about how VR Playhouse wants to bring a psychedelic consciousness to VR in order to potentially activate dormant portions of our brain and ultimately enable us to bring more liberation into our every day lives. We discuss some of these larger visions of VR, what Burning Man has to teach VR, her favorite narrative VR experiences, and the future of interactive storytelling within VR.

LISTEN TO THE VOICES OF VR PODCAST

Below are some of VR Playhouse’s 360-degree video productions:

Missing Pieces – Honda 360 music video

D∆WN: ‘Not Above That’ VR Experience

Profile of VR Playhouse

VR Playhouse Sizzle Reel

Teaser of “Surrogate,” an interactive CGI mixed with 360 video

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip

One of the PlayStation VR titles at Sony’s E3 booth this week is RIGS: Mechanized Combat League, which is a competitive, 3 player vs 3 player game that is like a mixture between a FPS and basketball. The overall goal is for you team to run up a ramp and jump through a hoop, but you have to kill a member of the opposite team before you can do that. I had a chance to catch up with Guerrilla Games Lead Designer Gareth Hughes at Sony’s press event at GDC to talk to him about the game design process, some of the motion sickness from VR locomotion issues that came up for me, and the process of cultivating an eSport within VR.

LISTEN TO THE VOICES OF VR PODCAST

Only time will tell whether or not people are able to cultivate enough individual and team skills within RIGS to the point that competitive leagues are formed. But from what I’ve seen so far, RIGS does have a lot of the key components in it’s 3Pv3P gameplay that make it perfectly primed to become one of the first major eSports in VR. If that’s the case, then either the future of competitive VR eSports will be populated with professional VR players who are immune to motion sickness or there will need to be some VR design compromises that make this type of intense VR locomotion experience comfortable for everyone.

Another key component for the future of eSports is whether or not it’s interesting and exciting to watch for spectators. Forbes recently reported that “The studio is working on a spectator mode for virtual reality viewers to watch the action when they’re not one of the six players competing in the arena.” This is a game where the audience would probably have a better view on everything that was going on rather than an individual player who can’t see the entire playing space from their first-person perspective.

RIGS: Mechanized Combat League is currently slated to be a launch title for the PlayStation VR in October.

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip

There were a lot of VR games announced over the past two days at E3 by AAA game publishers. I watched all of the press conferences from Sony, Ubisoft, Bethesda, Microsoft, EA, and Oculus’ appearance at the PC Gaming Show in order to capture audio highlights of all of the VR news that was presented. I distilled over 8 hours worth of content down to 41 minutes of the most interesting audio highlights and analysis in today’s episode of The Voices of VR podcast. You can listen to all of the major announcements in their own words, and find all of the new VR game trailers down below.

LISTEN TO THE VOICES OF VR PODCAST

Here’s the video the Sony PlayStation E3 conference:

Sony PlayStation VR will launch on October 13th and have at least 50 games by the end of 2016, and below are the new trailers that were just released

Resident Evil 7 biohazard will be released in 2017

A new horror game Here They Lie

Some portion of Final Fantasy XV will be in VR

Batman: Arkham VR

Farpoint is a sci-fi VR thriller that uses a new PSVR Aim controller

Starwars: Battlefront X-Wing VR Experience from EA. More details from Criterion Games Producer James Svensson here, and more details on EA’s Frostbite Labs here.

Bound is an amazing-looking game that just got a new trailer:

Here’s a list of the 29 games that PlayStation VR will be showing at E3.

Here’s the archive of Ubisoft’s E3 Press Conference

Palmer Luckey and the Oculus team beat the Ubisoft developers in a game of capture the flag in Eagle Flight.

Here’s the trailer for the social game Star Trek: Bridge Crew

Oculus’ Anna Sweet announced a SUPERHOT VR and Killing Floor: Incursion at the PC Gaming Show

SUPERHOT VR

Killing Floor: Incursion

Giant Cop

Wilson’s Heart

Serious Sam VR: The Last Hope

Oculus posted a blog post with 30 different launch titles for the Oculus Touch.

Here’s a list of Oculus Touch Launch titles named in the blog:

  • Luna by Funomena
  • Unspoken by Insomniac
  • Dead & Buried by Oculus Studios
  • Rock Band VR by Harmonix
  • VR Sports Challenge by Sanzaru
  • The Climb (now Touch enabled) Crytek
  • Pro Fishing Challenge VR by Opus
  • I Expect You to Die by Schell Games
  • Job Simulator by Owlchemy Labs
  • Fantastic Contraption by Northway Games
  • Giant Cop by Other Ocean
  • Oculus Medium will also ship with Touch

Bethesda Softworks E3 Conference where they announced Fallout 4 & Doom on VR

Microsoft E3 press conference is currently archived here.

Here’s a portion of the Minecraft update

Here’s Microsoft’s VR-ready console named Project Scorpio launching during the holidays of 2017

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip


Support Voices of VR

Music: Fatality & Summer Trip

SOUNDBITE
Ports?
Locomotion schemes?

SOUNDBITE
Bethesda – Fallout Four mods
DOOM – Snapmap

SOUNDBITE
PC Gamer
Tribute to the Mouse & Keyboard

E3
No mention of any VR
Fostering competition and eSports communities – Farm league
Funding of

SOUNDBITE
AMD said the RX 480 can run VR on the Oculus Rift and HTC Vive headsets. It supports DirectX 12 and Vulkan graphics standards, enabling high-end graphics effects. The Radeon RX series supports new monitor technologies and supports HDMI 2.0b and DisplayPort 1.3/1.4 standards. It has accelerated h.265 encoding and decoding for better video streaming at 4K resolutions at 60 frames per second.

Virtual reality brings us one step closer to realizing this amazing dream of piloting an X-wing in space. With this incredible new platform, you will be instantaneously transported from the comforts of your living room to an X-Wing cockpit in space. And with the artists at Criterion and DICE meticulously crafting every authentic detail, the experience feels more real than you can imagine.

Once you look outside your ship, you’ll start to take in and appreciate the sheer magnitude of the environment in which you’re sitting. Sure, you may think you know that a Blockade Runner or a Star Destroyer are big, but once you’re firing at Imperial forces alongside their hulls as you maneuver around their defenses, then you’ll truly comprehend the enormity of these starships. It’s intensely exciting!

ben-throopWhen Ben Throop went to the Boston VR hackathon in June 2014, he didn’t know that Valve was going to be showing off some prototype VR hardware that had positional tracking. At this point the Oculus DK2 had not shipped yet, and so he was able to build a VR game using his soccer experience to head soccer balls into a goal. He wanted to see how it felt, and was surprised that it was actually a lot of fun. He decided to continue working on it, and last year the game was first announced at E3 as a PlayStation VR launch title. Frame Interactive will be back again this year at E3 showing off Headmaster at PlayStation VR’s booth at E3.

I had a chance to catch up with Ben at Sony’s GDC press event where I talked to him about the game design principles behind Headmaster, why even non-gamers love to play it, and why the physics of things flying at your face are some compelling in VR.

LISTEN TO THE VOICES OF VR PODCAST

Here’s the trailer for Headmaster

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip

max-bolotovKonceptVR has produced a couple of 360 videos for Broadway Theater productions that includes a promo with the cast of Hamilton as well as a music video with the cast of School of Rock. I had a chance to catch up with KonceptVR’s Max Bolotov at SVVR 2016 where he shared a lot of his lessons learned from working on broadway, some history of 360 video rigs with Freedom 360, and one of the very first 360 music videos produced by yellowBird in 2010 that inspired him to get into spherical video.

LISTEN TO THE VOICES OF VR PODCAST

Here’s the School of Rock video:

Here’s the Hamilton promo video:
hamilton-360-tonys-2016

And here’s the rocket launch video that KonceptVR was showing at SVVR 2016.

And here’s an unwrapped version of one Professor Green’s “Coming to Get Me”, which was of the first 360 music videos

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip


Support Voices of VR

Music: Fatality & Summer Trip

xarene-escandarXárene Eskandar is an artist who has been exploring representations of time through the mediums of film and photography for many years, and she’s has been starting to design VR experiences that alter the user’s perception of time based upon insights from cognitive science research. She’s also a Ph.D. candidate at UC Santa Barabara Department of Media, Arts, & Technology, and she was presenting a paper titled “Aesthetics of temporal perception and scale in virtual reality” at the Experiential Technology & Neurogaming Conference in San Francisco in May. I had a chance to catch up with Xárene to talk about filmmakers Béla Tarr & Andrei Tarkovsky’s experiments in time perception, as well as the cognitive science research into why awe, self-regulation, and The Extended Now impacts time perception.

LISTEN TO THE VOICES OF VR PODCAST

Here’s Xárene’s poster titled “Aesthetics of temporal perception and scale in virtual reality”:
Aesthetics-of-temporal-perception

Filmmaker Andrei Tarkovsky wrote a book titled “Sculpting in Time” where he argues that people go to see films in order to experience time, and that the role of a director is to sculpt the time that the audience experiences:

What is the essence of the director’s work? We could define it as sculpting in time. Just as a sculptor takes a lump of marble, and, inwardly conscious of the features of his finished piece, removes everything that is not part of it—so the film-maker, from a ‘lump of time’ made up of an enormous, solid cluster of living facts, cuts off and discards whatever he does not need, leaving only what is to be an element of the finished film, what will prove to be integral to the cinematic image.

Here’s the opening scene of Andrei Tarkovsky’s The Sacrifice, which feels about 2-3 times longer than the actual run time.

And here are two of the potato peeling scenes in Béla Tarr’s The Turin Horse that also give an expanded sense of time:

For more Voices of VR interviews about time perception and time dilation in VR, then be sure to check out my interviews with Gerd Bruder, Owlchemy Labs, Karl Krantz, and Sarah Northway.

Below are some links and excerpts from some of the cognitive science research that Xárene cites.

Rudd, Vohs, and Aaker connect awe and time perception in their paper titled “Awe Expands People’s Perception of Time, Alters Decision Making, and Enhances Well-Being”

Two psychological theories are also relevant. The first involves the extended-now theory (Vohs & Schmeichel, 2003), which suggests focusing on the present moment elongates time perception. Awe captivates people’s attention on what is unfolding before them, which the extended-now theory predicts would expand the perception of time. The second is Socioemotional Selectivity Theory (SST), which posits that people are motivated to acquire new knowledge when time feels expansive (Carstensen, Isaacowitz, & Charles, 1999). Awe triggers in people a desire to make new knowledge structures (Keltner & Haidt, 2003). Thus, a speculative suggestion from SST is that awe’s ability to stimulate the creation of mental schemas may be a signal that the mind perceives an expanded amount of time in response to awe.

Rudd, Vohs, and Aaker also provide a specific definition of awe with references to previous research on the topic:

One, awe involves perceptual vastness, which is the sense one has come upon something immense in size, number, scope, complexity, ability, or social bearing (e.g., fame, authority). Two, awe stimulates a need for accommodation, meaning it alters one’s understanding of the world (Keltner & Haidt, 2003)… Experiences involving awe, such as optimal athletic performances (Ravizza, 1977), peak experiences (Maslow, 1964), and spiritual or mystical events (Fredrickson & Anderson, 1999), often also involve a sense of timelessness (Csikszentmihalyi & Hunter, 2003). The phenomenology of awe, therefore, suggests it might expand perceptions of time.

Vohs & Schmeichel make the argument that “regulating the self can elongate the felt duration of time” in their paper titled “Self-Regulation and the Extended Now: Controlling the Self Alters the Subjective Experience of Time.” They define self-regulation as “operations by the self to alter its own habitual or unwanted responses to achieve a conscious or nonconscious goal.”

Because people who are self-regulating tend to monitor their behavior (Baumeister, Heatherton, & Tice, 1994), they are likely to be attuned to the passage of time (“How long has it been?”). These monitoring responses and resultant attention to time are not found among people who are not regulating. Stated simply, it is likely that the act of self-regulation is associated with close attention to time.

Does our conscious perception have framerate? It turns out that according to the model of Herzog, Kammer, & Scharnowski that it does. They say that our perceptions show signs that they’re “quasi-continuously and unconsciously analyzed with high temporal resolution” and that “consciousness arises only in time intervals of up to 400 milliseconds, with gaps of unconsciousness in between.” They elaborate on their model within their paper titled “Time Slices: What Is the Duration of a Percept?.”

Even though we may only be synthesizing all of the unconscious streams of perceptual input every 50ms to 400ms, our ability to detect discrete changes across different senses differs across the different senses:

Whereas two clicks can already be separated if they are only 1–3 ms apart [33], two taps need to be about 10 ms apart [33], and two flashes about 25 ms [34]. However, for trains of stimuli, the presentation rate at which the sensation of flicker ceases is similar for the visual and auditory systems, i.e., at around 16 ms inter-stimulus-interval (ISI) [35,36].

Finally, The Atlantic did a profile on Donald D. Hoffman, a professor of cognitive science at the University of California, titled The Case Against Reality, A professor of cognitive science argues that the world is nothing like the one we experience through our senses. Our perception of time and reality has been shaped by many thousands of years of evolution of our senses by interacting with the real world.

After reading this profile about Donald D. Hoffman, then it’s really started to make me question the underlying fabric of our reality, and how much we’ve trained ourselves to completely ignore. Does VR have the capacity to provide virtual experiences that requires our perceptual systems to evolve before its ready to receive them?

Virtual reality is uniquely poised to serve as a medium to continue to explore the nature of time perception and our perception of reality. There’s still a lot to learn, and this cognitive science research into awe and self-regulation is providing a lot of different ways to start putting these theories into practice.

UPDATE: Xárene wanted to make the following clarification:
I meant to say that there is not any research that relates body scale to time perception rather than time perception research in VR doesn’t exist. Specifically, there’s 2D research where the movement of a dot is used for the perception of duration, or the movement of the sun, and we accept those objects scaling on the screen but not relative to our body scale. What if instead we view those objects the same size relative to each other with the body scaling to bring them to their correctly understood scale? Then what is the relationship of the scaling body with time and how is duration perceived?

In psychology research where the scale of the body is studied, it is studied as a mental disorder, such as Alice in Wonderland syndrome or body dysmorphic disorder. So any view of the body other than what is considered ‘normal’ is a disorder. I’m looking for research that acknowledges that our body exists in different scales just as time and space have different scales. This discussion also brings up another topic on how the trans body is seen.

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip

todd-hooperI had a chance to try out VREAL’s VR streaming technology a few weeks ago, and I think they’ve made some key technological breakthoughs that makes it possible to livestream an entire virtual reality experience. Rather than render out a 2D, first-person video or a third person mixed reality video, VREAL is sending a smaller set of serialized data so that you can render all of the VR objects, cameras, and internal state of a VR experience at 90 fps by utilizing your own PC hardware. I had a chance to catch up with CEO and founder Todd Hooper to talk about VREAL’s streaming plans, key features, and functionality including video exports, website ecosystem, and future plans. Just as streaming has become a big deal in video games, VR streaming is going to be an even bigger deal since they’re combining the magic of social VR interactions with watching their favorite personalities be immersed within VR experiences.

LISTEN TO THE VOICES OF VR PODCAST

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip

hayden-leeDarshan Shankar started BigScreen VR with a deceptively simple premise of wanting to have Virtual LAN parties with other gamers in VR, but using your computer screen as a center point of conversation has catalyzed a wide range of different use cases. In the first 3-4 weeks since their launch, the initial intended audience of gamers are indeed holding virtual LAN parties with 2D and 3D games, but there are also some surprising emergent behaviors. These range from business meetings, co-working rooms, ad-hoc training and technical support, application user testing, cultivation of business partnerships, cultivation of niche communities, and enabling the expression of identity through your collection of digital artifacts. The computer screen is a “social lubricant” that kicks starts a conversation, and allows people to connect with other people through playing, sharing, and exploring.

LISTEN TO THE VOICES OF VR PODCAST

There are also enough constraints in locomotion and a limit of four people per room that encourages intimate conversations, but there are also some surprising behaviors. People are hanging out in BigScreen playing their own games and in some cases don’t directly connect with anyone else, just as one might go to coffee shops to be able to focus on work. Sometimes playing a 2D game on BigScreen will have a lower resolution, but the ambient presence of other gamers with similar interests and values is enough to have users come back again and again.

I had a chance to do a demo with BigScreen co-founder Hayden Lee, and then catch up with him after talking to some random users within BigScreen. We talked about the wide range of social behaviors, being able to rapidly iterate on a product where the developers can literally watch their users every move via their shared screens, and some of their future plans moving forward. There’s a clear list of features from their users, and because of the wide range of use cases and applications, then I expect to see BigScreen VR become a big player in the future of social VR.

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip

philip-rosedalePhilip Rosedale has been working with virtual worlds for a long time, having founded Second Life back nearly 13 years ago in 2003. He co-founded High Fidelity in 2013, and he previously told me that he wanted to fix a lot of the things that were holding Second Life back. One big limitation was having all of the virtual worlds hosted by Linden Lab’s servers, and so Philip wanted to flip that model on it’s head by going with an open source model. Users will be able to host their own virtual spaces in a more scalable fashion like the Internet, but High Fidelity also has plans to leverage the idle GPU power of millions of machines to help render a high-resolution metaverse.

I’ve spoken to Philip at SVVR 2014 & 2015, and I had a chance to catch up with him again at SVVR 2016, where High Fidelity announced beta access to their Sandbox client. We talk about physics in VR, body tracking, 3D audio & social presence, experiments with SMI eye tracking, scaling the metaverse, avatar continuity for group collaboration, and the future of using VR as a neutral meeting ground to interface with AI robots.

LISTEN TO THE VOICES OF VR PODCAST

Here’s the Live Demo that Philip Rosedale gave at SVVR 2016

You can download the Sandbox client from here.

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip


Support Voices of VR

Music: Fatality & Summer Trip