Cymatic Bruce is the head of developer relations at AltSpace VR, and he talks about the three main goals for their company:

  • Enable social interactions within virtual environments
  • Connect the 2D web within VR in order to share media and livestreamed events
  • Enable developers to bring in 3D web content into shared VR spaces

cymatic-bruce
Part of what makes AltSpace VR unique is their mission is to create integrations with the open web. They’re building their application on top of Unity at first in order to ensure stable and accessible platform that performs well, but their long-term plan is to enable developers to create their own customized virtual environments.

Some of the topics that we discussed are:

  • Experiencing livestreamed events in VR with a collaborative web browsing component, as well as sharing animated GIFs and viral videos in a group context.
  • Plans to integrate with multiple HMDs and various input devices.
  • Their focus on non-verbal cues & limb tracking since facial tracking with an HMD is a difficult problem.
  • Their unique point-and-click teleportation solution to VR locomotion.
  • Integrating an open source audio solution, volume attenuation, positional VoIP & creating realistic sound profiles.
  • Plans for virtual environment customizations with via their SDK and API to create Ready Player One-like environments for expressing identity.
  • Their pragmatic approach towards solving the low-hanging fruit features of the Metaverse.

Cymatic Bruce says that part of what makes social VR so compelling is that memories are created based upon locations in VR, and it’s made even more salient when interacting with our friends and family. There are already people who are creating shared social experiences of watching TV shows and sports events together, and he foresees AltSpace VR being a part of how people experience these types of events in the future.

Be sure to sign up for AltSpace VR’s beta in order to check out some of their social experiences.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Nick-PittomNick “RedOfPaw” Pittom talks about the process of translating scenes from Studio Ghibli Films into VR experiences like The Boiler Room scene from Spirited Away and The Bus Stop Scene from My Neighbor Totoro

He makes the observation that translating a scene into a VR doesn’t mean that it’s the best way to experience the story or the characters from a film experience, and that really the best VR experiences will be custom made to take into account the pacing and environmental strengths of VR.

Nick also talks a little bit about his new Crystal Rift experience that he’s helping out with, as well as his successful Kickstarted trip to Oculus Connect and the current plan for his “RedOfPaw’s Big Crazy Stupid VR Adventure and American Pie.”

Theme music: “Fatality” by Tigoolio

Tom Kaczmarczyk is a developer of SUPERHOT, which is an FPS where time moves only you move, which recently raised $250k on Kickstarter. It was a popular 2D game that was developed as a part of the 2013 7-Day First Person Shooter game challenge.

Tom-KaczmarczykTom talks about the evolution of the game, and how they first started their VR version of the game after Aaron Davies from Oculus VR reached out to them offering them a DK1 VR HMD to experiment with. He talks about the challenges of creating a 2D game first and foremost, and then adding VR support. He talks about some of the ways that SUPERHOT might be developed differently had it been an experience that was designed for VR first rather than adding VR compatibility to a 2D game.

Theme music: “Fatality” by Tigoolio

Jens Christensen is the CEO and co-founder of Jaunt VR, which is a VR startup that has raised over $35 million to develop the toolchain for live-action, 360-degree spherical, 3D video. Jaunt’s technology combines “computational photography, statistical algorithms, massively parallel processing, cutting-edge hardware and virtual reality.

jens-christensenjaunt-304xx865-1297-152-0Jens talks about Jaunt VR’s mission to capture the world around you, and to be able to put that within a VR experience. It requires capturing a ton of video data, and they’ve developed a parallel processing data farm process to be able to stitch together a fully spherical video. They’re also developing a method for capturing ambisonic 3D audio.

The Jaunt VR Twitter account has been posting various photos of covering a wide range of different sporting events, and Jens says that Jaunt is very interested in eventually providing a live broadcasts of immersive content. Their primary focus in the beginning are sports events, music events, as well as experiments with narrative content ranging from a horror story to a story set in World War II. They’ve haven’t released any VR content to the public yet, but are looking forward to releasing some experiences here soon.

Jens said that they’re still working on there is no full positional tracking yet and at the moment it’s just rotationally tracked, but it’s something they’re still looking into. Storytelling within VR is going to require the developing of a new language and syntax, and that in the beginning they prefer to keep the camera steady to minimize motion sickness.

Jaunt VR is also more interested in leveraging the existing video post-production toolchain for editing, compositing and color correction. It sounds like they’re likely working on some plug-ins or video players to be able to directly integrate and preview VR content while you’re editing it, but Jens didn’t provide any specifics and said that it’s still an open problem. Finally, he sees that VR is going to change a lot of industries ranging from news, music and travel and he’s looking forward to helping play a small part with the work that Jaunt VR is doing.

Theme music: “Fatality” by Tigoolio

Danfung Dennis talks about the process of making the first commercially available, 360-degree, fully immersive VR cinematic experience called Zero Point, which was released today on Steam. I see Zero Point as a historic VR experience that will likely catalyze a lot of other ideas and experiences within the realm of cinematic VR, and so it’s definitely worth checking out.

Danfung-Dennis Danfung is the CEO of Condition One and actually started looking at 360-degree video starting back in 2010 when creating some immersive experiences for the iPad and iPhone. He sees Zero Point as a camera test and experiment that shows the evolution of 360-video capture technology starting from monoscopic 180-degree video and maturing to the point of 360-degree stereoscopic video using RED cameras shooting at 5k and 60 frames per second.

Danfung hopes that Zero Point will be a catalyst for other experiments with 360-degree cinematic experiences that help to develop and cultivate the new syntax, language and grammar for creating immersive VR experiences. Zero Point uses the constructs of a traditional documentary film, but yet I was personally left craving something that was focused more upon putting the physical locations at the center of the story rather than the other way around. Danfung seemed to agree with that sentiment, and that moving forward what is going to matter more is using the strengths of the VR medium to create a more visceral and emotionally engaging experience where the location, visuals, and audio are all synchronized together.

In terms of the language of VR, Danfung had many insights ranging from seeing that cuts between drastically different scenes is a lot more comfortable than moving around within the same scene. And while fading to black seems to be a safe way to navigate between scenes and minimize the disorientation from teleporting instantly to a new place, he also sees that it could be overused and that he’s interested in experimenting with longer shots with few cuts. He’d also like to iterate more quickly by creating shorter experiences that are long enough to create a sense of presence, but to be able to create a wider range of different types of focused experiments in order to learn more about the language of cinematic VR which is still in it’s very early days of being developed.

There’s also an interesting experiment of blending computer-generated experiences with live-action footage, and Danfung wants to experiment with this more. He believes that when the resolution gets to be high enough, then 360-degree video has the potential to provide a sense of presence more quickly than a computer-generated environment.

Finally, Danfung sees that there’s a tremendous potential of VR to be able transcend the limitations of the mass media, and to provide a more advanced language for sharing stories that take people into another world, another mind, and another conscious experience. There is a lot of potential to explore more abstract concepts and to use the VR medium for pro-social causes and to help make the world a better place.

Zero Point is now available on Steam here.

TOPICS

  • 0:00 – Introduction
  • 0:35 – Process of innovating and creating 360-degree video. Create powerful immersive experiences, and tried an early prototype of the DK1 and knew that VR was going to be his medium. Started with astronomy photography and use 180-degree fisheye lenses to look at capturing the night sky with a DSLR. Saw limitations to 180-degree lenses and started to add more cameras and now have an array of Red Cameras shooting at 5k and 60 fps.
  • 3:20 – Mix of different video styles and techniques. It’s an evolution of different capture systems and playback. Started with 180-degrees and mono and keep innovating the technology to the point of 360-degree sphere in stereoscopic 3D. There is quite a mix throughout the VR experience. Get it out and get feedback, and it’s a signpost of what’s to come. It’s a camera test and an experiment.
  • 5:00 – It’s easy to break to break presence. Seams and black spaces are immediately noticed and it’ll pull them out of the experience and it’ll break the sense of presence. A large process of trying to iron out the glitches. Have to have a very seamless image because people will notice the errors. The resolution needs to be higher than the DK2 to take
  • 6:54 – Mixing the form of documentary with being transported to different locations within VR. What is VR film the strongest for? Drawing on the techniques for what we know before. Those film techniques don’t really apply to VR. Zero Point is a more traditional documentary approach, but moving forward the core experience is going to matter more. How can emotionally visceral an experience will be is what is going to be the most compelling. The scenes and environments will be key. How to best convey emotion with the VR medium is an open question. It’s a new medium and a new visual language that’s being developed.
  • 9:00 – Zero Point as a “stub article” that will catalyze ideas for what the strengths of VR film medium. Designing around the location first. VR will be a visuals-first, and then audio. Design around that. This is what high-resolution and high framerate VR looks like. Hopes that Zero Point is a catalyst to start putting out experiments to see what works and what doesn’t work with the medium. What would happen if you put audio first?
  • 11:14 – Length of Zero Point as a VR experience, and optimizing around that. What would the optimal length be? It’ll still be determined. It has to be comfortable. DK2 is a lot more comfortable than the DK1. You need enough time to really get that sense of presence, and at least a couple of minutes. 15 minutes does feel like a long time to be in VR watching a video. Hope to iterate more quickly on shorter experiences.
  • 12:58 – Introductory sequence is a CGI experience, but most of it is 360-degree video. Interested in blending the CG to film. They’re very different. When you’re in video, then the brain can be more quickly convinced of being in another place with video IF the resolution is high enough. CG is more acceptable with the resolution of DK2. Video resolutions is kind of at the level of a DK1 at the moment. Need another iteration of panels at a higher resolution.
  • 14:58 – Cuts and fading to black and the language of VR for what works and what doesn’t work. It’s a new syntax and new grammar that needs to be developed. Fading to black works, but isn’t satisfying. Long-shot, unbroken and uncut is the starting point. It’s not too slow and still dynamic, and not motion. Easier to cut between scenes that are very different than to cut between a similar scene. Longer shots and less cuts is where he’d like to go. But it took years for the language of cinema to develop. Still in the research phase of what works in cinematic VR experiences
  • 16:50 – Movement within VR. Had some shots with motion, but also some stationary shots. Trying to push movement and moving through space. Need instant acceleration to make it comfortable. In film, you do easing from stationary to movement, but need to go straight to speed in VR movement. Head bob is too much when moving and causes motion sickness. Movement needs stabilized by a three-gimbal drone, and you get more 3D information. But VR needs new rigs
  • 19:00 – Experiments with audio. Different techniques used throughout the feed. Put binaural ear phones within your ear canals. Had a binaural mic in his ears recording the scene, but there’s no head tracking.
  • 20:32 – Different locations and different experiments with the scenes. The driving force is what is an effective VR experience? Direct engagement? Having things happen around you? Be a character and have an avatar and have some interaction. There will be a wide area of different experiences. People and animals and complex environments is compelling. Directly addressing the camera, and you start to feel like you’re there in real life. If you react as if you’re there and react to social cues, then it’ll get that sense of presence.
  • 22:34 – Power of VR is that it can put you in another conscious experience, put into another mind and another world. It’s a new form of communication and a new form of emotion. Apply this to situations that are abstract like global warming, and potential use it for pro-social goals. We can go beyond the type of content that we’re used to that is mainly controlled by mass media. It could provide a more advanced language for sharing stories in a new way. VR has tremendous potential, and it’s such early days still.

Zero Point Official Trailer from Condition One on Vimeo.

Theme music: “Fatality” by Tigoolio

Ally Maque of PixelWhipt & ASMRrequests talks about her new VR video show called VirtuAlly.

VirtuAllyAlly wants to help bring virtual reality to the mainstream by producing a show that covers the weekly highlights of VR news. She talks about coming from the world of producing Autonomous Sensory Meridian Response (ASMR) videos where there is a heavy focus on producing binaural audio content. She talks about some of her VR insights from the process of producing first-person, immersive media like her sci-fi ASMR video called Departure.

She talks about some of her favorite VR experiences including Sensa Peso, RedofPaw’s Totoro & Spirited Away demos, Windlands, and the Alien Makeout Simulator. Finally, she sees that VR could eventually do anything and will change so many aspects of a variety of different industries.

Here’s the first episode of VirtuAlly:

If you’re curious to learn more about ASMR then this Vice article on the roleplay subculture of ASMR is a good primer. Or here’s a NSFW cartoon explaining ASMR that Ally says is “Hands down, best explanation of ASMR I’ve ever seen.”

Bilago is one of the developers of Riftmax Theater, which is a 4D, interactive social experience. What started as a way to watch YouTube videos in a theater environment has turned into what was voted to be the Best Social VR experience by the Proto Awards.

bilagoRiftmax hosts events like Gunter’s Virtually Incorrect talk show and has a regular karaoke night. The longest length of time that I’ve spent in virtual reality was 2 1/2 hours while being a guest on Virtually Incorrect. The fact that there were other real human beings there in that virtual space made it one of the most powerful VR experiences that I’ve had to date, and a testament to why Facebook acquired Oculus VR to do even more social experiences. I don’t think it’s an understatement when Mark Zuckerberg says that “Oculus has the chance to create the most social platform ever, and change the way we work, play and communicate.” Riftmax Theater is one of the applications that is on the leading edge of proving this out. My body has a memory of having a conversation with these avatars as if we were sitting on a theater stage because there was enough body language with the head tracking and Razer Hydra enabled hand movements.

The developers of Riftmax have started a modest Kickstarter $1 campaign in order to raise as much money as they can to continue development. Development will continue regardless, but they’ve set up a number of stretch goals to do even more features and functionality.

This video of testimonials from VR personalities is a great summary of what type of experiences that Riftmax has been able to provide to the VR community:

Mark Schramm talks about his Darkfield space dogfight VR experience that includes cooperative missions that you can do with your fiends.

mark-schrammFor Mark, part of the biggest potential of VR is to be able to go anywhere and do anything with your friends, and so he wants to designing VR experiences that you wouldn’t be able to complete on your own. He talks a bit about the networking options to be able to host your own game with your friends, and some of the design considerations in order to create a more satisfying social experience in VR.

There’s currently 4 days left on the Darkfield Kickstarter, and so check it out and help make it happen.

Theme music: “Fatality” by Tigoolio

Oliver “Doc_Ok” Kreylos is one of the most prolific VR hackers and very popular participant within the Oculus subreddit. He talks about his quest to roll his own Linux SDK for the Oculus Rift since there hasn’t been an official one released yet. In that process he’s been building support for different input devices for his Virtual Reality User Interface (Vrui) Toolkit.

doc-ok-kinectSince Oculus Connect, he’s had a number of epic follow-up posts on his blog (Intro, Part 1, Part 2, Part 3) about this quest towards a Linux SDK. He also discusses some of his biggest takeaways about the VR video rendering pipeline as well as some possible limitations of the graphics API in the Windows operating system that came out of this process as well.

He also talks about some of his findings of hacking the Microsoft Kinect v2 depth sensor towards the end of using it as a VR telepresence tool. He talks a lot more about this in my first Voices of VR interview with Oliver back in May.

Oliver then talks about some of his reviews of the VR hardware and input devices from SVVRCon, and some of his WILD SPECULATION™ around what type of sensor fusion combinations may make the most sense within a VR input device.

Finally, he talks about moving from the hard sciences to the soft sciences in his VR research by moving to the UC Davis Technocultural Studies department in the process of creating a VR research lab as a part of the UC Davis ModLab.

Theme music: “Fatality” by Tigoolio

Josh Carpenter is a VR researcher at Mozilla looking to see how to combine the best of the web with what the VR communications medium can offer. Mozilla lead the effort to release a WebVR API in June and it’s supported in Firefox as well as Chrome.

josh-carpenterMozilla is trying to answer what makes a great web experience in VR by doing a number of research experiments that they will be releasing sometime in late October or early November.

They’re trying to answer the question of “What’s the strength of VR on the web?” They want to move beyond just adding more screen real estate and look to the “Jobs-To-Be-Done theory” for insights about what people are actually trying to get from the Web. They want to learn, socialize and be connected, and so they’re thinking about what it means to connect to a friend or look up a piece of information in VR. Some of their experiments are more transient VR experiences via the web, some are more integrated social aspects, and they’re also looking at mash-ups of the web and VR.

Josh talks about building out collaborative browsing experiences with tools like WebRTC real-time communication and TogetherJS for collaborative browsing and chat. They’ve also been working with the directors of ROME: “3 Dreams of Black” on a new VR experience.

Josh is seeing a lot of interesting things happen in the JanusVR community in that if you give people easy syntax and tools to create VR experiences with a low barrier to entry, then people will create a variety of VR experiences. He’s also interested in tracking the progress of mixing the web in social spaces with AltSpaceVR and some of the video experiments by the eleVR team.

Finally, he talks about the support for WebVR for Safari and Internet Explorer, and what he sees as some of the educational possibilities and ultimate potential of creating the metaverse from the foundational components of URLs, View Source and APIs.

Theme music: “Fatality” by Tigoolio