ikrimaI talk with Ikrima Elhassan of Kite & Lightning about their new tabletop VR game called Bebylon Battle Royale, which is a “Vehicular Melee Party Brawler.” Kite & Lightning has a made a name of creating high-end cinematic VR experiences such as Senza Peso as well as Insurgent movie VR experience. Bebylon is K&L’s first foray into developing a comedy game, and so we talk about their game development philosophy as well as the challenges of creating innovative gameplay with support for both gamepads and touch controllers.

LISTEN

Ikrima says that VR game developers can choose two out of the three of the following: innovative VR gameplay, support for gamepads, or support for touch controllers. By choosing to support both gamepads and touch controllers, then they’re forced to go with a lowest common denominator gameplay that both controllers can support. They can do innovative gameplay design with either the gamepad or with the touch controller, but not with both. Because they’re planning on focusing on creating a launch title for the Rift, then they’re choosing to support gamepads, and not commit to touch support until they’re further into the development cycle.

Ikrima also talks about the choice on going with a miniaturized tabletop aesthetic in order to have all of the action within the nearfield sweetspot of VR, which maximizes the parallax effects and magic of VR. The other competing player will be represented in a 1:1 depiction enabling the players to express their creativity through a number of taunts, humiliation animations, and overall boasting. Ikrima says that a smoke bomb released on the race track could also temporarily block your first-person perspective of what you avatar can see. The miniaturized VR action will also lend itself well for spectators to watch the action from within VR as well.

The Kite & Lightning team has relocated to Paris in order to focus on prototyping and developing this game. They’re still early within the development process, and so they don’t have any gameplay footage or trailers to show just yet beyond a piece of concept art which shows how the immortal “beby” characters are trapped within the bodies of a 2-year old baby. Ikrima says that surreal and humorous nature of these “beby” characters helps to defy your expectations and overcome the uncanny valley with their stylized cinematic reality art direction. Ikrima describes the game as a combination of Mario Kart party mode and Super Smash Bros, and they are targeting their launch to be within a month or two after the consumer release of the Oculus Rift.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

hao-liHao Li is an assistant professor at USC who has been collaborating with Oculus Research on facial tracking while wearing a virtual reality head-mounted display. They presented their initial prototype and research paper “Facial Performance Sensing Head-Mounted Display” at SIGGRAPH 2015. Hao says this prototype proved that it’s possible to extrapolate occluded facial expressions with a combination of strain sensors and machine learning algorithms. They are now moving forward on the next iteration prototypes that should be with more consumer-ready prototypes. I previously covered Hao’s research in my write-up on my interview with Martin Breidt. Hao says that eye gaze is really crucial to having a successful social interaction in VR, and so it’s very probable that Oculus is working on integrating eye tracking in future consumer headsets. Hao talks about some of the next steps in his facial tracking research, and he’s really optimistic about the metaverse given how his research is helping facilitate the future of telepresence and social VR applications.

Listen

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

mary-spioMary Spio is the president and founder of Next Galaxy, and she’s working on a number of different VR experiences including music videos, medical training, and enterprise applications. They’re also developing the Ceekars 4D Headphones in order to increase the immersion from audio. They’ve included haptic feedback on the ears and band of the headphones in order to do things like amplify the feeling of wind blowing or a passing train. Mary loves going to concerts, and so she’s channeling her passion for music into the virtual reality experiences that she’d like to have. Next Galaxy is working with some heavy metal bands and talks more about how she sees that VR could help increase the engagement and distribution of music for bands who adopt the technology.

LISTEN

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Varun-NairThere are a number of different 3D audio plug-ins available for game engines like Unity, but there have not been any solutions for mixing 3D audio for cinematic VR productions. Two Big Ears just released the 3DCeption Spatial Workstation, which is a digital audio workstation that integrates with professional audio production tools. The Spatial Workstation includes an Intelligent 360 Video Player to be able to preview audio edits in a VR environment, as well as the ability to mix ambisonics audio channels with additional audio inputs. Varun Nair is the VP of Products and co-founder of Two Big Ears, and I had a chance to talk about the 3DCeption Spatial Workstation, ambisonics, performance metrics, and how he sees virtual reality is facilitating the combination of audio post-production techniques from traditional game design and film production. You can register your interest in the Spatial Workstation here.

Listen

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Aashna-MagoAashna Mago talks about the first Women in VR event in San Francisco that she helped organize on September 20th, 2015. There were over 200 people who attended the event that featured 13 women luminaries who are working in virtual reality, demos from Bay area start-ups, as well as brainstorming sessions about how to get more women into VR. Aashna talks about what gets her excited about getting more women interested within VR, and adding more diversity to the virtual reality field. Aasha has had an interesting path for getting into virtual reality that’s included teaching herself programming, taking a number of strategic mentorship positions, and studying computer science at Stanford University. She aspires to become a virtual reality engineer, get more involved with education using VR, and continue to evangelize VR to new users.

Listen

Check out a video of the first Women in VR event produced by Upload VR.

For more coverage of the first Women in VR meetup, be sure to check out these two articles. “The movement to get more women in VR” by Sophia Dominguez of AllThingsVR and “Virtual Reality is about to get very real for women” by investor Anarghya Vardhana.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

gunterGunter S. Thompson has been hosting talk shows in virtual reality since April of 2014, and been participating in many of the early social VR experiences ranging from Minecrift, VRChat, and Riftmax Theater. He talks about some of the unique social experiences that are happening with user generated content including hover bus tours and various social games, as well as some of the regular events ranging from karaoke, meet-ups, and talk shows like Gunter’s Universe. Gunter is the events manager for VR Chat, and he talks about some of his visions for the types of events he’s looking forward to having as well as reflecting on the connections and friendships that he’s been able to cultivate over the past year and a half through social VR. There are people from all over the world that are able to connect through these Social VR experiences, and Gunter’s experience that social VR experiences have been some of the most compelling ones that he’s had and he looks forward to exploring the metaverse with his virtual friends.

LISTEN

Here’s the episode of Gunter’s Universe where he interviews the creators of THE VOID

And here’s the episode of Virtually Incorrect that I participated in with Gunter back in July 2014

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Geoff-SkowGeoff Skow is a co-founder of Fish Bowl VR, which has over 200 early adopter, VR enthusiasts available to do user testing for VR experiences. VR developers can get a monthly subscription to get on-demand, user testing with let’s play videos as well as quantative and qualitative feedback on their experience. Fish Bowl VR provides feedback ranging from the framerate and performance across a spectrum of different hardware, ratings on the GUI and game play, as well as open-ended survey questions talking about what types of things could be improved or added to the experience. Some of the biggest open problems that Geoff sees VR developers face is how to train users how to play their game, and he talks about some different tutorial approaches that are embedded within the VR environment. Fish Bowl VR has over 250 VR enthusiasts on-hand who are paid to play and record their playtest sessions, and are always looking for more users to get paid to play VR experiences and offer their feedback. Getting objectively detailed feedback from people experienced and familiar with VR is certainly filling a market need, and Geoff says that VR developers can use their service in order to track their development progress over time.

Listen:

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

The Oculus Touch Toybox demo was shown to the most number of people in one day at Oculus Connect 2 on September 23rd. This was the first time that a lot of developers were able to get their hands on the Half Moon Prototype VR input controllers. But more importantly, it was a watershed moment for so many developers to be able to experience social and emotional presence with another person within virtual reality. It became less about the technology and tech specs, and more about the experience of playing, having fun, and connecting to another human in ways that were never possible before. This Toybox demo felt like a real turning point and “Aha!” moment for a lot of VR developers to see how compelling social experiences in VR are going to be. I had a chance to capture some of the candid reactions from Ken Nichols and Ela Darling moments after experiencing Toybox for the first time.

Listen:

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

will-smith
For the past three years, Will Smith and Norman Chan have been playing the latest virtual reality demos and tracking the evolution of VR with their Tested.com reviews. Will was able to see Oculus VR’s Toybox demo at E3 this year, and the amount of presence he was able to feel with the Oculus Touch controllers combined with dynamic social interactions convinced him to quit Tested to start his own VR company. Will isn’t talking about the specifics of his new venture just yet, but he alludes that it has something to do with telepresence communication as inspired by the interactions from the Toybox demo.

Listen

Will talks about some of the big turning points in tracking the story of the consumer VR revolution starting with first seeing the Doom 3 demo at PAX shortly after the Kickstarter was successfully funded and after the initial buzz from Carmack showing it off at the 2012 E3. He says that the DK1 release was a huge turning point for developers, and then the amount of embodied presence with hand tracked controllers with the Vive demo at GDC this year was another huge turning point.

Will says that the thing that the Vive demo was that it opened up more verbs that are available to game developers and VR experience designers. While previous actions had to be abstracted out into a keystroke, button, or joystick movement, the hand tracked controllers allows us to move in a more intuitive and natural way that creates a sense of immersion that really fools much more parts of our brain into believing in the reality of these virtual worlds.

Will talks about noticing how he backed up to avoid hitting his head on a virtual table that didn’t exist in real life, but his primitive brain still believed in the reality of it. With the addition of real-time interactions with another person in the Toybox demo, it just increased that feeling of immersion and helped to convince him that virtual reality was a technology that was ready to make a game changer today, rather than be a technology that could have an impact 5-10 years from now.

In terms of experiences that Will is looking forward to having, he’s really interested in doing all of the things that are either impossible, too dangerous, or too scary to do in real life. He’s also really interested in telepresence applications of VR, and sees that VR is going to fundamentally change the way that we communicate with each other. He’s looking forward to being able to spend time with his wife while he’s away traveling, and that this is one of the things that he’s working towards in his new venture.

You can continue to listen to Will on the Tested Podcast, and if you’d like to keep in touch with his next venture, then you can sign up to his newsletter here.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Timoni-WestTimoni West is a principle designer for Unity Labs, and she’s working on creating professional tools for creating VR scenes in Unity while being in VR. These tools are still in the prototype phase of development, but it’s something that Unity is actively working on implementing. There will be an API for developers to extend the VR creation process within Unity, as well as a new Director Sequencer tool that could be used for Cinematic VR that’s on Unity’s public roadmap for the Unity 5.4 release. Timoni and I talk about these new VR features as well as design inspiration from the VR creation tools of Tiltbrush and Oculus Medium.

Listen

The Unity Labs team is focused on future technologies, and they’re currently spending a lot of their effort on creating some of the first pro tools for virtual reality. Because Unity’s intention is to democratize game design to make games easier to create, then they don’t have to worry about creating a sense of presence within their VR scene editor tool. Their goal is to create the tools so that the game developers can make something that feels real in VR.

Because developers will be potentially using these VR creation tools within Unity as a direct connection for how they’re getting paid, then they need to be customizable. They’re planning on having a flexible UI with smart defaults that will allow you to really customize your workspace environment. They still want it to feel like you’re using Unity, and so there will be many design elements and features that should be available.

Their plan is to create integrations with the 6-degree of freedom controllers as well as support for the standard keyboard shortcuts. They also heard at different VR conferences that there were a lot of people working on VR creation tools, and so that helped them decide to create a robust API to allow plug-in developers to create their own variations of VR creation tools within Unity.

Timoni did say that the actual modeling of 3D objects is beyond the scope of what they’re currently working on, and that there will likely be other tools like Tiltbrush, Medium, and others that tackle that problem.

One of the things that Timoni really likes about the Tiltbrush interface is that all of the options are always visible to you. She found that the Oculus Medium approach of hiding and changing the controls depending upon what tool was selected was a bit more confusing. She’d like reveal as many of the options as possible for the VR creation tools in Unity as well as represent the 3D objects as they would appear within the scene rather than depending upon file names.

There will also likely be a number of tasks that will still be more optimal to do within the 2D interface, and other aspects that will be easier to do within a VR creation environment within Unity. She talks about a chessboard interface that would have a miniaturized model of the scene in front of you while also having the full scale environment so that you could be able to have a large range of fidelity for altering the scale of objects within a scene.

Another upcoming feature thing that Timoni mentioned could have a huge impact on the cinematic VR is a Director Sequencer that is currently scheduled to be released within the 5.4 release on March 16, 2016. This cinematic sequencer tool will allow the authoring and playback of sequences of animation and audio clips. She said that she’d love to see this Director tool also get direct VR integration so that you could start to create Cinematic VR sequences directly within Unity.

They’re still early in the prototyping and development phase of a lot of these VR creation tool features, and at this point it hasn’t even been announced on their public roadmap. So it’s likely to appear on the 5.4 release or beyond in 2016. If you have ideas or feedback for what you’d like to see within a VR creation tool within Unity, then feel free to reach out to Timoni West via her website.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.