rod-haxton
Rod Haxton is the lead software developer for VisiSonics, which created the RealSpace™ 3D Audio technology that Oculus has licensed to put 3D audio into VR.

My experience is that having 3D audio in a VR experience is a huge component for creating a sense of immersion, especially when you’re able to go beyond panning the audio between the left and right channels as you turn your head. With RealSpace™ 3D Audio, they’re able to go beyond panning to simulate the elevation and whether the sound is in front or behind you. They process audio in a way that’s analogous of doing ray-tracing for ears where they take true material audio reflections and do calculations that are based upon Sabine’s reverberation equation.

Our ears filter sound in a way that helps us be able to locate the sound in space. Everyone’s ears are different, and VisiSonics can create a specific profile for your ears in what’s called a HRTF, or head-related transfer function.

They have a database of HRTFs, and use a default profile that works pretty well for 85% of the population. Rob talks about how VisiSonics has a patented a fast-capture for a personalized HRTF where they put speakers in your ears and have an array of microphones in a room. He sees a vision of a time in the future where you’d go into a studio to capture the HRTF data for your ears so that you could have a more realistic 3D audio experience in VR.

Rob also talks about:

  • Special considerations for spatializing audio & a special tool that they’ve developed to evaluate how well a sound will be spatialized.
  • Oculus’ SDK integration of RealSpace™ 3D Audio technology
  • Unity integration & VisiSonics direct integration
  • Options available for 3D audio that are provided by their SDK
  • Maximum number of objects that you could spatialize & what’s a reasonable number
  • Future features planned for the RealSpace™ 3D Audio SDK
  • Unreal Engine support coming soon
  • Originally funded by the DoD to help develop a way for nearly-blinded soldiers to do wayfinding
  • How Tesla is using their panoramic audio cameras to improve the sound profiles of cars
  • How Rod helped get RealSpace 3D audio into a game engine & how they connected with Oculus at GDC 2014
  • How they’ve developed a panoramic audio camera to be able to visualize how sound propagates
  • Good examples of 3D audio integration can be found in Technolust & demos from Unello Design & Bully! Entertainment
  • How poorly-implemented HRTFs had given them a bad name over time

This week, VisiSonics announced Unity 5 integration is now available in their latest v0.9.10 release

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Tobias-Baumann
Tobias Baumann is the director of game development for Perception Neuron, which is an IMU-based motion capture system. It might eventually be used as a VR input device, but Tobias acknowledges that the current price for the full system is a bit steep to be used for anything than a motion capture system at first.

The 32-sensor based full-body suit with finger-tracking runs at $1,499, and the 18-sensor full-body without fingers has an academic price of $799. The single hand version is more affordable for $100, and a two-handed option available for $200.

The Perception Neuron kickstarter more doubled their funding goal of $250k, and are getting ready to ship out their rewards within the next couple of weeks.

In the interview Tobias talks about:

  • The mechanics of how their wireless IMU-tracking system works and how it’s mapped to an inverse kinematic skeletal model
  • A bit of the history of Perception Neuron & how he first got involved
  • Some of the preliminary prototypes and design issues in grabbing virtual objects
  • Space requirements for roaming freely and cautions about Gear VR locomotion issues
  • The production pipeline for using Perception Neuron for motion capture and the Unity plug-in & Perception Access data capture software
  • Rigging considerations for getting Perception Neuron to work with motion retargeting
  • Hopes that it might eventually be cheaper to be used a viable VR input device

Finally, Tobias talks a bit about how he first got into VR and how he got involved with the Perception Neuron project.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Anush-ElangovanAnush Elangovan is the Founder and CEO of Nod, which produces the Nod Gesture-Control Ring. The Nod ring exposes motions, gestures, swipe, and tactile feedback and enables user interactions ranging from tap, double tap, swipe up and swipe down. Anush says that he started Nod because he saw that touchless interactions & gesture controls would bring about a new computing revolution paradigm.

Nod was showing a demo at GDC that used the nod ring in virtual reality as an input controller. It was a simple and intuitive interaction to interact within a VR environment, but it also felt like the latency was pretty high. Anush says that the extra latency was likely due to also broadcasting their demo to a screen so that others could see the VR action at GDC, and that their benchmarked latency is around 7.5ms per packet via Bluetooth.

The current price of a Nod ring is around $149, which seems a bit high. But it’s definitely worth keeping an eye on in the future as the VR input controller ecosystem evolves, especially for the mobile VR experience. Nod is integrating with OSVR, and so it could have a pretty straight forward integration with VR experiences through the OSVR SDK.

backspin
At GDC Nod also announced the Nod Backspin controller, which combines motion, gesture, touch, analog and haptic feedback. They had an early prototype on display at GDC, but didn’t have any demos with it yet.

Finally, Anush sees that reality and VR will blend into a mixed augmented reality where it may be difficult for our perception to discern what’s real and what’s augmented. In the end, he’d like to see Nod play a part in helping to capture human intention through various gestures and pointing behaviors.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

joel-green2 Joel Green is a producer and audio director at Cloudhead Games working on the The Gallery: Six Elements adventure game. He talks about the process of being invited to a secret meeting at Valve to experience the HTC Vive for the first time, and then going home with a development kit to create room-scale VR experiences.

Cloudhead Games has been working on consumer VR for a long, long time, and they’ve tried out a number of different hardware solutions. They see that the HTC Valve have really solved the input and tracking problems for the first time in a viable way. Up to this point, it’s been a lot of incremental progress and half-steps, but Joel says that Valve has really nailed this solution and it will take VR to another level of immersion and presence.

Joel talks about the process of collaborating with the other development shops who were invited by Valve to develop a demo for GDC. They were going through the process of debugging the hardware and software together because Valve was still developing the underlying SDK as they were already building experiences. There was a private forum that the developers used in order to share their lessons learned for developing a room-scale VR experience.

Joel also says that Cloudhead Games intends on still supporting as many different VR experiences as they can since they know that not everyone is going to be able to experience the full room-scale VR experience. He talks a bit about the Lighthouse tracking solution, and what’s involved with setting it up. And finally, he says that room-scale VR and this new tracking solution from Valve has the potential to bring in a lot more physical interactions into VR and that he’s excited to continue to experiment and develop more experiences that explore the realm of what’s possible.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Patrick-OLuanaighPatrick O’Luanaigh founded nDreams based out of Farnborough, UK in 2006 originally creating content for the PlayStation Home. After seeing some of the early consumer virtual reality head mounted displays, he quickly pivoted to focusing exclusively on creating immersive virtual reality content.

One of the first games that nDreams announced is called The Assembly, which is a sci-fi adventure game that presents the user with a number of different moral dilemmas. They’re not ready to talk too much about it yet, but Patrick did give some of the background story, intention and motivation behind exploring the medium of virtual reality through the process of making this adventure game. They’re planning on having it as a release title for both the Project Morpheus on the PS4 as well as for the Oculus Rift.

Patrick talks about the wide range of different initiatives ranging from developing for the Gear VR to helping AMD make some announcements about Liquid VR at GDC.

Patrick also talks about providing the seed funding for VRFocus.com in order to build excitement for virtual reality by posting very frequently about all of the latest developments and announcements in the VR space. I had the chance to interview VRFocus editor-in-chief Kevin Joyce at the Immersion 2014 gathering in Los Angeles. VRFocus has established itself as the AP of the VR world posting 6-7 updates per day on all of the latest incremental developments, and Patrick talks about the editorial freedom that he provides to Kevin and the rest of the VRFocus team.

Patrick is certainly a key figure in the consumer VR space, and so I was definitely glad that I had a chance to catch up with him at GDC and hear more about his background and to hear where he thinks the overall virtual reality space is headed. nDreams is definitely focusing on the gaming market at first, but he sees that there are a lot of really exciting opportunities for education in VR.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Danny Woodall is the Creative Director at Sixense, and he talks about how the electromagnetic STEM controllers can help to bring full-body presence into VR. He talks about the SDK improvements that they’ve been making, and some of the updates that were made to the lightsaber demo including updating it to UE4.

I had the opportunity to first try their Lightsaber demo back at Oculus Connect in September, and I had another opportunity to try it again at the VR Mixer at GDC. I often mention this demo as being one of the most immersive experiences that I’ve had in VR because it makes all the difference in the world to be able to track your body within a VR space.

Mel Slater’s research into Virtual Body Ownership Illusion has shown that the minimal thing that you need to be convinced that your virtual body is your body is to have a 1:1 correlation between you body movements in real life compared to what you see in VR. With the ability to have a your hands tracked relative to your head, Sixense has been able to create an IK skeletal model that really feels great.

Sixense also had their STEM controllers in the Tactical Haptics demo, and it made all the difference in the world to be able to track a physical representation of the weapon that you’re holding in real life with the weapon that’s being tracked in VR.

After the recording of this interview, Sixense announced to their Kickstarter backers that they failed the FCC testing. They said:

The reason the Base is failing is specifically due to our design of housing the five radio frequency (RF) dongles inside the Base. The RF dongles require grounding, but this grounding interferes with the electromagnetics (EM) of the tracking.

To address this issue we redesigned the Base electronics to keep the RF dongles located internally but not conflicting with the EM. This will require the production of new Base PCBs and further testing to ensure everything is working properly.

This will cause some further delays of the delivery of the STEM to at least back to July, and if they fail again then they’re looking to September to start shipping their units.

The STEM controller may have some advantages of working without having exact line-of-sight, but there are other potential issues of EM drift or interference from other electronics. For more of an in-depth discussion about some of the potential issues, then I highly recommend listening to this interview that I did with Oliver “Doc_Ok” Kreylos at Oculus Connect.

And for more of a background into Sixense, then here’s an interview I did with the founder and CEO Amir Rubin back at SVVRCon.

The STEM controllers are something to keep an eye on, especially considering that Danny mentioned in this podcast that they’re adding Gear VR support. If the price point can come down, then it’ll be a valuable addition to the VR input problem because having your hands in the game with a physical controller and physical buttons will have all sorts of applications that can create an incredible sense of immersion within virtual reality environments.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Henrik Johansson of Tobii Technology talks about some of their eye tracking technology that’s started to be integrated into different video games like Assassin’s Creed to be able to have an “Infinity Screen” user interface to 2D video games.

Eye tracking in VR has a lot of applications ranging from foveated rendering to being able to dynamically adjust the focus of the object that your currently looking at. At GDC, Henrik says that Tobii was talking about some of their collaborations with virtual reality head mounted display manufacturers for the first time. He wasn’t specific in terms of who or what exactly they’ve been working on, but said that sometime by the end of the June that they would be announcing more collaborations and work within both AR and VR. But they do have some AR-type of eye tracking Tobii Glasses 2 ready for the consumer market.

Henrik talks about some of the existing implementations of eye tracking with video games, and one of the exciting new game mechanics that becomes more possible with eye tracking is to give a more realistic and human behavior to interactions with NPCs. You can either trigger responses or change behavior of NPCs based upon whether or not you’re looking at them. He also shares some of the other applications and implementations of their eye tracking solutions in different software programs, and a bit more information about their SDK.

Eye tracking in VR is going to be able to add a lot of engagement within social situations and collaboration within VR, and so be sure to keep an eye on Tobii near the end of the second quarter of this year for more information on how they plan on using their technology within the VR and AR space.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

alex_chu
Alex Chu is an Interaction Designer at Samsung Research America, and he talks about the process and insights that came from designing Milk VR 360-degree video player for the Samsung Gear VR that’s powered by Oculus.

He talks about the importance of space when designing for virtual reality experiences, which he sees as a blend between real space and digital space. The Gear VR prototype team had been designing for mobile apps, and it took them some time to realize the importance of adding an environment and putting in architectural space around the content to help define the surrounding environment. For example, here’s a comparison that shows the difference between a gallery in 2D space vs designing a gallery for a 3D space and in VR:
samsung-gallery-2d-vs-3d

At the Samsung Developer Conference, Alex gave a talk titled:
VR Design: Transitioning from a 2D to a 3D Design Paradigm. He gave an abbreviated version of this talk at the Immersive Technology Alliance meeting at GDC, but you can see the full presentation slides here.

Alex showed a number of different ergonomic graphics showing about how users engage with content and giving metrics for comfortable ranges for the user. They also researched Field of View in order to help determine how to size their elements, as well as using peripheral vision to help guide their eyes. Finally, they also looked at depth cues and the range of visual disparity at different distances to see how the strength of stereoscopic cues varies depending on the depth. Here’s a number of slides with more info:
samsung-layout

samsung-3d-feeling

samsung-depth-perception

samsung-spheres

samsung-sphere-depths

He also talks about the user interaction options for a Gear VR, and why they decided that having a physical tap interaction paradigm was more reactive than just a gaze and wait type of interaction.

Some of the advice that Alex gives about user testing is to make your test meaningful by focusing on something very specific, and to eliminate all of the other variables that you’re not trying to test.

One of the biggest open problems with designing for VR is that there’s a huge range of how people react to VR experiences, and it’ll take some time for everyone to learn more about these differences and share data about how to best design for everyone.

Finally, Alex sees that the sky is the limit for designing for VR right now, and that he’s really looking forward to seeing how using architectural representations in VR will change the process of designing physical spaces.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

You can watch Alex’s full presentation from the Samsung Developer conference here:

Here’s a YouTube playlist of all of the VR-related presentations from the Samsung Developer Conference.

IMG_20150306_140753
Julie Heyde & Mariam Zakarian talk about the process of bringing the stories of Norse mythology to life with their RagnarökVR experience. Julie talks about going from hacking in Unity3d by herself to doing a series of game jams to eventually recruiting a total of a dozen people from the indie gaming scene.

Julie talks about the inspiration coming from facing one of her childhood fears of wolves, but also to explore some the Norse myths that she’s grown up with. They talk about the process of crafting an explorational horror experience, and how they’re going beyond using jump scares.

A big motivation for Julie is to create the type of virtual reality experiences that she wants to loose herself in. Their team has been taking a very iterative, game jam approach of rapidly prototyping ideas and then getting a lot of user feedback by showing it frequently at various public gatherings.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

NevilleSpiteriNeville Spiteri is the founder and CEO of WEVR (formerly Wemo Labs). It was originally started to create immersive experiences before the Oculus Rift Kickstarter. They had created an immersive underwater experience called The Blue, and then they ported it to virtual reality in 2013. They also started developing a mobile version in 2014 after getting a sneak peak of a Gear VR prototype. On March 1st, they rebranded to WEVR to show the world that they’re now 100% focused on virtual reality, and they announced their partnership with Valve in creating the TheBluVR: Encounter VR experience For the HTC Vive.

Neville says that experiencing The Room Demo in 2014 was one of the most profound experiences of his life, and that it was really moving to experience the sense of true presence in VR for the first time. After meeting with Valve to experience the Vive for the first time, they then spent six weeks building “ThebluVR: Encounter,” which is a room-scale VR experience that was shown as a part of Valve’s GDC demo reel. They wanted to create an introductory experience for room-scale VR where you could have a soft-landing and not be bothered by figuring out the controllers, but rather focus on exploring a space by moving around.

There’s a quantum leap from mobile rotational systems to desktop positional systems, and then another quantum leap to design a VR experience for room-scale. He says that it engages more of your physicality, which allows more part of your body that can react to the VR environment. They also learned that if there’s a large object coming towards a visitor and they are having a sense of presence, then they’ll move out of the way as if it were real. They also included various interactive elements including having school of fish react and being able to create currents by your hand movements to help increase the sense of presence. He also talks about the importance of establishing the sense of place, and giving you a sense of human-scale reference to give you an idea for how big of a space that you’re in when creating a room-scale VR experience.

WEVR has been creating experiences for the three tiers of mobile, desktop and room-scale VR in order to discover the strengths and constraints for each tier. They suggest starting with the audience that you’re trying to reach first with your VR experience, and then which VR platform is going to best express and reach that audience.

VRtheBlu_Encounter_SVR_panorama17_v2b

They created The Blue experience for all three VR platform tiers, and Neville says that a lot of the assets and interactions need to be specifically designed for each platform.

WEVR has also been exploring the medium of cinematic & 360-degree VR experiences. Overall, they’re focusing on non-game, storytelling creative experiences.

They also creating a VR Playback system, which is currently in private beta in order to help experiences play on all of the VR platforms. They’re hoping that it’ll help solve the problem of distributing your VR experience as well as to help build an audience for your work.

WEVR is more of a studio/publisher rather than a work-for-hire development shop, and they’re focusing on the underlying distribution platform. But since it’s still early days in the VR world, they’re also investing a lot in creating different VR experiences in order to better understand all of the different facets of this new medium. He also alluded to the fact that they’re working on a number of different launch titles for the Vive and and Oculus Rift desktop & Gear VR.

WEVR has also been doing some experiments in interactive storytelling and cinematic VR. They’ve been collaborating with different filmmakers including Roger Avary, who is an Academy Award winner who co-wrote Pulp Fiction. They’ve also been spending just as much time with game designers in order to get their insights into storytelling in an open world. WEVR also announced at SXSW that they’ve created a million dollar Virtual Reality grant program for immersive storytellers.

Finally, he talks about some of WEVR’s other VR experiences including the Space Shuttle Endeavour and Above & Beyond At Madison Square Garden. He sees that we’re moving beyond being able to share moments to being able to share full experiences with each other. Neville sees that VR has the potential to be more efficient in communicating and learning, and more entertained, and he’s excited to be a being a part of the mainstream resurgence of virtual reality.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.