NevilleSpiteriNeville Spiteri is the founder and CEO of WEVR (formerly Wemo Labs). It was originally started to create immersive experiences before the Oculus Rift Kickstarter. They had created an immersive underwater experience called The Blue, and then they ported it to virtual reality in 2013. They also started developing a mobile version in 2014 after getting a sneak peak of a Gear VR prototype. On March 1st, they rebranded to WEVR to show the world that they’re now 100% focused on virtual reality, and they announced their partnership with Valve in creating the TheBluVR: Encounter VR experience For the HTC Vive.

Neville says that experiencing The Room Demo in 2014 was one of the most profound experiences of his life, and that it was really moving to experience the sense of true presence in VR for the first time. After meeting with Valve to experience the Vive for the first time, they then spent six weeks building “ThebluVR: Encounter,” which is a room-scale VR experience that was shown as a part of Valve’s GDC demo reel. They wanted to create an introductory experience for room-scale VR where you could have a soft-landing and not be bothered by figuring out the controllers, but rather focus on exploring a space by moving around.

There’s a quantum leap from mobile rotational systems to desktop positional systems, and then another quantum leap to design a VR experience for room-scale. He says that it engages more of your physicality, which allows more part of your body that can react to the VR environment. They also learned that if there’s a large object coming towards a visitor and they are having a sense of presence, then they’ll move out of the way as if it were real. They also included various interactive elements including having school of fish react and being able to create currents by your hand movements to help increase the sense of presence. He also talks about the importance of establishing the sense of place, and giving you a sense of human-scale reference to give you an idea for how big of a space that you’re in when creating a room-scale VR experience.

WEVR has been creating experiences for the three tiers of mobile, desktop and room-scale VR in order to discover the strengths and constraints for each tier. They suggest starting with the audience that you’re trying to reach first with your VR experience, and then which VR platform is going to best express and reach that audience.


They created The Blue experience for all three VR platform tiers, and Neville says that a lot of the assets and interactions need to be specifically designed for each platform.

WEVR has also been exploring the medium of cinematic & 360-degree VR experiences. Overall, they’re focusing on non-game, storytelling creative experiences.

They also creating a VR Playback system, which is currently in private beta in order to help experiences play on all of the VR platforms. They’re hoping that it’ll help solve the problem of distributing your VR experience as well as to help build an audience for your work.

WEVR is more of a studio/publisher rather than a work-for-hire development shop, and they’re focusing on the underlying distribution platform. But since it’s still early days in the VR world, they’re also investing a lot in creating different VR experiences in order to better understand all of the different facets of this new medium. He also alluded to the fact that they’re working on a number of different launch titles for the Vive and and Oculus Rift desktop & Gear VR.

WEVR has also been doing some experiments in interactive storytelling and cinematic VR. They’ve been collaborating with different filmmakers including Roger Avary, who is an Academy Award winner who co-wrote Pulp Fiction. They’ve also been spending just as much time with game designers in order to get their insights into storytelling in an open world. WEVR also announced at SXSW that they’ve created a million dollar Virtual Reality grant program for immersive storytellers.

Finally, he talks about some of WEVR’s other VR experiences including the Space Shuttle Endeavour and Above & Beyond At Madison Square Garden. He sees that we’re moving beyond being able to share moments to being able to share full experiences with each other. Neville sees that VR has the potential to be more efficient in communicating and learning, and more entertained, and he’s excited to be a being a part of the mainstream resurgence of virtual reality.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

yuval-bogerYuval Boger is the CEO of Sensics and a founding parter for the Open Source Virtual Reality (OSVR) project in collaboration with Razer.

The vision for OSVR is to be able to create a standardized middleware software layer that helps VR developers integrate a wider range of VR peripheral input devices as well as to create a hackable platform that VR hardware developers can add their specific hardware customizations or implement specialized VR algorithms. Having an open source VR HMD is a great vision that has gathered a lot of interest from a variety of different industrial VR manufacturers as well as VR peripheral manufacturers. As the VR ecosystem grows, then I think that there will be an increasing need for something like OSVR.

Boger talks about some of the features of OSVR including:

  • Implications of having open hardware,
  • Difference between OSVR’s abstraction layers for sensors and rendering
  • Negligible latency tradeoffs for using OSVR and the benefits
  • Will OSVR standards limit or encourage innovation?
  • Support for over a dozen different HMD manufacturers and many different input controllers
  • Android mobile integration and using OSVR for the sensor integration for Gear VR and the Oculus SDK for the rendering layer

It feels like OSVR is probably where Linux was probably a year or two after it was first released compared to the other VR HMDs that are on the market. But in the long-run an open source model is something to keep your eye on. You can check out the OSVR Github repos here and their github landing page here:

It looks like a lot of the software code is licensed under the Apache License, Version 2.0. You can also sign up to download some of the OSVR hardware schematics from their website, which is licensed under the Google’s Project Ara Module Development Kit License because open source hardware licenses are less defined.

One announcement that came out at the IEEE VR conference is that OSVR announced that they’ll be collaborating with 28 leading VR labs at Universities around the country. I’d expect to that OSVR will be a great platform for hardware and software hackers as well as VR academics to have a baseline platform to experiment and innovate.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Ethan-SchurEthan Schur is the Chief Marketing Officer for ImmersiON-VRelia, and they produce three different virtual reality HMDs. “The GO” is a mobile mobile HMD for either iPhone or Android phones, the PRO G1 has higher resolution and some AR features, and the Blue Sky Pro version has some extra ruggedized features for the military or for use in digital out-of-home entertainment applications.

In my interview with Kevin Williams, right after he announced ImmersiON’s collaboration with VRelia in order to produce a mobile HMD that is suitable for the Digital Out-of-Home Entertainment market.

For more information on VRelia and AlterSpace, then be sure to check out their website here.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Brandon Jones is a WebGL developer who started implementing WebVR into the Chrome web browswer as a part of his 20% project at Google. He’s been working on it for the past year in order to get VR into the browser. You can find the latest WebVR-enabled Chrome build here.

Brandon talks about the moment and growth of WebVR over the past year, and how he sees VR as an area of computing that people are very interested in. WebVR and WebGL are very interrelated areas, and so one could consider that he’s working all the time on WebVR.

He talks about the Khronos Group standards committee for WebGL, and the fact that the WebVR is currently homeless in terms of a standards committee and it’s uncertain as to whether the W3C or Khronos Group will be the governing body. You can check out the latest WebVR spec here.

Reducing latency is the number one focus for working on WebVR in Chrome, and the latest latency with a DK1 was 64ms of seconds for motion to photos, and is likely lower with the DK2 with the faster framerates. They’ve also integrated timewarp into WebVR in Chrome in order to help with reducing perceived latency. He talks about some of the ongoing work for Chrome to make realtime WebGL rendering a lot faster, as well as some of the other optimizations to the browser that could be made to make it more efficient if it’s known that the output display is VR.

Google is not targeting WebVR for the Gear VR at first because it’s not meant to be an end-to-end VR experience. In other words, they’re not creating browsers that work in VR, but rather making VR experiences that work in the browser.

Brandon talks about Google Cardboard, and some of the initial reactions to it and the growing interest around it. His own personal reaction to seeing Cardboard for the first time was to laugh in his manager’s face, but he very quickly went to “This is crazy!” to “This is brilliant!” after trying it out and seeing it’s potential. He talks about some of the more compelling Cardboard experiences he’s had, and how he sees it being able to fill a need for consuming 360-degree videos, 360-degree photos, and other ephemeral content.

He talks about some of the future optimizations that Unity and web browsers could make in order to streamline the viewing of WebGL content on the web. The current downloads are fairly monolithic and could be made to be more friendly for the web by dynamically streaming of assets and content.

Finally, Brandon doesn’t see the web as a platform for Triple AAA VR gaming since it’s not optimized to be able to maximize the use of native resources. Instead, he sees that web will be great for sharing ephemeral content that you don’t want to download. He also sees that a lot of the framework for the metaverse is already here with the Internet and cites Vlad Vukicevic who said, “We already have the metaverse, it’s the Internet in 2D!”

For more information on WebVR, then be sure to check it out here.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Ikrima Elhassen of Kite & Lightning talks about the process of developing the INSURGENT: Shatter Reality virtual reality to promote the Lion’s Gate film called The Divergent Series: INSURGENT. He talks how the project came about as well as many of the lessons learned along the way.

Ikrima also talks about the two plugins that they developed for the Unreal Engine 4 in order to complete this project

  • UE4 Stereo 360 Movie Export Plugin – to easily create GearVR ports of their passive desktop experiences. It’ll be available as an open sourced & free plug-in here soon.
  • Alembic Cache Playback – enables playback of Alembic files in UE4 so that they can import vertex cache animation such as water simulations or rigid motion animation to handle up to 10k fragments.

They wanted to have the widest release possible for the INSURGENT experience, and so it’s available via:

  • A DK2 Version
  • A Gear VR Movie Theater experience to watch the movie trailer and GearVR port of the DK2 Version (via Oculus Home on GearVR)
  • A traveling city tour with a custom designed/built chair from the VR experience with full haptic feedback and 4D components
  • Google Cardboard mobile apps (Android & iOS)

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Elyse-Bromser-KloedenElyse Bromser-Kloeden is a 3D environment artist and VR developer who is working on a 360-degree video streaming site called Vrideo. Vrideo is focused on making VR streaming work, and they’re doing some special optimizations provide a better quality VR streaming experience.

Elyse also has her own personal educational VR experiences including Meso VR and Painted Cave. She’s worked in the games industry as an environment artist and art director on games like Assassin’s Creed Brotherhood and Til Morning’s Night. She’s also been freelancing in working on different projects where she’s able to create real-time VR environments.

Elyse is working on the user interface at Vrideo so there’s a consistent cross-platform VR experience from web to mobile to PC. She’s noticed that text-heavy content needs special attention in VR because it can be overwhelming. Other questions of how real to make the environment versus how abstract it should be so that the focus should be on the video content.

She talks about her educational experiences with Painted Cave and Meso VR. She used Autodesk’s free 123D Catch photogrammetry program to capture textures and 3D models at Vasquez Rocks to create Painted Cave. In the process, she learned the importance of starting by optimizing your VR experience from the beginning to create a comfortable VR experience first, and then focus on making things look good from there.

Elyse talks about how she’s using photos from online combined with Google Street View’s photo sphere captures of Mayan Ruins in order to create her Meso VR experience. She’s has some educational features that she’ll be adding to her experience, including a lot of historical insights based upon working directly with archeologists studying Mayan artifacts.

Her favorite educational experience is 6000 Moons, which is a guided tour of all of Earth’s satellites. She makes the great point that a lot of the iOS educational experiences are geared more towards youth, and the VR educational experiences tend to be more targeted towards an older adult audience.

For more information, be sure to check out Vrideo, which just launched yesterday as well as Elyse’s portfolio site.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Trevor-ClaiborneTrevor Claiborne is a product manager for Google Cardboard, and he talks about the evolution of Cardboard from being a 20% side project to now shipping over 500,000 cardboard units. The Google Cardboard was a very secretive project within Google, and a lot of the people who were working on it in their spare time leading up to the initial announcement at Google I/O in 2014.

He talks about some of the design evolution of Cardboard including how the first discovered how they could use a magnet as a button using the phone’s magnetometer. There have also been some small tweaks and improvements to the the design over time.

He talks about the initial perception of Carboard being a joke, but once people get a chance to experience a Cardboard VR experience then they understand it a lot more. Trevor says that Google is serious about Virtual Reality, and it’s just that they’re going at it in a different way than other companies. They’re trying to produce VR experiences that more accessible to more people.

He talks about how Google Cardboard was deliberately designed to not include a headstrap because you hands limit your head motions in a way that helps to prevent nausea. The phone’s sensors are still limited, and they’re just trying to create an optimized VR experience given these constraints.

He talks about some of the future plans with regards to creating a Cardboard experience that fits a “phablet,” and some of the emerging interaction models. A couple of stand-out Google Cardboard experiences that he’s had include VR Cosmic Roller Coaster & Titan of Space for Cardboard.

Trevor says that there are no current plans at this point to add positional tracking, and so they’re using 3DOF head tracking. And he’s looking forward to when VR gets to the point of being able to transport us to another world that’s indistinguishable for reality, and that there are a lot of opportunities for fun and education in VR.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Kyle Monti talks about HapTech’s Haptic Peripherals for VR. They have a patent for electronic recoil for haptic feedback using linear motors, and they’re creating a generalized haptic device to be able to simulate everything from pool sticks, baseball bats to tennis racquets.

They were showing off a gun system in a really fun arcade-style robot shooter game at GDC. Having haptic feedback in sync with a positionally tracked object does an incredible job of increasing the sense of presence within VR. They were using the STEM controllers with their demo that was created by Otherworld Interactive.

One of the iterative innovations that came from the GDC experience is that HapTech modified their demo so that the person who was next standing in line could spawn and control the robots attacking the user in the demo. It was a great way to engage and involve the people watching the person in VR, but a human-controlled enemy was also able to do things that would be a lot more difficult to program for.

Kyle also talks about some of the more military-grade haptic weapons that they are involved with helping to create with his DEKKA technologies. They license out the patented recoil system so that they can focus on the haptic technology portion, but also so that they can invest more energy into VR gaming rather than virtual military training equipment. So HapTech is an interesting company in that the founders have a foot in each world.

HapTech was one of the more immersive VR experiences and demos that I got to try out at GDC, and so keep an eye out for their upcoming SDK and information for how to get more involved with creating experiences for their generalized haptic technologies.

Here’s Road to VR’s write-up and video on HapTech:

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.


Paul Bettner is the founder and CEO of Playful Corp, and he’s best known for creating the iPhone game Words with Friends. Dan Hurd is studio design lead at Playful and led an epic 3-4 month jam that produced 38 VR game prototypes exploring various VR game mechanics.

I had a chance to play Lucky’s tale for the first time at GDC, and I have to say that I was blown away. I had a half hour for the interview, and I had the best intentions to get a brief feel for the game and then do the interview. Once I got into the Lucky’s Tale world, I was so compelled that I couldn’t stop playing. Playful has discovered something really key about what they call the “sweet spot” of VR, which is optimizing all of the gameplay to be about arms length away by increasing the IPD and using a set of finely tuned 3rd person camera controls.

Bettner has been an early adopter of new technologies, and he talks about discovering VR through a friendship with John Carmack and then the process that Playful went through in order to discover what works and what doesn’t work in the VR medium. They couldn’t fall back on their old tricks, and they had to find the rules where there are no rules yet.


Paul talks about how developers have decades of expectations and fantasies of what VR should be, and that we have to be willing to let go of some of the things that don’t work as well within VR. At GDC, Paul was starting to see some experiences that go beyond what our fantasies of VR might be. Dan suggests that developers have to be candid with themselves when experiencing something that they expect to be amazing in VR, but isn’t — such as flying because of the lack of depth cues of objects at far distances.

Dan & Paul talk about their iterative process of trying to discover the fun mechanics within VR. Like Job Simulator, they also found that doing mundane tasks with your hands is fun, especially if they’re physics-based.

Paul talks about discovering the “sweet spot” of VR, which ends up being about arms length away. They found research that backs up that we have more neurons in our brains to process information at higher fidelity when all of the game mechanics are happening within the length of our arms. Lucky’s tale was created in order to optimize the gameplay to happen within this VR sweet spot. They ended up increasing the IPD in order to give a giant’s perspective with a third person perspective. They also found that the diorama mode of an entire level looks really amazing in VR.

They broke some of the best practices guidelines from Oculus in order to discover this table mode of placing all of the action with the sweet spot of VR. Paul says that the parameters that control the third person camera operate in a very narrow band of comfort that are not very forgiving.

Finally, Dan talks about how light can bring a sense of reality to these virtual worlds and being in a magical place that causes surprise and delight in a way that’s the most comfortable and impactful. Paul says that virtual reality is the ultimate expression of being able to create your dreams, and that we’ve actually been able to create a technology that can directly interface with your brain in a way that’s indistinguishable from reality.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

josh-carpenterJosh Carpenter is a researcher and interaction and user interaction designer for figuring out how to use virtual reality on the web with WebVR. Mozilla has been increasing the size of the team working on WebVR because they’re betting that immersive experiences like virtual and augmented reality will be a huge part of the future of computing. Mozilla wants to enable web developers with the tools and infrastructure so they can help build out the metaverse of interconnected virtual worlds.

Mozilla was showing a number of different WebVR demos on a DK2 at GDC that were running natively in a Firefox Nightly web browser along with a WebVR Oculus Rift Enabler.

Some of the lessons learned for user experience design in VR is that they found that designing UI elements onto a curved surfaces works really well, and the text size has to be fairly large and so that reduces the amount of information density that’s available to work with. They also found that lean in to zoom with DK2s positional tracking is analogous to pinch to zoom on a multi-touch device in that it’s effective but yet also somewhat annoying, and so they try to avoid relying upon that too much.

Leap Motion integration into WebVR and virtual reality and warns about designing skeuomorphic interactions with your hands, but thinking about extending your reach and optimizing the interaction design for what works the best within VR. Josh also talks about the concept of progressive enhancement and how that applies to designing VR experiences that work in a range from mobile to sitting at your desktop with positional tracking to all the way to room-scale tracking with two-hand interactions. For the web, an experience should work at the most basic input device, and then progressively enhance the experience if more of those input devices are detected to be available.

Josh talks about the range of WebVR demos that were being shown at GDC ranging from games created in Flash, a 360-degree video, as well as Epic Games’ Couch Knights which was exported from Unreal Engine 4 using a WebGL export plugin.

The WebVR Web API specification is being designed so that you can experience the web through any of the virtual reality HMDs, and they’re also figuring out the user interaction paradigms that will allow people to be able to go to any destination in the world wide web-enabled Metaverse.

He talks about how Unity 5 now supports One Click WebGL export. Unity exports WebGL 1, and WebGL 2 is on the horizon with even more next-generation graphics capabilities. For example, Mozilla was showing off the following demo at GDC for the level of graphics fidelity that’ll be possible with WebGL 2

Josh also talks about what they’re doing in order to increase performance for real-time, immersive 3D experiences. There are a lot of optimizations that can be made to the browser if it’s known that the output will be a virtual reality display. It will take more development resources, and Mozilla has recentely committed to growing the WebVR to enable more VR on the web but also to help create the Metaverse.

He also talks about the power of the web for ephemeral experiences, and some of the collaborative learning VR experiences that he really wants to have within VR powered by a browser. He also talks about how WebVR is pushing innovation on the audio front, and he cites A Way to Go as an experience that pushes the audio API to it’s performance limits.

Finally, Josh talks about the future plans for getting WebVR easier to use for developers, making sure that it’s possible to have mobile VR experiences, and then creating responsive web design standards so that websites can flag that they should be experienced as fully immersive VR experiences. He also sees that it’s a safe bet to be investing in virtual reality because immersive experiences are a key part in driving innovations of the future of computing.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.