Nick Whiting is a lead engineer at Epic Games and has been a VR evangelist there. Nick had worked with Brendan Iribe and Nate Mitchell at Scaleform, and so they sent him an Oculus developer’s kit to integrate into Unreal Engine 4. He ended up working on it in his spare time, and eventually got it working.

Nick-WhitingThen Oculus brought them the HD prototype and they collaborated on creating a UE4 demo for E3 in July 2013, which helped Oculus beat out Xbox One and PlayStation 4 in the Game Critics E3 award. This was a turning point that helped to legitimize VR at Epic Games.

Then founder Tim Sweeney & the directory of engineering Dan Vogel saw the Valve VR demo room, and this helped to seal the deal for VR at Epic. Some people could see the potential beyond DK1, but others needed to see something closer to something that’s ready for a consumer product. The Valve Room VR demo was a clear turning point for the leadership at Epic.

Over the past couple of years, Nick has started to get more resources to make VR demos, including the Showdown demo that was the final demo scene in the series of Crescent Bay demos.

VR started as a side project at Epic, and now Nick says that it’s pretty huge there. Most recently Tim Sweeney said that he believes that virtual reality “is going to change the world.”

In this interview Nick talks about:

  • How opening up UE4 to a subscription model at $19/month brought it to a wider variety of developers and VR experience creators.
  • The process of integrating open source contributions from the community back into UE4
  • The Public Trello board for the UE4 Roadmap, and how that plays into their release cadence
  • Help from Oculus in integrating UE4 originally came from Andrew Reisse, and now Artem Bolgar has been the dedicated resource doing a lot of engineering work to get the Oculus SDK integrations working
  • Epic’s approach to superior visual fidelity
  • The possibility of SLI GPUs and need for more GPU power for VR
  • How the Showdown demo was being run on the NVIDIA GeForce GTX 980 at 90Hz at the Crescent Bay demo resolution
  • Experimenting with integrating with different hardware for VR input controls. The more integrations, the better

Nick also talks about some of the lessons learned from doing VR demos. He says that VR makes developers more honest because the tricks that work in 2D don’t work as well in 3D. Couch Knights was about creating a shared social space and it was more impactful than they expected.

Epic’s visual style has also traditionally been more realistic and gritty, but they found that within VR that people tend to make more emotional connections to abstract characters with a more stylized art style. There are downfalls of the uncanny valley with a hyperreal rendering, while a low-poly scene tends to allow your mind to accept it more because there is room for more mental projections and less noticing for what’s not 100% correct.

Finally, one of the most powerful experiences in VR for Nick was a social VR experience where he felt presence with another person who had limb tracking enabled. He see that humans being presence with each other was really powerful and compelling, and that being present in a world that’s not our own has a lot of potential that he finds really exciting.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

This video is what since has been referred to as the Showdown demo, and was the final demo scene of the Crescent Bay demo.

Here’s Nick Whiting and Nick Donaldson from Epic Games discuss Lessons from integrating the Oculus Rift into Unreal Engine 4 at Oculus Connect:

Sergio Hidalgo is an indie game developer who developed the Dreadhalls horror experience, which received an honorable mention in the Oculus VR game jam in 2013. He’s in the process of making a full commercial version as well as a mobile version for the Gear VR.

Some of the things Sergio talks about are:

  • Designing horror experiences that go beyond jump scares
  • Being able to experience a physical presence of the creatures in VR
  • Using procedurally-generated maps in order to keep the user surprised
  • Triggering the movement of creatures based upon the movement of your head is also unique to VR
  • The change blindness mechanic and playing with a peripheral version mechanic
  • Pitfalls of building a developer building tolerance of their horror experience & tips to help with this
  • Design approaches for when a player breaks through walls with positional tracking
  • The tradeoffs for supporting various input controllers
  • Lighting considerations for Gear VR optimizations

Sergio also mentioned how the first VR game jam provided a lot of inspiration for him in terms of exploring different VR gameplay mechanics.

This morning Oculus just announced that there is a Mobile VR Game Jam 2015 happening specifically for the Gear VR. You can sign up for it here.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Justin Moravetz of Zero Transform is a one of the most passionate and dedicated indie VR game developer that I’ve met. He’s been thinking about Virtual Reality since the 90s, and he talks about the evolution of developing both Proton Pulse and Vanguard V. Some of the other topics discussed were:

  • Differences between designing for the Sony Morpheus, Gear VR & Oculus Rift.
  • User experience and input control innovations with tapping
  • Lessons learned from his Kickstarter campaigns
  • The challenges of making it as an indie VR developer
  • The iterative process of game design and tracking progress with video logs
  • Using Gear VR to spread his VR development work beyond the VR community
  • The implications of being able to manifest your imagination to scale
  • His journey into VR and game development
  • Importance of teaching through interactions in VR

Since the recording of this interview, Justin has announced that he’s helping out on a VR Rock Opera called [NUREN].

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

E McNeill is the developer for Darknet, which is a cyberpunk hacking game developed for the Gear VR. It was originally developed in three weeks as the game, which was the winner of the game jam that Oculus sponsored back in the summer of 2013.

CiessE-McNeillE’s intention is to create a VR experience that has more of a deeper gameplay mechanic and keeps people coming back to play more. He resisted calling it a puzzle strategy game, but he’s now resigned to describing it that way.

E reflects upon the design process creating the rules that form the basic gameplay for Darknet. It’s an iterative process that sometimes forces him to abandon gameplay that isn’t working. He went through the process of creating rules that were too hidden and random for players to figure out, and they’d have to resort to random guessing in order to progress through the game. He ended up shortening the feedback loop cycles and experimented with a variety of different gameplay mechanics so that a player could progress from being a novice to an elite hacker. He’s even embedded some more advanced optional features where you could gain an advantage if you’re willing to learn how to read the matrix.

Finally, he talks a bit about how Darknet worked more naturally as a mobile game, and some of the optimizations that he needed to do in order to get it ready to be a launch title for the Gear VR. He found that playing the game in a swivel chair provided the best experience, but he also added a VR comfort-like option for turning in his game in order to avoid feelings of simulator sickness.

Thinking about VR too far into the future is a bit too speculative for E, as he’s not sure if VR will be able to live up to the potential of everyone’s expectations. But for now, he’s just happy to think about VR in the short-term and help pioneer some of the gameplay mechanics that work really well within this medium.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Cymatic Bruce is the head of developer relations at AltSpace VR, and he talks about the three main goals for their company:

  • Enable social interactions within virtual environments
  • Connect the 2D web within VR in order to share media and livestreamed events
  • Enable developers to bring in 3D web content into shared VR spaces

Part of what makes AltSpace VR unique is their mission is to create integrations with the open web. They’re building their application on top of Unity at first in order to ensure stable and accessible platform that performs well, but their long-term plan is to enable developers to create their own customized virtual environments.

Some of the topics that we discussed are:

  • Experiencing livestreamed events in VR with a collaborative web browsing component, as well as sharing animated GIFs and viral videos in a group context.
  • Plans to integrate with multiple HMDs and various input devices.
  • Their focus on non-verbal cues & limb tracking since facial tracking with an HMD is a difficult problem.
  • Their unique point-and-click teleportation solution to VR locomotion.
  • Integrating an open source audio solution, volume attenuation, positional VoIP & creating realistic sound profiles.
  • Plans for virtual environment customizations with via their SDK and API to create Ready Player One-like environments for expressing identity.
  • Their pragmatic approach towards solving the low-hanging fruit features of the Metaverse.

Cymatic Bruce says that part of what makes social VR so compelling is that memories are created based upon locations in VR, and it’s made even more salient when interacting with our friends and family. There are already people who are creating shared social experiences of watching TV shows and sports events together, and he foresees AltSpace VR being a part of how people experience these types of events in the future.

Be sure to sign up for AltSpace VR’s beta in order to check out some of their social experiences.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Nick-PittomNick “RedOfPaw” Pittom talks about the process of translating scenes from Studio Ghibli Films into VR experiences like The Boiler Room scene from Spirited Away and The Bus Stop Scene from My Neighbor Totoro

He makes the observation that translating a scene into a VR doesn’t mean that it’s the best way to experience the story or the characters from a film experience, and that really the best VR experiences will be custom made to take into account the pacing and environmental strengths of VR.

Nick also talks a little bit about his new Crystal Rift experience that he’s helping out with, as well as his successful Kickstarted trip to Oculus Connect and the current plan for his “RedOfPaw’s Big Crazy Stupid VR Adventure and American Pie.”

Theme music: “Fatality” by Tigoolio

Tom Kaczmarczyk is a developer of SUPERHOT, which is an FPS where time moves only you move, which recently raised $250k on Kickstarter. It was a popular 2D game that was developed as a part of the 2013 7-Day First Person Shooter game challenge.

Tom-KaczmarczykTom talks about the evolution of the game, and how they first started their VR version of the game after Aaron Davies from Oculus VR reached out to them offering them a DK1 VR HMD to experiment with. He talks about the challenges of creating a 2D game first and foremost, and then adding VR support. He talks about some of the ways that SUPERHOT might be developed differently had it been an experience that was designed for VR first rather than adding VR compatibility to a 2D game.

Theme music: “Fatality” by Tigoolio

Jens Christensen is the CEO and co-founder of Jaunt VR, which is a VR startup that has raised over $35 million to develop the toolchain for live-action, 360-degree spherical, 3D video. Jaunt’s technology combines “computational photography, statistical algorithms, massively parallel processing, cutting-edge hardware and virtual reality.

jens-christensenjaunt-304xx865-1297-152-0Jens talks about Jaunt VR’s mission to capture the world around you, and to be able to put that within a VR experience. It requires capturing a ton of video data, and they’ve developed a parallel processing data farm process to be able to stitch together a fully spherical video. They’re also developing a method for capturing ambisonic 3D audio.

The Jaunt VR Twitter account has been posting various photos of covering a wide range of different sporting events, and Jens says that Jaunt is very interested in eventually providing a live broadcasts of immersive content. Their primary focus in the beginning are sports events, music events, as well as experiments with narrative content ranging from a horror story to a story set in World War II. They’ve haven’t released any VR content to the public yet, but are looking forward to releasing some experiences here soon.

Jens said that they’re still working on there is no full positional tracking yet and at the moment it’s just rotationally tracked, but it’s something they’re still looking into. Storytelling within VR is going to require the developing of a new language and syntax, and that in the beginning they prefer to keep the camera steady to minimize motion sickness.

Jaunt VR is also more interested in leveraging the existing video post-production toolchain for editing, compositing and color correction. It sounds like they’re likely working on some plug-ins or video players to be able to directly integrate and preview VR content while you’re editing it, but Jens didn’t provide any specifics and said that it’s still an open problem. Finally, he sees that VR is going to change a lot of industries ranging from news, music and travel and he’s looking forward to helping play a small part with the work that Jaunt VR is doing.

Theme music: “Fatality” by Tigoolio

Danfung Dennis talks about the process of making the first commercially available, 360-degree, fully immersive VR cinematic experience called Zero Point, which was released today on Steam. I see Zero Point as a historic VR experience that will likely catalyze a lot of other ideas and experiences within the realm of cinematic VR, and so it’s definitely worth checking out.

Danfung-Dennis Danfung is the CEO of Condition One and actually started looking at 360-degree video starting back in 2010 when creating some immersive experiences for the iPad and iPhone. He sees Zero Point as a camera test and experiment that shows the evolution of 360-video capture technology starting from monoscopic 180-degree video and maturing to the point of 360-degree stereoscopic video using RED cameras shooting at 5k and 60 frames per second.

Danfung hopes that Zero Point will be a catalyst for other experiments with 360-degree cinematic experiences that help to develop and cultivate the new syntax, language and grammar for creating immersive VR experiences. Zero Point uses the constructs of a traditional documentary film, but yet I was personally left craving something that was focused more upon putting the physical locations at the center of the story rather than the other way around. Danfung seemed to agree with that sentiment, and that moving forward what is going to matter more is using the strengths of the VR medium to create a more visceral and emotionally engaging experience where the location, visuals, and audio are all synchronized together.

In terms of the language of VR, Danfung had many insights ranging from seeing that cuts between drastically different scenes is a lot more comfortable than moving around within the same scene. And while fading to black seems to be a safe way to navigate between scenes and minimize the disorientation from teleporting instantly to a new place, he also sees that it could be overused and that he’s interested in experimenting with longer shots with few cuts. He’d also like to iterate more quickly by creating shorter experiences that are long enough to create a sense of presence, but to be able to create a wider range of different types of focused experiments in order to learn more about the language of cinematic VR which is still in it’s very early days of being developed.

There’s also an interesting experiment of blending computer-generated experiences with live-action footage, and Danfung wants to experiment with this more. He believes that when the resolution gets to be high enough, then 360-degree video has the potential to provide a sense of presence more quickly than a computer-generated environment.

Finally, Danfung sees that there’s a tremendous potential of VR to be able transcend the limitations of the mass media, and to provide a more advanced language for sharing stories that take people into another world, another mind, and another conscious experience. There is a lot of potential to explore more abstract concepts and to use the VR medium for pro-social causes and to help make the world a better place.

Zero Point is now available on Steam here.


  • 0:00 – Introduction
  • 0:35 – Process of innovating and creating 360-degree video. Create powerful immersive experiences, and tried an early prototype of the DK1 and knew that VR was going to be his medium. Started with astronomy photography and use 180-degree fisheye lenses to look at capturing the night sky with a DSLR. Saw limitations to 180-degree lenses and started to add more cameras and now have an array of Red Cameras shooting at 5k and 60 fps.
  • 3:20 – Mix of different video styles and techniques. It’s an evolution of different capture systems and playback. Started with 180-degrees and mono and keep innovating the technology to the point of 360-degree sphere in stereoscopic 3D. There is quite a mix throughout the VR experience. Get it out and get feedback, and it’s a signpost of what’s to come. It’s a camera test and an experiment.
  • 5:00 – It’s easy to break to break presence. Seams and black spaces are immediately noticed and it’ll pull them out of the experience and it’ll break the sense of presence. A large process of trying to iron out the glitches. Have to have a very seamless image because people will notice the errors. The resolution needs to be higher than the DK2 to take
  • 6:54 – Mixing the form of documentary with being transported to different locations within VR. What is VR film the strongest for? Drawing on the techniques for what we know before. Those film techniques don’t really apply to VR. Zero Point is a more traditional documentary approach, but moving forward the core experience is going to matter more. How can emotionally visceral an experience will be is what is going to be the most compelling. The scenes and environments will be key. How to best convey emotion with the VR medium is an open question. It’s a new medium and a new visual language that’s being developed.
  • 9:00 – Zero Point as a “stub article” that will catalyze ideas for what the strengths of VR film medium. Designing around the location first. VR will be a visuals-first, and then audio. Design around that. This is what high-resolution and high framerate VR looks like. Hopes that Zero Point is a catalyst to start putting out experiments to see what works and what doesn’t work with the medium. What would happen if you put audio first?
  • 11:14 – Length of Zero Point as a VR experience, and optimizing around that. What would the optimal length be? It’ll still be determined. It has to be comfortable. DK2 is a lot more comfortable than the DK1. You need enough time to really get that sense of presence, and at least a couple of minutes. 15 minutes does feel like a long time to be in VR watching a video. Hope to iterate more quickly on shorter experiences.
  • 12:58 – Introductory sequence is a CGI experience, but most of it is 360-degree video. Interested in blending the CG to film. They’re very different. When you’re in video, then the brain can be more quickly convinced of being in another place with video IF the resolution is high enough. CG is more acceptable with the resolution of DK2. Video resolutions is kind of at the level of a DK1 at the moment. Need another iteration of panels at a higher resolution.
  • 14:58 – Cuts and fading to black and the language of VR for what works and what doesn’t work. It’s a new syntax and new grammar that needs to be developed. Fading to black works, but isn’t satisfying. Long-shot, unbroken and uncut is the starting point. It’s not too slow and still dynamic, and not motion. Easier to cut between scenes that are very different than to cut between a similar scene. Longer shots and less cuts is where he’d like to go. But it took years for the language of cinema to develop. Still in the research phase of what works in cinematic VR experiences
  • 16:50 – Movement within VR. Had some shots with motion, but also some stationary shots. Trying to push movement and moving through space. Need instant acceleration to make it comfortable. In film, you do easing from stationary to movement, but need to go straight to speed in VR movement. Head bob is too much when moving and causes motion sickness. Movement needs stabilized by a three-gimbal drone, and you get more 3D information. But VR needs new rigs
  • 19:00 – Experiments with audio. Different techniques used throughout the feed. Put binaural ear phones within your ear canals. Had a binaural mic in his ears recording the scene, but there’s no head tracking.
  • 20:32 – Different locations and different experiments with the scenes. The driving force is what is an effective VR experience? Direct engagement? Having things happen around you? Be a character and have an avatar and have some interaction. There will be a wide area of different experiences. People and animals and complex environments is compelling. Directly addressing the camera, and you start to feel like you’re there in real life. If you react as if you’re there and react to social cues, then it’ll get that sense of presence.
  • 22:34 – Power of VR is that it can put you in another conscious experience, put into another mind and another world. It’s a new form of communication and a new form of emotion. Apply this to situations that are abstract like global warming, and potential use it for pro-social goals. We can go beyond the type of content that we’re used to that is mainly controlled by mass media. It could provide a more advanced language for sharing stories in a new way. VR has tremendous potential, and it’s such early days still.

Zero Point Official Trailer from Condition One on Vimeo.

Theme music: “Fatality” by Tigoolio

Ally Maque of PixelWhipt & ASMRrequests talks about her new VR video show called VirtuAlly.

VirtuAllyAlly wants to help bring virtual reality to the mainstream by producing a show that covers the weekly highlights of VR news. She talks about coming from the world of producing Autonomous Sensory Meridian Response (ASMR) videos where there is a heavy focus on producing binaural audio content. She talks about some of her VR insights from the process of producing first-person, immersive media like her sci-fi ASMR video called Departure.

She talks about some of her favorite VR experiences including Sensa Peso, RedofPaw’s Totoro & Spirited Away demos, Windlands, and the Alien Makeout Simulator. Finally, she sees that VR could eventually do anything and will change so many aspects of a variety of different industries.

Here’s the first episode of VirtuAlly:

If you’re curious to learn more about ASMR then this Vice article on the roleplay subculture of ASMR is a good primer. Or here’s a NSFW cartoon explaining ASMR that Ally says is “Hands down, best explanation of ASMR I’ve ever seen.”