layla-mahLayla Mah is the lead architect of virtual reality and advanced rendering at AMD. She talks about AMD’s LiquidVR technology built to help bring comfort, compatibility and content to virtual reality. VR requires a lot of graphics processing resources, and Layla has been looking at different architectures, scaling strategies, and display technologies that can meet the growing graphic processing demands. AMD is not only making sure that VR can work out of the box today, but also continuing to innovate in order to meet the growing graphics demands of VR over the next 5-10 years. She talks about some of the GPU hardware innovations, multi-GPU strategies, overcoming the limits of LCD displays with virtual retina displays and digital lightfield technologies, as well as how the game engine will need to evolve in order to handle up to 16 GPUs.

LISTEN TO THE VOICES OF VR PODCAST

Layla Mah thinks a lot about the future of virtual reality and how to solve the exponentially increasing graphics processing demands to drive a 90 Hz display across two eyes, which amounts to 180 images per second at a resolution of 2160×1200. She says that the brute-force approaches that are taken today are not that sustainable as the displays move to 4K and 8K resolutions. She says the most important thing is to not drop frames, and so AMD is collaborating with content creators to debug their CPU and GPU piplelines in order to consistently hit the 90Hz spec.

Layla also points out that LCDs have evolved from the scan line approach of CRT monitors, and that a lot of the cables and hardware has been architected around the assumption that there will need to be a single frame with all of the data that’s updated at 90 frames per second. When looking to scale out to as many as 16 GPUs, then there are diminishing returns and inefficiencies in trying to break up an individual scene into different sections. Not only may there be an object may span across 3-4 different sections, but there’s also an overhead in re-combining the final image into a coherent image.

Layla says that photos are asynchronously streaming into our retinas, and so she’s investigating digital light fields as a solution that’s potentially more sustainable. This could mean that Magic Leap’s approach with a virtual retinal display could be more well-suited to meet the future graphics processing demands.

In traditional games, adding in multiple GPUs could be achieved by alternating frames between the two GPUs. But the motion-to-photo latency is more important in VR, and this alternating frame approach is not viable for VR. Splitting a scene into multiple slices also introduces inefficiencies. There are going to need to be both hardware and software changes at an architectural level in order to scale up to the future rendering needs.

However, there’s currently a chicken and egg problem that Layla is facing. The hardware companies are waiting on the games engine software companies to support a more scalable multi-GPU processing architecture. But the software are also waiting on the hardware to become available that could actually support it. So Layla is stuck in a situation of trying to designing for the future that doesn’t exist yet, and she recognizes that the software and hardware will need to reach a convergence point to provide a viable solution.

There may need to be a leap in either the hardware or software side first, while also moving away from the largely brute-force implementations and taking advantage of both perceptual hacks as well as best practices for creating beautiful graphics from the gaming industry.

Layla talks about how the new Vulkan API has been derived from and built upon components of AMD’s Mantle. She also laments how there hasn’t traditionally been a lot of collaboration between AMD and Nvidia, and that some of the innovative features that each company is creating don’t typically get ubiquitous adoption until these features become standard features for both AMD and Nvidia. If there was more collaboration earlier in the process, then perhaps they’d be able to reach that place of ubiquitous adoption of innovation much faster. But she also recognizes that this competition is what has continued to put pressure on each company to continue to innovate.

Finally, Layla believes that VR has a lot of potential to change all aspects of our civilization ranging from applications in education, medical, social, and increasing empathy. There’s still a lot of challenging problems ahead to meet the graphics demands of VR, and she’s excited to be working on the newest features of AMD’s LiquidVR to help meet those insatiable demands.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Daniel “SvenViking” Fearon is a developer behind the Scorched Battalion turn-based artillery game, which won the Silver prize for the Oculus Mobile Game Jam. Daniel talks about some of the design principles between alternating between first and third-person perspectives, and the evolution of that project. He also talks about his path into becoming a VR developer starting from getting into VR because it might help with his neck injury to then spending too much time having a text-to-speech program read him the posts on the oculus subreddit. He’s now developing a full version of Scorched Battalion in collaboration with his brothers, and he recommend’s Vernor Vinge’s Rainbows End as a sci-fi book that points towards where this is all headed.

LISTEN TO THE VOICES OF VR PODCAST

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Morris-MaySpecular Theory CMO Ryan Pulliam and founder Morris May talk about some of the 360-degree video experiences that they’ve produced over the last year including an edgy Sundance film called Perspective: The Party, which tells the story of a sexual assault from the point of view of the victim and perpetrator.

LISTEN TO THE VOICES OF VR PODCAST

ryan-pulliamSpecular Theory has had a busy year of producing different 360-degree videos for different clients including Terminator Genisys: The YouTube Chronicles in 360 featuring Arnold Schwarzenegger, a music video for the The Donnies The Amys, as well as a Jeep® Brand VR Surf Experience.

Morris May also collaborated with Go Fish director Rose Troche, on an experience called Perspective: Th Party that showed at the New Frontiers section of Sundance this year. The description of the experience is that “A young woman attends a college party with the intention of shedding her shy girl high school persona. At the same party a young man is after a similar reinvention. They meet. Add booze. Misinterpreted signals and do things that cannot be undone.”

Morris talks about how Perspective: The Party is an exploration of empathy, and how it generated a lot of buzz at Sundance. It’s the first part of a longer series that is described as:

Perspective is a Virtual Reality series created by Morris May and Rose Troche. Together they are setting out to use the inherent power of the virtual realities 1st person point of view to create live action narrative films that challenge viewer’s perception of social issues and constructs that might otherwise never have been. With the use of the Oculus Rift you experience both realities, and in doing so begin to understand the slight differences in perception that sometimes lead to misguided assumptions.

There’s a lot of potential in this type of series, and Specular Theory is on the leading edge of experimenting with storytelling with the live action, 360-degree medium. Morris explains that they have made a conscious choice to be really aggressive when it comes to moving the camera within a 360-degree experience. I’m personally really sensitive to movement within VR, and I definitely experienced some motion sickness after watching their music video experience. But Morris says that he’s more concerned with trying to be innovative and push the boundaries of what type of storytelling experiences are possible despite the fact that it may not be comfortable for all people. That said, it’s worth paying attention to Specular Theory’s work since they’re doing some of the most innovative and interesting experiments with 360-degree video.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

jason-storeyJason Storey (aka apieceoffruit) is a VR developer, Left-Handed VR podcast co-host, and all-around VR enthusiast who has worked on number of different VR experiences. He talks about working on the gaze-control mechanism in the cinematic VR experience of Colosse, which was a Silver experience award winner in the Oculus Mobile Game Jam. He’s also working on Echo Red with Nick Pittom, which is an experience that uses the Oculus Touch and Vive motion controllers in order to make you feel like a super hero. Jason has been able to work on projects ranging the full spectrum from passive cinematic VR experiences to the other end of the interactivity spectrum with full motion controls, and so he’s got a lot of interesting perspectives and ideas about the strengths and weaknesses of VR as a medium for expression.

LISTEN TO THE VOICES OF VR PODCAST

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

tony-parisiTony Parisi has been working towards creating an open ecosystem of virtual reality helping to co-create the Virtual Reality Modeling Language in the early 90s. He recently took a position at WEVR, and while he’s not revealing too much about the details of what he’s going to be working on at WEVR, it’s likely to do with building out components of the VR ecosystem using open web technologies like WebVR. In this interview, Tony talks about some of the disadvantages of the VR experiences solely being created within walled-garden and proprietary context, and the advantages of moving towards making VR a first-class citizen on the open web.

LISTEN TO THE VOICES OF VR PODCAST

webvr

I talked with Tony about the current dynamic VR distribution paradigm, which is a walled garden approach where there are guardians of the different platform stores. He says that there are going to be multi-million dollar VR games, a lot of indie developer experiences, but then there’s going to be a huge drop off for everything else where it’s not going to have any distribution options and will likely disappear into obscurity. If there’s content that is too experimental or explicit, then there’s not really a good outlet for type of long-tail material. At the moment it kind of just disappears. Of if there is awareness around the content outside of the official channels, then there has to be specific hacks and workarounds such as the sideloading installation process for mobile VR that users will have to do in order to get access to this content.

Tony acknowledges that the watching VR experiences or playing VR games via the web is not quite viable yet due to the latency issues as well as the fact that the web browsers are locked in at 60 frames per second. He says that one current blocker is that it’s a political decision within each of the browser companies in order to make the required architectural changes in order to optimize browsers to be better suited for the delivery of real-time 3D and immersive virtual reality experiences. However, he points out that one huge step within the last year towards the goal of VR being delivered on the web is that all of the major PC and mobile web browsers implemented WebGL natively.

There’s still a long way to go with the open web and VR, but Tony is committed to helping to make the creation of VR experiences go beyond just the 1 million+ Unity and Unreal developers, and to the more the 10 million web developers.

Tony sees that there’s a lot of the consumer push of VR is around gaming, but that ultimately that he thinks that there’s going to be other more compelling applications of VR. WEVR is obviously focused very heavily in the production of cinematic VR experiences, but Tony is also personally excited about creation tools like Tiltbrush 3D painting application and the Oculus Medium sculpting tool.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Dora-ChengDora Cheng is one of the co-founders of uForis VR, which is a VR company that is focused on creating pragmatic applications of virtual reality. One of their first big VR projects was to create virtual tours for the Domus Student Housing. They had to figure out a production pipeline that could operate at scale in order to capture thousands of photo spheres of apartment units within a 3-month period. Dora talks about all the production and post-production challenges that they had to solve that included processing, stitching, color correcting, and creating a VR application wrapper that’s viewable within the Gear VR.

LISTEN TO THE VOICES OF VR PODCAST

Domus Student had the problem of high-turnover rate with a lot of students changing their apartments at the end of year, and they saw that virtual reality could help the students be more efficient in their apartment-hunting journey. Instead of seeing dozens of units in real life, prospective tenants can go into an office and very quickly see a dozen units to get a sense of the look and feel of the apartments at scale. From there, then they can narrow down the options, and then only go to see a handful of their favorite ones.

Dora breaks down some of the challenges that uForis VR had to overcome in order to both capture and process thousands of photo spheres over many months. They’re looking to expand to other real estate applications, but also expand into other pragmatic applications of VR.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

E-McNeill2I had a chance to catch up with E McNeill at Oculus Connect about his design process for the Real-Time Strategy game called Tactera, which he developed for the Oculus Mobile Game Jam. He talks about some of the design constraints that he had by only having a single touch input, but he also wanted to simplify and streamline the gameplay mechanics in order to be better suited for virtual reality.

LISTEN TO THE VOICES OF VR PODCAST

One of the strategies that E is taking is to focus on creating a strong AI component so that the game holds up for playing it as single player. He’s also adding in multiplayer capabilities that will add another dimension of strategy, but he wants to be sure that the core game play mechanics can hold up while playing it as a single player.

E talks about some of the differences that he’s found in how he expects the game will be played on the Gear VR vs. the Oculus Rift. The biggest difference is that it’ll be a little bit easier to have more interesting perspectives given the positional tracking. E also says that it’s just really cool to put your head right into the action, and that it’s even cooler when you can her the 3D positional audio effects kick in.

E has also developed the Darknet Game for the Gear VR, and was told by Oculus that it’s one of the highest average play times of any of the other games that have been released so far. He says that it doesn’t feel like it’s fully launched just yet given that there hasn’t been an official Gear VR launch just yet. He says that between launching both Tactera and Darknet on both Gear VR and the Oculus Rift that he’s going to be keeping plenty busy from now until the official launch of them both by this Fall and next Spring.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Blair-Renaud2Blair Renaud is the creator of Technolust, and he talks about his strategy for creating different experiences out of the Technolust universe for the Vive, Playstation VR, Oculus Rift, and Gear VR. Technolust has been one of the early VR adventure games, and so Blair has been on the forefront of trying to find a good VR locomotion solution that is comfortable for the majority of users. He explains their flexible dungeon crawler locomotion scheme that they’re calling “Cloudstep”, as well as some of their strategies for overcoming the uncanny valley with realistic social behaviors.

LISTEN TO THE VOICES OF VR PODCAST

In this video, Mark Schramm describes their solution for VR locomotion named “Cloudstep.” They took inspiration from a combination of the comfort mode for yaw turning as well as the step locomotion that’s found in dungeon crawlers like Crystal Rift. The innovation that Cloudstep makes is you can have more freedom to explore around because you can move in the four directions relative to your head position as opposed to the four fixed cardinal directions.

In this video, Mark describes how they’re using four colliders oriented around the head position to calculate whether there are additional processing that is required for moving up or down or if there is a restricted movement to prevent walking through walls. Cloud step eliminates vection that is suspected to cause simulator sickness, and therefore provides a more comfortable locomotion experience for people who get sick easily while moving around in VR.

Here’s a video of Cloudstep in action with footage from Technolust:

Blair has also been involved in helping to start up and advise a photogrammetry studio called Quantum Capture where they’re able to produce super high-resolution avatars. One of the potential issues of having high fidelity avatars is that it could easily fall into the pit of the uncanny valley, and so Blair told me about some of things that they’re doing in order to help give these NPCs a little bit more lifelike behaviors. They’ve been collaborating with the Coffee Without Words developer Tore Knabe who has been mining psychology research for different non-verbal behaviors that could be coded into Unity scripts and attached to the NPCs.

A couple of research papers that Tore has used include “Realistic Avatar Eye and Head Animation Using a Neurobiological Model of Visual Attention” and “Eyes Alive.” He’s written more about how he coded these behaviors into Unity in this blog post. Having these types of behavioral cues can help make these characters seem less creepy and help to climb up that steep incline of the uncanny valley of creepiness. Blair is very confident that their approach is going to make some advances in creating a coherent feeling presence with these characters.

Finally, Blair talks about winning the Best Art Direction and Best Sound Design honors at the Proto Awards, and his approach of adapting Unity Store Assets in order to create a cyberpunk world with a consistent look and feel that feels really nice.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

bob-berryEnvelop VR is a productivity tool that provides a virtual shell to allow users to use all of their existing 2D applications within the context of virtual reality. I had a chance to talk to the founder Bob Berry at the SEA-VR Expo, which he also founded last year. Envelop VR announced last week that they raised a $4 million round of funding from Madrona Venture Group. Bob talks about their strategy to create a generalized solution to be able to work all day within VR, and some of the challenges that they had to solve along the way.

LISTEN TO THE VOICES OF VR PODCAST

One big issue with a virtual desktop solution is being able to see your hands on the keyboard, especially for people who are not touch typists. They point a web cam down at your hands, and then they render a virtual representation of you hands in VR laid over the keyboard.

Some of the other issues that we discuss include how the screendoor effect is negligible on the CV1 and Vive, the possibility for immersive visualizations, being able to increase productivity by having as many monitors and screen real estate as you want in VR, and the exciting potential of being able to code VR experiences while being within VR.

Bob believes that immersive computing is going to provide a complete paradigm shift in how we interact with computers and he hopes that Envelop VR can provide the tools to be able to help make that happen.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

ikrimaI talk with Ikrima Elhassan of Kite & Lightning about their new tabletop VR game called Bebylon Battle Royale, which is a “Vehicular Melee Party Brawler.” Kite & Lightning has a made a name of creating high-end cinematic VR experiences such as Senza Peso as well as Insurgent movie VR experience. Bebylon is K&L’s first foray into developing a comedy game, and so we talk about their game development philosophy as well as the challenges of creating innovative gameplay with support for both gamepads and touch controllers.

LISTEN

Ikrima says that VR game developers can choose two out of the three of the following: innovative VR gameplay, support for gamepads, or support for touch controllers. By choosing to support both gamepads and touch controllers, then they’re forced to go with a lowest common denominator gameplay that both controllers can support. They can do innovative gameplay design with either the gamepad or with the touch controller, but not with both. Because they’re planning on focusing on creating a launch title for the Rift, then they’re choosing to support gamepads, and not commit to touch support until they’re further into the development cycle.

Ikrima also talks about the choice on going with a miniaturized tabletop aesthetic in order to have all of the action within the nearfield sweetspot of VR, which maximizes the parallax effects and magic of VR. The other competing player will be represented in a 1:1 depiction enabling the players to express their creativity through a number of taunts, humiliation animations, and overall boasting. Ikrima says that a smoke bomb released on the race track could also temporarily block your first-person perspective of what you avatar can see. The miniaturized VR action will also lend itself well for spectators to watch the action from within VR as well.

The Kite & Lightning team has relocated to Paris in order to focus on prototyping and developing this game. They’re still early within the development process, and so they don’t have any gameplay footage or trailers to show just yet beyond a piece of concept art which shows how the immortal “beby” characters are trapped within the bodies of a 2-year old baby. Ikrima says that surreal and humorous nature of these “beby” characters helps to defy your expectations and overcome the uncanny valley with their stylized cinematic reality art direction. Ikrima describes the game as a combination of Mario Kart party mode and Super Smash Bros, and they are targeting their launch to be within a month or two after the consumer release of the Oculus Rift.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.