Valve’s Chet Faliszek has been a key virtual reality evangelist & developer relations liaison over the past couple of years for the HTC Vive, and I had a chance to sit down with him at GDC as they were showing off many of their launch titles. Chet talks about what convinced him to move his desk to start working on VR, some of the emerging new genres of VR games, some launch title highlights and some of his personal favorite VR experiences, as well as some stories of people discovering room-scale VR for the first time. There’s been a lot of growth and maturity in developing room-scale VR experiences over the past year, and Chet is continually humbled and surprised by what developers come up with. And with over 50 launch titles for the HTC Vive today, then there’s a wide range of different types of experiences to choose from.
LISTEN TO THE VOICES OF VR PODCAST
Here’s some of the launch titles that Chet mentions:
HoloLens development kits started shipping last week following Microsoft’s Build conference and there were also a number of NDAs that were lifted, which allowed some companies to start talking about their HoloLens projects for the first time. One of those companies was Portland’s Object Theory, which was founded in July 2015 after HoloLens engineer Michael Hoffman left Microsoft Studios to co-found Object Theory with serial entrepreneur Raven Zachary, who sold his previous iPhone app development company to Walmart Labs.
I had a chance to do 90 minutes worth of HoloLens demos on Friday, and then talk with Raven about their mixed reality collaboration service and early HoloLens client work with CDM Smith. We talked about AR vs VR, designing around HoloLens’ relatively small field of view, and why he decided to exclusively focus on developing enterprise applications for the HoloLens. Continue reading →
Virtual Reality has the power to transport you to anywhere in the world, and NextVR has been one of the leaders of giving you a front row seat to live sporting events including hockey, boxing, NASCAR, basketball, soccer, and golf. NextVR recently signed a five-year deal with FOX Sports, who have the rights to broadcast the NFL, Major League Baseball, and the U.S. Open Golf Championship.
I caught up with NextVR’s VR evangelist Helen Situ at GDC talk about the highlights of the past year of their livestream VR broadcasts, and where they’re going next. They’ve been experimenting with a lot of different camera positions, broadcasting real-time binaural audio, augmenting the experience with graphical overlays, and exploring how to give the viewer more agency in choosing the different perspectives that are available.
LISTEN TO THE VOICES OF VR PODCAST
I think that live streaming is going to be one of the more compelling use cases for VR that will help bring it to the masses, and one indications of this is that there are more broadcasters starting to stream live sporting events. It was announced this week that the NCAA Final Four & championship games will be livestreamed to the Gear VR by NCAA March Madness Live, which is managed by Turner Sports, partnered with CBS and Oculus, and has sponsorship from Capital One. Here’s more instructions for installing the Gear VR app. These semi-final VR broadcasts happen on Saturday April 2nd, and the championship is Monday, April 4th.
NextVR has been a pioneer in VR livestreaming, and they expect that the content offerings will be compelling enough that sports fans will be willing to pay to have a front-row seat within virtual reality. David Nathanson is the head of business operations for FOX Sports, and he was quoted as saying, “For sponsors there will be naturally an opportunity to create immersive opportunities whether it’s pre-roll video, banners, VR commercial units, or integrating brands into the experiences we create. It’s uncharted territory.”
Timoni West is a principle designer at Unity Labs, and she’s been working on creating tools within Unity to be able to create VR within VR. I first talked with Timoni in October about these content creation tools, and these were revealed to the VR community for the first time at Unity’s VR/AR Vision Summit in February. I had a chance to catch up with her at the Vision Summit to talk about creating VR in VR for both developers and non-developers.
LISTEN TO THE VOICES OF VR PODCAST
One of the really interesting aspects of this discussion was Timoni’s discovery that there are a number of different perceptual illusions that were made really evident when trying to work while immersed within a 3D medium. She says that bats process information in 3D, but that humans really process information in 2D and that a lot of the cognitive mapping of 3D spaces is extrapolated from contextual cues from within the environment.
Timoni ended up doing a deep dive into cognitive science research to learn more about some of these perceptual illusions that were revealed to her in the process of creating VR in VR, which she presented in her Vision Summit talk about the “Cognitive Implications of Widespread VR.” She’s trying to understand our perceptual limitations in order to design around them, but also to see if we might be able to perhaps evolve and perhaps overcome some of them. Here research hasn’t yielded any clear answers yet, but she’s at last starting a dialogue and getting feedback from the wider VR and research communities.
You can watch Timoni’s presentation of Unity’s VR scene editing tools at the VR/AR Vision Summit here:
I got the first glimpse of what advertising within a VR experience might look like when I saw a demo of some ads playing on the Immersv platform. I was within an movie theater similar to Oculus Cinema watching a 2D video screen where there was a 30-second video of a VR game being advertised. It’s still early days for advertising in VR, but Immersv co-founder & CEO Mihir Shah expects that VR has the potential to become a content marketing platform on steroids once it becomes easier to produce 3D spherical videos of VR gameplay, movies, and tourist destinations.
With the amount of immersion and engagement that virtual reality provides, Immersv expects that VR advertising will eventually draw some of the highest CPM rates as compared to other mediums. I had a chance to talk to Mihir at GDC about their initial per-view ad rates, how much VR developers are getting per view, their plans for growing the number of VR developers using their platform, and how they’re initially targeting causal VR experiences.
Part of the challenge of creating a first-person perspective game in VR is that there are limited locomotion options for moving around in VR in a way that’s comfortable for most people. Damaged Core is a Oculus Rift launch title that uses a unique and very effective locomotion method where you move between different first-person robots as well as third-person security cameras. This ends up being a very comfortable way to provide a lot more agency in movement than most other survival wave VR shooters, and it also has a lot of elegance in that it’d actually be possible to do in a world of AI-driven robot assassins. You can’t hack into every robot enemy, and so you have to strategically move around a battlefield in the right order.
I caught up with High Voltage Software’s creative director Eric Nofsinger at the Oculus Game Day event during GDC to learn more about their R&D into VR locomotion, FPS game design strategies, and how they were able to make a scoping feature work despite it being against what Oculus recommends as a best practice.
When most people think about the types of things that want to do in virtual reality, then they almost always think of experiences that are from the first-person perspective. But Lucky’s Tale proved to me that there are going to be a whole range of experiences that people don’t know that they want to have until they actually have them in VR. It also proved to a lot of VR developers that not only could a third-person perspective work, but that it could work so well as to be able to cause a fit of VR giggles for how surprisingly compelling and delightful it could be.
I talked with Playful Corp’s Paul Bettner in January about the decision to bundle Lucky’s Tale with every Oculus Rift, and I also had a chance to catch up with him again a few weeks ago at GDC for some pre-launch thoughts as he was showing off the final build of Lucky’s Tale.
LISTEN TO THE VOICES OF VR PODCAST
Having the camera in the third-person perspective with a finely tuned movement algorithms solves a lot of the problems of VR locomotion, but it also creates a very stimulating experience for your brain. Lucky’s Tale places all of of the action within the near-field sweet spot of VR, which feels like it’s about an arm’s-length worth of distance. This allows you to really see a lot of the stereoscopic effects that are the strength of VR, but it’s also really integrated within the gameplay of Lucky’s Tale in that it really wouldn’t work as well if you tried to play it in 2D.
I’ve been a huge fan of Lucky’s Tale since first playing it at GDC 2015 because it gave me that sense of nostalgic awe and wonder that I remember feeling from playing through Super Mario Bros as a kid. I had a chance to play a demo build of a single level of Lucky’s Tale back in September, and I ended up playing a single level over and over again for four hours trying to create my own self-created achievements. The final build of Lucky’s Tale will have at least a couple of different modes for playing each level, and Playful Corp wanted to provide different objectives to encourage players to revisit and continue to explore each level for either hidden coins or to find the fastest path to the end.
Lucky’s Tale is rated as Moderate in comfortability and is included as a bundled launch title for the Oculus Rift, which officially launches today.
Slightly Mad Studios been making racing games for the past 10 years, and they raised over $3 million dollars in crowd-funding to produce their own racing game called Project CARS. It’s a AAA racing simulation game that was first released in May 2015, and they recently added VR support to be one of the 30 Oculus Rift launch titles. I had a chance to catch up with creative director Andy Tudor and game director Stephen Viljoen at the Oculus Game Days to learn more about the extent that they’ve modeled the cars, tracks, physics, and dynamic weather systems in their goal towards creating the most authentic racing simulation.
LISTEN TO THE VOICES OF VR PODCAST
The VR support allows you to be fully immersed within a variety of different types of cars ranging from Indy cars, open wheel cars, track day cars, road cars, and karts. With VR you have full situational awareness in that you can see cars in your mirrors, you really get to feel the size and perspective of your car, and you can feel the full ambiance of each of the 35 unique locations and 110 different track layouts.
Stephen and Andy emphasized the extent to which they’ve gone in order to model how the grip of the tires change as they wear down, incorporating realistic sound field recordings, recreating actual tracks, creating physics simulations that take into account the atmospheric temperature and weight of the cars, and even factoring in a fully dynamic weather system and how the light changes throughout the day.
They claim that their simulation is accurate enough that there are actual race car drivers who use Project CARS to practice on different tracks, and here’s a video of Carmen Jorda using Project CARS in a racing simulator using three different 4K monitors.
The comfort rating for Project CARS is rated as intense, and I did experience some motion sickness from the brief time that I played the game. I am really sensitive to simulator sickness and there are some things in Project CARS that trigger it for me such as tilting the horizon line when going on a banked curve, suddenly stopping when crashing, or going up or down hills. All of these produce a disconnect between my visual and vestibular systems, and start to make me feel a little nauseous.
They do use a cockpit which helps to reduce vection, but the dense textures on the tracks still produce enough optical flow to potentially be another trigger for some people. This would be a difficult VR experience for me personally to play for an extended length of time, and so just be aware if you know that you’re susceptible to simulator sickness from VR locomotion. But if you’re not, then this is bound to be an intense racing experience — especially with the multiplayer mode.
There are also a lot of motion platform integrations that Project CARS has available, and so I imagine that incorporating more of the 4D haptic feedback and movements could actually make this an even more immersive and potentially more comfortable experience. I expect to see a lot of digital out of home entertainment arcades playing this game with a steering wheel, pedals, and a fully integrated motion platform.
Project CARS is being released on March 28th, and sells for $49.99
James Green is the creative director at Carbon Games, and he was inspired to create AirMech: Command from playing a 1989 game called Herzog Zwei, which was one of the first real-time strategy games that also used a gamepad as the primary control scheme. It’s usually recommended to design a game from the ground-up with VR in mind, but the VR port of AirMech’s tabletop aesthetic works really well in VR and gives you the feeling of being able to watch your childhood war toy battles come to life. I had a chance to catch up with James at the Oculus Game Days event at GDC where he talked about his inspirations and VR design process.
LISTEN TO THE VOICES OF VR PODCAST
AirMech: Command allows you to either focus on strategy by hanging back to produce units and direct the battle from your base, or you can take a more active approach by fighting on the front-lines or moving units around. There are multiplayer modes available where you can play against other people, or you can do a co-op mode where one player focuses on offensive strategy while the other focuses on defensive battle. There will also be spectator modes available for people to watch and learn from other expert players.
James says that playing against the AI can help you get to a certain level of understanding the basic mechanics and strategy of the game, but that it gets really interesting when you start to play against other humans who are a lot more unpredictable.
Because most RTS games are played on a mouse and keyboard, then there’s a lot of controls that needed to be translated into the gamepad controller. There’s an extensive tutorial at the beginning to teach you all of the controls. While this shouldn’t be too much of a barrier for most experienced gamers, I wouldn’t expect that this would be a good first-time VR experience for non-gamers.
AirMech: Command is launching on March 28th for $39.99, and is rated as comfortable.
Jeep Barnett is a programmer at Valve Corporation who has been on the VR team for the last couple year. He was at GDC showing off The Lab, which is a series of mini-games demonstrating different design principles within VR. I had a chance to catch up with Jeep after trying out four of their Lab experiments in order to learn more about how these came about as well as some of the VR principles that each one was trying to demonstrate. There are a lot more experiments left to be discovered when the Vive launches in the next couple of weeks, and these mini games and experiences definitely have a lot of replayable qualities and will make quite the impression as a lot of people’s first-time VR experience.