#101: Denny Unger on the HTC Vive & designing walkable, two-handed VR experiences

denny_headshot-200x200

Denny Unger is the President and Creative Director at Cloudhead Games. He talks about his experience of seeing the HTC Vive prototype for the first time with other developers, as well as some of the implications of 360-degree, full-room walkable capabilities as well as two-hand interactions means for VR game design for The Gallery, Six Elements.

Denny & Joel from Cloudhead Games along with Alex from Owlchemy Labs talk about the following:

  • How it’d change the gameplay for The Gallery. They always wanted to use your hands and to spin around 360, but the technology wasn’t there yet. The HTC Vive matched the vision of what they wanted to do, and it gave them freedom to design the VR experience as they’ve always intended. Started with designing for sit-down experience, then moved to a standing experience, and now designing for a full room and 360-degree turning experience. They have to account for designing for each of these scenarios
  • VR locomotion is still an open problem because you don’t want to warp around. Trying to build systems to also locomote the volume with a joystick.
  • Some of the other experiences chosen by Valve. Cloudhead Games had a vision of where VR could go with interacting with a 3D space in an adventure game context.
  • First time that he’s experienced presence for an extended period of time. Joel from Cloudhead Games talks about his own experience of presence in VR
  • Simulating jobs with Job Simulator by Owlchemy Labs is about two hands interactions in VR. Watched people and what they did. People expect natural experiences within VR, and if you account for that, then it’s delightful. Throwing something at a robot and having a reaction that is accounted for.
  • User interactions that are new, and starting to build in interactions into the VR world that parallel what people want to do in real life. Gameplay in VR that require two hands. People tend to just use one hand at first, and it’s hard to understand that you need to use your whole body. Helping to people realize that they can use both hands in VR. Finally have the freedom to use both hands. Leaning and crouching are now natural body movements.
  • It’s going to apply to so many experiences. Devs will enable what you’ve always wanted to do in VR.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. I'm Denny Unger, the President and Creative Director at CloudHead Games. We've just been obviously working with Valve on the Vive HTC VR headset and coming to terms with this new great technology. And so yeah, maybe describe to me what it was like to sort of go into this secret meeting and see this prototype HMD for the first time. It was a surreal experience for sure, and I don't think any of us expected to be there, but Valve was really great about observing the VR landscape and seeing who was kind of ahead in different areas, and then pulling together the developers that they thought would match what their kind of vision was for the technology. So it was a huge honor, honestly, just sitting there listening with all these other developers and having them roll out their plans and seeing their vision for how you could do full positional tracking in a room, in room scale. exciting. It was just exciting. And so I imagine that you're doing a game in the gallery, Six Elements, and how did that change the way that you're designing something that was sort of already on a pretty strong trajectory, you know? Did you have to re-architect anything, or what was the approach of actually integrating this new technology in that way? When we started the gallery, we had this kind of crazy high-level vision of what VR could and should be. So it was always about using your body, interacting with your hands, being able to spin around 360, and the technology just simply wasn't there. I mean, Oculus had some bits and pieces in play, and we had the Hydras to work with, but... all of it was a bit substandard, right? It didn't quite take us across the threshold. So when the Vive HTC thing with Valve was brought to our attention, it just kind of matched that bigger vision for what we could do. And so it's something we've kind of been working towards in a crazy, hoping that it would be there kind of way. And ultimately, it's just given us freedom. The freedom to design towards that, but also be considerate of And this is actually kind of the best retroactive design process we've had, because we had to start with a very limited kind of sit-down experience, and then we kind of moved towards a stand-up experience, and now we're moving to a full room-scale 360 experience. So, it put us in a unique position to understand what we have to account for for all of those things. Yeah, because I know that, you know, you were one of the innovators of the VR comfort mode, which is, you know, whenever you rotate, you sort of do a jump of, you know, 10 to 15 degrees so you don't get this motion sickness. And so VR locomotion seems to be still somewhat of an open problem, even if you're in a limited sized room. And so how are you doing in an adventure game where you're able to sort of walk around and still be able to sort of move through the VR space? Yeah, we've talked about it a lot, especially recently in the last few months, but I mean, unless you're just going to restrict all of your games to a single space that you can't move beyond, that's not really a viable solution. So, you know, kind of the benefit of us going through this research process for the last three years was we kind of understand how we're going to approach that problem. Valve has a number of ideas about how to approach moving beyond the volume, like how to shuffle the volume, how to warp to different locations. For an adventure game, you don't really want to warp because what we found is if you're just warping around, you get really lazy and you actually don't end up actually physically moving around because you kind of go back to that standard, you know, you're cheating with the joystick kind of thing. So what we're thinking about a number of different solutions, but one of them is just basically it's no different than if you're sitting. If you're sitting, you're using the left stick and you're pushing your character through an environment, right? So we're trying to build systems that might account for it that way. So you can either walk around that space that you've set up as your safe zone or you can use the stick to actually locomote the volume and push forward into the environment. So you're kind of bringing the environment to you and then you stop and then you walk around and yeah. And so it sounds like it was a very collaborative environment with all these other development shops that were also getting this technology working with the Lighthouse, with the new Vive headset. And so I'm curious, you know, you said that each of the development shops had what Valve would see as sort of a specialty. What was CloudHead Games' specialty and what were some of the other specialties that you saw emerge from the other development shops? Well, I think the smart thing that Valve did in bringing that group of people together is that they really scrutinized the different types of experience. So Tilt Brush, it's a painting program in 3D, you know, so you're moving around, you're painting in a space. Surgeon Sim kind of has its shtick with weird, wacky controls and you're operating and that works really well for that. I guess what Cloud had brought to it was they knew our higher level vision of where VR could potentially go with great hand interactions and object interactions. and sort of trying to take away as many barriers to entry as possible, interacting in a virtual 3D space. So I think they just kind of tuned in on the things that we had been working on before. We'd been showing the beach demo in previous conventions, which was a very atmospheric, very immersive thing involving your hands and, you know, puzzle solving and that kind of stuff. So they just kind of tuned in on that and brought us in because of that, I think. When I hear people like Sebastian Kuntz or Mel Slater talk about presence, they say there's two key components, which is the place illusion that you're actually in another place. And then there's the plausibility illusion, which is that you can actually interact and have agency and impact the world that you're in. And it seems like. But with this new Vive demo, it's really combining those two into creating this deeper sense of presence. So I'm curious from developing and then experiencing all these other demos from these other shops, what your own personal sense of presence was through experiencing it. Presence is the term that's lofted around, it's thrown around a lot, and really with this system, it's the first time I've felt present for a consistent amount of time. I could spend an hour in any one of these experiences and still feel present. Every other iteration of VR tech up to this point, you got these kind of glimpses of presence, but it wouldn't stick, whereas with this system, you're really there because it's tracking you so precisely and you have such freedom to move around and interact with the worlds that you're completely present. Joel has a story actually, maybe he could talk about that where, like we know this level inside out that we developed as part of our demo for the Valve demo loop. But even though we know every in and out of how it's designed and what to look out for, there's things that still surprise us because your brain is completely fooled into being in this place. So I don't know, maybe Joel could talk more about that. It's just a simple little story of finishing the last few bugs that we had on the level. This level that we built from scratch, we've seen it through all the iterations of super rough to being in the more final polished version. And we had this kind of performance issue looking towards the room from a certain angle and you had to be backed up right up against the wall and it happens that on that wall there's actually a drop down into a deep cavern and I was backing up into that wall to kind of test our performance issue there and like had totally had that sense where I was gonna fall I actually braced my whole body and like scared myself hundreds of times building it from scratch for months and being in it hundreds and hundreds of times and still you know my brain thought I was gonna fall off the elevator that I know is not a real elevator you know Yeah, one of the demos that's included in there is Alchemy Lab's Job Simulator, which, when you think about it, you think, well, why would you ever want to sort of simulate a job in VR? But maybe you could talk about your experience of being able to actually do things and manipulate and build recipes and do jobs simulated in VR. Yeah, it's, yeah. Hey, Alex.

[00:07:50.408] Denny Unger: The future is spicy. Someone was telling me before that in their demo loop they had a demo where they had two-handed interaction with things. Both of our games obviously have great interaction, you can touch everything, you can feel everything. And then the demo they had after it, there was no hand interaction and there was something in front of them that they wanted to touch. And it made them, like, they were very upset that they couldn't play with everything around them. And I feel like when you get your first hands onto something, you can't go back. And you can never do a non-interactive demo after that. And so, specifically about the Sriracha stuff, basically we would watch people and see what they did. Like, someone threw a plate on the ground, and it didn't break. That was actually my wife. She did that. She threw a plate on the ground, and then she was like, this is garbage. And so that was the next task on my list was like make the plate break And then the same thing with the salt is we saw some you know all the natural affordances of the items in the world They basically people would do exactly what you got to do and when it didn't actually happen That's where that like little frown on their face came about and then the frowns were basically task list items so

[00:08:56.242] Kent Bye: It's every action that you do in real life, because you're so present in the experience, you expect it to behave as a normal reality does. So you have to account for all of that stuff, otherwise you break presence, is really what it comes down to. And we found too that when things did work the way that they hoped it would, it was delightful for people, right? When they threw something at something and it actually reacted the way that they wanted it to, they were like, Wow, that actually worked. You guys accounted for that. So when you actually go through and have all those little details there and try to simulate how the world should really react. You know, sometimes over the top of what it actually would do. It brings so many little moments of joy into the experience. It's awesome.

[00:09:36.409] Denny Unger: Throwing something at the robot, which we added at the last minute, but multiple people called it out in reviews saying, you know, I threw an item at the robot because why not? And it had a little quip and it promptly greeted me. It said, you know, thank you, sir. May I have another? And we just threw that in there. I think our VO guy was doing the lines. And I just came up with this little idea right at the last moment. And I sent an addendum email. I'm like, can you do one more line? Just say this. And I just thought no one would do it. But it's those little things, I guess.

[00:10:04.473] Kent Bye: And so just in thinking of the user interactions that you have now that you have two hands in the game, what type of new things could you do that you couldn't do before? The whole thing's emergent, but it's emergent in a real way. You just start taking real world parallels and try to build them into your game. Whereas before you couldn't, especially for a smaller team, you couldn't necessarily afford to put that into production because you have to account for animations and then a whole system to animate all that stuff. Whereas once you have a good handle on motion control, you can just kind of build it into your world and do all kinds of crazy fun things that you would do in normal reality, really. So what type of things, I guess, in terms of the, you know, when you have an Xbox controller, you can, like, basically select, but, you know, do you have things where you're actually requiring both hands to kind of, like, you know, put something together? I'm just curious some of the gameplay mechanisms that you're able to do with these controllers that you couldn't do before. In this specific demo one of the things we do is we have these lock boxes on a wall that you can interact with and there's these two sort of radial controllers on the side so you twist them and you have to reach in with your left hand to grab the item while you hold the other one open because you can't access that item unless you twist it open. There's just really weird little things you can do.

[00:11:16.181] Denny Unger: I remember the first time that I played the demo and I was twisting it with one hand and I went to reach with the same hand and it closed. And then I was like, oh, okay, hold on. Maybe I messed something up. And then I did it again and then it closed. And I actually, out loud, I was like, damn it, Denny. Because it dawned on me that you wanted me to do the two-handed thing. And it was so genius.

[00:11:33.787] Kent Bye: When I thought of it, I was like, I'm going to take credit for that one.

[00:11:37.548] Denny Unger: But I just like saying, damn it, Denny. So that's kind of...

[00:11:41.487] Kent Bye: Exactly what we were hoping for because we we noticed I don't know if you did but we noticed that people play with one hand Like for some reason like especially at the beginning when it's their first time in the system like they tend to just use one hand They're just so used to doing that and they're not they haven't figured out that like no This is like your whole body you can use and so we tried to put things into the demo that forced exactly that those lockboxes that force you to use your other hand and all of a sudden as they realize that they can use both hands and through that interaction, then they start grabbing items with both hands and throwing them between their hands and all kinds of stuff.

[00:12:15.511] Denny Unger: On the first recipe, it says put two tomatoes, two mushrooms. And one of the things we wanted to do is for people to realize there's a fridge and you have to get items from it. So we didn't put enough tomatoes on the counter and you have to actually find the other one in the fridge. But we put two mushrooms right next to each other and you have to put two in. You see the two, you see the pot, you see the two. And then most people actually take two hands and throw both in at the same time. So that actually helped starting to get people to use the second one if it was like I guess the baked-down Version would be if there were just two bottles on the table and you had to move them both I would guess a large percentage would try to do them both or maybe they do one at a time and then go Oh, maybe I should use my other hand So yeah, I guess we kind of have to teach people to do things they normally do in real life a bit While they start to realize hey, this is actually like real life. I

[00:13:02.694] Kent Bye: you finally have that real freedom to use both hands, which is, it seems obvious to us, but... And then once they do, I understand that it's hilarious because now people that know that and they will play our demo, they'll walk over to the shelf that has all the bottles on it, and like, whenever I go in, I just start grabbing them with both hands and throwing them over my shoulders, like, because you just, why not? You might as well, you have all that freedom.

[00:13:23.429] Denny Unger: My favorite question someone asked me a long time ago is this little like tech demo I was showing them, I think it was DK2, so you had positional tracking and you're like hiding behind a wall and I said something like, oh you like lean out and you shoot or whatever it was, and they go, what do I hit to lean? And I'm like, you do it. You just do it.

[00:13:44.813] Kent Bye: One of our devs was looking for the crouch button on a controller, I remember. Because we have a crouch button from our older demos. And the first time we got this system set up, I remember he dropped an item on the ground. And then I saw him feeling around for the crouch button. And then he was like, oh. And he just bent down and picked it up. And he was like, oh yeah, we don't need crouch buttons anymore. So what do you hope to see that this new technology is going to enable, having a fully positioned body in the room and the hands? And what type of experiences do you want to have? And what do you see that's going to enable? Man, that's a broad question, but I just think it's going to apply to so many different experiences that we can't even really sit here and rattle them off for you. Because now that it's all about accessibility, and now that it's so easy to use the system as it is, devs are just going to go crazy with it. They're going to create all kinds of amazing stuff. Like, this is all about enabling us to do what we've always wanted to do in VR. And now the system is actually here to accommodate. Basically, your high-level holodeck vision of what VR could be, it's there. And so everybody can use it. And it's going to be incredible. Awesome. Well, thank you. You're welcome.

More from this show