#323: GDC Round Table: Valve’s ‘The Lab’ & Survival Wave Shooters as VR’s FPS

Road to VR co-founder Ben Lang and writer Scott Hayden join me in our third and final GDC round table discussion talking about our demo highlights including Valve’s The Lab and a series of different shooters. There is a theme that’s emerging in VR game play where you have to survive waves of enemies coming at you, or you might be able to teleport to different areas giving you some feeling of agency around locomotion. These demos include: Arizona Sunshine, Raw Data, John Wick: The Impossible Task, Bullet Train, Damaged Core, & Space Pirate Trainer. We also talk about the Basemark Cinematic VR benchmark demo from Crytek that blew his mind, The Climb, Manus VR gloves, the Optitrak demo, and the beautiful La Peri from Innerspace VR. We also give a sneak peak to the Unreal Engine VR Editor, Vanishing Realms, Budget Cuts, Audio Shield, & Unseen Diplomacy.

LISTEN TO THE VOICES OF VR PODCAST

Here’s what we talk about and when we talk about it:
00:00 Arizona Sunshine
03:04 Raw Data & the Emerging Wave Survival trope is the FPS of VR
05:25 John Wick: The Impossible Task + how far can we go with the Teleporting shooting gallery
06:08 Bullet Train
06:31 Future of VR FPS gameplay mechanics, slow motion, turret defense + survival wave gameplay, locomotion
08:33 Teleportation in Damaged Core & Bullet Train
09:23 Damaged Core
10:46 Valve’s Photogrammetry in VR – How-to from Valve
12:08 Valve’s The Lab: Slingshot
14:34 Valve’s The Lab: Longbow
16:21 What’s compelling in VR: Physics, Multiplayer, and Social interactions. Sony’s Social VR
19:11 Valve’s The Lab: Xortex arcade shooter
21:36 The Climb
22:07 Basemark Cinematic VR benchmark demo with Crytek
25:34 Manus VR gloves
28:51 OptiTrack’s basketball demo
33:23 La Péri
35:27 Preview of Vanishing Realms, Budget Cuts, Audio Shield & Unseen Diplomacy
38:14 Space Pirate Trainer
39:07 Preview Unreal Engine VR Editor

Here’s the Behind the scenes of La Péri, the beautiful VR ballet by Interspace VR:
https://vimeo.com/155369625

Here’s a video of Colin Northway making it to wave #20 in Space Pirate Trainer. His highest as of March 12th was wave 22.

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello and welcome to The Voices of VR Podcast. My name is Kent Bye and I'm here today with

[00:00:18.435] Scott Hayden: Scott Hayden from Road to VR.

[00:00:20.894] Ben Lang: And Ben Lang from Road to VR.

[00:00:22.775] Kent Bye: Great. So this is our third installment of our GDC updates. And so today is Wednesday, and this will probably be airing on Friday. So by the time people are listening to this, you guys will be actually wrapped up with everything. So let's just kind of recount some of the highlights that you had today on Wednesday and anything else that we haven't talked about so far. So why don't we start off with some of the experiences that you guys may have seen today?

[00:00:46.360] Ben Lang: Sure. So today, let's see, the first experience I got to see was Arizona Sunshine, which is an HTC Vive demo. It's actually also an Oculus Rift demo. Well, I say demo. It's a game in development. And I believe they're also coming to PlayStation VR. And this is a zombie shooter, which, it's kind of what you would think, but that's a great thing, because who doesn't want a zombie shooter in virtual reality? And in particular, what I saw today was a multiplayer version of this. So there were two HTC Vives, and I was in one, and one of the developers was in the other, and we were having this kind of Waves-style, survive as long as you can, zombie attack, and It's a solid game, and I'm looking forward to getting to play a lot more of it. Interestingly, I learned that they're planning to support cross-platform multiplayer between HTC Vive and Oculus Rift, which I think is awesome. So, yeah, the experience is you're using motion controllers to pick up guns, pick up ammo, use items, and then you have zombies that are coming for you, and you have to take them out before they get you. And they have a pretty interesting reload mechanic where you, in the world when you start there's an ammo belt and you put that ammo belt like on your body wherever you want your kind of reload point to be. And once you place it, whether it's you know on your hip or on your chest, that becomes the point that you kind of place your gun to get a new clip. So it's kind of like if you think about the old, you know, arcade shooter games where you shoot off the screen to reload. It's kind of like that, but you just put your gun wherever that thing that you position on your body was. But it's interesting because you get to customize it, right? You know, I could put it on my foot if I wanted to for some reason. And that's how you reload. So that was pretty cool. But yeah, they're working on a campaign game that's also going to be fully co-op enabled, which I'm really excited about because we've been talking through this whole, you know, series of GDC recaps. I think the highlight has been that games with you and other people are just more fun in VR. And so, you know, I'm definitely looking forward to getting to play through this game with another person and have that sense of holding out and surviving together and having fun with it.

[00:03:03.854] Kent Bye: Yeah, I also had a chance to try the raw data, which is Servios's kind of similar, you know, different waves of robots, I think in this case, are coming at you and all sorts of different ways of reloading and different weapons and So yeah, I had a lot of fun, and like you said, one of the more poignant moments was at the very beginning when the other player from Servios was there, and he kind of just gives me a high five, and then we start shooting and working together in some ways. We weren't really communicating or talking, but yeah. The thing I wonder is, you've had a number of these different types of experiences where you have the trope of you're standing in the middle and things are coming at you, How many different variations can you have to make it really interesting? Did you also have a chance to try the raw data?

[00:03:50.404] Ben Lang: Yeah, I did. And also very fun. I think these things are appealing because we're used to, because of first-person shooters being such a popular genre for the existing gaming market, We're used to the paradigm of having items and having abilities and, you know, ammo management and, oh, there's the power weapon. You know, these are things that are familiar with people, and so I think what we're seeing is, you know, this kind of room-scale survival thing where you have cover and you have weapons, you know, this is maybe starting to become the VR FPS. It'll probably advance beyond this state, I think. The Arizona Sunshine guys were now, at this point, starting to show kind of a blinking teleportation system where you could see these dots around the environment, and between rounds, you could say, I want to go there and get the items that were over there and blink around. And, you know, I think as time goes on, just like in the early days of first-person shooter, you know, you had controls that were all over the board and eventually coalesced into something that pretty much is the foundation of movement in every first-person shooter. I think we're going to start to see, and are starting to see very early on, these ways, style, experiences. How do you pick up guns? What buttons should be used for that? Do you have to hold the button to make sure that the gun stays in your hand, or you just click it once for it to toggle? I think these things are going to start to become very normalized, very stylized. And this way of doing a first-person shooter in virtual reality is going to be different for virtual reality, but it's kind of starting to establish itself almost along genre lines already. And these games aren't even out yet, which is pretty interesting.

[00:05:24.843] Scott Hayden: You know, in terms, I guess, of first-person shooters, the most recent one I tried, it was today, was John Wick, The Impossible Task, which is, you know, built by Starbreeze in partnership with Lionsgate and Weaver. I got the sense that, because essentially it brings you down into the shooting gallery. And, you know, I'm wondering how much farther we can go into that and have that still be a really cohesive and fun thing, because that's what Arizona Sunshine is essentially. It's a teleporting shooting gallery. What other elements did you get out of that demo? Because you said you could teleport around. They were starting to show the blinking and getting ammo and stuff. What else about that? How do you think that they're going to continue on with that? Did they tell you anything?

[00:06:06.559] Kent Bye: And just to also throw in there, I also played Bullet Train again this morning, which is another kind of like teleport shooter. And they have an ability to slow down time, catch bullets, and throw them back, which I think is a very unique game mechanic that they use in the Bullet Train demo that was showing there at Epic Games. But yeah, and just sort of throw that in there, because that was another kind of teleportation first-person shooter game mechanic that I also happened to play this morning as well.

[00:06:31.739] Ben Lang: Yeah, absolutely. And I think it's a very desirable game space because, again, it is very, you know, first-person shooters are just massively popular, and the masses immediately think, you know, Call of Duty is going to be great, you know, but of course you need to adapt it. You need the VR FPS, not just the FPS in VR. Otherwise, it's going to be uncomfortable. So to answer your question, Scott, the kind of further places I see this going, and I think it's, like I said, we're going to hit some standard, and then you just kind of evolve it from there. Like, FPSs from one to the next in the standard gaming market are not vastly different. They're kind of just mechanically different. And we'll get there. I think there's a lot of potential for lots of different exciting items, different settings. In particular, actually, Arizona Sunshine also had a slow motion mechanic that they're working on. You take this, it's really kind of neat, you take this huge syringe thing on the map, and to activate, this is like your slow motion, to activate it, you have to stick it into where your heart would be, and then press the button. And so you get that sense of like, I don't normally take big, giant, pointy objects and put them through where my heart should be. So it's really funny in that way. It took me back to some really good movie where there was some guy had to save himself with injecting into his heart, but interesting things like that that are uniquely VR. I was personally at that point wanting, like, oh, can we start setting up, like, turrets for defense and, you know, almost starting to merge, you know, like, turret defense into the survival wave setting so that I could help define the environment in which I'm kind of fighting and set up my defenses on one side so I don't have to worry about that side. You might have things where, like, you have some currency and you have to pay money to set up, like, lights so that when it becomes nighttime, you have to strategically set up your lights so that you can see zombies and they can't sneak up on you. Lots of evolutions like that, and hopefully more locomotion, and not just... Ultimately, I think the survival-style gameplay, whether it's in VR or not, It gets a little dull. So, you know, I'm much more excited, actually, about the campaign aspect of being able to move through the environment and kind of have more of a linear objective that ramps up in difficulty and complexity.

[00:08:33.153] Kent Bye: Yeah, and Damaged Core, also, I guess you could technically call that as another kind of teleportation mechanic where you're actually embodying the consciousness of a robot and, you know, it takes a little time for them to recognize that the robot's been hacked, so you get a little bit of time before they recognize that you've changed. Yeah, that seems like another strategic way of going to different places and then knowing how you're going to attack the different waves. And as you teleport, you know, in bullet train, you have when you're looking around, they have it so that when you're hovering over places, you could teleport. They're telling you like a little augmented reality, a little thing like, oh, there's a rifle here, there's a pistol here. So you can start to see like you're strategically moving around the space to get a weapon or something like that in order to eventually beat the level by going to the right places at the right times.

[00:09:23.162] Scott Hayden: I actually had a chance to play Damaged Core twice. I played it once at the Oculus, the pre-GDC Oculus event, and one time today. So I got a better chance to sort of sit down in a quiet place and replay it. And my initial appraisal of it was that, you know, this was probably the coolest mechanism of teleportation that I've seen in a long time, or at least ever since I've been following teleportation as a locomotion. It works in the sense that, you know, you're this disembodied artificial intelligence that is pitted against this core, this, you know, artificial super intelligence that has gone awry. And it's your job to be able to go across the battlefield, getting into, just like Kent said, getting into the body of a robot. And for me, it was one of those moments where I understood the game immediately. I understood that, you know, this is the way that I'm going to play. I'm going to jump from person to person. On my second appraisal of it, I realized we're not going to have the sort of agency that we have. Even with a highly populated map full of these essential teleportation points that are constantly moving, you're still not going to be able to get exactly where you want to go. And so we're just seeing that this game style, just like you said, it's going to evolve into something completely different. It's going to be a VR FPS. And that's exciting. It's exciting.

[00:10:42.320] Kent Bye: Yeah, so one of the other things that I got to try this morning was Valve's The Lab, which was a series of different minigames, and it was super compelling. I feel like I could have spent a number of hours in some of these little minigames. The first one was called Postcards. They did this amazing photogrammetry of this mountaintop. And when I talked to some of the developers afterwards, they were saying that they literally just pick one spot and take a couple hundred photos and then use this photogrammetry technique to be able to recreate the full 3D scene and then be able to teleport to different parts of that scene. So they didn't necessarily even have to go to each spot you can teleport to, but they just do one mass capture and then generate this amazing scene.

[00:11:27.058] Scott Hayden: For me that was like one of the most profound sense of presence that I've felt in a long time I think the fun thing about that is they did that they did all that they made a giant mountain that looks beautiful and they added a little cute robot dog to just like And to throw sticks at and to pet his belly and he wags his butt at you It's like this is this is very much like what I get from valve when they create these little nice things, you know, they know that to not just give you something beautiful and pretty, but something engaging too. And so I like that aspect of it a lot.

[00:12:01.413] Kent Bye: Yeah, I love playing fetch with dogs, so I could have played fetch with this artificial intelligence dog for a long time. And then the second experience was like this catapult where you're just basically shooting balls into this warehouse and it's exploding. And talking to Jeep afterwards, he was one of the developers at Valve, I was asking him, why are physics interactions so amazing? Just in general, why are we so compelled with it? And he had this really profound insight. He's like, I think it's because we like to look into the future. And physics interactions give us a way to look into the future and predict what's going to happen. And there's something about that when you make a prediction and then you do it and then it's like correct, then you're like, yes, I saw the future. And it's like, you know, angry birds are just like shooting a catapult. It's like a first person perspective of angry birds and just like completely destroying an entire warehouse of boxes and just seeing if you could go through and kind of strategically destroy everything.

[00:13:01.596] Scott Hayden: Right. And at the same time, they create this really fun tech demo where you're shooting, you know, these, it's very portalified area with robots. And they have the same sort of like, beautiful, wry sense of humor, you know, Justin Roiland of Rick and Morty, he was doing two of the voices. And I wanted to just hold them because they're in this hopper, these little core sphere robot people. And you load them up into your slingshot and they're telling you about their mission in life. And really all you're going to do is just launch them directly at a pile of crap. And they're telling you about like the intricacies of their personality and I just wanted to hold them back just for a while longer and hear everything they had to say before dispassionately lobbying them at this broken fan or whatever. And again, the thesis on that is just incredible number of pieces put together. seemingly, you know, they're not selling this. This is free. You know, this is for you to just look at and inspect and for it to become a part of your new bar of what you expect VR interactions to be a part of.

[00:14:10.622] Kent Bye: Yeah, part of the story of why they created the lab was because they've been doing all these different experiences at Valve trying to figure out like what's fun, what's compelling, what keeps people coming back to different experiences. So the lab is kind of like the distillation of all their lessons learned in terms of like, these are the things that people find compelling. And I got to say, like, they are super on it in terms of like dialing in the super fun interactions and The next one was the longbow demo, which they've kind of fully fleshed out into this tower defense game where you have those portal-like characters that are got either shields or crowns and you know, you're shooting them and you just have to shoot them with the bow and arrow. And talking to Chet at Valve, he was saying that doing a longbow motion is actually kind of like testing for a Vive controller because you're essentially doing the extreme of putting one hand out in front of you, the other hand behind your head, so you could have occlusion from all different types of angles. It's actually a technique that I expect to see way more in the Vive than you would into the Touch because the Touch is probably going to lose a lot of tracking when they do this kind of motion. Doing this type of shooting a bow and arrow in the Vive is going to be way easier than it is in the Oculus Touch, which is kind of interesting to think about technologically. But yeah, it's this game mechanic where I just felt myself get way better, again, with the physics interactions. I was able to start to really hit these characters as they're coming towards the gate more and more as I'm going through. And it feels like one of those things that as people play more and more, they're going to actually see that they get better at this.

[00:15:47.431] Scott Hayden: I think also what's interesting is you know of course you have your haptics in the controllers themselves and those little things are really important because if you're right-handed you'll hold the bow with your left hand and you'll pull the string with your right hand and you notice that when you pull it back you have this really gentle sort of feel in your in your hand of pulling the string and it's that little tiny bit I didn't know that actually that was the extreme test for it what Chad had told you it was just one of those intuitive things you pull back and you let it fly so

[00:16:21.215] Ben Lang: Yeah, I didn't have a chance to try the lab, but I'm hearing you guys talk and kind of comparing it in my head with what we've been saying this whole recap session. For developers out there, listen to this stuff that is super compelling. Physics, multiplayer, and interaction between people. These are things that people think are awesome in VR and just fun and make people laugh. I was at the Sony PlayStation VR kind of technical session today and at the end they showed this social experience that they got three people on stage and it was really just, you know, pretty similar, you know, alt space converge pretty similar, just allow people to talk and kind of interact with each other in a social space. But the audience was just loving watching them throw blocks at each other. And you could tell there's something about people interacting with each other. And VR makes it so natural to do when you just create the right foundation in terms of game design. that the audience, you know, many of them may be vaguely familiar, maybe have never tried a social VR experience, were just watching on a screen. And, you know, when somebody would throw a, like, basically a paint-filled water balloon at one of the other people and it would splash and change the color of their face, it was a very basic demo. It didn't really look that convincing, but people immediately He said, oh, you just hit him right in the face with it. Now he's got paint on his face. That's just such a silly thing that a whole bunch of people in the audience laughed at. I think it would be very, very different if the action had occurred from some NPC, some AI, or just some button you press and you get a painful water balloon launched at your face. that is so much different than somebody else taking the agency to do that and then something human about those interactions amongst each other. So I think there's something incredibly compelling about connecting people in worlds that they could not be in without virtual reality.

[00:18:12.642] Kent Bye: Yeah, just a short anecdote to that. I was in the Sony social experience and we were around this table of blocks and so I, of course, want to start like playing with it and tossing around. I picked up a block and just threw it at the guy speaking and he just like took his hand, stuck it out and caught it. It was just like this moment where he was like, everybody just started laughing because it was like, you know, it actually takes some dexterity to be able to judge where it's going to be and to put your hand in the right place and to catch it. And I didn't know that he was going to be able to catch it and just to do it, just to throw it at him and see what would happen and for him to react like that. Yeah, I just think that people interacting in that way, they're going to be able to do things that they normally wouldn't do, which I think the toy box demo explored a lot, like throwing a firecracker at someone's face. It's just like that's not something that's a very cool thing to do, or it's not safe. And so in VR, though, you can start to do all these things with a highly dynamic physics environment. And yeah, it becomes like this sandbox and playground to just have joy and lots of giggling. So yeah, just to kind of wrap up the lab, there was one last demo that I just came back to and played forever. It was just this amazing little drone in this kind of miniaturized space. And these things are shooting at you. And so these red balls are shooting. And then you have these different robots flying towards you. So all these things are flying right towards your face so they're like coming right into the near field and you have to like use the controller to like dodge them and weave around them and shoot everything and it just to me I just felt like I was using my peripheral vision like this is a game that could like never really work in a 2D screen because you just not be able to have all the depth cues that you need to be able to like judge and three-dimensional space to kind of like weave around and avoid all the lasers and all the shots and It was one that I played until they had to shut it down because I just kept going and going and going. It's something that these types of minigames start to almost stimulate my brain in a way that's never been stimulated because I've never been in an environment. that look like this and for me to actually kind of move my body and feel like I was able to get some skill of being able to play this game.

[00:20:22.320] Scott Hayden: I think the direct correlation of that game you're thinking about is just asteroids. It's asteroids in 3D. Playing asteroids, you break up these large pieces. They become smaller and you have to make sure that you don't hit them and you can move around this 2D map and you have to do that in order to not lose and you get hit once and you're dead. So they very much scaled this idea up to the third dimension. and it's very much the same play except for the way I thought about it wasn't necessarily a drone because you stick out your hand and you grasp this little toy and it's about the size a little bit larger than a matchbox car and it has of course these little thrusters on it and what got me was as soon as I picked it up In my mind, I knew I would be making that sort of like moving around like a kid again. Except for I'm seeing everything I knew I would have imagined as a kid. And of course you have to move past all these oncoming lasers and the reason that you would like asteroids that I used to play as a kid too. So it was this perfect storm of very cool stuff. Yeah, I played it for as long as I possibly could too. And I could see myself playing it for much longer.

[00:21:34.157] Kent Bye: What about you, Ben? What did you see today?

[00:21:37.000] Ben Lang: One of the other things I got to see was, so I went to Crytek's booth and I played the climb, which they've been talking about for a little while now. I'm personally really excited to see how that feels once they get touch integrated. Because although it's surprisingly natural to direct your hands in that experience of climbing up a wall, with your head, you look where you want your next grab to go. I think that touch is kind of the ultimate destiny. Motion input will be the ultimate destiny that will make that even more natural and comfortable. Another thing that I saw there was a collaboration between a benchmarking company and Crytek, a benchmarking company called Basemark. And they and Crytek came together to create, they're working on a VR benchmark, so basically creating a scene to be played back consistently over and over, built on a certain version of CryEngine, and then play it, see what framerate you get, all rendered up in VR so that you can compare system performance. So I go into this thinking, basically, they're showing just a cinematic mode. When they're actually going to play it back for testing, it's going to be exactly the same view every time because they want consistent results. So they're showing you just getting to go in and look just to see the experience for yourself, not really a test. So I go in with this expectation that this is a tool for testing the performance of VR hardware and, in a way, the performance of CryEngine for VR rendering. And I came out of it, I came out of it being like, like literally I turned to the guys, I was like, guys, this was, this was the most spectacular cinematic VR experience I've ever seen. And I was just not expecting that at all. So, I mean, CryEngine, when used correctly, can create some absolutely stunning visuals. And the experience that I saw was, kind of like, you know, it was a real-time, it was like a real-time animated short, where I'm this little android guy, I'm kind of like an android maintenance droid, and there's this big sky battle going on with these huge futuristic ships, and like you crash at one point, and you as a character are like pinned down under this thing, and you're watching this other droid kind of go and save the day, but it was just incredibly beautiful. It was well directed. The scale of the ships that I was on were just massive and they were like crashing into each other and things were blowing up. It was crazy and I was like just not expecting this because you know I've like this was done for and and and maybe Maybe there's a whole lot more to this that they're going to reveal later and it turns out, you know, they have this crazy internal cinematic VR team. I don't know, but to the best of my knowledge, you know, this was like part of the reason for creating this experience was for the benchmark. And then I come out of it and I'm like, what the hell guys like you just created something that was like incredibly compelling and i want like is there gonna be more of this because i kind of the end the end of the story leaves you with basically you know the other droid saves the day prevents the big ship from crashing into you and killing you all and then this humongous like city skyscraper sized stone golem just comes rising up out of the ground and is just like you're looking up and you're like wow it's huge and it looks great it's just you know beautiful artwork and They kind of fade you out at the end there, alluding that there might be more to this. And I totally hope there is, because I was blown away. And I'm really excited, too. I'm trying to get these guys to send me some more information on this, because I want to share it with people. It was completely unexpected. And at this point, I'm like, maybe I hope CryEngine puts out some super cool 15-minute action film like this. In VR, it would be amazing.

[00:25:25.656] Scott Hayden: What I'm really hoping is that they keep the name, like, Benchmark Demo No. 2, no matter how giant it gets and beautiful.

[00:25:34.043] Kent Bye: Yeah, one of the demos that I did today that was actually a really awesomely compelling and beautiful demo was kind of the tech demo that ManusVR had. And they collaborated with another content creator to be able to create it. But the ManusVR is this year the data gloves, where you put on this kind of like fabric gloves. And instead of IMUs, they have some sort of way of taking the material and then detecting when your fingers are bending. And they have, in the short term at least, this vibe that's strapped to your arm. And so you can actually get pretty good positional tracking. And so the thing that was noticeable for me is that usually with a leap motion, you have to have your hands come into a certain field of view. And your hands will pop up and appear. But if you move them out of that, then they just They just disappear. But in this, I have my hands, and I'm able to kind of reach completely over to the left and completely over to the right. And then it just completely can see them in even my peripheral vision. And I was experiencing a little bit of latency in the gloves. It didn't feel like I had complete hand presence because I was able to move my fingers, and they were delayed a little bit. And the person who was running the tech demo said, oh, there might be something wrong. It usually doesn't have that much latency. They usually see around less than 20 milliseconds of latency. But for me, there's moments where I just sort of like try to get completely immersed into this little diorama world where you kind of have to solve these puzzles in order to get through each of the different to the end and you have to do these different types of interactions with your hands. But just a simple like grabbing motion, you reach your hand out and you're like grabbing things and then you can kind of like put them into different places. But you know, usually when you're grabbing, you're just, you know, it's an abstracted way of you're holding a controller and you're pushing a button. And there's something in your brain that I think a lot of times, that's okay a lot of times, but there was something like just like really visually satisfying to be able to just like close my hands and feel like I'm gripping something and then see that it's detecting that grip and being able to like pick and move an object. And that's the first time that I feel like I've had that level of like detailed tracking with my hand to be able to like manipulate virtual objects in that way. They have some dev kits that they're making available for like $250 for developers, and they're obviously not going to always have a Vive controller strapped to your arm. They want to make a full Vive integration that is able to have the White House sensors to be able to talk directly. Yeah, it's an interesting, and the tech wasn't so precise as for me to sit down at a piano and be able to feel like I was actually playing a piano. I think that would be a good test to see, OK, they've got this finger dexterity worked out so it feels like I'm actually playing this virtual piano. But it's getting there, and I think it's an interesting approach that's taking a different approach than what we typically see with something with finger tracking with IMUs. So what about you, Scott? Did you have any other big experiences that you had today?

[00:28:30.990] Scott Hayden: Actually, I did a lot of writing today. I did writing, and I did a lot of running around. So I didn't get a chance to go past much more than ILM. I saw ILM as well. We covered that in our last one pretty well, the lab. No, not really. I mean, what about you, Ben? Did you have anything else that was really super compelling?

[00:28:50.856] Ben Lang: Yeah, one cool thing that I saw right toward the end was a company called OptiTrack is a motion capture company. And they do your typical professional high-end motion capture systems. We're talking tens of thousands of dollars, a bunch of cameras, a guy in a mo-cap suit covered with balls on him so that the cameras can bounce light off them and track him very precisely for movie-grade motion capture. but what they had done was they had a DK2 that had these same kind of like, you know, ball markers attached to it so that somebody who's not in full suit can come over and put their head in. And then they also attached some reflective markers onto a basketball so that it was basically enough of them that it was always in view of some, you know, they had so many cameras, so the markers were always in view of one of those cameras. And the cameras are shooting at 240 hertz, so basically that basketball can be tracked, the stream fidelity. So I put on the VR headset, so just my head is tracked. The guy in the area with the mocap suit on had the full suit. And so when I put on the VR headset, they're actually taking this data and streaming it into Unreal Engine, and then using that data to basically animate Unreal Engine. So I had the camera view through the DK2. They were tracking my head to determine where it should be. and then the guy in the suit in real life was animating this like robot android person inside of the engine you know when i'm looking at it and it's good professional grade movie tracking but then they brought the basketball into that experience too so like i see this guy playing with this basketball and it looks perfect because it's you know it is basically one-to-one mapping of his motions onto that model of that android and then he throws the basketball to me and I catch it and it's like I've never played with an object that is so physics dependent inside of virtual reality that is dependent on you know real physics and virtual reality where I need to actually be able to grab the ball to be able to like dribble it effectively and that was really really cool so you know that 240 hertz tracking comes in where you know you can throw the ball up and spin it really fast And what they did was they actually mapped, like when you look at the logo on the ball in the game, they mapped the Spalding logo and embossment correctly to the real ball too. So that I could actually see it in the game and then put my hand on it and feel in exactly the same spot. So you throw that ball up and spin it really fast and it tracks the spins perfectly. And it was a really good demonstration, because that sort of thing requires a lot of hand-eye coordination, and all the physics have to be just right, and thanks to the real world providing the physics of the basketball, as long as the cameras can track it correctly, and the latency is low enough, then you can just use your existing ability to handle an object like that. and have no problem. And it was no problem at all. It was easily within the latency threshold where I just, I couldn't tell. It was just like interacting with, I would expect to interact with a real basketball. And it was really cool to see. There's just a lot of possibilities with what you could do with this technology. It's not like really consumer home use stuff. But if you think about motion capture and green screen technology that they do today, generally you have the actors have mocap suits on and they're just in this crazy big green screen stage. and they're just running around in a big blank environment, maybe with a couple of props for them to jump on or whatever, but they just have to kind of imagine, they have to imagine the world that their character is in, and even the other person that they're acting to, they have to imagine that, you know, in Planet of the Apes, for instance, these people were motion captured to eventually be turned into apes, so they have to imagine that their co-actor is, you know, an ape. You throw them in a VR headset with a system like this, you stream the data into something like UE4 with pre, you know, not movie quality, but pretty darn good real-time graphics, so that not only do they see all of the virtual props that they can move around, not only could you bring in additional props like a sword by sticking on tracked markers, Not only could you have that their co-actor already look like whatever that person is going to be, whether it's Smaug in the Lord of the Rings film, so that you can pre-visualize all this stuff and bring that actor into that world and have them act in that world instead of have to imagine that world. So yeah, that was a bit of a long rant, but very exciting. Very exciting for not just for your personal, you know, home VR use, but for those sorts of things too.

[00:33:10.103] Kent Bye: Wow, that sounds really awesome and amazing of how that's going to change the mocap industry, to be able to, like you said, have these scenes where they're co-acting with other people and having lots of them at the same time in the same virtual space. I just want to also bring up one experience that I saw at the VR Mixer on Monday night, which was Interspace VR's Law Perry, which they have a video of some of the behind the scenes of what they did, but it's basically like they have this huge OptiTrack grid where there's this dancer who's doing this amazing dancing and then she's on a harness and so she's getting like flown through the air and flying and then you know just imagine capturing that real actual mocap data and then completely stylizing it into like the most beautiful like art you could like possibly imagine and you know just to be able to see that dancing you you just know that that's a human being that you're like there's no way that like anybody would have just like created that from like their mind of changing all the different numbers in a Maya animation. Like you can tell when it's a human embodied presence when they do that. And dancing that close up and having a story around it, it's like probably one of the most beautiful and epic like cinematic VR experiences that I've seen. And I'm really excited to see like the full like 15 minute version. I just got like the three or four or five minute teaser, but it's like a full experience. And so, yeah, I think like Interspace VR is doing some of the most sophisticated like motion capture, in addition to narrative storytelling in VR, and really using the strengths of VR in terms of near-field and just bringing human presence and embodiment into these experiences.

[00:34:51.627] Ben Lang: Yeah, I think the applications for that kind of tech are really exciting for creatives. You imagine, you know, very similar to what you're saying, imagine real time performance art where you have people on a stage, but they have motion capture suits on. And just because if you can stream that data into an engine and turn those people into, you know, a fairy, a dragon, any number of things and project that behind them, you know, you can see your real actor there and then have that experience of their transported into a whole nother world and have the audience watch what happens in that world. just really incredible possibilities coming up with this technology.

[00:35:26.758] Kent Bye: Great, and just to kind of wrap things up here, I thought I might just share a few kind of buzz-worthy experiences that, you know, at this point it's Wednesday, we have Thursday and Friday. Valve is going to be showing a lot of other different experiences. At this point, I don't think there's going to really be that much other new news or new experiences. It's going to be kind of like people kind of checking out a lot of the launch titles. Chet told me that they have about 50 launch titles with the Vive, which is kind of interesting juxtaposed next to like Oculus' 30. And I don't know like how full of games these are going to be. Like it kind of still feels like a lot of these Vive experiences are still kind of like at the tech demo stage where I'm not sure how much full gameplay you're going to get out of it, but a couple of ones that people kept mentioning over and over as I was talking to the Valve employees were like the vanishing realms, the budget cuts, and I think it's called audio shield is another kind of like meditative experience that people were really responding to. So I don't know if you guys had any other like either ones that you're looking forward to or ones that you've heard some good buzz about.

[00:36:31.088] Ben Lang: Yeah, so there's one that I haven't had a chance to try yet, but I would really love to and hope that I have a chance to, which is Unseen Diplomacy, which is using the room scale space in a really creative way, where they're using kind of like micro spaces that trick you into thinking you're exploring this big facility. So, for instance, you might start in a small little closet that fits one person, basically. Open the door and go into this room and see that there is an air vent that you can unscrew the screwdriver and pull it away. Literally get down on your hands and knees and crawl through the air vent, straight a couple feet, and then take a 90-degree turn. pop out of another room and then open the door and now you're in you know this other small room and it's like they are dynamically actually changing the environment so you're technically staying within this rectangular box but they're using all these virtual environments to change around you to make you feel like you're like a spy like crawling in the air ducts of this big facility and actually traversing a lot of space when really it's It's essentially a form of, you know, redirected walking, which your listeners are probably familiar with from experiences like The Void. Just kind of a less transparent version of that, but just still super creative way to use the space and make the player think that they're using so much more space than they really are. And for me, it took me back to, you know, when I was a kid, me and my siblings would build these just box forts that were that were no more than, you know, kind of a square of boxes and an entrance and an exit. And yet we would envision these the sorts of things that VR is now visualizing for us. And that was a really cool thing for me to see. So I'm looking forward to trying that and kind of living that thing that was once in my imagination when I was young.

[00:38:14.298] Kent Bye: Yeah, one other that came up over and over again with people that seem to play a lot is the Space Pirate Trainer, which is a game where there's these waves of robots that are coming at you and shooting, but you can actually use the full room scale and walk around and dodging, and they come in different ways, and I think that it's one of those games where you can just build up a real strong aptitude for pointing and aiming with both hands at the same time. and just sort of go to as many waves as you can. I think, you know, Colin Northway from Fantastic Contraption posted a video where he got to like wave 23 and just like, you know, as things are firing at you and bullets, you turn into like slow motion and so you can start to dodge and you just like feel like you're this action movie hero. And so that's a game that I heard a lot of Valve employees kind of talking about ones that they tend to like play over and over again. What about you, Scott? Do you have any ones that you're looking forward to?

[00:39:07.674] Scott Hayden: I'm looking forward to tomorrow. I think we're going to go see it. It's the Unreal Engine VR editor. And that's one that I've been looking forward to that before I even knew what it was. I've always been looking forward to be able to have a moment, because I've never had any experience with developing. I think I downloaded Unity and Unreal, opened them up, and then close them immediately because it's just, it's too much for me right now to even think about starting. Maybe that'll change. But having the chance to be able to get in to somewhere and just toggle through some pretty much ready-made assets and physically put them places, make them longer, put them higher, connect them the way, it's a really human way to interact with things. I know people will appreciate that whether they develop, actively develop, or they don't. And one thing that I heard about that that's also really exciting for me is you're going to have to be able to get around your terrain too while you're building whatever it is that you're building. And their locomotion strategy I heard is pretty unique because what you do is instead of, you know, like teleporting around and, you know, looking at things at different angles, essentially what you do and, you know, of course we'll make corrections if it's not right because I haven't tried it yet. What you do is essentially pull the world as if it were like on a rope So what you'll do is you'll take your you'll extend your controller and then you'll grip and you'll pull it So that you can physically pull the world forward while you're still in the same place and I think that's It's not a game Locomotion, you know, it's very much like a fine-tuning Sort of way to move around your environment, but I'm looking forward to all of that and I'm really what I want to make something obscenely stupid That sounds like a great note to end on

[00:41:01.901] Kent Bye: So Scott and Ben, thank you so much for joining me on these series of podcasts. I'm about to head off onto a plane to South by Southwest. And you guys will continue to cover things here for the next couple of days. Everybody who's listening to this will hear this on Friday. So all this will already be done. And so just look to Road to VR for all the updates. And yeah, so thank you so much for joining me. Excellent. Thanks, Kent.

[00:41:23.407] Ben Lang: Thanks, Kent. Safe travels.

[00:41:25.349] Kent Bye: And thank you for listening. If you'd like to support the Voices of VR podcast, then please consider becoming a patron at patreon.com slash voicesofvr.

More from this show