Epic Games’ Nick Whiting paired up with Nick Donaldson again before Oculus Connect 2 to build the Bullet Train demo over the course of 10 weeks. They wanted to dogfood the Unreal Engine to optimize the rendering for VR, but also experiment with Oculus’ motion-tracked Touch controllers. They had the Toybox demo to use for inspiration, and so they set out to maximize the interaction fidelity for a game that had a lot of guns and explosions. I had a chance to catch up with Nick at the Seattle VR Expo where he told me about their design process as well as some of the technical limitations that drove some of their design decisions.
LISTEN TO THE VOICES OF VR PODCAST
Nick says that when people get motion-tracked controllers in a VR experience, then one the first things that they do is to try to touch everything. Bullet Train was a very iterative design process, and they noticed that people wanted to interact with the environment, and so they started to add more and more Toybox-inspired interactions.
A lot of people don’t have time to discover the full scope of these interactions in the first time playing through Bullet Train. It’s a brief 6-minute demo where it feels like you’re stepping into a first-person perspective of an action movie, and so it can be a little overwhelming. They decided to add the ability to slow down time to make it less overwhelming, but they also discovered in their 2014 Showdown demo that experiencing your environment in slow motion is really compelling and something completely unique to VR.
Nick talks about some of the action hero-inspired combinations that he was able to pull off in Bullet Train. The most inspiring one was when he once threw a gun at someone’s head, teleported to be right next to him so that he could catch the gun and then shoot someone else. It’s these types of combinations and high-fidelity interactions that he thinks that are going to keep people coming back to play an experience like Bullet Train again and again. He says that the most compelling games and experiences in VR that he’s seen are in a world that you can interact with and have a lot of agency.
Nick also talks about some of the technical limitations that they faced in building their demo, which drove some of their design decisions. Specifically Oculus recommended to the developers leading up to Oculus Connect that having two forward-facing cameras would maximize the tracking fidelity, which meant that if the player turned around 180 degrees that they’d start to lose hand tracking due to occlusion. So Nick talks about how the crafted the teleportation mechanic to ensure that the action was always directly in front of you so that you didn’t have a reason within the game to turn completely around.
There’s been a lot of talk and debate about using the Oculus in a room-scale configuration with the cameras in the opposite corners of the room since the last Oculus Connect, but Nick says that they got around this constraint by designing the Bullet Train demo to ensure that there was always a compelling reason to look forward in order to optimize the experience for a front-facing camera configuration.
Nick mentioned that part of his team’s focus leading up to the consumer launch was to ensure that the Unreal Engine was optimized for all of the other companies developing launch titles. In my interview with Tim Sweeney, he alluded to they were indeed likely working on expanding a Bullet Train-like experience as a full game and title to be released.
Nick did not explicitly mention how his VR development has been continuing since Connect, but it’s likely that they’re continuing to build it out. Perhaps they’re investigating adding more multi-player functionality since it’s something that didn’t have time to fully implement before Connect. VR can be isolating without it, and he says that having a shared space and community within VR instantly makes an experience more compelling.
Become a Patron! Support The Voices of VR Podcast Patreon
Theme music: “Fatality” by Tigoolio
[00:00:05.412] Kent Bye: The Voices of VR Podcast.
[00:00:12.022] Nick Whiting: I'm Nick Whiting. I work for Epic Games. I'm the lead VR and visual scripting engineer. And Bullet Train was really an effort. Last year at Oculus Connect we introduced the Showdown demo and that was really an effort to kind of dog food the engine and give it some tender loving care, put it through its paces and really see what we could do. Bullet Train was kind of a year later we wanted to do that again and make sure that the tools are really up to the job. There was some optimizations that we'd always wanted to done and it's really good to have practical content to do that with. And so it was kind of a two-fold process. We wanted to push the rendering technology and make sure we can make the most pretty demo there. Now that Oculus has released their recommended spec, we wanted to target that and see what we could do with it. As well as we got our hands on the Oculus Touch controllers. So we really wanted to kind of do a practical test of our motion controller implementation.
[00:00:54.705] Kent Bye: Great. And so I know you've been giving some lessons learned about creating this experience at a number of different conferences now. And so maybe you could kind of share what were some of the highlights in terms of the big takeaways that you got from doing this experience.
[00:01:06.610] Nick Whiting: So I think the biggest surprise to me wasn't even necessarily technical. It was, you know, once people get the motion controllers in their hand, it's kind of like the first time you use positionally tracked VR, right? You know, you started out with the DK1s and it was rotational only, and then everybody used the DK2, and you suddenly had the ability to kind of move around the scene a little bit. And that was awesome, and it's really hard to go back once you kind of experience that. Once we got our hands on the motion controllers, it was kind of the same thing. Once you have that kind of interactivity and presence in the world, it's really hard to go back without it, right? So the biggest surprise to me, though, was once you have that, you want to touch everything, you know? Like in a normal video game, people just kind of accept, like, I can't always, you know, flush the toilet or turn on the sink, the little tiny things. But when you do that with motion controllers, people want to touch everything in the world. So, you know, we started out in a subway train and there's little rings above your head. The first thing everybody did was they wanted to jump up and just kind of boop the rings around, right? So we had to make those physics objects. Like, there's just a much higher bar of fidelity that you have to hit once people can actually move around and interact with things in the world rather than just be kind of a passive experience.
[00:02:08.490] Kent Bye: And so how did the main different gameplay mechanics evolve and develop then in terms of what you found to be kind of compelling and interesting with using the touch controllers?
[00:02:18.267] Nick Whiting: So with the touch controllers, when we got them, the only demo we had really seen for the Oculus Touch was the Toy Box demo that was made by Oculus themselves. So our first inclination was, you know, it was really fun just to interact with things, right? You know, you can pick up robots, drive cars, there's a lot of just very simple but very fun kind of interactions with that. So what we really want to do, since we're Epic and we have a history of making things with guns and things that go boom, take that and kind of make a toy box but with guns, right? So our first prototypes were just kind of an alleyway where you were shooting things and it was very static, but it was a lot of fun, right? You could tell how much fun it was to just kind of interact with the guns. We called it gun porn because you were, you know, just kind of pull the slide back on the pistol and feel the haptics in your hand. The details just really sold the experience. So we were like, we've got that part down, but it really kind of feels a little bit empty. So what we wanted to do was, we had plans to make a really cool environment. We're like, we need to move around this environment and kind of explore it and go around the world. So we started playing around with different mechanics. The first thing we tried was going the kind of showdown route where you're just kind of being pulled through the world as a passive observer and then kind of beating things up as you go along. But we found that the problem with that was you only have tracking when you're facing front. So as soon as you would go past people, you would lose tracking and it would kind of take you out of the experience because you have these hands and then when you can't use them, you feel very robbed of it. So we're like, we need to keep people facing forward. So we came up with the teleporter system, which, you know, wasn't a new original idea. But what we decided was if we kept all the action in the center of the map, we could kind of put teleporters in a ring around them and always face you towards the center of the action. And that's really where the gameplay evolved from. We wanted you to feel kind of badass, so we took influences from kind of Hong Kong cinema, Korean cinema, and the Matrix and stuff like that. And just after we got the basics of the gunplay down, we just put people through the experience. And whenever they tried to do something and were disappointed, or reached out and tried to grab something and it didn't react, that was when we would go back and implement, you know, something that was interactable.
[00:04:11.931] Kent Bye: Yeah, and it seems like one of the big innovations in terms of gameplay mechanics, I think, is that you're able to slow down time, grab bullets, and throw them back. So how did that come about?
[00:04:22.005] Nick Whiting: That was really, the slow down time mechanic was something we had played around with Showdown, but it's also kind of a response to just how overwhelming the experience can be. Like, people aren't used to being in an action movie, right? You have bullets and explosions flying all around you. It's very overwhelming and people want to, you know, slow down for a second, right? So, adding the slow down mechanic gets people that don't have super quick reflexes the ability to check out the scene and, you know, kind of react to things without feeling as much pressure. And slow-mo in VR, it's one of those things you can't do in real life, so having that kind of slightly surrealistic element to it is something new and different, and it's something people don't normally experience. It kind of adds to the novelty of it.
[00:04:58.395] Kent Bye: Yeah, and I know that when you gave a presentation at a hackathon, you were talking about going and moving forward through the Showdown demo, you have these explosions that are flying through the air and that it actually kind of helped with locomotion sickness to be able to have some reference points of this particle effects, you know, these pieces of exploding shrapnel kind of flying by your face and that. I think that being able to slow down the bullets so that you can actually reach out and grab them kind of has the effect of creating this near-field VR effect of being able to have things that are really close to you being really super compelling that you can then reach out and interact with. And so I guess that's a little bit of a challenge of finding ways to have that near-field interaction with a full-scale first-person perspective.
[00:05:43.427] Nick Whiting: Yeah, that's definitely the case. I mean, the bullets slowing down around you are kind of one of those things we wanted you to feel okay. You're being shot at, and that's not something people are usually used to. But, you know, when they go by slow and you reach out and you can touch the bullets, you can actually bop them around with the gun itself. People feel a little bit better, then the first thing they do is try to reach out and grab them. And, you know, through playtesting, we found out that was something. we should add. But the other thing that we tried to do to get the near field, because like you said, it's so compelling to be up in kind of the near field of various objects and characters, was we added the teleport mechanic so that you can point at a guy and actually teleport up to him and be about a meter away from him and really interact up close. You can grab the gun out of his hand, you can punch him, you can grab his hand, you can do all the sorts of stuff. That again was super-duper compelling. I mean, one of the most magical moments of the demo for me was I had a pistol, it ran out of, or I only had like a bullet left, so I threw the pistol at the guy, it hit him in the head, I teleported to him, and then I was in my near field and I grabbed it right back from him after I'd knocked him out and shot the guy next to him. And it's really cool because that very up-close tactical interaction, I mean, we as humans experience the world very up-close, and so that feels the most natural for us. When you have to kind of reach out and grab, people feel very awkward, you lose your balance, especially when you have a mask on, like you do in VR, so it was really, the more we could kind of pull players closer to the action, the better.
[00:06:59.719] Kent Bye: And so, Oculus Connect was an opportunity to show the bullet train to a whole number of really experienced VR developers, and then here we are at CVR. Have you been able to make any changes or updates since then?
[00:07:12.168] Nick Whiting: This is pretty much the straight-up version from the Oculus Connect. Once we get a demo that works and is tested, like Oculus Connect had almost 2,000 people, I think. When it's been through that many hands, we're pretty confident in it. We don't want to shake the jello, proverbially, so much, so this is pretty much the same thing. But we have been working on minor modifications. There's still a few bugs that shipped when we went to Oculus Connect. We've done things like improve the reflection materials and a few little tweaks here and there just to really kind of buff it up and do things that we didn't have time for before Oculus Connect, because it really took the entire 10 weeks that we had. We were up to the very last moment making builds and modifications, so it's nice to have time to patch it up.
[00:07:49.733] Kent Bye: Yeah, you know, I had a chance to do both the Toy Box demo and the Bullet Train, and, you know, something about the Toy Box for me is that it was very low-fidelity, low-poly. There wasn't a lot of textures or anything that was really making it look realistic, but yet, to me, I almost had a more immersive experience in that. I don't know if it was because it was with other people, or if I was able to have more things in the near field to be able to play with. But when I was in Bullet Train, having kind of like this really photorealistic environment, there was something in my brain that just didn't feel like I was completely there. And I didn't know if you have thought about that or played with that at Epic Games or what some of your thoughts are in terms of like the level of expectation that you have when you have something photoreal, if you feel like it's going into the uncanny valley or if you want to keep pushing forward with that.
[00:08:37.383] Nick Whiting: I think the thing about the Toy Box demo, they really hit a really magical peak where they traded a little bit of visual fidelity for just the interactive fidelity. Everything that you can see in the Toy Box demo is interactable, and it's interactable in a cool way, right? You want to reach out and play with it, and you get rewarded for that. And you have kind of the social experience, too. I think, you know, it's easy to understate the value of having another human there present talking to you and guiding you through the experience. You get a lot more out of it, I think. We had played around with doing multiplayer in Bullet Train, and we didn't have time to actually pull it off, but that was one of the things that we really wanted to replicate, was having a guided experience to ensure that you're touching all the cool things. That's the thing about Bullet Train, especially the first time people go through it, there's so much just getting used to the controls that you have to do, that really the second or third time you go through it, people start discovering the little kind of toy box bits of it. doing combinations of things, like I want to throw the grenade and then shoot it out of the air, or grab the rockets and then shoot those out of the air, or aim them at a crowd of guys that you just really don't discover because you're limited to a six minute demo the first time around. So that's something that we feel is really kind of fertile ground to improve. upon the experience, but we really have to hand it to Toybox. They did a really good job of making every little detail in that world interactable. We just, you know, with a 10-week demo, you can't make every little bit. But like I said, kind of the bar has been raised, I think, in terms of how much you have to be able to interact with the world, even in the most simple of ways, like bopping physics objects in the world or, you know, reaching out and turning handles, things like that. Now you are requirements as opposed to just kind of candy flourishes.
[00:10:13.512] Kent Bye: Yeah, and what were some of the other big lessons learned that you were trying to communicate at, like, say, Oculus Connect when your presentation there?
[00:10:20.457] Nick Whiting: So, I think some of the biggest lessons that, you know, not necessarily learned, but that, you know, became very hard, brutal truths are trying to control the player in VR is a very, very hard thing to do. As I kind of mentioned, the teleportation system was really an effort to try to direct the player's physical orientation towards the two tracking cameras. that were in there. We had an early prototype that was kind of based off of the Hong Kong and Korean action movie one shots where you're kind of moving through a space as a character and then interacting with things around you. The problem was once they went past you, if you, you know, didn't get to the guy quite in time, you'd start to turn around and then you'd lose tracking and it would break the presence of the experience. So I think we kind of at the beginning were a little cocky and underestimated the immersion breaking potential of that. We were hoping that, you know, you could interact in a full kind of 360 thing. And then even with the teleportation system, we've noticed a lot of people still turn around, people still lose tracking, people still do crazy stuff all the time. And there's just no way you can control it. So the only way that you can kind of get around is to try to mitigate that through kind of clever design tricks and trying to orient people and give them a reason to look forward. So yeah, people are full of surprises, I think is the most hard-fought lesson of that.
[00:11:29.320] Kent Bye: Is that something that Oculus is recommending, that they only have two front-facing cameras? It just seems a little weird that they wouldn't have something, maybe one in front and one in back.
[00:11:38.646] Nick Whiting: We'll see what they get up to. The reason they do the two front-facing cameras is to kind of give you the maximum breadth of trackability in that volume. So that's why they encourage you to have interactions that are all up in front of you and whatnot. It just works the best for the current setup. But I'm sure they're probably thinking about the problem long and hard because, you know, it's an issue.
[00:11:57.117] Kent Bye: And when it comes to, you know, the future of this VR team that you have, it sounds like it's kind of like a scrappy, you know, use the best out of limited resources. Maybe talk a bit about, like, how many people were on the team and then what you see is kind of moving forward.
[00:12:11.575] Nick Whiting: So, Bullet Train was about a 10-week project. For the first few weeks, I was kind of talking with Nick Donaldson, who does all the design work on Showdown and Couch Nights and all. He's been my VR buddy through the entire adventure here at Epic. He was kind of prototyping ideas, getting guns set up, and kind of the basics of the interaction for about four weeks, and then I rolled on to help him on with some of the engineering stuff, and we had a... the guy responsible for the environment came from the film industry, actually, and so he built all of that in about six weeks, and we kind of had a core team, I'd say we had about an average of six to seven people for that last six weeks there. We swelled up to 18 was our biggest day when we had begged, borrowed, and stole as many people to fill in the VFX content and the animation and stuff. It was really kind of an effort. Whoever we could get our hands on, we would, you know, take their time. And so about 18 to 20 people contributed to Bullet Train in some way, but it was highly variable day to day on when we could actually steal their time.
[00:13:03.837] Kent Bye: I see, and it seems like as the consumer launch of VR is coming up here, is your team just mostly focused on, you know, the next big demos or are you actually starting to, you know, build things out to actual games?
[00:13:15.883] Nick Whiting: So, you know, the reason we did this demo was really to, like I said, dog food the engine and make sure everything was kind of production ready for it. As all the headsets are going to release late this year, early next year, our focus on the engineering side, at least right now, is to really make sure that we're keeping up with all the SDKs. We're kind of rounding out our framework for people to build these experiences on, so the people that are going to release experiences in the launch window for all these platforms have a good experience and everything is smooth. And we've started working really closely with some of our licensees, like CCP has recently been shipping GunJack. The Samsung Gear VR, we're trying to identify licenses that are really close to shipping and then asking them where their pain points are, revving the SDKs, trying to help them through those last issues, in addition to kind of taking the learnings that we got from Bullet Train and adapting those to save people time on the framework side and the engine side. So right now I'd say our two biggest focuses, at least from the engineering standpoint, are A, keeping up with the release version of the SDK and then just kind of making sure that all the optimizations and all the knobs are tight for the release early next year.
[00:14:18.104] Kent Bye: And in that 10-week development cycle, are you running into bugs with the Unreal Engine and then submitting them to the core engineers that are responsible for that, and then hoping you get a fast enough turnaround that you can actually complete the demo in time?
[00:14:31.012] Nick Whiting: Well, we're certainly running into bugs. But we try to be as self-sufficient as possible. With that concatenate of a time frame, it was mostly me looking into issues. And I was playing firefighting engineer for a good bit of it. If they would run into a bug, I would find the fix. poke other people on the team. We try not to rely on people externally to it because everybody's, you know, we've got our own internal projects at Epic, multiple ones going on in addition to maintaining the engine. So everybody's on a very tight time frame. So as much as I could solve those problems for people, I tried to be very kind of in tune with the artists and the design team and remove any barriers that they had.
[00:15:05.503] Kent Bye: And in terms of like designing VR experiences now that you've had a chance to kind of play around with the HTC Vive and also the Oculus Touch controllers, are you trying to figure out what you can do with both of them? Or are you trying to actually do things that you can only do with, say, the Touch controllers that you wouldn't be able to do with the Vive controllers?
[00:15:22.479] Nick Whiting: So we kind of play it a little bit both ways. Because we are an engine, we want to support everything out of the box. So we have the Oculus support, the Morpheus support, and the Vive support out of the box. And the kind of infrastructure that we built around that for motion controls and whatnot is platform agnostic. So because you were very resource constrained when we were building Bullet Train, you know, we only had one set of touch controllers for the longest time because that was all Oculus had to give us. So, you know, we were using Razer Hydras and, you know, Vive controllers and whatnot, and everything was working under the same system, so it made kind of no difference. But the differences really come in kind of the ergonomics and the tuning for each platform. So the Oculus Touch has very different ergonomics than the HTC Vive or the Sony Move controllers. So there's a lot of specific tuning towards that, the kind of longest mile thing. But we really wanted to keep the kind of gameplay systems agnostic so that you can build the content once and then run it anywhere just because that's what, you know, people do. We have a bunch of people on Oculus. We have a bunch of people on the Vive. We have a bunch of people on the Sony Morpheus, or PlayStation VR now. Sorry, I'm still used to the code name. And it's to their advantage if they're able to kind of test and deploy anywhere they need to without having to rewrite a bunch of platform-specific code. So we really feel that that's our job to kind of keep things agnostic.
[00:16:33.354] Kent Bye: And so have there been other gameplay mechanics that you've seen or played with touch controllers that you find to be super compelling?
[00:16:40.239] Nick Whiting: I think just the idea of interacting in the world is super compelling. There's a particular demo at Oculus Connect. I can't remember what it was called. It was like Cloud Nights or something like that, where there's a bunch of floating islands and whatnot, and you can kind of scoop up your little knight characters. It was kind of a simple strategy game. That was really cool, because everybody's done the kind of tiny world thing now, and that's fun, and it's cool, and it's a neat experience. But finally being able to kind of interact and poke with those and scoop people up was just a very natural interaction. Everybody that I saw play that demo just got it immediately. So that was pretty cool. And I was also actually very impressed with the medium from Oculus, with the sculpting tool. Like, I'm not an artistic guy at all, in the very least. I mean, I had a little bit of programmer art in my presentation today, and it looked awful. But, you know, I could go in there and I could have fun. And, you know, I could kind of see where it was going, right? It's like being a kid and playing with a lump of clay, but it looks better because it's, you know, all digital and 3D rendered and stuff. So I actually had a... I was very surprised with how easy and intuitive they had made the controls for that. Same thing with Tilt Brush is another one where I'm not an artist, but I have fun doodling in those things because it feels so natural.
[00:17:42.731] Kent Bye: Yeah, I also just want to mention that Tim Sweeney was at Oculus Connect, right? I've seen some quotes of him being very supportive and helpful, but what can you tell from actually working within Epic how he is being supportive of the future of virtual reality?
[00:17:57.128] Nick Whiting: I think it's very telling that, you know, two and a half years ago, this was my after-hours project at Epic, right? You know, I had known Brendan and Nate from Oculus for a long time. They sent me a prototype and they gave me the freedom, like, hey, we've got these cool tech demos. I want to hook them up. And they're like, yeah, sure. Whatever, right? And then since then, we've grown into an official kind of division, right? I've been able to hire up a team where, you know, The fact that we had 18 people on bullet train at one point in one day, I mean, that's kind of not cheap to do, for one thing, right? You know, you have to pay all those people's salary, but the fact that they were so supportive and let us steal from other projects for that, I mean, it really shows a lot about their commitment. So, you know, Tim Sweeney and Paul Meegan and Kim Libreri, especially, have been just so supportive and really believe in the technology and have just given me a lot of kind of free reign to do what I see fit, and it seems to have worked out. So far, I hope I can keep doing them proud. I mean, Tim was one of my heroes growing up. You know, I had the PC Gamer with him in it and I always wanted to meet him and now the fact that I get to work with him and that he believes in what I'm doing just means so much to me personally. So, it's been a really cool experience.
[00:18:59.891] Kent Bye: And finally, what do you see as kind of the ultimate potential of Virtual Reality and what it might be able to enable?
[00:19:06.062] Nick Whiting: Oh man, that's a tough question. I'm going to put on my futurist hat right now. But I mean, to me, the most compelling experience is, you know, a lot of people that I don't think have tried very many virtual reality experiences always talk about it as isolating because they see people putting on masks and not being able to make direct eye contact with people. But those people, I don't know, have tried kind of multiplayer experiences where you can get just a little bit of body language or emotion code from another person and that is instantly compelling, right? You know, chat is good, like I play video games with my friends and we chat online with voice, but you know, it's not the same as being there because you don't have that body language. You can't tell the frustration when you beat them up a little bit or, you know, get, you know, pummeled by them and they have the victory. That kind of human element to VR, I think once we really start networking things together and making more compelling multiplayer experiences, I think that's where it's really going to be truly revolutionary, right? People cooperating on a project, even if it's not games, you know? Imagine a bunch of designers or architects or something working in a shared space, right? The shared space and shared community is really where I see it taking off. It has a lot of potential, the same way that the internet kind of changed telecommuting. People can work from home now and we don't have to work in offices. I really think that the shared presence really opens up a lot of opportunity for things that that the internet necessarily isn't great for, and our current collaboration tools aren't great for, but being there and feeling that presence with people, both on the social and professional level, I think that's really where it's gonna take off in the next few years.
[00:20:26.354] Kent Bye: Okay, great, well thank you. Thank you. And thank you for listening. If you'd like to support the Voices of VR podcast, then please consider becoming a patron at patreon.com slash voicesofvr.