Nick Whiting is a lead engineer at Epic Games and has been a VR evangelist there. Nick had worked with Brendan Iribe and Nate Mitchell at Scaleform, and so they sent him an Oculus developer’s kit to integrate into Unreal Engine 4. He ended up working on it in his spare time, and eventually got it working.
Then Oculus brought them the HD prototype and they collaborated on creating a UE4 demo for E3 in July 2013, which helped Oculus beat out Xbox One and PlayStation 4 in the Game Critics E3 award. This was a turning point that helped to legitimize VR at Epic Games.
Then founder Tim Sweeney & the directory of engineering Dan Vogel saw the Valve VR demo room, and this helped to seal the deal for VR at Epic. Some people could see the potential beyond DK1, but others needed to see something closer to something that’s ready for a consumer product. The Valve Room VR demo was a clear turning point for the leadership at Epic.
Over the past couple of years, Nick has started to get more resources to make VR demos, including the Showdown demo that was the final demo scene in the series of Crescent Bay demos.
VR started as a side project at Epic, and now Nick says that it’s pretty huge there. Most recently Tim Sweeney said that he believes that virtual reality “is going to change the world.”
In this interview Nick talks about:
- How opening up UE4 to a subscription model at $19/month brought it to a wider variety of developers and VR experience creators.
- The process of integrating open source contributions from the community back into UE4
- The Public Trello board for the UE4 Roadmap, and how that plays into their release cadence
- Help from Oculus in integrating UE4 originally came from Andrew Reisse, and now Artem Bolgar has been the dedicated resource doing a lot of engineering work to get the Oculus SDK integrations working
- Epic’s approach to superior visual fidelity
- The possibility of SLI GPUs and need for more GPU power for VR
- How the Showdown demo was being run on the NVIDIA GeForce GTX 980 at 90Hz at the Crescent Bay demo resolution
- Experimenting with integrating with different hardware for VR input controls. The more integrations, the better
Nick also talks about some of the lessons learned from doing VR demos. He says that VR makes developers more honest because the tricks that work in 2D don’t work as well in 3D. Couch Knights was about creating a shared social space and it was more impactful than they expected.
Epic’s visual style has also traditionally been more realistic and gritty, but they found that within VR that people tend to make more emotional connections to abstract characters with a more stylized art style. There are downfalls of the uncanny valley with a hyperreal rendering, while a low-poly scene tends to allow your mind to accept it more because there is room for more mental projections and less noticing for what’s not 100% correct.
Finally, one of the most powerful experiences in VR for Nick was a social VR experience where he felt presence with another person who had limb tracking enabled. He see that humans being presence with each other was really powerful and compelling, and that being present in a world that’s not our own has a lot of potential that he finds really exciting.
Theme music: “Fatality” by Tigoolio
This video is what since has been referred to as the Showdown demo, and was the final demo scene of the Crescent Bay demo.
Here’s Nick Whiting and Nick Donaldson from Epic Games discuss Lessons from integrating the Oculus Rift into Unreal Engine 4 at Oculus Connect:
[00:00:05.412] Kent Bye: The Voices of VR Podcast.
[00:00:12.012] Nick Whiting: My name is Nick Whiting, and I'm the lead VR group at Epic. And VR was kind of a very scrappy kind of upstart thing. I had a personal relationship with a bunch of the guys that started Oculus, Nate and Brendan and those guys, because we worked with them at Scaleform before. So then when they said they were doing this crazy Oculus thing, they called me up. They were like, hey, can we send you a kit? Would you integrate it into UE4? And I'm like, yeah, you know, it sounds cool to me. I had my background in kind of biomedical engineering stuff, so it was kind of cool to kind of get into the human interface stuff again. And it really kind of tickled my fancy. After work, you know, just in the late hours I would, you know, slowly start to integrate it in and integrate it in and, you know, we had a working integration at one point and then Oculus came to us with the HD prototype and said, you know, do you want to show it at E3? We'd love to have some cool sexy content to show and we said, yeah, you know, we'd love to work with you guys and so many late hours after I started integrating that as kind of a side project and then it became a legit thing and they launched it at E3 and that started to kind of build momentum. People were seeing it as legit, especially when they won the best hardware in show versus the Xbox and the PS4, like, you know, it's suddenly legitimized. And then we had more traction internally in Epic after that, but the thing that really sold is we had a few guys like Tim Sweeney, our founder, and Dan Vogel, the director of engineering, went to see the Valve VR. demo room and that's an amazing experience. It's the religious moment for a lot of people. They put it on and then all of a sudden they believe. And that was what really sealed the deal. And after that, I got another programmer, J.J. Hazing, to join the team. So we have two guys now and we got more resources from Nick Donaldson and Paul Motter to start making these demos that we did with Oculus over the past year and a half. And it's just kind of bloomed from there.
[00:01:48.400] Kent Bye: Interesting, and so you really pinpoint a turning point as the Valve room and sort of really convincing the higher ups at Epic Games that this is a real thing that should be thought about and invested in.
[00:02:00.373] Nick Whiting: Yeah, absolutely. Because, I mean, there's kind of two types of people. You know, people that put on the DK1 and can see the potential and see past the technical limitations that were there. And then other people are still, you know, kind of unsure, you know, is the delta between what we have here and an actual consumer product, you know, too big to surmount. And then they kind of see the research that Michael Abrash and Otman Binstock and stuff are doing at Valve with the VR. And, you know, they kind of see the light. They can see the finish line consumer product and they have a good experience in it. And then all of a sudden they're like, okay, maybe that delta isn't as big as I thought it was. And they kind of turn into believers. That was really kind of one of the turning points for Epic with VR to convince people that it was a legit thing. They let me do it as a side project, starting it up, but until, you know, we really got people to believe in it was when it really started gaining official traction. And now it's gigantic. It's huge for us. We love working with Oculus and they've been super supportive of us and we've got a bunch of other partners in the VR space and it's been going great. It's more work than we can keep up with sometimes.
[00:02:53.353] Kent Bye: And so, have you had a chance to experience both the Valve Room and the Crescent Bay series of demos? And maybe you could sort of give your sense of how the two compare.
[00:03:03.116] Nick Whiting: They're really starting to converge really quickly. I mean, the Crescent Bay is super impressive, the better optics they have on there. screen and the 90 Hertz refresh rate is really nearing what the valve room you know is especially back when I saw it for the first time the two are starting to converge so I think it's an exciting time because now the dream of the valve VR room is kind of becoming reality on the Oculus side as a consumerized product and that to me is really cool. There's not a huge delta anymore.
[00:03:27.077] Kent Bye: And so I imagine that when you first started to work on some of the integrations, that was before you had released the full source code of the Unreal Engine. And so maybe talk about that evolution in terms of when you started work on it, and then the changes that happened, and the excitement that came from the virtual reality community after that.
[00:03:44.920] Nick Whiting: Yeah, like you said we started well before it was leading up to e3 So it was May of last year that we'd started integrating the oculus Stuff and we did March GDC this year is when we released it the source code completely and you know before since we had a limited number of Licensees right I mean, you know a bunch of triple-a people and a bunch of cool people in the industry using it But you didn't have the volume of you know just kind of unleashing it upon anybody who wants to give it a try so after we made the engine $19 a month. A lot of people downloaded it because it was a nice cheap way to get some really high quality visuals in VR and a whole lot of people have been experimenting with it. And one of the coolest things is, you know, not only are people making cool kind of indie experimental demos that are really kind of pushing the bounds of what are cool experiences in VR, but a lot of companies that are doing like high-end visualizations and architectural visualizations and kind of other peripheral companies are starting to do integrations because they want to work with, you know, virtual reality and, you know, kind of touch on the periphery of controls and haptics and all these other things. So seeing that kind of, you know, opening the door to everybody has let all these people that wouldn't previously have, you know, used our engine or even known that it existed, now have the opportunity to integrate into Unreal Engine. And it's been really remarkable to see all the cool projects that come out of that.
[00:04:52.437] Kent Bye: And have you been able to expand the community of developers that are also contributing to the source code of the Unreal Engine and having some of those innovations be fed back in upstream into the core project of the engine?
[00:05:04.947] Nick Whiting: Absolutely. I mean, we read the forums almost every day. I don't get as much chance to reply to them personally, but we keep abreast of all the kind of community projects that are doing and what their pitfalls are. We've talked to a lot of the developers, the kind of small groups of guys that have been using Unreal and, you know, where are their pain points? What are they being really successful at? We kind of take that feedback and a lot of people have given us code contributions that fix bugs in the Oculus integration and stuff like that. So it's been really nice to kind of build that sense of community because, you know, the game industry is sometimes very secretive and very you know, kind of clandestine and you don't get a lot of that kind of sharing and cross-pollination. But now since, you know, our code is out there and anybody can see it, you know, it kind of lifts that burden off of you. You don't have to feel like a secret agent all the time protecting, you know, trade secrets. And, you know, people feel that kind of reciprocity, like, you know, you gave me this and I'll give you back some contributions to try to help you. So it's been overwhelming. It's hard to keep up because we get so many of them, but it's been really rewarding to kind of see that knowledge share go two ways now.
[00:05:58.573] Kent Bye: And the other thing I find really interesting is like a public Trello board that sort of lists out your roadmap of, you know, this is what we're working on next in conjunction with rapid iterations of frequent releases. And so maybe you could talk about that open process of development, but also the type of cadence that you're trying to hit when it comes to improving the engine continuously.
[00:06:18.008] Nick Whiting: Yeah, it's been a pretty dramatic shift from our production because, I mean, before we had the luxury of just, you know, kind of working on features and when they were done, they were done, and when we were ready to release them, we could release them, but the Trello has kind of kept us honest. You know, we try to broadcast about a three-month timeline out and commit to features that we're going to deliver to people. So, you know, we don't always make them, but as soon as we realize that we're not going to make them, we try to update it and be as transparent as possible. And the cool thing is people will see what we post on the Trello board and they'll be like, oh, I've been really looking forward to that, or, you know, I really wish we had this in conjunction. And that gives us ideas to feedback. So we've changed direction based on what people want and what people desire, based on the feedback from that Trello board. So it's been something of a blessing and a little bit of a stress meter. But I think it's overall positive. As we've adjusted to it over the past few months, it's gotten a lot easier. And we've gotten more into the rhythm of the public releasing of that sort of information. And it helps us plan, honestly, internally as well.
[00:07:10.889] Kent Bye: And it does seem like Unity does have a larger market share in terms of the number of developers and because of that, Oculus is putting a little bit more effort in terms of writing up their own sort of Unity integration. Have you found that the Oculus engineers are also jumping in and helping do the integration? Are they kind of leaving that all up to you and your team to make everything work with their latest HMDs?
[00:07:34.129] Nick Whiting: No, they've actually been super supportive right from the beginning. I mean, we originally worked with Andrew Rees to get the original integration up and running for the E3, and then Artem Bulger has been incredibly helpful making the integration. He's really taken it and just run with it. So, he does a lot of the engineering work on his side and we just integrate it back in and then, you know, modify it and try to make The HMD work with the Morpheus and all the other things we're trying to support, but they basically have a dedicated resource over there for UE4, and now you've seen the other demos they're showing here that were UE4 based as well, so they've invested heavily in UE4, and it's great to see. It's a lot of love both ways.
[00:08:07.782] Kent Bye: What is it about the Unreal Engine that has a different quality of visual fidelity?
[00:08:11.623] Nick Whiting: I think it's kind of the product of we have a really nice tool chain. It's a very powerful set of tools. We've been focused, especially since we've released it to the public, on making those more accessible and usable to people so that they can hit a high level of visual fidelity and quality with a lot less effort than they could before. It's a combination of that and we just have some super talented engineers at Epic. It's a fun place to work because everybody is brilliant. It's very humbling coming into a company like that. You feel outclassed, but it's amazing because you learn so much from them. I think just the quality of the people we have there really gives us an edge on the engineering side to make an excellent engine and empower people to make their own really cool content.
[00:08:48.352] Kent Bye: And in terms of graphic cards, you know, NVIDIA just came out with a new 980 series of, you know, some specific customizations for VR. Do you foresee any time in the future having any sort of like dual graphics card SLI, or do you feel like we're going to be on a single GPO for quite a while?
[00:09:04.655] Nick Whiting: Well, they were kind of talking about that a little bit in their Maxwell event. You know, I hope it comes to fruition. I'd love to see that because, I mean, VR is obviously a growing market and, you know, it's something that you really need to have a powerful GPU for. You know, for quite some time we've been able to run games at a high frame rate with, you know, older cards. But now we really have a need for more GPU power. So I'm hoping that kind of the popularity of VR will kind of encourage the graphics card makers and manufacturers to really focus on it as a legitimate thing. And they seem to have been. I mean, NVIDIA announcing All their VR support from Maxwell is an excellent first step in that.
[00:09:35.770] Kent Bye: And have you got a chance to use it at all? And what sort of benefits do you get from that?
[00:09:39.981] Nick Whiting: Yeah, we actually we showed the showdown demo last week at the Maxwell Press event, so we had fortunate enough to have a little bit early access to some of the cards and running on that, and you know, we were really shooting for a high quality visual fidelity, we really tried to push it, and we really wanted to run to the ends of, you know, even what the 980 cards could do for us, so it was a fun experience just to try to push the boundaries a little bit, I mean, the demo running at 90Hz at the resolution we did, you know, was possible because of the new NVIDIA cards.
[00:10:06.045] Kent Bye: Oh, I see. So even here, you're running the latest cards in order to get to that. And so what do you see as the biggest open problems yet to solve within virtual reality with using the Unreal Engine?
[00:10:19.311] Nick Whiting: Using the Unreal Engine, what I really want to see is, you know, we've done a lot with the HMD stuff and I'd really like to work a lot with, you know, kind of experimenting with the controls and stuff, you know, to have motion control support, try to get as many integrations with as many of these little bits and pieces of the VR hardware community as we can so that, you know, people can piece together the experiences that they think are cool and they think are compelling. and support as broad a spectrum of hardware as we can. And we try to do as much as possible, but we're only so many people, so we can only support as many. So it's nice, because we have a lot of really good relationships with a lot of the hardware companies out there, and we get to work with them pretty directly to support as much as possible. So that's one thing I really want to see, is just kind of the cornucopia of VR hardware that's out there right now. The more integrations, as far as I'm concerned, the better. And we're happy to help people. If they have an interface that they need that isn't hooked at the engine level, they just email me, and we put it in if it's a good thing to do.
[00:11:09.739] Kent Bye: And so you've done the Couch Nights and the Showdown and potentially other demos with Oculus. And what were some of the biggest lessons learned that you took from this experience that you're going to take back into doing future virtual reality experiences there with Epic Games?
[00:11:22.223] Nick Whiting: There's a lot of learning lessons, like yeah, one of the great quotes I think coming out of the conference is that, you know, VR makes game developers into honest people because, you know, we can't do as many tricks with billboarding and, you know, normal maps and that that we're kind of used to and we've kind of started crutching on, so we have to kind of go back and rethink how do we get more visual fidelity in the actual, you know, assets that we're creating and how do we keep those running. at a high level, and we've kind of done that a little bit with the Showdown demo, you know, how can we do something that's very cool to look at and whatnot, but still working within the constraints of the demo, so a lot of the things that we fake, like fake shadows and, you know, the particle systems and stuff like that, using those to kind of build upon for the next demos is really going to be the most valuable lessons. Plus, there's just so many crazy design considerations, even from a gameplay standpoint. Couch Nights was a really cool experiment of having two people in the same shared virtual space and just via the fact that we were kind of sending body motion and body language from one person to another person and that you could actually pick up on those cues was a huge lesson to us. We didn't think that it was going to be as impactful as it was, but you can really feel that that avatar is inhabited by a real living human being on the other side. And it's incredible. But on the same scope, we found that the little night girl that was more kind of cartoony and looked like a toy, people really bonded with her because she was a little more abstract. And, you know, we got a lot of bang for the buck by just making her smile or making her do a little hop and a dance. And just adding these little lifelike animations to her really helped people kind of develop that emotional bond. So Epic's historically very, you know, kind of realistic and gritty focus. But now we've kind of started to get a little more to the stylized stuff with, you know, Fortnite, our game coming out, and with the Couch Knights demo. I feel that the VR experiences that we go going forward, you know, might not necessarily be super realistic. They might play on that boundary of, you know, can we take something that's slightly more abstract and imbue it with a sense of, you know, animation and life and really get people to bond with it emotionally.
[00:13:07.257] Kent Bye: Yeah, that was one of the more striking things about going through the Crescent Bay demos was the really low poly world scene with like a couple of animals and there did see a sense of more vitality than doing super realistic. So why do you think that is that you get that different feel versus something that's photorealistic and hyper real versus something that's a little bit more stylized?
[00:13:27.979] Nick Whiting: I think it's kind of the classic problem of the uncanny valley. Once you start making something that's so realistic, when you're sharing a virtual space with it, you know, when you're looking at a game and you see a character that's maybe not, you know, perfect quality, you can kind of forgive it because you're looking at a screen and you're projecting yourself into that screen. But when you're actually present in the experience itself, you become a lot more critical. You start looking for the minutia that lets you know that that's a human and, you know, that's, you know, they're not faking me out. It's actual living, breathing being. But when you have something that's completely abstract, you kind of hold it to a lesser standard and you start to project more onto it. It's kind of like, you know, the book versus movie debate. A lot of people enjoy the book better because they kind of project their own mental images on top of it. Even though everybody's reading the same book, you kind of get a different experience based on, you know, what your mood is at the time. And I think that's kind of the same thing that's going on there. There's a really cool experiment on the internet. It was a group of researchers at the University of Calgary and they took a 2x4 basically and hooked it up to a little actuated motor. that started waving around and basically just move in a rotational dimension. And they put a bunch of people in the room with it, and they said, just interact with it. And almost every single one of the people that came out had some sort of emotional connection to it. They started playing with it. Some people were scared of it. Some people thought it was playful or helpless or like a little child, and they tried to teach things to it. So everybody kind of bonds with it differently. But the fact that humans naturally are able to project onto less lifelike creatures these kind of qualities that we look for in each other is really an amazing thing. I think we can really leverage that in VR to a significant level.
[00:14:51.053] Kent Bye: Is there anything in VR that really sticks out for you that you've experienced that you know is like something that you feel like is going to be really important for the future of the field?
[00:14:59.615] Nick Whiting: I think one of the coolest moments like I said was in the couch nights when we had the two avatars hooked up for the first time. I was with Nick Donaldson. He was in North Carolina and I was in Seattle. We each had one of the DK2 prototypes. And we were working on the actual characters that you were controlling, the game portion of it. And I didn't even realize that the IK animation for the avatars was hooked up. And I kind of saw him out of the corner of my eye and I noticed his avatar was moving. I'm like, that's really weird. And so I looked up at him and he looked up over at me and kind of gave me the like, what's up bro nod. And it was like a magical transformative moment, even though it was something so stupid and so cheesy. It was the first time that I realized that, you know, I was sharing this space with a real other living, breathing human. And something so simple being so powerful to me is super compelling. So I can't wait till we get more of the kind of multi-user virtual shared environments. I mean, I don't know where we're going to go with it, but the fact that we can replicate humans being present with each other in the network space, I mean, that's sci-fi, right? It's amazing, and it's really cool.
[00:15:52.688] Kent Bye: And so what do you see as the ultimate potential of virtual reality then?
[00:15:57.133] Nick Whiting: I mean, just being able to be present in a world that's not like our own. I mean, we can recreate, you know, realistic scenes and stuff like that, but to me the really fun things is like, I want to go into a cartoon and see what it's like to live in something that's toon shaded everywhere. I want to go to a non-realistic environment. I want to have something that's surreal and just a little bit south of reality so that you can experience something that you can't experience in real life. Because if virtual reality crests with just trying to replicate reality, that's kind of boring to me. I'd really like to see us push what's possible and become a little more abstract and kind of poke at the boundaries of what's cool and what's surreal and those sorts of things. To me, I'd really love to see demos that kind of go in that direction.
[00:16:34.598] Kent Bye: OK, great. Well, thank you so much.
[00:16:36.059] Nick Whiting: Yeah, no problem. Happy to talk to you.