Valve premiered a prototype of a new type of VR input controller at Steam Dev Days in order to get some preliminary feedback from developers. They’ve created a capacitive-touch controller that is attached to your hand so that you can open and close your hands to mimic the feeling of grabbing a tangible object. They used a modified scene from The Gallery, Episode 1 demo to show off this new controller, and I had a chance to Cloudhead Games President & Creative Director Denny Unger about it at the VR on the Lot conference. We talked about the Valve’s new input controller prototype, the growing ecosystem of lighthouse-tracked peripherals, his thoughts on the future of non-linear narrative, and an update about The Gallery, which recently won best narrative VR experience at the Proto Awards and has surpassed $1 million in sales.
LISTEN TO THE VOICES OF VR PODCAST
https://twitter.com/shawncwhiting/status/786350329356886016
Nice hands-on demo video of the capacitive touch features the new SteamVR controllers by Valve's Jeremy Selan. https://t.co/6Wd2Jc6unk
— Kent Bye (Voices of VR) (@kentbye) October 13, 2016
Donate to the Voices of VR Podcast Patreon
Music: Fatality & Summer Trip
Support Voices of VR
- Subscribe on iTunes
- Donate to the Voices of VR Podcast Patreon
Music: Fatality & Summer Trip
Rough Transcript
[00:00:05.452] Kent Bye: The Voices of VR Podcast. My name is Kent Bye, and welcome to the Voices of VR Podcast. So October has been a huge month for virtual reality. Starting with October 4th, we had the launch of Google's Daydream View. Then October 5th to 7th was the Oculus Connect 3. And then October 13th was the launch of the PlayStation VR. And then October 12th to 13th was Steam Dev Days happening in Seattle, which was Valve's developer conference. And at Steam Dev Days, Valve announced a new prototype that they were showing to different developers to get feedback. And so they're doing a completely different approach to input controllers, which allows you to essentially open up your hand and because it's kind of attached to your hand like a glove, you could grab objects and it feels like you're actually grabbing and throwing objects. And so Denny Unger of Cloudhead Games was a part of helping create the demo that was shown to all the Steam Dev Days attendees, taking one of their scenes from the Gallery Episode 1 and just taking a lot of objects and allowing people to interact with them. And so at VR on the Lot here in Los Angeles, I ran into Denny at the Upload VR party and had a chance to pull him aside and talk about both the new prototype controllers that Valve has been working on and showing for the first time at Steam Dev Days, as well as Gallery Episode 1, Call of the Starseed, which just recently won Best Narrative VR Experience at the Proto Awards. So we'll be talking about the controller as well as the future of nonlinear storytelling on today's episode of the Voices of VR podcast. But first, a quick word from our sponsor. Today's episode is brought to you by Fishbowl VR. Fishbowl VR provides on-demand user testing for your VR experience. They have hundreds of VR playtesters who record their candid reactions with a turnaround time as fast as 24 hours. You can solve arguments, discover weaknesses, and get new gameplay ideas. User testing is a vital part of the development cycle, and Fishbowl VR takes care of all the logistics so you can just focus on the creative process. So start getting feedback today at fishbowlvr.com. So this interview with Denny happened at the VR on the Lot conference that was happening on October 13th and 14th in Los Angeles. So with that, let's go ahead and dive right in.
[00:02:36.546] Denny Unger: I'm Denny Unger. I'm the CEO and Creative Director at Cloudhead Games. We built the gallery called Starseed, which is a four-part episodic VR game. It's bundled with the Vive. You probably know us as the company that invented blink locomotion and VR comfort mode and gestural input things. That's kind of our...
[00:02:57.517] Kent Bye: Great. Well, I want to dive into the gallery here in a bit, but I want to first address some news that was announced yesterday there at Steam Dev Days, which was basically some new tracking solutions, some prototypes that Valve have been working on. So maybe you could talk a bit about this new controller and what's different about it.
[00:03:14.364] Denny Unger: Yeah. It was a really neat opportunity. Like, Valve contacted us literally just a few days before Steam Dev Days. They're like, you know, we have this new thing we're really excited about. We weren't sure if we could show it. Can you put together a demo so we can expose what this is to the public? And so we just kind of cordoned off a little area from the first episode. and just shoved a bunch of interactable objects in there so people could play with things, you know, just in a really natural way. Got it done in time, thankfully, and then went to the show, got to try it on for the first time, and it's really a game changer. It's an interesting input method because it's like a hybrid between, like, leap motion and having really solid, stable, tangible controllers, right? And it takes, like, a minute for your brain to realize, you know, for so many years we've been having this death grip on controllers like a 360 pad or whatever. And suddenly having the freedom to open your hands is like a weird thing cognitively. You know there's a controller floating there, but you don't have to hold it. But once your brain gets that, it's this really unique sense of freedom. And putting it on, I was really actually quite surprised at how comfortable. It was obvious that they had spent a lot of time on ergonomics. The controller itself is quite small. It fits in the palm of your hand. It's got sort of the one big touch pad and two other buttons to interact with. And it's got this really nice padding on the back of the hand. So it just kind of, it fits like a soft mitten, you know? It feels really comfortable on your hand. So much so that you kind of forget it's there after a while using it. It's capacitative, but there is some granularity to it. So, you know, it'll find sort of the mid state for opening and closing fingers. And there's some other stuff I probably can't talk about that I know that they're working towards. This is the first prototype, and for me, it was like the ultimate VR controller. It's the thing you want. You want a natural interface as much as possible so that you don't have to over-explain input to people. You don't have to hand them the controller and say, A, B does this, or the grip trigger does this, or the trigger trigger does this. You just put it on and you get it. It's an immediate point of feedback.
[00:05:19.971] Kent Bye: Yeah I know that I was just watching some videos of people trying it on and they were like throwing objects and they were like oh my god I can't like pick up an object and throw it and so when you say it's like a little bit like Leap Motion it's not necessarily doing optical tracking but I think what you mean by that is that you can open your hand and close it and that You have the haptic feedback of actually feeling this thing. So basically it's kind of like this thing that's strapped to your arm. You have tracking on the back of your hand and then it's from the back of your hand like over your thumb kind of like a little harness and then you have like this little pod like thing that fits on your hand when you close it. So you open your hand and you let go of it. When you close your hand it sounds like it does the capacitive touch to detect that you've closed your hand. And then you've got a few triggers. And so I just saw people juggling and throwing and people's minds were pretty blown. I wish I could have been there to actually try it out myself. The Lighthouse controllers, they felt like a wand. They don't feel like something that's intuitive that you behold. The Oculus Touch controllers actually fit into your hands a lot better, and it just feels like more comfortable. They kind of disappear and melt away in your hands. It sounds like this is like the next iteration, and it's like a little bit more comfortable in the hands, but also this completely new interaction design where you're just being able to open up your hands to let go of objects, and you just are able to grab things. And when you grab things, you actually get the haptic feedback of grabbing things.
[00:06:39.513] Denny Unger: That's the curious thing is that the touch ergonomics are great. You know, to hold, it's a comfortable controller to hold. But what's really interesting about the valve design is that you're not holding something and you've got that really weird, tangible feedback point where when you grip your hands, all of a sudden you're holding something. You know what I mean? Like, so reaching out for a gun handle, you reach out and then you close your fist and you're actually grabbing onto something. It sounds stupid to explain, but just having that ability in VR is kind of a game changer in a weird way.
[00:07:09.554] Kent Bye: Well, I think that it's probably working at the primitive level of our brains, you know, like when we grab something, we kind of know what that haptic feedback is, and it's a little bit of a passive haptic feedback that, given the visual feedback, our visual field dominates, so that given all these different things that you're grabbing, whether it's a pot or a pan or a bat or whatever, that even though we're not still feeling the weight of that, we're still feeling like this passive haptic feedback, which our minds, I presume that when you're handling different objects, our brains is kind of filling in the gaps and kind of saying like, oh yeah, that kind of makes sense of what this might feel like. Was that kind of true with touching different objects? It just felt like you were holding each object?
[00:07:45.257] Denny Unger: Exactly. That's all it takes is that little bit of tangibility between your hands and it suddenly feels like you're touching real things, right? The other really cool thing I think that's kind of been overlooked so far is that Once your hands are free like that, you can put on the headset. For one thing, you feel really badass, because they look cool. The controller itself is kind of a neat design. It's still prototype, but it's neat. And the greatest thing is that you can actually physically grab the headset, put it on, while you have controllers in your hands, because your fingers are free to interact with tangible things in the real world. So you can go to your keyboard, you can type things, you can put the headset on, all without ever taking the controllers out of your hands. So that's pretty cool.
[00:08:25.642] Kent Bye: It's always been a little awkward coming out of a VR experience of having the wristbands on. I guess you don't really need wristbands anymore because it's strapped to your hand. The other announcement that I haven't really talked about on the podcast yet, but it feels like there's a number of different partners that have opened up and licensed the Lighthouse technology as well. Just from what I hear, it sounds like around CES is probably when a lot of these first peripherals will be starting to hit the market. But to me, being at Steam Dev Days, it sounds like there was a big initiative to saying, hey, we're really trying to create an open ecosystem here with the technology that's out there. We want lots of people creating this and kind of making the lighthouse technology like the Wi-Fi of VR and just allowing all these other different peripherals and stuff to be out there. To me, That's some of the most exciting things about being able to potentially even track more points in the body. So you could do more sophisticated inverse kinematics, or being able to get your feet in the game, or get all these new different peripherals, and all sorts of different things that people can start to do within VR.
[00:09:26.334] Denny Unger: So they brought up an interesting point about the advantage of Lighthouse is that you can track real-world objects. So no matter how sophisticated camera-based tracking gets, it's still going to take a number of years until they get to the point where they can fully track your body and things in the room that you can interact with and move around. Lighthouse, you can kind of stick to anything and make it a tangible thing in the game world, right? They're also saying that something like over 300 companies have already applied for the program, and some of them are quite notable. So yeah, around CES, you're going to start seeing things. Oh, and they also brought up privacy as an interesting kind of wrinkle in what Lighthouse brings compared to a camera-based tracking solution. You know, it is an inherently private system. It's using lasers, not a camera, right?
[00:10:12.704] Kent Bye: Yeah. Oh, that's a dimension I never even thought of. So what you're saying is that the Facebook and Oculus solution, you're going to be having cameras that could potentially be recording more information other than just lasers. That is really interesting. I never even thought about that.
[00:10:28.938] Denny Unger: Yeah, there's a huge potential for a huge invasion of privacy there, I think, yeah.
[00:10:33.602] Kent Bye: Wow, okay, well, that's duly noted. Well, just one quick question about the actual trackers for Steam. I can imagine, I want to have more points in my body tracked. I want to have my full body, because I think that that'll invoke the virtual body ownership illusion, and it's going to take the sense of presence to the next level, because I don't think that you can really do inverse kinematics justice with just those two controllers. You actually need more points. My understanding with the Lighthouse is that it has to somehow transmit that data back through the Bluetooth connection into the headset back into the computer. Is there a limit in terms of how many sensors you can have, or do each of these sensors need to be powered by batteries? Is it even going to be possible to have these little dime-sized sensors, or is there going to be a minimum? I'm just curious to hear a little bit more about that.
[00:11:22.847] Denny Unger: I don't know what the sensor limit is, but I know it's enough that you could conceivably track feet and elbows in the loop of the system. Sorry, I've been petitioning Valve for foot tracking. You know, as much as I want elbows for proper IK, I think feet open up a really wide range of gameplay possibilities and just another method for grounding you, right, in the environment. I don't know if they'll do that, but that's what I want. When they showed the new controller, It's interesting because they have a wrist strap that keeps the controller from flying away if you really huck something, right? And it seemed to me like it's not that far away from adding an option to have another strap that goes up to the elbow, right? With one more sensor. And I'm hoping many people told them that. They were kind of fielding for advice and suggestions during that entire conference. To me, that would be amazing. Because once you have proper elbow tracking, then you can introduce upper body avatars, and now we're getting somewhere, right?
[00:12:20.354] Kent Bye: Yeah, absolutely. Valve, if you're listening to this, add elbows. I think it's going to make a huge difference with being able to actually feel it, because right now you can only put into the game what you can for sure track, and that's just hands. And so you have this kind of floating hand model, and yeah, having the head with the hands and elbows I think is just going to take the level of presence to the next level.
[00:12:41.465] Denny Unger: A funny story about that was, you know, when we started with the gallery, we had full body persistence, and that was with Razer Hydra, right? But it was using an IK system, it was before room scale was a thing, it was like seated joystick movement, you know, all those bad things. And it kind of worked because that's what we expected back then. And as the technology got better, and the tracking got better, we started losing parts of the body that we had in our game. So, originally when we got the prototype hardware from Valve, we lopped off the legs because we're like, oh, we can't track feet. Or if we put an IK system in there and we just have an animation for the feet, it's going to look stupid and it's going to feel weird. So let's get rid of the legs, and we'll just have the upper body and the arms. So guard of the legs, and then our elbows feel funny. And we had to come up with a system to detect a lean versus a dip, right? So all these weird IK systems with three-point tracking, And we got close but we couldn't get it perfect so we took away the body and then we just had arms. And so we worked on a number of systems that stacked on top of IK to try to predictively understand where your elbows might be based on wrist orientation and everything else. But ultimately it's impossible. So got rid of the arms and then we were left with hands, right? So now I'm hoping that the industry pushes back the other way and gives us better sensing and it might come from Oculus. They might do full camera tracking of the body as an avatar and that would be amazing. But I think in the short term the easiest way to do it would be with these little nodal dongle things that are part of the lighthouse system. At least for elbows.
[00:14:13.578] Kent Bye: Yeah, it seems like being able to track the feet, having the lighthouse sensors up high, you have a higher likelihood than tracking the feet than other tracking solutions.
[00:14:23.127] Denny Unger: Yeah, feet for sure. It's what we all want to happen, but baby steps I guess.
[00:14:29.312] Kent Bye: Well, so let's talk about the gallery a little bit more, because this is something that is a narrative experience, and it seems like you've hit some milestones. One, that you won award at the Proto Awards for Best Narrative, and also kind of in response to the raw data mentioning that they had reached a million dollars, you also came out and said, oh, well, actually, we've also reached a million dollars. So maybe you could talk about the release and how it's been going so far.
[00:14:51.730] Denny Unger: Yeah, it's going to the Serbia's thing. It was a weird, weird situation. We didn't really want to talk about the money because, you know, a million dollars is kind of a drop in the bucket with respect to the video game industry in general. But it's an important marker for VR and the idea that you're actually generating meaningful revenue. And we spent more than that building the gallery, and we continue to do so, but there is revenue to be made. So we just kind of had to say it. It's a positive thing to say. So we jumped on the bandwagon and said, yeah, well, we kind of did that, too. Yeah, and we recently received an award for Best Narrative Experience at Proto Awards. We're up against some really tough competition, quite honestly. I was quite flabbergasted that we won. You know, there was the Martian, and some stuff Google had done, and a few other titles. And they were all excellent, amazing titles. So that's a pretty big honor.
[00:15:43.372] Kent Bye: For you, how do you think about, like, narrative in VR in terms of, like, you know, do you really focus on, like, the environmental storytelling? And, you know, what are some of the things that you've learned in terms of, like, how do you tell a story but also allow people to give a highly dynamic interactive experience within an immersive experience?
[00:16:01.838] Denny Unger: So like right now we're at VR on the lot in LA and I spoke today on a panel about you know VR game production and what that means to Hollywood and blah blah blah. It's a really interesting question because I think Hollywood is doing this thing right now where they're doing you know it's all 360 video and I feel like that's a half step to what it really will be in a pretty short period of time. I'm a little worried that Hollywood's going to invest tons of capital into 360 video experiences and not generate the kind of traction they really need to make that format work. So, I don't know, I like the scenario where you're using, like, Lightfield 360 captures of actors, and you're doing photogrammetry to capture an environment. Like, I want to be on the wall with Legolas, and he's shooting a bow and arrow into the orcs. And I want to sit there with my wife and my kids, and they're sharing that experience with me. And if they get bored, they can get up, and they can grab the bow, and they can start shooting the orcs, too. But it has no impact on the progression of the narrative. And me and my wife can be super lazy and just sit there and watch the action play out and go to the next scene. Like, I think Hollywood has to start wrapping their heads around that new kind of paradigm in telling a story where you have this light, non-consequential interaction as an option. But give the player freedom to move around. Sorry, the observer. Give them the agency to be the camera if they want, like let go of strictly controlling the frame. Just forget it. This is a new medium and start experimenting with what works now and push that forward.
[00:17:33.406] Kent Bye: Yeah, I think the challenge there is a lot of times when you are embodied with an experience and you kind of have this moment where you're trying to push the limits and really test to see how much real agency you actually have, you know? It's like, is it kind of like an afterthought that you're there with a body and you're able to then interact with different scenes? And a lot of times when you start to be immersed within an experience and you realize, oh well, I can't really do anything in here and so it just It kind of feels like the plausibility of it all, just like you're not able to interact with the characters or interrupt them or be able to engage and interact with all the objects. So it seems like it's really kind of a choice that I see that either you're really going to go all in and make it, you can interact with anything at any moment at any time, even interrupt the characters. Or you're going to be kind of a passive ghost and you're just there observing. So I feel like it's actually a tricky balance because when you do give them an opportunity to do a little thing, then do they want that to make a difference at all? So I think it's kind of a tricky balance that people have to make.
[00:18:32.095] Denny Unger: Yeah, but I think that's where it's going to fracture. It'll be like these willful engagements. interactive narrative where you're actually the main character and you're participating in that way. But I think the general public will understand that there's two kind of different implementations of that. One being where you're just kind of voyeuristically watching the action unfold. and one where you're actually engaging with the content. And I don't know, like I'm just spitballing, but I think that those are the kind of verbs we have to start understanding with the medium. We don't know what that is yet, but I haven't seen Hollywood really dedicate a ton of time into figuring it out yet. They're so obsessed with 360 video, thinking that that's the future. yet we're in a YouTube generation where everybody consumes content and they expect everything for nothing, essentially. And I don't think there's a model there that really works for that type of content yet. Maybe there is. Someone will prove me wrong, but I think mobile VR, just in general, is quite a few steps back from kind of the top tier VR experiences. There's no positional tracking, the FOV is not as wide, the hardware itself doesn't generate the same type of experience. So my worry is that if that's the entry level experience for a lot of users using mobile, that they'll look at that as an interesting novelty and then just kind of walk away. You know what I mean? Because it doesn't push the edges like room scale does, for example.
[00:19:54.433] Kent Bye: Yeah, I think it's, to me, I feel like the future of interactive drama within a VR experience is going to have all of these artificially intelligent non-player characters where you're able to talk to them with natural language input and then they're going to be able to talk back. So I've been talking to, I've actually kind of spun up a whole other podcast, Voices of AI. I've yet to launch it yet at this point. I just got back from Artificial Intelligence for Interactive Digital Entertainment, so that's essentially like all of the creative AI people. One of the people that I had a chance to sit down with was Michael Mateus of Facade, who, with Andrew Stern back in 2005, created this experience where you're able to have these natural language input conversations with different characters, and from those, you're able to kind of set off into, like, a specific set of, like, five different outcomes. And so, in being able to sit down with him and talk to him, he was, like, saying, well, there's two ends of the spectrum. One is, like, authorship. And I think that right now, all of Hollywood and storytelling has been completely obsessed with controlling every dimension of that authorship. And the other extreme is, like, emergent, kind of, like, bottom-up behavior that kind of is in the moment and, like, you have to kind of think of it as building trust and rapport with a character, which means that you have to slowly interact with somebody and build trust with somebody rather than just have that other extreme of the authorship the narrative spat out at you and so to me like finding that balance point between authorship as well as like that emergent kind of high agency type of highly dynamic interactive environments That to me, I think that's where VR is going in the future and it's like moving away from the obsessive control of the frame as well as what people are looking at, as well as the story. Letting go of that story and only having one potential outcome of it.
[00:21:43.061] Denny Unger: Yeah, so I think you hit it on the head but I think we have to like slowly kind of push Hollywood into first baby steps into VR and thinking about letting go of some of that control and once we've done that then it moves into like David Cronenberg existence territory where suddenly you've got like you said interactive AI sort of pushing a story forward and you're really You're existing within a simulated reality in the truest, purest sense, but there's some narrative force that's always driving it, right? And kind of fracturing the different choices. Yeah, I mean, that is the ultimate goal, right?
[00:22:19.103] Kent Bye: Yeah, I think so, and I think that that's the strength of the medium, and I think that part of the reason why I think the gallery is winning these narrative experiences, rather than something like on the other extreme, which Pearl is an amazing experience, but it is completely authored, you have no agency within it, right? So if you want to actually look at the, and I think there's a role for that in terms of the four different types of stories you could do in VR, that's one where you're a passive ghost, but Having an environment where you actually have an embodiment, you're a character, you're able to move around and like have some direct experience of exerting your will and agency into the experience, I think that's the future of where things are going. But like I said, there's still a lot of like, culturally, it feels like you're kind of coming from the games industry, which has been on that other side of it, which is like complete agency. you know, for you, like, how do you think about character development or these other story narrative? In talking to Ran Miller, he's talking about, like, having kind of non-linear story across the whole world of abduction, where you can go anywhere in the world and kind of have the story revealed to you. But, yeah, for you, I think it may be a little bit more of a linearized path through that rather than sort of like a completely open-world version, but kind of blending environmental storytelling. And how do you kind of balance all of that, of like combining that agency with that narrative and gallery?
[00:23:36.257] Denny Unger: Well, with Episode 2, we actually wanted to break that mold because... So we have these, like in Episode 1, we have these moments where we've kind of ideally gated the player to accept a narrative beat, like there's some information coming your way. In a couple cases, we actually lock the blink volume so you can't move away from the action. You can walk away, but we'll just teleport you right back to where you were. But that feels a bit like Twilight Zone, where you can never escape the Hell House, or whatever it is, right? So, in Episode 2, we've come up with a narrative system that lets the player move freely anywhere they want, and it never matters. I won't tell you what that is yet, but I think it's a good half-step to something better in the future. And it's a non-linear experience, Episode 2. It's more circular. So you're traveling to different kind of nodes and several environments and trying to progress the narrative that way. So we're taking a lot of feedback we got from EP1, obviously, and trying to come up with new ways to tell the story, but still feel like, you know, you have engagement with the main characters and it's visceral and it's progressing. But this is just, it's so primitive. We're just in the baby step days of building narrative VR experiences, and this will be our next kind of experiment with what that means and what we can do with it.
[00:24:50.913] Kent Bye: Have you had a chance to try out the Rick and Morty simulator yet at all? No. Okay. That's to me like adding the job simulator engine on top of a narrative where you kind of have the ability to interrupt the characters as they're speaking and maybe sometimes there's a reaction to when you pick up something but having that super extreme of like, you know, that would be maybe there's an arc that happens in that five-minute demo that I saw but they kind of are thinking about, like, if you threw a shoe at this character's face, you know, what would be the reaction? And I think that accounting for all of the many different ways that you could interrupt a character, you know, they had to build an entire interrupt system in order to do that. And so it sounds like you're kind of developing your own sort of, like, interrupt system, which is trying to maintain that plausibility of, like, allowing the player to express their agency and their will within the experience, but yet still present the narrative information, but finding a way to, like, kind of coax them into really wanting to receive it.
[00:25:46.447] Denny Unger: Yeah, we've come up with a very interesting abstraction of how you give the player agency and freedom to move around a scene, even though there's a narrative beat happening, even though there's other characters moving around you and delivering dialogue. But you'll just have to see it when it comes out.
[00:26:03.527] Kent Bye: Have you been doing any other kind of locomotion stuff? I know that back at the Silicon Valley Virtual Reality Conference in 2014, you were just talking about VR comfort mode, which I think has really taken off in a lot of different ways. And each conference, PAX West 2015, the Blink system is the first time I had a chance to talk to Joel about that. This seems like the locomotion and moving around VR space is something that you're continually trying to refine and work out and maintain this balance between a sense of physical presence that you're in the world because there's a trade-off of whenever you teleport you can break that presence in a certain way but you have a limited amount of space that you want to allow people to explore and so you've been kind of really innovating with these different blink teleportation methods. So what's kind of like the latest iteration of an experiment that maybe open problems that you're really still trying to solve with locomotion in VR?
[00:26:50.408] Denny Unger: I would call locomotion in VR like my pet project of the studio and I'm always obsessing about there's always got to be a better way. Teleportation works and yes you don't get sick and it's been widely adopted as a great method for reducing vection and all those problems. But I also recognize that there are VR diehards out there like me that still want a different system. They want that half step between teleportation and stick moves. And so we are working on something right now. There's no name for it. We'll come up with some smart, stupid name that people make fun of us for. And we plan on sharing that with the public, just like we did with Blink and VR Comfort Mode, in the exact same way. It'll be closer to launch. But we do have a method of traversing an environment without ever teleporting. And it reduces vexion very effectively. And it feels natural, as natural as we could possibly make it. The thing about it is that it uses techniques that everyone has seen. It's kind of like the culmination of all of the experiments everybody has kind of done and put into a singular system and it's just about tweaking all of the values that engage within that system. I'm not making myself clear because I can't say what it is right now, but it feels really good and we're excited to reveal it.
[00:28:06.643] Kent Bye: When is it planning to be released then?
[00:28:09.004] Denny Unger: So episode two is coming out near the end of the first quarter next year. And so I hope that we can show what we've done for locomotion a month or two prior to that. Yeah.
[00:28:19.083] Kent Bye: Okay, great. I look forward. I know that Eagle Flight has been doing some things with reducing the peripheral view and a lot of different things that have been out there in experiments. So it sounds like if there's anybody that's been going around YouTube looking at all the different VR locomotions, you've been trying to ingest all of it, synthesize it, and keep kind of innovating, it sounds like.
[00:28:37.912] Denny Unger: So a year and a half ago in the studio, right before we committed to Blink, we started experimenting with, we called them Vection Portals. And it was exactly what Eagle Flight eventually showed. But it's a totally smart technique. And yeah, it's cool.
[00:28:53.845] Kent Bye: Well, we'll leave it at that. So what's next for you and Cloudhead Games?
[00:28:59.430] Denny Unger: So we're promising four episodes but like everybody we're just kind of watching as the industry goes on and I have nothing but positive vibes about where it's going to go. So we're continuing with that journey. We're working on two other projects confidentially that I'm really excited to talk about and I'm hoping that over the next four months we can say what they are. One is with a major media conglomerate and with a cherished IP. That could all fall apart, who knows, but so far so good. And the other is something that democratizes VR creation, I guess is a really simplistic way of saying it. So we're excited about both things and we're just trying to kind of scale up to meet those projects.
[00:29:44.903] Kent Bye: The thing that's really striking to me about the process that you have these four episodes is that it's going to be a little bit of like a written documentation about the most cutting edge of different VR design technique at that moment. And I can imagine a time when you release the fourth episode and look back on the episode one, two, three, and four, like going back and playing episode one may be like, oh my God, like it's almost... I don't know if you've thought about that.
[00:30:07.139] Denny Unger: Yeah, a lot. I think when EP2 ships, I think going back to EP1 from EP2 will feel like that, you know. And it's coming out soon, you know, in relative terms. But what we're trying to do is backport some of the information and lessons learned into Episode 1 as we move forward. So I'm hoping it won't feel too disjointed, you know. But yeah, absolutely. Every episode is going to feel like a step, you know.
[00:30:32.862] Kent Bye: Great. And finally, what do you see as kind of the ultimate potential of virtual reality and what it might be able to enable?
[00:30:41.470] Denny Unger: You know what? I honestly don't know anymore. I see this momentum building. I worry a little bit about where some of it is going. But at the same time, there's equal opportunity for this to be the most amazing, connective technology we've ever had, or the darkest, most sinister technology we've ever had, right? So I see a lot of emphasis from all of the major players to build their own version of the Oasis. And it's just a question of what corporation you want Helming that you know, and so that's the part that scares me. I just hope it remains open enough to be Democratic I guess Awesome.
[00:31:23.466] Kent Bye: Well, thank you so much Denny. Thanks So that was Denny Unger He's the CEO and founder of cloud head games and they created the calorie called the star seed which was bundled with the HTC Vive and also just recently created a demo for the new prototype controller from Valve for Steam Dev Days. So, I have a number of different takeaways about this interview, is that first of all, I think that the thing that Denny said about having over 300 companies in the process of creating Lighthouse-enabled tracking technologies, I think is pretty staggering and telling. And I think the big takeaway from the Steam Dev Days is that they're really trying to foster this open ecosystem, where Valve really wants the Lighthouse just to kind of be this Wi-Fi standard within virtual reality technologies. And so if we go back to August 4th, 2016, that's when Valve just initially announced that they're going to be opening up the lighthouse tracking technologies with this royalty free license. There was a fee to pay in order to get trained officially, but then, you know, aside from that, any company could come along and start to create these different peripherals that are going to be put into the ecosystem. And I think. This is absolutely huge. And I was really looking forward to Oculus Connect 3 to see if Oculus was going to have any response to this. You know, were they going to open up their constellation tracking and allow third-party vendors to be able to create peripherals in a royalty-free fashion? And, you know, there was no response at all from Oculus. And I was pretty shocked that there wasn't anything at all. And I just kind of thought, wow, you know, Valve is going to start to lap and really leapfrog in terms of their rapid innovation, in terms of these input controls, as well as with the lighthouse tracking. So I have a couple of other thoughts about this, is that I'm going to go way back into some of the very first interviews that I did, one with Nani de la Peña, when I asked her about the VR ecosystem. And I think Nani really nailed the different classes of VR. She was essentially saying that there's going to be some mobile VR that comes along, where it's going to be a kind of tetherless and not fully positional tracked. And then up from that, there's going to be the sit down in front of the computer or a console VR. You're going to be able to have an Xbox controller, but maybe some motion track controllers. But you're going to be mostly sitting down and not moving around all around the room. And then she said, beyond that, it's kind of like the room scale where you're kind of walking around into an experience. And that was from her experience of creating different experiences at USICT, where they had the different tracking technologies to be able to do that. But at that point, before anybody knew about what was going on with Val, that was kind of like the IMAX experience, kind of like the digital out-of-home entertainment. Well, it turns out with Valve coming out with their room scale first, everything that they're designing for is for room scale first, and then kind of, you know, anything that's sit down, they'll be able to support. Oculus kind of took the opposite approach. They were doing sit down first, because they just expected that no one is going to really want to either use the space or really fully walk around. And so they just, for the longest time, had been pushing this sit down first type of experiences. And there was a bit of a disconnect whenever we would go to these Oculus Connect conferences and all the demos, you would be standing up. But yet, publicly, they were telling all the developers that, you know, actually, you should be developing these experiences for sitting down. There's just kind of a qualitative difference in presence when you are able to stand up. You just feel like you're more in the experience and I think that was one of the innovations that Valve has been going with and with these new prototype controllers I think that they're going into even further beyond in terms of like even being able to open up your hand and grasp at things and just give you the sense at a kind of a primal limbic level that you were immersed in the game and that you're able to actually pick up objects and throw them. And so all the different moments in the history of VR, this week has been a pretty important one, just with the whole Steam Dev Days and people's minds getting blown yet again, you know, with some completely new radical approach of how to do input within VR. But I think Denny's right in terms of like the camera based solutions actually could do some more other sophisticated detection of someone's body and being able to do some of the machine learning type of approaches to detect whether or not you know where they're moving and with those two cameras I think they'll be able to perhaps give you a full kinematic model of your body and start to do that. And within the demo from Mark Zuckerberg, they were showing emotional expressions. The idea is that eventually they want to somehow get your facial expressions into the VR experience. And so for social experiences, I think that's going to be really huge, which is kind of Facebook's wheelhouse. So it makes sense that they're really focusing on that. So they could have implemented that through a camera only solution, which is not something that is going to be all that feasible within the Vive. So that's some of the thoughts on that. And just with Denny and Cloudhead Games, they've been true innovators in the space of VR and in terms of locomotion within virtual reality and some of their storytelling that they've been doing so far and, you know, at the next iteration really starting to move away from kind of a linear pathway through an environment but making a little bit more of a nodal and nonlinear and a little bit more like with the interview with Rand Miller in Abduction, talking about using the environment in a nonlinear way, being able to kind of go to anywhere at any point to be able to discover different parts of the story. So it sounds like they've also come up with some new locomotion techniques that they're going to be talking about when it comes a little bit closer to releasing episode two of the gallery. So we'll look forward to what type of things that they've actually done to make it more comfortable to actually locomote around. So that's all that I have for today. I just wanted to thank you for listening to the Voices of VR podcast. And also, just on a personal note, my Patreon did actually cross the $2,000 per month threshold, and that is awesome. And I've got other goals as well. That was sort of a goal that I had set up at the very beginning. So it's not like the end of, you know, being able to fully be completely funded, but it's a super important milestone. And one thing I'll say about that is that there have been some people that have written and said, Hey, I'm going to give, you know, a big chunk of money for a couple of months, but then, you know, once those months comes over, it will kind of fluctuate. So I do need people who want to contribute at just a few dollars a month. And that just helps give a level of stability so that if people decide to drop in and out. it doesn't have such a huge impact. And so if you don't think it matters, it certainly does matter and has been a huge safety net for me through doing this podcast and just helps with all the expenses that it takes to run this. And so if you enjoy the podcast as a service to you and the wider virtual reality community, then become a donor at patreon.com slash voices of VR.