#368: Google’s Nathan Martz on Developing with the Daydream SDK

Nathan-MartzGoogle’s Nathan Martz is a product manager for the Daydream SDK, and I had a chance to catch up with him to talk about what developers need to know in order to get started in developing for Daydream, which will be released in the Fall. Road to VR has a great article covering some of what’s needed to build the DIY Dev kit, which Nathan refers to as the “Build Your Own Dev Kit” (BYODK). Or you can follow along the instructions here. I talk to Nathan about the 3DOF Daydream controllers, SDK features, AndroidVR, as well some of the other of the other tips for getting started in developing for Daydream.

LISTEN TO THE VOICES OF VR PODCAST

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. My name is Kent Bye, and welcome to the Voices of VR Podcast. Today, I talk to Nathan Martz, who is a product manager at Google, working on the software SDK for Daydream. So in this interview, we talk a bit about some of the functions and features that are available to developers within the SDK, as well as the 3DOF controller and how to integrate with that, And what you can do right now to get started in terms of developing for Daydream, it doesn't sound like there's going to be any pre-release access to hardware. And so we cover a lot of the content that VR developers need to know in order to get their content ready for Daydream for the launch that's coming up here in the fall. So that's what we'll be covering on today's episode of the Voices of VR podcast. But first, a quick word from our sponsor. Today's episode is brought to you by the Virtual World Society. The Virtual World Society wants to use VR to change the world. So they are interested in bridging the gap between communities in need with researchers, with creative communities, as well with community of providers who could help deliver these VR experiences to the communities. If you're interested in getting more involved in virtual reality and want to help make a difference in the world, then sign up at virtualworldsociety.org and start to get more involved. Check out the Virtual World Society booth at the Augmented World Expo, June 1st and 2nd. So this interview took place at the Google headquarters, at the Googleplex, in the partnerplex room. And actually, again, this interview also took place within the boardroom. So with that, let's go ahead and dive right in.

[00:01:49.494] Nathan Martz: My name is Nathan Martz, and I'm the product manager for developers on Daydream.

[00:01:54.016] Kent Bye: Great. So why don't you tell me a little bit about what is Daydream?

[00:01:57.626] Nathan Martz: So Daydream is our brand new platform for immersive smartphone-based VR.

[00:02:02.088] Kent Bye: Okay, and so this was just announced at Google I.O. and it's going to be coming out in the fall, and so maybe first talk a bit about the input controls that are available because, you know, we have Google Cardboard, there's the Samsung Gear, which has got this trackpad on the side, but this is an actual 3DOF controller with the trackpad. Maybe you could talk a bit about that and what it enables.

[00:02:22.733] Nathan Martz: Yeah, that's actually one of my favorite parts of Daydream is the new controller. I spent a lot of my career as a game developer, and input is obviously super important in gaming, and I've kind of carried that respect on to VR. I think that for me there's a real challenge in input in VR, which is that most users can't see their hands, like on most platforms when they're in VR. And for us, we want to make sure that we build VR devices that are extremely accessible. VR for everyone, right? So you need a controller that is simple enough that anyone can feel comfortable picking it up and using it, even when they can't see their hands. On the other hand, developers want an expressive platform, right? They want enough degrees of freedom and input that you can actually interact with the world in compelling ways. And there's kind of a tension there, and I'm really, really happy with the solution that we landed on for the Daydream controller. I think it's a great balance of being expressive and being accessible. The other thing I would say is that, if you look at a lot of, you know, you mentioned both Cardboard and Gear VR, in mobile VR right now, your head and your hand are kind of attached to one another, right? Like, the input is on your head. But we believe, like, fundamentally, VR is really at its best when you have your head and your hand separate from one another. And Daydream gives you that, right? You can look around and you can move the controller independently of one another. And with just a little bit of creativity and cleverness from the developers, that's a lot of what our developer-centric sessions at IOH have talked about, you can create an incredible sense of hand presence with that controller, even though it's not positionally tracked.

[00:03:49.804] Kent Bye: And so is Daydream VR going to be shipped with one controller or two?

[00:03:54.406] Nathan Martz: One controller. But the cool thing is we are bundling the viewer and the controller together and we're also requiring the controller to be active when users enter Daydream Home. A big part of that is that we know developers need a stable foundation, right? They need to be able to say, okay, like, I don't want to know, is it this or that or this? Like someone launches my app, I really want to have a solid baseline that I'm going to build my whole experience around. So we've worked very, very hard to make sure that that promise is for users and for developers both.

[00:04:25.385] Kent Bye: And so this is a three degree of freedom controller, and it also has a trackpad. But I'm curious if you see, in the long run, if we'll start to see positional tracking in both the VR headset and in the controller.

[00:04:37.351] Nathan Martz: That's a great question. So one correction, it's actually a clickable trackpad, which is kind of cool. So you have essentially like three button states where like your thumb is off, your thumb is touching, and your thumb is pressing. So you can do sort of like swipey-browsy things, but also grabby gestures. And that's actually the duality is part of what opens up a lot of different interactions. At the limit, I think we're going to see lots more positional tracking, right? But there's a lot of challenges for doing that on an untethered device that you can take with you anywhere. And that's part of the reason why our focus for Daydream initially has been very much on the sort of accessibility, portability, which is about three degrees of freedom right now. One thing I would also add that's really cool is even when you have three degrees of freedom, you can do a lot with it. For example, the simplest thing to do is think about the rotation around the center of the controller. But if you actually assume the rotation is actually around someone's wrist, you can create what's called a wrist model. And that actually lets you do things like point a wand and cast spells in a way that's actually more believable than just simple controller rotation. you can take that even further and do what's called an elbow model. And that's how you get like swinging tennis rackets. And if you see the demos that we shared, it's really, you can feel like you're playing tennis and the tennis racket is moving both rotationally and positionally in VR. And we do that because even though like the platform doesn't know the position of the controller, individual apps know the user context. And you can combine that user context with sensor data to get incredibly compelling interactions.

[00:06:06.508] Kent Bye: And as I was thinking about it, I actually kind of realized that there's not positional tracking on the headset. And so if you did have a six degree of freedom controller, it might actually encourage people to move around more and then not have their head move. So it actually sounds like a good constraint for mobile VR.

[00:06:22.175] Nathan Martz: Yeah, I think that's very insightful. I think, you know, we definitely believe that the head and hands should be designed together, right? And it's natural having 3DOF on the head and 3DOF in your hand. In fact, even on your head, there's actually a neck model, which is a similar kind of thing that we use the rotation to give you a little bit of translation in the same way that you can do an elbow model or a wrist model on the controller.

[00:06:43.963] Kent Bye: And could you talk a bit about like, what does it look like in terms of a six degree? Like I'm trying to figure out the bounds and the limits of what you can and cannot do with the three degree of freedom controller. So like, what are the constraints with that?

[00:06:56.803] Nathan Martz: Yeah, that's a great question. So again, I'm a game developer by background, and kind of the essence of game development is knowing when and how to cheat like crazy. You know, a lot of games work by emphasizing context and what people will notice and not notice to produce incredible effects. And I actually haven't seen any ideas currently that are like, oh, you just can't do that at all on the controller. In fact, for me, it's not so much the like 3DOF versus 6DOF, it's the fact that we focused on a single controller, that that's really what we're building around.

[00:07:27.291] Kent Bye: And so it was announced here at Google IO that there's actually kind of like a DIY kit that people can get if they have like a special Android phone that can run Android N as well as like a spare phone. Maybe you could talk a bit about for developers out there, what does it take to get up and running to start developing for Daydream?

[00:07:44.286] Nathan Martz: Yeah, sure. So all of those materials, there's actually a BYODK, build your own dev kit, section of our developer site, which is developers.google.com slash VR. And that has a bill of materials, instructions for how to set everything up. In particular, you need a Nexus 6P specifically as your primary headset phone. That's the one that you're going to be doing your development on. You can use a cardboard V2 or there are some headstrap cardboard viewers that you can pick up. We have recommendations for those. And then you actually use a second phone to emulate the Daydream controller. And we have a custom app that runs on that phone that emulates the controller and pairs with your headset phone. So you can get that sense of 3DOF on your head and 3DOF in your hand. In fact, there's even a sticker in the Build Your Own Dev Kit that you can print out and put on your controller phone so that it gives you a feel of cutouts of where the buttons are and where your thumb should rest and things like that.

[00:08:38.771] Kent Bye: And so in order to develop for Daydream, do developers need to get a special preview of Android N?

[00:08:45.156] Nathan Martz: Yeah, so you need to use what's called DP3, the public developer build that just came out also at I-O. And also, the thing I would add with the dev kits is that we could have held back on all of this and been like, OK, wait for the final hardware and then get started. But we really wanted to get people out there as soon as possible. As soon as we talk about this platform, let developers sink their teeth into it and be creative. And even Cardboard is about this sort of like scrappy ethos of like, build stuff and be creative and get it into the hands of real people and see what they make. And we've really tried to carry that forward with the BYODK on Daydream.

[00:09:21.065] Kent Bye: And so with the SDK, what are some of the other components that developers should know about?

[00:09:26.380] Nathan Martz: Yeah, so there's a new part of the SDK, which is the Daydream NDK. So historically we provided a Java API on Android and engine integrations. But we know a lot of VR developers, especially ones that like to write their own game engines, program in C++. So now there's a C++ interface available for all of those developers, which is great. It's been one of our most popular requests over time. A lot of other developers, though, don't build their own engines. They use engines like Unity and Unreal. And so we've been working really, really closely with those engine providers to have what are called native integrations of that NDK into their game engine. So the first native integration came out is in UE4 4.12 dev preview. People can get that right now. And there'll be a Unity version of that native integration coming out later this summer.

[00:10:15.436] Kent Bye: And at the very beginning of Clay Bavor's keynote, the beginning of Google I-O, he kind of made this subtle announcement that VR is coming to Android with Android VR. Is Android anything specific, or is everything kind of encapsulated within Daydream?

[00:10:29.796] Nathan Martz: Yeah, so there's a VR mode in Android, which is like part of the Android operating system and has some of those lower level features like sustained performance mode, things like that. Daydream is a set of services and developer tools that sit on top of Android VR's developer mode that actually like create the Daydream platform itself, and that's everything from Things like the Daydream-ready phone specification, which is basically a hardware spec. The VR hardware, like the viewer and the controller. And then also the apps, and that includes both things like Daydream Home, as well as the tools that developers use to build apps.

[00:11:04.900] Kent Bye: Cool. And you've been on a number of different talks here at Google I.O. Maybe you could give a little short summary of some of the big points that you were trying to make to the developers here.

[00:11:13.320] Nathan Martz: Yeah, yeah. Well, I mean, I think the most basic point is, like, get started, right? Like, be creative. Like, you can do it right now. We've really, like I said, tried to embrace that creative scrappy ethos to get things in the hands of developers as soon as possible. In general, I would stress that we're really early in VR. If you look at a lot of the talks that we gave at IO, they're focused on, like, exploration, innovation, trying crazy things. You know, one of my favorite prototypes that we shared was a team that did like a gardening prototype in VR. And they took a watering can and they chopped off part of the spout. They put a Vive controller in there and they built a prototype around it. And it was super fun and like surprisingly compelling to like water plants in VR with that setup. And so I really think developers, it's way too early to get locked into like, this is the thing that works, and this is the thing that doesn't work. We're all learning together. I think if you want to be creative, now is the time. And the mentality is try lots of things, do lots of things, be excited, and get out there. And I think the last thing I'd emphasize is that we talk about this a lot, this sort of VR for everyone, but another way of saying that is VR at scale, right? Like having VR in a form factor, at a price point, that's really broadly accessible. And that's super important to Google, it's super important to Daydream, and I think important to a lot of developers. And we really believe that scale is gonna be about mobile in VR. That if you look at the computing form factors that are dominating people's lives, it's mobile, it's portable, it's a thing you can take with you. There's certainly technical challenges with building VR on a smartphone, but we think any developer who's really thoughtful about addressing a large audience, reaching a huge number of people, really wants to pay attention to Daydream and that mobile form factor.

[00:12:56.567] Kent Bye: Daydream Lab had a prototype where they were doing a language teaching exercise. There seemed to be some sort of integration with the speech recognition, perhaps the cloud speech API. Maybe you could talk a bit about the importance of speech moving forward in terms of an input.

[00:13:13.976] Nathan Martz: Yeah, yeah, it's a great example. And that's one of the things I love about that team is they just try so many different prototypes. Yeah, speech is super cool. It's an area that Google has invested tremendously in, right? You saw a lot of our announcements around OK Google and things like that at IEO this year. In VR, it's particularly interesting. You can do a lot in VR, but text entry is actually more difficult than other kinds of interaction. Because we're used to entering text on the keyboard. You don't have a keyboard in VR, at least not a physical keyboard. And so speech can really complement things like the Daydream controller as a way of doing text input when you actually really need it, like searching through a movie catalog. And it's really nice that one of the great things about being at Google is that we can combine Daydream with features of the existing Android platform, like speech recognition, and stitch them together. And developers can do this, too. It's not private in any way. You can use the Android platform and Daydream to do things you can't do anywhere else.

[00:14:06.873] Kent Bye: And just to make a quick clarification, because the Daydream Lab team showed a prototype of this drum keyboard, but that was on a Vive and not the Daydream, correct?

[00:14:16.397] Nathan Martz: Yeah, that's right. And actually, there's not even the Daydream Labs team. There's a number of people who contribute to that effort in different ways. But yeah, the mission of Daydream Labs is to innovate and explore and be creative in VR. And it's basically like, give that team hardware, and they will try lots of different things out with it. Like I said, I mentioned a little bit ago that we feel like now is the time to be creative and to explore in VR, not to kind of lock into a particular methodology. And that applies nowhere more truly than in the Daydream Labs group itself.

[00:14:47.630] Kent Bye: And there was some coverage of one of the Google I.O. talks where they were talking about 10 or 13 different recommendations for building applications for Daydream. And one of them was that you really are encouraging people to use the controller within these experiences. Maybe you could expand on that a little bit.

[00:15:04.150] Nathan Martz: Yeah, yeah. I mean, we, there's like a promise that we're making to users when they buy this hardware that it's going to be useful, right? And where there's a promise that we're making to developers that they can guarantee that the hardware exists. And so we really want to make sure that as people are building and thinking about designing Daydream apps, they're taking full advantage of the features of the platform. And we really think the controller is like one of the most compelling things for users, as well as one of the most interesting things for developers to work on. Google's a very user-centric company, especially as a product manager, you're like, it's always user first, that's what matters most. And we really think that the experiences that will be best for users are the ones that make creative use of the controller, not treat it like a very large cardboard button.

[00:15:50.612] Kent Bye: And so, what else can we expect between now and fall?

[00:15:55.304] Nathan Martz: So we've got a lot more to talk about, I'm sure. We'll be talking about our catalog, specifics on hardware, but that's all to come.

[00:16:03.607] Kent Bye: And so what else can you say about the Daydream Home, as well as some of the other applications that are coming out that are coming specifically from Google?

[00:16:12.362] Nathan Martz: Yeah, I mean Daydream Home is really interesting and the thing I think is most relevant for developers is, you know, there's certainly many platforms have home screens with the ability to promote apps in them. But we're taking that a step further with deep linking to actually promote individual experiences within apps. Because there's sort of two challenges for developers, right? One challenge is helping users find the content in the first place, right? Just knowing about your app is difficult. But often people will like know about your app, they'll install it, they'll try it, they'll really enjoy it, but then other things will come up and maybe they'll forget about it even as you're releasing updates. And so the ability for us to promote not just whole apps but with experiences within apps allows us not to engage users but to re-engage users. And for also, even when we promote apps that you maybe don't have installed, promote it in a more contextual, specific way. So rather than like, hey, here's a news app and news in VR, it's like, no, there's a breaking story right now. Come check it out. And we think that's going to be great for developer discovery and great for users who are curious to understand more about what's out there in VR.

[00:17:14.472] Kent Bye: So I do imagine a world where we're going to have a lot of VR experiences kind of embedded within the context of a web page.

[00:17:21.181] Nathan Martz: Yeah, that's a great, another great question. In fact, one of the talks I'm giving today is about a product we call VR View. And it's a really simple way for developers to embed immersive media in existing websites and actually native applications. And that's an area I think is personally really, really interesting. When you think about the applicability of VR, there's a range of ways that you can use it, and that includes long-form immersive content, like Daydream apps, but it also includes short-form snackable content, which is what Cardboard is all about. And I think especially for companies that are looking to just get started in VR or augment existing properties in VR, like VR View, snackable 360 content is a great way to do it. And as a simple example, if you're a home builder, you sell homes that don't exist yet, and you have this problem that some of the people who want to buy homes from you have a hard time doing that because they can't walk around the house, and it's hard to buy a house if you can't feel the tile in the kitchen. But in VR, you can actually walk around the house before it's built. Now, some of those, is a home builder going to build a brand new app from the ground up? Maybe not, but they've got apps, they've got websites, and we're really focused on helping them expand those properties with immersive media. And that's another one of many ways we're trying to execute on that vision of VR for everyone, for every business, for every developer, for every user.

[00:18:42.294] Kent Bye: So maybe you could tell me a bit about how did you first get involved into VR here at Google?

[00:18:46.903] Nathan Martz: Yeah, I was a game developer for a lot of years. I joined Google for the chance to work in Google search, which is amazing and totally different in a really fun way. And I actually joined just after we first did our giveaway cardboard at Google IO in 2014. And, you know, that was such a success that the team, you know, which had done it as a 20% project, this kind of famous initiative at Google. And they kept going and they're like, oh, you know, we're getting interest, people are more curious, want to work with some developers. And so they sent an email saying, hey, if anyone can help out with this initiative, we'd love some people who know more and more about developers who can help make connections, think about the stuff we should build. And so I was like, yeah, I'd be happy to help. You know, I spent a lot of my career doing this. And so I'd volunteered early on, helped do training for the team, helped them with their strategy. stayed in touch, and then as we built more steam around that effort, joined, and haven't looked back since.

[00:19:37.858] Kent Bye: Do you have any favorite memory or story of being in VR?

[00:19:42.381] Nathan Martz: Oh, good question. My personal favorite is Expeditions, actually. And for people who aren't familiar with it, that's our VR field trip app. So even though I spent a lot of my career as a game developer, I love applications of VR that are not just engaging, but are useful, or enriching. And I love that Expeditions is enriching, right? You get this great sense of going somewhere else. And actually, in a cool way, it's social as well, right? Because you take a trip together with other people. And that's really fun. And you can see what they're seeing. And even if you don't have enough cardboards, you can take it off and hand it to other people. It's really a great demonstration. Even though it's, in many respects, one of the simplest applications of VR, the fact that it's useful, enriching, social, for me, is super cool.

[00:20:29.005] Kent Bye: Speaking of social, are there any specific social implementations within the SDK?

[00:20:34.068] Nathan Martz: That's a great question. So the SDK today doesn't have specific social frameworks built into it. The engines that we integrate with, like Unity and Unreal, have plenty of existing tools and technologies for doing that. One of the things that we're trying to be careful about is not be prescriptive early on. We don't want to say, there's one way to do social in VR, here's the API to do it. Because if you look at social, you have things like synchronous VR to VR social, right? I'm in VR and you're in VR. You have synchronous asymmetric social. I'm in VR and you're on your phone, right? Maybe we're in the same room and maybe we're in different rooms. You have, you know, asynchronous engagement where I see something you did and then I interact with it later and maybe that's also in VR and out of VR. So we're really trying to support a lot of experiment through Daydream Labs, support developers in a broad variety of things. And as we see dominant patterns emerge, then we'll build up more kind of infrastructure underneath that.

[00:21:30.306] Kent Bye: And finally, what do you see as kind of the ultimate potential of virtual reality and what it might be able to enable?

[00:21:36.645] Nathan Martz: Yeah, that's a great question. Clay uses this phrase that part of our vision for VR is to help the world see the world, to help people see facets of reality that they've never seen before. And for me, this is true of both VR and AR together, that they're fundamentally enabling and additive technologies. I'm kind of like a nerd for data, like I love understanding how the world works and understanding it in a really quantifiable way. And one of the things that I'm really fascinated about with VR and AR is, you know, we live in this world that's more and more instrumented, right? We have sensors, you know, in the phones that we carry with us, in the cars that we drive, in the homes that we live in. These sensors are increasingly connected and aware of one another. But for the large part, they're invisible to us, you know, even if your car has a problem. Like there's a sensor that knows it's a temperature problem at this location, you know, but right now my only visualization of that is like a light turns on in my dashboard. It's like something bad happened and you've got to plug in a diagnostic computer just to figure out what's going on and you can't even read it unless you're trained to use that computer. I think that VR and AR are going to make the world of information that we live in dramatically more accessible and useful. And that, to me, is super cool and part of the reason I'm doing VR at Google specifically, because we're all about lots of information, understanding the world, making it universally useful and accessible.

[00:23:00.666] Kent Bye: Awesome. Well, thank you so much. My pleasure. Thank you. So that was Nathan Martz. He's a product manager at Google working on the software SDK for the D-Dream mobile VR headset that's coming out from Google in the fall. So a couple quick takeaways from this is that first of all I do think that with the constraints of mobile VR and not having positional tracking that it actually does kind of make sense to not have a six degree of freedom controller because the hand and the head are so connected that if you did have a six degree of freedom controller you may be tempted to start to move around where your head wouldn't be tracked and it's just going to cause a lot of motion sickness. So I think going with that constraint and design decision is actually going to kind of create very unique and different experiences, which if you look at mobile, desktop and room scale, as a VR developer, you kind of have to pick one because you can do a lowest common denominator experience for mobile. and port it over to desktop and room scale but it's not necessarily going to be the best for each medium and so I'm sure there will be cross-platform experiences that work equally well in each of the different platforms but they'll probably have to be designing for mobile first with those constraints because there's just a lot more you can do in desktop with being able to move your head around with positional tracking And then the sense of embodied presence and activity that you're able to achieve within a full room scale is just completely different than doing something in mobile. So from a VR developer perspective, I can imagine that people who are already developing room scale games look at something like Daydream VR and say, yeah, well, it'd be nice to have six degree of freedom movement with your heads and hands. But in the absence of that, we're not going to bother trying to port our game into a medium that's just not going to fully work. I mean just think of Tilt Brush of being able to draw in 3D and trying to do that with the constraints of a 3DOF controller. It's just not going to be the same type of experience that to be able to really draw that level of precision with 3D depth. So I think there's probably some frustration for a certain type of developers who are wanting to do something that is fully room scale, but I think overall, the Daydream is going to actually create this creative constraints that is actually going to create experiences that go way beyond what is possible within a full room scale. You know, I've been thinking a lot about different types of presence and I kind of think of that there's four different types of presence. And one is active presence is actually physically doing things. And I think to a certain extent, the Vive is doing that really well. The Rift is starting to get some of that with experiences like the climb. And another type of presence is embodied presence. So having your hands and feet tracked and being able to have haptics and mixed reality experiences. I think that the Vive is starting to do that a lot with the room scale just on its own and being able to see your hands and body in there. Oculus with Oculus Touch is certainly moving into that direction as well. And it's a little bit harder to have embodied presence without hands. just having your head you can only go so far and within the daydream I think it's going to start to go beyond what the Gear VR is able to do in terms of some of the active presence and perhaps some of the embodied presence although with only one hand I think you really kind of need to have both hands in there to really fully feel like you have an embodied presence with invoking what they would call the virtual body ownership illusion. In order to really get that virtual body ownership illusion you really need to have feet and perhaps even elbows track to be able to fully have a fluid feeling type of avatar. So the other two types of presence though I think are social presence and I think that with the constraints of Oculus Rift of not being able to go and move around a lot then I think it's actually going to have a lot more social presence to be able to have interactions and the gesture controls within the touch controls are going to start to give a little bit more hand movements as well and feels like the Oculus Rift is going to really have its strength in social presence and also within the Daydream VR as well as Gear VR there's going to be a bit of constraints with being able to not really be able to fully move around into a scene and so social presence may actually be something that's also the strength of having the constraints of the Daydream VR with not being able to have a positional tracking. And then finally the final type of presence that I see is the emotional presence, so really getting engaged in media and story. And in some ways I think that the strength of mobile VR is that it's going to be so constrained in the other types of presence that you're going to really start to see a lot of narrative as well as YouTube and video and photo content and you know just experiences that really engage the emotions I think are gonna have its strength within Google's Daydream as well as in the Gear VR. So I've been really starting to think about the experiences that I've had in VR and the different qualities that I've seen in terms of what's really activated different levels of presence and immersion for me and these four types of presence, with embodied presence, with active presence, with social presence, as well as with emotional presence, there's going to be different strengths and weaknesses where perhaps the Vive is going to fully encompass all of them, and then with the room scale, and then perhaps the Oculus Rift is going to start to expand that. But the mobile, I think, is going to be constrained in these other levels of presence, but it's going to amplify other ones that may be harder to achieve, like the emotional presence or social presence. So in real life, we kind of have access to all four of these different dimensions of presence, but yet, you know, when we go into VR, we start to take out our embodied presence, our active presence, and social and emotional, and we have to kind of reconstruct them from the ground up, which starts to really amplify the moments that we really feel like we're engaged with life. But the overall point that I'm trying to make here is that just by looking at these different types of presence, there's different constraints and ways that these different platforms, that they're really starting to have different input controls and constraints that are going to drive different types of experiences. And with a mobile, tellerless, no cord headset, and with one 3DOF controller, that's going to be a certain level of constraints that are going to create some casual games that are a little bit more immersive than perhaps other experiences. And we'll see, I think, different types of experiences emerge because of that. So I'm really just personally excited to see something like Daydream VR, with those constraints, be able to create a certain level of experience that really gets VR out there into what is actually going to be a lot of people's first, you know, higher quality VR experience. So be excited to see how this develops over the next couple of years. So for more information, check out some of the links in terms of where to go to get some of the software SDK and to get started with your DIY kit in order to start developing for DaydreamVR, which is coming out in the fall. So with that, I just wanted to thank you for listening. And if you are enjoying this podcast, please tell your friends. send it to thevoicesofvr.com, subscribe on iTunes, you can follow me at Kent Bye on Twitter, and I'll also be doing some daily video blogging over at Snapchat, so follow me over there, at Kent Bye. As well as, if you do feel called to contribute to the podcast, then consider becoming a donor at patreon.com slash voicesofvr.

More from this show