#373: Highlights from 60 VR Prototypes by a Google Daydream Labs Developer

rob-jagnowRob Jagnow is a senior software engineer who was a part of Google’s Daydream Lab team that produced over 60 VR prototypes in 30 weeks. This team was originally revealed in a WIRED magazine profile of VP of VR Clay Bavor, and this Daydream Labs prototyping team was officially announced by Bavor in his Google I/O talk on “VR at Google.” The Daydream Labs team gave a really amazing talk titled “Lessons Learned from VR Prototyping,” which had a lot of great VR design insights across the three areas of interactions, immersion, and social VR. I had a chance to catch up with Rob Jagnow at Google I/O, and dig a bit deeper into some his favorite prototypes, lessons learned, and design principles that were driving these VR experiments.

LISTEN TO THE VOICES OF VR PODCAST

To keep track of VR developments at Google, then you can follow their Google VR Twitter account, and keep an eye on any VR tagged posts on the Google Developers Blog.

Here’s the Daydream Labs Drum Keys prototype for text entry in VR that uses Vive controllers:

Here’s a direct link to the “Daydream Labs: Lessons Learned from VR Prototyping” session along with all of the other

You can watch more videos about VR from Google I/O by looking at the playlist on the new Google VR YouTube channel.

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip

Rough Transcript

[00:00:05.412] Kent Bye: The Voices of VR Podcast. My name is Kent Bye, and welcome to the Voices of VR podcast. Today's episode, I talk to Rob Yagnu, who is a senior software engineer at Google, and he worked on the Daydream Labs prototyping team. So there was an article in Wired Magazine about Clay Brevoort talking about this rapid prototyping team who was building two VR prototypes a week. And at the Google I-O conference, they announced this team as the Daydream Labs team. And they did 60 prototypes over the course of 30 weeks. And so they were sharing a lot of their lessons learned at Google I-O for the first time. And they're going to be announcing more of their findings over time. So on this rapid prototyping team within Daydream Labs, there would be essentially one engineer and one designer and two of those teams. So they would create two prototypes a week. And Rob was the software engineer on one of these teams. And so he was involved in creating a lot of different prototypes over the last 30 weeks. So I had a chance to sit down with Rob at Google I-O to talk about this process of creating all these different VR prototypes, as well as some of his biggest lessons learned and some of his favorite experiments and findings from this time period. And so that's what we'll be covering on today's episode of the Voices of VR podcast. But first, a quick word from our sponsor. Today's episode is brought to you by Unity. Unity has created a tool set that has essentially democratized game development and created this lingua franca of immersive technologies, allowing you to write it once in Unity and be able to deploy it to any of the virtual or augmented reality platforms. Over 90% of the virtual reality applications use Unity. So to learn more information, check out Unity at Unity3D.com. And so this interview happened on the last day of Google I.O. in May 2016 and happened at the press area of the amphitheater. So with that, let's go ahead and dive right in.

[00:02:13.219] Rob Jagnow: I'm Rob Yagno and I work with a small team within Daydream that makes these little tiny prototypes. So we work in pairs, one engineer and one designer. So I'm one of those engineers in those pair teams. And what we try to do is ideally turn out a new VR prototype every single week. So just like go on crazy ideas, and since it's just one week, it's not a whole lot of risks, we just like, we crank out fun stuff.

[00:02:36.445] Kent Bye: Yeah, maybe you could talk a bit about what you were able to accomplish over the last 30 weeks.

[00:02:40.868] Rob Jagnow: Oh, man, it has been such a ride, and we're finally glad to be able to share a lot of this stuff. We've been kind of like sitting on these lessons for a long time, and we have looked at everything from, you know, it could be as simple as like, what makes for a great, comfortable reading experience? How big should the text be? How much contrast should it have? all the way up to really complicated, can we make a layered timeline animation system that would make it easy enough for a 10-year-old to do storytelling in VR? And we did. We did that in a week. And it was freaking amazing. So now we're finally getting to share all this stuff. And we're just happy to be getting these lessons out to the VR community. It's one of those rising tide raises all boats situations. We really think that it's important, especially at this point in the lifecycle of VR, that we get those lessons out there so that everyone can benefit.

[00:03:25.927] Kent Bye: Yeah, so maybe you could give me some numbers in terms of how many experiences you created over this time.

[00:03:30.468] Rob Jagnow: The one that we've been advertising is 60-plus. We're not that far past 60. We've actually slowed down a little as we start to prepare for I-O. But let's just say we've done more than 60 experiences. And like I said, some of those are really small, like using a controller as a fishing rod, or just experimenting with using that same controller as a frying pan, and just kind of tweaking the parameters about how to make that a great experience. But also, some are big. an immersive language learning experience where you're having a conversation in Mandarin with two young girls and they turn to you and you're expected to answer their questions. Some of these are actually pretty content heavy.

[00:04:06.779] Kent Bye: And so in talking to other people, I heard that there's a lot of these prototypes that are actually done on the HTC Vive in full room scale to be able to have positional tracking and everything. Maybe you can talk a bit about using this highest level of VR in order to plan for the future of what's coming.

[00:04:21.455] Rob Jagnow: Sure, and I'm going to almost interpret that question as like, why, since you're using smartphone VR for a bunch of stuff, why are you prototyping on the Vive? And the reason is basically it keeps the technical hurdles from getting too much in the way of designing our ideas. In some cases, we actually dumb down the Vive. We actually rewrite pieces of the SteamVR code so that it acts more like a smartphone. But the Vive, it's got a really smooth, easy SDK. It's pretty reliable hardware. And we also know that it's gonna not be that much time before VR moves in that direction and we're able to make much higher end experiences. So we're just kind of pretending we're a few years down the road in the future and designing for that target.

[00:04:57.885] Kent Bye: And so there was a talk where you were able to present for the first time some of your big lessons learned. And I really sat down and let it marinate. And I watched the live stream and did a ton of tweets, so many insights into three major areas. Maybe you could kind of summarize those three categories of these lessons learned that you were just sharing here at Google I-O.

[00:05:18.844] Rob Jagnow: Oh man, let me see if I can come up with some of our best learnings and really boil it down. I'm actually going to talk about a little bit of a different perspective. The four other members of my team, they did an amazing job with the talk, talking about very specific things. like how to design your environment and how to design your interactive space to just lead to delight and intuitive interactions and some of the use cases. I want to talk about something that I'm a little bit more passionate about, which is the process for actually creating those experiences, which is its own separate talk. But I think one of the things that's been really amazing that Google has done here is to allow us the freedom to actually take these crazy ideas that we are super passionate about that any company in their reasonable mind would say no to, and they say yes. And when we say, you know, can we make a collaborative language learning app? They say, why are you asking permission? And the result is that, you know, we let the passions of the designers, the whole team, and we solicit ideas from the entire team, stack rank those ideas for our favorites, and then implement the best ones, the ones that seem most reasonable to do in a week. And the learnings are just unbelievable.

[00:06:29.901] Kent Bye: Yeah, I know that Paul Bettner with Playful Corporation went on very early with Oculus and they did the very similar thing where they were doing rapid prototypes of trying to prove out different game mechanics. And that's how they stumbled upon Lucky's Tale to have that kind of super tiny near field or what Paul would say is kind of like the sweet spot of VR. But they were trying to do something that was very fun for gameplay. And so when you're trying to design these different rapid iterations of these prototypes, what is the kind of design principle that's starting it?

[00:06:58.892] Rob Jagnow: We're definitely asking a different hypothesis. We're asking a different question for every design that we do. But it's interesting that you use that word gameplay, because specifically, we know that game studios are going to make great stuff. They don't need any help in that. But we also think that there are a lot of avenues and use cases outside of games that are kind of being ignored or brushed aside for virtual reality right now. There's so much low-hanging fruit in games that people are going for that. But we think productivity and storytelling and education are really underexplored areas in VR that have huge promise. And so we're taking a little bit of a closer look at those areas.

[00:07:36.046] Kent Bye: Yeah, maybe you could talk a bit about the drum keys and text input.

[00:07:39.817] Rob Jagnow: Yeah, I mean, text input, we knew from the beginning that this was going to be one of the requirements of a lot of VR experiences. You're going to have to put in text, whether you like it or not. Maybe you can do voice recognition, but we feel like we might still be a little bit a ways away from that. And also, we wanted to try to keep with this idea of doing prototypes that are small enough to do in a week. It was one of these things we put off for a long time because we just didn't feel like we had much wisdom to offer in that space. A lot of people are exploring text input. And then one of our teams did this drum set and it was so fun. And people, they would stack drums up and use them in ways that you can't use them physically by playing, like by swiping a drumstick through them and playing a bunch at a time. And we started to think like, okay, this is a little crazy, but what if we had a drum typewriter, right? Like where every key was a little tiny drum. And, you know, we already had this foundation, so we could prototype it up pretty quickly. I was very skeptical that something on a drumstick could give you the precision or the speed that you would need. But once we saw it in action, we were like, actually, this works really well. I mean, I don't know if it's going to, you know, I'm not going to pretend like, oh, yeah, this is going to be the standard for text input in the future. I don't know. But the fact that we discovered something new makes me feel pretty proud of that.

[00:08:51.884] Kent Bye: So how fast can you type in VR then?

[00:08:54.250] Rob Jagnow: I don't think I ever figured that out. When I was playing the game, it recorded your time in seconds, and then later they adapted it for the words per minute, so I don't actually know what I was getting. But I can tell you when you watch somebody play, you will be really surprised at the dexterity and the accuracy. That's what really surprised me, is that people were very precise with it.

[00:09:13.748] Kent Bye: What were some of the other prototypes that you made that really stand out for you personally?

[00:09:18.904] Rob Jagnow: Like I mentioned previously, the animation system where you can do a timeline layer animation. I like that one because it's something that's so hard to do ordinarily. If you want to create an animation, there have been a couple ways to do that in the past. You can do motion capture, which requires a $100,000 system and all these cameras, and you have to climb into a suit with little infrared buttons all over it. Or you can do keyframed animation, which anybody can do. You can do that in your Unity editor or whatever, and you move curves around, but trying to get it to look natural or trying to do it quickly is really hard because you're tweaking one parameter at a time. And here we figured out that with VR, you grab a controller, you move an object, you record its movement, and you get this thing that feels so natural that when you play it back, And you see your emotion and your hand movement in that animation. It's simple, but it's fun. It feels great for kids and for adults. It's so easy to do. And I think little areas like that are these gems that are going to build up. We're going to find a bunch of little gems, and other people are going to figure out how to put those gems into a full-fledged application. That's just a great experience.

[00:10:31.807] Kent Bye: So I think a lot of the goals of a VR experience is to cultivate this sense of presence and full immersion. So what were some of the big takeaways and lessons learned that you found from all these prototypes in order to cultivate this presence?

[00:10:45.051] Rob Jagnow: You know, some of these are almost anti-learning. Some of the most interesting things we learned were bad ways to destroy presence. For instance, one of the things that we played around was full-body avatars. So I wrote a little heuristic so that as you move your head around, which is the only thing that's getting tracked in VR, well, your head and your hands. So as you're moving those around, this heuristic would try to guess where your waist is and how much you're bent over and how tall you are and where your feet ought to be. Now, when you apply that heuristic to somebody else's avatar, it looks pretty natural, it looks pretty good. But when you apply it to your own avatar, and you look down, and that's not where your feet actually are, that really breaks immersion. I mean, the great thing about VR is that, in some degree, it's hard to not make an immersive experience. You've got this beautiful world, and you're surrounded by it, and you feel like you're there, and as you move, it moves, and as you turn your head, the world moves around you. So almost the best things to learn in immersion are what not to do. how not to teleport your player, how not to implement motion to make you sick. And there's still a lot to be learned there.

[00:11:45.955] Kent Bye: And so there's also a whole section of different lessons learned around social VR. And so what were some of the big things that you were doing in terms of social?

[00:11:54.258] Rob Jagnow: So let me give you an example of one that we didn't cover explicitly in the talk, which is the importance of voice communication. So we started out with a bunch of really simple co-presence experiments. You put two people in vibes, they're in different rooms, but you see each other's head and hands in the same space. One of the things we learned is that you can be surprisingly expressive with just a head and hands. A lot of emotion can come through. And when you're in that space, let's say there's somebody else in the same physical space who's talking to you, asking you questions, you don't really hesitate to respond to them. You still are in some ways anchored to that physical space that you're in. But the lesson came when we added in voice chat. And all of a sudden when you have that other person, that other avatar speaking to you, the physical world and that place where you started completely falls away and immersion is complete. And if I'm in that physical space and you're in the virtual space and I try to ask you a question, it's in one ear and out the other. It's like I don't even exist anymore. So it really demonstrated that having voice chat is like an immersion multiplier. It absolutely makes you feel like you're there.

[00:13:01.685] Kent Bye: So maybe you could talk a bit about some of the lessons learned in terms of the environments, in terms of cultivating a strong sense of taking you to another place in VR.

[00:13:09.872] Rob Jagnow: I think some of the interesting things in environments were just kind of being creative about it. So a few things. One of the things that we found is that you're always going to be constrained by the real physical space that you start in. And if you enter a virtual space and you don't know what the physical bounds are that you're constrained to, it can be a little intimidating. You might be a little nervous to go anywhere. You're sort of waiting for a virtual wall to appear in front of you when you get too close. People are a lot more comfortable in that virtual space if they have something, even if it's a six inch tall little fence around them on the floor that gives them an indication of where they can go. And if you can weave that virtual boundary, if you can make it feel appropriate for the space. And one of the examples that we gave was that you're riding a little cart and you turn a crank and that cart moves forward and backward. If you can make it feel appropriate for the space, then it completes everything. You feel comfortable, you feel like you can move and you're unconstrained, and you just stop worrying about the physical constraints of the space. The other place where we really see creativity being effective was what we call the Slides and Ladders demo, where we took our physical space, which is pretty small, and divided it up in a creative way. And we sort of said, well, this is going to be where the carriage starts, and this is going to be where the cloud starts, and this is going to be where the slide starts. And then the way that you interact with that space in the virtual world is that when you walk to the slide and kneel down, you go down it, and you see yourself start moving. And then when you walk to the carriage, you enter it, and it starts going up a hill. And then you walk over to the cloud, and you step onto it, and it starts taking you to the next location. And the result is that we've basically designed the space very carefully to keep you physically safe, but the space feels enormous. And because we've fulfilled this contract with the user, we basically told them, we're going to keep you safe. We're going to keep you constrained in the physical space. So don't worry about it. But it's almost impossible for them to believe. It feels like I've traveled a mile. How can I possibly be in the same physical space?

[00:15:05.027] Kent Bye: And so can you walk me through a little bit of your process of how you go from the beginning of the week to the end of the week and having a new prototype?

[00:15:12.573] Rob Jagnow: Sure. So I could walk you through two versions of the process. This is the clean version, which is we have this nice clean five-step process. We get our ideas, we rank our ideas, we implement them, we share the findings, and we document the findings. That's the clean version. The reality is that because we're doing this rapid prototyping, it's not like it's clear where one prototype ends and the next one begins. So all this crazy stuff is overlapping. So the way that a week actually looks for us is that on Monday, we're doing our demo party, we're showing off the findings from the previous week, we're zipping up the files, creating movies because that documentation is really important, we're sharing it with the team, and we're also starting to narrow down our ideas and choose our idea that we're going to implement for that week. Tuesday, Wednesday, and Thursday we're implementing that idea, we're refining our thinking, we hit some dead ends early, we refine the question that we think that we're asking, we decide where we want to apply the polish, what technology we want to use to implement these features, and also during that time we're reaching out to other teams because we're finding a lot of our great ideas that are coming in for prototypes don't come from us anymore. We have a little bit of a decent reputation on the team for building stuff quickly. And so people come to us all the time and say, hey, I have this framework that's too heavy handed for me to do something quickly in, but can you mock this up and see if it's fun, see if it's interesting. So then on Friday, we're kind of like testing and polishing and wrapping things up and starting to get ready for the next week and narrowing down our list of ideas. So that's kind of what a typical week looks like. Of course, then you throw in Google I.O. and meetings and preparation for other things and some emergency that comes up, and it gets a little bit more chaotic than that. But generally speaking, when things are going smoothly, that's what a typical week looks like.

[00:16:49.990] Kent Bye: Do you have any specific examples of ideas that you thought were going to be great but ended up being really terrible in VR?

[00:16:56.374] Rob Jagnow: Yeah, we have this really beautiful little London tabletop model. It's this really cool slice of London at like maybe 1 to 200 scale. And you see the little cars moving around and trains moving around. And it's just like, you really just want to lean in close and just look at the beautiful detail in this world. So we thought well we'll use this as a foundation for virtual tourism and you can click on pinpoints and it'll bring up this cool wall of information to show you all of the stuff that you can do at that location. So like if you click on the Globe Theater it'll bring up a history of the Globe Theater and pictures from inside the Globe Theater and you can buy tickets and you can like make a phone call and it turns out like when you go from inspecting this beautiful delicate city to bringing up this wall of information It's a terrible experience and it totally breaks this connection that you have with this cute little town. I'm sorry, I just called London a cute little town. With this adorable slice of this major city. And so the lesson from that is that, you know, sometimes we have a hard time taking these lessons that we learn and boiling them down into something bite-sized. But I think the lesson that we learned is that just because you have this basically infinite screen space that can wrap around you doesn't mean you should or have to use it all. Like, you still need to be conservative with how you present data to your users.

[00:18:10.187] Kent Bye: Well, calling it a cute little town, I think, you know, from a VR design perspective, I would say that anything that's in the near field in that small scale, there's something that we're able to create these dollhouse kind of figures that go beyond what we've ever seen before and have, like, things being animated. stimulates our brain. Paul Bettner says we have more neurons from sticking our hand out from the distance from our fingers to our eyes than we do in any other part of our brain. So that's the peak amount of stereoscopic effects and just feels adorable, anything that's in that sweet spot of VR. And there was another demo that you showed where you're kind of going back and forth between the small scale and the large scale. So maybe you could talk a bit about playing with scale in VR.

[00:18:50.232] Rob Jagnow: I think the scale in VR is still hugely unexplored. I mean, VR is so good at conveying scale in both the huge and the tiny. And we've actually done, I think, four different prototypes, a couple of which we shared in our talks, where you're building something that's kind of a tabletop scale in front of you, But then we have this thing that we call the chest piece that you can put into that model and then, bink, press a button, and you are now at the size of that model in that virtual space. And you have a totally different impression of the space. And it's just such a magical experience. And it's something that I think every kid dreamed about. You make the dollhouse and you move the dolls through it, but you think, what if I could be in that dollhouse? Well, now you can. Now it's super easy. And all of the different ways in which you can just crank up and crank down scale and play with that in different ways. I'm really looking forward to seeing games take advantage of different ways of representing scale and operating at different scales, senses of scale at the same time and teleporting between those spaces. I think there's a lot of really cool ideas to be explored there.

[00:19:53.397] Kent Bye: Another thing that really stuck out for me was a lot of these new user interface gestures and throwing and snapping and maybe you could talk a bit about what did it take to be able to implement some of these kind of like detecting the intent of a user?

[00:20:06.409] Rob Jagnow: The good news is most of this stuff came about organically fairly quickly for things like our photo arranging app, where you can take a photo that's on your virtual couch in your living room and you throw it against the wall and the application detects that that action has happened and it projects it. makes it oriented correctly. As it turns out, just the way, like the framework that we have for building a lot of these applications, you have this thing and it can collide with other things and this thing detects collisions. And it just like, in a lot of ways, it's actually easiest to like align that thing in a nice clean rectilinear way at a very particular scale when this thing intersects this thing. As it turns out, the nature of those frameworks made it really easy for us to prototype those kinds of experiences. But then when you actually step into those experiences and try them, it feels so magical. It's just reading your mind. You just take a folder and you just like... I mean, originally the idea, for instance, was you grab photos from the sofa and you put them in the folder and the folder contains them. But as it turns out, the way that the app was written, you can take the folder and drag it through the stack of photos and you end up with this beautifully organized stack of photos. And it's just like, so many of these things we discover accidentally along the way. I think, and this is going to be true of VR for a while, is that we're going to just like try stuff out and discover, you know what, either that was really terrible or that was really great. It wasn't even what we were looking for. And there's a lot of space. So early in the days of VR, there's a lot of space for those kinds of happy accidents to come.

[00:21:40.793] Kent Bye: Maybe you could talk a little bit about how Tilt Brush originated.

[00:21:44.394] Rob Jagnow: It's perfect because that was basically an accident. From what I have heard from Drew and Patrick, who were the originators of Tilt Brush, they were making some other game completely and they were having trouble debugging something and they added what they call a tracer. So basically as you pull this physical object through space it draws a line behind it. And they had so much fun with those tracers that they started drawing pictures in space and realized, this is actually a better experience than the original game that we were creating. Let's just make this the experience. I mean, it wasn't just like a pivot, it was like a 180-degree turn. This is like a totally new experience that they were designing. But they were open enough to see that there was potential there. And they turned it into something that is absolutely magical. I mean, I think just about everyone that I've talked to who's experienced anything in VR has said Tilt Brush is their favorite experience. And when I give demos, people come in and we'll show them something, and they'll, like, 60 seconds in, they'll be like, OK, that's enough, that's good. I'll show them Tilt Brush, and they're like, can you just leave me here and come back in an hour? You just get lost in this creative space.

[00:22:48.634] Kent Bye: Well, with these prototypes, I know that Valve released the Lab, which is sort of the series of their demos that they felt were kind of highly polished and trying to show different mechanics and gameplay. Does Google have any plans at some point releasing some of these prototypes to the public to be able to kind of play around with for design inspiration?

[00:23:07.634] Rob Jagnow: I can't speak to what the total plans are yet. What I can say is that both of these, what Google has been showing off at I.O. and what Valve did is just an example of how open these communities are. I think everybody feels like the rising tide will raise all boats. And we think that by being open about our process and our findings, the whole ecosystem is going to benefit. And especially, we've seen VR come about as a fad before in the 90s, and we think that it's absolutely here to stay. We're pretty confident in that. But we know that the best way to do that is to share findings so that that ecosystem is healthy across the board from the get-go.

[00:23:47.092] Kent Bye: Do you have any specific plans for sharing more information on the VR developer blog?

[00:23:51.716] Rob Jagnow: That's a great question. We already have a bunch of pre-written blogs ready to announce. So I'm really excited that we've opened the doors on that. I always basically the kickoff. And we hope that at least every couple weeks, we're going to be announcing new findings that we've discovered from our VR labs explorations, and that we'll be sharing those with the rest of the community.

[00:24:13.232] Kent Bye: Do you have any personal favorite stories or memories of being in VR?

[00:24:18.023] Rob Jagnow: Oh my gosh, so many amazing VR experiences, some of which I can't talk about yet. And let me tell you, some of my favorites are those things that we can't talk about yet. Let me just tell you there's some fun, fun stuff coming down the pipeline. I mean, I can talk about some of the favorite apps that I've worked on, which were the Animator was one where I think when we pitched it, others didn't really see promise. But the first time that I grabbed an object in virtual reality, moved it through space, and then rewound that animation and played it back, it was just like there was a part of me that had been bestowed upon that object. I mean, it's almost like it's so magical in a way that you just have to experience to understand it. It's like you've taken a little piece of your personality and put it into this inanimate object, and now it's animated. I don't know. There are so many magical experiences that I've had like that.

[00:25:14.025] Kent Bye: What are some of the things that you want to do in VR?

[00:25:17.002] Rob Jagnow: Oh, man. Some of the things that Expeditions is doing, and Expeditions is still covering this in a fairly light way, visiting places all over the world and off of this planet, going to Mars. I want to go to Europa in VR. I have a real passion for travel, and there's no way I'll be able to visit all the places that I want to see in my lifetime. And I kind of hope that VR is going to make that possible for me in an increasingly engaging way.

[00:25:43.691] Kent Bye: And finally, what do you see as kind of the ultimate potential of virtual reality and what it might be able to enable?

[00:25:50.861] Rob Jagnow: You know, everybody's looking at games right now. I really believe in the potential of virtual reality to revolutionize education. To have a classroom where the teacher can make eye contact with every single student in VR. To allow you to learn from the absolute best professor in the world in the most engaging multimedia presentation from a remote location in Africa. I think that the power to teach through empathy, to understand what people are going through in a refugee crisis by standing in their shoes, almost literally, all of these things, I think, are really going to change. It's not going to be easy. And we need to figure out how to measure how this is working. We need to rethink both how to measure the impact of VR on education and how to measure that. But if we can do it right, I think that it could change everything.

[00:26:44.175] Kent Bye: And just to follow up on that, I haven't really covered the Google Expeditions on the podcast at all. Maybe you could describe what that is and what some of the new information that you were sharing about that today.

[00:26:53.482] Rob Jagnow: Sure. So Google Expeditions, it's sort of a complete set of technology that we send around to classrooms. So the teacher has a tablet. And she can pick a location and basically a photosphere at that location. So let's say you take an expedition to the Great Barrier Reef. And the teacher on her tablet takes you down to dive with the sharks. Every student in that classroom has a cardboard, so a VR viewing device with a phone in it, and as soon as the teacher clicks that location, the students all see that same location in their cardboard. And they can turn around and look around, and the teacher sees exactly where they're looking. And the teacher can then, say, select a particular point of interest that has teacher's notes available at that location, and she'll see all the students' heads turn to look at that location, So that's kind of what the framework looks like. The amazing thing about Google Expeditions is that we sort of thought originally it would be used to show off locations that people can't go to, but due to contributions from people like the First Lady of the United States, we're now doing things like college tours, or looking at, you know, refugee crisis, or looking at careers, like letting people do career expeditions to see what's it like for a day in the life of a veterinarian, or an airline pilot, or an aquarist. careers that I didn't even know existed. So this is already, I think, showing the willingness of people to try new things in education and also the effectiveness. These kids are so engaged and learning things in an empathetic way that they didn't have access to before.

[00:28:23.606] Kent Bye: Is there anything else that's left unsaid that you'd like to say?

[00:28:29.613] Rob Jagnow: Just that I think everyone should be prepared for surprises. We're still really early in VR, and right now we're tending to take existing media forms and adapt them to create these first early VR experiences. The best VR experiences are going to be the ones where we throw away everything that we think we know about media, about engagement, about interaction, and they're going to be built from the ground up. They're going to understand the strengths and weaknesses of VR, and they are going to look like nothing you have ever imagined.

[00:28:58.396] Kent Bye: Awesome. Well, thank you so much, Rob.

[00:28:59.969] Rob Jagnow: Thanks for having me.

[00:29:01.362] Kent Bye: So that was Rob Yagnu. He's a senior software engineer at Google who worked on the Daydream Labs team, working on a number of different prototypes of virtual reality over the last 30 weeks. So a number of different takeaways from this is that it is interesting that they decided to do most of these prototypes on the Vive and not on their own hardware that they're building, and that they're trying to push the limits of what's going to be possible with VR user interface and design once they do have fully tracked six-degree-of-freedom controllers. And so at the moment, the Daydream controller is only three DOF with some extra clickable trackpad that gives some additional degrees of freedom, but more or less it's a three-degree-of-freedom controller, as well as the Google Daydream VR headset does not have any positional tracking at all. And so will Google eventually try to be adding this with Project Tango and having some sort of external sensors? It's really unclear at this point, but you can just see that Google has already released a Vive experience with Tilt Brush, and I imagine that they will be releasing some more experiences on the Vive with RoomScale that go beyond just what they're able to do for the Daydream headset. And so if you haven't had a chance to watch the talk from Google I.O., from the Daydream Labs team talking about a lot of their lessons learned, it's really quite an amazing talk about VR design. I highly recommend it because it's something that a lot of just really salient insights in terms of VR design and all sorts of different design inspirations. drum key keyboard I think in particular is something that I think is really inspired and I'm really hope that they consider releasing some of these experiences for people to try out on the Vive and like they said to share as much information to rise the tide on all boats and so I look forward to hearing a lot more of their insights and write-ups on their blog. You can probably keep track with a lot of the latest information from the Google VR team since they did switch over their Twitter account from Google Cardboard. They deprecated that account and they moved everything over to Google VR. So if you go over to the Google VR on Twitter, you'll probably find a lot of the latest links to their blog and They also have a Google VR YouTube channel where they have a lot of the VR-related talks that were at Google I.O. You can find them in a playlist there, and that's probably the easiest way to find them since a lot of the original talks are actually on the Google Developers YouTube website. Go check that out, there's lots of really great talks that happened at Google I.O. And if you are interested in starting to develop for the Daydream VR headset, you can start to build your own dev kit and download a sticker to put on an old Android phone, and you have to get a new phone that's compatible with Android N, get the latest operating system, and you can start to do your own rapid prototyping with the Build Your Own Dev Kit, BYODK. And there's more information on that in my previous interviews than I did with some other Google employees last week. And just on a personal note, I am really just super excited and honored to have Unity as one of my sponsors on the Voices of VR podcast for the next coming month. You know, they've just done a lot to support and enable this virtual reality revolution, and I look forward to helping tell that story. And I still am really relying upon, in a lot of ways, the Patreon support to help with travel, as well as just my own living expenses and other expenses that come with hosting and supporting the travel that's involved with the Voices of VR podcast. And so my hope is that I'll be able to combine both some sponsors as well as Patreon support to be able to really sustain and continue to do the Voices of VR podcast full-time and to not only sustain it but grow it over time. So if you enjoy this podcast and you feel like you're getting a lot out of it, then please consider spreading the word, tell your friends about it, but also consider throwing me a tip and help to cover a lot of the out-of-pocket expenses that I have to be able to bring you this Voices of VR podcast. So Please consider becoming a donor at patreon.com slash voices of VR

More from this show