On October 4th, Google revealed Daydream View, which is their reference design for their mobile VR headset. Google CEO Sandar Pichai also announced the first Daydream-ready phone designed called Pixel, which also has native hardware support for their artificial intelligent Google Assistant technology. Pichai emphasized that Google and the wider tech industry are moving from mobile-first to AI-first, and so they showed off more demos of their AI conversational interface with Google Home.
I had a chance to go hands-on with the Google Home, Google Pixel, and Google’s Daydream View devices, and then had an opportunity to sit down with Google Vice President Clay Bavor, who is heading up Google’s VR initiatives. We talked about Daydream View & the 3DOF controller, the special optimizations they made to be able to seamlessly stream 360 videos on YouTube, their Streetview implementation and future of Earth VR, and how AI and conversational interfaces will start to be integrated into Daydream. We also talked about how Google Jump was their first initiative after Cardboard to improve upon the stereo 360 video quality, the future of digital lightfields, and Google’s push for open ecosystems and their support for WebVR.
I also have a chance to give some more of my hands-on impressions of Daydream View, and add some of my thoughts and analysis for how Google’s VR immersive computing platform is starting to converge with their AI initiatives.
LISTEN TO THE VOICES OF VR PODCAST
Donate to the Voices of VR Podcast Patreon
Music: Fatality & Summer Trip
Rough Transcript
[00:00:05.452] Kent Bye: The Voices of VR Podcast. My name is Kent Bye and welcome to the Voices of VR Podcast. So on October 4th here in San Francisco, Google held a press conference where they announced the Pixel camera for the first time. It's the first camera that is Daydream ready. So Daydream is something they announced at Google I.O. It's their mobile VR platform and Here at this press conference, they announced the Daydream View, which is their reference design for a mobile VR headset. So I'll be sharing a little bit more of my hands-on reaction to the Daydream View at the end of the podcast, but first I wanted to share this interview with Clay Bevor, who is the Vice President of Virtual Reality at Google. So we talk about some of the Daydream applications that they're showing off, including the Street View VR, as well as some of the optimizations they did for 360 video streaming for YouTube, as well as some of the big news of the day from Google CEO Sundar Pichai, which is that Pixel is going to be the first phone that has the Google Assistant built in. So in other words, there's going to be a lot of artificial intelligent features that are going to be built in throughout the entire operating system, including into Daydream. And so we talk a bit about that. in how Clay sees the future of conversational interfaces and AI working with virtual reality. So that's what we'll be covering on today's episode of the Voices of VR podcast. But first, a quick word from our sponsor. This is a paid sponsored ad by the Intel Core i7 processor. VR really forced me to buy my first high-end gaming PC And so Intel asked me to come talk about my process. So my philosophy was to get the absolute best parts on everything, because I really don't want to have to worry about replacing components once the second gen headsets come out and the VR min specs will inevitably go up at some point. So I did rigorous research online, looked at all the benchmarks, online reviews. And what I found was that the best CPU was the Intel Core i7 processor. But don't take my word for it. Go do your own research. And I think what you'll find is that the i7 really is the best option that's out there. So this interview with Clay happened at the Google Press event that was happening in San Francisco on October 4th. So with that, let's go ahead and dive right in.
[00:02:24.660] Clay Bavor: I'm Clay Bevor. I lead the VR team at Google. And we've been up to a few things. What we announced today is what we call Daydream View. And it's the first Daydream-ready VR headset, which means that when you pair it with a Daydream-ready phone, you get a great immersive VR experience powered by your smartphone. As we showed today, the headset comes with a controller as well, the Daydream controller. That's actually a motion controller. It uses several different sensors to give you the sense that you're holding a magic wand, a fire hose to put out fires. You can aim with it, you can point with it. It's actually precise enough that you can draw your name with it. So this really, really nice combination of comfortable headset, it's really easy to use, this intuitive, powerful controller, and we're offering it up in November for $79.
[00:03:15.062] Kent Bye: Great, so one thing that I noticed with the 3DOF controller was that there seemed to need to be some recalibration process that needed to happen. When I would go in and out of programs especially, I would kind of come in and the pointer may be not pointing directly straight. So maybe you could talk a bit about like why does there need to be some recalibration process that needs to happen?
[00:03:34.982] Clay Bavor: So, at the start of the experience, there's a kind of centering process where you align the headset and the controller. And that basically makes due north for the headset, same as due north for the controller. And in general, the two stay really nicely coupled. Occasionally, especially if you're doing very aggressive motions with it, you'll occasionally need to recenter just as sometimes you recenter kind of the headset view in similar devices. What we found in general though that they stay nicely coupled and you can point naturally with it even in and out of things.
[00:04:06.404] Kent Bye: So today there was also announced a Pixel and Pixel XL phones. Is there a difference in field of view or any sort of other differences that depending on which phone you use and what kind of VR experience you get?
[00:04:17.853] Clay Bavor: Yep, so the phones from the SoC perspective have the same GPU CPU in it. So both very powerful Qualcomm processors. The larger phone has a quad HD display, the smaller phone has a full HD display, and with the smaller phone you have a slightly narrower field of view. You have about 90 degree field of view with the larger device and a bit smaller with the smaller phone.
[00:04:38.988] Kent Bye: So for the content-wise, there seems to be some applications that you were really focusing on here, one of which was the YouTube. And I've done YouTube streaming on Cardboard. And one time when I did it, it ended up buffering and streaming, which when you're watching a VR experience and it starts to buffer, that could be very disruptive to the experience. And so is there anything specific that you're doing in terms of YouTube streaming in order to optimize to ensure that it doesn't start buffering?
[00:05:06.061] Clay Bavor: We are really excited about YouTube, and for us it represents so much of what we want VR to be about, which is a bit of content for everyone, something you can do, something you can see that's new every day. And on the technology side, a couple of things. First of all, the streaming part doesn't matter if you can't actually capture the content. And so we've been hard at work at something we call Jump. That's our kind of a reality capture system where we've partnered with GoPro and other companies to make these camera arrays that then work with our special processing software to synthesize, in essence, every possible viewpoint of the scene to create really, really nice omnidirectional stereoscopic VR video. So, we actually had a lot of those cameras out in the wild, in the hands of some of the top YouTube creators, building new content from unboxing videos to taking you to Iceland and other exotic places. On the streaming side, first of all, to stream down the equivalent of a 4K video, you're gonna need a better internet connection, right? Don't try to do that on Edge, over your cell phone, and definitely don't do it on roaming data, I guess. But we have made a number of optimizations, in particular around the way we project video. And by project, I mean the way we unwrap the world in a video that's all around you and put it into a video rectangle. And traditionally, if you kind of unwrap that video naively, you end up with a lot of resolution at the poles, the top and the bottom. Of course, that's not where you spend a lot of your time looking. And so, we've developed a different projection that much more evenly distributes the resolution across the entire scene. And so, while if you're streaming that down at 4K, it's still going to take, you know, a lot of bandwidth, you can get better quality even at lower resolutions while still having, you know, nice fill of the pixels and so on. Again, I think you're going to want a good internet connection for now.
[00:07:03.999] Kent Bye: And they were also showing some Street View as well. So maybe talk a bit about what you're excited about in terms of the Google Street View when people are looking at the Daydream View.
[00:07:13.537] Clay Bavor: Well, we sometimes joke that we've been accidentally building one of the great VR content libraries for the last decade with Street View and with Earth, and we are really excited about what you're going to be able to do in Street View. And it's both the curated tours that we've developed, tours of special places like the Taj Mahal or the Pyramids of Giza, but it's just as much places that may matter to you or to me. like the street you grew up on or a vacation spot you're thinking about. We've driven a bazillion miles in Street View cars and we've taken Street View backpacks places and covered a lot of the world and so it's not just these kind of narrow special places but really anywhere you care about and I think we're really just kind of scratching the surface of how people will use it. You can imagine uses in real estate, uses in tourism, all sorts of things like that using the existing data. What we have seen though that just even the mere act of going home in VR, a place that matters to you that you haven't been for a while, It connects with people. It means something because it's something you're familiar with, it's a place you've been to before, and getting to see it immersively as opposed to, you know, in just a photo or a video, it does something different to you.
[00:08:32.008] Kent Bye: Yeah, I was just looking at Google Maps recently, and I flipped into a 3D view, and then was surprised to see that there was actually a lot of 3D geometry that's been added into the Google Maps. And I just was like, wow, I can't wait to see this in VR. And so I'm curious how that 3D geometry was generated, if it's some sort of combination of Street View combined with, like, satellite and machine learning to extrapolate geometries. Or maybe you could just share a little bit What's happening when you look at some of the actual volumetric shapes with those map views?
[00:09:05.771] Clay Bavor: So I guess to start just to say it, the geometry you saw there won't be a part of this first launch of Street View on Daydream, right? It's panoramic images without all the geometry. I don't know the details of all of the techniques used to create that geometry. But there's a blend, including techniques similar to photogrammetry, where you use photos of the same object from different angles to infer geometry and then texture map it. Some other more sophisticated techniques that we use as well. And they're used in different ways at different levels of kind of where you are in the earth imagery. One approach for kind of building level geometry, maybe a different approach for something that's more street level, where you may have different sensors available. I'll just say that we notice and like that geometry too and we think there's some interesting things we could do with it and some pretty neat experiences to unlock.
[00:09:55.033] Kent Bye: Yeah, definitely. That's probably one of the biggest things that I'm personally looking forward to, is to be able to actually have this near-field view of different cities in that way. But in terms of the big announcements here today, I think there's a lot of emphasis on AI, specifically the Google Assistant and Google Home. And I'm just curious to hear your thoughts in terms of how you see this new conversational interfaces mixing with this virtual reality, 3D immersive computing. Great question.
[00:10:22.920] Clay Bavor: I come at it from this perspective of what has Google been about? Why are we doing VR? Why are we doing AI? And from the beginning, Google's been about information, organizing it, making it sensible, giving people access to it. And so on the VR side, I really think about VR as kind of this evolution and this continuum of computing experiences from punch cards to the command line to the GUI to touchscreens to, oh, now you can In the future, you will be able to just interact with things in the same way that, you know, you or I interact with things in the real world. So, information gets richer, right? There's a reason that, you know, you're here talking to me in person as opposed to, you know, us having a phone call. And the computing interfaces become more natural. And I think we're only just scratching the surface of what you can do with kind of volumetric computing interfaces enabled by VR, by AR. AI similarly makes the computing vastly more intelligent and, we think, more useful, helping you wade through data that otherwise you wouldn't have the time to do, pull up facts, do things for you. The intersection of those two things, I think, as you pointed out, is really exciting. And you can imagine in the near future simple things like, OK, Google, take me to Paris. That would be neat. I think that just begins to scratch the surface. I envision, many years out, what would an interface that is all around you, that uses your voice, your hands, your eyes, other forms of input, backed by an intelligent assistant, what would that system look like, right? If you were thinking, you were building something, you were doing something, and you had the ability to have someone, some agent, the Google Assistant, pull up any fact, any piece of data, any object, any place, and a far higher bandwidth interface to create, to think, to do, and so on. And so, I think it's early days for both VR and for these kind of AI-based assistants. I see some neat things happening even in the relatively short term. And further out there, I think we end up just engaging, interacting with computing in a fundamentally different way. And I think that is incredibly exciting.
[00:12:29.369] Kent Bye: Yeah, the thing that I found really striking just playing with the Google Assistant on both the phone and the Google Home was that there was a conversational interface and I was asking it, okay, tell me the latest books from Neil Stephenson. I had a little bit of trouble in the Google Home making a list of different things and it was trying to pick one of those things, but yet on the phone when I asked that same question, it would just show the list and then I would be able to kind of scroll through it. To me, it feels like there's different outputs depending on the interface, whether it's just a voice conversational interface with the Google Home, whether or not you're doing it on the phone screen, and then potentially even with VR, it's going to perhaps have even more affordances with output. So I'm just curious to hear some of your thoughts of where you see that going.
[00:13:12.403] Clay Bavor: Well, I love the idea, and I think you've perfectly captured how, as the canvas for the Assistant to express itself gets richer, wider, better, it can do different things. It coming back with a voice answer, it can come back with a single fact. On a phone, it can come back with an image, right? It can come back with a list, as you say. I haven't thought that much about, well, what if the assistant can come back and put any number of things in space around you in any environment? I think that's pretty powerful, right? Coming back to you with not just words, not just an image, but, like, here's what it's like there. Or here are all the different models of that shoe, right, on a table in front of you. You can look at and inspect. And so, again, I think the possibilities at the intersection of these Venn diagram circles are many and very exciting.
[00:14:01.651] Kent Bye: Well, it seems like another big thing is just this knowledge graph, which it seems like a lot of the answers are coming from the knowledge graph. So would there be a 3D immersive exploration of a knowledge graph? Because a lot of times when you get the answer, it's sort of the top result, you know, but you may actually want to see the full spectrum of all the different results. So I'm just curious, like, how you think of what this knowledge graph is and how you might be able to interface it in VR.
[00:14:25.089] Clay Bavor: I think it would take an Edward Tufte or a master data visualizer to think about how to travel through the knowledge graph in VR, explore it. I think it's interesting. I actually think a bit the other way, which is like, what new things do we need to bring to the knowledge graph? in order to make it more useful in VR, in VR sister technology, AR. And I think we'll be seeing more and more of an emphasis on 3D, on objects, on the physicality of things in space. Not just kind of their GPS coordinates, but it's here, exactly. And enabling things in the knowledge graph, just as today they relate to one another conceptually, to relate to one another spatially. So I think five years from now, the knowledge graph looks like a knowledge graph with, if you call it a space or location graph on top of it that relates things to each other in size, relative position, location, and so on. I think we were just scratching the surface on that bit, though.
[00:15:23.449] Kent Bye: Can you talk a bit about your own personal experiences of using this new 3DOF controller and what you're able to kind of do within the context of different VR experiences and content that you've seen developed so far?
[00:15:35.912] Clay Bavor: First of all, I think one of the most compelling experiences in mobile VR right now are just these immersive high-field-of-view VR videos where you can go someplace else and really feel like you're there. And the reason for that is everyone has something in the world that they care about. A sports team, a performer, an event, a place. And your smartphone, a mobile VR headset, something like YouTube can connect you with that in a way that photos, videos can't. And I've spent a huge amount of time just in YouTube VR experiencing those places and so on. And there's something really nice about being able to sit back, relax, hold the controller loosely at your side, and use it just as you would a pointer, a remote, swipe through things comfortably. It's incredibly relaxing and transporting. It's really, really nice for that. I think where it's more interesting is where you have a lot of interactivity and one of the experiences I feel really showcases the controller and why it's important to decouple the controller from your gaze is Gun Jack 2. So we worked with CCP on the next version of Gun Jack and with it you can look all around you but the aim of the laser cannons is independent, right? So you can look at what's coming in over there, but be finishing off the aliens right over there. And, like, how do you aim something? You point, right? You point, and that's the way exactly the controller works. And so that's very compelling. The other one that I think is neat, because it so directly connects the VR experience with the controller itself, is the Fantastic Beasts experience, where you're a wizard, you're holding a magic wand, And it feels like you're holding that magic wand and right looking around the room and casting spells with it. And so it's that kind of direct connection between the controller and the magic wand that I think showcases what you can do if you do a 3DOF controller thoughtfully. And I think it shows some of the promise. I'm most excited to see about what people do with it that we didn't anticipate. Whether it's piloting spacecraft or I talked about drawing with it, other things like that. So I'm looking forward to that.
[00:17:48.118] Kent Bye: Is it technically feasible for people to buy like two daydream view systems use two controllers and then Potentially create an experience that uses both controllers in some way or is it not part of the SDK yet?
[00:17:59.648] Clay Bavor: It's not part of the SDK one of the things that's been really important to us with daydream are the standards quality standards latency Resolution all the things that come together in creating a comfortable VR experience, but also standards for developers So developers know what to target And our goal with Daydream is not to make one device, one phone, but rather to do a lot of the heavy lifting, the hard work and software and hardware, to enable the amazing hardware manufacturers, smartphone makers out there, to VR enable their own devices. In order to enable developers to build great experiences that they know will run across the gamut of devices, we really want to point people to, hey, this is the controller. Will the controller evolve over time? Of course, right? But for now, we want to point developers, hey, here's what the user will have. Every user of a Daydream device will have access to this thing. And we think that's going to create a level of consistency and quality, both for users and for developers, that we think is very important.
[00:19:01.894] Kent Bye: So a lot of times when people watch a YouTube video or other experiences, there's a little symbol of a headset. And right now, that is mostly for support for Google Cardboard. But do you foresee a time where you're actually both supporting Daydream View and Google Cardboard, at least for the short term, but also, I guess in the long term, if you're going to continue to maintain support for Google Cardboard viewers?
[00:19:27.927] Clay Bavor: Well, we're going to continue to support cardboard. There are millions and millions of them out there in the world, and we expect there will be millions and millions more. And the reason is, this piece of cardboard for a couple dollars unlocks this capability in your phone, and it gives people a taste of VR. So I think that will continue for many years to come. Our focus, though, has really shifted towards what you can do if you design a phone, a controller, a headset, the operating system, everything in conjunction with each other, to build really, really high-quality VR. And, you know, I expect we'll see a lot of the best cardboard apps being upconverted to Daydream to make use of the controller, to bring in all of the APIs that enable low-latency rendering and all of those things. And I think the two will live on happily, at least for the foreseeable future.
[00:20:17.904] Kent Bye: And can you comment on where you see the technology going in terms of digital light fields, if Jump is using any digital light fields, or if Google is looking at how to deal with digital light fields and real-time ray tracing, where things may be going in the future?
[00:20:33.651] Clay Bavor: So Jump was actually the first project we started within our team after Cardboard. And we started it because we thought, A, the real world is interesting. We all are excited to see something. And B, content creation in VR is hard. It's hard to create novel 3D environments. And at the time, it was torture to create omnidirectional stereo video. And even the results then weren't very good. And so the ability to capture the real world in a way that's visually compelling and realistic has always been important to us. We're exploring a bunch of techniques from jump on up to more and more accurately reconstruct scenes. You know in the limit a densely sampled light field is kind of the quote correct way to capture the world. You basically have every possible viewpoint on the scene. I think there are a lot of still unsolved problems in that area though. from, what is the camera geometry? How do you get enough cameras together? How do you synthesize the views? How do you represent it? How do you compress it? And these are hard problems that, you know, we see a bunch of companies working on and making progress on. I'm really excited about what Lytro's doing. Otoy has shown some interesting things in light fields. So it's something we're keeping an eye on, again, with, like, this goal of, hey, the real world's interesting. Let's help people see it in the back of our minds.
[00:21:53.040] Kent Bye: So WebVR is something that there's going to be a big workshop here in a few weeks. A lot of the big players are coming together to talk about WebVR. I'm curious, since you're Google and you have scanned and searched through the whole internet, if you have any ideas of how you're going to search and be able to discover WebVR experiences and be able to deliver those to people.
[00:22:14.803] Clay Bavor: I think the first step in searching and discovery is actually getting a lot of the foundations of web VR in place, and I think that's what this past year has been about, that's what this event upcoming is about. I think for Google, which was born of the web, it was born in the web, and the search engine was useful because the web became bigger, more valuable, and also at the same time harder to search, We really believe in this notion of an open VR ecosystem, one where things can be brought up, loosely connected, and interoperate without a developer, a creator of an experience needing to use this specific thing for this specific platform and so on. And so there are a lot of things that the web got right. I think before we worry a lot about searching it and discovering it and so on, we need to figure out, well, what is identity in web VR? We need to think about, how do you represent a scene? How do you relate other scenes to one another? How do you get the performance up to a level that feels really, really comfortable? And so I think for the next year, that's going to be the focus for us, for the larger community, all pulling together on web VR. But if we get those foundations right, I think we build something that's as organically useful, that can grow and flex as much as the web has. And my hope is maintain some of the most important attributes of the web, its flexibility, its openness, and its ability to adapt as technology evolves, people figure out how to use it for different things.
[00:23:42.334] Kent Bye: Yeah, it sounds like what you're saying is in the short term you're doing an app store, but yet in the long term you really believe in the potential of the open web and what WebVR could do.
[00:23:50.397] Clay Bavor: Well, even in the short term, we're doing both. In fact, we have some pretty neat experiments coming with WebVR. I can't share too much about that just yet, but something we're really pushing on. What we found is the app model, the APK specifically on Android, gives you access to lower-level hardware features that are available today, lets you do performance at the level that we think is important. and provides hooks into the operating system that we can't quite yet. But, you know, like, we have Chrome. So we've been thinking about, you know, Chrome and WebVR and Daydream and how those might fit together. And so stay tuned.
[00:24:24.593] Kent Bye: Great. And finally, what do you see as kind of the ultimate potential of virtual reality and what it might be able to enable?
[00:24:31.756] Clay Bavor: Well, that's a big question, and we should talk for another 40 minutes. I guess just to say at first, I think it's important that folks in the VR industry are kind of honest about what the ramp of VR is going to look like. Look, I am one of the most excited people in the world about VR. I see its potential. But I'm also honest that it's going to be a while before it realizes its full potential. You know, today, very few people have experienced VR. Next year, slightly more will have, but it will still, on a relative basis, be a small fraction of the world. And so, I can't remember who said it, but someone said, people tend to overstate the impact of a new technology in the short run and underestimate the impact in the long run. And I think that's right with VR. So, anyway, just as a backdrop to my answer here, I think this is not stuff that happens, you know, next year or the year after that. I think when you fully realize VR, the technologies will get to a place where the experience, the visual experience, the auditory experience, you know, in time I hope, the haptics and touch experience, approach, you know, the experience that you and I are having right now. Like, we're here. And when you have the capability to put someone in a place that truly feels Not where your brain is kind of tricked into feeling like it's somewhere else, where it's like, oh yeah, I'm in another place, I'm present. But where if you think about it, you can't actually tell, am I there or not? For me, what's most exciting about that is what it will do to bridge space, to connect people with places and experiences. that they care about but might not have access to. And I'll just give a couple examples. Basketball. One of my best friends has recently gotten me into basketball. I'd only ever watched it on TV. I went to one game and I saw it, not quite courtside, but with good seats. I was like, this is amazing. It's a different sport. It's incredible. But how many courtside seats are there? like a couple hundred. And unless they change the rules of basketball and play in a lot bigger court, there will never be more than a couple hundred courtside seats. And so in time, VR will let anyone who's into basketball be there with courtside seats. Ten million people could go to the next Warriors championship, you know, knock on wood. You know, a hundred million people could go to the World Cup or the Olympics. and there's so much happening in the world, beautiful, interesting places to see, things happening, that most of us will only ever see this small fraction of. And I love this idea that here with a pair of goggles you give everyone a ticket to the best seat in the house at whatever they care most about. People can explore, see things that they otherwise wouldn't be able to afford, have time to see, or just inaccessible. Like, what does the inside of an active volcano look like? I don't know. I don't want to know myself. But I would like to know, you know, via VR. There's that. Something that's related, and I'll just finish here. I'm very into photography. I've always thought about photography in kind of two ways. One, How do I see this thing beautifully? How do I represent this scene beautifully? But also, hey, this is a nice moment. How do I preserve this moment? And how do I basically give myself a cue to recreate this moment in my mind? And for a lot of us, that's what our photos are about. It's a little prompt to, oh, you know, I'm proud of that. Or, oh, that was a great time with that person, that friend. That was beautiful. And I think in time we'll end up with, you know, personal VR memory cameras, right? Where you can record a slice of life and come back to it. And I think people will find that incredibly powerful. And, you know, the home movie from the birthday or Christmas will be replaced with, you know, the home memory of the birthday or Christmas. And I think that's pretty profound, how we experience our lives, how we look back on our lives, and kind of take in, make sense of, and go back to the most important things that happened to us in our life. And so I think VR will do a lot more than that. We'll need a bit more time to cover that. But those are two of the things that really, really motivate me to make this thing great, to bring it to more people, and to push it forward.
[00:28:54.328] Kent Bye: Awesome. Well, thank you so much, Clay.
[00:28:55.889] Clay Bavor: Thank you, Kent.
[00:28:57.224] Kent Bye: So that was Clay Bevore. He's the vice president of Google for virtual reality. So I have a number of different takeaways about this interview is that, first of all, one of the smoothest experiences that I had within the mobile VR headset was the YouTube app. And it was actually very impressive how good the quality was. And I think that that's probably going to be one of the biggest overlooked things that came out of today's press conference is just how solid the YouTube application and integration actually is. Because if you look at mobile VR headsets, I think that one of the things that has been a problem is that you've had to download these videos in order to get good quality, which means that you've had to have all the space cleared, you have to wait for the download, and it's not just a dead simple push play and it works. And I think the overall experience of YouTube has really improved over the years in terms of the streaming video. You just don't see as many buffering issues as you did in the early days of online video. And I think in VR, it's going to be even more important to be able to have a solid stream of the content. Now, one thing that I have a criticism with about the Daydream View headset is that I don't think that the optics were necessarily really synced up with the barrel distortion. So in other words, there was a bit of a sweet spot where if you look straight forward, it looked good. But if I were to look down at the edges, then I started to see a lot of distortion with what should look like straight lines. If I were to move my head, then the whole world would kind of have this warping type of feel. Net warping was for me a little bit motion sickness inducing just because it felt like I didn't know if it was the world that was moving or if I was moving but it's just something that I hope that they do a little bit deeper look at some of the optics and some of the software barrel distortion that comes out of that. Overall, the headset's super light. It's a lot easier to actually put in the phone than other headsets like the Gear VR where you actually have to put in the USB and kind of click it in. This is just you open it up and then you do a good job as you can to line it up, but there's some sensors in there to do kind of like some automatic centering so that you don't have to, you know, get it precisely right. And the thing that I'd say about actually using the 3DOF controller is that there's four different types of presence that I like to talk about. There's the social and emotional presence but also the active and embodied presence. Now the difference between active and embodied presence I think is pretty instructive here because with embodied presence you get the feeling that as you move your hands around those are actually your hands. and you feel like your body is actually in VR. If you do a game like AudioShield and you're just like punching these balls that are coming at you, you really just feel like it's you punching those balls. Well, with a 3DOF controller, you don't have that sense of embodied presence. You know immediately that when you're moving your hand, it's some sort of abstracted representation of your movements, but it's not actually your hand. And I think that To me, it was a little bit of just testing that a little bit to see like, okay, how does this actually feel? Does it feel like I'm actually present with my body? And the answer was no, I don't feel a sense of embodied presence within the daydream view. And I don't think you're able to with a 3DOF controller. However, the active presence, that's when you start to do things actively like you're actively manipulating different objects within the experience and you may be holding like a magic wand and you start to feel like you're actually holding the wand and you're doing things with that wand but at no time did I feel like it was me actually holding that wand it was more of like I'm able to exert my will and agency within the experience and I think that's going to be the strength of mobile VR especially with Daydream VR is that it's going to be really focusing on that type of active presence So one example of one of the experiences they had was Resolution Games' Wonderglade and it was kind of like this mini world with lots of different carnival park games and they just were really showing one of their demos but this particular game was using the 3DOF controller So it was kind of like a marble race game where you're able to control the x and y axes of this table plane and you're kind of tilting it up and down and kind of having like this marble-like device and you're basically doing a loop around this obstacle course and having a race with a computer ball. And as you're moving it around, it was kind of an interesting feeling, because your hand is moving the table. The 3DOF controller, you're kind of tilting it in different ways. And with that tilting, you're actually impacting the world. And so that's another expression, I think, of active presence, where it's more of an expression of your agency, but not necessarily feeling like it's your body that's doing that. One thing that I did notice, at least in my demos that I was doing, is that it kept needing to get recalibrated. So for example, if you were to be pointing the pointer straight forward and then go into an app and out of an app, the pointer may be pointed off to like 45 degrees off to the right. So if you needed to point forward, you'd have to kind of overcompensate and do some pointing. Or I didn't realize that you could do this, but you could just recalibrate and recenter. So that was a surprising and new thing for me of actually using it. In terms of how it actually fit onto my face, there was a lot of extra light leakage that was happening. It wasn't completely snug to my face. Once you're in the experience and fully immersed, then it's usually okay, but I was seeing some glare based upon some sunlight, which kind of takes me out of the experience a little bit. Overall, it does feel like an upgrade from the Google Cardboard. I can see how this is going to be a great platform to be able to watch 360 videos. They didn't show anything from Netflix, Hulu, or HBO, but they announced that they're planning on having the entire library of content available for the Daydream. And I think YouTube is really kind of a killer app in a lot of ways because you're just going to be able to have access to all these different videos and be able to see them on a big screen. But also the 360 videos, I think they're going to take off a lot more. I know that Brandon Jones is one of the leads of WebVR, works on the Chrome team. And one of the announcements that he had recently was that Microsoft actually came in late into the game and started to add more things into the WebVR spec, which the good news is that there's going to be a lot more support for augmented reality applications within WebVR. The bad news is that the intended launch date of WebVR is going to be delayed a little bit in terms of needing to let the WebVR spec bake in a little bit. And so they're going to be having a big meeting here in a couple of weeks with all the major players of WebVR, and they're going to be getting together and kind of hashing out the spec a little bit more, I think. And so it's something that's really caught a lot of momentum through the entire industry. Just reading some of the tweets from Josh Carpenter, you know, he was at Mozilla, now he's at Google as well, working on WebVR. And I think Google actually has a lot of big plans for WebVR. I can expect that a lot of their applications and websites are going to be WebVR enabled so that you're going to be able to jump into like an immersive computing platform. And especially with Google Tango, we're eventually going to be moving to a world where we're going to be not having a lot of screens, but just like augmented reality glasses and I could just imagine that a lot of this content that's in 2D screens now is going to be made available within a 3D immersive environment. I think the thing that I really took away from playing around with the Google Home today was that there's different mediums for the Google Assistant. And depending on the medium, it's actually going to change its output. So something that's a list format doesn't actually translate very well to speaking. And I asked one of the Google representatives, Hey, can this like read a website for me? And they're like, No, that actually isn't a very good experience to just have it read a random website. And so I think they're moving more towards like snippets and audio snippets as well. So actually I think podcasting was really cool to see like, okay, Google play me the latest episode of the Voices of VR podcast. And that command alone would just pop up and start playing the latest episode of the Voices of VR, which was very delightful for me. To me, it's really curious to see this Google Home moving more and more towards audio interfaces and audio output. And there's a higher bias towards content that's natively in audio. You know, it's a lot of just playing your music, but also playing videos. You can hook up the new Chromecast into the Google Home and say, OK, Google, play me the latest videos to this TV. And then it'll pipe through the Chromecast the videos from the Google Home onto your television screen. So you really see this Google Home being like this central interface, this conversational interface that's going to be your way of interacting with the Internet of Things. I think eventually we're going to have more augmented and virtual reality interfaces to the Internet of Things where we could actually have visual feedback of what's actually happening so we don't have to just rely on the conversational interfaces. But conversational interfaces, I think, are going to be a lot more important. Just going to the O'Reilly AI conference, it's a whole new paradigm of how do you do a conversational interface because this existing paradigm of using the home button is not going to be the same type of way that you're interacting. Like when you're navigating into a web page you could go back two pages and you know what that means but in the context of a conversation which is more serial In the time domain, if you say back back, that doesn't actually mean much to you. And so it's going to be more difficult. And so the context switches and moving around, I think this is something that I'm actually really curious to start to play around more with the Google Home to see how this conversational interface user experience is going to evolve. Because I think eventually it's going to be moving into virtual reality a lot more. And I think we just see the early days of that right now. But I could see a time in the future where a lot of your interactions with the Google Assistant is going to be through an embodied non-player character, artificial intelligent being within virtual reality. And you're going to be able to get a lot more contextual information through non-verbal body language. And it's just going to feel like interacting with another human a lot more. So it's going to be moving from just audio only into like a more visual manifestation of these Google Assistants. So given the same question to Google Assistant, whether it's through Google Home or whether it's through the phone or whether it's through virtual reality, I think eventually all three of those answers are going to be different because those answers are going to be to the strengths of whatever that medium is. And just from the conversation I had with Clay today, I don't think they've really fully fleshed out what that is going to look like and all the new capabilities that are gonna be possible within virtual reality. But I can imagine having an entire immersive experience of the data and the websites and not just kind of the top Google results that the Google Home is giving you, whereas when you do a Google search within the phone, you get the top 10 results. What do you get when you do it in VR? What does that preference? If the Google Home is preferencing audio, is the VR going to eventually preference the visual information or is it going to be preferencing the 3D information that's available for that content? So it's still early days, but this is really the basic foundations of 3D immersive computing platform that's out there. And having it available at large scales, I think is the ultimate goal of what Google is going for. And the Daydream is going to be the first step towards that. And I hope that the quality is going to be on par with some of the other mobile VR headsets that are out there, like the Samsung Gear VR. Right now, I'd say the Gear VR beats out in terms of the quality of the experience that you get. But in terms of the content and the things that you're able to do with operating with Google and in the conversational interfaces, the artificial intelligence integration, being able to have a 3DOF controller rather than just using the trackpad on the side, I think is actually a better experience. So I think over time, it's going to get better. And so finally, the thing that I'm most looking forward to with the Google line of products is the Earth VR. And for anybody who hasn't checked out Google Maps, go check out just zooming in and clicking the 3D button and then start to look into your residential neighborhood, wherever you may live. You start to see that they've added quite a lot of 3D polygons and give it a lot of shape into something that used to be very flat, you know, they have something that is got the somewhere in between the satellite photos and the textures there and the street view as well as being able to use all the data that they have to be able to extrapolate the actual 3d model of a lot of the metropolitan cities and I think it's gonna look pretty amazing when you're able to kind of fly across that within the earth VR and I think that right now Google has a few things within the Vive, like the Tilt Brush is one experience that they have been continuing to improve in a lot of different ways, adding all sorts of new features with multiplayer and all sorts of ways to do animation. They're just taking that experience to the logical extreme of what you could do with it. And that's absolutely amazing what is going to come out of Tilt Brush in terms of be able to go into VR and actually express yourself creatively and be able to actually generate entire VR experiences that were generated from within Tilt Brush, which I think is amazing. But something like the Earth VR, I don't know if that's going to be compatible for the Daydream. There wasn't any new information about it. It may be something that requires the horsepower of your computer and might be a Vive-only experience, but I'm still holding out hope that they'll have more information about that after the launch of Daydream, where you'll be able to actually roam around and do these virtual flyovers over cities and be able to travel anywhere in the world, which I think is a big dream of what's possible within VR. So if any company's able to really pull that off, I think Google is one to really look out for. I was told, stay tuned, more information soon on that front. So that's all that I have for today. I just wanted to thank you for listening to the Voices of VR podcast. And if you enjoy the podcast, then spread the word, tell your friends, and become a donor at patreon.com slash Voices of VR.