The keynote from Unity’s Vision VR/AR Summit yesterday showcased just how far Unity is reaching into non-gaming content when it comes to augmented and virtual reality. The person who is in charge of Unity’s xR strategy is Tony Parisi, who is a co-founder of VRML and a long-time open web advocate. Tony has long been interested in using VR for artistic expression and storytelling, and the keynote speakers highlighted the range of diversity of immersive technologies ranging from NASA JPL to car companies to the NFL to graphic novel comic books to immersive storytelling to construction to the big tech players including Facebook, Google, and Microsoft.
I had a chance to catch up with Tony at Sundance earlier this year to talk about his approach to leading VR & AR strategy at Unity, and moving immersive technologies beyond just entertainment. Anyone who knows Tony can say that he wouldn’t have taken this job at Unity if there wasn’t some long-term open web strategy involved, but he wasn’t prepared to provide any specifics on it yet. But it’s safe to assume that it’s on the roadmap, especially with the recent news that the co-creator of WebGL & WebVR, Vlad Vukicevic recently joined Unity’s emergent technology group.
LISTEN TO THE VOICES OF VR PODCAST
Here’s the two and a half hour Unity Vision VR/AR Summit 2017 Keynote
Donate to the Voices of VR Podcast Patreon
Music: Fatality & Summer Trip
Support Voices of VR
- Subscribe on iTunes
- Donate to the Voices of VR Podcast Patreon
Music: Fatality & Summer Trip
Rough Transcript
[00:00:05.412] Kent Bye: The Voices of VR Podcast. My name is Kent Bye, and welcome to the Voices of VR podcast. So on today's episode, I'm going to be featuring an interview I did with Unity's head of AR VR strategy, Tony Parisi, back at Sundance of this year. Right now, happening in Los Angeles is Unity's Vision Summit. This is where they are gathering both augmented reality and virtual reality communities together to talk about some of the best practices. I am not attending the Vision Summit this year because I've already been to like eight VR events this year and I have a healthy backlog of over 200 episodes that I'll be working my way through here on the podcast. But I do have this interview with Tony and it's worth digging up to talk about this discussion because we talk about both virtual reality as well as augmented reality. And looking at the keynotes that were happening today, there was three hours worth of keynotes that they were showing. And quite a big portion of them were featuring augmented reality technology. So I'll be talking to Tony about some of his vision of what he's trying to do with heading up the VR and AR strategy at Unity. And I really see his fingerprints on really bringing together the entire ecosystem of what was just shown during the keynotes at Unity's Vision Summit. And Tony also comes from a background of web VR. So he's somebody in his heart is really wanting to eventually start to meld in different open standards into the virtual reality workflow. So I trust that he's going to be able to bring that vision to pass at Unity at some point. And so we talk a little bit about that and some of the technological blockers for actually making that happen. So, that's what we'll be covering on today's episode of the Voices of VR podcast. But first, a quick word from our sponsor. Today's episode is brought to you by the Voices of VR Patreon campaign. The Voices of VR podcast started as a passion project, but now it's my livelihood. And so if you're enjoying the content on the Voices of VR podcast, then consider it a service to you in the wider community and send me a tip. Just a couple of dollars a month makes a huge difference, especially if everybody contributes. So donate today at patreon.com slash Voices of VR. So this interview with Tony happened at the Sundance Film Festival that was happening in Park City, Utah from January 19th to 29th, 2017. So with that, let's go ahead and dive right in.
[00:02:41.881] Tony Parisi: I'm Tony Parisi, head of VR and AR strategy at Unity Technologies. Great.
[00:02:46.404] Kent Bye: So the last time I talked to you was at the Oculus Connect 3, and you were just working on a lot of web VR things. So maybe you can tell me a little bit about that process of you deciding to go join Unity and what you hope to do there.
[00:02:58.356] Tony Parisi: Yes, I've known a lot of folks at Unity for several years, and I've been talking to them on and off about various opportunities, but at that particular moment in time, when last time we got together, Kent, we were at Oculus Connect, and I was working on web VR technology showcases, some of them were actually featured, there was a segment on it, a session during the day, and it was actually featured in the keynote, and I was pretty excited about doing VR on the web. And thinking of starting my own startup actually was getting really close to doing that when I met the executives at Unity and they explained to me that they really needed someone to lead the strategy for the company from a business standpoint but it touches kind of everything, product, marketing, how we tell the world about VR and AR and our involvement in it, business development and our partnerships across the board. And they needed to look at that in a holistic way because they looked up one day and realized that most VR and AR development was happening using their software platform, the Unity Editor and Engine. And the company had done such a good job at democratizing game development, so Indies all the way up to big development houses could do gaming. And the technology overlap between game software and virtual reality is so big that Unity became a natural place for people to start their VR and AR development. Also, the preceding couple of years, all the folks from Oculus and Valve and all the headset makers had partnered closely with Unity to get first-class support for being able to author for their devices, so that had been done sort of as a strategic bet on Unity's part that this was maybe going to be something big and interesting, and it became so much bigger and more interesting than the company had thought. All of a sudden, people are trying to build car configurators and real estate walkthroughs and cinematic VR, like the kind of stuff we're experiencing here at Sundance. And the company needed to figure out what to do with that as a business, and that's why I'm here.
[00:04:48.320] Kent Bye: I see. And just recently, Unity has announced the Unity Editor and the decision to open source it as a way to actually create VR within VR. So maybe you could tell me a bit about that project and the decision to open source it and create it as this community tool.
[00:05:03.763] Tony Parisi: Okay, yeah, so Unity has a division called Advanced Research, or Unity Labs, it's more commonly known as, and Unity Labs does research projects that don't necessarily make it into product, they're looking at things one to three to five years out, and even beyond, and they do a lot of work in VR and AR, and there's this project called Editor VR, which is a way to be inside an immersive and positional-based system, a room-based system like RoomScale system, such as the HTC Vive, and actually build your virtual reality inside of it. Or you could even build a 3D video game that was just rendered on a flat screen, but you're using the VR as an editor. So you're inside the environment, you can pick up and manipulate objects at the moment with your Vive controllers. I think they have it working for Touch now as well. And that's pretty exciting, and I hadn't thought about it this way, and it was actually Philip Rosedale from High Fidelity who pointed it out when he and I were at an event recently. He put up a slide that just slayed me. It was a picture of an Etch-A-Sketch and a mouse. And he pointed to the Etch-A-Sketch and said, think about how hard it is to draw in 2D with this 1D input device. You get your two one-dimensional knobs and you have to kind of coordinate them just right to draw in two dimensions. We have the same problem of trying to make 3D content with a two-dimensional, two-degree-of-freedom input device, the mouse. And as an industry, we've been doing it that way for three, four decades already. So the idea that we could actually use hand controllers and haptics and other inputs that are VR-based and native VR to create 3D is a really exciting one. And that's what Editor VR has come about because of. And so now we have decided to open source that project. We haven't open sourced our entire editor, just to be clear. It's just the Editor VR project, which is a set of extensions and add-ons to our core Unity 2D editor.
[00:06:45.981] Kent Bye: Yeah, and also I just saw recently that Tilt Brush had created some tools to integrate more tightly with Unity to be able to create scenes and animate them and then potentially even output them into an app. Maybe you could talk a bit about what that Tilt Brush SDK connection to Unity is going to be able to enable.
[00:07:01.987] Tony Parisi: See, I was so excited to see that news come out just last week. I think it was around the Tilt Brush SDK for Unity. So imagine that you're a Unity developer and you want to make an interactive experience. Now you can actually incorporate Tilt Brush created art into it to do that. Or think about it from the other standpoint, you can think about Tilt Brush now as a VR application creation tool, where you do most of your design using a Tilt Brush style tool, then you bring it into the full-featured Unity editor and integrate it with the engine to make the bits of your environment art interactive. I just can't even imagine what the possibilities are going to be for that. So we've seen Unity become this development platform over the last few years, beyond just a simple game engine. It's really become a full-service platform And now combining something like Tilt Brush as an art creation platform and as a sort of social platform, it's become where people are sharing all their creations. I just can't even imagine what people are going to be able to build with it. Super exciting.
[00:07:57.060] Kent Bye: Yeah, and I know we've had a number of different discussions since really going back to May of 2014, talking about this potential of WebVR and the tension between the closed app ecosystem mindset versus the open approach. And so I know that Unity's had some approach of being able to export into WebGL, but, you know, had a lot of different limitations in terms of having to download the entire engine and just the loading issues. And so it feels like WebVR's got this momentum of really going to be taking off in the future. And I'm just curious, as you've been kind of intimately involved in both of these communities now, working with web technologies for a long, long time, and now also looking at Unity as that app ecosystem, of how those two worlds are going to come and kind of play off each other. I'm curious to hear your thoughts.
[00:08:44.309] Tony Parisi: So it's a really big landscape and I think the idea that all content and application functionality will be delivered via a packaged app through an app store, I think that's such a limited idea and I don't believe it's going to be that way. I never have. Certainly, I haven't believed that while I was working closely with the folks in WebGL and WebVR communities, and I'm still working with a lot of those people now. I still communicate with everyone at Oculus, and Mozilla, and Google, and all the people at Samsung, Microsoft, making WebVR implementations. It's a really exciting area, so the idea that you could access some VR experiences just at the touch of a hyperlink, or through things that are shared in a social feed or via email, so you don't have to make a conscious decision to download a packaged application. I think a lot of use cases are going to benefit from that. Maybe in areas like travel, like we were experimenting with back at OC3, where you could do a Virtual Trip Advisor kind of experience via the web. Or other areas where it's just, you just know the end user's not going to take the time and bother to access an experience, but if they had to go download and install it, but if they were just shared a link by some trusted source like a friend, they will open that up and check it out. There's no friction that way, right? I think for a lot of different use cases we're going to see that, so WebVR is really exciting. Now you mentioned Unity's approaches to doing WebGL in the past, and yeah, that was done with a set of technologies, and I think I've talked to you about this even on this podcast, a set of technologies called Emscripten and Asm.js, that's a very techie way of saying basically what they did was cross-compile all of the code for the engine itself to get it to run in a web browser in this really low-level JavaScript. The net result of it is a very large set of JavaScript code, about 10 megabytes. The problem with that is the only people who are going to wait for that thing to download and watch a loader bar would be people who are going to play a game. And those people don't need web technology. They can just go download a game or go get it on an app store. They're comfortable with that use case and that usage pattern. They'll pay their $1.99 for the mobile game or their $50 on a Steam for a SteamVR title or anything in between. And they're willing to wait for the download, they're willing to install the app, because they know what they're going to get. It's a game, you know, that's the world that they're in. And from the developer standpoint, it's a packaged piece of content they can monetize, so the developers have a motivation to do that. And doing all that through a web distribution model, it's not necessary. But for all these other casual applications, you want that web distribution, but you can't pay the pain of waiting a couple minutes to download the engine code, plus then getting the experience. We're going to need to figure out ways where there's much more quick access to Unity-created web experiences. Unfortunately, I can't actually get into specific details about what those might be or when the company would do that, but we're looking at it, obviously. I'm there now, and I've had a deep set of experiences around that. And I'm talking to everybody strategically about how we do that. But beyond that, it would be getting in that kind of product strategy and when. And as typical, you know, a company like Unity, we don't talk about those until we're ready to announce the product. And that's when it's ready, right, in some beta or preview form. And we're not there yet.
[00:11:44.684] Kent Bye: Well, just following the discussion with GLTF, there seems to be a little bit of a fundamental infrastructure issue of being able to do dynamic streaming of content so that, just like a web is able to progressively download something, is that something that you feel is going to be happening within the web standards community to be able to do something like go to a website and have, as like we go now, the images are kind of loaded in in parallel and it's sort of like the longer you wait, you know, and look at something like the Google Earth experience, it's probably the first time I've really seen it executed in a really elegant way of kind of forming the geometry of a scene as you wait longer, it gets more and more defined and so That seemed to be an approach where to really have something where you're able to just go to a site and go right into it. That's where I would hope for it to go. I just was in VRChat the other day and going between worlds and there's a big loading screen. I was like, okay, I know there's a world on the other side. I know there's people in this world. I want to go there. I'll be willing to wait there. I kind of wish I would be able to just kind of progressively download. And I'm just curious to hear what you think are some of the biggest standards, blocks, and technological limitations for being able to kind of seamlessly go between these types of experiences in that way.
[00:12:54.631] Tony Parisi: Well, yeah, we're not quite there in the infrastructure to do this progressively yet. glTF enables really a way to universally access the content. It's a 3D file format similar to JPEG, but for 3D scene data. And that's getting a lot of industry support. So you see a lot of site support that a lot of publishers start supporting it in these web settings. But it currently doesn't have any streaming technology built into it. It's sort of streaming friendly. As we were designing, we wanted to make sure that you could build streaming layers on top of that. But that would be essentially downloading more and more bits and pieces. Like you were saying, you'd see more points come up or anything. But there's nothing super advanced in there yet in terms of, say, compressing data and or streaming it in a way that you could see something that is a lump that turns into the fully defined model of a dinosaur. It started as a little sphere or cube. There are technologies to do that. They're not wired into GLTF yet, but they would come Assuming the basic use cases get out there and there's more and more demand for these kind of experiences so then the technology partners, tool makers will all invest in the technologies required to deliver those. If it becomes commonplace in three years for people to hit a link and be able to enter inside a virtual building that has been published by a real estate developer in order for them to get a walkthrough of it before it's been built so they can maybe buy a condo in there or suite or something. Then you'll see the demand rise for more and more the ability to deliver bigger and bigger models, right? So we would get there, but it's just a little ahead of where we are today because the market demand isn't quite there yet. But all of the collaboration around the initial technologies around GLTF and WebVR and the tool makers are happening. So that's kind of like just laying the roads, right? That's if you want to use metaphor. We'll go with that one for a while. We'll lay the roads. We'll have a couple of pit stops, but there's none of the buildings or other infrastructure in there yet.
[00:14:41.492] Kent Bye: I'm curious to hear your thoughts about augmented reality and how that's maybe changing some of the existing technology stack at Unity. I know that VR obviously has a huge penetration and support for all the major VR headsets and that we're kind of moving into this immersive computing paradigm where a lot of the 3D user interface and principles are being proved out into VR, but I'm curious to hear what the unique affordances of AR are that you see and what Unity does that, you know, you can do in AR but you can't do in VR.
[00:15:11.665] Tony Parisi: So yeah, one of the big things, I mean, obviously besides not being cut off from the outside world, so you can see all your surroundings around you and then you're having 3D content, you know, objects placed in that environment in some real-world context. There's also this persistence to it, which is like if you've played with a HoloLens at all, most of the apps let you pin an object somewhere and you walk out of the room, you come back tomorrow, that object is still there, right? So there's ideas of integrating the real world that's scanned in by, you know, computer vision technology that's a fundamental thing around the mixed reality, the headset-based AR. Combine that with the context of objects placed in there, and it's a magical integration of the real and virtual worlds, which is very different from VR, which is taking you somewhere else. AR is bringing the magic into the actual environment that you're in now, with everything from animated little objects all the way up to buildings you can look at in the distance that may not exist yet, but placed where they would be. It's a very different set of use cases that are all leaning on a common set of technology in terms of real-time 3D. and stereoscopic rendering of that. And the inputs around hand gesturing or other controllers are very similar. But what's different is that tight integration with the real world. And as far as the technology stack goes, there's a little more you need to do in AR to be able to scan that in. And in Unity's case, you know, somehow bring it into the editor environment and make the app aware that it has the AR capability in it. Now, Unity is working with all the major AR headset providers as well, including even the phone-based AR like Vuforia. We have a big partnership with them, and so Pokemon Go was built on Unity. So, you know, there's a debate about whether that's AR or mixed reality in terms of the headsets, the headsets like Meta or HoloLens versus Vuforia. I look at those personally as a continuum. And especially when you look at phone-based AR, you can think of that analogous to the way cardboard brought VR to a mass market. It might not be the be-all and end-all, or may have a limited set of experiences around it, but it's a great way for consumers to become really comfortable with the idea that we're making this magical blend of real and virtual, right? So for most people, AR is Pokemon Go and experiences like that with this overlaid content. And that's okay, because as these other headsets, which are not even consumer available and ready, given their price points and form factors and tethering in some cases, this is a great way for consumers to be comfortable with this, start expecting it, the usage patterns will emerge around that. So when the AR headsets become more mainstream in terms of price and form factor, there's going to be content there and there's going to be consumers who know how to use it. So I think that's pretty great.
[00:17:39.545] Kent Bye: So you've been here at Sundance checking out some of the experiences. What have been some of the things that have been really jumping out for you?
[00:17:45.689] Tony Parisi: So first I want to say it's kind of, I'm feeling like it's a watershed for cinematic VR, now AR. If we go back to New Frontier last year, let's say, VR was definitely ball of the ball. A lot of people were really excited about it, but it was mostly still the early experiments of stick a camera in a room, put a gear, you know, record that, put a gear VR on someone's head, and they look around and they get some video. And some experiments with interaction on the Vive and all that, and sort of world building and storytelling, cinematic as in not games, where you don't have a win condition, right? You're exploring in free form and you have some amount of agency, a range of agency, depending on the experience. But it still felt like those were experiments. And what's amazing to me in this one is it really feels like we're getting to real content now. when you saw Within's Life of Us, and I'm gonna rattle off three or four of the things that were made with Unity. It's more than half of the VR and AR in the world is made with Unity, and at Sundance it's true, too. I don't know the exact percentage, but nine or ten pieces featured in New Frontier made in Unity. So I'm really excited about that, having joined the company a few months ago, that it's being used in these ways. So the Life of Us made by Within, Chris Milk, an amazing director, and Aaron Coblin, the CTO, and their amazing team. created a whole new level of interaction and presence in my mind. If you haven't seen this piece, it's amazing. You inhabit a character in a multiplayer setting, and you communicate with them, and they're sort of your companion. It's a four-person experience, and I was doing it. My wife's here at Sundance with me, so we got to do it together, which was great. We're calling this VR couples therapy this week. Not that we had any real need for couples therapy, but it's really fun. It really brought us closer together. We start out as these little protozoa and then we grow up to be fish and then we become mammals and primates and people and eventually these gleaming robots and along the way we're flying, running, looking at each other, talking to each other, it's voice activated as well. But you're on rails, so you don't have to labor too hard at how do I move and how do I interact in this thing. You use your Vive controllers to kind of flap your wings when you're a pterodactyl. You can fly a little. And in some cases, when you're dancing as a robot, you're doing this sort of tilt brush style swirly, but it's ephemeral. They can draw in these swirls in the sky and it's to disco music. It's so great. And it was an amazing way to communicate. And I hadn't been in anything that made me feel not just present, but like I was inhabiting the character. I was those creatures. I do think, actually, I probably have to think about the theory behind this a little bit. I think a big part of that is because I had that other person in there with me, in this case my wife, that it was a social VR experience. But again, because you didn't have to worry so much about how I operate, how I locomote, how I wave my avatar hands. A lot of that was done for you. You just got to emote and enjoy it. And being on the rails made it that part amusement park ride. But again, you were there with somebody else. And so I really felt truly intimate and present in a way that I hadn't before. So I think Within has kicked that up. And I believe that's not ready. I mean, that's a sort of early demo of this that I don't know if they're selling it or turning it into a title yet. But it's just showing the way for These are going to become killer to the point that people have to have them. And so I was really stoked about that. And that's just one. Just saw Asteroids made by Baobab. They hadn't made anything in Unity before, but they made this one in Unity. And they're really stressing out all our new tools for doing cinematic type of creation. And we're learning a lot working with them. And that's the follow-on to Invasion. So that's still the mac and cheese characters in a space cartoon. You're in there with them. You're a little helper robot and you get to do various little tasks. So you're not super involved in the storyline, but you do play a critical part in a couple of places in the storyline. And in that one, you're inside a cartoon. You're basically in an animated cartoon. You feel like a part of the action. I felt like a six-year-old again watching the Jetsons, watching my space cartoons, but I was actually part of the action, so that was pretty amazing. Also, vibe-based, running around and using the controllers. And then, I've just got to give a shout-out to Tyler Hurd for the other piece, his new piece, Chocolate. I don't know if you got to try that one yet, but like his other pieces, Batson and Old Friend, he brings together comedy and whimsical animation and gets you involved in the story. In this case, I won't give too much away because there's some fun reveals in there, but you're shooting cats out of a gun and they sort of float around in space around you as primitive creatures dance around you and there's very cool music and it makes you want to dance and laugh and I was doing a lot of dancing and laughing. So I really feel like we've hit this watershed where cinematic VR and storytelling based VR, world creation based VR is really real. We've moved beyond gaming as a use case for real now and that's again one of the reasons I've come to the company to work with Unity is to look at the other opportunities beyond just straight game development. VR and AR games are great, and I played a few of those while I was here, too, but I'm mostly focused on how people are telling stories and expressing themselves creatively while we're here at Sundance. And I gotta tell you, I can't think of a better fit for New Frontier and that program here at Sundance than these kind of technologies like Unity has, where it really enables anybody, independent all the way up to the big houses. It's democratizing development of interactive media in a big way, and I'm super excited about that.
[00:22:54.375] Kent Bye: Awesome. And finally, what do you see as kind of the ultimate potential of virtual reality and what I might be able to enable?
[00:23:02.960] Tony Parisi: Don't you ask me this every time? Do I get to answer it again? Are people going to go back and check and see if I've answered the same thing or if it's different three months from now? Things change. Yeah, it's true. And people change, right? And well, let me see. I mean, these days I really am feeling like applications like Tilt Brush and the worlds I was just describing, I've seen here at Sundance, that creative impulse and being able to support that level of creativity is going to happen in ways that we've never seen before because of VR and AR. I just, I don't even know how to get into it but, you know, people are thinking about VR as an empathy machine, that's all wonderful, and we are, I mean, there's no doubt there's some experiences that make you very empathic, make you want to cry, they really touch you emotionally. And I'm all for that, and I think that that's going to continue. I think the ability to have just pure-ass fun with games is great. I think there's a lot of really cool stuff. I saw a documentary piece. I should mention that for a second, because this is along the lines of where I want to go with answering you. It's called Zero Days by a shop called Scatter in New York. That's a new form of documentary that integrates broadcast media, traditional broadcast media, audio, video, voiceovers, with data visualization, handcrafted rendered things to help support the story. In this case, the story of a cyber attack that happened a few years ago. and it blew my mind. It was actually a lean back experience, there was no interaction in it at all really, but I was watching a documentary of the future, the way it's gonna be done. And so I think the ways to tell stories and present information and express creativity are where VR is going to really shine in the next few years. Awesome, well thank you so much. Thank you, always a pleasure Kent.
[00:24:40.855] Kent Bye: So that was Tony Pricy. He's the head of AR and VR strategy at Unity. Well, just a quick note about AR, VR, you know, a lot of people within the VR community, there's been this discussion about what to even call it. And I think there's an emerging consensus to call it XR, where the X means cross-functional or mixed reality, or, you know, it's really kind of placeholder to mean either virtual or augmented reality. So just about any immersive experience is kind of being called XR today. Me, personally, I like virtual reality. Although, you know, on the show I'm going to be covering both. But, I don't know, I just like saying the voices of VR. It just sounds weird to call it voices of XR and then I have to explain to people what the X means. Anyway, it's just a larger discussion that's happening within the community over the last couple of months. And as you look at the different programs, you start to see a lot more of the XR. For example, OpenXR from Kronos Group, they went with the XR.
[00:25:39.123] Tony Parisi: So anyway.
[00:25:40.364] Kent Bye: Some of the notes about the overall strategy of Unity in the long run, I think it's interesting to look at the glTF bits as a bit of a blocker in some sense. I mean, we're kind of getting into the technical weeds there a little bit, but the basic essence of that is whether or not you're able to go to a website and have things progressively downloaded to you in a way that you could stream it down without having to wait for a big download. I think that's the big blocker for the current workflow for being able to export WebVR type of stuff from Unity is that things have to be compiled down and those files are binary files that are pretty monolithic and they are just difficult to be able to send over the wire dynamically. So GLTF streaming, you know, the vision would be just like you go to a website today and it just kind of dynamically streams a lot of the images and JavaScript. It's all happening in parallel. And you start to see the framework of the site and the text of the site, even as things are still loading in. And for VR, that doesn't happen. You either get the whole thing or you have nothing. So you end up just having to wait. And as Tony was saying, I think that is kind of like the biggest technological blocker for, I don't know, next year or two at least for seeing any viable web VR export from Unity. I think it's going to take a while and there's nothing for them to announce yet. However, back on April 25th, Vlad Vesivic, who was one of the original creators of WebVR with Brendan Jones and Diego Marcus at Mozilla, you know, he's now going to be working at Unity on their emergent technology team. So for me, that's a huge sign for somebody like Vlad to be moving over into Unity, that we're going to start to see more of the open web integrated into Unity's stack. And because Tony has been a long advocate for the open web, I think you're going to start to see the strategy for Unity start to also move in that direction. So for me, that's super exciting. Also, just generally, Unity moves really slowly. They have their plugin ecosystem that allows people to plug the gaps into a lot of the functionality that may not be there yet. And I think they've architected it in that way, but they tend to move fairly slow. And some of this stuff, if you look at Unreal Engine, for example, some of their sequencer and being able to edit stuff in VR, they're actually kind of outpacing Unity in different ways. And some of the user interface innovations and some of the user experience of the sequencer within the Unreal Engine is just at least a year ahead of where Unity's version is at. But that said, Unity still has a huge market share of apps that are out there. So the new peripherals and other things that are out there tend to be integrated first with Unity. So you see what has happened within the VR community is that at the very beginning, a lot of the applications were done with Unity. But I think that number is maybe coming down a little bit. I think Unreal Engine is kind of catching up in different ways. But when it comes to AR, there's like over 90% of the AR applications are using Unity and not a lot that we're seeing from Unreal Engine just yet. So just generally anything that's new that you're doing integrations with is going to have Unity support first. So if you're going to be on the bleeding edge of technology, that's a big reason why people are going with Unity. Now, just a quick note of some of the different content things that we talk about there at Sundance, just because there is a little bit of a mirroring of what was actually happening in the Unity keynote. So first of all, the life of us experience from within is absolutely amazing. I did an interview with Aaron Koblen back in episode 504. So they actually, during the Unity keynote, they announced that they're going to be having Life of Us being playing at the IMAX arcades. So why is this important? Well, I think that this is actually going to be probably one of the biggest hits of these VR arcades out there. for a number of reasons. You get a social experience, but you're also getting thrown into this embodied experience and you're like, as you speak, your voice is being modulated. It actually gives this weird feedback mechanism to your brain that actually changes the way that you're communicating so you're like actually touching into new parts of your personality and your psyche and when you do that with other people with your friends it just creates this very surreal and unique experience that you share with your friends and because of that my prediction is that the Life of Us is just going to be like one of the absolutely smash hit blockbusters of 2017 for people going to be going out and experiencing this at these VR arcades. I think eventually they're going to be releasing it on within, so you'd be able to see it at home, but to be able to see it with like four people with network computers, with some of the haptics and the wind blowing in your face to give you just a much more deeper embodied experience. And if they add that into the IMAX experience, I think it's just going to be like, you know, put it over the top in terms of just people are going to go crazy for it. So that's something to look out for. Also, Baobab Studios Maureen Fan was part of the keynote today at Unity. She was kind of recounting some of the lessons learned from storytelling, mixing interactivity with storytelling. This is something that I explore quite in depth with Eric Darnell back in episode 290 from Sundance 2016. And then from this year, we dive into it again in episode 501. And I originally talked to Maureen back in episode 258. where she talks about chasing her animation dreams. And that was just at the very beginning of the journey into Baobab. So Baobab was also just showing one of their latest experiences called Rainbow Crow at Tribeca. And that is one that, again, the production quality of that, as well as some of the big name talent that they have with John Legend attached to it, but just their storytelling skills mixed with the talent, mixed with the fact that they're using Native American folklore to be able to tell a creation myth of the crow. So these oral traditions and these stories, I think they have a certain power. And as you go into VR and start to have these interactive experiences within the story, I think that this is another experience to really look out for. And I think it's going to be really powerful and compelling as well. You know, everything that's going on with the Baobab Studios team is they're definitely a production studio to keep an eye on, to see how their work evolves. And then finally, Zero Days VR was mentioned by Tony there at the end, and they just got awarded both the Cinematic VR Award as well as the Narrative Achievement Award. So I have an interview with Yasmin that I did at Sundance 2016. I've been waiting until the experience is actually going to be released, which I think should be within the next couple of weeks or so. But the Zero Days VR, to me, is one of the best volumetric storytelling experiences that I've had. Definitely the best documentary experience that I've had within VR. just how they use space to tell the story as something that's quite spectacular. But the team of scatter, including James George, who I talked to back in episode four or seven, as well as Alexander Porter talked to him in episode four 53. So scatter there, we're just at Tribeca and they have blackout. So I'll have another interview with that blackout experience coming out soon as well. So to me, I'm super interested in the content and how the. Storytelling is evolving within virtual reality. That's something that Tony was talking about. And I'll be sort of doing these deep dives into these different experiences as well. So again, there was a lot more emphasis on augmented reality at the Vision Summit, which I think to me was a little bit surprising. We saw just maybe a few last year. NASA JPL was showing what they're doing with being able to have this rover experience of being able to actually work with 3D models. And then the other thing that was really impressive was Vuforia was showing some of their latest phone based AR, which was kind of doing what the HoloLens is doing, where you're able to do this scanning of the space and be able to start to overlay CGI interactions on top of that. And so, you know, I love what Tony was saying in this interview was Pokemon Go is a little bit like the Google Cardboard of the augmented reality. So we're kind of at that Google Cardboard level of augmented reality. And I think that the emphasis that Facebook is putting on it from their F8 conference, as well as some of the tools that are coming out with Euphoria, I think we're going to start to see a lot more augmented reality phone-based games. And that's going to start to bootstrap the overall AR ecosystem as well. And finally, there was kind of an interesting keynote selection with Richard Dawkins, who's this big evolutionary biologist guy who tends to be very much into, you know, science and rationalism and skepticism. And so I was kind of surprised to hear him talk about virtual reality. I was really wondering what his take was going to be. It was actually really quite interesting because he had this whole thing about constrained reality. just talking about how the construction of our reality is already kind of virtual. Richard was basically making the argument that we already have like this virtual reality within our mind. So that was a fun, mind-blowing treat to hear somebody from the mainstream evolutionary biology world talk about virtual reality in a somewhat esoteric way. So that's all that I have for today. I just wanted to thank you for listening to the Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends and consider becoming a donor. Just a few dollars a month makes a huge difference. So you can donate today at patreon.com slash Voices of VR. Thanks for listening.