#316: Bringing VR Content Creation to the Masses with Unity’s Carte Blanche

Dioselin-Gonzalez2Dioselin Gonzalez is a senior VR engineer at Unity Labs who is working on bringing VR creation tools to the masses. Project “Carte Blanche” will be launching sometime in 2017 and will be targeting desktop VR consumers with an Oculus Touch or HTC Vive who are not programmers. I had a chance to catch up with Dioselin at Unity’s VR/AR Vision Summit where she talked about these next-generation VR authoring tools.

LISTEN TO THE VOICES OF VR PODCAST

Unity is working on creating smart assets so that users will be able to use voice commands to find 3D models & scripts from the Unity Asset store, and then instantiate objects & behaviors within virtual worlds using a virtual deck of cards using tracked controllers. Unity imagines that people will be able to create VR games and worlds to explore without having to know any programming. It will also allow you to export a Carte Blanche project and and import it into Unity so that you could do rapid prototyping of virtual worlds.

Dioselin mentioned to me that you’ll be able to share virtual worlds to Facebook so that you could have your friends join you, but it’s unclear if this social media integration goes beyond a text notification or if it’s a sneak peak into some of Facebook’s plans for hosting native VR experiences.

Dioselin graduate school research was in making collaborate art projects within CAVE environments around the country that were linked together, and she says that Project Carte Blanche will have also incorporate collaborative world building functionality. Sylvio Drouin told me that the project Carte Blanche tools should be expected to launch sometime in 2017, and Unity is starting to think about how to move beyond the five millions Unity developers and create something that more like forty-five million people could use to tell stories within VR.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast.

[00:00:11.947] Dioselin Gonzalez: Hi, my name is Yoselin Gonzalez and I'm a Senior VR Engineer at Unity Labs. We're focusing right now on authoring tools on creating in VR. So at Unity, we have the project that we presented today in the keynote, which is the authoring tools from the developer point of view. That one is led by Amir Berhimi. And the project that I'm leading is also on authoring in VR, but it's from the point of view of the consumer, the non-technical people. So pretty much we're solving very similar issues, but then from different perspectives. So I'm working in this project. The code name is Carte Blanche, which is an authoring platform for VR, VR worlds for the non-technical users.

[00:00:49.898] Kent Bye: So who would be the ideal target demographic for Carte Blanche?

[00:00:54.345] Dioselin Gonzalez: We foresee two types of users. One is definitely the non-programmers, the non-technical people that want to create VR worlds. Because for this platform you're going to be able to create, you know, bring objects and create a whole experience, a game or etc. But then you can share it. We're going to have connection with social platforms, you know, Facebook etc. and other and your friends will be able to come and join them and experience it. But also, the other type of user would be prototypers. You should be able with that, being able to quickly prototype your world, your game, etc., and then bring it into Unity. So it's also for the developers.

[00:01:30.828] Kent Bye: And is this sort of the experience that you would see would be primarily used within virtual reality then?

[00:01:36.819] Dioselin Gonzalez: Yes, it's from scratch. We're thinking that this will be a virtual reality experience. So you create the object is the users. What they would want to do ultimately is to create a VR experience. And so from the get go, they will be inside.

[00:01:52.983] Kent Bye: Is this going to be targeting like any of the mobile gear VR or is this like strictly going to be like desktop for like say the Oculus Rift or perhaps the PlayStation VR or the HTC Vive?

[00:02:06.343] Dioselin Gonzalez: Initially, the platform where people are going to go and build these worlds, we're envisioning it as a sitting experience. So we're targeting Oculus, of course, and Vive. But then the idea is that these worlds that you create, you bring them into Unity, you can finish editing, but then you can deploy them pretty much in any platform. So if you want to, you could deploy it in Samsung Gear.

[00:02:30.988] Kent Bye: I see. So it sounds like you have integrations with the Unity Asset Store as well. So talk about being able to tie in these either free or assets that you have to pay for to bring in these 3D assets and the interface that you have for doing that.

[00:02:46.645] Dioselin Gonzalez: Yes, that part is really exciting because it's a complicated problem, technically, from the technical point of view, but also adds a lot of value. So this is not just an app, right? This is not going to be just another VR app where you just go and play and create The idea is that it's going to be integrated with the asset store, and it's going to have a lot of components. There's going to be a helper, we're calling it you, I don't know if maybe Timani mentioned that. And there's going to be voice recognition, so you're going to be able to actually query and talk and say, find me zombies. So that is going to connect to the asset store and find all of those assets that are attacked. like that. That means that we're going to have to come up with a new taxonomy, a new data model for virtual reality assets. Developers are going to need to prepare whatever geometry, characters, etc. and then add the correct information to be integrated into Asset Store and then our platform will take it from there.

[00:03:42.009] Kent Bye: So there's a lot of 3D assets. What are some of the characteristics that would make it a VR ready asset?

[00:03:48.453] Dioselin Gonzalez: The very first thing that you can think of is for static geometry, for example, if you want to do it for VR, you need to create the interior. So that's the very first example that we tell. If you're going to create a building, for VR, it has to also have the interior. The other part is we have this really cool metaphor, because again, the users is non-technical people. So our designer, Greg, he came up with this metaphor of using cards. So in order to bring assets and instantiating into the world, you're going to have a deck of cards and you can, whenever you want to instantiate one of them, then you take it, literally grab it in VR and put it on your desk and then it's instantiated. So these assets are going to have this metadata that is going to allow our card system to read it and then render automatically, for example, the corresponding card and see all of the properties, just like a plug-in. Just like when you write a plug-in. So they need to have the correct information so our system is going to take it and then whatever parameters that you need to change, it's shown there in the cards in the UI.

[00:04:53.025] Kent Bye: And so what's the primary input for this program? Is it going to be like the Oculus Touch and Vive controllers? Is it going to be like a Xbox 360 controller? Or is it going to be a mouse and keyboard or some combination of all those?

[00:05:05.560] Dioselin Gonzalez: Yeah, initially we're targeting, again, because we're envisioning for now that it's going to be a seated experience. So we're targeting Oculus, touch controllers and Vive. But of course, we're going to, we already support Hydra, for example, that's one device that we use a lot in our everyday work. And mouse and keyboard unity already has that incorporated. However, to make the experience better, we would like to have, the experience will definitely be better when you have tracked controllers, like the touch or the Vive. But of course, we're going to support, you know, any others.

[00:05:35.694] Kent Bye: So one of the challenges with creating a 3D environment is that you put a lot of objects there, but they may not be interactive in a way, like responding to you or having actions that are responding to you. So what kind of simplified scripting or ways of making this interactive are going to be built into the carte blanche? Is this something that you would need to go back into Unity to integrate that, or are you going to have ways to make it so that it can be really interactive and dynamic?

[00:06:01.804] Dioselin Gonzalez: Yeah, that's another one that Greg designed. Greg is really cool, you should talk to him too. In his metaphors, these cards can be anything. It can be a light, it can be, you know, and then you change it. For the scripting, for the behaviors, if it's a character, the card system will read it the information from the character about all the possible animations and it's going to show you that 3d interface is going to show you okay which animation do you want to activate right now the other thing is we have the console we're calling it right now the game mechanics type of card which is again it will be available to you in your deck and then you select it and you say oh i want this game to be capture the flag so you instantiate the card put it in your desk and then the game becomes So there's got to be a lot of intelligence in that sense. But yeah, at this point, there's not going to be a scripting interface there, because again, we're targeting this for the non-technical people. However, if you want to, I can see that as well. You can export this once you prototype. With this game mechanics, you know, cards that you instantiate, you can then import it into Unity. And if you want to add the scripting, you will be able to do.

[00:07:08.069] Kent Bye: Can you give me an example of an experience that was created in Carte Blanche that you think is a really perfect example of what's possible with this platform?

[00:07:16.395] Dioselin Gonzalez: Okay, I can tell you not the specific one that we're building. I cannot talk about that one yet. But one cool experience, I mean, there's definitely games, right? That one I can tell you, I can tell you a lot. And I'm sure anybody can think of, for example, a town and you build and imagine a kid that wants to create a town and brings all the geometry and all the buildings and then instantiates a dragon and then instantiates treasure. And then instantiates a game card, a game mechanics that says, okay, this game is about protecting the treasure. And then immediately, then the dragon starts trying to get the treasure and capture and you become the player because that's the idea. And then you immediately start, the system will know that then the controllers become how to navigate and all. So, of course, Games is another one. The other one that would be really cool is how about exploring places? You know, in VR, this is a very traditional project that in all research, we've all done that as graduate students, a virtual tour of our university. Right? We've all done that. But imagine in Carte Blanche, people are going to be able to do that without scripting. They instantiate the university buildings and all, and then a tour. The game mechanics would be a tour guide. And maybe the parameters is simply, OK, I wanted people to visit first the student center, and then, you know, the theater, et cetera. So that would be another one that is not necessarily games.

[00:08:46.709] Kent Bye: I have a filmmaking background and so I'm familiar with non-linear editing tools like Premiere to be able to have a timeline and it's been a little frustrating that there hasn't been like really sophisticated timeline sequencer integration within Unity but it sounds like there's going to be a new sequencer that's coming out in Unity 5.4 and maybe you could talk a bit about how you see the sequencers capabilities to be able to expand the possibilities to be able to tell stories within VR.

[00:09:12.640] Dioselin Gonzalez: Yes, I know, and I've seen it as well with my background in the animation industry at DreamWorks and Pixar that my friends are already asking me. We have a project, we're already working on that. It's been called Director, several different names. Again, the idea is to enable people that are not programmers and developers to come in and tell the stories. Imagine being able to, when you're in VR, being able to set action points where like, you know, I want different points. I know because animators do that a lot. They set key frames, right? When they're creating the stories and that the way they do is very important for them. They call it blocking, which is setting, you know, the specific points of the sequence where things are going to be happening. So imagine doing that in VR as well. So that's one thing that we envision would be like super awesome, like bringing that. Initially, director is going to be, of course, in the Unity editor, but then imagine bringing that into VR, where actually you're going to be able to walk around and then, I don't know how would be the best way to visualize it, but you know how animators work, you know, and they create the key frames and then do the splines and actually then being able to create the animation curves in VR in 3D. That's something that I'm really, really excited and hopefully will help.

[00:10:27.986] Kent Bye: I think one of the challenges with a system like this would be that I would want to have a way to actually jump and use the controllers to actually animate and kind of record my own mocap just by using either the Oculus Touch or the Vive controllers and really bring a performance into these characters that's uniquely my own. and yet I don't know if this is something that Unity is already working on or is that something that you would kind of leave to the asset store to be able to have somebody build that as a plug-in to be able to use these motion track controllers to be able to do low fidelity motion capture.

[00:11:03.167] Dioselin Gonzalez: No, no, that one definitely, Unity, we're already working on that. I cannot give too much information at this point. But the idea is that we are focusing in authoring tools. And I'm going to repeat what I said in the keynote today, which is the idea is to empower creative minds. So authoring tools is definitely something that we're working a lot. It's our focus.

[00:11:26.961] Kent Bye: And when you think about the future of storytelling and virtual reality in these immersive mediums, what are some of the things that you think are going to help define this language of virtual reality?

[00:11:37.528] Dioselin Gonzalez: One thing is the visual language. Because right now, again, from movies, the way to tell stories is through the camera. And it's so essential for people to define the shot, what it looks like, the shot. I think now people are going to have to come up with a new visual language to tell story that is somehow adds that extra dimensions. And I don't know what they're going to call it, but instead of the shot, it's going to be the the 3D scene. And then like I can hopefully envision people right now, the way they do storyboarding, right? It's just 2D. They just draw frame by frame. It's going to be instead in 3D. And that's how they're going to pitch the story and going to tell it is actually by showing it the 360 degrees of how each one does. That's one thing that I see definitely is going to change. The other thing that is going to have to change, and I'm excited, like I get really giddy about that, is the language, a new language for interaction. There's already in movies a visual language, right? Straight lines, you know, with edges mean a little more like aggressive, while round lines mean more like harmony, you know, lighting, you know, dark versus light. there's going to be now a new language of interaction with the user and it's going to be both based on perception but perhaps also a little bit of learning on our part as spectators which is maybe it'll come a standard that when music starts playing behind you that means something really important comes in the story because Now that you have the user being able to look in absolutely any direction, there has to be a way, again, a language to be able to, without invading the experience, tell the user, hey, there's important action going on right now, maybe behind you, maybe into your breath. Like being able to direct the user where the action is happening.

[00:13:35.522] Kent Bye: Yeah, the thing that Eric Darnell from Baobab Studios, he was the director of Ants and Madagascar, and the way that he phrased it I think is probably the thing that translates it in a way that really makes sense to me. What he said was, in films you're telling a story of an experience through kind of a singular perspective that you have control over. Whereas in virtual reality you're kind of giving somebody an experience for them to generate their own stories from. So there's a little bit less control over like what people are actually seeing or doing and then it's about cultivating and crafting an entire experience and then trusting that people are going to have their own ability to pay attention to what they see and they're going to be able to generate their own unique experience out of that.

[00:14:18.253] Dioselin Gonzalez: Yeah, I guess filmmakers now have to think of the audience as partners. It's kind of like, let's go together, and there's a story here, but let's go and leave it together, which right now is let me tell you the story. Sit there, and this is what I need to say to you. The filmmakers now have to bring and take more into account, OK, what do you have to say to me as the viewer?

[00:14:40.369] Kent Bye: And so what type of research have you done in graduate school in virtual reality? What were some of the things that you were looking at?

[00:14:45.721] Dioselin Gonzalez: Oh yeah, I specialize in collaborative virtual reality in cave environments, actually. Because again, at that time there was no Rift, no Hetsis. And like I mentioned, I always say that I'm the granddaughter of Carolina Krasnaya, a VR pioneer. So I was very lucky. While I was at Purdue University, I started this assistantship in the Envision Center. It had just been created. And it was being run by Dr. Laura Arndt. She's the academic daughter of Carolina Cristineira. So I specialize in cave environments, actually, collaborative experience in cave. Particularly, my work was about bringing VR into environments for video conferencing. So there was the Access Grid, it was a very popular video conferencing framework at the time that included not just video and audio, but included actually a framework and an SDK for creating collaborative applications. So my work was about bringing virtual reality into that environment. So I created an SDK that people could import into it and then run their applications. It was really cool. We showed it at SIGGRAPH and we had people connecting from Alaska, from the UK, Manchester, and Purdue University. That was really cool. Another thing that I did that I get very giddy, if you ask me, I can talk about this forever, was I was part of a group called Art in the Grid, because I'm a programmer, but I'm fascinated by artists. I love working with artists. So Art in the Grid was about creating immersive, geographically distributed art experiences. So we did one that was called In the Box. That was the idea. It was every artist, and we were, let's see, there were people in University of Florida. We were at Purdue. and scientists in Alaska. There were at least those three sites. So each one of them took their interpretation of being in a box and created an experience and then we all did it at the same time we performed it. So at Purdue we had the cave environment. It was in a theater people were seeing And then so I developed an application. It's an environment designed by one of our artists of being in a box. So there were avatars. And so people with their laptops could connect to it. And we had the simulator mode. They could go into that environment and with avatars walk around this box. But then at Purdue, we had a dancer with motion capture in the cave, dancing inside the box and her avatar was moving around. Then another portion during the performance, that same dancer would go to a side and then with a motion capture suit, she was able to control music that was being streamed, I believe from Alaska. So that's how we did the whole collaborative performance. There is streaming music where she was able, through her movements, to control it. That was fascinating. That one, I loved it. So that was kind of the work that I did.

[00:17:45.629] Kent Bye: And with that collaboration, is that something that you foresee eventually becoming a part of this Carte Blanche project, is to be able to create these virtual worlds collaboratively with other people?

[00:17:55.434] Dioselin Gonzalez: Yes, exactly. That's another thing, again, because it was from, in my background, the collaborative part of VR is something that I'm always like, like that social aspect, we need to bring that in order to make it really, really mainstream. So, in Carte Blanche, like, I envision people coming, even while they're editing their world, you know, friends come and connect and help. And maybe the owners of the world can say, hey, you know, my friends, give them permissions. My friends can actually come and edit and then collaboratively do that. That would be amazing.

[00:18:27.767] Kent Bye: Okay, so finally, what do you see as kind of the ultimate potential of virtual reality and what it might be able to enable?

[00:18:33.690] Dioselin Gonzalez: Ooh. Oh my goodness. I guess it's in my head. One is education and the other one is Visiting remote places, just because they're very close to my heart, is, you know, being able for example to, because I like traveling, being able to travel to different places without necessarily having to be there, you know. And the social aspect is the other one that is big for me, that I believe is going to be. is being able to have new experiences together like avatars and you know embodiment that's I think that is going to be huge and very important in VR it has to and it's being able to have social experiences in going to worlds that don't even exist that we just created but then Enabling something that we cannot do in real because that's a value, right? It's it's about what is it that the physical world doesn't give us that we can do in VR So it's that creation even generation of new ideas. Does that make sense?

[00:19:43.212] Kent Bye: Yeah, yeah, totally. And the thing it makes me think of is the Google Street View. There's an app called Street View VR, which integrates with Google's Street View. And you're able to go around the world and look at these photo spheres. They're 2D, and they're not stereoscopic yet. But I imagine a time of Google doing, instead of Street View, they have digital light fields in cars that are actually capturing light fields of the entire same coverage of the Street View to be able to then walk around in a room scale or beyond room scale environment of this. But the other thing that makes me think of in terms of unity and building is that the environment and the context is actually very important. And I'm just imagining being able to pull up a photosphere and be able to paint over it in an abstract way, but also actually generate the environment. It seems to be a crucial key to making you feel like you're in another world is actually creating this environment that's unique. So I don't know if that's also something that you're... thinking about trying to actually create these refined world-building tools to actually, with fine-grain control, build these imaginal worlds.

[00:20:43.407] Dioselin Gonzalez: Exactly. And again, it goes back to empowering the creative people. I'm sure, like, and that's what I like, we take it very seriously. We have to create, like, tools that are Flexible, powerful, but at the same time natural, so that the learning curve is small. Because I'm sure people are going to come up with these crazy worlds, right, that I couldn't even think of. One of the things for me is like teaching geometry to kids. That is such a visual activity. I remember when I was learning calculus in school and we were learning about parametric surfaces. That is such a visual art. And I remember having markers of different colors to draw it in my notebook to be able to see it and learn. That's one of the powers that I see, for example, in virtual reality and augmented reality. You know, even the kids aren't actually teaching you, they actually can see the curve and manipulate it. It was the social aspect. Another thing that fascinates me and that I see will be there, it would be creating art that it only exists in virtual reality. again back in my days more than 10 years ago that I was in Second Life you know in the peak of his popularity I was working with one of my colleagues I was living in Singapore at the time by the way an artist I showed her Second Life hey you see you can easily create sculptures and she was like oh my god so That already, it's already some people do it, but imagining in VR when we really have the right tools, the right creation tools, there are going to be artists that just like right now some photographers only do digital, there will be like digital sculptors for example in VR. That would be amazing, that is one of the things that really excites me.

[00:22:23.438] Kent Bye: Awesome, well thank you so much.

[00:22:24.919] Dioselin Gonzalez: Thank you, thank you very much.

[00:22:27.329] Kent Bye: And thank you for listening! If you'd like to support the Voices of VR podcast, then please consider becoming a patron at patreon.com slash voicesofvr.

More from this show