#226: New VR Creation & Cinematic VR Tools Being Prototyped in Unity

Timoni-WestTimoni West is a principle designer for Unity Labs, and she’s working on creating professional tools for creating VR scenes in Unity while being in VR. These tools are still in the prototype phase of development, but it’s something that Unity is actively working on implementing. There will be an API for developers to extend the VR creation process within Unity, as well as a new Director Sequencer tool that could be used for Cinematic VR that’s on Unity’s public roadmap for the Unity 5.4 release. Timoni and I talk about these new VR features as well as design inspiration from the VR creation tools of Tiltbrush and Oculus Medium.


The Unity Labs team is focused on future technologies, and they’re currently spending a lot of their effort on creating some of the first pro tools for virtual reality. Because Unity’s intention is to democratize game design to make games easier to create, then they don’t have to worry about creating a sense of presence within their VR scene editor tool. Their goal is to create the tools so that the game developers can make something that feels real in VR.

Because developers will be potentially using these VR creation tools within Unity as a direct connection for how they’re getting paid, then they need to be customizable. They’re planning on having a flexible UI with smart defaults that will allow you to really customize your workspace environment. They still want it to feel like you’re using Unity, and so there will be many design elements and features that should be available.

Their plan is to create integrations with the 6-degree of freedom controllers as well as support for the standard keyboard shortcuts. They also heard at different VR conferences that there were a lot of people working on VR creation tools, and so that helped them decide to create a robust API to allow plug-in developers to create their own variations of VR creation tools within Unity.

Timoni did say that the actual modeling of 3D objects is beyond the scope of what they’re currently working on, and that there will likely be other tools like Tiltbrush, Medium, and others that tackle that problem.

One of the things that Timoni really likes about the Tiltbrush interface is that all of the options are always visible to you. She found that the Oculus Medium approach of hiding and changing the controls depending upon what tool was selected was a bit more confusing. She’d like reveal as many of the options as possible for the VR creation tools in Unity as well as represent the 3D objects as they would appear within the scene rather than depending upon file names.

There will also likely be a number of tasks that will still be more optimal to do within the 2D interface, and other aspects that will be easier to do within a VR creation environment within Unity. She talks about a chessboard interface that would have a miniaturized model of the scene in front of you while also having the full scale environment so that you could be able to have a large range of fidelity for altering the scale of objects within a scene.

Another upcoming feature thing that Timoni mentioned could have a huge impact on the cinematic VR is a Director Sequencer that is currently scheduled to be released within the 5.4 release on March 16, 2016. This cinematic sequencer tool will allow the authoring and playback of sequences of animation and audio clips. She said that she’d love to see this Director tool also get direct VR integration so that you could start to create Cinematic VR sequences directly within Unity.

They’re still early in the prototyping and development phase of a lot of these VR creation tool features, and at this point it hasn’t even been announced on their public roadmap. So it’s likely to appear on the 5.4 release or beyond in 2016. If you have ideas or feedback for what you’d like to see within a VR creation tool within Unity, then feel free to reach out to Timoni West via her website.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Rough Transcript

[00:00:05.412] Kent Bye: The Voices of VR Podcast. My name is Timony West, and I'm Principal Designer at the Labs team at Unity. The Labs team is a fairly new team that was created to sort of focus on future technology stuff, some pie-in-the-size stuff, and some things that we're actually actively building and making. And since VR is coming up over the next year, that's pretty much been the focus right now. So we're working on a couple of different things for VR, but I am focused on VR editing tools in VR. If you're a game developer, if you've ever made content for VR, you know how much of a challenge it is right now to actually go in and edit your scene and so on, put the headset on, take the headset off. So we're actively working to make, I think, some of the first VR Pro tools, figuring out a good UX for that and how best we can make tools for developers and creators. So that's what I'm working on. That sounds really awesome. I know that at Oculus Connect, I had a chance to try out the new Oculus Medium. And also just a week after that, I tried Tilt Brush for the first time. So I was able to use both of these. What I see is kind of like the leading user interfaces in terms of creating things within a VR environment. So I'm curious if you've been able to try both the Oculus Medium and as well as Tilt Brush and kind of your thoughts on how they did the user interface. Totally, yeah. I have tried out both. I probably spent maybe three hours in Tilt Brush myself. Well, maybe a little bit more. And I've actively shown it off to maybe like 40, 50 people at this point, which is pretty great. So with Tilt Brush, I have a lot of experience using it. Good job on the new brushes, guys, if they're listening. But I only got about seven minutes with medium at Oculus Connect, unfortunately. So I found Tilt Brush instinctively easier to use, but at this point I've used both. It's such a difference in usage right now, it's hard for me to tell exactly. But I will say this, Tilt Brush is... amazing for just drawing and sketching out things that look brilliant from the get-go. You know, you're in a dark room, everything's fluorescent or neon or rainbow, you know, you could just draw a squiggly little line and be like, it's a masterpiece. Whereas with medium, you could tell just from the screenshots that people had made that this is definitely much more of a tool where you need to work within it for, I don't know, maybe a half an hour or so at least to get something where it doesn't just look like a pile of like slimy Play-Doh, you know. Yeah, that's the thing that I noticed is that it took quite a long time just to get the instructions. With Tilt Brush, you go in there and it's very intuitive. And to me, the trade-off seems to be that, like, I don't know how long I would be able to really rotate my wrist in that Tilt Brush environment before I really got tired. Aha! So, did you mean the menu or did you mean just actually drawing itself? Well, I mean, just the menu. So I guess what I mean is that over time, I would imagine that that could get a little bit tiring to be able to rotate your wrist like that just to be able to go through all the different options in the long run. Right. So the reason I said, aha, is because I feel like a lot of people don't really instinctively get that you can actually just use the Vive controller's thumb touchpad to move around the panels. So you don't actually have to move your wrist. But for the record, not a lot of people get that. I usually have to tell them. But yeah, if that were the case, that would be a significant flaw, right? One weird thing I had, I had quite a few weird things with Medium. Being able to select using the joystick and a combination of the triggers felt very odd. Once I knew you could do it with your secondary hand, then it seemed like I'd be a little bit more intuitive. But also the fact that the tools changed based on what mode you were in was a little difficult for me to grasp right away. But on the other hand, it seems like Medium is a tool that's designed to get out of your way and let you just sculpt. And Tilt Brush is definitely a little bit more user-friendly from that perspective. And so it seems like the Tilt Brush is really optimized from like, perhaps like a vector drawing approach of painting within 3D, whereas maybe Medium could be sculpting 3D objects that you may actually import into a VR project potentially. And so for Unity's perspective, what type of tools are you trying to create to make the process of creating environments within VR easier? Sure. First for the record, I'd like to say I really want Tilt Brush Pro, where you can make actual primitives. I think that would be totally sick. Either that or Medium, I guess, should go back the other Tilt Brush direction. Some combination of the two, I think, would actually be more the killer app for creation for assets and 3D worlds. So one thing that really struck me about Medium is that it was A VR tool for sculptors who already knew how to sculpt in 3D space. So if you didn't really know how to sculpt very well, you weren't going to walk away from Medium having suddenly made something awesome. And in that sense, it is definitely leaning towards the pro tool side. And they might change this over time. One cool thing about what we're working on right now with scene editing tools in VR for Unity is we don't have to care about presence. We're literally just trying to make a pro tool that is incredibly easy for people people to use and to get their job done. Like our job is to make devs' life easier, right? To democratize game design. And so to that end, we just need to make something that allows people to do everything in VR that they want or need to do in order to make their game. And that's actually kind of opened up the way we think about UX a lot because we don't have to worry about things like presence or making it feel real. In that sense, we just have to have the tools around so that you can make things that feel real. So it sounds like when you're in Unity, you have the ability to move around lighting and objects and materials, potentially even do some debugging. Describe to me the interface that you're going for. I'm imagining you have to have some sort of 6DOF controllers to be able to actually move things around. Given that, what do you expect to see? Okay, so we're assuming that it's going to be game creators for the most part, for the pro tools, for consumer tools, it's like anybody's guess. But we assume that you're either going to have a dedicated set of controllers, or you're going to be sitting at your desk with a keyboard and mouse. So we need to be able to work with both of those interfaces, but we really don't need to support one-touch Google Cardboard-esque situations. So that kind of makes it a bit easier for us. So we're going to have, I think, you know, this is all still kind of up in the air because you never know until you actually try it. And we've only done initial prototyping so far. But I think it'll be a series of panels that can be connected either to you or toggled on and off and moved anywhere in space. One of the weirdest things about VR, and I feel like people don't really talk about this enough, but it is the weirdest thing is how natural it feels to have everything just hanging around your head. You know what I mean? Yeah. Like you can just put stuff in space and it feels great. You're like, cool. Like the cat toolbox and fantastic contraption. You just put it up right next to your head and you're like, Hey, little buddy, this is great. You just live here now next to my head. Why does that feel good? I think that to me, there's a, what they call near field VR. So it's anything that's within arm's length distance. It has the highest amount of stereoscopic effects. And so when you have things that are in that near field, it just is super compelling. So. Like, I expect you to die if you're able to just kind of lock tools or objects in the air like that. Yeah. When I tried that out, I was surprised they did that, because it is kind of going for something like realism otherwise in the game. But yeah, I mean, I loved being able to do that. I was thinking more, like, philosophically, as humans who have lived in a gravity-filled world our whole lives, why is it when we get into VR, suddenly we're like, cool, everything should just float. That makes sense. It's just a weird quirk of our brains, I guess. Okay, so on to the UX. So when we started working on this project, I did a bunch of concept UIs that I have no idea still will necessarily work or not. But as we went to a bunch of conferences, we realized that a lot of other people are actively working on similar tools. We don't necessarily want to get in their way, so what we decided to do is make a really robust API so anyone can start building out. really awesome VR tools in VR, but we'll have a really flexible sort of buildable UI with smart defaults. But Pro Tools are always sort of by necessity very customizable, right? Like people will customize the hell out of Maya. They'll customize any given screen. They'll have different workspaces. Almost every window is tab-able. You can pin it. You can put it somewhere else. So creating that same sort of extremely flexible, Interface in VR, I think, is going to be pretty key for Pro Tools. And again, we don't have to worry about presence. So we really can just let you build up any sort of interface you want. So I think that is what we're going to move for right now. I guess the question that I have for you is that I imagine that there's going to be some optimal combination for if you want to create a scene in VR. There may be some things that are just faster to do in the 2D interface and other things that it would be faster to just rapidly iterate while you're in VR. So for you, what type of things would you do with the normal 2D Unity interface? And then what type of things would you want to do actually in VR within Unity? Sure, so all the procedurally generated stuff, I feel like you set that up in 2D, right? Then you just... I guess you could tweak it pretty easily in VR once you're in there. Honestly, I've had quite a few devs tell me they would love to just have the terminal in VR. So, okay, you can have the terminal in VR. And we were sort of playing around with maybe just giving you like a 3D version of your keyboard. You could just stick some sensors on the edge of it. And at least you know what your keyboard is then, right? Like, anything's a step up. But in terms of things that you actively want to do, I think a lot of... Well, we don't do, like, for example, modeling in Unity exactly, right? You'd probably use another tool for that in any case. So that's an obvious one. I think things like setting up large-scale environments, like if you have mountains, if you have specific lighting things, skyboxes, that kind of thing, you'd want to do that. Probably just set it up first in Unity and then put it on the headset and kind of go from there. But honestly, that's something we'll figure out as we go along and we actually get people in and testing and get beta testers and I'm really curious to hear the feedback about that. Because there's nothing more surprising than getting into VR and finding out what feels good and what doesn't. And from your perspective of designing this user interface, what have you been taking your inspiration from and what kind of elements have you been pulling from different experiences? So a lot of it was initially just thinking about how in 2D space now we have things that are levers that magnify movement, things like joysticks and mice and so on. and figuring out what would be an equivalent lever for VR, so I've been playing around with that a lot. Basic building blocks. Unity has a specific style and we're continuing to evolve that style, so I'm also going to work within that style. When you're using Unity in VR, you should still feel like you're using Unity. In terms of the tools themselves, I feel like I'm a big fan of just show everything and have it be close to what you're going to get. So for example, if we show like a list of game objects, I would prefer to have the actual 3D game objects maybe like in our turntable or something so you can actually just pick up stuff and place it in the scene and not have to worry about what the file name was. In terms of, I guess, other inspiration, I don't know, I spend a lot of time in VR seeing what everyone else has done too because I don't necessarily like to reinvent the wheel of someone who already has a pretty good wheel. Especially this early on, I like figuring out patterns so someone could seamlessly go from one app to another. I don't necessarily think it's the job of Pro Tools to be the most inventive. That being said, if we come up with some clever stuff, we will definitely put it in there. Yeah, and you keep mentioning this concept of Pro Tools. For you, what does that mean? I would say Pro Tools are the things... I was asking myself that same question earlier because I mentioned it so much. Pro Tools are things that people depend on for their paychecks. So if you're a designer, Pro Tools would be Photoshop or Sketch, whatever it is. I know a lot of developers depend on Unity, and hopefully an increasing amount of VR developers. And I guess one question as developers are going to be listening to this, when are some of these pro tools within Unity to be able to create VR within VR going to start to become available? Yeah, that's a good question. Honestly, right now we're still in the prototyping stage. So we have a tentative schedule. It's not quite public yet, and we're still sort of working on the details. So if you're listening and you're excited about this idea, I'm sorry that I can't be a little bit more precise right now. But if you are interested, I would say actually just email me and let me know if you have thoughts around what you'd like to see or what you'd like to see prioritized. I'm definitely trying to hear from everyone about this and get as much information as I can. So it's on the public roadmap, but there's a new director tool that sounds like it's going to be doing some timeline stuff to be able to potentially make it a little bit easier to do like some cinematic VR stuff. And so I'm just curious what you can tell us about that. Sure, yeah. The director tool is a timeline-based animation tool. I anticipate it to be very useful for cinematic VR. So that does beg the question, should we also have in-VR director tools? I think the answer is yes, because I've continually heard feedback from people that they would like this. And honestly, cinematic VR is going to be really big, and people are already using Unity to make it. So we should be supporting what the people want it for. I've seen some interesting concepts of how to do timelines and things in VR. So we might take inspiration from that. Honestly, it seems like it could get messy very quickly. So I think we'll start off with the basics. You know, you'll be able to sort of play back scenes and perhaps edit in the middle of them. I would love to see, not necessarily from Unity, just in general, I would love to see like animation tools that allow you to do onion skinning or have like full size maquettes that you can actually just manipulate and see that the frames change. Stuff like that is fascinating because then You can really take advantage of this virtual space to see essentially in four dimensions, right? Like be able to see movement over time and actually see things move in the frames as you manipulate one frame, you can sort of see it. For example, like you have a maquette who's waving. If you move his hand, you could see all the other hands sort of adjust slightly in a expanded timeline situation. That's really cool. I'm definitely looking forward to someone working on that.

[00:14:27.433] Timoni West: Nice.

[00:14:28.273] Kent Bye: Going back to the API for the creation tools within VR, what type of things are going to expose to that? And I guess you're going to take an approach, but I guess I'm trying to get in my mind, what are people going to do that's better? What are people going to do that's better than what they would have? Oh, OK. So well, I guess that's a two-part question. The API, I think, is going to support. Actually, at this point, I think a better question would be figuring out what exactly it ought not to support. I think we're going to be fairly thorough about what we expose. That's a broad answer, but I honestly can't tell you what we for sure wouldn't include at this point. So I guess that's your non-answer for you, unfortunately. I think a lot of things are going to be better in 2D for sure. The things that I know will be better are things like storyboarding, things like scene editing, actually moving around objects in the scene, previewing and so on. Being able to sort of have this hybrid view where you could sort of play out a scene and then edit on the fly is, I think, going to be pretty awesome. And so since your job at Unity is to create tools to create virtual reality experiences and make that easier, are there any things that you're kind of creating as a side project to kind of eat your own dog food? Not so much right now, actually. Not personally, anyway. There's some other people on the team that might be doing a little bit more of that. I'm very interested in the creation tools myself, although, I don't know, I've been thinking about that, like, what is the next thing that I should be working on on my own? Have any ideas, anything you want to see? Yeah, just make it more frictionless. Like the approach of medium, where it has what seems to be a little bit more buttons that are more confusing, that takes more learning curve, whereas the tilt brush seems like it was more intuitive. And so it's just kind of finding that sweet spot where you could do an intuitive interface And I'm wondering, you know, in the long term, how six degree of freedom controllers, if that's going to actually be viable to do for eight hours a day. So looking at like potentially like multi-touch controllers or other things to be able to have a little bit more non-fatiguing interfaces that are still intuitive and that you could actually do for like eight hours a day, you know? Yeah, no, that's a really good point. I mean, although we shouldn't be sitting on our desks eight hours a day either, right? So maybe let's go four and four, four in VR and four at your desk. So one thing, this is just a theory, obviously, like I said, it really did not get a lot of time in Medium. And I know they also pared down the feature set substantially for Connect, they mentioned that. in the retrospective, just not being able to see the controls seemed like part of the reason why I kept messing up. I kept accidentally drawing what I meant to select using the same trigger button. So I'd look at the joystick menu, the radial menu, see what I wanted to do, and then it would disappear. And I appreciate that they wanted to do that because they wanted you to be in the flow. And the idea was that over time you would just sort of instinctively get what the tools meant. But having the tools available all the time until brush is honestly super helpful. Like the cool thing about VR is when I'm, everything is light as air, right? Like maybe I'm carrying around, you could literally like put a house on your head. And it doesn't mean anything, you know, because it's not really there. So having things attached to you or connected to you or off to the side of you, it really makes no difference once you get in the groove. So Tilt Brush's sort of persistent menu, I think, is at least more useful at the beginning there when you're still sort of learning what you want and trying to make sure that you remember which options are available to you. So I think we're going to go the persistent menu direction for sure, just because there's no reason for us to hide stuff. We're not looking for people to be in sort of a creation flow. Yeah. I mean, both of them, when I use them, I just wanted to just spend forever in them. I think that having the creation tools in VR is super powerful, even if I didn't really have an idea of what I wanted to create, just to kind of like be able to move stuff around. So yeah, just to be able to easily import objects into Unity and start putting them into a scene and arranging them. The thing I loved about Medium is that I could like draw kind of a small version and then scale it up. And I couldn't do that in Tilt Brush. And so to be able to design it in kind of a small, like near-field version of it, and then be able to scale it up. And so if you're not able to change the scale within VR, then I think it's going to be a little bit more inefficient to be able to really do some of the things. So to be able to kind of zoom in and out. Yeah, okay, so that's actually one really big piece of the UI that I am betting on right now, again, because we've played around with it in prototype, but need to actually start hammering away on it in user testing, is something that's called a chessboard. And actually, I think you made a game that had a similar structure, where you see a small version of the large version. So in this scene, you've got a chessboard, and it'll be, like, say, two by two, and all it contains is the smaller version of the scene. The size of the chessboard itself will never change. It'll always be about two by two, but you can zoom in and out within that chessboard. You can see yourself on it and use it to move yourself on the chessboard. And actually, I think it was in another podcast, someone mentioned that if you divert people's attention away from the background moving around while they're moving, it makes them less sick, which I thought was pretty cool, because the chessboard does the same thing, right? You have to be looking at the chessboard in order to pick up yourself and move yourself. So hopefully that should help with some locomotion issues. And then, for example, if you want to move a mountain, you can either move it on the chessboard or you can actually just select it in the space and move that. If you want to, you know, move a mountain in the mood of a rabbit, both to scale, you can use one, do one using the chessboard and the other one using the in real life stuff, you know, pick up the bunny and move it two feet or whatever. Yeah, that was the sort of locomotion mechanism that I used in my crossover mobile VR game jam where there was kind of a model of the room in front of you. And I did find that when you look down at the model to choose the room, it was giving you that center of focus. So that by the time you looked up, it was kind of like this change blindness effect where it was kind of a seamless way to teleport between different areas and navigate them. So yeah, sounds like doing something like that to be able to change your scale. Because you can only stretch your arms out so wide. And so you want to be able to have a way to vary the scale so you can just really zoom around. And it's like you're becoming God in some ways. Oh my gosh. You're so in God mode. It is the best. It is so much fun. Just being able to pick up stuff on the end of a laser and be like, this is lighter than air. I am clearly God. That is great. So what type of experiences do you want to have in VR then? I really like creation tools, honestly. Like, fantastic contraption and felt brush, and hopefully more medium soon. Those, for me, are the killer things. I mean, I am a designer by trade, so even in my free time, I do a lot of sketching and designing more on the artsy side. I've also always been interested in architecture. I would love to have... I know architecture is actually really something that VR program makers have been looking at, app makers, for obvious reasons, right? You could build up an entire world and that's very compelling. I also, you know, like I backed space VR, I want to go into space in VR. I want to go to magical worlds I've never been, you know, the Wii VR thing where you get to go to the bottom of the ocean. Now I want to go to the top of the mountains. I want to be in magical fairy lands and Middle Earth and, you know, I want to go everywhere, so. those kinds of experiences, creating things and interacting with things. Oh, also, have you read The Diamond Age or Otherworld? I have not, no. Okay, so in both of those, is it Otherworld, Otherland by Tad Williams? Anyway, in both of those books, they have actors who are virtual actors in that they are sort of a series of interchangeable actors all playing the same character, sort of like Mark Ruffalo in The Hulk. I feel like VR has a potential to be a serious boon town for actors right now because they could just go in and sort of occupy the skin, this VR skin, but actually interact with people directly. Maybe kind of staying on script, sort of like, you know, the Disney princesses at Disneyland, but actually being able to interact directly with people and react to them. I think that could be an amazing potential for VR. Nice. And speaking of potential of VR, what do you think is the ultimate potential of virtual reality? A while ago, I gave a talk on a passion project, on a thing that I was most interested in for design. And I had to think about that long and hard. And I ultimately decided that I feel like the potential of design was to allow the seamless transmission of information more easily. So if you're reading about World War II, you could also potentially learn about other wars and put everything into context. Like essentially, I think the more you know about the world and other people, the less afraid you are. of other people, the more you feel like things are connected. And I think I agree with Chris Milk when he said that VR is the ultimate empathy machine. It's one thing to be able to go scuba diving and tell a friend about it, but if you could actually show them that, show them where you were, show them what it's like to jump out of the airplane, or show them what it's like to just even be in your house. I think in the future, weddings will be filmed in 360, and it will feel so much more like you were actually there. And that takes that empathy level off the charts, the ability to learn things quickly when you feel like you're actually in the space. So often we say you don't know it until you try it. And VR gives us a crazy amazing ability for all sorts of people to actually be able to try things and learn things. So I think the ultimate potential in VR is really just sort of the simultaneous potential upgrade of humanity, really, in the same way literacy was. Awesome. Wow. Well, thank you so much. Sure. Yeah. Thanks.

[00:24:19.402] Timoni West: And thank you for listening. If you'd like to support the Voices of VR podcast, then please consider becoming a patron at patreon.com slash Voices of VR.

More from this show