Timoni West is a principle designer at Unity Labs, and she’s been working on creating tools within Unity to be able to create VR within VR. I first talked with Timoni in October about these content creation tools, and these were revealed to the VR community for the first time at Unity’s VR/AR Vision Summit in February. I had a chance to catch up with her at the Vision Summit to talk about creating VR in VR for both developers and non-developers.
LISTEN TO THE VOICES OF VR PODCAST
One of the really interesting aspects of this discussion was Timoni’s discovery that there are a number of different perceptual illusions that were made really evident when trying to work while immersed within a 3D medium. She says that bats process information in 3D, but that humans really process information in 2D and that a lot of the cognitive mapping of 3D spaces is extrapolated from contextual cues from within the environment.
Timoni ended up doing a deep dive into cognitive science research to learn more about some of these perceptual illusions that were revealed to her in the process of creating VR in VR, which she presented in her Vision Summit talk about the “Cognitive Implications of Widespread VR.” She’s trying to understand our perceptual limitations in order to design around them, but also to see if we might be able to perhaps evolve and perhaps overcome some of them. Here research hasn’t yielded any clear answers yet, but she’s at last starting a dialogue and getting feedback from the wider VR and research communities.
You can watch Timoni’s presentation of Unity’s VR scene editing tools at the VR/AR Vision Summit here:
Become a Patron! Support The Voices of VR Podcast Patreon
Theme music: “Fatality” by Tigoolio
Subscribe to the Voices of VR podcast.
Rough Transcript
[00:00:05.452] Kent Bye: The Voices of VR Podcast.
[00:00:11.934] Timoni West: I'm Timony West. I'm principal designer at Unity. And right now, we're focused on two things, the first of which you may have seen at the conference earlier today, scene editing tools in VR. And the second is consumer applications and continuing the democratization of game development for everyone.
[00:00:30.376] Kent Bye: Great. So tell me a bit about what are some of the specific features for scene editing can people expect within Unity?
[00:00:38.642] Timoni West: So today you saw a few things. You saw instantiation of the asset library. You saw the beginnings of the inspector. You saw locomotion using a joystick. In the near term, we expect to have vastly different types of locomotion. And if you've seen other VR editing tools that have come out recently in VR, you'll notice that they use a lot of scaling, zooming in and out of the scene to preview what you're doing and be able to move large objects. You can't do that very effectively with a gizmo, honestly. Like the things I was doing today, when things get too big, when they get to be like a mile high, it's not workable. So, we have a couple of solutions in place long-term that I think will be really cool, where you can actually see the scene that you're currently in, sort of mini-map style, and be able to directly manipulate objects either in room-scale VR, the way that you saw today, or just use the board. If you saw the demo today, you'd see that we had this gigantic whale. I mean, that thing was probably, like, I don't know, half a kilometer or something. But if you had it on the mini-map, it would probably be about six inches long, and you could move it easily anywhere you wanted in the scene.
[00:01:41.767] Kent Bye: I see. And so there seemed to be a little gizmo that you were trying to, like, move things on specific planes. So what are the advantages of this plane tool?
[00:01:50.856] Timoni West: So I think long-term, snapping is really key, right, to surface. So we're going to have surface detection and also some clever ways of speeding up that process for you. Like, honestly, it's easy to snap to a plane in real life. All you have to do is drop your phone on the ground and you're like, sweet, snapped to a plane. So we can do that, but we can do that on any given surface, actually. One of our designers has come up with a really interesting way of thinking about this, sort of how shadows can snap to any given plane. We can do that in VR very easily. In general, actually, it is kind of cool, especially for things like architecture or, as I did today with paintings, it's very nice to just isolate dimension and just move objects on that given dimension. Having easy ways to quickly, quickly do that and place things in the scene to block out scenes, or if you're doing level design or scene design, or if you're a director coming in to make final edits, like, it's a really handy, just quick tool.
[00:02:42.905] Kent Bye: And so are these going to be integrated within Unity 5.4 that are going to be released sometime in March?
[00:02:48.488] Timoni West: Can I break your heart? No, I'm sorry. We've been a little backlogged, honestly. No, it won't be out for 5.4. But we're going to try to get it out into people's hands as soon as possible. And so I'll promise you in a second that it's ready. We'll get it out there so people can use it. We know people really want these tools.
[00:03:09.463] Kent Bye: Is it going to be tied to a major release, or is this something that you can release independent of the major release cycle of Unity?
[00:03:15.129] Timoni West: Interesting. I don't know. TBD. We'll see. Yeah.
[00:03:20.394] Kent Bye: So just from using some of these tools and creating in VR, from your own personal experience, what does it feel like, and what is it like to actually be able to build VR within VR?
[00:03:31.178] Timoni West: I mean, there are a couple of different sandbox tools, so I think anytime you use these tools, you're like, yeah, I'm God, I'm just moving this giant thing around and it feels great. And the same is true of doing this in Unity. God mode is awesome, anytime you can do it in VR, and I think this is actually a lot more fun to use than we had anticipated. It was supposed to be, you know, a useful dev tool. So, I think it feels very satisfying. I mean, I've been using it for a while, I know the hotkeys, I know how to move around in 2D, in 3D, but actually just being able to walk up to a bench and just move it like an inch, that is something I could do in 2D as well, but it involves using my mouse, continually turning to check where I am, or typing in the numeric value if I happen to know where it needs to go. It involves, you know, holding down several keys at the same time and really having that mapped physically as opposed to just being able to carefully move it in VR.
[00:04:25.013] Kent Bye: Yeah, it seems like to me the power users of Unity may actually still want to use all their hotkeys in 2D because they may actually be faster, but tools like this to me seem like when, for one, you're refining a scene and there's something about the size, proportion, and scale of VR that doesn't translate to the 2D screen. You really do need to be in VR to move stuff around. But also for people who aren't whizzes at doing this translation of doing 3D manipulation through a 2D interface, it's going to be a lot more natural, I think, for people to just dive into VR and make it easier for them to start creating scenes.
[00:04:58.888] Timoni West: Yeah, and I think there's real value in having people who don't know Unity particularly well and 2D still be able to go in and do their job in VR. If you are a set designer and you don't know Unity well, you can just put it on the headset and pretty effectively create something very quickly. It's true that a lot of professional tools where you need to move quickly generally right now involve a lot of hotkeys And so that is a big open question in VR. This is like a large-scale question for anybody, I think, who's working in VR right now. Like, do you just switch modes continually? How do you represent those modes? Because we don't have a ton of buttons. We don't have, you know, how many buttons are on a normal keyboard? I mean, you have all of those buttons available to you when you're working on a computer, but you have anywhere from, like, six buttons to, you know, and up a little bit when you're using the motion controllers. So having the equivalent of hotkeys in VR, I think, is an open question. And honestly, I'm curious to see what the hardware will do and how the hardware will change.
[00:05:58.617] Kent Bye: Yeah, you know, I did an interview with Mike Alger and one of his interviews about user interface design for immersive media, virtual reality in particular, that he foresaw there could be a time when just as we use two hands on a keyboard, if we want to move around in the 2D space, we take our left or right hand off the keyboard and start moving the mouse and so it just spoke to me that what are different types of combinations like could you foresee a time when you are sitting at a keyboard using both hands and then you pick up the 6DOF controller and then be able to move that in 3D space because you know a lot of times when you use the Vive it's like it almost seems like you should be standing up or you know like can you actually be sitting down and using it.
[00:06:44.065] Timoni West: Okay, so first of all, yeah, we think people will still want to use their keyboard a lot. So, especially for input stuff, quickly typing out scripts or whatever. So we plan on supporting that from the get-go. But that's an interesting thing, because I've noticed that myself, using the scene editing tools we've built so far, I often just want to stand up, and I'm not sure why. because I could be sitting. Sometimes I'll be lazy and use the joystick to move around, but just standing up feels so much better. So yeah, again, open question, I think. I guess if you're in a room and you just know you need to walk over and move something, it just feels correct to walk over there rather than to use a joystick to move things.
[00:07:19.463] Kent Bye: Yeah, well, going to the IEEE VR and the 3D UI conference, Rob Lindeman has a lot of opinions about non-fatiguing interfaces and actually moving more towards using tablets as an input controller, even though it's like a touchpad controller, but you're sitting down and comfortable. I guess when I see things like this, I wonder, like, is this something that you could foresee someone working and doing for five, six, eight hours a day, or is this something that you may do in bursts? So you do a lot in the computer, and then now you dive into VR and you sort of do highly active activities that may actually tie you out in the short term, but then you're only doing it in short bursts.
[00:08:00.435] Timoni West: So, well, two parts here. I don't think, no, not eight hours, four and four or whatever. But I mean, people do have jobs where they stand around and are very physically active all day. And frankly, it wouldn't be a bad thing for humanity if we started doing that more. Certainly not for me personally. But here's the thing. If you do happen to check out the keynote later on, because I was surprised by this myself when I was practicing and someone video recorded me. how little I move. I actually don't move very much. I mean, I barely ever even go and raise my arm to a full extension. I kind of piddle around a little bit, but most of the time I already know where I need to be and stuff comes to me. So I'm moving my arms a bit, but probably no more than you would in your average conversation. So I've already got the muscles for that.
[00:08:43.689] Kent Bye: And so the Vive is potentially going to be coming out in the second quarter, and then the six degree of freedom controllers for Oculus has been delayed into at least the second half of 2016, which I'm curious if, you know, is this something that you're targeting in developing these tools to work, kind of agnostic between the differences between the touch controllers and the Vive controllers?
[00:09:07.195] Timoni West: Heck yeah, yeah, I actually meant to mention that during the keynote and I didn't, I tweeted out as well, we will be supporting all types of inputs, If you got your Razer Hydras off eBay, we already support that. We actually have already built an input switcher, so we're hoping to push that out fairly soon, maybe even sooner than the rest of the tools, just because we know it's something that devs want. Yeah, being agnostic is kind of a pain in the butt, really, because the button mappings are so different for the different controllers. That's part of the reason why we want to make everything as customizable as possible as well. I think that when people continue to use different tools in VR, they're going to come up with their own standardization for what they think a thing should be. And with the double triggers on the Oculus and only one trigger on the Vive controllers, you're already getting into a question of what is shoot, what is select, which button is it, right? So we're going to have smart defaults, or we're going to try to have smart defaults, but we appreciate that one person's smart is another person's dumb. So then you can just go remap it to whatever you want.
[00:10:00.001] Kent Bye: Oh, interesting. And there seemed to be a moment where you took your hand-tied controllers and kind of flipped them on the other side and were revealing some performance statistics as to like whether or not the scene that you were creating was pegging out and dropping frames and being able to actually track dynamically the different performances with these live animations that you see within the Unity Live Editor. So talk a bit about what type of things you're showing there and why.
[00:10:26.313] Timoni West: OK, so that was just the profiler and the console. So I mentioned this in the keynote as well. We want to rethink the Unity UI. We don't just want to be drawing in Windows. But that is the number one most requested feature. And actually, that was Amir's idea. Amir, our lead developer on the project, he was the one who built that. And he invented the flip thing, or he came up with the idea and implemented it. And I was like, that is a really good idea. Thanks. That's awesome. So yeah, it's just something that was has been requested from day one from every single partner we've talked to, and we knew it was high on the list. It was just a matter of, would we be able to take the data and rebuild it in Unity VR in a more VR-friendly way, or are we just going to have to pull in the window? And for now, it's the window, long term. It will be a little bit easier to read, a little bit more structured, but it will still be that accessible.
[00:11:19.789] Kent Bye: What types of questions or problems would looking at that information give to people while they're in the environment dynamically editing it live?
[00:11:27.670] Timoni West: Oh, I mean it tells you a number of things, but essentially I think the thing people are most concerned with is frame rate, right? Like if you just added something to your scene that's dramatically reducing your processing speed and everything starts to judder, you want to know why. So that's what that's for. And then on the other hand, it's a console, it has errors. So if you have errors, it tells you what the errors are. So you can just look in and instantly know what you did and how that has affected what you're doing.
[00:11:51.856] Kent Bye: Yeah, it's interesting. I could imagine a time where you do fully immersive 3D plots of graphics as well to be able to do all sorts of sophisticated data analysis that transcends what you're even able to do on a 2D screen.
[00:12:04.907] Timoni West: And hopefully manipulate it. I think it'd be cool to be able to attach an object to this representative data. That might be a little aspirational, but... If you could, for example, choose an asset and then see directly how that is, like, sort of a really interactive way of getting feedback on different types of game objects, while frame rate is still such an issue in VR, I think is a really helpful tool that people want.
[00:12:28.291] Kent Bye: What are these consumer tools? You mentioned a consumer product. What does that mean?
[00:12:33.132] Timoni West: The consumer project that we have right now, Project Carte Blanche is the code name, is designed to let anyone create scenes and games in VR. Project Card Blanche is taking a lot of the things that you would use to build out a game in Unity and putting it into another metaphor, one that's a little bit more familiar. So we're going with a card metaphor. So you have your asset package, your asset library, and each asset is a card in your deck. So for example, you're laying out a medieval scene. So the cards all appear in front of you, and you can just pick up a card and place it on your scene plane. and it turns into the actual 3D model. So you've got a cabin that appears, you've got like a castle that appears, and so on. And then you can select the game objects, and still using cards, you can add behaviors to them. I think it's like Magic the Gathering combined with, I don't know, Dominion combined with node-based programming. So that's the basic idea.
[00:13:28.085] Kent Bye: So these are like augmented reality cards, and they have like these tracked, like do they have some sort of things that are being optically tracked to be able to pull in these 3D assets within the virtual space?
[00:13:38.360] Timoni West: No, sorry. Other people have also thought that as well. No, actually, it's just a VR card deck. So you're in VR, and you get to choose which card deck you want. They animate in front of you, show you what they've got. And then you say, I would like the space deck, please. And then you get your space deck, and it appears in front of you. And then in VR, you're laying out the cards. But I do think having physical cards would be cool. So certainly other games have that, right, where you buy the card deck, and you get the 3D version of it. So I don't know. We'll see. Long term, that could be an awesome addition.
[00:14:08.020] Kent Bye: Yeah, because it seems like, you know, with augmented reality is maybe a year or two behind where virtual reality is at at this point. You know, it's going to take a few more years to get there. But I can imagine that what I see is like Unity is like the lingua franca of all of the different immersive technologies and that we're going to be using things to create both in VR and in AR. And so is there anything that these tools that you're building that you would see like would actually be better or different if it was in an AR context rather than VR?
[00:14:36.171] Timoni West: I mean, yeah, I think it'd be dope if, okay, so for the Consumer Project, it's a seated experience. We assume you have a desk or a flat surface in front of you. You don't need it, but it's helpful because then you can actually calibrate your desk height to where your actual desk is and actually feel more like you're placing objects. But how much better would that be if you could actually have that card deck in place, actually just put it on there and see it? I mean, I really do think that AR is the future future. VR is going to be for certain experiences. But every single day of our lives, we can't walk around blind to the world, right? So we need AR. And is there anything that we're building now that would be better in AR? No, I think it would be a whole different set of challenges, honestly. For example, again, I had this gigantic whale, and you could see when I moved the whale, sometimes he just cut into buildings. In AR, you're going to have to fix that, right? So then it's just a different set of tools. Where do the occlusions occur? How do you fix them? How do you cut away the building and define where the building gets cut away for? Is the whale behind the building? A lot of the stuff we're using is very applicable to AR, but as AR ramps up, we're going to need to build more AR-specific tools.
[00:15:45.331] Kent Bye: So just to clarify something, because Unity Labs is kind of like this experimental cutting edge. They're building things like these tools that are for VR developers that are using Unity to create VR experiences. It's kind of a meta, tools to build tools, like world building tools. And this consumer product sounds like it's like a game that would be just like any other experience that would be created by any independent game developer that was creating an experience like this to have in VR. So is this what that is? Is it just sort of like a consumer-based experience that's in VR?
[00:16:22.537] Timoni West: Yeah, actually. I mean, yes, it is. But two things. First, it'll link with the asset store. So you can actually just download Unity asset packages, a new kind of Unity asset package that's very VR specific. And secondly, you could actually save whatever you do in carte blanche and open it in Unity if you wanted to then take it to the next level. So. Yeah, it's a fun game, it's a consumer thing, you can share your scenes with your friends, but it is very tightly integrated if you wanted to then take it to the next level and start going into actual game design.
[00:16:53.597] Kent Bye: It's like a type of experience to give people an immersive experience with the Unity Asset Store with all these 3D assets. What's the game, what's the object?
[00:17:03.321] Timoni West: It's a game maker, I guess, a world maker, so it's just fun to make stuff, right? It's like Lego, insofar as Lego could possibly be a game. But you can share scenes with other people, so there's that. But also, you can make it a game. So you could build your own little VR game using Carb Launch.
[00:17:21.125] Kent Bye: I see. So it sounds like world building tools are going to be a huge thing. I think that for me, and the big question that I would have is my theory at this point is that the company that's able to create the world building tools to be able to create VR experiences within VR and interlink them, sort of like the foundations for the metaverse, where you're building tools that are able to create these worlds and have ways for people to kind of dynamically interact with them. perhaps they're hosted on the internet, or perhaps they're, at this point, Unity apps that you download. It's kind of like this different model between the walled garden app infrastructure versus like the open infrastructure of the internet, which is all sort of based on open technologies. And so from Unity, is this like your early stake into trying to create the foundations of the metaverse?
[00:18:12.765] Timoni West: Interesting. Well, one nice thing about Unity, I have to tell you, by the way, I want the metaverse to be a real open metaverse personally. So I'll say that first of all. But I think, sure, you could use carte blanche to share worlds, but you could also then send them out into the rest of the world. I mean, Unity stuff works on any platform. So I think I imagine the metaverse being much more like the Internet. I really don't want it to be like AOL Circuit 1993. So my hope is that people using all types of platforms, not just Unity, have the ability to create their own worlds, their own phenomenal worlds, and share them any way they want. That's my hope there.
[00:18:56.307] Kent Bye: Great. And finally, to sort of revisit this question again, what do you see as the ultimate potential of virtual reality, and what do you think it might be able to enable?
[00:19:05.269] Timoni West: So the last time we talked about this, I talked a lot about the empathy and just the sheer power of being able to share these experiences and to learn more quickly when you you know, teach a man to fish, et cetera, et cetera. But lately, I've been doing a lot more research into the cognitive implications of VR. Ironically, the name of a title of a talk we're giving tomorrow. There's a lot of interesting stuff around how the brain works in this very fuzzy, nebulous way right now, how it inputs data and translates that data is largely just like a series of like neurons twitching in a way and your brain makes the best use of it that it can. But I think there's a really interesting potential to be able to isolate certain senses or certain things in the brain and basically reteach ourselves or almost like super evolve ourselves. So for example, humans don't process in 3D. Bats process in 3D. We only get things on two dimensions and we use triangulations and a series of other things to actually create cognitive maps. So I'm not, I have no plans to actually build this product, but I'd like to think that there is some way we could use VR to start teaching ourselves how to truly think in 3D, because the number of weird optical illusions that have happened just working on the scene editing tools is so significant that my brain wants things to be a certain way that they are not, and I know that that is the case, but I can't help but think the longer I spend in VR, the more likely I am to be able to actually parse what's really going on. So this is kind of a new thing that I'm thinking about lately, but I think it could go up, up, up.
[00:20:41.046] Kent Bye: Wow, that is really fascinating that you've sort of, what are some of the examples of, an example of a perceptual illusion that you see that you think you may be able to train your brain to be able to overcome given more time?
[00:20:53.162] Timoni West: OK, so the gizmo is a really easy one. So we have this gizmo in VR. You can see in the screenshots. So the gizmo has to maintain the same space in your field of vision, no matter where the object in question is. So that means, hilariously, when you get something like a mile away, the gizmo looks like it's a mile high. But it's not a mile high. It's the same size it was before. It just looks like it's a mile high. So at what point does my brain get that the gizmo didn't change size, if ever? We're really used to motion and velocity as a sign of the height of an object, because that is the world in which we live. But in VR, that's not the case. So, interesting stuff.
[00:21:30.796] Kent Bye: So if I understand that correctly, that given something at some size, we're taking around all of the contextual information of the perspective lines, and I've seen a lot of perceptual illusions that have objects that are actually physically the same size if you were to measure them, but yet if you were to put them in a context with an environment with like perspective lines that are converging, then your brain automatically assumes that it's smaller.
[00:21:54.968] Timoni West: Triangulation. Yeah, exactly. So that's really interesting because in real life, it doesn't matter. It's like, oh, that's an interesting quirk of the brain, but it doesn't really matter. But in VR, it actually kind of does matter because you need to make sure that you know that the gizmo, or maybe you don't need to know. So that's the part where I'm not sure how to force the brain to determine that the thing is actually the original size that it was despite all obvious visual cues. There's been some studies done with people who had eye replacements who are basically blind and then could see later. And at first they are unable to see objects as a whole. So if you ask them, for example, this floor right here has a bunch of shadows on it, they would say that this is one, two, three things. The brain hasn't yet learned that it's all the same object. Eventually they get it and they can move around the space. But the fact is our brains are kind of a blank slate from the beginning. That's the takeaway, right? It didn't know that there was a shadow here. It really thought that there were three different objects. And I think maybe we could retrain the brain back a little bit. And I don't know if that's necessarily the most useful thing we could train. It's just a very obvious example.
[00:22:55.674] Kent Bye: You could elaborate a bit as to why you started doing this research into this cognitive processing and what you're presenting here and what you hope to communicate with people that are developing within VR.
[00:23:06.085] Timoni West: Well, labs has a few different goals, but one of them is to really just do deep dives into how people use things in general. And I've always been interested in cognitive psychology and cognitive behavioral science. So this is sort of a natural next step. We're seeing these weird quirks, but why do we see them? What is happening in the brain where these things seem like they should work and they're not working in VR? I mean, if you're going to try to troubleshoot stuff, it's really helpful to actually know what's going on in the brain.
[00:23:31.482] Kent Bye: Wow. So what are some of the big takeaways that you're going to be telling people?
[00:23:35.170] Timoni West: You know, we're not coming to any real conclusions. We're more just talking about, so it starts off with, here's how the brain works so far as we know. Here's what's going on in academia right now. And then the last part is on interesting, weird quirks that we've seen. And obviously, the things that other VR developers are going to have to deal with at some point if they're doing something similar. So the actual discussion around how to get around that, we're not quite there yet. We're still in the research phase.
[00:24:01.057] Kent Bye: OK, great. Well, thank you so much.
[00:24:02.197] Timoni West: Sure, OK. Yeah, thanks. Nice talking to you again.
[00:24:05.575] Kent Bye: And thank you for listening. If you'd like to support the Voices of VR podcast, then please consider becoming a patron at patreon.com slash voicesofvr.

