#30: Kite and Lightning team talks about Senza Peso, Unreal Engine 4 vs. Unity, 3DUI, motion capture technique tradeoffs, VR lessons learned & future of cinematic VR storytelling

I’m joined by the Kite and Lightning team including co-founders Cory Strassburger & Ikrima Elhassan as well as developer/VFX artist John Dewar.

kite-and-lightning-allThey talk about the creative process behind converting the mini-opera song of Senza Peso into a 2D motion graphics film and then into an immersive virtual reality experience, which created some impressive buzz within the VR community.

They also discuss a number of the reasons why they went with using Unreal Engine 4 over Unity 3D, and how it enables them to more rapidly prototype on the look and feel of their VR experiences. They also have more control by being able to change the source code. They also talked about the decision to record stereoscopic video of the characters rather than using motion captured avatars.

Cory also talks about his background in the sci-fi film Minority Report, and his interest in helping develop 3D user interfaces in VR as demonstrated in The Cave & The K&L Station experience..

Finally, everyone talks talks about some of the major take-aways and lessons learned from working on all of their VR experiences over the past year, where they see VR going as well as how many exciting, open questions there are right now.

To keep up with all of the latest developments with Kite and Lightning, then be sure to sign up on their newsletter listed at the bottom of their website here.

Reddit discussion here.

TOPICS

  • 0:00 – Intros
  • 0:51 – Backstory behind Senza Peso. Getting a DK1 changed everything. Switching to Unreal Engine
  • 2:56 – Comparing Unreal Engine to Unity, and what UE4 provides
  • 5:25 – Translating the story to a 2D motion graphics film, and then translating it into a cinematic VR experience
  • 9:35 – How they did the character capture with stereoscopic video
  • 11:06 – Programming challenges for creating this cinematic VR experience
  • 12:47 – Visual design considerations & working with the Unreal Engine 4 in contrast to what the workflow would’ve been with Unity.
  • 15:29 – Ikrima’s take-aways from working on this project, and Kite and Lightning’s
  • 17:14 – 3D user interface prototypes in the Cave & insights from working on sci-fi films like Minority Report
  • 21:51 – Other 3DUI interface insights from the VR community including Oliver Kreylos’ Virtual Reality User Interface (Vrui)
  • 25:56 – Tradeoffs between file sizes in using different motion capture techniques
  • 31:38 – Experimenting with experiences that are either on-rails, some triggers, completely open world
  • 35:17 – What type of innovations they’re working on in terms of motion capture and graphics. Optimizing their production pipeline processes.
  • 37:14 – Lessons learned for what works and doesn’t work within VR
  • 44:51 – The ultimate potential for what VR can provide
  • 52:35 – What’s next for Kite and Lightning

Theme music: “Fatality” by Tigoolio

Other related and recommended interviews:

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast.

[00:00:11.938] Cory Strassburger: So my name is Corey Strasburger. I am one of the co-founders at Kite & Lightning and come from VFX. I'm part of the creative team.

[00:00:20.525] Ikrima Elhassan: I'm Ikrima Ilhasan. I'm the other co-founder of Kite & Lightning and I do development. I'm a programmer by trade. I used to do VFX R&D.

[00:00:32.472] John Dewar: Okay, and I'm John Dewar. I started in VFX, moved over to web development for about five years, and now I'm combining them back together. I'm half a VFX artist and half programmer. Well, hopefully more than half of each, but...

[00:00:51.210] Kent Bye: Great. Well, so Kite and Lightning made quite a sensation here recently within the virtual reality community with Sense Peso virtual reality experience, which was a little mini opera. And maybe you could start there and how that project that sounds like it was going on for about five years was happening and then wrapping up and then the decision to then convert it into a virtual reality experience.

[00:01:15.624] Cory Strassburger: Right. Yeah. So, you know, it's funny because the project itself, the majority of it had been done after sort of the three year mark. And then as a cream of sort of showed up one day with the DK one, you know, right fresh from the Kickstarter, suddenly it was like nothing else mattered. You know, the five shots that I had left to finish since a peso that could have been done in like a month, just stretched out to the two year mark. So that kind of got it up to the five years. And, You know, we had just kind of wrapped up with building the KNL station, the VR experience. And I was just finishing up that last shot on Sense of Peso and more like actually cutting out a few shots just so I could get it done, you know, because a lot of awesome people played a part in working on it. And at that point, I was like, all right, I've got to just finish this thing. So as I was finishing it, we were kind of talking about, you know, what we were going to do next. And one of the ideas was Let's take all the awesome assets in Sense of Peso, including, you know, the music was so powerful and use it to build an experience out of. The other thing that kind of played a role was Akrima had, I think, come back from GDC and was like, dude, forget Unity, we're switching to the Unreal Engine. And I was like, okay. And so Sense of Peso became, I think, in our opinion, a great sort of test bed to jump into the Unreal Engine because it's super strong on the visual front. Like all those things sort of lined up and then we, I don't think we thought about it for very long. We just kind of said, yeah, let's do it. And then we just jumped into it.

[00:02:55.943] Kent Bye: Yeah. So maybe talk a bit about that decision in terms of switching from Unity to Unreal Engine 4. What were the big selling points in terms of what you're able to do graphically and visually that you're not able to do in Unity?

[00:03:07.901] Cory Strassburger: Well, on the graphic front, it's funny because I think Unreal has sort of always been on our radar from a graphical standpoint, right? We're always trying to achieve the highest quality graphics. And Unity, it takes a lot of work, I think, to get out of Unity. So Unreal has always sort of been on the radar from that standpoint. And Akrima can speak to the other side of, I think, why we actually jumped into Unreal this early on. So if you want to fill in that blankie.

[00:03:37.968] Ikrima Elhassan: Yeah, sure. I think for us, the biggest change motivator was the fact that we could get source code access. Having source code access enables us to make some changes pretty quickly. For us here, we have the technical chops as well as the art chops, so a lot of times we're able to come up with really clever, creative solutions as well as R&D technical solutions to things and with Unity it was always kind of constrained by the fact that it was this big black box and trying to figure out how to make Unity do what would be really easy to do if we just could modify the source. And we've been loving Epic and what they've been doing with monthly updates. They're really open and transparent with their roadmaps as well as they really have their finger on the pulse of what the community is needing and wanting, and it's just been an awesome experience so far being on the receiving end of that. And even in Sense of Pace, we were able to get our download size down from its initial four gigs to one gig because Someone from the community had released a video texture plugin that used the WMV codec. And so we were able to use that instead of this homegrown thing that we cobbled together over a day that had like three gigabyte video files. And it's kind of been great so far, but I always say at the end of the day, it's just a paintbrush in our creative toolbox. Depending on what it is that we're trying to create, we'll use the right tool to achieve that goal.

[00:05:24.528] Kent Bye: So when I'm looking at your website of Senso Peso, I'm seeing that there's a whole story, a backstory to this opera in the original intention. And I sort of see that it went through this translation into the first CGI film that you put together. And then yet there was another translation into the VR experience. Can you maybe talk a bit about that process of converting this idea, this story, through those different iterations and the process of figuring out what to focus on in each of those mediums?

[00:05:56.887] Cory Strassburger: Yeah, sure. You know, the song was primarily sung in Italian, and there's a little spoken section in Russian. So we kind of, you know, me and my buddy Alan, who I co-directed it with, got super inspired by this song. You know, some friends of ours were making it as part of another project, and we were helping kind of record the making of it. So once we heard it, we kind of fell in love with it. And then we knew we wanted to make a music video, or that's initially what we had thought. And so, you know, all the performers were like, yeah, let's do it. So we got them to translate the Italian into English so that we could kind of figure out what the story would be. And then as we went down that road, you know, the song itself lyrically is very ambiguous in some ways. It's very sort of poetic and nonlinear. And so we kind of extracted a bunch of little pieces that started to turn into this Greek mythology based, you know, River Styx kind of journey through the afterlife you know and so we you know we're just kind of like having fun and kind of going wherever we went with it you know and we ended up with that original story and then we immediately kind of like jumped into animatic mode so we started building some of these models and we kind of got the boat in there and then we started blocking out like what the scenes would be like you know from the traditional standpoint so from that point it was really just like we've got this animatic you know, we shot the green screen stuff and then after that it was spread across the five years of just building the environments and all that. And, you know, without getting too heavily into that part of the project, it was one of those things that you're like, all right, you know, we'll knock this out in like six months or something. And, you know, there's a lot of support with this company that was giving us render farm access and all this, but it just turned into this sort of like huge thing, you know, with like no client, no deadlines, you know, and I'm just like having, I had so much energy to put into it and having fun just exploring like weeks on end, just making a particle emitter, you know, it's pretty ridiculous. So that was kind of like the story of the movie. And then I think the translation into VR was fairly straightforward. I mean, I don't think we even really conceptualized too much about what that was going to be. It was sort of like, all right, you know let's see what assets we have that actually would work in VR because as you know like in cinema or like 2D a lot of that stuff is just like comped and faked and a lot of those worlds just wouldn't translate into VR space because there wasn't you know it's like matte paintings and things so I kind of went through the whole asset folder of like all the geometry that I had and then You know, if you watch the movie and you go through the VR experience, you can see definitely similarities and assets, you know, from the movie, but there are a lot of differences in terms of the realms. So it's basically a really freeform thing, you know, where we just started with the assets. Originally, I think we had like six realms that we were going to go through, and then we kind of cut one out based on time and kind of based on, you know, there wasn't a whole lot of interesting things happening in it. I think it was just a really kind of organic thing. And so we just started mocking up these little worlds and then we started flying the camera through it, you know, all with the intention that you're going to be the girl from the story. You know, you're going to go into the boat and kind of travel through the realms for yourself and then go through the tunnel, you know, the light of the unknown at the end.

[00:09:34.677] Kent Bye: Yeah. Maybe talk a bit about the actual animations that you have in there in terms of the process that you go through either motion capture or photogrammetry in order to do some of those effects with the characters that are in that Senso Peso VR experience?

[00:09:51.239] Cory Strassburger: So it's actually not photogrammetry or a motion capture, both of which, you know, we've been kind of heavily working on, you know, because it's in a lot of ways sort of the holy grail to be able to do like 4D motion capture that you can in VR space fly all the way around. In this case, because we knew we were going to be on rails, we knew that we were going to be able to control sort of where the user went, we ended up doing a trick where we just shot stereoscopic live action over a green screen. So it's just left eye camera, right eye camera, and then mapped them onto a billboard or two billboards that always face the camera. When you do that, there's a lot of ways you can kind of cheat dimensionality, you can kind of cheat the fact that, you know, when you know what the user is going to be doing, you can cheat the motion or the sense of perspective into those sort of images. You know, there's just a lot of room for slop where your brain kind of just fills in the gaps, you know, you kind of just believe it up to a certain degree. So it's kind of a cheap way to get some cool live action characters in there and It worked really well for the last few things that we've sort of put them in.

[00:11:04.554] Kent Bye: And John, from the programming perspective, what were some of the things that you were thinking about in terms of putting this together?

[00:11:09.836] John Dewar: I think for me, I mostly was dealing with just getting that experience to flow together. So I spent a lot of time playing just with the Unreal feature called Matinee, which lets you almost put together cinematics Mountain Race, almost like an NLE, like a Final Cut Pro or something. But it's still fairly primitive. So there's quite a bit of sharp corners on that to watch out for. And it was a big challenge to kind of push that tool beyond what it was intended to do to kind of get this experience to work. We had this very long camera animation in it. Normally, it would just be a few quick linear camera moves. But here we have this camera move that is going on for five minutes. And a lot of events are getting fired off at specific moments in the song to keep everything in sync and make sure the levels are loading and unloading throughout the experience smoothly without everything grinding to a halt at some point. So there was a lot of challenges along that line for me with this project. And it's our first Unreal experience. I think the best way to learn something is you have a really tight deadline, which we had on this. and then also to just push the boundaries of the tools because it forces you to get beyond the tutorials and dig into certain features and learn how this tool is structured and how it works in a deeper way than you could if you were just dabbling with tutorials.

[00:12:46.162] Kent Bye: Yeah, and I think certainly working from an evocative song that really stirs up a lot of emotions and also having all those assets to play with, I think, you definitely see how it comes together to be a really compelling experience. And I'm curious in terms of the graphics perspective, what were some of the considerations that you were really going for in terms of the look and the feel and adding that extra element of immersion within virtual reality?

[00:13:15.104] Cory Strassburger: So on the visual side, I mean, you know, a lot of it was kind of like where John was coming from. It's like we're figuring out unreal as we went. So, I usually tend to fall for certain aesthetics that I really like, you know, in terms of lighting and just general mood and atmosphere. So there's naturally, like, regardless of the platform I'm on, I'm always trying to, like, push the level of that stuff with VR. Because it's, like, really easy to render it out in 2D, like, traditional style and get all those things. But suddenly in VR, in a real-time engine, there's a lot of things that, you know, you can't really take for granted. So for me, it was like, I think a lot of it was just driven by the Unreal Engine itself. And, you know, at its core, it does things really well. Like, you know, it's got built in ambient occlusion, it's got really great lighting algorithms, and the shaders are pretty phenomenal. So a lot of it just kind of just happened as I was going, you know, it's just like placing in lights and kind of working up a scene the way that I would want, and it all sort of was looking good to me as I was going. So a lot of it's a testament to the Unreal Engine. Because if I was going to try to get that same quality in Unity, the pipeline would be going back to Maya and baking textures. And it would have really just, A, it wouldn't have been possible, I don't think, for us on our timeline to get that level of quality or even near it with Unity. I think we definitely could have gotten near it. but it would have taken a lot longer and there would have been a lot less flexibility, right? So we could change something in Unreal just like that and rebake the lighting in seconds and it's just there, which is a godsend for me because it's like, in Unity, if we move something around, then I gotta go back to Maya and then rebake a texture. It's very draining on the creative side of things, you know? The look kind of just really came out of what Unreal does really well, and then sort of the aesthetic, I guess, that sort of bridged over from Sense of Peso, the movie, at least the theory behind the look of that movie.

[00:15:30.826] Kent Bye: And Akrima, what was your takeaways from working on this project?

[00:15:35.349] Ikrima Elhassan: I think it was pretty awesome in that we were able to kind of accomplish this pretty big visual feat that's by far our most elaborate and highest quality, visually highest quality thing that we've kind of put out there. I think for me the biggest thing is that we always talk internally about, we have a very high internal quality bar and we're kind of pretty critical on ourselves internally and for something to kind of impress us internally that we've created, it kind of has to meet a really, really, really, really, really tough bar. And, you know, we're cognizant that, you know, in the beginning as we're making all this content, what's really amazing and impressive for us is such a high bar that is above what, you know, most people would be really impressed with and happy with. And we were really considering whether it would be kind of, you know, overkill in a way to really soup up the fidelity and the visuals and the aesthetic of Sense Apaiso. by going through the Unreal Engine, or if people would really appreciate that. And it's been kind of an overwhelming surprise in that it really kind of hit that note and resonated with people that it was visually awesome, and it did look cool, and it was just aiming for that AAA game title level of quality, and people really resonated with that, whether they were in the VR audience, or whether it was kind of People have been reaching out to us to see what we've been working on. That was my biggest takeaway from the projects.

[00:17:13.798] Kent Bye: Yeah, and I guess we haven't really mentioned it, but a lot of you have worked in the special effects industry in some pretty big Hollywood films. And when I'm looking at the Batcave and I'm seeing some of those 3D user interface prototypes that you were doing in there, I get really excited in terms of what could be possible in virtual reality. And not just from special effects professionals, but eventually having that level of fidelity of a 3D user interface to be able to do the things that you see in the Iron Man or Minority Report. And maybe talk about that in terms of sci-fi visioning of being able to work on some of these projects in Hollywood, but also take that into how can you translate this into something that's pragmatic and real within virtual reality.

[00:17:59.945] Ikrima Elhassan: Yeah. Maybe Corey would be a pretty good person to talk to about that, because he actually worked on Minority Report many a year ago. He can kind of talk about the stuff that he was envisioning and ideating for that.

[00:18:13.826] Cory Strassburger: Well, I mean, sort of, Kent, to your question, I mean, I think I'm ultimately in the same boat as you and the rest of us. I sure as hell would love to have an interface like that in VR reality. Because for us, it's like, you know, we kind of mock something up and you kind of get the feeling of it and it's really cool. But to actually put functionality behind that and make it like a usable thing where you can like Iron Man style, like pull up holograms of things and like, aside from the cool factor alone, to be able to create these experiences in a similar manner is super high on my list. It's sort of like the thing at the top of my list that I need the rest of these guys to implement and it's like at the bottom of their list. So I don't know how quickly it's going to turn into reality, but the idea of like being in VR space, uh, like in Maya, for example, and then having some kind of interface where I can literally just like, okay, I need a tree here. I need a mountain here and start pulling from these assets in that space. And then spatially sort of move things around is like, a dream come true that's like right in front of me you know. So the idea of actually trying to implement all that stuff like I can't wait for that to happen whether we start building some of those tools as we go which I'm going to really be pushing for or you know somebody else starts and then we get to utilize it. I mean I'm ecstatic about that and then just kind of jumping into the Minority Report side of things. It's funny you know originally when we were working on that movie it was set In the story, it was supposed to be in the year 2085. And so we started conceptualizing what the future would be like at that time. The movie kind of got put on hold at a point. And then when it came back in front of Spielberg to actually start, they switched it to 2045 so that we didn't have to tackle so many sort of ultra-futuristic concepts, I guess. But we came up with some pretty amazing things. We had guys from MIT. bunch of the universities sort of contributing to like what the future was going to be what our interface is going to be like and I think a lot of my favorite stuff is stuff that never made it into the movie that conceptually that I think would be awesome for everybody to kind of see the behind the scenes of that stuff alone because sort of the interface the future computer concepts were Indescribably awesome. And what ended up in the movie was really cool visually. And I think that came from some basic concepts from imaginary forces that were really pushed on the visual side, just because it's like a movie. But it all kind of boils down to, I guess, this sort of holographic interface with gestural movements to control it along with voice. And like I said, to be able to now with VR, we actually have the ability to create that. We've got the gloves and all the stuff coming out to do it. And I feel like it's funny that it's really the only thing I haven't seen a lot of progress made on, and maybe you or the other guys have. But I haven't really seen anybody coming out and saying, hey, we just got a million dollars in funding. We're creating the sort of virtual reality interface. And it's probably because all these different components are needed to do it that aren't really out yet either. But that hasn't stopped anybody else in any of the other fronts, it seems. So I don't know, I'm curious what you guys think about the progress of that kind of interface actually happening in VR.

[00:21:53.378] John Dewar: And there's a couple of cool things out there, right? There's the Cave VR interface that they've had for years and years. What's that? That's kind of this user interface built entirely for VR for academic research. And I believe you can download that and play with it on the Rift now. but it's kind of interesting. It's not super great, but it is like something established and actually useful. And I, what I think another part of it is kind of the Apple issue with the minority report interface. Cause we've seen, you know, it's their argument against windows eight basically is we've seen a lot of people building giant touchscreens and we haven't seen any of them really catch on. And I think if you're doing real work, you need to have some kind of surface to both rest on, because you need to rest on something to have a lot of precision, like your Wacom tablet. So to me, you need some kind of prop. So it's almost like it goes beyond VR and it becomes an AR problem. If you really want to do work on things very precisely and for a long period of time, we're going to need to have some kind of prop that we can use to give ourselves that precision. I saw something recently, I don't know if I can find it, but it was like this tablet and you wear AR glasses with it. It was basically a Wacom tablet and you can move it around in space and so you have a 3D model floating in space in front of you and you move your tablet around and then you can adjust things on the model very precisely with the pen and the tablet and you're kind of holding it in two hands. So I think that's a really cool concept and maybe that could be part of the solution to making a really practical spatial interface.

[00:23:36.579] Cory Strassburger: Right, right.

[00:23:38.400] Kent Bye: Yeah, and from what I've seen, one thing that I point to would be Oliver Kralos, also known as Doc Ock on Reddit. He's got what he calls the VR UI, or VRUI. He kind of says it all together. But he will take two Razor Hydras, and he'll take like a globe, an Earth, and there'll be like GPS data for where earthquakes were, and so it'll be actually below the surface of the Earth, and he'll be able to kind of move it around and have a little panel there to the right that has like little menus that he'll be able to do different stuff. But he's actually even taken like CAT scans and being able to produce 3D models off of a bunch of 2D slices and then like do different things like here, isolate this organ and then construct it in 3D and then be able to zoom around in it. And so I think the other thing I'd say is that right now the VR input is in such a flux, there's no standardized input controls, and there's things like the stem controllers, which seem to be on the leading edge in terms of physical controllers, and then the Leap Motion and Kinect, but more along the lines of the Leap, which is more lightweight and focused on just the hands rather than a full skeletal body, which is kind of what the Kinect is focusing on. So once you start to have more of these input controllers out there and people using them, I think you're going to start to see more interactions with data visualizations. I think a lot of the stuff that has been happening has not been public yet, just because it's, you know, the Rift has only been out for a little over a year. DK2 is coming out. You're going to get a lot more corporate interest with companies that are looking into stuff. You see things like Bloomberg is going to be doing like terminal visualizations with the Rift. And so I think that, you know, once it gets into those types of businesses, that a lot of innovation is probably going to come from the corporate side rather than the gaming side, just because they have real business problems to solve from that data, rather than a gaming application.

[00:25:32.629] Cory Strassburger: Yeah, that's a great point.

[00:25:34.311] John Dewar: And actually, really is what I was talking about. That's the cave interface, where when you have the actual cave, you have these motion optical tracker controllers that are very precise, using professional mocap cameras. Oh, and the tablet, I just found it. It's called Sketchpad Gravity Sketch.

[00:25:56.765] Kent Bye: So I'm curious, in your experience of doing a on-the-rails VR experience, it was around five or six minutes and almost a gigabyte or so, or maybe a little bit more when you uncompress it. And so I'm curious about what do you see as the limit in terms of the type of techniques that you're using, in terms of the trade-offs of the file size versus how long of an experience you can have, What do you foresee as the future of these types of experiences in those trade-offs?

[00:26:25.492] Cory Strassburger: You know, it's interesting to think about because on one hand, the file sizes are only really come to our attention when we actually get to the end. And we're like, how big is this thing anyway? It's like 500 megs. That's not too bad. Or, oh shit, it's a gig and a half or two gigs. So we haven't really been that concerned about it up till now. at least with the projects that we've had. You know, as we start to get into more, like, shipping types of projects, obviously that becomes something you have to consider right out of the gate. You know, it's an interesting thing, too, because, you know, how you're saying, you know, with the DK2 and the evolution of VR, you know, how long can somebody really be in here? How long does somebody really want to be in here? What are people expecting out of these kinds of experiences in terms of length? I mean, I have a vague idea. But in terms of really long things, I feel like the only thing I could actually spend a long time in right now is we have this sort of experimental VR experience in our office called the Genesis. It's like a big eight foot tall sort of cube octahedron shaped audio sensory thing. But you lay down in it and we kind of created this very spiritual sort of meditative experience. And that's actually the one thing in VR that I could probably spend an hour in if the experience kept going on and on. And it's super relaxing. It's pretty wonderful to be in. But outside of that, man, I think people's tolerance and then how the story unfolds, it's sort of a mystery to me in terms of where that's going to net out. And then if it's long, are people going to be that concerned about download size? I suppose if it's a mobile-based thing, then definitely. I think a lot of what contributed to Sense of Peso's file size, and Akrima can correct me if I'm wrong, is the video that we ended up putting in there. Was that a big contribution to that gig file size?

[00:28:26.235] Ikrima Elhassan: At the end, it didn't end up being. In the beginning, we had a four or five gig download, and three or four of those were the video files. They were definitely a significant portion of it. They might have been a couple hundred mags total.

[00:28:40.579] Cory Strassburger: What do you think, Kent, as far as, like, does file size seem like it's, you know, relative to the experience length, like something that people are thinking about?

[00:28:49.865] Kent Bye: Well, I think my concern would be something along the lines of the decision of using something like stereoscopic video rather than doing something that you may have a character animation and motion capture where it's more vector-based and more efficient in file size, and so you know, you kind of have this situation where you could potentially have longer experiences that are smaller if you use different techniques. And I'm not sure if you've kind of thought about that or if experimented or if that's even on your radar.

[00:29:19.086] Cory Strassburger: Yeah. Yeah. I mean, you know, there's a, there's a lot of factors actually that kind of go into those decisions. And definitely the idea of having like a 4D captured character that is more vector based and less video based and you can travel all around it. I mean, That's definitely the holy grail and certainly on our radar and actually something we've spent a bunch of time kind of developing and pushing forward in-house. But to me, I think that that's sort of fairly far off still, at least in terms of being able to do it in a timely manner and therefore cost effectively for real projects. I think it's possible today, but it's like super expensive. And then when you get into full body capture, that's something that hasn't really been tackled yet. So that being kind of like the holy grail ideal scenario, motion capture would sort of be the thing that's possible today. And, you know, the quality difference between, at least in like sense of peso, for example, of if we had motion captured, one of those characters wouldn't have been anywhere nearly as good as or believable or as real feeling. in the subtle ways as the video technique. And it also, the video technique's incredibly fast, right? So for us, it's sort of like the trade-off is, yeah, the file size is gonna get bigger, you can't fly around it. But in this particular case, those limitations weren't really a big thing for us, and the fact that we could literally knock it out in an afternoon, in terms of shooting somebody, and then getting a real performance, don't think can be compared to any other sort of technique you know and so to me if file size is the only downside at this point I guess to your point if it's longer and we have a lot more of them which I definitely wanted you know I wanted to have this whole world like populated with characters like front to back and mainly we just ran out of time and then initially before Akrima magically solved the video file size issue it was like Every second we added was adding this massive megabyte. So yeah, I mean, I guess what's sort of that cutoff in terms of this technique and getting a lengthy experience? Yeah, I'm not really sure.

[00:31:36.901] Kent Bye: Yeah, that's really interesting. I mean, the other thing that I think about is that in talking to different people, they kind of create three categories where either you're completely on rails, you have some combination where you have some freedom in your triggering events, Or you have complete freedom where you're just walking around in open worlds and you kind of just have things happen to you. So there's, you have the least amount of control for a director's perspective. And so I kind of think of it in those three ways. And the sense of peso is definitely like on rails, you know, you don't have much triggering and it's just kind of the same every time. You can look all around, but more or less, there's no events that are being triggered and you don't have a lot of autonomy. So I'm curious if you guys have thought about, you know, those other two types of experiences where you have at least some sort of on-the-rails component, but things are triggered by actions, or either looking at, or proximity, and then ultimately getting to the open-world component.

[00:32:32.740] Cory Strassburger: Have you seen the first two demos we created, which one of them was called The Cave, and then the other one was KNL Station?

[00:32:40.682] Kent Bye: Yeah, yeah.

[00:32:42.442] Cory Strassburger: So I think, and correct me if I'm wrong, those are somewhat more in alignment with what you're saying, right? K&L station, you can kind of free to walk around, but you're very kind of constrained. So it's almost like being on rails, but you can sort of trigger the timing of getting on the train and you can kind of walk around. And then sort of the cave is, there's not a lot of places you can walk around in there, but you know, you do trigger things. I mean, you know, we're, we're totally, I think we, at this stage, the things we've been coming up with happen to fall into the category of you know more of a rail based thing more of a limited experience based thing partly because of time and and just okay we're just knocking out a demo so this is the extent of it and partly like sense of peso was kind of really the only way you could create that experience you know based on the movie was to put you in the boat who ultimately is being controlled by you know the oarsmen at least in the actual movie and then kind of be on rail so you know i think uh we definitely are heavily into interactivity and the idea of interaction. I know for me, a big missing component right now is like, you know, the AI-based stuff. Like, just being able to run around in a world doesn't thrill me personally that much. And interacting with characters, you know, through the keyboard and things, it just, on a cinematic level, I don't get as excited as, like, if I could actually have a conversation with somebody. Like, if I had, like, Surrey abilities, And I could, you know, get feedback and then the story evolves with that level of communication between the characters. Then interactivity becomes a hugely powerful thing from my creative standpoint, you know? Otherwise it's sort of, we're just trying to, you know, kind of exploring, is it boring to not being able to have control because you actually can? And I think in some cases that you can enjoy a VR experience just by looking around. I mean, I love Pirates of the Caribbean as much as anybody, you know, and it's the same ride every time. But, you know, you go back to it every once in a while, and it's just really fun to sit there and look at the crazy things with the crazy music. Like, I think there will always be something to that style. But yeah, I mean, dude, we're definitely, you know, we want to mess with every little bit of VR there is every, you know, all the different kinds of interaction and as long as it's sort of the right thing for really creating an impactful, sort of mind-blowing, emotional experience. At least as much as we can with each project, you know?

[00:35:16.476] Kent Bye: Totally. And Karima, you've done quite a bit of heavy-duty graphics research, and I'm curious about what type of innovations or insights that you're bringing into virtual reality and the type of working that you're doing at Kite & Lightning.

[00:35:31.899] Ikrima Elhassan: We were doing a lot of performance capture stuff. what kind of something similar to what Corey was mentioning earlier. For us, it was really exciting. You know, there's a lot of the techniques have kind of been maturing in from the VFX world of, you know, being able to make digital doubles and clones of people. You know, we always say, like, we want in-house technology that they use to make a Curious Case of Benjamin Button so that we can kind of bring a human element to all the experiences that we're creating in a way that's kind of doesn't require an eight or nine figure budget to make. So we were, for five months, re-evaluated and realized that, again, our internal bar for what we'd really like to do is so up there and high and it's gonna be awesome and we're going to get there, but right now it kind of makes a lot more sense to tackle the low-hanging fruit, kind of experiment creatively, like what works in the VR medium, flush out kind of our pipeline and figure out kind of our basics, our basic building blocks before we kind of get into the really, really exciting stuff, you know, about doing really heavy-duty, you know, real-time GI and doing performance capture and material capture and all sorts of stuff that's just like super exciting and, you know, all the AAA studios are doing. But, you know, it's just not the best use of our time at the moment. And I'll also say, one of the really big benefits of moving to Unreal is that my list of things that I needed to implement in Unity went from 100 pages long to two. So it's kind of been a godsend on my end for that purpose. Interesting.

[00:37:17.792] Kent Bye: I'm curious if each of you could go through and talk a bit about some of the insights that you got from working with the medium of virtual reality in terms of what works, what doesn't work with the projects that you've done so far.

[00:37:29.515] Cory Strassburger: some of the insights. I mean we've kind of explored a lot of things and you know just like anybody it's just the tip of the iceberg. So there's this underlying feeling that at least that I have that I'm not really putting too much weight towards any one thing that's sort of like you know working or not working because I feel like there's so many other missing pieces that something that might not be working today you kind of like find this piece that comes out in a year from now and all of a sudden then that earlier thing actually does work in some way. So I've been trying not to really decide one way or the other, you know, we're just kind of know what's working today with what we've been doing and we kind of know some of the things that haven't been working. A lot of it has to do with motion, you know, as you know, in some of the earlier stuff we did, you know we discovered just like everybody like turning and all these things are really not good. We played around with like doing cuts and things like that and kind of felt like they didn't work but recently saw something where cuts were sort of working in VR. So it's like it's kind of all up in the air and I think there's a lot of things that play into whether something would work or not. Relative to other mediums I think that from a story standpoint I mean it couldn't be more exciting thinking about VR as a medium because you gain so much just by being present somewhere, right? Like we've been skating by on some of these things, like in-house projects that people have seen where the quality is like fairly low for our standards, but the second you see it in VR, suddenly it's like, it's amazing. Like you watch it on 2D on the screen and it looks terrible, but when you're present in a world, it's like, you're gaining this whole other feeling, like you're tapping into this feeling that you never get from any other medium other than like reality, you know? So then you start to build on that and, you know, I feel like we're just really just like not even in a place to have much of a knowledge of it because it's all very short form stuff. It's all very, we're still excited about being, you know, immersed in something. So it's like, when you cut to five years from now, everyone's been in so many VR things, and what's exciting you then? What are the things that are actually holding up through the test of time? I don't know. To me, I think it remains to be seen, and I'm super happy to be a part of the group of people trying to figure it out, because it's super fun. Even failures are kind of fun, because it's really not that bad, in a way. And like I said, some of those things, I think, could be made to work in actually a cool way.

[00:40:17.915] John Dewar: Yeah, I mean, I think there's kind of two parts to it. You know, if we're kind of banking on the idea that, which I think we are, that VR is kind of the next big medium, right, and entertainment medium, and it's going to absorb gaming, it's also going to absorb film. And we're looking at it from a cinematic perspective. There's almost like these two separate areas of Concern right? There's the VR specific area of concern, which is how do you do things like walk around? Without a person needing to physically walk around in their room and then they don't get sick Which to me so far has been really hard like every Every experience I've done where I have to walk around it just hasn't worked for me. I get sick pretty quick so that's you know, it's a huge challenge like The on-rails or cockpit games solve that problem, but we're going to need to find a solution for that eventually. The other half of this problem is the problem that gaming has been struggling with for a while still, and that's telling a story from a first-person perspective. We've arrived at this solution that almost every first-person game is using. There's an opportunity, I think, for us to do that in VR. What I'm talking about is the idea where you walk around, you collect things, you learn from the environment, the story that's going on. People talk at you a lot, but you're not really interacting with them. There's an opportunity where we have all these people who might get interested in this medium who weren't previously interested in video games because it's virtual reality. I think that's a huge thing that I've noticed is that my girlfriend, my mother, my grandparents, everybody, Some people who came to SVVR who hadn't really experienced VR before, people who are interested, who suddenly see why they would want to do this. We've never had any interest in playing a first-person shooter before. We've opened up this form of storytelling to them, and we could probably reuse all of these things that we've learned from these first-person shooters that have been developed over the past 20 years, but I'm still not satisfied with that as a storytelling form. It still feels very stilted, if you know what I mean. It's not natural. It doesn't replace watching a movie for understanding what you as a character are going through. It's very hard to do when you're inhabiting somebody. And in virtual reality, I think that is where the power is, that you're inhabiting a space. And you feel that you're physically present in the space, so that you feel there's some stakes involved, and that you're physically there means that perhaps you might be in some danger from the environment around you. I think that's a lot of what gives VR its power. So I don't know if the solution of just putting people in a room and kind of watching a play in VR is going to be ultimately what gets us beyond movies as a medium. So, I mean, I have no answers yet, but I think it's very fascinating to think about, you know, what can we do to move first-person storytelling forward and make it more interesting

[00:43:38.393] Ikrima Elhassan: I guess I'll answer it with kind of a meta answer, like an insight about insights. And I think the biggest realization I've had and have seen doing this over the last year or so is that everything is so brand new. Your intuition is kind of really not a good guide yet at this stage. And so to me, it's like, What that means is that the most important thing is being able to cut the distance between being able to ideate something and then being able to prototype it. So we were trying internally just to be able, building everything and our pipeline so that we can quickly iterate and quickly prototype things. Someone will throw out an idea of like, oh, is this going to work? And people will discuss back and forth. what they think about it, and then we'll just try it out. And being able to just test things really quickly and make very quick prototypes of our ideas and see if that works and doesn't work, if that feels good, if it doesn't, if it hurts or it doesn't, I think is gonna be the fastest way to build that intuition muscle about the VR medium.

[00:44:52.722] Kent Bye: Awesome. And the final question that I'd love to ask everybody, where do you, each of you, see the ultimate potential for virtual reality and what it can provide?

[00:45:02.397] Cory Strassburger: It can definitely provide, like, endless amounts of distraction from reality. Man, I, you know, to me, I think that it really spans across practically every single industry on the planet in some form or another. Socially, I mean, from just a humanity standpoint, I think it's going to change everything on a fundamental level, you know, and like on the path to becoming digital and from the Kurzweil sense of, you know, merging with the machine, which seems a lot closer to reality than it did, you know, five years ago to me. VR is sort of that bridge. It's kind of an out there answer, I guess. But, you know, for me, it's like when you have the ability to kind of create worlds and create stories that you're in and a part of, and you think about how the quality of this whole format's going to evolve with higher res and wider field of view, and it's just going to start blurring the line between reality and these worlds. And I think these worlds are going to be fantastical enough that it's a very real concern and benefit to spend time in them, you know, lots of time, and meet other people in them. regardless of what industry you're in and why you're in VR I think it's going to be everywhere like on a massive scale. I think that's sort of what we all one degree of that or another is what we all are kind of feeling the potential of and so at least for me it's like trying to milk it every step of the way because I know this early stage isn't going to be here for very long and there's like a lot of cool things at this early stage in terms of trying to define or play a part of defining what the language is, which is always going to evolve, and just sort of get every little bit of it and have as much fun as we can in the early stages. Because then it's going to just infiltrate everybody, just kind of like the iPhone did, but on such a bigger scale. And what role it plays, it will be fun to see.

[00:47:10.653] John Dewar: Well, it's kind of hard to beat him on everything. It's going to affect everything. Sorry about that, John. I don't know. How can I plus this more? I kind of have this weird feeling. It's not just VR, but we're kind of in this new era. What's going to happen? I just went to see How to Train Your Dragon 2. And of course, you leave the theater, you really want a pet dragon really badly. What's going to happen when Kurzweil succeeds at his goal here of making an artificially intelligent computer? You know, suddenly that would be possible, right? You could have an artificially intelligent pet in VR that learns and does everything a real animal would. You could interact with it. And then what if that extends? What if you end up with people having children in VR? These people that exist only on Google servers, but are otherwise indistinguishable from real people. It could be a very weird world in which you could have a kind of a sub-race of humanity that only exists in virtual reality. So anyway, sometimes I lay awake at night and think about that kind of thing.

[00:48:28.798] Kent Bye: Yeah, kind of like the movie Her in a lot of ways, exploring those ideas of what's it mean to have artificial intelligence that evolves and grows and becomes its own entity in a lot of ways. Right. What about you, Akima?

[00:48:43.205] Ikrima Elhassan: I think for me, I would I mean, like everyone said, it's going to have some far-reaching implications. VR is kind of like a hovercraft technology, right? I forget who said it, but all start-ups, all technologies can be bifurcated into two categories. Hovercraft or they're like toasters, right? Toasters, you see it, you know, the first time someone brought a toaster to like the consumers, they're like, why do I need this? This is such a useless redundancy of my oven, right? And then, you know, they don't get it. They can't really foresee its use case. Whereas like Hovercraft, it's like, yes, I want this. It's going to be awesome. Everybody's going to use it. Does it work? You know? And, you know, I'm sure at one point we're all going to be in the Matrix, having babies in the Matrix, and, you know, Google and Facebook are going to run our lives. And that's where we're going to end up. But I think in the near term, I feel like the thing that I would want to see the most come out of VR is I want I want inception. I want to be able to go into this world that I create and I want it to be very frictionless for me to create whatever I want to create and then I want to get to play in it. It's like my own ultimate sandbox. It's like Legos on steroids. Then I can put my favorite characters, movie people, my friends. I could put dragons in there if I want. Just everything. I want my unlimited creative potential to be realized.

[00:50:19.197] Cory Strassburger: You better watch out what you wish for, buddy.

[00:50:24.501] Ikrima Elhassan: Totally realizing we are contributing to making this a reality and I always joke with people is that whenever we put them through first timers of the rift and we put them through a sense of peso, you know, they walk away being like, oh my God, this is amazing. Did you guys make this? This is so cool. And we're like, yep, totally made it. And I just want you to relish this moment because

[00:50:47.629] Kent Bye: This is the moment where you are experiencing the downfall of humanity starting. I definitely see that there's the unlimited untapped potential and then obviously the shadow side. And I don't think it's going to be a clear shot either way. I think we're going to be living with both of those, the great potential and the negative exploitation of this technology for sure. I think it's the intention that we put into it as well. What do we want people to experience? What do we want people to do? Do we want people to grow and people to be more of themselves? Or do we just want to distract them with getting their money and taking up their time? So I don't know, I feel like I can see it going both ways. But yeah, I don't know if I want to end on that type of note.

[00:51:44.064] Ikrima Elhassan: all of this is tongue-in-cheek, if it's really, you know, like I don't worry about that stuff at all in reality, because if you go back, back when Doom was being first created and launched, and you see the reactions that people were having, it is extremely comical that people were calling Doom, which had pixels the size of bathroom tiles, that It is too real, and it is an effective murder simulator, and it's going to ruin our children. And you look back at Doom, and you're like, how did anyone play this for more than 30 seconds without wanting to gouge their eyes? So I think we're going to be fine. And it's always funny to see people's overreactions about how the youth is going to be totally ruined with these new technologies.

[00:52:34.637] Kent Bye: Yeah. Well, I'm curious if you guys have anything you want to say in terms of what we're going to see next from Kite and Lightning or some of the big problems or things that you guys are going to be looking at moving forward.

[00:52:46.165] Ikrima Elhassan: Sure, sure. I think there's a lot of really exciting stuff that's in the works that people would really enjoy. We can't really talk publicly about any of it at this moment, but we are gearing up to create and launch our own kind of game that we're going to kind of put out there. So I think that like the best way that people can kind of keep in touch that we're kind of communicating with our audience and people who like the stuff that we're doing is through our website. If you sign up to our newsletter, that's where we send out early betas, you know, more, you know, get feedback, have that dialogue, you know, party invites for every once in a blue moon when we have them here in L.A. And our website is kiteandlightning.la. Awesome.

[00:53:32.512] Kent Bye: Great, well, thank you guys so much for joining me today. And yeah, I look forward to all the new VR experiences that you guys have in store with us. I'm sure it'll be quite a treat.

[00:53:43.538] Cory Strassburger: Awesome, Ken. Dude, thanks for having us, man. We appreciate it. It was a lot of fun, good conversation.

[00:53:48.361] John Dewar: Awesome. Yeah, thanks, Ken. Yeah, thank you.

More from this show