When I was at SIGGRAPH this year, one of the hottest topics was figuring out the workflows for how to deal with the complexity of capturing, processing, and distributing volumetric digital light fields. Otoy’s Jules Urbach had a lot of insights, and a number of different high profile VR agencies like Framestore VR appear to be secretly working on their own processes for how to best capture and display live action volumetric capture.
I had a chance to talk with a couple of people from Framestore VR about this, Johannes Saam, a senior software developer, and Michael Ralla, a compositing supervisor. They weren’t ready to provide a lot of specific details just yet beyond the fact that it’s a hot topic, and that they’re possible working on their own workflow solutions with some insights gathered form deep image compositing.
LISTEN TO THE VOICES OF VR PODCAST
Here’s some recent footage from Lytro Immerge camera that shows how they’re compositing 6 DoF Volume Capture Inside Nuke:
https://vimeo.com/179944822
Here’s the final VFX Build of “Moon” by Lytro that blends the live action 6DoF footage shot with Lytro Immerge on top of the composited environment.
https://vimeo.com/179833357
It’s unclear whether or not the Lytro lightfield camera will need to only be used within a green screen environment. There’s a previous marketing video that has the disclaimer that “conceptual renderings and simulations are used in the video,” and so it’s unclear whether to not this camera is able to actually capture this type live action footage with objects in the near field somehow be able to accurately render the background paralax for any occluded portions:
https://vimeo.com/144034085
What is known is that there are still a lot of open problems with digital lightfield capture and workflows, and that Framestore VR is one of the production studios that are actively investigating it.
Johannes and Michael also talked about some of the high-profile ad campaigns that Framestore VR has been a part of including one for BWM M2 that was like a race car shell game for keeping your eye on the right car as a 360 camera races down a runway. It’s received over 5 million views on YouTube, and is a great introductory experience for people new to VR to help train them that they are able to look around.
https://www.youtube.com/watch?v=OwDtPixqJLc
They also worked on an interactive meditative application called Lumen with Dr. Walter Greenleaf that uses procedural generation of content to grow trees, harvest blossoms to plant new trees and grow a forest around you. It is a part of the TIME Life VR initiative that launched today.
Framestore VR also created a Field Trip to Mars as a part of a STEM initiative from Lockheed Martin by replacing all of the windows of a school bus with transparent LCD screens. They created a Mars environment within Unreal Engine, and then matched the real-life bus movements with virtual Mars rover movements to create a collective virtual reality experience for a number of school kids.
They also produced the Game of Thrones Ascend the Wall VR experience that premiered at SXSW 2014, which was one of the first high-profile advertising campaigns using virtual reality.
Donate to the Voices of VR Podcast Patreon
Music: Fatality & Summer Trip
Rough Transcript
[00:00:05.452] Kent Bye: The Voices of VR Podcast. My name is Kent Bye and welcome to the Voices of VR Podcast. So when I was at SIGGRAPH this year, there was a whole lot of virtual reality technologies that were both in the VR village as well as in the expo floor. And one of the trends coming up was digital light fields and how are people going to be integrating digital light fields within their workflow in the future. And so this seemed to be a big open question that I had a chance to talk to a number of people about. And so in this interview today, I'm going to be talking to a couple of technical contacts at Framestore, which has done a number of high profile VR experiences for Marvel Comics. And they did a Game of Thrones experience back in 2014 at South by Southwest. And they have a couple of other experiences with BMW and Time Life that's actually launching on Tuesday, September 20th. So, I'll be talking to Johannes Sam as well as Michael Rahla about Digital Lightfields and some of the projects that they've been working on at Framestore on today's episode of the Voices of VR podcast. But first, a quick word from our sponsor. This is a paid sponsored ad by the Intel Core i7 processor. You might be asking, what's the CPU have to do with VR? Well, it processes all the game logic and multiplayer data, physics simulation and spatialized audio. It also calculates the positional tracking, which is only going to increase as more and more objects are tracked. It also runs all of your other PC apps that you may be running when you're within a virtualized desktop environment. And there's probably a lot of other things that it'll do in VR that we don't even know about yet. So Intel asked me to share my process, which is that I decided to future-proof my PC by selecting the Intel Core i7 processor. So this interview with Johannes and Michael happened in Anaheim, California during the SIGGRAPH conference from July 24th to 28th. So with that, Let's go ahead and dive right in.
[00:02:04.792] Johannes Saam: Hi, I'm Johannes Sam. I work for Framestore as a software developer, and I'm part of a team that operates out of London and New York for VR, AR, and any sort of real-time development. And also, as I'm based in Los Angeles, we are part of the integrated advertising group of Framestore and are there to provide any sort of creative solution for whatever your interactive or offline problem could be in the media world.
[00:02:33.863] Michael Ralla: Hi, my name is Michael Rahler. I am a compositing supervisor at Framestore in Los Angeles. We primarily work in commercials because it's the integrated advertising division that we're both working for. But we basically do a whole bunch of different things and it's not only commercials but also an increasing amount of rides and 360 video experiences as well as like true virtual reality projects.
[00:03:01.811] Kent Bye: And you said that you're one of the compositing supervisors, so it sort of implies that you're doing a lot of special effects, so maybe you could talk a bit about translating those special effects from what was traditionally in 2D, but also bringing them into these 3D immersive environments.
[00:03:18.042] Johannes Saam: I mean, to translate film and feature film techniques into the VR realm, I mean, first of all, it's what everybody probably tells you is that it's all storytelling of some sort, which is true, but there's also more detailed, I think, if you understand how pictures are composited or composed together or what makes a good experience and that all translates over and I mean it's always evolving and always continuing and what happens I think technically is that from like a 90s effects loaded movie to now a VR experience that we can cheat less and less and less and less so you have less and less possibilities of you know classic smoke and mirror visual effects where you hide things or you You're diffusing the situation to, you know, it started off with stereo movies where suddenly you cannot cheat things in depth anymore and the distance from an object to the camera has to be accurately represented. And now we have VR where not only do you have to be correct but you can also not predict where somebody's looking. So it has to be more and more and more real in a sense it has to be based in reality and goes away from the Melies smoke and mirror experience which is exciting and challenging but I think this is with more computing power and more brain power that's getting put into this problem. It's just the way it naturally progresses and yeah the holodeck is the next step hopefully somehow someday.
[00:04:38.782] Kent Bye: So are you guys looking into digital light fields and being able to shoot with digital light fields and actually produce and actually create that pipeline to be able to do that?
[00:04:48.068] Michael Ralla: That's one of the hardest topics in, do we want to call it in the VR industry? Yes. We are definitely and we're doing that on different levels. I mean light field, I don't want to call it light field compositing, but just light field workflows are possibly going to change everything. How we deal with images, is a light field still an image? It's basically a 3D point cloud representation of an existing environment. I'm pretty sure that this is echoed by probably most people that light fields are gonna get us a lot closer to actually recreating realities virtually to a photoreal level which is currently not too often the case. You either have the very photoreal looking 360 video experiences or you have the real-time stuff that usually has at least a little bit of a gamey look still but Yeah, of course we are. Capturing light fields, that's one thing, and then working with light fields, it's an absolutely interesting field, definitely.
[00:05:50.665] Johannes Saam: I would like to take this opportunity to tell you a little bit more about how we currently work at Framestore internally. We have a very interesting dynamic where we have a set of interested and very creative and driven people who either at work gets the opportunity to experiment and or do that at home and then work back and forth. We've, for example, written a VR demo which is, we call it Coral, which is an interactive fractal explorer. It was only done to explore if we could do it and how it would look and we got the time to, you know, the two weeks to kind of explore this and now we're based off that with kind of offshoots we use the technology and the experience that we gain from that to influence all our decisions with clients. And it's the same with Lightfields, like we have, I can't really talk a lot about it, but we have a very good collaboration with the science world and we are driven through our exciting creative director who is very interested in the whole subject. We are actually trying to, you know, come up with our own solutions for that, but it's very early stages and very It's one of those topics where we all know this is an interesting choice and we try to experiment as much as we can and that we are allowed to come up with new technologies to educate ourselves first and then take that experience and put that into the creatives with clients.
[00:07:08.207] Kent Bye: Yeah, to me it seems like the biggest open question with the digital light fields is how to output them and give them to people to experience them because it's essentially like the volumetric experience that you could walk around potentially into a room scale, you know, whereas most of the stuff that you've been working on has been 2D and so whether or not that's going to be in the game engine with Unreal Engine or Unity or some other completely new kind of output format to be able to explore these kind of digital light field worlds.
[00:07:33.458] Johannes Saam: I was majorly involved with a little technology called Deep Image Compositing, which me and a group of other people were kind of coming up with, which is a similar concept where you have an image, but it has more to it than just color. There's also depth information and there's multiple depth information, so you can recreate the image with the highest fidelity possible. So the next extrapolation from there is obviously to completely detach yourself from the camera point of view and have a point cloud-ish representation. And the hardest part is to manage the amount of data and to manage the amount of data smartly and to, again, to not make it gimmicky, to use it for what it's good at. That's the same with any kind of new technology, I think. It's always exciting and it has its value for the problems you're trying to solve. Just don't try to come up with the one-button solution for everything. There will be probably multiple different approaches that will all coexist till the next genomic technology leap comes out and then we do the whole thing over and again.
[00:08:32.548] Kent Bye: Well, it seems like with the light fields, it's sort of demanding an entirely new workflow for special effects as well. I mean, imagine that you might be able to take a point cloud of a character and then it might be actually easier to cut out the background if you have that depth information to then put whatever you want in the background, but you still, if you're actually doing special effects on the characters, it seems like there would need to be some sort of new workflows for that.
[00:08:56.844] Johannes Saam: Absolutely, and again it started with the deep image idea that you have. If you think of it from a computing standpoint, not from a filmmaking camera standpoint, we always have the problem that we have beautiful animated 3D scene that if you take your character for an example, you have the front and the back of the character, the whole thing is there, the whole thing is lit, the whole thing is shaded and you make a flat image out of it. and pass it down the pipeline to the next guy who then gets a flat image in compositing and has to, all this precious 3D data has been discarded and now he has to paint and do stuff to make it still work. So Deep Image was the first driver, this whole filtering paradigm has been shifted down the pipeline and you keep somewhat of a 3D representation to salvage that data to, you know, use more the different workflows for depth of field, for example, formatting, or all this kind of stuff. Lightfields is, again, the next level up, so you first of all try to capture a similar data set live with a camera, which is obviously the next step to composite images properly, and also to keep more information, because now suddenly you don't have color, you also have the entire light information at this point from a sampled direction, so you can go even further. So it's an ongoing process, and it's very challenging, but I think thanks to computers getting faster and memory getting more available to us, It's always the cat and mouse game. One chases the other. As soon as we have enough render power to render the current technology, the next piece of technology comes out, which makes our render farm way too small.
[00:10:25.272] Kent Bye: So maybe you could tell me a bit about some of the big projects that you've been working on lately.
[00:10:30.115] Michael Ralla: So most recently, we completed It's basically an entire campaign for BMW's new M2 and the campaign features Gigi Hadid as the long-legged supermodel getting into one of the cars and there's five identical cars and you have to follow the one which she's in and that's basically like a adrenaline-fueled adaption of the well-known Shell game. That was the most recent piece. It's a 360 video pre-rendered experience that was designed specifically for a YouTube 360 release, but we just released today an app that Johannes in fact wrote for the whole piece which transports this entire experience into a different reality.
[00:11:18.028] Johannes Saam: I just wanted to add that so basically the app part of the whole campaign, which was written partly by me in Los Angeles, was basically a wrapper around the video that Michael was supervising. And it features the same component, which is a big 360 video. But the reason why we made our own app is because we wanted to give them, them being the client, SBMW, highest quality experience so we can load a 4.5K actually video with adaptive audio and real-time lens flares and all this kind of niceness in our own app that is not YouTube as an ecosystem.
[00:11:53.788] Michael Ralla: And that's actually an interesting aspect because making that specific app for that 360 commercial was born out of necessity to stay in control of the release quality because we found that Especially with YouTube releases, you're highly dependent on internet connections and also obviously the actual system power of your device that you're using to look at the experience. On top of that, there's just added sweetening and certain things that we were missing in everything that's currently out there, as Johannes just said. you know, the ability to always deliver 4K playback, to me that's a really big thing. And then the other feature that I really like being a 2D compositing person is the fact that there's real lens flares accurately getting rendered in real time while you're looking at the video, which is something that I've never seen before and that was something that we've all been very keen on implementing for quite some time.
[00:12:55.200] Johannes Saam: What excited us I think as a team as well to write the app is that VR can be very gimmicky and this campaign has been designed with the whole VR 360 concept at its core and it only works there as Michael described the outline of the piece so you know somebody goes into a car and then you have to follow the car in 360 while you spin around trying to follow this car and it only works in 360 video so it's a very purposeful piece for the campaign.
[00:13:23.653] Michael Ralla: Most 360 experiences, as Johannes just said, or I don't want to say most, but a lot of 360 experiences out there, not only low quality, but also relatively gimmicky. So you're standing in some kind of an environment or a room and there's stuff happening around you. And very often people are trying to motivate you to turn around and look into a different area or just a different section with sound. That's like, to me, that seems to be the number one motivator to get people to actually use the 360 room. Whereas on this experience, just the fact that GT is getting into one of the five identical BMWs, and then it's up to you to determine or keep your eyes on the one that you think she's in. That's like the biggest motivator that I've ever seen so far across the pre-rendered 360 landscape.
[00:14:17.905] Kent Bye: So if we go back to South by Southwest in 2014, Framestore had The Wall, which I think is probably one of the big first advertising marketing campaigns to kind of do a fully immersive experience with Oculus Rift and got a lot of attention at the time. And so I'm just curious if you guys could talk about, you know, starting with that experience and then kind of some of the big experiences that you've done since then.
[00:14:41.304] Johannes Saam: See, I joined Framestore roughly a year ago, so I cannot talk much about this, but probably more recently this was, as you say, one of the first projects. The most recent attention that Framestore got was the Mars bus experience, which got a lot of attention at Cannes this year and was done out of our New York office. And it's a piece where kids go into a school bus and suddenly all the windows, which are actually screens, transform them into a Mars rover experience, which is a similar concept. I guess it's purposeful VR again. You're educating people by putting them in the right environment, exciting them, and not just letting them experience something where they could go any other way. Maybe gimmicky is too much, but usually if I get a VR experience of something I could actually attend, then that's very nice, a documentary piece, but for an ad or for an experience, I love pieces that I could not experience any other way. Like, it's very hard for me to be in the middle of a five-car stunt drive adrenaline madness, or on Mars. Those are two scenarios where I have a hard time to transform myself into, and that's why I think Framestore's VR experiences kind of add to this part of the entire 360 VR realm.
[00:15:56.085] Kent Bye: So what's next for Framestore then?
[00:15:58.352] Johannes Saam: We are collaborating right now, Framestore's VR studio in New York and LA is collaborating with Time Magazine for an interactive piece for their VR app that is launching later this year. And we have just announced this collaboration. I cannot really go into more details what it's going to be like, but if you stay tuned with the Time Live VR app over the next couple of months, you will definitely see our work come up there.
[00:16:27.353] Kent Bye: Great. And finally, what do you see as kind of the ultimate potential of virtual reality and what it might be able to enable?
[00:16:36.749] Johannes Saam: I like augmented reality in the sense that I like reality itself a lot already. I don't need a lot of like, if virtual then completely abstract and completely different. As I said before, I don't want to do the pool bar VR experience and transform myself into a CD bar and play pool where I could just go to Silver Lake and play pool in a CD bar. But stuff that is more abstract and more... It's not experienceable, if that's a word, any other way, is what I think is the big potential. Or augmented reality, where you take reality and make it better or help people out with this kind of technology.
[00:17:14.259] Michael Ralla: I think that's a key point. I mean as cheesy as it may sound but making the world a little bit better or just making the world a better place. I mean so one experience that I've had in the past is I was just taking one of the headsets home and my parents live in Germany and I showed them some 360 still images that were taken on a cheap Theta camera that I showed them to them in the headset and they were actually able to look around and see where I work and they could just get an idea of what it is like to be there and that was, as simple as that was, it was really cool and almost rewarding to see them get that kind of glimpse into my daily life by just strapping on that headset. So I think there's a lot more potential down the line And I'm really curious to see what else is going to happen and what ideas people come up with Awesome.
[00:18:12.703] Kent Bye: Well, thank you so much. Thank you. Thank you so much So that was Johannes Somm. He's a senior software developer at Framestore, as well as Michael Rahla, who's a compositing supervisor at Framestore. So I have a number of different takeaways about this interview is that, first of all, there's two big areas about this podcast I want to talk about. First is the digital light fields and the workflow, and then a little bit about the actual content that they've been producing. So let's dive into digital light fields here for a second. So in talking to Otoy founder Jules Olbrecht, The big challenge with digital light fields seem to be in three different major areas. One is capturing the light fields. The second is then processing those light fields into some sort of ray traced environment that you can actually see within a immersive experience. And then finally is to how to compress and deliver that content efficiently. So these are three big different areas that all kind of point to the fact that digital light fields have a whole lot of data and it's going to be kind of a challenging thing to figure out. But there's a couple of things that have since come out since I did this interview. The one big thing is that the Lytro Emerge showed some of the initial footage from their digital light field camera. So one of the things that I was really curious about is how are they actually going to integrate this light field camera into the production workflow for these visual effects industries. So one of the things that I noticed is that they actually took a very similar approach to what Jules was actually recommending to them. That is to say that instead of having a single digital light field camera capture an entire scene, they're actually just focusing it on the subjects. And in this demo footage that they showed of the Light Show Emerge, they're essentially just showing things that are in the near field, or things that are going to actually have stereoscopic effects. Part of the reason is that if you only have a single camera and you're trying to comprehensively capture an entire scene, then if you are shooting somebody, they're going to be blocking the background in some way. And so you're going to have all this included parts of your footage. And so what Jules was recommending is to do some sort of either photogrammetry capture for a static scene, so let's say that there's not a lot of other action that's happening within that environment, you could do something like that. But if you have something that's actually moving around and is dynamic, then you may actually want to do more of a 360 video that then you're compositing it over. So Alex Chu gave a great lecture back in November of 2014 when he was still working at Samsung, where he essentially did this experiment when he was trying to figure out the stereoscopic effects. And what he essentially found was that there's kind of like these three different zones of stereoscopic effects. The first zone is that anything that's from like 0 to 10 meters away is going to have a strong effect of 3D. Anything that's from like 10 to 20 meters is going to have something that's a little bit like some 3d But not much and then anything that's like 20 meters and beyond that's like 65 feet away Then that's gonna essentially have no discernible 3d effects so in other words You don't really need to have stereoscopic cameras to be able to capture the background if it's like either 20 meters or that's 65 feet away and even if something is like 10 meters away that's like just over 32 feet and then that's still within the realm where it may actually make more sense to just do some sort of more static, even monoscopic type of 360 video capture that you're then compositing on top of that some of this digital light field footage. So I expect to see in the future when people are using these digital light field cameras, they're not going to be using them to capture the entire scene just because it's not going to be feasible. It's not going to be able to, with a single camera, be able to comprehensively capture everything. perhaps if they had an array of cameras capturing an entire scene then you're just dealing with a whole lot of other data and trying to combine these different perspectives and from the footage that they showed initially they just kind of showed these astronauts walking on the moon and they basically were showing this example of a faked moon landing and They had the protagonist astronaut who was there and essentially everything that was in the scene was green screened in except for the talent and they had a whole model of a spaceship and they were just using these special effects in order to really do a lot of this digital light field footage. So I expect to see a lot of this blending of the special effects industry and digital light fields and compositing. I think this is actually going to be a big part of even making these digital light fields possible. So those are some of the big takeaways that I had. I just wanted to kind of synthesize that a little bit. just because I think that Framestore seems to be working on their own kind of solution. They could very well be collaborating with different companies like Otoy, who they're creating some plugins for Nuke. They're trying to kind of create this standardized solution for all sorts of different processing and compression of the digital light fields. So on to the content of Framestore. I wanted to talk a little bit about the BMW M2 experience that Framestore released because it's actually a really fun experience. It's like a short minute and 50 seconds or so and it's essentially just a tracking shot that's going from the beginning into a finish line and straight down this runway and they've got these five cars that are all look the same. They're all the BMW M2s and they're kind of weaving in and out of each other and the goal is to watch this model who is wearing a red dress and she gets into a car and your goal is to basically keep your eye on that car as you go through this experience. And so it actually does a really great job of forcing you to really move around in a 360 degrees. I think at the very beginning in these 360 degree videos, people have to be a little bit trained into how to actually watch them and to learn to move their head around. And this is a great and fun example for that. For anybody that's not really seen a lot of different VR before, this might be a great experience for them to see. Some people that may get motion sick for this, but it's a pretty consistent velocity and I tend to be pretty sensitive and it was okay for me. But the other thing that I wanted to just talk about a little bit is the fact that they also had a companion app that was available for download. So I went ahead and downloaded that and checked that out. And my thought was that I'm kind of hesitant to move towards this world where you have to download these applications and fill up 495 megabytes worth of space on your phone in order to see one of these experiences. I think right now they're really trying to go for quality for these experiences so by having a download they can control both the frame rate as well as the consistency of the experience. YouTube actually does have the ability to stream 4k video and I tried it just on my phone in the Google Cardboard and it actually just started to buffer a couple of times. I don't have particularly speedy or rocket lightning fast internet but it's kind of the common use case for a lot of people if they want to watch one of these experiences then they're either going to have to preload it in order to make sure that there's no buffering interruptions or they're just going to kind of have to roll the dice and hope that it doesn't get interrupted in the middle of the experience. And whenever it starts to buffer in a VR experience, I think it is a lot more disruptive than watching a video on the 2D screen. It really can take you out of the experience. And especially with motion in this experience, it can actually be a little bit jarring and motion sickness inducing when you're suddenly stopping and starting in that way. So I can totally see why at this point why that's necessary for them to also produce an app. I just think in the long term that just for an experience that's essentially two minutes, I'm not quite sure that people are going to want to have to manage the download space and then do the cleanup afterwards, make sure that they just don't get their phone all clogged up. I think that's the big thing with VR right now is that if you have a gear VR or mobile VR and you're downloading and checking out everything that's available, then if you don't do that cleanup, then all of a sudden your phone that may have like 64 gigabytes worth of spaces can very quickly get filled up when people are expecting to have the ability to be able to download the higher resolution versions of either video or other interactive experiences. So the other experience that I wanted to mention briefly is this field trip to Mars, which was mentioned on the podcast. And what they essentially did was took this school bus and replaced all of the windows on the school bus with these transparent LCDs. They put all these kids into the school bus and started to drive around Washington, DC. And then what they did was they essentially turned on the screens and it made it look like this bus was driving on Mars. They had created this whole Unreal Engine experience of a vehicle driving around in Mars that was kind of mapped to a similar route that the bus was actually physically going on through DC. and so they're kind of mapping this one-to-one on this bus and having kind of like this giant shared virtual reality experience on the bus with all these kids who are kind of freaking out as they're like looking out the window and feeling like they're actually traveling on Mars. So it kind of reminds me of other immersive dome experiences which are shared immersive experiences where you sit into this geodesic dome and you kind of have this ability to project onto this dome these different immersive experiences and At VRLA there was a really amazing experience by Android Jones that to me was really quite moving and just a great shared experience because you are actually kind of physically co-located with people kind of like watching a movie in a movie theater but instead of a movie you're just kind of laying on your back looking up at this dome and having this immersive experience. So the other big experiences that Framestore has worked on include the Game of Thrones exhibit that was at South by Southwest. That's when they were kind of mimicking this elevator going up the huge wall that's shown in Game of Thrones. People were really just freaking out and screaming. I just remember a lot of videos that were coming back out in March of 2014 as people were wearing these Oculus Rift DK1s and having these additional kind of 4D effects happening. A lot of people seeing VR for the first time, having this whole immersive experience of the Game of Thrones. So, that's all that I have for today. I just wanted to thank you for listening to the Voices of VR podcast, and if you'd like to help out the podcast, then tell your friends, help spread the word, and become a donor at patreon.com slash voicesofvr.