#440: Spherica’s Stabilization Technology Used in HBO’s ‘Westworld’ VR Experience

Alina-MikhalevaHBO premiered a VR experience for their new Westworld series at TechCrunch Disrupt, and they used Spherica’s camera stabilization technology in order to pull off an extended live-action tracking shot in VR. Common advice given to new VR filmmakers is to not even try to attempt to move the camera since any shaking or sudden unexpected movements can be a motion sickness trigger. But Spherica has been able to create stabilization platform using a GoPro mount and remote-controlled rover that is able to comfortably move a VR camera through a tracking shot.

I had a chance to catch up with Spherica’s CEO Nikolay Malukhin and managing partner Alina Mikhaleva at TechCrunch Disrupt where we talked about their rover, drone, and cable camera stabilization solutions, collaborating with HBO on the Westworld VR experience, scaling up their rig to Black Magic and eventually RED Epic cameras, and some of their upcoming content and hardware projects including a first-person perspective helmet mount.

LISTEN TO THE VOICES OF VR PODCAST

You can watch a high-res demo of their Spherica technology in this Immersive Combat demo for the Gear VR, or watch it on YouTube here:

The marketing agency Campfire was responsible for designing the physical Westworld booth experience at TechCrunch Disrupt, which created the feeling that Delos was a real travel agency. The actors running the booth were telling attendees that they were showing a virtual reality experience that featured one of their destinations, and so I didn’t have any idea that what I was about to see was really an immersive advertisement taking me into the surreal and dystopian world of a new HBO series starting on October 2nd.

Here’s some photos of the booth and the travel brochure they were handing out:

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. My name is Kent Bye, and welcome to the Voices of VR Podcast. So I was at TechCrunch Disrupt last week, and I was roaming around the expo floor checking out all the different VR experiences, and there was this nondescript booth that said Westworld, a Delos destination, and it looked like they may have some private rooms in the back. And so on the off chance that they might be showing some VR stuff, I went up and said, yeah, what is Westworld? And I was told that they were a travel agency and they were showing a premier VR experience to show off one of their destinations. And so I was like, okay, that sounds great. Sign me up. And so I got an appointment and was able to go in and see this experience, which ended up being this really flashy Unreal Engine experience where you're engaging with this travel agent person who's showing you into this Western world and get transported into what appears to be like in the middle of the desert and you have these interactions with these characters who are kind of telling you about this destination and then you're suddenly shot and killed and you go into this live-action sequence which is kind of like this science fiction dystopian future and You start moving through this scene and just being utterly confused as to who are these people and what is going on? And oh my God, what kind of advertising for travel agency is this? And so I get done with this experience and I'm like, okay, so what is this advertising? And the person who's running the show is like, oh, this is an HBO show, it's called Westworld. So Westworld is essentially like this world where you can go into these different places and there is these artificial intelligent robots and anyway there's this experience that they didn't even say it was a TV show and I as a VR journalist was completely kind of sucked into this alternative universe. Anyway, the major point of this podcast is that I'm going to be actually talking to Spherica. They're the technology that HBO used at the end of their experience in the live action sequence, which actually seamlessly moves the camera through the scene, where normally within VR productions, it's been advised to not move the camera because with any shake and movement, can cause a lot of motion sickness, but Spherica has created this really amazing stabilization technology to be able to put cameras either on a rover, on a drone, or even a cable camera. And so, I'll be talking today to Alina Mahaleva, as well as Nick Mulukin of Spherica, talking about some of their camera technology and what it's enabling in terms of storytelling. So, that's what we'll be covering on today's episode of the Voices of VR podcast. But first, a quick word from our sponsors. This is a paid sponsored ad by the Intel Core i7 processor. If you're going to be playing the best VR experiences, then you're going to need a high end PC. So Intel asked me to talk about my process for why I decided to go with the Intel Core i7 processor. I figured that the computational resources needed for VR are only going to get bigger. I researched online, compared CPU benchmark scores, and read reviews over at Amazon and Newegg. What I found is that the i7 is the best of what's out there today. So future proof your VR PC and go with the Intel Core i7 processor. Today's episode is also brought to you by... VR on the Lot. VR on the Lot is an education summit from the VR Society happening at Paramount Studios October 13th and 14th. More than 1,000 creators from Hollywood studios and over 40 VR companies will be sharing immersive storytelling best practices and industry analytics, as well as a VR Expo with the latest world premiere VR demos. This is going to be the can't miss networking event of the year with exclusive access to thought leaders of immersive entertainment. So purchase your tickets today while early bird pricing is still in effect at VROnTheLot.com. So this interview with Alina and Nick happened at TechCrunch Disrupt, which was happening at Pier 48 in San Francisco from September 12th to 14th. So with that, let's go ahead and dive right in.

[00:04:17.871] Alina Mikhaleva: My name is Alina Mihaleva. I am working in the VR startup Spherica and we are aiming to change the way how we see VR content and VR live action content in virtual reality because I truly believe that there is a huge market for that and we just need to make live action content right. That's what we're working on.

[00:04:40.244] Nikolay Malukhin: My name is Nick Malukhin, I'm the CEO of Spherica, and we're really happy to think about ourselves as a missing link of virtual reality evolution. So basically what we're trying to solve is the biggest problem that the industry faces at the moment, which is the lack of movement, lack of stabilization of the camera. So we not only built the system that manages and fights this problem, but we also produced a lot of content which is a proof of concept and it's an exciting time because it's going to change the whole agenda for the industry, hopefully.

[00:05:19.130] Kent Bye: Yeah, so one of the biggest challenges with shooting cinematic VR is that with locomotion there's a lot of movement when you are moving the camera around and anytime that there's things that your eyes are seeing that your body isn't, especially shake, I think for me is a big motion sickness trigger. So I think a lot of the footage that I've seen on drones or even dolly shots that are moving around tend to be still a little bit too shaky for me to experience, but I think In seeing some of your footage, I think it was one of the first times that I've really seen a comfortable movement within VR that was stable enough that felt like it was moving at a consistent velocity, which I think is also important. Starting and stopping can be a trigger for some people, but as long as the movement is moving at a constant speed, then with your Spherica platform, be able to stabilize the shot in a way that is comfortable.

[00:06:10.903] Alina Mikhaleva: I think you're absolutely right, this is one of the main challenges for cinematic VR and I'm quite disappointed to hear a lot of people dismissing live-action VR experiences as not virtual reality. I'm personally very excited when I see really good content in VR headset and I feel quite immersed and also we need to remember that there are a lot of content verticals that can be done only in live-action VR and if we dismiss them we are actually cutting a very large percentage of the audience. That is crucial for mass adoption of the virtual reality overall because at this point of time we're still not in the interactive part of VR. The consumers are only starting to test mobile VR technologies and Samsung Gear VR and solutions like that are the most popular at the market. So they are starting with the video. Their first experience most of the time is video. And if the video is of the low quality or even worse, if it's not stabilized and it creates an unpleasant affects and brings you motion sickness. This is the last time people are going to try anything connected to VR. So that is why it's so important to solve this problem, to actually bring high quality live action content into VR. And on a later stages, we can definitely merge it with some interactive experiences, but still live action. We cannot do tourism in CGI. We cannot do celebrity content in CGI. There are so many things that interest mass audience and need to be in VR and have place to be in VR that we need to solve. So for that, we've specifically developed our solutions. Our stabilization systems allow you to move the camera and at the same time, it's absolutely comfortable for the viewer. Not only it's solving the problem of live action capturing and motion, but it's another problem and very interesting conversation about the storytelling in VR. Because I think that the conversation, how to tell the narrative in VR, how to drive the story in VR, is one of the most challenging for the industry, for Hollywood, for big content studios to actually go into this space. I think that they're hesitant because they haven't seen quality content and that's exactly what we're trying to change. We want to show them that it's possible to drive the story and motion for that is one of the crucial parts because if you have tools available for storytellers where you can follow the character, when you can go after him, create different kinds of movement, different kinds of scenarios even, combining live action content with interactivity. That's a completely different story, and I truly believe that as soon as more and more content creators see this opportunity, they will end up going for larger production projects in virtual reality, which is not happening right now.

[00:09:08.977] Kent Bye: Yeah, so you have a couple of different systems that you are able to stabilize the camera. There's the one that you're showing here at TechCrunch Disrupt, which is kind of like a robot that's on the ground moving around. You have a stabilization platform for the drones. Maybe you could talk about the different options that you have available.

[00:09:29.174] Nikolay Malukhin: Oh yeah, sure. So the rover is a really clever solution which utilizes our stabilization technology. So we put the stabilized rig on top of the rover and we run the rover with the speed of up to 40 miles per hour which means we can shoot not only live action but we can shoot something like Just think about the commercials for cars. We can shoot car racing, we can shoot drifts in real time, just following the real action. So we're not talking about CGI. It's a completely different experience and this is what the audience expects. And we can satisfy this demand. The drone is an amazing technology in itself, but when we add the stabilized 360 rig on the drone, that makes the whole story even more engaging and interesting, because now we can do the flyovers in full 360 imagery and you can enjoy the feeling of being there. It's like flying there, like literally flying. It's amazing. We also managed to build something absolutely extraordinary, which is the cable cam. The cable system, which is commonly known as SkyCam, this is what they use in sports when the camera flies over the stadium on a football field, for example. We succeeded in inventing the 360 stabilized solution for the SkyCam. Again, it opens up great possibilities for storytelling when the drone is not an option, for example, or the rover might be too much. So that's when cable cam joins in the game. And now we're thinking about building a POV solution. So we're talking about the helmets with full hardware stabilization of 360 cameras. And that would be just, I think, the most anticipated thing on the market. So this is what virtual reality is really about. It's the first-person 360 experience.

[00:11:27.170] Kent Bye: So you're getting a lot of demand from content creators who want to be able to shoot from the perspective of an individual, have them in a scene, perhaps even acting, having the ability to look down and see their body and their movements, and then have some sort of helmet that's able to, no matter how much they move around, to just have a stabilized point of view from that.

[00:11:47.171] Nikolay Malukhin: Yeah, that's true. And actually, whilst we were talking, I was thinking about, it seems like sort of a deja vu what we are experiencing in virtual reality industry at the moment. Because what I mean by that is that we already had all these problems in traditional filmmaking. So if you remember the early days of filmmaking, we started with black and white, we started with static camera. So the taboo was to move the camera. So the rule was not to move the camera. We are now facing the same, absolutely the same problems, and companies and content producers are hesitating. They consider it as a taboo, they are moving the camera, they are being slow in taking decisions, in grasping these new technologies, but we already had it in the past, so we really need to focus on the future, I think. And it's already here, and that's the most amazing thing.

[00:12:41.545] Kent Bye: Looking historically and looking at film, was it a matter of motion sickness that people were getting motion sick back in that? Or is it just kind of more of a stylistic thing that there's something that you just shouldn't do?

[00:12:52.166] Nikolay Malukhin: Oh yeah, definitely that was the matter of style, but now we're talking about the new era of filmmaking where it's not only the style, but also the scale of the picture that you can show to your viewers and the matter of engagement. So we're talking about much more serious and deep engagement into the story and into the picture. So this is the new era for filmmaking. I really hope that the future of film is the 360 virtual reality film.

[00:13:24.493] Kent Bye: So what are the storytelling options or the impact that you see kind of moving the camera through a scene, what does that give you?

[00:13:33.578] Alina Mikhaleva: From my point of view, and we are definitely planning to experiment with a lot of things like that. So, first of all, it gives you the option to move as you're moving in the regular life, like follow the character. I understand that it's still not the movement that is initiated by the person, like in interactive VR experiences. But if you're following the story and the main character is inviting you to the story and you're following him, it's an absolutely different experience. At the same time, the drone footage can give you a kind of experience that you are not able to do in real life, which is flying, you don't see anything around you, you're not sitting on a plane, you don't see any helicopter next to you, you don't understand how it was filmed and it's another very exciting part of that. And of course the next step would be combining live action with interactivity. I truly believe that either with light field cameras or just using live action and adding some interactive points, we'll get there and we'll find the solution to combine this engaging VR that gives you the sense of presence in terms of you're deciding where to go and what to do with actually live action realistic footage. For example, already right now, we're dreaming about building the experiences where you are, as a viewer, you're choosing how to continue the story, where to go, which door to open. You need movement for each and every step of building the story. Without movement, you cannot actually think that it's even possible.

[00:15:12.047] Kent Bye: So with these prototypes that you're showing here, you're using the GoPro cameras and looking at some of the Lytro light field cameras that are on the horizon, it seems like something that would be pretty static and not moving around as well. So have you started to look at applying this type of rig to one of these larger light field cameras?

[00:15:34.280] Nikolay Malukhin: Oh, yes, we actually built a prototype which works with Blackmagic cameras already. Now we're talking about serious professional level of filmmaking, so Blackmagic camera is already film quality camera which can be adjusted with additional professional lens, which is not an option with GoPros, unfortunately. It just proves that our solution is fully scalable, so we can scale it up to RED Epics, for example, if necessary. So we're ready to meet the demands of the market and to adjust our solution to any type of cameras.

[00:16:08.495] Alina Mikhaleva: Maybe I can add that we are filming right now and most of our rigs are built with GoPro, because the whole industry started with GoPro and it's the main camera. Because right now, if you film with RED Epic or GoPro, at the end of the day, the headsets are scaling the quality down, so you wouldn't see the difference. But as industry develops the headsets, the quality is getting better. Of course, we'll be ready and even ahead of the curve for that. To embrace all new technologies, to stabilize 3D 360 cameras, that's also another very big challenge because it's kind of already here. A lot of people are shooting with 3D 360. but at the same time it's even challenging going from frame to frame to just your eye in 360 3D. And if you move the camera, it's even harder for human eyes. So my feeling is that 3D technology is not there yet for a moving camera, but we're already testing and we definitely plan to continue doing that to be there when finally 3D is perfectly comfortable for camera movement in VR.

[00:17:16.463] Kent Bye: Yeah, and I had a chance to try out a VR experience here at TechCrunch Disrupt that was using some of your technology. Maybe you could talk a bit about that project and what you were doing in there.

[00:17:27.090] Alina Mikhaleva: It's hard to talk about this project because it's probably the client who should be talking about that. But it's a project produced by HBO. It's called Westworld. And probably as much as we can say that we contributed to this project by doing the part that was live action. filming and post-production, everything. And we enjoyed great collaboration with HBO team. Unfortunately, we cannot tell anything else because the project is not released and we don't know what are the plans for promotion of this upcoming series. But it's a great experience. There was definitely stabilized camera movement. So if you will see this experience in the future, I hope you enjoy it. We had incredible fun working on the set of Westworld series and we look forward hopefully once again this experience will show to other professional content creators and studios that there is a place for more VR content and especially original storytelling in VR because there's such a huge difference if you're just trying to film the backstage or just trying to fit existing filming scenes and film in addition in VR or you are specifically thinking about building a scene in 360 all around you. That's the whole different story and I'm sure that in two years time we will be talking about VR original series launched by the leading studios. It's absolutely there. We just need not to scare the users before we get there and we have high chances of doing that.

[00:19:10.991] Kent Bye: Yeah, I can talk to my own personal experience of it, not specific to the content, but just the ability to be able to be in another world, kind of a fantasy science fiction realm, where I don't really understand who the characters are, what's going on, it's confusing. But normally in the experiences, you are very limited by just having a singular perspective of a space, but having the ability to locomote through the space It's kind of a guided tour where it's showing me a lot of the different highlights of this space and different scenes and gives me a lot more of a dynamic type of experience that allows the whole scene to unfold in a much more complex way rather than kind of artificially choreographing an entire scene so that everybody would be moving around a singular point. you can start to move the camera through a space in a much more natural way where people are kind of naturally interacting with the scene and you can kind of really look around and see what's happening and be utterly confused and trying to figure out what's going on.

[00:20:12.182] Nikolay Malukhin: Yeah, that's true and actually it's an interesting point from the psychological point of view. So this is what we see each time certain people try our content. So when they test it for the first time, I can see in their eyes when they are back. I need to tell them welcome back to the reality because they are teleported into another dimension, this is for sure. And they all say that it is thanks to the moving camera because the static camera cannot teleport you, cannot transport you to the new realm. So it's the combination of the moving shots, the static shots, probably some CGI effects, but it's a lot of work and the result is just absolutely amazing.

[00:20:52.635] Alina Mikhaleva: And of course we started by creating our own original series, so it's our experiments. We absolutely understand that we had to showcase our technology, hopefully at its best. We produced action films that we're going to release soon. The trailer and the application is already available for Gear VR users and coming to other VR platforms soon. It's one niche product. But by creating this niche product, we also wanted to kind of do what others wouldn't risk to do. And we definitely reached this point because no one ever filmed a beautiful flyover when you're flying over fighting ships. Maybe some people would say that it doesn't make sense in VR, but nobody ever tried. We at least tried to show it. And I think still for, hopefully for a lot of people in content business, it will showcase that it's possible.

[00:21:46.709] Kent Bye: And we're here at TechCrunch Disrupt. There's a lot of different startups that are here. So you're here presenting, figuring out the next steps for your company. And do you plan on eventually having this as a technology that companies can license? Or is this something that you would imagine selling to companies? Or what's sort of the plan moving forward?

[00:22:06.025] Nikolay Malukhin: Well, the plan here is to introduce this technology to as many companies as possible. So we really want to... The big idea behind all this is to change the current situation in the market and to actually make the content much more appealing to the viewer, to make the content really high quality, to introduce camera movements, all this stuff. So we really want to have as many companies as possible use this technology and make the most of it. So we plan to license and to rent this technology to content producers and to companies who work in the field. And hopefully we'll also look at raising funds. So this is what we need at the moment.

[00:22:51.604] Alina Mikhaleva: Basically, the main idea is that major content producers need to have access to this technology right away. And definitely, this will be our priority. The next step might be selling it to the clients and studios who are interested in that. But we also plan on producing our own original content, hopefully, in the future, if it all works out.

[00:23:16.932] Kent Bye: Great. And finally, what do you see as the ultimate potential of virtual reality, and what it might be able to enable?

[00:23:25.942] Alina Mikhaleva: For me, absolutely, it's the future medium. I want to be more active in the experiences, in the information. It's not about entertainment only. It's about news. It's about the way how we consume information, the way how we interact with the world. For me, virtual reality is going to change the way how we are interacting with the world. And of course, content is coming as the first signal how this interaction is changing. There are a lot of things that we need to learn to figure out how we're going to interact, or how active we want to be there, how passive we want to be there, but it's ultimately going to change everything for me.

[00:24:09.010] Nikolay Malukhin: Well for me it's a pure fantasy world and that's why actually the first idea when we thought of doing our own content series that was to do something like a fantasy world. So this is what we created and this is my idea that I really like being introduced to the masses. Great, well thank you so much. Thank you Kent, thank you very much.

[00:24:31.704] Alina Mikhaleva: I'm a very big fan of your podcast. I'm very excited to do this interview, thank you so much. At last we're here.

[00:24:38.831] Kent Bye: So that was Alina Mehaleva as well as Nick Maloukin of Spherica, and they've created some stabilization technology to be able to create tracking shots within VR. So I have a number of different takeaways from this interview is that first of all, I was actually pretty skeptical that I would be able to see some tracking shots within 360 degree videos that would be comfortable, but I gotta say that the technology of Spherica is really pretty solid for somebody who is pretty sensitive to motion within VR. They're able to stabilize the camera in a way that makes a huge difference. There are some caveats, I think, when talking about some of these camera movements. First of all, I think it's really important to move at a consistent velocity whenever you're starting and stopping and kind of slowing around or acceleration. That's something that can be a big motion sickness trigger for people. So as long as the movement's at a constant velocity, then it is a lot more comfortable. Also, there's one point within the Westworld experience where they're taking the rover through this tracking shot that is moving in a straight line, and then at some point they kind of move the camera and move the entire rover into a different direction. And while the rover is able to do that, I think the effect that happens on the actual video is that it feels like someone's taking control of the camera and starting to move it for you. yaw rotation within VR that is automatic and that you're not controlling is a big motion sickness trigger for a lot of people and it was very subtle, it was in a very long moment so I don't think most people would notice it but I definitely noticed some discomfort and it's not something that I would recommend being able to move the robot through corners or anything like that so I think the limitation of kind of going in a pretty much straight line I think is one constraint that I think is still something that should be kind of adhered to Overall, what this showed me, though, is that there's been a number of different rules that people have been setting forth in terms of creating a comfortable experience within VR. And I think that Nick is right in saying that there are some taboos that need to be broken, and there may be actual ways of getting around some of the discomfort. And so even though there's a lot of different best practices and guidelines for creating a comfortable experience, there can be solutions that are exceptions to those rules. And I think that people need to try to break some of those rules, but also respect the rules and also know what some of those motion sickness triggers are. So for example, anytime they're doing a yaw rotation, for example, that's a big trigger. In their demo for Spherica, they had a camera shake when they had somebody within the show kind of hit a giant hammer into the ground and they did a kind of artificial camera shake. And I think Camera shake is something that is something for me that again, if I keep seeing a lot of different camera shakes within an experience, one example is smash hit is a gear VR game where one of the different power ups actually kind of shakes the entire world. And for me, that's something that is a little bit motion sickness inducing. But overall, if you just take a step back and take a look at the Spherica technology, they're able to create this amazing stabilization, which I think does give a lot more comfort within these VR experiences and kind of opens up a lot of different storytelling mechanisms. You are pretty limited from a stationary first-person perspective, but if you're able to start to move through a scene, then you can start to have a lot more dynamic action that's happening. So it sounds like they have support for a rover, a drone. I didn't see any other footage from a cable cam yet. And it sounds like they're also working on a helmet mount, which I'm still not quite sure if having a camera within a human is going to be stable enough to be able to be comfortable for people watching. I know that Specular Theory has been very ambitious in doing a lot of different first-person perspective camera shots. I think they even created their own helmet mounts within the shooting of their latest perspective, version two, that premiered at Sundance this year. And that was basically like a handheld shot from a first-person perspective. And it was fairly discomfortable watching it, but the content was so compelling that I kind of fought through a lot of those motion sickness triggers. But having a helmet mount, I think, is something that a lot of content creators are wanting and that they're working on. But yeah, I'll be curious to see how that ends up coming out and if they're able to actually create something that's stable enough that's still comfortable for people. So I just wanted to comment a bit on Alina's comments about light fields and interactivity. So Lightro did show some of their footage from the Emerge camera. And what was really striking is that a lot of the action that was happening in the shots that they were showing, they were essentially showing like a faked moon landing type of scenario that were, if you just look forward, you were seeing the Apollo and the astronaut kind of walking in this moonscape and you turn around and you see that it's a whole faked shot. But within this shot, what's notable is that they just had the primary actors that were moving through the scene shot through Litro's Emerge light field camera. Everything else within that scene was composited within Nuke. So they were kind of creating the backdrop and the other things that were in that scene within a 3D program. And they were either doing that through photogrammetry techniques, where they were able to create this whole 3D scene through taking static camera shots, But I think at some point some of that scenery in the background could be feasibly shot with some of these cameras, especially if there's a lot of dynamic action that's happening with the scene that would be hard to really simulate within a virtual environment. So the thing to really note here is that there's different stereoscopic distances that the things that are near you are going to have some sort of stereoscopic depth and the things that are far away, basically the pixels converge and your eyes can't really necessarily discern any sort of depth information. So it's perfectly feasible to kind of have these combinations of these 360 camera footage as well as digital light fields. So just wanted to explain that a little bit because when you take a step backwards to then look at kind of camera movement through a scene, I think Alina said that one of the open problems that they have right now is still how to deal with a lot of compatibility with the 3D 360 degree cameras. Right now, they're just supporting the GoPros, which is in mono. It's not in stereoscopic 3D, as well as other solutions like maybe Blackmagic cameras that they've been able to build a prototype for. And they say that that kind of proves that they can scale up to something as big as a Red Epic camera. But all those solutions are still within a monoscopic 3D. And so like I said, if there's a certain amount of distance of the main action that's happening, as long as it's a certain distance away where the stereoscopic effects basically are not noticeable, then that is totally fine. You don't really actually need that stereoscopic camera unless someone's right up into the near field where you can actually see some of that stereoscopic depth. I think the big problem that a lot of these cameras deal with right now is that whatever they shoot in is at a fixed IPD and if then you show that footage to somebody who has a different IPD then it's going to look like off and not quite right. And so I think there's a lot of solutions like the Google Jump camera which is essentially taking a lot of artificial intelligence and we don't know quite sure exactly what they're doing but I presume that they're essentially kind of reconstructing the whole 3D scene perhaps even creating a mesh and then artificially creating two cameras within that 3D scene and then outputting the footage. So even though it's just 360 footage that's shot on a singular jump camera, they're somehow doing this magical processing in the background in order to simulate some of these stereoscopic effects. The other 360 cameras that are out there like Jaunt are perhaps doing something similar to be able to create some of their stereoscopic effects. So they didn't mention that they had any support for the Google Jump or something like the Jaunt cameras. So right now the technology stack that they're using is going to be something along the lines of both the GoPro, the Blackmagic, as well as perhaps eventually the RED Epic camera. So I think the big takeaway for me about seeing what they're able to do is that it really showed me that there was technologies out there that could create stabilization within the camera that could actually create comfortable camera movement within some limitations. Some of the drone shots that I saw, for example, were high up in the air and flying along and they were fairly comfortable as opposed to a lot of the other drone footage that I've seen that has enough shake that kind of makes me motion sick but they didn't have a lot of optical flow that was happening near the bottom of the screen and if you looked at the HBO experience for example most of the ground was pretty black and darkened out it was a not a well-lit scene and so it kind of made sense to have the floor be this kind of black abyss so you're not seeing a lot of optical flow that may be a trigger for people when they're watching some different VR experiences. So that's all that I have for today. I'd like to just thank you for listening to the Voices of VR podcast and if you'd like to support the podcast then tell your friends and help spread the word and become a donor at patreon.com slash Voices of VR.

More from this show