#436: Providing Wide-Area Tracking for VR Arcades with OptiTrack

Brian-NillesOptiTrack premiered a new demo at GDC that shows the extent of their tracking technology precision. They put passive tracking markers on a basketball and football that allowed people to go toss a ball back and forth to each other in VR. I had a chance to catch up with OptiTrack’s Chief Strategy Officer Brian Nilles at SIGGRAPH who talked about how OptiTrack is being used as the primary tracking solution within the different VR Arcade solutions including The VOID, VRCade, and Holovis. He also talked about OptiTrack being used for motion and facial capture for AAA studios, and for indoor GPS systems for robots and drones. There are a number of yet-to-be announced VR Arcade solutions out there that are pushing the limits of OptiTrack’s technology, and Brian gives us an idea of what’s possible by saying that he’s seen solutions that use as many as 75 HMDs within a space up to 165ft x 120ft.

LISTEN TO THE VOICES OF VR PODCAST

Here’s a video of their Basketball demo that premiered at GDC:

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. My name is Kent Bye and welcome to The Voices of VR Podcast. So at GDC this year, OptiTrack had this new tracking demo to be able to show the extent of their tracking technologies. They actually had these passive markers on a basketball. And so you had the opportunity to jump into a VR experience and have somebody toss you a basketball and you can throw it back and forth. They even had a football that you could throw around. And so they're kind of showing the extent of their tracking systems that are used in a lot of beyond room-scale VR experiences including The Void and VR Arcade. So I had a chance to try out this OptiTrack demo at SIGGRAPH and talk to the Chief Strategy Officer of OptiTrack, Brian Nellis. and talk a little bit more about some of the technology and how he's seeing a lot of the beyond room scale VR companies really push the limits of what the OptiTrack technology can do. So that's what we'll be covering on today's episode of the Voices of VR podcast. But first, a quick word from our sponsors. This is a paid sponsored ad by the Intel Core i7 processor. You might be asking, what's the CPU have to do with VR? Well, it processes all the game logic and multiplayer data, physics simulation and spatialized audio. It also calculates the positional tracking, which is only going to increase as more and more objects are tracked. It also runs all of your other PC apps that you may be running when you're within a virtualized desktop environment. And there's probably a lot of other things that it'll do in VR that we don't even know about yet. So Intel asked me to share my process, which is that I decided to future-proof my PC by selecting the Intel Core i7 processor. Today's episode is also brought to you by VR on the Lot. VR on the Lot is an education summit from the VR Society happening at Paramount Studios October 13th and 14th. More than 1,000 creators from Hollywood studios and over 40 VR companies will be sharing immersive storytelling best practices and industry analytics, as well as a VR Expo with the latest world premiere VR demos. This is going to be the can't-miss networking event of the year with exclusive access to the thought leaders of Immersive Entertainment. So purchase your tickets today while early bird pricing is still in effect at VROnTheLot.com. So this interview with Brian happened at the SIGGRAPH conference that was happening in Anaheim, California from July 24th to 28th. So with that, let's go ahead and dive right in.

[00:02:42.132] Brian Nilles: I'm Brian Nilles, I'm Chief Strategy Officer at OptiTrack. We're a wide area tracking company. Motion capture, but for VR, wide area tracking for HMDs, objects, controllers, etc.

[00:02:54.002] Kent Bye: So maybe you could talk about some of your biggest markets that you use OptiTrack tracking in.

[00:02:58.878] Brian Nilles: Oh, so, well, we've had a long history in optical motion capture, so the ones that people know us from are things like character animation for games and film. When we put reflective markers on a performer, we can pick up their movement in 3D and we drive a digital character. So all the sports titles, the sports games, hockey, football, baseball, basketball, soccer, all that realistic motion is derived from character animation that's performed by a performer with our motion capture. And then movement sciences includes a long list of research and injury prevention and also patient assessment of injuries and disease like cerebral palsy. So the idea is that we can measure subjects and produce a digital skeleton that's very accurate in 3D. in motion, we can marry that with peripheral data like EMG and force plate data for ground reaction forces, and we get these wonderful dynamic models for doctors to look at and decide how they're going to change these kids in surgery. And then there's a giant research arm that goes along inside that movement science vertical as well. And then the other ones include robotics. That's a big market for us now, both ground robotics and drones, aerial robotics. And basically we're indoor GPS for drones. Our accuracy, sub-tenth of a millimeter accuracy and very low latency enables that to be the sixth degree of freedom measurement device for input into their control systems. And then for ground robots, we're either ground truth systems for their other proximity sensors and things like that, or we're an end effector. We're so precise that we're an end effector verifier. So if they're trying to hit a bolt hole that's a tenth of a millimeter tolerance, and they can't do it through their normal encoders, then they'll readjust with optical data from us. And then the VR stuff is super exciting. So we don't do in-home VR. We, Oculus and HTC Vive, they all have their own tracking devices. We take over outside that 8-foot or 9-foot diameter in wide area tracking. So we're used in the out-of-home experiences, arcade experiences, arcade-like experiences, like The Void and VRcade and HoloViz. These guys are making grand environments where you throw on an HMD and you walk into an experience that is basically gameplay. and especially the void they do a bunch of physical effects as well so when you reach out to touch a torch and you light something else on fire you get heat in your face because there's a heat a flash heater there and they have wind effects from fans and that mixture with the VR experience is really compelling. The void just launched a new experience at Madame Tussauds in New York for the Ghostbusters movie so in conjunction with Sony Pictures and the reports I get are lines out the door and people paying $30 or $35 apiece to go through the experience. So that's our position and we've got customers who want to track in very wide areas. in China, Western Europe, Eastern Europe, North America, and some of these guys are going really, really big. One customer wants to manage 165 feet by 120 feet and have 75 HMDs. But that's stuff that's relatively easy for us to do.

[00:06:13.805] Kent Bye: Wow, and so maybe you could talk a bit about the specifications in terms of like, you know, I know the Lighthouse, they have lasers, but they have to have active markers where it's actually receiving the marker and then from the actual device, it's sending it back to the computer. But for OptiTrack, it seems like there's passive markers, so you just have a sticker essentially, and then somehow it's having an array of all the cameras taking that information and feeding it directly into the computer.

[00:06:40.597] Brian Nilles: Yeah, it's actually pretty simple. So we have high-speed, high-resolution cameras that we design and build, and we do both active systems and passive systems. But for the passive system, IR LEDs surround the lens, throws light onto the subject, we get a big bright return from the reflective marker, and we compare those views with multiple camera views, and we find their positions in 3D, you know, plus or minus less than a tenth of a millimeter. And then, once we find the point clouds, we've got a bunch of unidentified markers. We've got software that picks out the skeletons, if you've got markers on a person, or the basketball, in the case of what we're showing here at SIGGRAPH today, or the head-mounted display, or the gun, or the torch, or whatever it is we're measuring. So basically, we're tracking either skeletons or what we call rigid bodies. Rigid bodies, we have to have at least three markers on it to define translation and rotation, six degrees of freedom. And what we do is we end up putting a couple extra on so we can lose some, they can be occluded, they can fall off. And as long as we're left with three at any given moment, then we give great six-to-off information. So, and it happens at high frame rates, the camera's here running at 240 frames per second. So we get a snapshot, a freeze frame of the scene with all the markers in it, 240 times a second. And we can adjust that, you know, if there's a native frequency that the headset's running at, then, you know, we can make that anything we want it to be. But you get a freeze frame snapshot of 3D points in space hyper-accurately, and then we do that 240 times a second, or more with some of the other cameras that we have.

[00:08:10.912] Kent Bye: Yeah, I think it's pretty impressive to see a basketball with markers and to see it spin around and still track. Are you able to actually track, like, precision to how many times the ball is actually spinning around? Because for most other systems, I'd imagine that it wouldn't be able to actually track that.

[00:08:26.055] Brian Nilles: Yeah, so we use a flat disc on the basketball, so it's just like the Spherical markers, same tape, it's just flat. And when you say, can you track the rotation of it, we're tracking each one of those markers, I think we've got 14 on it or something like that, each one of those we're tracking with sub-tenth of a millimeter accuracy. So, RPMs were getting to probably four decimal places. So, tracking the basketball, finding the center, we did some special software that does that really, really automatically. And the precision for the tracking and the low latency that allows you to put it on an HMD and have a basketball thrown at you and for you to catch it before it hits you in the face. That's why, have you tried it? Yeah. Okay, so what did you think?

[00:09:09.629] Kent Bye: Yeah, no, I think it was pretty impressive. I mean, one thing that I would like is if I was doing it to actually see my hands within the VR experience, because I had to just guess where my hands were going to be. But it was good enough that I could, with the ball bouncing to me, go ahead and pick it up.

[00:09:22.542] Brian Nilles: So all we have to do is put markers on it. We've got some gloves that you could have thrown on, and you can catch your hands. And there are other ways of doing that as well. Of course, Leap Motion has something that goes on the front of the HMD. And when it's in your field of view, it tracks hands and fingers. We talk to that team quite regularly, and we're incorporated with systems that are using Leap Motion as well. There are a couple ways to do it, but we can track anything with markers on it, so if you want to track your hands, throw markers on it.

[00:09:46.542] Kent Bye: So we're here at SIGGRAPH, and it's a big part of the visual effects industry that's here, and I'm just curious if you have any sort of metrics in terms of how much market share you have in terms of using OptiTrack tracking and sort of motion capture type of contexts.

[00:10:01.604] Brian Nilles: So like the traditional character animation market, so again, separated between film and games. Film is different in games in that often films use service providers. So we have a number of partners out there, people like Metric Minds in Europe and Animatric in North America. They've got offices in Vancouver and here in LA. And they did all the work for the Warcraft, where the most stunning live action set indoors I've ever seen. The largest motion capture volume by a factor of two. Animatrix did that work with 110 prime 41 cameras from off the track. And then the game work, we've got a bunch of headliner customers. Rockstar has five of our systems for the Grand Theft Auto line. Activision has the largest performance capture system in the world that tracks 120 3mm markers on each performer's face for the cutscenes in Call of Duty. So we are newer to the market than some of the other entrants. We started doing optical systems in 2008. But we have quickly become the sweetheart. We've got higher performing equipment and lower prices. So in terms of market share, we're over 50% in almost all markets. And in things like VR and drone tracking, these are markets we've already run away with.

[00:11:15.690] Kent Bye: So when you say you're putting markers on your face, what kind of things do you have to put on your face in order to track your face?

[00:11:20.411] Brian Nilles: In that case, there are three millimeter and four millimeter hemispheres. So it's crossed between the flat marker and a spherical marker. Very small, you know, they're like that big, three millimeters. And they've got an adhesive and they put them in different patterns around the eyes and the lips to capture the expressions that they want from the performers. For performers like Kevin Spacey and you, they've got A-list performers that are doing these cut scenes. So it becomes a film shoot for a game.

[00:11:47.160] Kent Bye: And so, for you, what's next on the horizon in terms of where the improvement or growth is in terms of what OptiTrack is going to do to continue to expand into the VR market?

[00:11:56.702] Brian Nilles: Yeah, so in the VR market, we've got work to do. It's a great system for VR. We need to make it more of an appliance where somebody plugs a cord in the wall and it goes to work. So our development is focused around that. For instance, a lot of these VR arcade experiences are run by high school kids for summer hires. So they don't have any motion capture experience whatsoever. So we're into making it a complete appliance where if somebody plugs it in, it goes to work in tracks and does exactly what we do best, which is low latency, wide area, precision tracking.

[00:12:29.847] Kent Bye: And are there different levels of cameras, or are they all pretty much the same in terms of when they buy an array of cameras? So are different numbers of cameras? Or just types of consumer or prosumer or professional versions of the actual OptiTrack cameras, or if they're essentially the same and it's just a matter of how many you have in the array?

[00:12:48.291] Brian Nilles: We've got a whole lineup of cameras starting at $600 a piece, which makes a VR tracking system, a wide area VR tracking system, start at about $4,000. A full body tracking system, that's character animation for a film or a game in a 20 by 20 area, that's about $7,500. So we're not a consumer company, right? We're all business to business. But what we've done is we've enabled the low end to do character animation for very small game developers. We've developed systems that can be set up in a conference room. These guys put them on speed rail. It dresses up the conference room a little bit. Always nice having a mocap room if you're a game developer. And they pull the conference table out and shoot mocap. All the way up to the most expensive systems in the world, like the Activision Call of Duty system, which is the largest of its kind ever built.

[00:13:37.363] Kent Bye: So can you talk about sort of the evolution of even the last three or four years of how you've seen the industry change in terms of VR coming on to the market with consumer VR and how that sort of changed what's happening here at Optitrack?

[00:13:50.277] Brian Nilles: I mean SIGGRAPH is a great example. So this is, you know, the world's best the guys who are deceiving you visually to make films and games compelling. And everybody on the show floor is trying to figure out where their spot in VR is, whether they're a content producer, they're a tool provider, and VR is changing everything. And I'm sure we're on an exuberance curve that will be dampened over time and will settle into the really valuable propositions. But right now, in terms of visual entertainment, this is the best thing that's ever happened since the TV. I don't know, I haven't thought about that before, but it's staggering the scale at which VR can change everything in the graphics industry.

[00:14:35.122] Kent Bye: What about from the context of, even within OptiTrack, from your perspective of seeing how it's changed the company?

[00:14:39.650] Brian Nilles: Well, so we sell into a diverse set of markets. All of them are growing, but VR is growing much more quickly than the others. So it's a massive opportunity for us. I mean, you know, with these out-of-home experiences, we could deliver thousands of systems next year, which is a lot.

[00:14:57.278] Kent Bye: Great, and do you have any personal favorite memories or stories of being in VR?

[00:15:01.579] Brian Nilles: Well, I haven't been in all our customers' experiences, so it's not fair, but I did do the Voids experience with the Serpentine, and it's kind of an Indiana Jones experience, and it's stunning. One of the principals and founders is a magician, and he's good at tricking people, and they're a clever bunch of guys and great technicians, and the experience that I tested blew my mind compared to, you know, the other VR that I've seen.

[00:15:27.710] Kent Bye: Yeah, I had a chance to try out the Void and talk to Curtis Hickman as well. So yeah, to me, they were working on some sort of their own RF tracking, trying to figure out a way to prevent occlusion, because I think that was one of the big issues that they were dealing with. Because they were dealing with actual physical walls in the Void. Do you know if they're going to stick with OptiTrack, or if they're going to try to do their own RF system to get around some of those occlusion issues?

[00:15:54.862] Brian Nilles: Yeah, well I won't answer for them. So we work closely with them on the most recent installations and we made some adjustments to make their experience better, their tracking experience better. So we feel confident that we're going to deliver the tracking systems they need into the future as well. But that's a question for them.

[00:16:12.882] Kent Bye: Great. And finally, what do you see as kind of the ultimate potential of virtual reality and what it might be able to enable?

[00:16:20.166] Brian Nilles: So I'm a fan of augmented reality as long legs. Some of the use cases I see with HoloLens and some of the others are really compelling, and I see a future for that. In VR, you know, I've got young kids who are gamers and the idea that I'm going to sit with my family in a living room with headsets on or play games that way, I think there's a real novelty there. There's no question you can make really compelling content that is stunning compared to a display or a couple displays and a computer. I'm just not sure if my kids wouldn't do that for 15 minutes and then put it on the couch. But I'm not Mr. Zuckerberg. He has visions well beyond gaming, right? So I'm not going to judge it, and I'm not going to tell you that I'm right. I do think that, given my experience with VR, the out-of-home experiences in VR are really compelling. Because if you think about it, it is laser tag. I do laser tag with my kids all the time. And it's the same thing over and over again. If I even say the words, they're in the truck. They're going. Like, they love laser tag. But if you think about VR, and the content can be something new every week, every month. So every time you go there, it's something new. That's way more compelling. So I think the shiniest object I see, and it's not just our view of the VR world, is fairly narrow compared to how big VR is. But from my view, I think that the out-of-home experience VR has real entertainment potential.

[00:17:43.835] Kent Bye: Well, one thing I just want to add on to that, because the Void, all their experiences are cooperative. There's no kind of going against each other because they're doing some redirected walking tricks. And so in order to have a big space, they kind of have space constraints and also throughput issues in terms of safety and people not running into each other. But have you seen other companies starting to think about actually competitive laser tag type of experiences within a bigger open space using OptiTrack?

[00:18:11.133] Brian Nilles: Yes, I can't talk about it, but because we're under NDA, but yes very very large areas, and that's what we do best Typical mocap setups have kind of cameras on the edges of the room one of the cool things about VR is we can put them in a grid in the ceiling add some regular spacing and we can go as big as you want so if you want to track this convention center This would be a stretch because I think this would take 500 cameras or so but you know we built systems over 225 cameras before so all the numbers for people who have come to us say I want to track in this area we can do all that they've asked for so far and a lot of it has to do with finding gameplay with you know high character counts and things like that.

[00:18:52.263] Kent Bye: But do you have to do like parallel computers? Because I'd imagine that you did the bottleneck to have all 500 cameras all run into one single computer.

[00:18:58.848] Brian Nilles: You can run them on one computer. I know that sounds insane, but if you think about it, so there are a couple things that happen. One, we process on the camera, so the centroid calculation, we see that bright spot in the marker, it's happening on the camera. And all we do is transport X and Y for the centroids on the camera, so it's super lightweight coming to the PC. The other thing that happens is we can capture thousands of markers. So, for instance, the Activision system, 176 cameras, many of them seeing the action with thousands of markers. So that throughput is much more complex than 500 cameras that are seeing whatever it is, 400 markers with 100 objects. So we have less camera overhead per camera because they're just not populating certain parts of the space. So the tracking part, it's relatively easy for us to do that on one PC. Wow, that's impressive.

[00:19:48.147] Kent Bye: Is there anything else that's left unsaid that you'd like to say?

[00:19:50.149] Brian Nilles: No, we covered a lot of ground there, and you're very knowledgeable about the whole program. I think the credit is due with our customers, people like VRK, and I'm sure you talked to those guys up there, and The Void, and HoloVis, Our friends in China and all the people that are pushing forward, we're going to try very hard to keep producing the best tracking systems for this market. And we've got the product roadmap to do that. But I'm not a content guy. I think all that stuff that those guys do is magic. So they deserve the tip of the hat. Awesome.

[00:20:21.758] Kent Bye: Well, thank you so much. Yeah, good. Thanks for having me. So that was Brian Nellis. He's the chief strategy officer for OptiTrack and they do optical tracking for beyond room scale VR as well as for motion capture. So I have a number of different takeaways about this interview is that first of all, Brian seems to be one of the people that is really got the inside scoop for what some of these beyond room scale VR companies are starting to do with their OptiTrack technology. There's a lot of times where I'd ask a question and I could tell that he knew that there are some companies out there doing it, but he couldn't give me any specifics out there than to kind of confirm that he's seen some of these things grow bigger and bigger. For example, he said that there was one company that was starting to track in a space that was like 165 feet by 120 feet and have up to 75 different HMDs. I know that the Void was doing something that was just 30 by 30 and perhaps something that was as big as 60 by 60 feet. So I think they have different levels of scale that they're able to build out. There's a specific configuration which is kind of designed to do redirected walking. You kind of have to have a minimum radius to be able to walk in a circle where they're giving you feedback that you're actually walking in a straight line. So just a note on the void is that because it's using this kind of redirected walking technique then a lot of the VR experiences that are done on the void tend to be a little bit more cooperative rather than competitive because if you think about it if you're designing a space where you're supposed to be able to walk in circles if you have multiple people kind of competing against each other then They may be physically co-located in real space, but yet in the virtual space, they could be on completely different parts of the map. And so you really start to need to get a lot bigger of these spaces in order to do multiplayers and perhaps a little bit more competitive types of interactions. And so we haven't heard too much about some of these companies, but I'm talking to Brian. There's a lot of people that are doing some pretty crazy and amazing things where they're tracking up to 100 objects or 100 different HMDs. So also I think that in terms of performance, the OptiTrack is certainly running at a frame rate at 240 Hz and being able to do the tracking at large volumes, way beyond what even the Lighthouse is kind of specified for. I know one of the big problems with the lighthouse within some of these expo halls is that if you have multiple VIVE systems that are set up then they can start to interfere with each other and the full extent of how far the lasers are kind of specified to operate isn't as large as you can get with spaces with the OptiTrack. I don't think that I've seen anything that's equivalent to what the OptiTrack cameras are able to do. Especially since a lot of their markers are passive, so they're just reflecting and they're kind of going back to the main cameras and kind of all being fed into a single computer. Whereas with the HTC Vive, all the information that's coming from the receptors of the lasers is being sent back into the HMD and then back into the computer to be processed. So it was interesting to hear Brian say that he's not necessarily convinced that his kids will be playing a lot of VR games just yet, but he does know that they love laser tag and would get really into some of these beyond room scale VR experiences, which is not explicitly saying that these definitely exist, but I could kind of read between the lines and kind of extrapolate that these types of competitive games are certainly around the corner. So another big use case for doing these OptiTrack cameras is through motion capture as well as doing facial capture. I think that the bigger studios are certainly using that a lot more. And there's a lot of other options that are out there in terms of doing motion capture that are a little bit more on the cheap and may not be as precise. One of the big things to consider when doing motion capture, I think from some of the solutions that I've seen at least, is that there tends to be a lot of data cleanup that ends up having to happen with some of these different solutions. So if you decide to go with the cheaper solution, then you may end up having to do a lot more work on cleaning up the data. From my impression, it seems like the OptiTrack may not need as much cleanup because it seems to be pretty solid. So that's all that I have for today. I'd like to just thank you for listening to the Voices of VR podcast. And if you'd like to support the podcast, then tell a friend, help spread the word, and sign up to the email list on the Voices of VR. I should be having some announcements and more virtual events coming up here soon. And become a donor at patreon.com slash Voices of VR.

More from this show