#119: Danny Woodall on creating full-body presence with the Sixense STEM controller

Danny Woodall is the Creative Director at Sixense, and he talks about how the electromagnetic STEM controllers can help to bring full-body presence into VR. He talks about the SDK improvements that they’ve been making, and some of the updates that were made to the lightsaber demo including updating it to UE4.

I had the opportunity to first try their Lightsaber demo back at Oculus Connect in September, and I had another opportunity to try it again at the VR Mixer at GDC. I often mention this demo as being one of the most immersive experiences that I’ve had in VR because it makes all the difference in the world to be able to track your body within a VR space.

Mel Slater’s research into Virtual Body Ownership Illusion has shown that the minimal thing that you need to be convinced that your virtual body is your body is to have a 1:1 correlation between you body movements in real life compared to what you see in VR. With the ability to have a your hands tracked relative to your head, Sixense has been able to create an IK skeletal model that really feels great.

Sixense also had their STEM controllers in the Tactical Haptics demo, and it made all the difference in the world to be able to track a physical representation of the weapon that you’re holding in real life with the weapon that’s being tracked in VR.

After the recording of this interview, Sixense announced to their Kickstarter backers that they failed the FCC testing. They said:

The reason the Base is failing is specifically due to our design of housing the five radio frequency (RF) dongles inside the Base. The RF dongles require grounding, but this grounding interferes with the electromagnetics (EM) of the tracking.

To address this issue we redesigned the Base electronics to keep the RF dongles located internally but not conflicting with the EM. This will require the production of new Base PCBs and further testing to ensure everything is working properly.

This will cause some further delays of the delivery of the STEM to at least back to July, and if they fail again then they’re looking to September to start shipping their units.

The STEM controller may have some advantages of working without having exact line-of-sight, but there are other potential issues of EM drift or interference from other electronics. For more of an in-depth discussion about some of the potential issues, then I highly recommend listening to this interview that I did with Oliver “Doc_Ok” Kreylos at Oculus Connect.

And for more of a background into Sixense, then here’s an interview I did with the founder and CEO Amir Rubin back at SVVRCon.

The STEM controllers are something to keep an eye on, especially considering that Danny mentioned in this podcast that they’re adding Gear VR support. If the price point can come down, then it’ll be a valuable addition to the VR input problem because having your hands in the game with a physical controller and physical buttons will have all sorts of applications that can create an incredible sense of immersion within virtual reality environments.

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast.

[00:00:12.080] Danny Woodall: My name's Danny Woodall. I'm the creative director at Sixth Sense, and we're working on creating full-body presence inside of VR.

[00:00:18.972] Kent Bye: OK, great. So yeah, tell me a bit more about the input controllers and what you're able to do with those.

[00:00:24.095] Danny Woodall: Sure. We use an electromagnetic tracking solution. So we have a base that emits a low power electromagnetic field. And there's coils in all of our track devices that can sense where they are within that field. So we get really accurate position relative to the base and very low latency. And we've been using it now to provide full body presence for VR. But it's limitless from what you can actually do with the system. We have people doing rehab. We have people doing simulation, training. But for VR, it's amazing because we put a tracker on your head, and then we know where your hands are relative to your head. So you get this perfect hand-eye coordination. It really opens up what you can do inside of VR. Instead of viewing VR, you're actually in there interacting with VR. So it's pretty amazing.

[00:01:06.107] Kent Bye: Nice. And so I guess the last time I talked to Amir was back in SVVRCon back in May of 2014. So maybe you could fill me in a little bit in terms of what's happened since then over the last year.

[00:01:16.315] Danny Woodall: Sure. Well, we've been working hard to get the hardware ready to get out to all of our Kickstarter backers and the rest of the community that has bought developer kits. Of course, along with that comes lots of software development so that developers have something to use when it comes out. So we've been working hard on our core API SDK, making sure that it's cross-platform. We support Mac, Linux, Windows, Android, and it's all using basically the same driver. for everything. So now we have an Android SDK. So Gear VR, we've been working on Gear. And we have our lightsaber demo now running on Gear. We have our shooting gallery running on Gear. We have a network bridge that we've added to our SDK so that people can start actively developing for Gear VR. And then, of course, tons of work on the VR SDK and integration both in Unity. And now we're showing here at the show Unreal Engine 4.

[00:02:03.693] Kent Bye: I see. And so what type of things can you do with the 6th Sense SDK for the STEM controllers?

[00:02:08.036] Danny Woodall: Sure. So like I mentioned, we have two that we're working on. One is just the core API. So you as a developer, maybe you're working in your own native application, and you just want to get the data yourself and do whatever you want with it. So that type of plug-in from 6th Sense has been around since the Hydra days, and it's going to be very familiar for anyone that's used it already. The VR SDK is, at its core, is a C API. So it's, again, very cross-platform, but we take the animation pose of the character and then we feed it all the track device data and use a custom IK solution to spit back a very plausible relaxed pose to the game engine so you can influence the pose based on animations so you know you might change the idle pose. from standing feet together to maybe like a fencing stance or you're doing some sort of sword kind of simulator or something. And we can add more track devices to the avatar, like you could add two to the feet. So now you could be walking around and look down and see your feet. So again, we've been kind of focusing on VR because that's where this kind of technology is really shining. When you look down and you see the one-to-one kind of correlation where your virtual and physical hand line up, that opens up these amazing possibilities where I can ask you to grab something in VR and you aren't playing this input game trying to find it. You just reach out and grab it and pick it up.

[00:03:18.133] Kent Bye: Yeah, I got to say back at SVVRCon that doing the gun demo was probably my sense of having the most presence by having just the hands move around and it really fooled me in the sense that I had the sim controllers in my hand but yet in the game I had not had anything in my hands and so I was just like, oh, I just did take off my HMD and yet there was something in my hand. It kind of tricked me enough and so I think yeah there's something about having both the place illusion and the plausibility illusion where you know you're in a place and you're in the world but the plausibility is the fact that you can actually interact and engage and have an impact on the world and so yeah maybe you could comment on your own experiences of actually having agency within VR.

[00:03:58.853] Danny Woodall: Yeah, I mean, I think just being able to reach out and touch things in a very intuitive, natural way is huge. But like I said, the trackers themselves can be used for anything. So we worked with a company called Stryker VR, or actually that was their product, Stryker VR. But they do realistic weapon simulation recoil for the military. And I think they can get within 15% or 20% of the actual weapon recoil. So they came to us because they didn't have tracking, but they had this really cool recoil system. So we just took one of our stem packs that we use for the head for the position and orientation tracking And we just put it on the gun itself, their physical gun. And then instead of you picking up two controllers, you actually, you're holding this real physical weapon. And that was really powerful. You could almost convince yourself that you're really holding something that was a weapon because it was physically modeled to be exactly the same proportions as the weapon itself. So you can hold it near your face and look at all the little parts of the weapon. And if you were to take the headset off, you'd be looking just at that part. So that means you could actually touch it and feel it. That was pretty amazing. You know, we have rumble motors so we can do a lot of haptic kind of touch. And like one of the cool things that we've done with the lightsabers, when you scorch the floor, there's a rumble that happens. And we've been talking about now how we should keep going with that so that when you move your hand a little bit faster through and scorching a little bit more, you can change the frequency of the rumble motor to make it feel like you're really kind of cutting through something. So having haptics come into play is going to add a lot as well, I think.

[00:05:20.288] Kent Bye: Yeah, I think that was probably a very subtle thing that, to my subconscious mind, was tricking it even more, that when there was lasers being shot at me and I was blocking them with the lightsaber, that, you know, to feel that, that haptic feedback, I think, can do a lot, even if it's very simple, a very simple rumble.

[00:05:36.773] Danny Woodall: I think that's kind of key, too, is like the subtleness of the haptics. You know, you don't want to overpower, you don't want to be buzzed by a haptic motor and rumble motor. And I think it's, you know, they've kind of got a bad rap for that kind of reason. Like somebody's like, Oh, we need some rumble for this weapon shot or something. So you kind of get buzzed by it. But I found you really just kind of dial back, you know, and you get it so that you feel it, but it's not so powerful that you notice it almost, right? Like, just like he said, like, you can block a few bolts or something and then you're like, wow, this feels really good. And he almost didn't even notice that it was happening, but it was just enough to make it feel right. You know, and the same thing with like 3d audio, right? You put on 3d audio and VR and you don't really notice because it's working correctly. It's just how audio should sound. But when you try one demo with it and then one demo without it, it's glaring difference between the two, right? So.

[00:06:21.318] Kent Bye: So yeah, and I just talked to Yuval of OSVR and talking about this kind of middleware platform to be able to have one SDK and have all the different third parties feed into that. Now, since you are developing your own VR SDK and your own SDK for all the different platforms, maybe you could talk about what can you get in OSVR's integration for $0.06 and what you may need to go to your own custom SDKs in order to do.

[00:06:45.415] Danny Woodall: Sure, so our SDK for the VR SDK specifically is it's written so that it's device agnostic. So we could work with any track device that can provide position and orientation. I think that's key because right now there's all these players coming into the field trying to find their spot and a lot of input systems are trying to come and fill this void that we have in VR where there isn't input yet. And so as a developer, it's, you know, which one do you choose? So being able to have like a common SDK that you can go to that can provide some sort of unified API that you can go to is a great, great idea. I think in especially in this early days of VR, you know, being able to support as many platforms, if you will, or third party devices out there. It's going to be big for both the developer and for the end user because now you can develop for multiple hardware and people at home can see it. And of course, then that will then kind of show people which is the better way to go, I think, because there's going to be a lot more people that comment on how great this is and how their track device that they chose is maybe a little inferior. So I think it's going to drive us all to finding the right input, finding the right solution for all this. So it's a good thing.

[00:07:52.079] Kent Bye: Yeah, I totally agree about the abstraction layer to be able to throw any input device that you want at it, and it will be able to figure it out. I guess the question I'm asking is whether or not you have in your SDK, every single function that you have in your native SDKs for Sixth Sense, if there's a one-to-one correspondence to all the features that are also implemented into OSVR.

[00:08:13.190] Danny Woodall: So it will be basically the same kind of feature set, where, again, it's better for us to have the same developing environment for the developer, whether they're working just with our SDK or they're working with our API in some other abstracted layer, right? So it's going to be very much the same working on any other platform dealing with our controller data. We'll expose it so that it's familiar, it's the same API. So yeah, there shouldn't be any reason to change it at this point. I don't see it anyway.

[00:08:43.577] Kent Bye: And Amir had mentioned that there is a new experience, maybe an educational experience, or that there's a new experience that you guys have been working on as well. Maybe you could tell me a bit about that.

[00:08:52.441] Danny Woodall: Well, we've done a few lately. Probably the one Amir was talking about was a retail platform that we've been kind of showed what it could be like. And it's very compelling to be inside of VR and have like these kind of endless rows and columns of items that you could have put on shelves. And then to actually reach out and grab it and look at it And then, you know, maybe it's a toy and you can set it on the ground and you can actually, maybe it's a robot toy or something, you can drive it around or whatever. Or maybe it's some kind of interface for UI for maybe a phone or a camera or something. And you can actually kind of try these interfaces in the toy itself before you buy it. And it gives you a better affirmation that what you're looking at is really what you want, right? And so you can feel better that this is the right product you're purchasing instead of hey, this looks good, I'm gonna order it, and if it's no good, I'm gonna return it. I think that's so easy to do now, like with Amazon, you buy something and you can just return it very easily. I think we could help out in that whole area.

[00:09:42.840] Kent Bye: And what are some of the other experiences that you've seen that you find really compelling that integrate hand tracking?

[00:09:48.903] Danny Woodall: Again, I was mentioning earlier, like this bunch of simulation stuff. This company was doing a welding simulator. And they can use the data from the actual controllers to figure out how good of a weld this user would be able to do. And then they can do this out of harm's way. They're not doing the real torch, right? We have people doing rehab. So you're using muscles in the controllers to rebuild some kind of stroke that you've had or something. And we have people doing training. So you can, like even the shooting gallery, like we talked about, that's, you know, people could be learning how to shoot guns in kind of a safe environment. We have other people that are very interested in gamings and just, you know, interacting inside of VR. So, you know, it's kind of possibilities what you could do with this stuff is pretty endless. But one of the cool things that we did recently was like a golf kind of simulator and like leaning over and putting into, you know, holes. I mean, it feels really pretty amazing. And because the tracking is so accurate, that it is something you can improve your game on at home.

[00:10:46.694] Kent Bye: Nice. And finally, being here at GDC this year, it seems like virtual reality is really kind of crossing the chasm into the mainstream, at least going to the next level. And I'm curious from your own reaction to see what it's like being here and where you see VR going from here.

[00:11:01.878] Danny Woodall: Sure. I mean, I think obviously Valve is kind of one of the big highlights here, coming out with a system like they have, is really putting VR potentially into the homes of everyone sooner than maybe expected. And so that's a big thing. You know, there's other players here that are doing cool stuff. Oculus is doing great stuff, of course. Their demos are amazing as well. I think the mobile space is interesting. I haven't seen a lot of mobile stuff here, but Samsung is here. And I think that's a viable kickstart for people in the VR in a mobile platform that they already own a device or, you know, it's a little bit cheaper to kind of get into that realm. There's a, like I mentioned earlier, Striker VR. They have some cool haptics stuff over there. And same with Tactical Haptics. They have some really interesting stuff. with haptics, so I think all these kind of, like, getting more senses into VR is important. 3D audio is getting big, right? So, again, we did an implementation that we weren't quite ready to show here, but with a lightsaber demo, we had a 3D audio. And there was a point there where I was playing with a drone and getting all the sounds just tuned just right. And then, for whatever reason, I decided to shut my eyes and try to play, just like in the movie. And I could do it. It was amazing. I turned off the barrage, so the droid goes away kind of far, and he shoots a bunch of bolts at you. But I turned that feature off, so he just kind of stayed around me and shot at me. And I could feel where he was. I could hear him. And I would just follow him around. And eventually, I took off the headset and peeked. And he was right there. And I was blocking the bolts. I could tell, because the flashes were still bright enough in the screen to go through. And I could see it with my eyes closed. That was pretty magical, you know, so we've been joking around about adding that as to a feature to someone who's doing really well, like having the blast shield come down like in the movie, you know, and still be able to kind of peek out the bottom of it so you can see your feet and stuff, but, and then try to play. So I think all these technologies coming together is, you know, if once we stop trying to work isolated and we all start coming together, it's going to, I think that's what we're going to see in this next year in VR is everyone starting to kind of collaborate a little bit more to define what this VR system should be. It's going to be amazing.

[00:12:58.648] Kent Bye: Is there anything in VR that you're really looking forward to experiencing?

[00:13:01.389] Danny Woodall: I mean, there was this one moment, and I don't know how we're ever going to do this in the near term, but there was this one moment in development where I actually gently brushed a plant in VR, and actually hit my plant in my home that was right there. And that was shocking, almost. I thought I had somehow wired that up in the game for half a second, and then I took the thing off, and it set off, and I was like, wow, that was so amazing. adding something like that, where you can feel things, that was amazing. Again, it's another sense, right? So all the senses, first we did the sight, right, for getting the headset working and stuff, and now we're doing touch input, and doing 3D audio, getting those all just right is gonna be pretty amazing. Maybe people aren't gonna wanna come out of VR, which is a little scary, but when VR becomes cooler than real life, that's kind of very interesting, but also very scary kind of future. So I don't know. I think it's a, I'm actually looking forward, if I had to pick one thing I would say multi-user VR. Right now everything's been fairly isolated and nobody has avatars representing who they are and the few demos that we've tried where we put two players in the same scene and you're looking down at yourself and the other guy's watching you do that. I mean it's clearly a human movement and not scripted in any way and once the eyes start tracking inside of the headsets and we can make eye contact and we have this real social element that works in VR, it's going to be super powerful and super amazing to not only develop for, but just experience, I think. These guys are doing a really cool dancing kind of simulator where they would invite people in, and no one had avatars and stuff, just kind of little blocky stuff. But again, kind of mesmerizing when you're inside a space with multiple people. It's pretty cool. Awesome. Well, thanks so much. Yeah, thank you.

More from this show