#434: Tracking Your Hands using Flex Sensor Technology with Manus VR

Stijn-StumpelI had a chance to try out the Manus VR hand-tracked controller on the expo floor of GDC this year, and saw that there a couple of really strong use cases for having your hands and fingers tracked in VR. You can be a lot more expressive within Social VR, and in mixed reality experiences where passive haptic feedback is available, then having your hands tracked can actually increase the level of embodied presence.

I had a chance to catch up with the lead designer of Manus VR, Stijn Stumpel, at GDC where we compared Manus VR to Leap Motion, talked about how the flex sensors work, the use cases where having tracked hands makes sense, their extremely polished demo called Pillow’s Willow, and where they’re going in the future.

LISTEN TO THE VOICES OF VR PODCAST

At GDC, Manus VR strapped an HTC Vive controller to the back of my wrist, and it gave a lot more consistent tracking of the location of my hands as a result. I didn’t have to worry about keeping my hands within my field of view like I do with optically tracked solutions like Leap Motion. There was some uncanniness in not being able to actually physically grab objects, which can break presence. And I also experienced a lot more than 20ms of latency in my finger movements, which was a presence breaker. But I was told that they are able to achieve much better latency performance in their lab environment.

Manus VR just announced in a press release that their “gloves are being used in experiments to train NASA astronauts in mixed reality to prepare them for the International Space Station.” Here’s some footage of some of that training that they’ve released.

They also announced that Manus VR is joining the first SteamVR Tracking class being taught by Synapse on September 12th in order to create a version of their glove that has the SteamVR Tracking sensors built in. So I expect to see the next iteration remove the stopgap solution of attaching a SteamVR controller onto the back of your arm. With the increased amount of tracking on the arm, then they might also start to be able to do a lot more accurate inverse kinematic tracking of your body and be able to have a powerful invocation of the virtual body ownership illusion.

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. My name is Kent Bye and welcome to The Voices of VR Podcast. So back in the early days of VR, I think that there were a lot of people who had this conceptualization that they wanted to have their hands in VR and that was going to give them a lot more presence. And so since then there's been a lot of motion track controllers and it tends to be that in most cases you can just grab something and have a button and it feels pretty good. But there's still some cases whether you're in social VR or in mixed reality situations where you still kind of really want to have your full hands within the experience and it can just give you an extra level of presence. And so today I'll be talking with Manus VR, which I had a chance to try out at GDC this year in the spring. And so we'll be talking about some of their technology, which uses flex sensors rather than any kind of optically tracked solution like Leap Motion or some other IMU-based solutions like Control VR. So I'll be talking to the lead designer today, Stein Stupel, about ManusVR and some of their design process and where they're going in the future on today's episode of the Voices of VR podcast. But first, a quick word from our sponsors. This is a paid sponsored ad by the Intel Core i7 processor. If you're going to be playing the best VR experiences, then you're going to need a high-end PC. So Intel asked me to talk about my process for why I decided to go with the Intel Core i7 processor. I figured that the computational resources needed for VR are only going to get bigger. I researched online, compared CPU benchmark scores, and read reviews over at Amazon and Newegg. What I found is that the i7 is the best of what's out there today. So future-proof your VR PC and go with the Intel Core i7 processor. Today's episode is also brought to you by VR on the Lot. VR on the Lot is an education summit from the VR Society happening at Paramount Studios October 13th and 14th. More than 1,000 creators from Hollywood studios and over 40 VR companies will be sharing immersive storytelling best practices and industry analytics, as well as a VR expo with the latest world premiere VR demos. This is going to be the can't miss networking event of the year with exclusive access to thought leaders of immersive entertainment. So purchase your tickets today while early bird pricing is still in effect at VROnTheLot.com. So this interview with Stein happened at the Game Developers Conference that was happening in San Francisco from March 14th to 18th. So with that, let's go ahead and dive right in.

[00:02:44.939] Stijn Stumpel: I'm Stijn Stumpel, and I'm the lead designer for ManusVR. We make the first consumer virtual reality glove. It's a glove that puts your hands inside of the virtual world.

[00:02:56.928] Kent Bye: Great, so I just had a chance to try it out here at GDC, and the first thing I'll say is that I was seeing some latency issues in terms of, like, the finger wasn't responding, but the interactions, I think the thing that's unique is that I'm able to track my hands with the Vive controller, so with the Lighthouse, So I don't have issues that I may have with say like leap motion where if I'm not looking at my hands I can kind of like have them out of the range and it's just strongly tracked wherever they are. And there's some moments where I just really got lost in terms of just having my hands present in the scene which is really nice to be able to just kind of grab at things and then use the natural motion of my open hand grab and just kind of feel like I was able to manipulate the virtual world just by grabbing my hands in that way. From your perspective, what were you trying to do in terms of creating this hand presence?

[00:03:45.031] Stijn Stumpel: Well, we are all about virtual reality as immersion, and your hands are obviously a very important way that you interact with the world. There's direct controllers now, which weren't there when we started the company, and they're really good for playing video games at the moment. They're backwards compatible, they have a familiar form factor. But if we're trying to simulate the real world, which is what virtual reality is doing, then you kind of want your hands there. Because you can imagine going through your regular life holding HTC Vive controllers, trying to make an omelette wearing sticks. That doesn't really work, because there's a lot of different ways of gripping objects. and we start developing our hands from the moment that we're born and then we go into the virtual world and we get rid of them and we don't want that. We want you to keep a hold of all of the dexterity that you have developed throughout your life.

[00:04:38.313] Kent Bye: So maybe you could describe to me why not Leap Motion and why Manus? What's the difference and what does your approach give?

[00:04:44.798] Stijn Stumpel: Yeah, so we get that question a lot about Leap Motion, and it's obviously a very low-threshold solution. It's just a camera and there's no gloves to wear. But it has a lot of limitations. So Leap Motion has a very small viewing angle. You have to look at your hands for the computer to know where your hands are. And for a lot of ways, you interact with the world, you're not looking at your hands while you're doing things. So if I throw something, I have to extend my arm behind my head. And if the computer doesn't know what my hand is doing at that point, I cannot throw like a grenade in a shooting game or things like that. And a fun experiment you can do is you can go through your life, try to go through a day by first looking at your hands before you touch something with them. And it becomes very tedious. So you want to have your hands everywhere. The other advantage is that Leap Motion uses computer vision for detecting what the hand is doing, which is prone to error. It has to infer from an image what the hand is doing, and because we have physical sensors around the hand, we have specific data points that correlate to a specific joint on the hand that give us a number of degrees, and that also makes the amount of data that we send to the computer much smaller. So, in theory, it should make the latency a lot lower. You are having some issues in your demo. We'll definitely look into that. But we can have, with our custom wireless protocol, we can get the latency down to way below 20 milliseconds using our glove.

[00:06:09.578] Kent Bye: And so, maybe you could describe to me, like, why not IMUs? What is the downfalls of IMUs and what's the advantage of the approach that you're taking?

[00:06:17.323] Stijn Stumpel: So IMUs, they require calibration, they handle more data, so they're more computationally intensive than what we're using, and they have much more components. So we can measure all of the fingers with a custom sensor that's a single manufactured component, which allows us to make the glove for a price that consumers actually want to pay for it. IMUs do have their advantages but the FLEX sensors are very consistent in what kind of data they output and they are very accurate. They have an accuracy of one degree in each direction and that's an appropriate degree for us. We don't notice inaccuracies there.

[00:06:55.680] Kent Bye: You had mentioned that the fabric that you're using was able to detect up to two joints, so you just have a single piece of this specific sensor that is on the sides of your finger and that is able to detect both if your knuckle and the other joint in your finger is being bent.

[00:07:10.927] Stijn Stumpel: Yeah, it's a flexible material, a plastic material that has a carbon and silver ink material inside of it that changes its resistance on how much it's bending. And we can divide that surface area into different segments and position those over the joints. So we can decide what parts of the hand we want to measure movement in. And because it's one piece, it's a very sturdy, reliable way to measure fingers.

[00:07:41.859] Kent Bye: Well, first of all, I've got to say that the demo that you have is absolutely beautiful. And you're able to have an amazing diorama scene that you're interacting with. And so for you, what do you see as the sweet spot of the use cases that you feel like, using your technology, that you're able to do interactions that you wouldn't be able to do without your hand presence?

[00:08:02.132] Stijn Stumpel: Right, so there's a whole lot of things that we can do with our glove. Even outside of VR, we can control robots, we can fly drones. But if we're thinking about virtual reality, there's a lot of things that are good to do with a controller, like shooting a gun, because you're already holding an object. But basically for every single thing that you do with the world where you're not continually holding something like a sword or a gun, a glove would be preferable. In this game, what we've tried to do is because a lot of people still find virtual reality a bit intimidating, and you put people into an experience and you give them cues on what to do, but they kind of ignore them and just look around. kind of experience the space. We've tried to make a game that's very calm and you can just kind of play around with stuff a little bit and look around and there's no real hurry to do things. And I think that's the most interesting thing to do now is just to kind of pick up an object and see what the affordances of it is and how it reacts. Just like a baby does with objects. Like you put a baby in a room and it starts picking up things and kind of playing with them. And that's the base level at the moment. We have a lot of ideas on playing basketball and making creative building games, all sorts of interesting applications. But we are also counting on the developer community to come up with those types of ideas. We obviously have some stuff that we really wanted to try and we put some of them in this demo. But there's a lot of creativity out there that we want to leverage. That's also why we're here at the Game Developers Conference.

[00:09:39.938] Kent Bye: Yeah, another thing that you have is in order to get really specifically tracked hands, you are strapping HTC Vive controllers on the side of your arms. And so do you imagine that would be kind of like the preferred method of getting positional tracking for your gloves?

[00:09:55.983] Stijn Stumpel: Well, it's definitely not the preferred method. It's an intermediate solution. So, we just got the controllers in, so we 3D printed a thing that straps it onto the arm, because it's a very good positional tracking technology, HTC. And it'll work great for developers. It works, but it's a lot of hassle, and there's a lot of hardware attached to the thing that you don't need once you strap it onto your arm, like the trackpad. We only really need the far end of the controller, because that's where all of the diodes are that track the position. So, in the future, when we start rolling out to consumers, consumers don't really differentiate between tracking the pose of a hand or tracking the position of a hand. They just want to put something on that works and that's integrated into a single object that they don't have to calibrate and put their hands in. So the end solution that we're working towards is one where basically you can imagine that the tip of the Vive controller goes around the wrist, gets attached to the glove and it becomes a single unit that you can put on and it stretch your hands completely.

[00:10:57.117] Kent Bye: Yeah, I think that would be ergonomically make a lot more sense because I think that wearing that solution for a long time I think we'd get pretty fatiguing with because of the it's just kind of awkward to have all that that weight on your arm So I imagine that the final solution will will have some way of getting that information back to the computer, right? Because that's the thing with the HTC Vive controllers is that it's it has to send that information back through the headset. I imagine

[00:11:19.344] Stijn Stumpel: Yeah, so the positional information goes to the game engine and the pose information goes to the game engine and there it gets combined. So the glove generates a skeletal model inside of the game and the Vive positions the skeletal model somewhere in the scene at the proper place where your hands are at that moment.

[00:11:39.122] Kent Bye: So one of the other features that are on these gloves is that you're doing a little bit of a haptic feedback like a little rumble whenever you are triggering specific grabs to give you some type of feedback. I have a little bit of mixed feelings with that just because when I'm grabbing something in real life I hardly ever have like something buzzing on the back of my hand and there's something with haptic feedback is sometimes if it's low fidelity it's actually better than trying to perhaps stimulate the fingertips or something like that. But I'm just curious of like some of your experiments where you kind of feel like it really works or sometimes it just may feel like too artificial and presence-breaking.

[00:12:15.372] Stijn Stumpel: Yeah, haptic feedback is a very interesting topic. There are a lot of different ways of doing haptic feedback that we've investigated. It's really, really difficult to do very well fleshed out haptic feedback in a consumer product. Because our main focus is to make a glove that we can sell for a certain price and that we can manufacture and that'll keep working day after day. And every single component that we try to integrate into the fabric, that just makes it a lot more difficult to manufacture. So now we have chosen for basically the most minimum viable thing that we can do, and that's a rumbler on top of the hand. And it's not really an immersive haptic feedback, but it's very functional. You want to know when you go towards an object and close your hand, when you can pull back your hand and take the object with you, when you've actually held on to it. And a small vibration can be a very good way of directly, at the right hand, indicate where you're going. But you can do some more creative, immersive things with it as well. You can imagine playing a racing game where you're driving on the road and you suddenly go off-road and your hands start rumbling. Well, that sort of happens when you're holding a steering wheel as well. So you can do immersive things, shoot a gun, have it rumble really heavily for a short period of time. But most of it is functional, haptic feedback.

[00:13:35.563] Kent Bye: Yeah. There's an interesting thing with some of these HTC Vive games, like Job Simulator, are things where you have a sense of hand presence and you're grabbing things. And in some ways you did this same thing of what Alex Schwartz would call tomato presence, which is when you grab onto the objects, your hands actually disappear so that your presence of your hand transfers to the object. It's interesting because in Job Simulator, you just have the trigger and when I'm in Job Simulator and playing it enough, I kind of forget that I'm holding the controllers at all. And so, I guess that kind of begs the question of where is the real sweet spot of actually having your full hands presence and being able to do things that you can't just do with holding a Vive controller with a trigger.

[00:14:15.545] Stijn Stumpel: Well, this is a very early demo, and we went for the tomato person solution because it's a safe way of making it work and having it be properly playable. Another way of doing it is by keeping the hand rendered while you're holding the object, but it'll collide with the object and it'll go through it. Our hands are rendered as ghost hands, which if you render your hand as a ghost instead of a solid hand, the user will accept most of the time that his hand is clipping through an object. If you're rendering a solid hand, which is even more difficult, you need to do some physics interaction between the hand model and the object, and you need to track the way in which the user grips the object, because there's a lot of ways of holding a single object, and if it doesn't correspond with what the real world is doing, then it'll break presence even more. So the general rule is, if you don't have accurate information on what something is doing, then don't try to approximate or display it at all. It's better to get rid of it completely. So the same goes for the lower arms. If you don't know exactly where the lower arms is, it's best to just have floating hands, because if the lower arms are in the wrong position, it's even more deteriorating for the experience. So that's something that we and I think the whole VR community at this point is still doing research in what the optimal place is there. But we've done some interesting experiments so far that we might demo in the future.

[00:15:41.815] Kent Bye: Cool, and so what's next for ManusVR? I mean, you have some dev kits that are available, maybe you could talk a bit about that and then going forward.

[00:15:50.102] Stijn Stumpel: Well, we are opening up the developer kits for pre-order reservation at this moment. We are not accepting payments, we will do that when we are ready to ship. Yeah, that will include a pair of gloves for 250 euros, an open source SDK that supports Unity and Unreal and all of the different game engines. We are very much focused on becoming compatible with the big HMD systems. So we now have a demo that works with the HTC Vive. Our goal is to have one that works with the Oculus, with the PlayStation VR, with mobile VR, so that we are the universally compatible go-to data glove for people that want accurate and reliable hand tracking in virtual reality and that want to have a bit more than just a game controller for more presence experiences and stuff like that. We are also partnering with Pillows Willow Studios for this demo to create more direct partnerships between us and game developers. Because usually when you put out a developer kit you just send them to a lot of developers and they make content. But we try to be very close to the developer because a lot of the work they do informs the design decisions that we make on the hardware. And in the next year, once more of the developer kits will start rolling out, we will be gathering more feedback and tweaking the design and the configuration of the sensors to where we have the optimal solution for a virtual reality glove.

[00:17:20.503] Kent Bye: And so what do you want to do in VR with your hands?

[00:17:24.125] Stijn Stumpel: What I want to do? Anything that I'm doing now with my hands. I played the drawing game of the HTC Vive. I forget what it's called. It's... Tilt Brush? Yeah, Tilt Brush. And that kind of blew me away. I'm a designer and product designer and a graphic designer. And I think there are a lot of opportunities for designers. Virtual Reality is a tool for designers. So drawing in three dimensions, modeling in three dimensions, collaborating in work environments. So I'm thinking a lot about creative applications. I'm also not really big of a hardcore gamer, but I think virtual reality outside the gaming market has a very big potential as well. I can go to any location in the world, I can look at any artwork in the world, at its proper scale, in its proper lighting, and those are the type of solutions that I'm thinking about a lot. And of course, a big thing that I've been telling every single developer, if there's a developer listening out there that wants to make this, please call me and we'll get something together. I want to be sitting in a cockpit, flying above planets, be able to land on the planet and press buttons in my cockpit with my hands that do different things and turn knobs. And I just want to have a lot of buttons in front of me and a panoramic window and a beautiful universe below me and flying around and just seeing the sights.

[00:18:46.740] Kent Bye: Awesome. And finally, what do you see as kind of the ultimate potential of virtual reality and what it might be able to enable?

[00:18:53.465] Stijn Stumpel: That's a big question. I think, Palmer Luckey said this in one of his interviews as well, and I completely agree with this, is that the internet democratizes information and virtual reality democratizes experience. So if I want to learn something, I get the same Wikipedia that Barack Obama gets. but I cannot visit the Sistine Chapel because I work in a startup and I get paid very little money. And if I want to go there, I have no option but to go to Google Images and type Sistine Chapel and kind of look at images. But with virtual reality, I can not only go in there, which people that actually go there can do, but I can fly up to the ceiling and I can see what Michelangelo saw when he was painting the thing, which some people that actually go into the Sistine Chapel aren't even able to do. So I think humans are a very curious species by nature. We want to experience things. And if I look at the impact that the internet had on human curiosity, just with information, and I extrapolate that to experiences, I don't even know where it's going to go. But it's going to be very interesting. Awesome. Well, thank you so much. No problem. Thank you.

[00:20:04.190] Kent Bye: So that was Stein Stupel, he's the lead designer for ManusVR, which is a solution to be able to track your hands within VR. So, I have a number of different takeaways from this interview, is that first of all, I do think that there are specific situations where having your full hands in the game just is a lot better than just having motion track controllers. And one of those I think is actually in social situations because having your whole hands within the experience just gives you that extra dimension of expressiveness and being able to really gesture and point and just gives a lot more body language that is hard to communicate when you just have kind of motion track controllers. You can get some sense about the motion of the hands within social VR experiences with just the motion track controllers but I found that when I'm in alt space and I see people who have their full hands in the game then it just gives a lot more level of expressiveness. So the other thing is that at GDC they had a bit of a stopgap solution which is that you essentially kind of strapped a lighthouse controller onto the back of your wrist and with having your hands motion tracked, I did find that you could start to do different things that you couldn't do with the Leap Motion. So since Leap Motion is optically tracked, you really kind of have to see your hands within the field of view in order to have any type of action with them. As soon as they go outside of that field of view, they essentially disappear, and you can't do any other kind of motions. Like an example that Stein gave in this interview is that you're actually throwing a grenade. So in the gameplay that they had within the demo at GDC, they didn't have anything that was really taking advantage of this, but I just kind of found that the tracking of my hands was a lot more consistent and steady. So on August 4th, Valve actually announced that they're going to be opening up the Lighthouse to have different peripherals be able to be tracked with the Lighthouse system, which means that instead of kind of just strapping your controller to the back of your hand, they're going to be able to do that natively. And in the interview, Stein was saying that they're also going to be compatible with the Oculus Rift and potentially other VR systems as well. And so they're trying to be the one data glove that works with all the different VR headsets. In the press release that ManusVR sent out, they just recently announced that they're working with NASA to be able to use the ManusVR within mixed reality experiences. So in other words, they have a room that has some bars on the wall and you're in a virtual environment that has a similar set of bars that mimic what the environment is going to look like within outer space. And so being able to have your hands within a mixed reality environment that you're able to actually reach out and touch things, I think is a really super powerful way to build presence within VR, specifically your sensory presence and your embodied presence. And you start to really invoke the virtual body ownership illusion. I think if they actually start to track more points along the arm that go all the way down to the elbows, you'll be able to do a pretty good inverse kinematics, I think. They'll be able to go beyond just having what they had in the demos that they were showing at GDC, which is just kind of like floating hands. And I think eventually they're going to start to be able to actually track your full upper body and have it pretty accurate, which I think once you get to that point, then you're going to really start to invoke the virtual body ownership illusion. you're going to have a much deeper sense of presence within these VR experiences, especially if you have your hands tracked. So the biggest downfall though is that when you start to actually grab objects, it actually is a bit of a presence breaker because you're not getting any haptic feedback unless it's a mixed reality experience. Like I said, with this NASA, they have some things that you actually have that haptic feedback when you're grabbing stuff. but to not actually have things that you're grabbing can actually feel like a little bit within the uncanny valley like you expect it to feel a certain way and when it doesn't then you kind of get taken out of the experience and so having that high fidelity of a tracking with your hands can backfire if you're not able to live up to all the different haptics that you're expecting and so I think thinking about the Manus VR in terms of a way to really take mixed reality to the next level is probably a little bit better than thinking that this is going to be a general purpose solution for all sorts of different hand interactions because a lot of times if you are just holding some sort of either touch controller or Vive having that haptic feedback in your hand it makes you just kind of forget that you're not actually holding a motion track controller and you start to really feel like you're holding whatever object you're having whether it's a gun or a sword or Whatever else you just have this sense of hand presence or tomato presence as it were like with job simulator and Alex Schwartz Discovering that when you grab an object, then they actually make your hand disappear so rather than actually seeing your hand anymore you're just seeing the object that you're holding and and For a lot of people, they don't even necessarily notice that because when you are grabbing and holding stuff, you usually start to ignore your hands anyway. And so within VR, by explicitly making your hand disappear, then you start to have a deeper presence with the correlation between your hand moving around and the object that you're holding. And so they had taken this similar approach within this Manus VR demo that was showing at GDC. Now in terms of the haptic feedback that they did have, they had like this buzzer on the back of your hand and I felt that it was super uncanny and just felt really weird and awkward like whenever I'm picking up an object you hardly ever feel something kind of buzzing on the back of your hand and every time it happened, it just kind of took me out of the experience more. And so I think the advantage of having a physical controller like the Oculus Touch or the Lighthouse controller is that the haptic feedback that it gives is a lot more natural and low fidelity and kind of giving it to the palm of your hand rather than the back of your hand. So again, this is another one of the challenges I think that moving forward, we'll see whether or not they end up delivering and shipping some of these kind of haptic buzzing that is happening with your hand. But it's one of the design challenges, I think, when you start to do this higher level fidelity of tracking your hands. The other thing is that we're using these kind of flex sensors rather than IMUs. And IMUs have to be calibrated, and they also can drift and be susceptible to interference from other electromagnetic waves and whatnot. And so they tend to be a lot more tricky and nuanced to be able to work consistently all the time. In the demo that I was seeing at GDC they were having some issues with latency and so I can't necessarily give my my final full take on it because it wasn't kind of as reactive as I would want it to be but according to some of their other tests they are claiming that they get the latency lower than 20 milliseconds but just remember kind of playing a demo of playing the piano with the gloves and I think that that's a good test actually if you feel like you're actually playing a virtual instrument that's going to be something that you're not going to be able to necessarily do with some of these other controllers is to be able to do things like playing an instrument that you couldn't do with a just using kind of a controller and buttons and being able to have degrees of freedom and movement with all of your fingers is kind of like the ultimate test of going into a VR experience and feel like if you're able to actually play one of these virtual instruments. But again, like I said without having that virtual feedback then it can be a little bit of like you're not quite sure whether or not you're pushing the button down or not. And so in the press release that just got sent out from ManusVR, they're saying that the class that's being taught by Synapse to be able to access all of the SteamVR and the Lighthouse, this class to be able to train people how to use a lot of the Lighthouse tracking, that's starting on September 12th. And if you kind of look at the timing, I expect to see a lot more of these peripherals start to launch when it comes to CES, a lot of the consumer electronics show that's coming up in January. You can expect to see a lot more of these VR peripherals that are going to be compatible with the Lighthouse launching there. We'll see what Oculus announces at Oculus Connect 3, whether or not they're going to be open up their constellation tracking. I expect to have some new announcements in terms of technology and what's coming next. final date on the Oculus Touch controllers. It's been some information that's leaked out already in terms of when those Touch controllers are coming out, but it's going to be probably sometime in November and, you know, for sure in December starting to have access to some of these Touch controllers as well as games that are going to be launching around that time. so i think in the first couple of weeks of october we're going to have the oculus connect 3 and then the steam dev days is going to be that next week so i think at the end of that we'll know a lot more information as to where the trajectory of some of these controllers going to be going and i did actually go back and look at this control vr which was one of these hand track controllers that had a lot of buzz at the year before at GDC in 2014 they had launched it and they had a successful Kickstarter where they raised over $400,000 but it looks like control VR made an announcement to their Kickstarter backers back in August of 2015 basically saying sorry we ran out of money there's been some restructuring and and Since then, they've had radio silence with no other further updates. So it looks like, for all intents and purposes, the ControlVR initiative looks to have failed and imploded. Until we hear more information as to whether or not that IMU hand-track solution is going to go anywhere. But at this point, the Manus VR seems to be one of the leading solutions for being able to track your hands within VR. And I'm looking forward to trying out another demo to see how their latency is doing and how it feels. And like I said, the big use case for this type of hand controllers is going to be likely in situations where in social VR as well as in mixed reality situations. So that's all that I have for today. I just wanted to thank you for listening to the Voices of VR podcast. And if you'd like to keep posted with all the latest news from the Voices of VR, then go sign up for my email list. We've got some other projects that I'll be announcing here soon, so sign up to the email to get more information about some of those when it's ready to be announced. And if you enjoy the podcast, then spread the word, tell your friends, and become a donor at Patreon.com slash Voices of VR.

More from this show