Anush Elangovan is the Founder and CEO of Nod, which produces the Nod Gesture-Control Ring. The Nod ring exposes motions, gestures, swipe, and tactile feedback and enables user interactions ranging from tap, double tap, swipe up and swipe down. Anush says that he started Nod because he saw that touchless interactions & gesture controls would bring about a new computing revolution paradigm.
Nod was showing a demo at GDC that used the nod ring in virtual reality as an input controller. It was a simple and intuitive interaction to interact within a VR environment, but it also felt like the latency was pretty high. Anush says that the extra latency was likely due to also broadcasting their demo to a screen so that others could see the VR action at GDC, and that their benchmarked latency is around 7.5ms per packet via Bluetooth.
The current price of a Nod ring is around $149, which seems a bit high. But it’s definitely worth keeping an eye on in the future as the VR input controller ecosystem evolves, especially for the mobile VR experience. Nod is integrating with OSVR, and so it could have a pretty straight forward integration with VR experiences through the OSVR SDK.
At GDC Nod also announced the Nod Backspin controller, which combines motion, gesture, touch, analog and haptic feedback. They had an early prototype on display at GDC, but didn’t have any demos with it yet.
Finally, Anush sees that reality and VR will blend into a mixed augmented reality where it may be difficult for our perception to discern what’s real and what’s augmented. In the end, he’d like to see Nod play a part in helping to capture human intention through various gestures and pointing behaviors.
Theme music: “Fatality” by Tigoolio
Subscribe to the Voices of VR podcast.
Rough Transcript
[00:00:05.412] Kent Bye: The Voices of VR Podcast. I'm Anush Langovan. I'm the founder and CEO of Nod Labs. Nod is a gesture and motion tracking company. Our first product is a gesture control ring. And we've just announced a new product called the Nod Backspin, which is a gamer-focused device that is the first controller to bring together tactile input, analog input, touch, motion, and gestures in a very small and sleek package.
[00:00:37.768] Anush Elangovan: Did you start Nod specifically to develop some input controls for virtual reality then?
[00:00:43.773] Kent Bye: It wasn't specifically for virtual reality. So when we started, our focus was input, right? We approached it from a computing revolution paradigm. So we look back in history and you see there's always been a computing revolution around input. You had the mouse which triggered the whole PC era, and then you had capacitive touch that triggered the mobile era. And so we're like, hey, what's next? What's going to trigger the touchless error? So there's speech, and then there's gestures. And speech has its inherent problems, ambient noise, things like that. But gestures, until now, has not been very accurate. And now we've been able to get it to a point where it's so accurate that you can control every pixel on a screen or every voxel in a 3D space.
[00:01:26.809] Anush Elangovan: Great. And so maybe you could talk a little bit about first the ring in terms of what all the different types of user interactions that you can have just with this little single ring that's on an index finger. Right.
[00:01:35.856] Kent Bye: So the ring exposes gestures, motion, and tactile and touch input. By that, what I mean is you can do simple gestures like swipe forward, swipe back, swipe up, down, enter, exit. And then you could build on other things like tap, double tap, swipe, swipe up, swipe down, tactile buttons, things like that. And then you can build anything above it. So we have an open SDK that on top of it you can build anything that you want for your games.
[00:02:07.442] Anush Elangovan: And so in this little demo experience, you're in this world where all these little bunny rabbit monsters are coming towards you with targets on their chest. And you're kind of pressing on one side of it to shoot. And then you can also move forward and backwards. And yeah, maybe you could talk a little bit about all the different ways of actually having VR locomotion with this ring. Right.
[00:02:28.655] Kent Bye: So one of the things that people really like in gaming is they don't like one-to-one motion for locomotion. You don't want to run in this 3D space. So we mapped a simple touch button. So you touch it and you start running. In the Nord Backspin, the same thing could be mapped to an analog controller because you get good feedback. And the other controls, like what we call the side touch button near the thumb side, it's just ergonomically, it's like easy for casual gaming. Any kid can pick it up and they're just like tapping to fire. And even though we have other gestures to map, like switch a gun, switch things, we just wanted to have a very simple experience for people to play with least amount of tutorials to say, hey, this is the controller, press this, press this. And so I think we did achieve that to a certain extent. You tried it, so you should let us know what you thought about it.
[00:03:17.607] Anush Elangovan: Well, I thought that it does feel like a sense of immersion that you can have input. I did notice that there was some latency. And so in terms of measuring latency with an input controller like this, what's where it's at now, and what's your target for where you want it to be?
[00:03:31.684] Kent Bye: Yeah, so what you saw today is mostly latency from our display side. And if you know the Nightmares, it's actually based off of our Unity sample project. So we took that, put a character in from Unity Assets just to show the game. What we do for benchmarking our latencies, we have a sniffer at the Bluetooth level to see what our latency is. And our latency is at about 7.5 milliseconds per packet. What that means is it's good for gaming, right? You actually have game controllers that do the same 7.5 millisecond latency. But we have some new technologies that actually mask latency even better. And you can think of it like lossless compression on a link or something like that. But that's kind of the roadmap as to where we're going. Essentially, we want to get to a point where you don't care about the fact that it's over Bluetooth low energy. You just are so immersed that you're just playing.
[00:04:23.249] Anush Elangovan: And so how did you get into this particular Nod Labs and doing this product and eventually into VR?
[00:04:29.912] Kent Bye: Yeah, so I was at Google prior to this, and we were working on the Chromebooks. We worked on the first Chromebook that was fanless, which was a pretty out there experience, because until then, laptops had loud, noisy fans. VR itself, I've been involved since I was like 12 or 13 in some sort of VR. This is like late 80s, early 90s. You know in my junior high we built our own School we remodeled our whole school and you could actually walk through the school and as you go through the school You get everything but the funny thing is it was on a CGA monitor written in GW basic That's what we had in India at that time and you had to plot every line And so you had like these huge graph sheets that you rendered a whole scene and you walk through each Thing and when you went to the chapel and there's like music playing or something like that. So VR kind of came on early, and so we did a lot of graphics then. And then for a while, I dabbled with VRML back in 94, 95, when it first came out, VRML 1.0. And then for much of the intermediate period, I was focused on networking and network security. But then this kind of gave me a good opportunity to get back to graphics and user experiences.
[00:05:37.673] Anush Elangovan: Great. And so with this new controller, you have a little bit more buttons and an actual joystick controller on it. But it still kind of is ergonomically fitting in your hand. So maybe describe all the different things you can do with this next iteration.
[00:05:48.984] Kent Bye: Right. So the Nord Backspin is a perfect confluence of motion, gestures, touch, tactile, analog, and haptic feedback. So that's quite a few input and output methods to kind of fuse in a small like form factor that fits in your palm, right? And it's ergonomically designed. So we went through a lot of pains to design this. So this honeycomb structure is to kind of like give you a little aeration when you hold it because people usually grasp it for a while and even the controllers are like, you know, the buttons are placed ergonomically. and we just think all of this with like touch right capacitive touch gives you so many other things where you can swipe to switch your weapon or go up go down you know there's so many other things that comes in here i think with this nord backspin we have more ways to express ourselves, and it's actually the inverse problem where the content hasn't caught up, because we have accurate pointing, we have accurate motion, we have accurate gestures, and now it's time for content to catch up, and so that's our primary focus right now, where we're trying to get content developers to be aware of this, we're seeding developers with this, we want them to play with this and vouch for what they can do with this.
[00:06:59.860] Anush Elangovan: And when I was talking with Yuval of OSVR, he had mentioned Nod as being one of the partners for being able to integrate your SDK into the OSVR SDK. And so maybe talk a bit about your involvement with OSVR and where you see that going.
[00:07:13.850] Kent Bye: Yeah, so we like what the OSVR team's doing. They're trying to federate the platform layer. Of course, we have our input platform, the Nord Immersive platform, which we make sure works across all HMDs, so content developers don't have to worry about it. But what OSVR is doing is trying to get a broad, industry-wide support, and we're happy to plug into that API. Anything that furthers input in VR or presence in VR, we are all for. And it's just a race. So whoever runs fastest wins. So we'll just run that.
[00:07:45.007] Anush Elangovan: Great. And what are some of the experiences that you want to see in virtual reality with some of these input controls?
[00:07:50.811] Kent Bye: What I'd like to see is input should disappear and kind of fuse with the body. My dog knows when I point somewhere, that's what I'm pointing to. And that's when it gets very natural. So there'll be a blend of like different technologies, cameras, sensors, however we do it. And I think this is like a 50, 100 year problem where it's trying to communicate human intent and trying to understand that and communicate it to the surrounding so that your ambient compute, be it VR or AR or whatever else, can listen in on what this person wants to do is kind of the holy grail. And hopefully, Nod is taking the next step towards that, and we'll get there.
[00:08:28.778] Anush Elangovan: And finally, what do you see as the ultimate potential for virtual reality and what it might enable?
[00:08:33.945] Kent Bye: So I think the ultimate potential is kind of, I don't think we can see that far to say what the ultimate potential is. What I see in my limited view is reality and virtual reality will blend and augmented reality, you know, it's just going to be a mixed reality where at some point the person standing in front of me, I don't know if it's real or not. And maybe even when I touch them, I don't know if I'm touching them or not. it's going to get to a point where it's all perception in the brain, right? And when we get to that point, it'll be a little scary for people who live in this era, but I think that's where we'll get to. Great, well thank you. Thank you.