At GDC this year, SensoMotoric Instruments (SMI) showed a couple of new eye tracking demos at Valve’s booth. They added eye tracking to avatars in the social VR experiences of Pluto VR and Rec Room, which provided an amazing boost to the social presence within these experience.
There are so many subtle body language cues that are communicated non-verbally through the someone else’s eye contact, gaze position, or even blinking. Since it’s difficult to see your own eye movement due to saccades, it’s best to experience eye tracking in a social VR context. Without having a recording of your eyes in social VR, you have to rely upon looking at a virtual mirror as you look to the extremes of your periphery, observing your vestibulo–ocular reflex as your eyes lock gaze while you turn your head, or winking at yourself.
I had a chance to catch up with SMI’s Head of the OEM business Christian Villwock at GDC to talk about the social presence multiplier of eye tracking, the anatomy of the eye, and some of the 2x performance boosts they’re seeing with foveated rendering on NVIDIA GPUs.
LISTEN TO THE VOICES OF VR PODCAST
It’s likely that the next generation of VR headsets will have integrated eye tracking and the goal of both SMI and Tobii to be the primary providers, but neither Tobii or SMI are commenting on any specific licensing agreements that they may have come to with any of the major VR HMD manufacturers. I will say that SMI had some of the more robust social VR eye tracking demos at GDC, but Tobii had more nuanced user interaction examples and more involvement with the OpenXR standardization process in collaboration with the other major VR hardware vendors. You can read more information their integration with Valve’s OpenVR SDK in SMI’s GDC press release.
Donate to the Voices of VR Podcast Patreon
Music: Fatality & Summer Trip
Support Voices of VR
- Subscribe on iTunes
- Donate to the Voices of VR Podcast Patreon
Music: Fatality & Summer Trip
Rough Transcript
[00:00:05.452] Kent Bye: The Voices of VR Podcast. My name is Kent Bye, and welcome to the Voices of VR Podcast. So at GDC this year, I had a chance to do a couple of eye tracking demos. One was with SMI, and the other was with Toby. And I'm going to start today with the interview that I did with SMI. I had a chance to try out their latest demo at the Valve booth where they actually had me in a social VR situation. So I was in Pluto VR where I was interacting with a couple of people that were working with Pluto VR and it was just an amazing sense of social presence that I was able to get by actually having eye contact with these other people. So I had a chance to catch up with Christian Fulneck, he's the head of the OEM business at SMI, asking him different questions about the biology of the eye and what they can do with foveated rendering and why that's important, as well as some of their early experiments with social VR. So that's what we'll be covering on today's episode of the Voices of VR podcast. But first, a quick word from our sponsor. Today's episode is brought to you by the Voices of VR Patreon campaign. The Voices of VR podcast started as a passion project, but now it's my livelihood. And so if you're enjoying the content on the Voices of VR podcast, then consider it a service to you in the wider community and send me a tip. Just a couple of dollars a month makes a huge difference, especially if everybody contributes. So donate today at patreon.com slash Voices of VR. So this interview with Christian happened on Wednesday, March 1st, 2017 at GDC in San Francisco. So with that, let's go ahead and dive right in.
[00:01:50.477] Christian Villwock: My name is Christian Fühlwerk and I'm the head of the OEM business at SMI. So SMI is an eye-tracking company. We're doing it for a very long time. I have a lot of experience in VR since the 90s. We know about the value of eye-tracking virtual reality headsets. So foveated rendering, social presence, social interaction, making quality even better. And our goal is to get eye-tracking into every headset that's out in the market as a standard feature.
[00:02:16.165] Kent Bye: Great, yeah. So just here at GDC, I had a chance to have my first experience with social interaction with eye tracking. And so maybe you could talk a bit about what is striking for you with having eye contact in a social situation.
[00:02:31.203] Christian Villwock: How we normally communicate is a lot of facial and eye contact. So we know exactly what natural eye movements need to look like. And if you have multi-user avatar environments, you also want to have the same natural social interaction in virtual reality and copying the eye movements of a person into an avatar. is one way to create this sensational feeling of really having a human person in front of you, even though it's an avatar. It's looking probably strange, can be a robot, can be something else, but it's, as you have experience, it's absolutely natural and you really feel that you are interacting with a real person.
[00:03:04.836] Kent Bye: Yeah, I find that just the way that where people are looking when they're talking, it just adds a punctuation, as well as the blinking. I was really extra surprised to see how much communication is made when people blink. And I know that film editors actually look at blinking, and they try to edit on or just before a blink, because it usually indicates that people are changing thoughts in some way. So maybe you could talk a bit about your own experience of what it's like to have a blinking in a VR experience.
[00:03:35.079] Christian Villwock: So the whole facial expression and eye expression is important. So it's not only about the eye movements, but we also have our eyelids and they are moving and we are blinking. So the whole eye is important in this type of situation as it represents how we communicate.
[00:03:50.787] Kent Bye: Yeah, I wanted to see a mirror of myself, and I actually had a chance to do that in Rec Room, but there's sort of a dilemma there of having a saccade and not actually able to ever really see your own eye movement. I think you could probably do some recording of that to watch it, but maybe you could talk about the saccade and what that does in terms of being able to watch your own eyes move.
[00:04:12.907] Christian Villwock: If you look in a real mirror, you cannot see that your eyes are moving. It's the same sensational feeling that you have in VR and eye tracking. If eye tracking is really fast and low latency, then you have the same experience that you have if you look in a real mirror.
[00:04:27.895] Kent Bye: Yeah, the thing that I could detect, though, was blinking as well as if I were to focus on one specific spot and move my head left and right, I started to see my eyes continue to look. And I think just added a whole other level of immersion and presence. And so maybe you could talk about like your plans at SMI. Do you hope to, you know, in the second generation of headsets, strike a deal with, you know, major headset manufacturers to have it actually built in? Or what's kind of your strategy to get this into the hands of consumers?
[00:04:58.175] Christian Villwock: We are talking to all the major headset manufacturers. Eye-tracking is definitely on the priority list of all of them. We're working with Valve and other companies like NVIDIA and so on in order to show and to prove the value of eye-tracking in virtual reality headsets. We have our reference designs here which can be acquired and purchased from nearly everyone, so we really know how good eye-tracking works and what you can do with it. The feature that will pay for eye tracking is probably the foveated rendering feature that allows you to have better performance out of the GPU and save on rendering or get quality up and down. Once this is in, you get all the other features that eye tracking can bring for free, like natural social presence, the gaze interaction interfaces. You can also measure automatically the interpupillary distance. You don't need to set it up automatically. So you improve quality in general when you add eye tracking.
[00:05:48.507] Kent Bye: Yeah, I found that evaluating a foveated rendering performance is really difficult because, you know, if it's done well, it's almost imperceptible and you can't even see that it's happening. And so maybe you could kind of describe the mechanics of the fovea and what's actually happening in the eyeball there.
[00:06:04.568] Christian Villwock: Yeah, so basically how our visual system works, we have these five degree of really foveal and sharp vision and then it goes to peripheral vision. Acuity goes down in peripheral vision, but our motion sensitivity goes up. And one critical part here is if you go and use in the fovea in the central region where, pay your attention, where you're gazing in high resolution and you use low resolution in the peripheral view, you can save a lot of power and performance and increase also the quality, if you like, with current generation of GPUs. One thing that needs to be done right is that our motion sensitivity in the periphery goes up and if you have lower resolution rendering, then the aliasing effect comes into the game. Aliasing is recognized as motion. So some intelligent type of implementation filtering in the periphery are necessary to have really compelling and working 4D rendering. And as you experienced right now, 4D rendering is working very well. So what we can expect from that is, and it's a boring demo, because if you don't see a difference between normal rendering and foveated rendering, which is a goal, so then it means it really works.
[00:07:06.402] Kent Bye: What kind of metrics do you use to measure the impact of foveated rendering in terms of, you know, looking at the reduced load? What kind of numbers are you seeing there?
[00:07:14.967] Christian Villwock: It depends a little bit how it's been implemented, what type of GPU it's been implemented, but as a rule of thumb, you can get a factor two of performance with the current solutions.
[00:07:25.882] Kent Bye: And so does this foveated rendering require like a composited rendering approach where you're just rendering in two different layers and then kind of merging them together in some way? Or maybe you could describe the mechanics of how you're actually reducing the load in terms of having a higher resolution in some fovea region and then lower resolution in the other, how you actually do that technically?
[00:07:47.933] Christian Villwock: So we as SMI cannot do it alone. We have the enabling technology with eye tracking, but it requires to work with the GPU owner, graphic card driver owner, and we just call it multi-res shading so that you can render basically in one scene in different resolutions. And that's a very important feature in order to, that's a basic functionality for foveated rendering. And then you apply another type of features and filtering in order to make it really invisible.
[00:08:12.690] Kent Bye: And so with the SMI tracker, are there multiple cameras that are getting a 3D model of the eye in order to increase the accuracy? Or is it just a single camera that's looking at the eye and tracking it?
[00:08:22.574] Christian Villwock: We are using a single camera per eye, but we have a real 3D eye model behind this in our algorithm set. So we can also reconstruct the 3D eye model out of even one camera.
[00:08:33.190] Kent Bye: So if people wanted to integrate SMI eye tracking into the Vive today, is there a kit that they can buy? And if so, how much is it?
[00:08:41.557] Christian Villwock: It's not an aftermarket kit that you can buy. You can buy retrofitted units from SMI, which we are selling to basically mainly scientific researchers and high-end developers and everybody who is seriously interested to work with this type of technology. It's not that SMI is going to consumer markets ourselves. So these reference designs are basically for everybody who is interested to integrate eye tracking and bring that market or to do research work with eye tracking so far.
[00:09:09.083] Kent Bye: Are there any sort of pending open problems with eye tracking? I imagine that it's mostly an issue of power consumption and reducing it down. For mobile, is that kind of like the next frontier of optimizing your system so you can get it into mobile headsets as well?
[00:09:23.309] Christian Villwock: Our focus from the beginning was to have eye tracking being enabled for PC console and mobile VR as well. We are working with Qualcomm on the reference design heads and the all-in-one device. We have optimized our algorithm set also to work on specific platforms very efficiently.
[00:09:38.036] Kent Bye: Great. And finally, what do you see as kind of the ultimate potential of virtual reality and what it might be able to enable?
[00:09:46.067] Christian Villwock: Talking of virtual reality in general, I believe in this. It's a huge potential and if we look forward a few years back, it will be the device similar to what we carry a phone with us, something which we carry with us, which will probably replace the phone. It probably will replace also the PCs that we currently use, since if we have a very super powerful VR headset in our pocket, we can do a lot of nice things, which we can only do with several different devices right now.
[00:10:13.079] Kent Bye: Awesome. Well, thank you so much. Thank you. So that was Christian Volnick. He's the head of OEM business at SMI. And he's talking about some of the latest eye tracking features that were being shown off the Valve booth at GDC. So I have a number of different takeaways about this interview is that, first of all, in comparing what's happening at SMI and Tobii, some of the demos that I've been seeing from SMI are a little bit more advanced. They're actually showing their foveated rendering demo. I had a chance to first see this at SIGGRAPH back in August and did an interview with SMI about it back then. It's tricky to really do too much evaluation within a foveated rendering demo because when it works, it just works and you can't actually really see it. But that said, the thing that Christian is saying is that he's really seeing the foveated rendering as one of the primary early wins with eye tracking. And there's a couple of points to that. One is that, first of all, for people who already have a high-end VR-ready machine, then it's able to potentially even have higher resolution graphic fidelity. And people who don't have a VR-ready machine may be able to start to use VR, given the fact that there could be enough performance boost given foveated rendering. So I've been asking SMI and Tobii about their plans of integration in the second generation of VR headsets and they really can't say much of anything. I think there's potentially a battle between some of these companies like SMI and Tobii as to whether or not they're going to be the preferred solution with them. some of these headsets, I expect that once the second generation of headsets come out, it's already going to be built in. So it's a little bit of an open question. But in terms of the advancement of the demos that I'm seeing, I'm seeing a little bit more innovation in terms of the features that I'm seeing from SMI than Tobii, at this point at least, from what I saw at GDC this year. The social interactions with eye tracking just take it to the next level. Again, it's something that's really difficult to really test within a social VR experience, but I did get a chance to go into Rec Room, where they've been modifying the avatars in order to really amplify the impact of eye tracking. Right now in Rec Room, you essentially have this black dot and an eyebrow. But it's hard to tell if you're looking up or down. You basically get a left or right. And it's kind of a canned animation right now. But with the eye tracking, they actually have the whites of the eyes. And you can do all sorts of things, like look up and down, roll your eyes. And there's just so much more fidelity of both communication that you can have, but also body language and subtle, unconscious body language that's going to be transmitted. Now, I didn't get a chance to ask SMI anything about privacy, but I did get a chance to talk to Toby about that. So they have some interesting perspectives on that. I think that, you know, eye tracking is really on this front lines of what does it mean to be able to actually start to have access to all this information about our body for the first time. So it's a question that I'm going to be kind of unpacking a little bit more in the next couple of episodes here. So some of the big wins from eye tracking that SMI says is that you get this sense of natural social presence, which we talked about. The gaze interaction interfaces, which I think is an interesting one because I think that actually you can do a lot more movement with your eyes quickly than with your hands. So in some ways you can use an eye interface to be less fatiguing. And I think that Tobii actually had a little bit more interesting user interaction paradigms that they were demonstrating than what I saw at SMI at this point. But just to be able to look at objects and be able to select them and then be able to manipulate them I think is something that we're going to see a lot more of once you get eye tracking built into these headsets. I know that I did an interview with iFluence back at TechCrunch Disrupt. Since that interview they've been purchased by Google and so I could expect to see that some of this iFluence type of user interface technology to be built into some of the future iterations of the mobile VR headsets of Google. And it was also just really interesting to hear that you can automatically measure the IPD. That's the interpupillary distance, the distance between your two pupils. Why that's important is that when you put on a headset, and if the IPD is way off, then the world could not really converge in the way that you're really used to it. Your eyes can adapt, but you kind of have to put a lot of strain in having your eyes be crossed over. The fact that you could automatically detect that, to me, implies that one, what are they going to do with that? Are they going to actually sort of put that back into the software and have the software fix it? Because right now, both the Vive and the Rift are using this mechanical IPD adjustment. little knob on the right hand side of the Vive. Every time I put on a Vive I kind of tweak the IPD so that it's at 61 millimeters for me. And then for the Rift there's a little slider on the bottom and you can more qualitatively adjust it so that whatever you're looking at converges. Now the problem I see with this automatically detecting it is that there's a mechanical solution. So are you going to have like some sort of automatically mechanical adjustment of that IPD whenever you put on a headset and you have eye tracking? I think these are kind of open questions as to whether or not that would be actually solved on the software layer or whether or not that would feed back into the headset and then automatically kind of adjust it. So that'll be interesting to see what they're actually able to do with that data. So that's all I have for today. I just wanted to thank you for joining me on the Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends, and become a donor. Just a few dollars a month makes a huge difference. Go to patreon.com slash Voices of VR. Thanks for listening.