#428: SMI Eye Tracking VR Applications & Foveated Rendering

Walter-NisticoSensomotoric Instruments (SMI) is a German-based eye tracking company who has released an eye tracking kit for the Oculus DK2 & Gear VR, and most recently for the HTC Vive. At SIGGRAPH this year, Nvidia was showing a foveated rendering demo where it only renders high resolution to the sections of the scene that you are actually looking at. It’s really an imperceptible difference that would allow mobile technologies to render higher resolution scenes, or potentially help make it more feasible to wirelessly transfer data to a desktop VR HMD.

Tom SengelaubAt SIGGRAPH, I had a chance to talk with Walter Nistico, Head of R&D and Lead Architect Computer Vision, as well as Tom Sengelaub, Manager Solution Delivery, about SMI tracking, and some of the applications in foveated rendering, medical applications for autism research and concussion detection, marketing and analytics, and even deception detection with Converus’ EyeDetect.

LISTEN TO THE VOICES OF VR PODCAST

Researcher Hao Li told me that eye tracking is pretty essential in order to take VR social presence to the next level, and so I expect that the second generation of the Oculus Rift and HTC Vive would both include eye tracking technologies. In talking to Walter and Tom at SIGGRAPH, they’re also very confident that we’ll start to see eye tracking technologies in the next generation of VR headsets.

From SMI’s perspective, they’re hoping to be able to license their eye tracking algorithms to the big headset manufacturers. In my interview with Tobii eye tracking, they also told me that they’ve also been in discussions with some of the major VR HMD manufacturers. SMI says that the hardware required for eye tracking is not a huge barrier, and so it will likely be a matter of whether the eye tracking algorithms are going to be developed in-house or licensed from one of the big eye tracking players.

Here’s a video of NVIDIA’s foveated rendering demo shown off at SIGGRAPH

Here’s a shadertoy fovea visualizer demo that illustrates how your fovea works (be sure to watch it in full screen).

Here’s a recent demo of using SMI eye tracking with the HTC Vive:
https://www.youtube.com/watch?v=iNGKAEBlQ-E

Here’s a demo of eye tracking of a spatial search task within VR:
https://www.youtube.com/watch?v=16VtA0YIddA

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. My name is Kent Bye, and welcome to the Voices of VR Podcast. So at SIGGRAPH this year, there was a couple of eye tracking demos that were using the SMI tracking technologies. SMI is the Sensor Motoric Instruments Company. It's based out of Germany. And they had just announced that they have an eye tracker kit that you can add to the HTC Vive. And so NVIDIA was showing SMI tracking with the Vive and showing off a demo of foveated rendering. And so I'm going to be talking today with a couple of people from SMI tracking, Walter Notisco and Tom Singulab. We'll be discussing what they've been working on and some of what they see for the future of eye tracking and virtual reality. So that's what we'll be talking on today's episode of the Voices of VR podcast. But first, a quick word from our sponsor. This is a paid sponsored ad by the Intel Core i7 processor. If you're going to be playing the best VR experiences, then you're going to need a high end PC. So Intel asked me to talk about my process for why I decided to go with the Intel Core i7 processor. I figured that the computational resources needed for VR are only going to get bigger. I researched online, compared CPU benchmark scores and read reviews over at Amazon and Newegg. What I found is that the i7 is the best of what's out there today. So future proof your VR PC and go with the Intel Core i7 processor. So this interview with Walter and Tom happened at the SIGGRAPH conference that was happening in Anaheim, California from July 24th to 28th. So with that, let's go ahead and dive right in.

[00:01:49.083] Walter Nistico: I'm Walter, I'm Italian and I'm the head of computer vision. I'm an engineer and I'm, let's say, behind the eye-tracking algorithms that we have developed. So, I'm working for Sensor Motoric Instruments, which is a German company, which does eye-tracking systems and solutions. We have been doing that for 25 years. originally more in the field of medical applications so eye trackers which are used for eye surgery like for example laser refractive surgery and now we are focusing on different type of applications including VR which is very hot right now.

[00:02:28.018] Tom Sengelaub: Yeah, so my name is Tom. I work for SMI too. I am heading the application development at SMI. I also do most of the like hardware integrations and that. I used to also work on the algorithms a couple of years back, but now my main focus is on applications and I also like work with OEM customers and getting our eye tracking integrated in that and customized to their needs.

[00:02:51.690] Kent Bye: And so I saw just today that there was a press release from SMI that there's been some integrations with Vive explicitly, but you also have some kits for Oculus Rift as well as a Gear VR. Maybe you could talk a bit about what is available out there if people want to start to play with eye tracking within VR, what type of options are out there?

[00:03:10.756] Walter Nistico: Yeah, so we made an integration with the Oculus DK2 more than a year ago. Right now, the Oculus DK2 is not available anymore, so we are just right now announcing the integration with HTC Vive, which is probably the hottest VR system at the moment. We have also made an integration with the Gear VR, which is also a very nice system. I'd say the main difference between the two is the Gear VR is like more of a mobile kind of application, and the HTC Vive is more like a high-power gaming kind of thing.

[00:03:47.659] Kent Bye: And so what type of applications or specific functionality is this eye-tracking enabling on these different VR technologies?

[00:03:55.490] Tom Sengelaub: So, one of the biggest promises Eye Tracking holds for VR is most likely foveated rendering. It's a technique where you exploit the human visual system in a way where you only render high details in a very small area in the center of the field of view of the user. and then you can drop resolution and detail going towards the periphery because this is how our human visual system works. So we have like a five degree area where we can see fine detail and then this is due to the mount, like we have the density of cones is like the highest in the fovea and then amount of cones drops going towards the periphery of the vision. So definitely this is the most promising one because it makes virtual reality more accessible by allowing it to be rendered in less expensive hardware. Which is one thing also, it will eventually enable us to go to higher resolution displays or it might even make wireless VR and wireless headsets a possibility. where I believe that this certainly be the application which will pay for the integration of eye tracking in consumer headsets, is that once we have that in, we can use eye tracking, like the same sensor, the same technology, exact same software, for also interacting in virtual worlds. So what we're showing here is a way where you can use your point of gaze to absolutely, to precisely select objects, and then you only do relative manipulation with your hand controllers, for example from the Vive, which actually works really, really well, because this is how we interact in our environment naturally anyway, because before you grab something, you always look at it. And then by bringing that to VR, we kind of re-establish this hand-eye coordination in that. Another very, very interesting application in that is creating social presence with eye tracking. So if you look at platforms like High Fidelity or Outspace VR, where we have avatars representing us in the virtual space, Adding eye movements to that, which we've done with both companies, makes the avatar so much more alive. You actually feel like there's someone real behind that.

[00:05:58.421] Kent Bye: And here at SIGGRAPH, NVIDIA was showing a demo of using foveated rendering within their pipeline. And kind of the ironic or funny thing of watching some of these foveated rendering demos is that it's actually really difficult to tell that it's working or not. They're actually kind of freezing the frame and allowing you to kind of look around and see it. But it's pretty imperceptible when it's on or off. It's kind of a fascinating thing to look at that we only have a small part of our eye with our fovea that we're really actually focusing. And so when we see these demos that are running 90 frames per second, and I guess I've seen different update rates of up to 240 frames per second for the eye tracking, it's fast enough that it's really imperceptible that it's changing anything.

[00:06:42.439] Walter Nistico: Yeah, I mean, for real rendering, it's a technique which is very hot right now because it allows to either reduce the computational load and power consumption of doing rendering by a large factor and in a way which is imperceptible to the user. Or you can use it the other way around and you can still, let's say, use the full power of your GPU but you can then increase the level of detail of your rendering or of your game. Also by a large factor in a way. I mean like you could see a bit like I don't know mp3 sound compression I mean like you are just exploiting the weaknesses of our perceptual system. So traditional rendering essentially it's wasting power by rendering everything with a full resolution where our eye is not able to perceive that extra resolution it has in the periphery That's why it's so important now, let's say, 4D rendering and eye tracking, which is the key enabler for 4D rendering, because this will allow probably in the next generation of VR systems to have a massive increase in terms of image quality and fidelity and keeping always high frame rates, also without needing to have like the most expensive GPUs which cost a thousand dollars, because this was also one of the main One of the main things that in the beginning people have been complaining about concerning VR that, oh, I need to buy a very powerful PC, right? So it's expensive.

[00:08:09.899] Kent Bye: And it seems like with being able to track the eyes, you're able to really get a lot more detailed analytics in terms of what people are looking at, what they're paying attention to. And so maybe talk about some of the applications in terms of getting analytics or analysis in terms of marketing purposes.

[00:08:25.207] Tom Sengelaub: So this is one of our more traditional markets in that, so before eye tracking now starts to get popular for the future of VR, one of the traditional applications for that was actually like market research or psychological experiments, right? Where you could use an eye tracker to understand where people pay attention to, for it either be like optimizing a website or optimizing advertising, or to use it for more nice use cases as to understand how the human brain works, how do we interact with people. At the moment our eye trackers are used in autism research in order to help sick kids to learn how to better interact in social scenarios. or one of our partners is using it for detecting if a football player is concussed in a game. They developed a 30 second test with our eye tracking where they show you a certain target and then by how good you're able to follow that they can diagnose or give a hint on the diagnosis if you might be concussed or not.

[00:09:22.637] Kent Bye: So football players who have a concussion are able to go into like a VR environment and then they're given an automated eye test and then they can decide more accurately and faster whether or not they have a concussion or not, it sounds like.

[00:09:34.598] Tom Sengelaub: Yeah, because at the moment, the way it's done, you get like a questionnaire with that and the player will never, like no player in no sport will ever say, oh, I'm too sick to go back in because you want to go back in and you want to play. So with eye tracking, you're actually objectifying that test and then you can better decide if a player might need treatment or not, or it's quite better to take him out because it's also, it's very, very dangerous if you like have a concussion and go back in the game.

[00:09:59.826] Kent Bye: And what are some of the other applications or things that you can tell from looking at somebody's eyes? I mean, are you able to determine how interested somebody is, or whether or not they have some sort of emotional states by looking at the eyes alone?

[00:10:12.936] Walter Nistico: Well, one obvious further type of application is interaction based on eye tracking, but also in terms of, for example, hands-free menu navigation. I think this one, especially on mobile players, like the Gear VR, for example, can be very helpful. Otherwise, let's say in the fields of analytics, there's a lot of companies which are doing a lot of research in very interesting areas, also like trying to detect, for example, states of relaxation or cognitive activity. There is actually also one of our customers which uses eye tracking for a lie detector. Actually, some of these guys were the inventors of the polygraph, so the original lie detector. And they claim that their tracking-based lie detector is about 80% accurate, while the polygraph is only like 55-60%. So, let's say somebody says that the eye is like a window to the soul, right? But let's say just by analyzing eye behavior and patterns, there is a lot that can be learned about also a person's mind, let's say.

[00:11:21.827] Kent Bye: So it's 2016 and we have consumer VR and none of them have eye tracking built in. And I expect that this is one of the features that will inevitably be built in into either the second or third generations. And so just curious to hear your thoughts of where this is going, because right now we have the SMI eye tracker kits as a bit of a stopgap for people to get the eye tracking technologies, but it's not going to have an all-pervasive adoption rate until everybody has it built into the headset, I imagine.

[00:11:52.230] Tom Sengelaub: So what we're trying by exhibitions like this and by also obviously talking behind closed doors with all of the headset manufacturers is that our goal is to get it integrated in each of the consumer headsets. We don't have the timeline under control unfortunately, otherwise it would be out there already, but we are also quite positive that we will see it in the second generation of headsets and once one of them has it, then all of them will have to follow.

[00:12:17.958] Walter Nistico: Yeah, I mean, Let's say it's four years, right, that we have been investing a lot of time, energy, and money, let's say, in creating what we call like our OEM business unit with explicit goal of taking our eye-tracking platform, our algorithms, our expertise out of just the professional space and making it available for consumer applications. Because we strongly believe that right now the time is right, technology is mature, we are able to work with components like cameras, LEDs and so on, which they used to be very expensive and right now they are very cheap. With a few dollars now, the hardware side of eye tracking is there. And we also have the algorithms for making it work. So, right now it's just a case of making everybody understand that there is a clear fit, let's say, between eye tracking and mass market applications. And we think that now with VR, and especially with 4D rendering, we are seeing a lot of traction, we are seeing a lot of interest, we see that people see our demos and they understand the value. And so that's why we are confident that in a matter of a year or two, we'll have been able to convince some major players in the industry to adopt our technology, license it and bring it to the masses.

[00:13:43.497] Kent Bye: So for you, what are kind of the biggest open problems that you're trying to solve that are really driving you moving forward? I imagine that there is optimizations in terms of perhaps power and efficiency, other things along those lines, but what do you think are the big things that you're trying to work on in terms of improving this technology moving forward?

[00:14:02.353] Walter Nistico: Yeah, so you named a big one, power. So let's say for PC-based systems, this is not a problem. However, for mobile systems like the Gear VR or whatever is going to be the successors, you have a battery, right? So everything that you can save in terms of power is going to be extremely valuable because you want to have a system which works not for one hour or two, but ideally maybe for eight hours or ten hours or whatever. And the other aspects that we think are still, let's say, areas of potential improvement are just in making the technology itself even more easy to use, even more accessible, even more natural. So right now, we think we're already at the point where everybody and they play with it for five minutes, then they get the hang of it, and then they get proficient with it. But there is still a very little learning curve, let's say, in using it, like with any new sensor. But I think everybody is now going towards, let's say, I mean, we want the technology to understand the humans, the machines to understand the humans better and better. And so, obviously, we're just going to be going more and more into this direction. Our system is becoming smarter and smarter, and just the interaction is becoming more and more natural.

[00:15:14.682] Kent Bye: Great. And finally, what do you see as kind of the ultimate potential of virtual reality and what it might be able to enable?

[00:15:24.085] Tom Sengelaub: So within virtual reality, we can use the technology not only to entertain ourselves, but also to start understanding humans better and also help them to learn. I see it as a big educational platform, as well in academic as in industrial applications. So VR has a great potential in being used for training people in either dangerous or hard to reproduce scenarios. So if you're a power plant operator, you might don't want to have a hazardous event simulated in a real power plant. So you can do that in VR. And by making the systems better and better so that we feel more present, then also this will allow us to transfer more of the knowledge we gather in VR to the real world.

[00:16:10.282] Kent Bye: Awesome. Well, thank you so much.

[00:16:11.502] Tom Sengelaub: Thank you. Thank you so much.

[00:16:14.018] Kent Bye: So that was two employees from Sensoro Motoric Instruments. That was Walter Notisco. He's the head of R&D and the lead architect of computer vision, as well as Tom Singulab. He's the manager of solution delivery. So I have a number of different takeaways from this interview is that first of all, I think it was pretty interesting to hear how confident that SMI was about eye tracking was going to be coming to the second generation headsets for presumably the Oculus Rift and HTC Vive. Now, it sounds like the hardware for the actual eye tracking has come down in cost. The big open question I have is who kind of owns the algorithms to be able to actually pull off this type of eye tracking? I know that SMI has got a lot of years of research and development, and there's also other eye tracking companies like Tobii. I know I did an interview with them back in 2015 at GDC, and they also seem to be pretty confident in having different discussions with different virtual reality HMDs. And so I think eye tracking is one of those things that is absolutely necessary for social interactions and to take social VR to the next level. In my discussions with Hal Lee and talking to him at Oculus Connect 2 back in 2015, he was essentially saying that eye tracking is basically a requirement for social VR, is that you really need to see where people are looking in order to really determine their attention as well as to have better social interactions. In terms of social VR applications, I know that High Fidelity as well as Altspace has been using some of the SMI trackers with the DK2 kit that they've already released and started to do some early prototyping. And I know that just hearing back from them as well, eye tracking just takes social VR to the next level of social presence that you can achieve with other people. So another reason why I think that eye tracking is going to be pretty much guaranteed to be included within the next generation of VR headsets is that there's just so much analytic possibilities to be able to track somebody's eye gaze and do all these different heat maps for both analytics but also advertising and marketing. And I think eye tracking is actually one of the first real biometric technologies that's going to be integrated within VR headsets that starts to bring up a lot of different really interesting privacy issues. Do you really want these big corporations having access to your attention data? What are you looking at? What are you paying attention to? especially in talking about being able to use eye tracking technology to determine whether or not you're lying or not. With previous polygraph tests, the accuracy was only around 50%. And like they were saying in this interview, they're claiming accuracy of up to 80%. And so there's just a lot of information that you may be communicating with your eyes that you may not even fully be aware of. And so what happens to that data? Where's it going? And and how it will be used to learn more information about you and to be able to better advertise to you. And I think the issues of privacy here are going to be ones where you're going to actually want to look a little bit closer at some of these terms of service and where is this eye-tracking data being sent to and how are people using it. So some of the privacy implications of eye-tracking, I think, is perhaps going to be a little bit more of concern for people moving forward. The other side of the eye tracking is that it does give a lot of really helpful applications. One, which is the foveated rendering. Now, for people who have been early adopters of VR technologies, you've already got these VR-ready GPU cards. And the graphics are pretty good. And I think that there's probably even more fidelity that you could go to if you start to use things like foveated rendering. But I think the two big applications that jumped out to me for foveated rendering for VR at this point is, for one, doing wireless connections, so to be able to not have to process the entire scene and to just send across the wireless technology, as well as mobile. I think that mobile is going to want to have something that looks great but doesn't have the same level of graphic processing power as the desktop PCs. And so perhaps something like foveated rendering can allow you to have a little bit higher visual fidelity in some of these experiences. The big question being whether or not it's going to be cost effective and power efficient to be able to actually do that. And so I'd imagine that a lot of these eye tracking technologies will first be integrated into the desktop PCs. And then as the technology gets more efficient and better, then perhaps we'll start to see it in more of the mobile VR headsets, such as the Samsung Gear VR or Google's Daydream. So there's a lot of really interesting applications for eye tracking that ranging from training applications to helping to research and perhaps train people with autism to be able to do better within social interactions, as well as detecting concussions for sports players. So having a VR headset on the sidelines of the NFL to be able to do a more objective test of concussion rather than something that's a little bit more subjective. So a lot of really interesting medical applications that are out there. So that's all that I have for today. I just wanted to thank you for listening. And if you'd like the podcast, then please help spread the word, tell your friends, and you can also help out the podcast by becoming a donor at patreon.com slash Voices of VR.

More from this show