Henrik Johansson of Tobii Technology talks about some of their eye tracking technology that’s started to be integrated into different video games like Assassin’s Creed to be able to have an “Infinity Screen” user interface to 2D video games.
Eye tracking in VR has a lot of applications ranging from foveated rendering to being able to dynamically adjust the focus of the object that your currently looking at. At GDC, Henrik says that Tobii was talking about some of their collaborations with virtual reality head mounted display manufacturers for the first time. He wasn’t specific in terms of who or what exactly they’ve been working on, but said that sometime by the end of the June that they would be announcing more collaborations and work within both AR and VR. But they do have some AR-type of eye tracking Tobii Glasses 2 ready for the consumer market.
Henrik talks about some of the existing implementations of eye tracking with video games, and one of the exciting new game mechanics that becomes more possible with eye tracking is to give a more realistic and human behavior to interactions with NPCs. You can either trigger responses or change behavior of NPCs based upon whether or not you’re looking at them. He also shares some of the other applications and implementations of their eye tracking solutions in different software programs, and a bit more information about their SDK.
Eye tracking in VR is going to be able to add a lot of engagement within social situations and collaboration within VR, and so be sure to keep an eye on Tobii near the end of the second quarter of this year for more information on how they plan on using their technology within the VR and AR space.
Theme music: “Fatality” by Tigoolio
Subscribe to the Voices of VR podcast.
Rough Transcript
[00:00:05.412] Kent Bye: The Voices of VR Podcast.
[00:00:12.039] Henrik Johansson: So my name is Henrik, I'm working for Tobii Technologies, a company coming out from Sweden. We've been in the eye tracking business since 2001 and now finally I've been able to bring an eye tracker to the consumer market and we started doing that in CES. We launched the first consumer eye tracker peripheral together with SteelSeries Sentry. That one included one streaming solution and one training solution. That was then in January we launched this and now then here at GDC. We are now launching a few other games that have the immersions that eye tracking can help you with. And one of those are the Assassin's Creed Rogue for PCs, where we have an infinity screen solution. So if you look to the right, the screen goes to the right, and if you look to the left, the screen goes to the left and up and down and so forth. So we have created an environment where the eye is actually controlling the movement of you and how you characterize yourself in that game.
[00:01:09.428] Kent Bye: I see. And so one of the things that is challenging with virtual reality is that the typical eye tracking solution that's external to the HMD, you won't be able to actually see the eyes. And so have you guys been considering actually putting eye tracking within virtual reality HMD?
[00:01:24.987] Henrik Johansson: Yeah, so we actually are researching it at the moment and already at CES we actually announced that we are today working with partners on VR and AR solutions with our glasses. One of the business units we're having in Tobii are working on research and marketing and there we already have a glasses implementation with eye tracking. And that technology, when we developed that platform, we already at that time had a consumer market in mind. So it is actually a very small implementation of eye tracking. And that implementation you can put into an AR or VR environment. And we are doing that today with key partners as an R&D discussion level on the focus direct interactions. But by end of Q2, we plan to have our full SDK ready for AR and VR environments. So yes, we have the solution and we have a platform that is in the size that you actually can implement today in different AR and VR solutions.
[00:02:24.408] Kent Bye: Oh wow, nice. And so, you said that you announced a number of partners. Have you announced who you're working with? Or just that the fact that you have been looking into doing eye tracking with virtual reality and augmented reality?
[00:02:36.258] Henrik Johansson: No, we haven't announced any partners yet. So we are working on finding solutions that we believe is good enough to put in the market. And we are now dedicated to working with key partners to make integrations where we include the eye tracking with the platform that comes from Adobe Pro organization and the glasses too. and we take that solution and bring it into their R&VR and from there then we can build on and develop a fully SDK that will for the broader market that are then available in Q2. End of Q2 we are actually already stating.
[00:03:08.946] Kent Bye: I see. And so I guess, you know, one of the things with, you know, eye tracking combined with something like with virtual reality is that you can do something like foveated rendering based upon like where you're looking at on the screen, you're only rendering that portion of the screen. And so it could be more efficient. And so talk a bit about, you know, some of the benefits that you could get once you have eye tracking within a virtual reality head mounted display.
[00:03:31.672] Henrik Johansson: You're touching a very correct point. Foveator rendering is an obvious thing you could do with implementing eye tracking because you will always know where you're looking and then you can start rendering the display or the monitor or anything that you're projecting your actual game on in the way that you follow the gaze or where you're looking. So that's one obvious use case. Another obvious use case for games in general and of course also in VR is the eye contact. So today when you go into a game and you go play a game, the characters in a game today are quite rude actually. You look at them, they don't look back at you, they talk in a different direction. But today, if you implement an eye-tracking solution in a game, in either of the environment of just normal games or in VR, When then you look at him, that person know that you look at him and he can then respond to eye contact just as we are doing in normal life. If I talk to one person, that person looks at me and nod. And if I then shoot my eyes to another person, then that person of course starts to be nodding and feel related to me in the discussion.
[00:04:38.510] Kent Bye: And so do your eye trackers just use a 2D camera based or are you actually doing depth sensing as well in order to get a full model of the iris?
[00:04:47.327] Henrik Johansson: I mean the technology is constantly evolving so today we can do also depth but it's not as accurate as the gazing as on the left and right and forward running because depth is always a challenge but it can definitely be done with a correct implementation without a doubt.
[00:05:07.148] Kent Bye: I see, and so does that mean that with the SDK you'd be able to say like how far of a focus someone's eyes are looking at based upon their gaze of their, you know, the size of their iris or being able to determine what they're actually focusing on just by looking at their eyes?
[00:05:20.994] Henrik Johansson: We are at the moment developing the SDK and we hope to be ready by the end of Q2 and by that time we will know what is exactly implemented but yes, it's definitely possible to do even depth within eye tracking in a VR environment.
[00:05:35.755] Kent Bye: And so some of the solutions that you have right now are mediated through a 2D screen. You know, talk a bit about the immersion that you get just from adding eye tracking to a 2D game like Assassin's Creed.
[00:05:47.761] Henrik Johansson: There are many ways you can implement eye tracking into games. But the way it's implemented in Assassin's Creed specifically is the way we define it as infinity screen. Because today the screen is no longer limiting where you can look. Because immediately you look on the right of the screen, the screen will then just push in the next environment that is possible that theoretically could be outside of the screen. So you look to the right and then you just pan the environment to the right, just as you and me do today when we are out in the real life. We look to the left and then we see something on the left, and then we look to the right, we see something on the right, or up or down. And that is the way it's implemented in Assassin's Creed Rogue. So you look at one direction and then that direction will come into the center of the display, and then you look back on the other direction and then that will center in. So it is that you have created an environment where you actually see outside the boundaries of the display. So that is one of the features. The other benefit of eye tracking is that we actually now know if you actually are in front of the computer or your laptop or monitor. Because as long as the eye tracker can see your eyes, we know that you're actually there. So if you look away from the screen, the eye tracker will recognize that your eyes are no longer in front of the display, so that game then pauses. So if you're in the middle, someone rings the door or something, and you need to go and answer the door at home, and then you don't want to lose the fight, actually, when you then leave the game, the game auto-pauses. And in the middle, you come back to your computer, it starts where you stopped. So there is an automatic pausing feature, because we know you're in front of it.
[00:07:25.798] Kent Bye: And Toby had a game jam recently, and were you having some people sort of experiment with the different user interactions that you could have with using the eye gaze? And maybe you could talk a little bit about some of the innovations that you saw come out in terms of adding eye tracking to gameplay.
[00:07:41.220] Henrik Johansson: The implementation in Assassin's Creed Rogue is of course one. Another obvious one is eye contact, where you look at people and you can get a response. Like in Wizards games, you need to talk to different people to get different answers. And if you don't look at them, they don't tell you their secret. Another very cool feature that you can implement is multidimensional movements. And that means that a character today in a game, they are more or less vertical. So they can walk straight and they can go to the left or the right or backwards. So if you should, for example, pick up something in a game today, you normally need to stay just in front of the object and then you can pick it up. What you can do with eye tracking now is just in real life. You can walk in one direction and then you see something on your left or right side and then you can, by just looking at it and putting out your hand, you can actually pick that object up while still continuing walking in another direction. So, for example, in one of the games we have here showcasing Son of Nohr, you can walk in one direction, you look at the stone, you can pick that stone up and you continue walking straight ahead and you want to throw it to the left. You continue walking straight ahead and you look on the left side and then you throw it on the left side while you're not changing the movement of the character. That's another obvious one. A little bit deeper integration into the games but it could be really beneficial is graphics and sound that reacts on where you're looking. So again as a human if I look at one person that person comes into the center and my focus and the rest of the things in the environment are my peripheral view. It's a little bit more foggy and I see the person I'm looking at very clearly. Then when I change my focus to something else, that thing or object will become very clear and the person I looked at before becomes a little bit foggy. So this you can now implement as well in games to make it more human and correct. Same goes with sound. If you look at the person, you normally hear what that person is saying. And if you look away and look at another person, you will hear that person more clearly. So this can also now be implemented in-game because you know where you're looking. So by both vision and sound, you actually can, by looking at things and seeing where people are having their eyes, you can then create environments that are more natural and human.
[00:10:05.140] Kent Bye: And finally, here at GDC this week, there's been a lot of buzz around virtual reality and a lot of big announcements. And I'm curious from your perspective, Toby, where you see virtual reality going and how eye tracking is going to play a part of that.
[00:10:19.022] Henrik Johansson: Yeah, I mean obviously super cool with VR finally kicking off. It's been talking about so long time. So definitely it's one of the technologies that we believe in and that's also why we are now entering in that arena as well and trying to be part of it and already in good discussions with a lot of partners. The eye-tracking technology is good for both current implementations in-game and future implementations, just because it is actually how humans interact. It's the correct way of doing things, because you and me and everyone else, we look at things and we walk to things, we pick up things by viewing them. So implementation in a game in a PC market or anywhere else in a 2D environment, it's equally important in a 3D environment. So there is not idle or eye tracking is needed in both if you want to create a true immersive human experience. Great, well thank you so much. Thank you.