#514: Tobii Recommends Explicit Consent for Recording Eye Tracking Data

Johan-HellqvistThe eye tracking company Tobii had some VR demos that they were showing on the GDC Expo Hall floor as well as within Valve’s booth. They were primarily focusing on the new user interaction paradigms that are made available by using eye gazing to select specific objects, direct action, but also locomotion determined by eye gaze. I had a chance to catch up with Johan Hellqvist, VP products and integrations at Tobii, where we discussed some of the eye tracking applications being demoed. We also had a deeper discussion about what type of eye tracking data should be recorded and the consent that application developers should secure before capturing and storing it.

LISTEN TO THE VOICES OF VR PODCAST

One potential application that Hellqvist suggested was amplifying someone’s eye dilation in a social VR context as a way of broadcasting engagement and interest. He said that there isn’t explicit science to connect dilation with someone’s feelings, but this example brought up an interesting point about what type of data from an eye tracker should or should not be shared or recorded.

Hellqvist says that from Tobii’s perspective that application developers should get explicit consent about any type of eye tracking data that they want to capture and store. He says, “From Tobii’s side, we should be really, really cautious about using eye tracking data to spread around. We separate using eye tracking data for interaction… it’s important for the user to know that’s just being consumed in the device and it’s not being sent [and stored]. But if they want to send it, then there should be user acceptance.”

Hellqvist says our eye gaze is semi-conscious data that we have limited control over, and that this is something that will ultimately be up to each application developer as to what to do with that data. Tobii has a separate part of their business that does market research with eye tracking data, but he cautions that using eye tracking within consumer applications is a completely different context than market research that should require explicit consent.

Hellqvist says, “It’s important to realize that when you do consumer equipment and consumer programs that the consumer knows that his or her gaze information is kept under control. So we really want from Tobii’s side, if you use the gaze for interaction then you don’t need the user’s approval, but then it needs to be kept on the device so it’s not getting sent away. But it should be possible that if the user wants to use their data for more things, then that’s something that Tobii is working on in parallel.”

Tobii will be actively working with the OpenXR standardization initiative to see if it makes sense to put some of these user consent flags within the OpenXR API. In talking with other representatives from OpenXR about privacy I got the sense that the OpenXR APIs will be a lot lower level than these types of application-specific requirements. So we’ll have to wait for OpenXR’s next update in the next 6-12 months as to whether or not Tobii was able to formalize any type of privacy protocols and controls within the OpenXR standard.

Overall, Tobii’s and SMI VR demos that I saw at GDC proved to me that there are a lot of really compelling social presence, user interface, and rendering applications of eye tracking. However, there are still a lot of open questions around the intimate data that will be available to application developers and the privacy and consent protocols that will inform users and provide them with some level of transparency and control. It’s an important topic, and I’m glad that Tobii is leading an effort to bring some more awareness to this issue within the OpenXR standardization process.

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip


Support Voices of VR

Music: Fatality & Summer Trip

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast. My name is Kent Bye, and welcome to The Voices of VR Podcast. So on yesterday's episode, I had a chance to try out the eye tracking demo from SMI. And on today's episode, I'm going to be talking about the eye tracking demo that I saw from Toby at GDC. So let me just describe briefly the actual demo that I was able to have. So they were showing off a lot of user interactions and user interface by using eye tracking. So one example would be like if you're in this field and you see all these pillars and on the top of these pillars are these bottles. And that if you pick up a rock and you just throw it without looking at anything specifically, the rock doesn't hit anything. But if you are looking directly at this bottle on the top of a pillar and you throw the rock, then the rock will just automatically hit whatever you're looking at. So it's an interesting little mind trick that you're throwing it and you feel like it's because of your arm and your aim because of how you're throwing it with that philosophy, but it's really more about what you're looking at. starting to use eye tracking gaze for these types of interactions to focus on not only whatever you want to select in a scene, but if you want to teleport to some location, then you look at that place and you can teleport there. And so I had a chance to talk to Johan Helfquist about some of the features and demos that they're showing about eye tracking, but also taking a step back and talking about some of their involvement in the OpenXR process of trying to standardize these types of peripherals into the ecosystem. as well as talking about privacy and their philosophy around privacy. Since we're starting to have all this really intimate information about our eyes and what we're looking at, then how are they going to approach treating this information and data? So we'll be talking about that and I'll be unpacking that a little bit more at the end of this episode. So that's what we'll be covering on today's episode of the Voices of VR podcast. But first, a quick word from our sponsor. Today's episode is brought to you by the Voices of VR Patreon campaign. The Voices of VR podcast started as a passion project, but now it's my livelihood. And so if you're enjoying the content on the Voices of VR podcast, then consider it a service to you in the wider community and send me a tip. Just a couple of dollars a month makes a huge difference, especially if everybody contributes. So donate today at patreon.com slash Voices of VR. So this interview with Johan happened at the last day of GDC on Friday, March 3rd, 2017 in San Francisco. So with that, let's go ahead and dive right in.

[00:02:41.230] Johan Hellqvist: So I'm Johan Helgqvist from Tobii. I'm VP Product and Integration. So what we do here at GDC, we show PCs with eye tracking and a lot of games. And we also show VR with eye tracking. We integrated eye tracking into Vive headsets.

[00:02:57.883] Kent Bye: Maybe you could start by telling me a little bit about some of the features that are being enabled with eye tracking in VR.

[00:03:03.688] Johan Hellqvist: So I think it's basically five things. So it's first thing is social interaction, eye contact, seeing a person is essential. If you don't have eye contact, you look like a zombie, both in a game environment, but also in a leisure environment. So that's one of the key features that we see. The second one is interaction in general, kind of aim at gaze, which throw at one point where you're looking. You teleport where you look, for example, and menu interactions, instead of moving around, you look at the button you want to select and then select it with a click, as an example. So we have, I think it's 10 or 9 different kind of design patterns that is really good for VR and eye tracking.

[00:03:47.547] Kent Bye: Yeah, sometimes it's difficult to test eye-tracking demos because there's different limitations of the eyes with, like, saccades, for example. When you are looking at yourself in the mirror, you can't necessarily even see your eyes move. But being in other social VR situations, I feel like the eyes tell you a lot more information as to where people are looking and more contextual information. Maybe you could talk a bit about what does eye tracking specifically give you and how does it enrich, from your perspective, a social interaction.

[00:04:18.482] Johan Hellqvist: So, I mean, gaze and views, as you just said, is maybe the most important sense that we have. And especially in a VR environment that is closer to the real world than a normal computer game, eye contact and getting response from what you're looking at is super important. And I think it is maybe one of the most important additions to modalities that we can have in VR is eye tracking, because it makes so much sense. If I'm looking at you and I'm looking to the side, you probably turn your head and say, what's behind me? And if I have a conversation and I'm looking at my colleague here, but I'm talking to you, you may get confused. And that's extremely difficult to model without eye tracking in, for example, in a game environment, but also in any environment where we have something that you should interact with.

[00:05:04.638] Kent Bye: Yeah, and I think that's both where you're looking as well as like eye contact. It adds a punctuation sometimes to what you're saying, but also blinking I think is like a subtle thing as well. But I know that film editors, for example, they look for blinks and they say that the blinks are indicators for a thought either beginning or ending. And so I think having accurate blinking, there's a lot of these things that you can mimic and simulate, but actually having that level of body language, one of the things I've noticed in having social VR experiences with eye tracking here at GDC for the first time, I started to be able to know the person in real life and then be able to correlate that same body language. So yeah, I'm just curious if you've been able to have these social VR experiences with other people you know and what that is like and being able to translate how they act in VR and how they are in real life.

[00:05:53.330] Johan Hellqvist: Yeah, we've been working with Rec Room, with Gravity, and that's exactly what we've been doing there. To have a multi-user environment, and not only the eye, but also, as you said, the blink part is super important. And you can also take that another step, and look at the pupils, how they dilate, and if you're in love, you get bigger, and if you're not, you hate them, you get smaller. very unscientific, but you can also illustrate that moving forward. So it's kind of, you can do more things, you can do small feature mapping around your eyes, so you can have the eyebrows and so getting it more alive. It gives a completely different feeling when you actually, hey, it's looking back at me and here I am and then that person looks at you and We had a kind of virtual meeting which was kind of fun. We were sitting in chairs and looking and chatting and also using the whiteboard that is available in the rec room and it's kind of felt kind of, yeah, that's good. We had one example here of a guy who actually has 2D vision in reality, who got 3D vision for some reason in VR. So for him it was actually a better experience in VR than in reality. So that was kind of a really interesting experience. And I think someone told that eye tracking is maybe for VR as touch was for the smartphone. So that's a good statement from one of the visitors here.

[00:07:18.248] Kent Bye: Yeah, just one of the things that you mentioned there, being able to track the dilation of the eyes, and that could be an indicator of how you feel about other people. And I feel like that could be a little bit of private information. I'm not sure if I necessarily want that broadcast to other people, sort of my more internal state. They say the eyes are the window to the soul. And so you talk about the different privacy implications of eye tracking in VR. I'm curious to hear your thoughts on the types of things that you can even detect and determine by eye tracking. And then what are some of the privacy implications and being able to architect for privacy and to ensure that that type of intimate information is either made available for the people to make a choice as to whether or not they want that to be broadcast or to be recorded and stored.

[00:08:05.147] Johan Hellqvist: I think it's a very good question and a very important question. Just a quick comment on the pupil. I think scientifically there's no evidence on dilution and that kind of really express your feelings, but we think that from Toby's side that We should be really, really cautious about using eye-tracking data to spread around. So I think we separate from, if you use the eye-tracking data from interaction, and consume the eye-tracking data in the device, that's important for the user to know that, yeah, it's being consumed in the device, it's not being sent. But if they want to send it, then there should be user acceptance. So I think that is really important, and that's something that Tobii will drive in, for example, in, we're a member of the Kronos Open XR, and driving that, and that will be one thing that we will see too, because eye tracking is super powerful and the gaze do tell you what you're looking at, and it's kind of semi-conscious, so if you hear a bang, you will look there automatically, so you have semi-control of your gaze, and so it's a really important question, and we will treat that with real, real, oh, it's important.

[00:09:15.053] Kent Bye: Well, I think there's many different layers here. There's the operating system layer, the hardware layer, and then that application-specific layer in terms of the type of data that you're having access to. And so with this initiative with OpenXR, I know that they're in the process of trying to standardize the APIs in a general sense so that you can have kind of a generalized device driver so that you can write an eye tracker device driver once, and that anybody could swap in and out across all the different headsets. And I think that's a real important in order to really lower the barriers for innovation for a company like Tobii to be able to create what you're creating and to be able to have it across all the headset devices, but that to me I thought was so low-level that it doesn't necessarily really take into account the actual data that's being either captured and might be up to the application developer as to whether or not they want to capture and store that data based upon whether or not they want to use additional information like that for advertising, for example.

[00:10:11.935] Johan Hellqvist: Yeah, I mean one big portion of what Tobii does in other segments is market research. But I think it's important that when you do consumer equipment and also consumer programs that the consumer knows that his or her gaze information is kept in control. So we really want from Tobii's side, if you use the gaze for interaction, then you don't need the user's approval, but then it needs to be kept on the device, so it's not being sent away. But it should also be, if the user wants, be possible to do use the data for more things. And that's something that we are working on in parallel, but I think it is crucial because it's important data. But today, I mean, when you use the mouse on a webpage, that gets recorded. And that's not something that that many people know, but I think we want to take that into account that we do nurse privacy because it is really important.

[00:11:05.773] Kent Bye: Yeah, and one of things I really appreciated about this demo is that you don't have the reticle as you're moving around your eyes. I know a lot of eye-tracking demos like to show you where you're looking by putting a big red dot in the middle of where you're looking, which I find breaks immersion. But yet, you made the stylistic decision to not do that. I'm just curious to hear some of your thoughts on the design process behind that.

[00:11:26.775] Johan Hellqvist: Yeah, I think it's just intrusive and disturbing. I mean, it makes sense if you want to really look where am I looking, but you already know where you're looking, so why should you point that out? Maybe if you want just to see how good the eye tracker is, then it makes sense, but from a user perspective, it absolutely makes no sense. You want in the throwing demo, for example, you just look at the ball and throw. You already know where you want to throw, so why show it? It will just be unnecessary information. I can't comment what the other does, but I think from our perspective, we've gone from being kind of a high value vertical company, serving specialty markets, to consumer brand, a consumer company, and that transition is a huge transition, because going from working for some, or for plenty, to working under all circumstances for everyone, that's what you need to be if you're doing consumer grade eye tracking, and that's what we've been focusing on. And we still have our vertical with market research, science analysis, assistive usage, helping disabled people to communicate, get a life, get a language. And then we also have the consumer side where we work the OEMs and the ecosystem.

[00:12:36.727] Kent Bye: And I'm wondering if you could tell me a bit more about the foveated rendering features that you're working on and whether or not you're having to do some sort of initiative through the OpenXR in order to standardize the interfaces that you might need from all the different GPU manufacturers to enable that.

[00:12:52.106] Johan Hellqvist: I think from an eye tracking perspective and open XR perspective, what we can see too is that the eye tracking signals, you have one stream for foveated rendering because that streaming has a certain purpose, which is very much different from the purpose when you're doing interaction. And also that you introduce quality levels of eye tracking so all eye trackers can be used, but depending on use case, you need a certain level of accuracy, for example. So that are things that we will drive on OpenXR and also the privacy issues on the eye tracking part. So that's kind of what we bring based on our experience. What you can do, what you should do and what you should not do and hopefully we can make some recommendations on how to design things as well. On the foveated rendering part, we work with the full ecosystem, all the graphic vendors, and many game studios, as you see around from the game that was developed. I can't talk too much, but it's, of course, very high priority for us, and we have many people working on that internally.

[00:13:54.567] Kent Bye: So I'm curious to hear some of the other use cases for eye tracking in VR that you see. Maybe some of the industry verticals that you see is going to be kind of an early win for having this integration.

[00:14:06.872] Johan Hellqvist: As I mentioned, market research is one obvious one. Also, design has been many interested. Actually, even here in GDC, to CAD, zooming where you look, kind of turn around where you look, working with tools and palettes, menu selection is much easier. So that's one, some medical applications that we see that is coming up. Movie and visualization is also what we've been questioning about. Some automotive is also in the VR space. So basically whatever your imagination, what you think is good for VR, we've been basically approached from live, super realistic games with handheld equipment running around to fine-tune x-ray and surgery powered by VR so it's kind of the full range but I think game has a tendency to be a fast mover so that's why we are at GDC and then there's a couple of high-value vertical that will definitely be using VR for sure.

[00:15:06.912] Kent Bye: Is there a way for people to get a kit to put into a Tobii tracker, into a VR headset today? Or is it something that you're working towards, like the second generation of the VR headsets to have a custom integration?

[00:15:19.276] Johan Hellqvist: So right now, we're working primarily with the partner ecosystem. And we hope to be able to announce something later.

[00:15:24.238] Kent Bye: OK, great. And finally, what do you think is the ultimate potential of virtual reality, and what it might be able to enable?

[00:15:32.920] Johan Hellqvist: Wow. I don't know. I mean, it's a very good question. I don't have actually really personal favorites. Maybe meetings. Maybe meetings may be good enough and be real enough to make real sense. And for it to be really, really true enough, you have to have eye tracking. It's just one of those killer features in social interaction.

[00:16:01.922] Kent Bye: So you don't have to fly across the country to have a meeting face to face. You could do it in VR perhaps.

[00:16:05.925] Johan Hellqvist: Well, you said that when you got the telegraph as well, but still people still fly so I think I think it will always be better to meet but It would be a good compliment Awesome.

[00:16:17.461] Kent Bye: Well, thank you so much So that was Johan Helfquist. He's the vice president of products and integration for Tobii, which is an eye tracking company. We're showing off some demos at GDC. So I have a number of different takeaways about this interview. First of all, I really respected what Johan was saying about their ideas around privacy and VR. You know, it sounds like there's going to be a little bit of different layers that are involved here. There's the application layer, there's the hardware layer, there's operating system layer. And so I think what Johan is saying is that from the hardware perspective, they just have this kind of preference and bias to not just automatically share and store all this information about what you're looking at. They're really encouraging the application developers to, at least if they're going to want to capture and record that data and information, to provide that as an option for the users to be able to opt into that. I think that there's going to be likely lots of different benefits of being able to do that within the types of experiences that you're creating. However, it's just more of a question of like, does that need to be stored on a server forever? And is that going to be connected to your personal identity and then data mined? there's just a lot of really deep philosophical questions about what type of information from our body and our biometric data are we going to want to start to both share and have these companies store and use for various purposes. So I'm super glad to hear that they are thinking about privacy at least and potentially even trying to build some of that into the API layer of the OpenXR initiative. So when I was talking to some of the other members of OpenXR and asking them about privacy and, you know, they essentially were saying that's kind of like a higher up level at the application layer stack than what OpenXR is really trying to solve. But, however, it does sound like Tobii would be one of the companies that are trying to potentially have something that would be an option for a flag to be set, for example, as to whether or not some of this data is being recorded and stored or not. So this is still early days and you know, a lot of this. Discussions are going to be happened about open XR can be mostly behind closed doors in this process. That's going to be unfolding over the next year or so, but this is really kind of the first time that I've heard any company kind of say, Hey, we're going to try to perhaps architect some privacy settings within this open XR standard. So it'll be interesting to see, you know, how that actually unfolds and ends up when we start to move towards a 1.0 spec, probably sometime next year. Now in terms of comparing the Tobii demos with the SMI demos that I saw at GDC, the Tobii demos weren't as fully fleshed out. Toby wasn't specifically showing a foveated rendering demo, although I'm sure they know they're in the process of working on that. I saw the Toby demo on the GDC floor, it was like end of the week. From what I saw, there was a lot more user interaction for how to use your eyes as expression of agency and to be able to control different objects and to be able to, you know, throw a ball, look at it and kind of bring it back. And there was just a brief moment of being able to be embodied with an avatar, but again, it's actually really difficult to be able to judge and track what your eyes are doing within one of these demos. What I would really want is just a camera to be recording me. me doing all sorts of crazy, like rolling my eyes, looking up and down. And then, you know, you can't actually like observe yourself doing that because your perception just kind of clicks off when you have the saccade. But I'd love to do that to be able to watch a video of that to see what it looks like to other people. And you know, you can wink at yourself and can blink and those pick up pretty well. And you can see this vestibular ocular reflex as well. So if you're like looking at a mirror and you're staring at yourself and you turn your head left and right, then you'll see your eyes just not move. And that is something that shows up pretty well in social VR, but also you can see that within yourself. So I also asked Toby whether or not they had plans to get integrated into these headsets, and they're not really saying what the plans are. It doesn't sound like that any decisions have actually even been made yet. When it comes to having some of these eye-tracking hardware readily available for developers to start working on, it sounds like SMI has perhaps maybe a little bit more streamlined way of going through them and getting kind of a custom headset that's already have it integrated. Toby, it sounds like you may have to get into their partnership program in order to start to get some of this hardware. But, you know, Rec Room was one of the games that both Toby and SMI were starting to do these integrations with. From what it sounds like from talking both to SMI and Toby, they're not really focusing on creating a consumer-based version for people to add on, that they're really working towards this kind of integrated solution into the next generation. At least that's my sense for right now and what they've told me so far. So that's all that I have for today. I just wanted to thank you for joining me for the Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends and become a donor. Just a few dollars a month makes a huge difference. So go to patreon.com slash Voices of VR.

More from this show