Wendy Seltzer is the Lead Strategist and Counsel for the World Wide Web Consortium (W3C), and she says that the concerns of privacy and security on the web have been taking up a lot of her time lately. I had a chance to talk to her at Mozilla’s View Source Conference in Amsterdam where she shared some of the highlights from the latest technical plenary (TPAC) meeting for the W3C including threat modeling for privacy and security, blocking third-party trackers, differential privacy, curtailing active and passive fingerprinting, and the diversity of approaches that the different browser vendors are taking to privacy on the web. Seltzer is concerned about how to help make the web a trustworthy platform, and to help explore some of the underlying economic business models by providing new web payments infrastructure. She also says that the immersive web will introduce even more issues to help provide a layer of privacy and security, especially when it comes to ensuring that any payments are going to the intended first-party origin. She also says that the level of privacy invasion with immersive tech could be enormous if they don’t get it right.
LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
Rough Transcript
[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to The Voices of VR Podcast. So continuing on in my series of looking at XR ethics and privacy, today's conversation is with Wendy Seltzer. She's the strategy lead and counsel to the World Wide Web Consortium. So this conversation happened at the view source conference where there's lots of people from the open web the w3c So from that community a big hot topic was looking at different issues of security and privacy So this is one of the biggest things that Wendy is looking at right now just being the strategy lead of looking to all these Hot topics within the community and what needs to be addressed and security and privacy is right up there at the top. So I She talks a little bit about what is happening at the W3C around that and the different dynamics of the open web, different aspects of threat modeling and the different ecosystem dynamics and working with the different companies. And yeah, just looking to see what is happening now of this recalibration process within the open web technologies is something where a lot of these best practices that they're doing will need to be even more adapted and amplified and reconfigured to be able to address the whole new range of different privacy concerns when it comes to WebXR, as well as the immersive web in general, wherever it ends up going. So that's what we're covering on today's episode of the Voices of VR podcast. So this interview with Wendy happened on Tuesday, October 1st, 2019, at the ViewSource conference in Amsterdam, Netherlands. So with that, let's go ahead and dive right in.
[00:01:43.514] Wendy Seltzer: I'm Wendy Seltzer, I'm strategy lead and counsel to the World Wide Web Consortium. So as strategy lead, I'm looking across the platform for where do we need new standards work or standards focus. But I come to that from security and privacy, so I am particularly interested in how do we make the web a platform where users have an expectation of privacy and that expectation is met effectively.
[00:02:13.587] Kent Bye: Yeah, maybe you could give me a bit more context as to your background and your journey into doing what you're doing now.
[00:02:19.211] Wendy Seltzer: So I am a lawyer and technologist, and as a lawyer I've done work in intellectual property and information privacy, so looking at how we use new technologies and how those intersect with laws that were often written with much earlier technology environments in mind. So I've been particularly interested in privacy as we face the questions of both privacy from government surveillance, from corporate surveillance, from social surveillance, and privacy too. I think it is a necessary element of being human and formulating thoughts in society that we have space offstage to compose ourselves and if we don't secure privacy protections we lose out on that basic element of humanity.
[00:03:19.905] Kent Bye: Yeah, we're here at the ViewSource conference in Amsterdam, and a lot of the presentations, privacy comes up over and over again. Seems like a bit of a hot topic in the community right now, and there seems to be a threshold that has been crossed in many different ways. And so, I guess, what are some of the things that the W3C are trying to do to rein it back in a little bit?
[00:03:38.894] Wendy Seltzer: Well, I just came from W3C's TPEC meeting, our technical plenary meeting in Fukuoka, Japan, last week. There was an entire track of privacy sessions during our day full of breakout sessions with a whole bunch of good thinking on both how do we do better privacy reviews of existing specifications, how do we do threat modeling for privacy on the web, thinking about how do we set priorities for what to work on first and then some very specific API questions around advertising and privacy. How do we respect user interests in not being tracked and user interests in getting access to advertising supported content? Are there ways to provide advertising? measurements without tracking users around the web. And some of the concepts of differential privacy or statistical measurement or aggregate measurements can help do that if we standardize to enable the browser as user agent to abstract away some of the detail of the individual access.
[00:04:55.053] Kent Bye: Well, can you talk a bit about what you're doing in terms of trying to collaborate with these tracking technologies? Because it sounds like that they're kind of seizing all this information and you have the ad tracker blockers that are completely blocking it out. So that in some ways creates a cultural market dynamic that is a signal to them that they may be crossing a threshold of an ethical boundary. There could be, just talking to Selena from Mozilla, was talking about whitelists of trackers that are a little bit more ethical, but is there a middle ground, I guess, from a W3C perspective, like this is what your specifications are, or if it becomes more of a cultural issue and a market dynamic of how things actually play out.
[00:05:35.135] Wendy Seltzer: Well, right now I think we're seeing an array of experimentation and each of the browser implementations is doing something a little bit different with tracking protection. We see Firefox doing some tracking protection. We see Apple doing its ITP. We see Brave doing other things. We see Google doing privacy sandboxing in Chrome. And one thing we hear from advertisers is that all of that makes it an uncertain landscape. They don't know quite what to expect. And so that offers a potential pathway toward cooperation. If we can give some more settled expectations for the advertisers and publishers, even if those expectations are you won't be able to track people individually, but maybe we can give an aggregate measurement that enables them to learn which ads resulted in user follow-up to the website, not on a user-by-user basis, but on a cohort basis. And there are several different, sometimes divergent proposals for what that would look like in specific, but I'm encouraged that a lot more people are concerned about privacy, interested in being user agent stewards of the user's interest in privacy and really interested in thinking together how can we protect those interests.
[00:07:09.614] Kent Bye: One of the big challenges that I hear is that, for one, the economic business model of surveillance capitalism seems to be built on a foundation that is very ethically questionable. It's got lots of transgressions about our sovereignty and our right to privacy, but also potentially using information that has been aggregated against us in certain ways. And so how do you address that ethical violation and the cultural normative standards that have been developed so that we've created an economic dependence upon this is how we do things? So to try to create new economic models that are able to sustain things, but yet slowly wean off this other model, it seems like this migration paradigm shift that maybe some of these things with the tracking technologies are able to help catalyze that. But from your perspective, How do you take a holistic look at something like this where there may not be a clear alternative economic model? And if that economic model is in complete contradiction to user privacy, then how do we move forward with sustaining an open web?
[00:08:11.627] Wendy Seltzer: Jump right to the difficult questions. I think it's also encouraging to see COIL here with their web monetization proposal, talking about can we use micropayments instead of advertising and tracking even on an aggregate basis. Brave doing other things with monetization in the browser and maybe the answer is an entirely different model and tracking based advertising isn't the right way forward. Maybe we will finally make enough progress with web payments that we can make subscriptions and payments directly for news and other online content and alternative to ad-supported content. It's hard to predict where things will go and it's hard to break out of the path that we're on right now. It's difficult from right here to see where we might go that is completely outside of the track advertising and interest-based advertising model. But we had lots of news and creative material produced before that model came into effect. And so I am hopeful that we'll see other things supplant that model.
[00:09:38.063] Kent Bye: So coming from a background of being a lawyer, there seems to be certain rules of GDPRs, the laws that are in Europe. In the United States, we have maybe a constellation of different First Amendment rights, with the Fourth Amendment and Fifth Amendment, some sort of implicit rights to privacy that we have. Do you feel like there needs to be policy-level changes to be able to address some of these privacy issues? Because as a W3C, you're looking at the international community. You're talking about the policies for all these different countries. Do you feel like there's a role for good policy to be able to help shift things as well?
[00:10:18.536] Wendy Seltzer: Sure, and speaking as an individual without my W3C hat on, I do think that we need regulation and policy working hand-in-hand with technology. I said that because W3C isn't a policy organization and so it's not in that advocacy role, but I think we've seen from GDPR implementation an increase in attention to privacy and that has forced companies who want to do business in Europe or with European citizens to segment their data collections more carefully, to think harder about what they need to collect, about the consents that they are getting before they collect information, and how they can comply with obligations to delete data or be responsive to user data requests. And each of those yet some global reach as companies then think, you know, why would we do some things differently for different regions if we can implement one good data flow? So, sure, the thought of fines also is bracing to companies, but the GDPR sets up a regime in which privacy is an expectation of users not something that they are expected to bargain away and the US framework has been more of a privacy is something you give away rather than privacy is something you have. The model here is a notice and consent where you click through and read the enormous privacy policy and then continue to take actions on the site, you're deemed often to consent to everything that they say they'll do, and there's very little pushing them to say that they won't collect data. So with that kind of incentive, it's natural that we have seen business models that start with slurp up all the data and figure out if it's useful afterwards. So I think it's useful to get a regulatory pushback of there are liabilities for data collection or reasons to minimize the data to what a user would actually voluntarily give in exchange for services and to be more careful with that.
[00:12:53.183] Kent Bye: Yeah, this morning Selena from Mozilla was talking about how a lot of the original web APIs, when they're developed, they have a certain expectation for different market dynamics that are happening, but yet some of the things with the data collection have then been used for potentially psychological manipulation or sociological control. So you talk about at the W3C how you're talking about threats and threat models. And so maybe you could talk a bit about some of the larger context that has led to the W3C looking at some of these potential threats.
[00:13:24.897] Wendy Seltzer: Sure, the threat model document comes out of a desire to understand more of the context in which we're doing privacy reviews. So as we look at a new API or a new feature that's being built into something like CSS and want to ask, of what is its impact on user privacy. Threat modeling helps us to get a background sense of against what are we trying to defend user interests. And what is the baseline we're working from? What is it expected that parties will be able to get as part of an ordinary web transaction so we can understand what's out of the ordinary and what's invasive. So some of that is looking at the difference between first parties and third parties, that the origin model of the web gives greater powers to the party who serves you a website than to those others who are included into the mix, and the data sharing between and among them is part of what enables cross-site tracking around the web, so setting up an understanding of What is a normal web transaction? Of course you have to reveal the request you're making in order to receive a webpage in response. You're revealing aspects of your behavior on the site by, you know, what click follows from where you got to. If we wanted to scale those back. It would really require a different design of use Tor, use VPNs, use lots of distinct browsers for each click and understanding what we're starting from and what we can reasonably expect of new features. Not add fingerprinting surface, possibly not duplicate fingerprinting surface in case we later want to remove some of the existing privacy invasive features. Some of these will be questions that doing threat modeling helps to point out and sometimes there will be tensions between different aims for web technology. If you're doing a web RTC conversation, you need access to the microphone and possibly to the video camera. Is that something that should require user permission? Should it require user acknowledgement every time, one time? Should different browsers make that choice differently depending on what their users anticipate when they use that particular browser or browsing environment. We'll see as we go forward. Another area is the silent versus active fingerprinting. Is it better if a feature requires an API call so that at least a web-wide census or user agent monitoring can see when the API is being called and ask, why did that site need access to my canvas? And expose that perhaps there's some fingerprinting going on that might be more acceptable than simply returning information with each request.
[00:16:56.172] Kent Bye: And so is this issue of security and privacy taking up most of your time at the W3C, or are there other hot topics that are also coming up that you're working on?
[00:17:04.853] Wendy Seltzer: Well, in the overall strategy, I think security and privacy is taking a bigger chunk because we are seeing more interest from users and developers. How do we make the web a trustworthy platform, a place that users can come and see that it's safe to go browse to a website? It won't do untoward things to you just because you've typed a URL or followed a search result. and we want to be able to make those assurances. Other pieces of what I'm looking at are, how do we continue to evolve the web as a capable platform for development? How do we make it a place where the developers of immersive realities can work, where there's a good payment infrastructure, where there are good capabilities for device interaction.
[00:18:05.737] Kent Bye: And how is what's happening in the immersive web and the future of virtual and augmented reality, how is that on your radar when you think about some of the potential privacy concerns and issues there?
[00:18:17.860] Wendy Seltzer: I think it's a huge area for exploration because the degree of privacy invasion in an immersive environment could be enormous if we don't get it right. And the ways that we could get it wrong swell what a permission prompt look like in an immersive environment. How do we convey security indicators there? We already know that it's difficult to convey those in a 2D web? How do we give users indications that help them to convey their intents of where to navigate or what environments they're in? And somebody is asking for payment in an immersive environment. How does the user get assurance that it's a trustworthy party on the other side? I think It's also exciting because it's an opportunity to develop new paradigms and to build on increased user research and understanding of how could we build this environment to work well? How can we, as we're introducing new design decisions, start by introducing things that are privacy protective?
[00:19:35.858] Kent Bye: So for you, what are some of the either biggest open questions you're trying to answer or open problems you're trying to solve?
[00:19:44.784] Wendy Seltzer: For me, I see the persistent question of how do we govern an open platform? One of the great virtues of the web and what's allowed it to develop as it has is that anyone can do anything with it. And that's cool. Anyone can extend it in directions that its inventors didn't anticipate. And yet, how then do we offer assurances to users that they'll be safe on that platform? And where's the right place to set limits among these are the good practices that that user agents and those building the web follow. So we've had a period of do anything and let anything flourish and we've seen that along with the really cool creative sites there's the misinformation and the abuse and it's hard to find the right toggles to set limits on those to enable people to form communities that resist being abused, that resist misinformation and misdirection. So I think we need and are working on things that can be foundations for trust on this open platform. I'm excited by technologies like web authentication that make it easy for people to authenticate without passwords, without phishing, without data breach risks on that authentication pathway. And that and components like it can help build toward these better user-controlled communities.
[00:21:34.344] Kent Bye: Well, because you have a background in law and you've looked at privacy, I'm not sure if you're familiar with Dr. Anita Allen, one of the founders of the philosophy of privacy. And she gave a talk to the American Philosophical Association in January. And she was laying forth out saying to the entire philosophy community saying, hey, we don't have a comprehensive framework for privacy. And here are the things that she thought we need to do to get there. But it immediately hit me when I heard her say that, that it was like, oh, because there's not a comprehensive framework for privacy, that's kind of left up to each of these individual companies, whether it's Google or Facebook or Amazon or Apple, any company, they have to decide what that threshold is. And so, how do you take something that's so unbounded and so contextually dependent and then come up with an operational definition or deal with all the different contextual differences? Like, how do you sort of handle how difficult it is to pin down what privacy even is?
[00:22:28.185] Wendy Seltzer: We can't throw up our hands. We can't say, until we know everything about privacy, we can do nothing about privacy. We need to take incremental steps. And I think meeting user expectations is one of those cores. I think GDPR has, for all of the hand-wringing about that law, companies are doing work to comply with it and have set up data segregation and data controllership agreements to manage data flows. And each of those helps to build up a better understanding and control of user data, user control, and meeting user expectations. And we'll still find cases where, despite taking all of the best precautions, users are surprised. And those are opportunities for rethinking and designing better controls. And we'll find places, I'm sure, where some of those controls are too restrictive and people want ways to gain user permission to do something more with data. But I think respect for the user as the individual making choices, respect for user communities is key to thinking through how we give privacy that they expect.
[00:23:52.642] Kent Bye: Great. And finally, what do you think the ultimate potential of the open web with privacy architected in mind and immersive technologies in there as well? What do you think the future of where this is all going?
[00:24:06.917] Wendy Seltzer: The reason that I got into web technology is because it enables human communication. And I think that that is a mix of encrypted one-to-one communication and encrypted group communication and open public communication. And the better we can help people to choose their communication modes, find their communities of interest, and take part in social and political and economic discourse, the better the web will fulfill that potential.
[00:24:44.593] Kent Bye: Is there anything else that's left unsaid that you'd like to say to the web community?
[00:24:52.356] Wendy Seltzer: Keep inventing cool things and respecting users while you're at it.
[00:24:59.003] Kent Bye: Awesome. Great. Well, thank you so much for joining me today on the podcast. So thank you.
[00:25:02.586] Wendy Seltzer: Really good talking with you. Thank you.
[00:25:04.567] Kent Bye: So that was Wendy Seltzer. She's the strategy lead and counsel to the World Wide Web Consortium. So I have a number of different takeaways about this interview is that, first of all, well, just the fact that privacy is a hot topic, not only in the larger culture right now, but especially when it comes to technology and all these different ecosystem partners as a strategy lead, that's the topic that's actually taking up most of Wendy's time. So they just had come back from a technical plenary meeting where there was a lot of privacy sessions, privacy reviews. And it seems like one of the things that they were doing there was coming up with threat modeling. So looking at all the different risks and threats to users on the web, setting a list of priorities of what to tackle, what to address first, and looking at the APIs and what are the things that are going to be. threats to user privacy? Are there ways to take aggregate information? So using things like differential privacy, so that maybe these tracking companies can get aggregate statistics without having to tie it down to individuals. She talked about the differences between the first party and the third party. So when you go to a website, whatever that URL domain is, is that first party, but yet in order to do the advertising, you have all these third party trackers that are tracking you and tracing you across the web and trying to create this whole profile. And so how do you start to balance the relationship between these first party and third party and do different approaches, privacy sandboxing, like in Chrome or Apple has their own approach and Mozilla and brave browsers are taking different approaches as well. So it seems like because there's so many different strategies, it's creating a little bit of unsettledness within the ad tech community. And so rather than trying to optimize for each of these different strategies, there may be a point of leverage to be able to actually like collaborate and to try to get things for everybody on a little bit more of an equal basis with not having a lot of this surveillance and tracking across the web. Wendy was talking a little bit about fingerprinting and so the different risks associated with that. And so you can look at different aspects of how different things are configured on your computer, your build and your canvas size, uh, all these kind of metadata aspects of the user agent that are usually used to be able to deliver web content to you. But they can be used in aggregate to be able to actually create a unique fingerprint so that even though you're supposed to be anonymous as you go around the web, there's ways to be able to do all this passive and active fingerprinting to be able to like gather all this metadata on you and be able to actually track you across site to site. So just thinking about different ways of mitigating that, trying to just like be a little bit more skeptical as to whether or not you give out information or not. And to see if it's for the function of fingerprinting or if they actually contextually need it for some reason, rather than to just automatically give over that information. And so building that into a little bit more of the architecture of the web, so that it's just not automatically revealing people's identity based upon all the metadata that you have and your specific machine and all the ways that you have things set up. So just in terms of the policy perspective, she's a lawyer, but yet the W3C specifically doesn't take any policy positions. But, you know, speaking on her own behalf, she has seen that there has been quite a lot of movement for creating new, good privacy architectures that are taking into consideration the need to be able to delete user privacy and to manage different data flows and to be able to hand over data if you need to. And so it's really forced upon a lot of these different companies, if they want to operate in Europe, they've had to do this whole re architecture and rethinking of all these other practices that they used to do before. So it seems like that legislation has done quite a bit to be able to shift a lot of the different dynamics of the different technological architectures that are out there. But there's still a long ways to go in terms of GDPR isn't implemented in the United States, for example. So She sees that there's a good potential to have both the regulation and the technology work hand in hand and to try to have at least some guardrails that are coming down from the policy level. So even though there isn't a universal framework for privacy, it's not something that she thinks that we should just throw up our hands. There's different ways to make incremental progress. She pointed to GDPR to see the different steps that are happening there with data segregation and data controller agreements and managing data flows and just understanding the user data and user control. And so there's been quite a lot of movement there and has really catalyzed the design of a lot more systems to be able to allow the users to be able to have more control over their data. But, you know, as we move forward, there's still a lot of questions around, you know, how do we govern this open platform? How can we allow anybody to do anything with it and balance this need to be able to cultivate this trust because they're trying to create the web to be this open, safe place. So to be able to manage all these risks and to just make it. so that it is a safe place to be that users will be safe and the user agents can be safe and dealing with all these other aspects of misinformation, disinformation, abuse and misdirection, you know, trying to find all sorts of other ways to build up foundations of trust with on the open web. whether that's through web authentication or finding new ways of payment systems, you know, that's still a big open question in terms of what are the new models going to be with the new web payments baked directly into the browser, then maybe there'll be easier ways for you as an individual to support different aspects of content that you're seeing on the web. Coyle was there at the ViewSource conference talking about some of their different partnerships that they have with Mozilla and just trying to come up with new strategies for web payments and to be able to allow mechanisms to be able to pay users for the content that they're creating. So it's a fertile time in terms of this recalibration phase, but also like trying to lay down a lot of these ethical frameworks and a lot of the design as we move forward into what's happening in the 2D web, but also to look at all the specific threats that are coming from the immersive web and to see how you can start to build from a more solid foundation as there's this recalibration phase that's happening on the open web. And as we move into the immersive web with WebXR, then being able to add in all sorts of other considerations, like when you're in an immersive environment and you want to have payments happen, how do you ensure that it's actually the people that you want without having other people kind of spoofing in information in there? So there's a lot of things like that, that need to be figured out and protocols that have been working for the 2d web and how to do those similar types of translations. When you look into the immersive web and just in general, she's super inspired by what happens with communication online with end to end encryption, whether it's one-to-one or in these group chats, or even if it's open communication on the open web. And so having many different options and modalities for people to find what is the best way for them to communicate. and to ensure that they have a certain level of autonomy and sovereignty and privacy in their communications. And that allows you to find your different communities of interest and to take part in all sorts of different social, political and economic discussions. So that's all that I have for today. And I just wanted to thank you for listening to the voices of VR podcast. And if you enjoy the podcast and want to see more of this type of coverage, then please do consider becoming a member of the Patreon just $5 a month is a great amount to give and I really do rely upon listeners like yourself in order to continue to bring you this coverage. So, you can become a member and donate today at patreon.com slash voicesofvr. Thanks for listening.