Ellysse Dick is a policy analyst who has written 10 technology policy publications over the last year about VR & AR for the The Information Technology & Innovation Foundation. The ITIF is a non-partisan, non-profit tech policy think tank, who says their mission is to “advance innovation,” believes “disruptive innovation almost always leads to economic and social progress,” has a “considered faith in markets and businesses of all sizes,” and believes in “deftly tailoring laws and regulations to achieve their intended purposes in a rapidly evolving economy.” In other words, they lean towards libertarian ideals of limited government to avoid reactionary technology policy that might stifle technological innovation.

While the ITIF is an independent organization, their tech policy positions have a strong alignment with the types of arguments I’d expect to hear from Facebook themselves. The ITIF lists Facebook as a financial supporter and Facebook has listed the ITIF as an organization as a part of Facebook’s Political Engagement. But Facebook also says “we do not always agree with every policy or position that individual organizations or their leadership take. Therefore, our membership, work with organizations, or event support should not be viewed as an endorsement of any particular organization or policy.” And Dick says that she maintains editorial independence for what type of tech policy research that she’s doing within VR and AR. That all said, there’s likely a lot of alignment between ITIF’s published tech policy positions and the implicit and often undeclared policy positions of Facebook.

Ellysse Dick has written about XR privacy issues in these three publications:

One really interesting insight Dick had in her December 4th piece on Augmented Reality and bystander privacy is that there are already a lot of social norms or legal precedents when it comes to the different types of data collection. Here’s the taxonomy of data collection that she lays out:

  • Continuous data collection (non-stop & persistent recording)
  • Bystander data collection (relational dynamics of recording other people)
  • Portable data collection – (the mobile & portability ease of recording anywhere)
  • Inconspicuous data collection (Notification & consent norms around capturing video or spatial context)
  • Rich data collection: (the geographic context & situational awareness)
  • Aggregate data collection: (combining information from third-party sources)
  • Public data exposure: (associating public data to individuals within a real-time context)

Dick says that the combination of the real-time, portable, aggregate, and persistent nature of data recording that may create a new context requiring either new social norms or laws.

I wanted to talk with Dick about take on XR Privacy, why she sees the need for a US Federal Privacy Law, some of the concerns around government surveillance and the Third Party Doctrine, and how aspects of biometrically-inferred data should be a key part the broader discussion about a comprehensive approach to privacy. She calls this data “computed data” while Brittan Heller refers to it as biometric psychographic data.

Dick is not as concerned about near-term risks of making inferences from physiological or biometric data from XR, and cautions us from a “privacy panic” that catalyzes a reactionary technology policy that leads to technologies being banned. I guess I’m on the other side of having a reasonable amount of privacy panic considering that technology policy analyst Adam Kovacevich has estimated the odds of a Federal Privacy Law passing ranging from 0-10% for the more controversial sticking points, or around 60% if the Democrats compromise on the private right to action clause.

Dick says that the ITIF follows the innovation principle, which is to not overregulate in advance for harms that may or may not happen. Creating laws too early has the potential to either stifle innovation, to not have the intended consequence, or to quickly go out of date. Dick recommends soft laws, self-regulation, and trade organizations as the first step until policy gaps can more clearly be identified. This means the end result is that the most likely and default position is to do no pre-emptive actions regarding these privacy concerns around XR, which will likely result in us trying to reel it back once it’s gone too far.

Dick seems to have a lot of faith that companies will not go too far with the tracking our data and ads that could lead towards significant behavioral modification, but for me the more pragmatic opinion is that companies like Facebook will continue to aggregate as much data as possible in trying to track our attention and behaviors creating an asymmetry of power when it comes to delivering targeted advertising.

Overall the ITIF generally takes a pretty conservative approach to new technology policy, suggesting that we either wait and see or rely upon self-regulation and consensual approaches. Dick and I had a spirited debate on the topic of XR Privacy, and in the end we agree on the need for a new U.S. Federal Privacy Law. I think we’d also agree that we need the right amount urgency to make it a public policy priority without leading to a reactionary panic that leads to a technology policy that bans certain immersive technologies.

And I believe that it’s still up for debate how much privacy panic we should collectively have on this issue, especially considering how there is no way to verify the appropriate flows of information given the broad mandate that Terms of Service & Privacy Policy adhesion contracts give to Facebook for how they can use the data they can capture.

In the next episode, I’ll be diving into how philosopher Helen Nissembaum defines privacy as appropriate information flows within a given context within her Contextual Integrity theory of privacy. She also argues that notice and consent model of privacy is broken, and her contextual integrity approach may provide some more viable and robust solutions for ensuring users have more transparency on how their data are being used.


This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Comments are closed.

Voices of VR Podcast © 2022
    [type] => 2
    [message] => Illegal string offset 'output_key'
    [file] => /home/kentbye/public_html/
    [line] => 601