#1179: Discussion of XRSI’s Privacy Framework 1.0 from 2020

On September 8, 2020, the XR Safety Initiative launched their XR Privacy Framework 1.0 which is a set of self-directed, privacy guidelines aimed to empower “individuals and organizations with a common language and a practical tool that is flexible enough to address diverse privacy needs and is understood by technical and non-technical audiences.” It is “inspired by the NIST privacy framework’s approach” with the five functions of “Identify, Govern, Control, Communicate, and Protect” while the XRSI Privacy Framework modifies it to be “Assess, Inform, Manage, Prevent.” XRSI is in the process of developing a 2.0 version of their Privacy and Safety Framework that’s aiming to be released by the end of 2023.

Their September 2020 paper surveys the fragmented nature of US privacy law, with insights from the EU, and adds in some specific considerations for XR privacy including what they call “Biometrically-Inferred Data”, which they define as “collection of datasets that are the result of information inferred from behavioral, physical, and psychological biometric identification techniques and other nonverbal communication methods.” Most of the examples of Biometrically-Inferred Data listed are the same types of physical or physiological biometric identification techniques that are tied back to identity, but they also included an adapted graphic of Kröger et al’s 2020 paper “What Does Your Gaze Reveal About You? On the Privacy Implications of Eye Tracking” to further elaborate on some of the behavioral or psychographic information that could be inferred from eye tracking. Biometric inferences from XR data remains a big open questions for how they’ll be treated by the law, and it’s something that I first started covering back in March 2017 in an interview titled “Biometric Data Streams & the Unknown Ethical Threshold of Predicting & Controlling Behavior.”

Brittan Heller’s February 2021 Vanderbilt Journal of Entertainment and Technology Law article coined the phrase of “biometric psychography” that she defines as “a new concept for a novel type of bodily-centered information that can reveal intimate details about users’ likes, dislikes, preferences, and interests.” Heller’s emphasizes that it’s these psychographic inferences that differentiate XR data from how existing legal definitions of biometric data are often explicitly tied to identity. For example, Heller states, “Under Illinois state law, a “biometric identifier” is a bodily imprint or attribute that can be used to uniquely distinguish an individual.” The lack of explicit personally-identifiable information in the types of biometric and physiological data that comes from XR means that it lives within a undefined legal grey zone that is largely unprotected by most existing privacy laws.

One of the limitations of self-regulatory guidelines like XRSI’s XR Privacy Framework is getting major industry players like Meta, Google, Valve, or Apple to adopt a framework like this. And even if they did, then there’s still the open question of enforcement. Ultimately in order to provide consumer privacy protections by the biggest players, then we’ll need either a comprehensive federal privacy law in the United States or stronger privacy protections at the State level, but not everyone lives in California which has some of the strongest consumer privacy protections.

But this paper is explicit in targeting individuals and organizations to provide a “baseline” of “solution-based controls that have principles like “privacy by design” and “privacy by default” baked in, driven by trust, transparency, accountability, and human-centric design.” So it’s utility lays in organizations understand the existing legal landscape and point out some specific considerations of XR data and XR privacy. Since there has yet to be a comprehensive Federal Privacy law, then some of the XR-specific concerns first covered back in this framework may still be relevant in providing a lens into informing potential federal privacy legislation. Companies often only follow the bare minimum for what’s legally required, and so since we’re in an interim space with XR privacy, then this framework lays out some foundational principles for companies to voluntarily adopt. XRSI is also collaborating with Friends of Europe to “explore innovative policy solutions for possible transatlantic regulation of metaverse.”

This interview with five contributors to the XR Privacy Framework 1.0 was recorded as a livestream during the XR Safety Awareness Week in 2020, and features:

  • Suchi Pahi – Data Privacy and Cybersecurity Lawyer
  • Kavya Pearlman – Founder & CEO XR Safety Intitiative
  • Noble Ackerson – leader of Data Governance initiatives & Product Manager for Ventera Corporation
  • Jeremy Nelson – Director, Extended Reality Initiative (XRI) – Center for Academic Innovation, University of Michigan
  • David Clarke – Cybersecurity and data protection work and EU-GDPR Strategy Advisor for XRSI

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality