#1091: IEEE XR Ethics: The Erosion of Privacy & Anonymity

The ethical questions around privacy in virtual and augmented reality are some of the most pervasive, unanswered questions in XR with the top two questions being: What new types of intimate biometric & physiological data are available through XR? And how are this new XR data being used? The intractable problems of XR privacy have come up in every single one of the 8 white papers as a part of the IEEE’s Global Initiative on the Ethics of Extended Reality. Human computer interaction researcher Mark McGill is excited about the new accessibility features and perceptual superpowers that may come with “all-day, every day augmented reality,” but he’s afraid that the lack of adequate consumer privacy protections and the half-baked consent models for bystander privacy may lead to social reject and backlash for wearing devices that may not be undermining the privacy of the owner of the AR headset, but also potentially violating the privacy boundaries for everyone in their immediately surrounded area.

McGill works as a Lecturer at the University of Glasgow in Scotland at the Glasgow Interactive Systems Group, and he was the lead author of the IEEE XR White Paper on “The Erosion of Anonymity & Privacy.” I was a contributor to the paper along with Michael Middleton, Monique J. Morrow, & Samira Khodaei, but it’s a huge topic that McGill did a great job of tackling through the lens of what’s new about XR sensing and then extrapolates on the potential accessibility benefits, but also perils towards privacy.

The chapters are broken up into XR Sensing and Computed Data, Identity and Anonymity of Self, Augmented Intelligence and Mental Privacy, Identity and Privacy of Bystanders, Worldscraping, “Live Maps” and Distributed Surveillance, Augmented Perception and Personal Surveillance, & finally a look at the existing Rights and Protections.

There’s a pretty bold conclusion that “the current system of digital privacy protection is no longer tenable in an extended reality world.” Also see my interview with Human Rights Lawyer Brittan Heller who also argues that a new class of data called “biometric psychography” needs to be legally defined in order to explain the intimate types of information that can be extrapolated from XR devices.

Here’s a talk I gave last year after I attended the Non-Invasive Neural Interfaces: Ethical Considerations Conference, which gave a sneak peak as to what’s to come with the neurotech like brain computer interfaces, neural interfaces, and sensors XR technology.

Here’s a taxonomy of the types of biometric and physiological data that can be captured by XR technologies as categorized across different qualities of presence that I first showed at that “State of Privacy in XR & Neuro-Tech: Conceptual Frames” talk presented at the VRARA Global Summit on June 2, 2021.

McGill passed along a graphic from an unpublished pre-print tentatively titled: “Privacy-Enhancing Technology and Everyday Augmented Reality: Understanding Bystanders’ Varying Needs for Awareness and Consent” that shows how some of the same types of intimate information could be extrapolated with depth-sensing AR headsets. McGill emphasized that it’s not just about cameras taking pictures of people with hidden cameras. It’s about capturing fully spatialized information that can then be segmented and processed on many different layers revealing a lot of biometric psychographic information, which happens to be a lot of similar information that laid out in my taxonomy up above as you go further and further down the path of doing composite processing capabilities of data from XR devices:

McGill shares a number of different recommendations in the White Paper, but many of them will also require buy-in and self-regulated behaviors who are the same big tech companies who are pushing forward with innovation of XR technologies while emphasizing the experiential benefits, but downplaying the existential privacy risks. I suspect that we’ll ultimately need more robust privacy legislation that either expands GDPR to more fully account for the types of biometric psychographic data that comes from XR, or perhaps the US will pass a new comprehensive Federal Privacy law — although all indications so far are that XR data are not being accounted for at all in all early draft legislation.

McGill and I do a very comprehensive 2+ hour breakdown of his paper and breaking down all of the exciting possibilities of extended perception while also the many terrifying open questions as to how to reign in the many concerns about how to put some guardrails on XR data. I’ll include some more links and references down below if you’d like to dig more into the discussions on this topic I’ve been helping to facilitate for the past six years. And like I said, every single IEEE XR Ethics White paper mentions the challenges around XR privacy, and so be sure to also take a listen to the other podcast discussions to see how this topic shows up across a variety of different contextual domains.





This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

More from this show


Episode 1091