The ethical questions around privacy in virtual and augmented reality are some of the most pervasive, unanswered questions in XR with the top two questions being: What new types of intimate biometric & physiological data are available through XR? And how are this new XR data being used? The intractable problems of XR privacy have come up in every single one of the 8 white papers as a part of the IEEE’s Global Initiative on the Ethics of Extended Reality. Human computer interaction researcher Mark McGill is excited about the new accessibility features and perceptual superpowers that may come with “all-day, every day augmented reality,” but he’s afraid that the lack of adequate consumer privacy protections and the half-baked consent models for bystander privacy may lead to social reject and backlash for wearing devices that may not be undermining the privacy of the owner of the AR headset, but also potentially violating the privacy boundaries for everyone in their immediately surrounded area.
McGill works as a Lecturer at the University of Glasgow in Scotland at the Glasgow Interactive Systems Group, and he was the lead author of the IEEE XR White Paper on “The Erosion of Anonymity & Privacy.” I was a contributor to the paper along with Michael Middleton, Monique J. Morrow, & Samira Khodaei, but it’s a huge topic that McGill did a great job of tackling through the lens of what’s new about XR sensing and then extrapolates on the potential accessibility benefits, but also perils towards privacy.
The chapters are broken up into XR Sensing and Computed Data, Identity and Anonymity of Self, Augmented Intelligence and Mental Privacy, Identity and Privacy of Bystanders, Worldscraping, “Live Maps” and Distributed Surveillance, Augmented Perception and Personal Surveillance, & finally a look at the existing Rights and Protections.
There’s a pretty bold conclusion that “the current system of digital privacy protection is no longer tenable in an extended reality world.” Also see my interview with Human Rights Lawyer Brittan Heller who also argues that a new class of data called “biometric psychography” needs to be legally defined in order to explain the intimate types of information that can be extrapolated from XR devices.
Here’s a talk I gave last year after I attended the Non-Invasive Neural Interfaces: Ethical Considerations Conference, which gave a sneak peak as to what’s to come with the neurotech like brain computer interfaces, neural interfaces, and sensors XR technology.
Here’s a taxonomy of the types of biometric and physiological data that can be captured by XR technologies as categorized across different qualities of presence that I first showed at that “State of Privacy in XR & Neuro-Tech: Conceptual Frames” talk presented at the VRARA Global Summit on June 2, 2021.
McGill passed along a graphic from an unpublished pre-print tentatively titled: “Privacy-Enhancing Technology and Everyday Augmented Reality: Understanding Bystanders’ Varying Needs for Awareness and Consent” that shows how some of the same types of intimate information could be extrapolated with depth-sensing AR headsets. McGill emphasized that it’s not just about cameras taking pictures of people with hidden cameras. It’s about capturing fully spatialized information that can then be segmented and processed on many different layers revealing a lot of biometric psychographic information, which happens to be a lot of similar information that laid out in my taxonomy up above as you go further and further down the path of doing composite processing capabilities of data from XR devices:
McGill shares a number of different recommendations in the White Paper, but many of them will also require buy-in and self-regulated behaviors who are the same big tech companies who are pushing forward with innovation of XR technologies while emphasizing the experiential benefits, but downplaying the existential privacy risks. I suspect that we’ll ultimately need more robust privacy legislation that either expands GDPR to more fully account for the types of biometric psychographic data that comes from XR, or perhaps the US will pass a new comprehensive Federal Privacy law — although all indications so far are that XR data are not being accounted for at all in all early draft legislation.
McGill and I do a very comprehensive 2+ hour breakdown of his paper and breaking down all of the exciting possibilities of extended perception while also the many terrifying open questions as to how to reign in the many concerns about how to put some guardrails on XR data. I’ll include some more links and references down below if you’d like to dig more into the discussions on this topic I’ve been helping to facilitate for the past six years. And like I said, every single IEEE XR Ethics White paper mentions the challenges around XR privacy, and so be sure to also take a listen to the other podcast discussions to see how this topic shows up across a variety of different contextual domains.
LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST
TALKS & KEY CONVERSATIONS ABOUT PRIVACY OVER THE YEARS
- VIDEO: State of Privacy in XR & Neuro-Tech: Conceptual Frames (2021)
- SLIDES: State of XR Privacy
- VIDEO: Towards a Framework for XR Ethics – Kent Bye, Augmented World Expo (2021)
- VIDEO: XR Ethics Manifesto (2019)
- #951: Privacy Primer: A History of U.S. Consumer Privacy, U.S. Federal Privacy Debates, & XR Privacy Implications with Joseph Jerome
- #998: Primer on the Contextual Integrity Theory of Privacy with Philosopher Helen Nissenbaum
- #988: Defining “Biometric Psychography” to Fill Gaps in Privacy Law to Cover XR Data: Brittan Heller’s Human Rights Perspectives
- #994: Neuro-Rights Initiative: A Human Rights Approach to Preserving Mental Privacy with Rafael Yuste
- #493: Is Virtual Reality the Most Powerful Surveillance Technology or Last Bastion of Privacy?
- #516: Privacy in VR is Complicated & It’ll Take the Entire VR Community to Figure it Out
- #517: Biometric Data Streams & the Unknown Ethical Threshold of Predicting & Controlling Behavior
- #892 Sundance: ‘Persuasion Machines’ is Architectural Storytelling Against Surveillance Capitalism
PRIVACY CONVERSATIONS WITH FACEBOOK EMPLOYEES:
- #958: A Candid Conversation with Facebook’s AR/VR Privacy Policy Manager: New Potentials for Community Feedback
- #987: The Neuroscience of Neuromotor Interfaces + Privacy Implications with Facebook Reality Labs’ Thomas Reardon
- #641: Oculus’ Privacy Architects on their Open-Ended Privacy Policy & Biometric Data
- #520: Oculus’ VR Privacy Policy Serves the Needs of Facebook, Not Users
OTHER CONVERSATIONS ABOUT PRIVACY OVER THE YEARS
- #999: The EFF on XR Privacy & How AR/VR Needs a Human Rights Framework + Timeline of UN Resolutions on Privacy
- #996: OpenBCI’s Project Galea collaboration with Valve & Neuro-Privacy Implications of Physiological Data
- #997: Debating XR Privacy Tech Policy with Ellysse Dick of Information Technology & Innovation Foundation
- #991: Critiquing Facebook’s Responsible Innovation Principles & Project Aria through the lens of Anthropology & Tech Ethics
- #839 XR Ethics: The Challenges of Privacy Engineering with Diane Hosfelt
- #717: VR Privacy Summit Organizer Highlights & Next Steps
- #716: VR Privacy Summit: Medical Insights into VR Privacy + Health Benefits of Biometric Data
- #939: Ethics & Privacy in Mixed Reality Panel for IEEE VR Academics & Researchers
- #838 XR Ethics: SIGGRAPH Panel on Privacy-First Architectures & Ethical Design
🎉THREAD of an 8-part Voices of VR podcast series on XR Ethics covering the white papers produced by @IEEESA's Global Initiative on the Ethics of Extended Reality.
1st ep is on VR Harassment & Trolling with @theextendedmind & @ellecortesehttps://t.co/6gcbH8xHtz
Video Overview pic.twitter.com/0WArX9jGFL
— Kent Bye (Voices of VR) (@kentbye) June 6, 2022
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
Podcast: Play in new window | Download