#1090: IEEE XR Ethics: Diversity, Inclusion, & Accessibility

Continuing on my series on XR Ethics in collaboration with the IEEE Global Initiative on the Ethics of Extended Reality, this episode dives into the White Paper on Diversity, Inclusion, & Accessibility featuring Isabel Guenette Thornton (Ph.D. candidate at the University of Cambridge in Sociology), and Dylan Fox (XR Access & Researcher at UC Berkeley for Augmented Reality for Obstacle Avoidance).

Accessibility is a really huge open problem with XR, and there’s lots of work that still needs to be done, and Fox says that there hasn’t been an XR experience that’s totally accessible. There have been incremental innovations with experiences tackling one accessibility feature, and this White Paper starts to map out and define features of XR accessibility. There’s also a XR Accessibility Project GitHub page that’s being curated as a collaboration between XR Access and the XR Association that contains lots of references and code snippets to help make XR experiences more accessible.

There’s a free XR Access Symposium happening on June 9-10 that will be digging a lot of these issues, and videos will be made available if you can’t make it live.

Thornton brings up lots of the latest sociological scholarship on the challenges of data extraction, predatory inclusion, and a quote from Tressie McMillan Cottom’s 2016 paper on “Black Cyberfeminism” that points out how “digital divides may not go far enough to capture the various intersections of privilege, access, and power that operate online and offline simultaneously and which can also be mutually constitutive.” This helps to set a broader context for how the realms of possibilities that are opening up for able-bodied XR users are increasing the realms of possibilities while simultaneously increasing gaps people who don’t have the ability or means to gain access to emerging XR technologies.


This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Rough Transcript

[00:00:05.412] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR podcast. So continuing on in my series of looking at XR ethics in collaboration with the IEEE Global Initiative on the Ethics of Extended Reality, the second paper that I'm covering is XR ethics and diversity, inclusion, and accessibility. So accessibility is such a huge topic in terms of making virtual and augmented reality experiences accessible for people across a wide range of different abilities, both from sensory impairments that they may have, but also cognitive impairments as well. The XR Access and XR Association have been collaborating a lot on this issue. There's actually a conference that's coming up at the end of this week on Thursday and Friday, June 9th and 10th. That's the XR access symposium. So it's free and definitely if you're interested in this topic go check it out And if you miss it and if you're listening to it after it happens and there should be recordings that are made available so you can go check out some of the different discussions because this is probably one of the areas that has the most technical work to be done to be able to meet the baseline threshold for making immersive experiences more and more accessible and So this conversation is with Dylan Fox, who is working with XR Access, but also Isabel Gannett Thornton, who is a sociologist. And so the first half of this conversation is actually going to be digging into a lot of the different sociological lens, both diversity, inclusion and accessibility. So that'll help set a broader context. And then we'll dive into more of the technical details in the second half of this conversation that kind of mirrors how the paper is structured. One other thing of note is that this conversation happened in the beginning of February, and in mid-April, there was news about a lawsuit that was filed back in November of 2020 by Dylan Panara, who filed a civil rights lawsuit against HTC Corporation because the Viveport Infinity did not include captions. He's deaf and could not experience a lot of the different experiences that were featured. And so he filed a civil rights lawsuit, citing that there were violations of the American Disabilities Act. There was a preliminary ruling that happened on April 15, 2022, where HTC tried to get the entire lawsuit dismissed because they were saying that, hey, this is not our content, we're just a distributor, and there's other people that are making this content. Well, that motion to dismiss was denied, and so it's still an ongoing case. So there could, at some point, something come out of this as a case in terms of setting a precedent for requiring all immersive experiences to meet some of these baselines of whatever the accessibility requirements that are spelled out in the American Disabilities Act. In the course of this conversation, we start to map out how there's actually not a refined set of what those guidelines should be yet. There's polling from many different disciplines and domains. There's still quite a lot of work that's yet to be done in order to more proactively define what it would mean to be able to have an immersive experience be completely accessible. So that's what we're covering on today's episode of the voices of VR podcast So this interview with Dylan and Isabel happened on Tuesday, February 1st 2022. So with that let's go ahead and dive right in My name is Dylan Fox.

[00:03:15.092] Dylan Fox: I am a Currently the coordination and engagement team leader for XR Access, which is a nonprofit focused on virtual and augmented reality accessibility. I'm also a researcher at UC Berkeley where I work on augmented reality for obstacle avoidance for people with low vision and a background in UX design living here in Oakland, California.

[00:03:37.759] Isabel Guenette Thornton: Hi, I'm Isabel Gannett-Thornton. I'm currently a PhD candidate at the University of Cambridge in sociology, where I research producers of XR technologies. And I have a background in product management. So I used to live in California, where you are Dylan, and I'm now elsewhere.

[00:04:00.996] Kent Bye: Okay, great. Yeah. And today we're going to be talking about this paper that you both were co-authoring as a part of the IEEE Global Initiative on the Ethics of Extended Reality, talking specifically about XR ethics, diversity, inclusion, and accessibility. And I'm wondering before we start to dig into this, if you could give a bit more context as to your background and your journey into XR relative to this topic.

[00:04:22.507] Dylan Fox: Sure. So I started out originally at Berkeley in mechanical engineering, because, you know, I was good at math and science and figured, Hey, that's a meal ticket. Right. Found that thermodynamics wasn't quite as much fun as it was cracked up to be. And so I kind of switched to cognitive engineering, that idea of looking at how people interact with technology and how we can leverage that information to make it better. And I got first into XR just because it was such an interesting design space in terms of how are we going to let people know what they can do in this space? How are we going to communicate that to them? And in the process of getting my master's at Berkeley, focusing on that, I was presenting something around kind of XR for engineering and CAD and found a whole number of interesting applications of XR for accessibility. And it, you know, really got me into the mindset of, Hey, you know, I'm a UX designer. I like to focus on user needs. Here's a whole bunch of user needs that have been completely ignored by a lot of different people. And if we ever want XR to really live up to its potential as a technology and to please the little futurist that lives in my head and that says technology can be good for society, we definitely need to make sure that we're thinking about how to make it accessible to everybody. So it's not just the privileged and the able-bodied and so on and so forth who have access to it. And so that kind of led me into this niche that I've found very satisfying to work in.

[00:05:57.871] Isabel Guenette Thornton: My first job out of undergrad was at a company called Nest in Palo Alto. And so we were working on a smart thermostat. It was at the beginning of the internet of things. So I was the product manager for the smart thermostat. That was really an amazing experience. And that actually led me back into academia. I felt like it was an interesting opportunity to ask How are the ways that we build products leading to social change, instigating social change, affecting social change? And so one of the first things I did at the University of Cambridge is I joined a project called the Whistle Project. And that is a group that builds technology products for human rights organizations. So that was hugely interesting and really helped shape my thinking on the ethics of technology production. And my dissertation that I'm working on now is looking at how technology producers, specifically those in XR, kind of world make, both literally as they're producing virtual worlds and also figuratively as they're imagining what the next extension of digital life is going to be. So my interest in XR ethics specifically is twofold. It's both from an extension of my work at The Whistle, thinking through technology for human rights and ethical technology in that sense, and also is a big part of my doctoral work.

[00:07:17.961] Kent Bye: Okay. Yeah. That's all great context because I think that this issue of accessibility is one of the things that because they were just trying to get XR technologies to work for people who are able-bodied now that it would set the minimum baseline to work. Now it's to see how to start to expand that and to try to have it work for everybody. And so part of the background that I thought was interesting. in this introduction was there was quite a lot of other sociological insights that I haven't seen part of the discussion so far. I think most of the discussion I've seen in terms of accessibility were much more around how to technically make the experiences more accessible, but maybe because you start off with some of the more sociological insights. Isabel, I'm wondering if you can maybe help set a little bit more of this broader context for how accessibility fits into this larger sociological relationship between people and technology and the economy and everything else tied together.

[00:08:11.344] Isabel Guenette Thornton: Sure. Thanks, Kent. So I'd love to start off with a quotation from an amazing public sociologist scholar, Dr. Tressie McMillan Cottom. So she's a sociologist with an expertise in digital technology and racial capitalism. And she suggests that the digital divides may not go far enough to capture the various intersections of privilege, access, and power that operate online and offline simultaneously and which can also be mutually constitutive. So I find this really fascinating because it reminds us that making sure that something works for a group of people is really just the beginning of that story. And we always want to be thinking about, I should say, I always want to be thinking about the ways in which digital technologies are either aligning with or departing from other structures of power that we see very entrenched with institutions today.

[00:09:06.602] Kent Bye: You know, some of these things that you're talking about here in the introduction, there's a number of different references in here about the ways in which these platforms start to extract labor from people in a way that's very exploitive. You're starting to bring in more of the economic dimensions of equity within the context of your, maybe you could just help set a framework for how you understand how those economic aspects are impacting people who are not fully abled and how that relates to this larger discussion around accessibility and equity.

[00:09:39.309] Isabel Guenette Thornton: Sure. So I think in terms of the potential extractiveness of labor, that's certainly an issue. I would also say that XR technologies are built on increased surveillance, particularly biological surveillance. You used an interesting word, which is extraction. I think extraction is a huge topic when we think about ethical technology. And when we think about making technology that works well for marginalized groups, it's something that we spoke a lot about at the whistle in terms of making technology tools for human rights organizations, because a lot of these human rights organizations were looking to gather data from marginalized groups on human rights abuses and other issues. And this word came up again and again, you know, let's not be too extractive because usually, for example, when a human rights organization is getting information from respondents on human rights abuses, that's a human connection. And so there's a lot of sitting with in solidarity, a human relationship that helps that feel like a reciprocal interaction. And so that's very much a concern in human rights. But I think in general, when we imagine the word extraction in the context of technology, it's really about data. It's about all the data that is getting tracked and consumed. And so when we have that extractive data, there's questions around privacy, there's questions around who owns the data, and also transparency. Transparency is a particular concern for marginalized groups because going back to Macmillan Cottom, decision-making can be veiled from democratic inquiry when there is a lack of transparency, particularly around how data is used, how data is collected. And with XR, we have this huge amount of biological data all of a sudden, potentially with eye tracking, head tracking, you know, tracking how people are moving in space. And so that's something I hope that we can have be part of the conversation really in this early stage at the beginning, how given that we are working in a technology that does extract an enormous amount of very personal data, how do we make that not exploitative? How is that transparent? How is that equitable? And again, that probably means really thinking through the experiences of different groups of marginalized groups and not having a one size fits all user who may reflect certain dominant characteristics.

[00:12:01.404] Kent Bye: Yeah, I wanted to also ask one other question around a sociologist that I've come across in previous contexts of looking at media theory, Pierre Bordeaux's field theories. You cite Bordeaux in here in terms of going into these experiences and having a world that has a field of possibilities. And so maybe you could expand upon what Bordeaux is talking about there and how you are contextualizing that aspect into this larger discussion of accessibility and equity.

[00:12:29.277] Isabel Guenette Thornton: Sure. So I think there's some really interesting work that's been done, not just in field theory, but also in terms of Lacan and others around possibility and ideology and language. And essentially, it's the idea that to some extent, As human beings, as social beings, our possibilities are already narrowed simply by the fact that we use language to communicate and we use language to think and that the structures of language narrow the possibilities of what we can imagine to some extent. And to my mind, XR takes this to potentially a whole nother level. Because the actual world and the possibilities for engagement, the possibilities for physical movement, and the possibilities for all sorts of things, basically the affordances of the XR product may set the conditions of possibility for users. And those in some ways are artificial conditions of possibilities, or they're chosen conditions of possibilities, chosen constraints that producers have created in their piece and their product. And I just think that's worth thinking about, that step one, when you're going into a new virtual world, the conditions of possibility have already been set. And that's not entirely new. You know, there's an analogy in literature and analogy in other media and other entertainment, but because XR has the biological component, And a lot of producers I've interviewed for my research talk about this kind of lizard brain phenomena where things feel differently real, they feel biologically real. I think it's worth thinking about what it means to have constraints on the field of possibilities in these new worlds. And particularly, I'm concerned about that field of possibility that's artificially created by XR producers aligning with existing biases that we see today socially or otherwise restricting opportunity for marginalized groups.

[00:14:29.193] Dylan Fox: That was good. I would love to add to that just when it comes to, yeah, like you said, the conditions of possibility. We have to be really careful as designers in XR when we're thinking about what possibilities exist in these worlds around us. Because if you're a 2D designer, you could have a screen that just has two buttons on it, yes and no, right? And there's not a lot of freedom for creativity or expression there. It's going to be very straightforward. People have two options. But when you put somebody in a whole virtual environment, when they have objects and characters all around them, different people are going to have very different ways of interacting with that environment and of coping with the things in it. And it's worth looking at not only what is your first instinct as the creator of it, but also talk to people with different lived experiences, right? Talk to people of color, talk to people who are neurodiverse and see what their instincts are, because they might be very different from yours. And if your interface only accepts a very narrow range of possible interactions, you're really going to be making a lot of your users uncomfortable and unsatisfied with the experience.

[00:15:43.650] Isabel Guenette Thornton: I just think that, you know, economically and the way that a lot of XR startups work is a lot of the user testing is done internally because they're in stealth mode or they're done with a small group of early adopters or with developers who are also first customers. And so I wonder about the testing kind of apparatus and how much that's really supporting what you're suggesting Dylan, which is understanding the lived experience of diverse groups.

[00:16:09.827] Kent Bye: Yeah, and as I read through this paper, as I start to make sense of all these different spheres, I draw upon Lawrence Lessig. He has a book called Code is Law. In there, he talks about the pathetic dot theory where he has these collective spheres and he talks about the culture, aspects of the laws, the economy, and then the technological architectures. And I feel like the user is at the intersection of all those. In some ways, I think of the largest context of the culture, which I think a lot of these sociological dimensions that are being brought into this discussion. Then there's the laws that set different policies for the relationship between the government and accessibility for the Disabilities Act and a wide range of specific laws that are being referenced throughout the course of this paper. And then the economic dimension, which is happening within the context of the laws. But here there's also, in terms of equity, economic dimensions when it comes to data being extracted for economic context. And that gets to the last level, which is the technological architectures. There's the XR Association, which is a collection of groups within the XR industry who are in collaboration with XR Access to be able to produce either code, which is the specific recommendations in terms of what can you do technically to be able to address this issue. So I feel like this paper does a really great job of covering those cultural aspects, covering some of the legal recommendations, some of the economic dimensions and what the companies can do. And then also at the technological architecture, what can be done. So you Dylan, I'd be curious to hear any reflections on that, just in terms of how the XR access and how you as an academic, which is, I guess, more in the cultural level is in collaboration with these different groups to kind of help orient some of these different things, working at not only the lower level tech architectures, but also the policy levels and some of these other economic and legal dimensions.

[00:17:55.345] Dylan Fox: Absolutely. I think what helps to make a lot of this clear is that when I think about XR Access, I think of it as a boundary organization. It is a catalyst. It is a group that is intended to get all of the very many different stakeholders that need to be put in the same room together so that we can work out this really thorny challenge of XR accessibility. Because You're going to need participation from obviously designers and developers, from the platform owners, but also from governments. You need to have people with disabilities in the room to share their lived experience. You need to be working with people in healthcare and education that are going to often be using these XR solutions. You really need to have a lot of people talking to one another to solve this problem. And so, when we had this opportunity to work on the IEEE guide, we knew that we had to cover those multiple aspects. Because it's one thing to tell somebody how to make their application accessible. But if they don't have the resources to do it, if they don't have you know, oftentimes the requirement to do it, because for better, for worse, accessibility is often driven by legal requirements. We need to make sure that we're handling all of those aspects and communicating with all of the different stakeholders in order to achieve any type of meaningful action on this front. So that's definitely been a major goal of ours is to get these people connected and get everybody into the room to solve these problems.

[00:19:33.183] Isabel Guenette Thornton: And that's so needed. And the work that your organization is doing is so needed. And I think we can see so many technologies that have been around a lot longer, including social media, still hasn't solved these problems. And it's exciting to see a group really committing to doing that work so early and so thoroughly.

[00:19:55.179] Kent Bye: Isabel, I wanted to ask you, cause I think you may actually be the first sociologist I've had a chance to interview because there's Bordeaux and other sociologists I think have similar ideas of how to differentiate between these different spheres of culture and the laws and the economy and tech architecture aspects. Are there other theorists that you look to that help to break down some of these different spheres?

[00:20:18.610] Isabel Guenette Thornton: Hmm. That's such a good question. I think these days I find I'm looking to more contemporary scholars, I think particularly because they're really paying good attention to race and other topics that have been a little bit ignored in prior canon. There's a great effort called Decolonize the Curriculum that's going on at Cambridge and then a lot of other institutions that's seeking to rectify that. So, you know, there's folks like Benjamin Rua, Race After Technology, and a lot of these good sources that are creating a new theory, I think. I think it's really helpful to pull, as you were suggesting before, Kent, from some of this understanding of labor exploitation. Humphreys and Grayson and others have done good work on the concept of predatory inclusion, which I think is sort of the dark side of thinking about how we make technologies accessible very broadly, which is making sure that that inclusion is not exploitative, that we're not democratizing technologies, which is a word I use because it's a buzzword that often comes up, I find, in the industry when people are talking about why XR or other new technology will be helpful to the world. It's like, oh, well, we're democratizing people's ability to X, Y, Z. It's like, well, to what extent is that really about giving people helpful tools? And to what extent is that predatory inclusion where people are actually being invited into products in order to be extracted for their data or other reasons? And I think that kind of theory has been really helpful with my own work.

[00:21:56.858] Kent Bye: Okay, that's helpful. Yeah, there's some references in that paper to dig into more of those contemporary scholars. So definitely appreciate that being introduced into this larger conversation. The next section is talking about more of those legal aspects of how, Dylan, you said a lot of times the implementation of these technologies are forced by different legal regulations that are brought about, especially in the context of government doing anything with these types of immersive technologies, but also with education context. And so I'm just curious if you could start to dig into some of the major, either policy recommendations or organizations that are going to be involved. You have everything from the department of justice and the GSA, and there's a whole wide range of different governmental entities that may be a part of this specific discussion around accessibility and drawing upon the work of Elise Dick, who has been writing a lot about the policies recommendations. I think she did a really great job of the first cut of mapping out some of those dynamics. I know she was involved in this paper as well, but maybe you could start to paint out a larger picture for what people should know in terms of the different organizations, the laws, and what are the key areas that maybe need to be updated in terms of the existing stuff that's on the books that needs to be reevaluated and updated in the context of being in alignment with what's happening with XR.

[00:23:17.382] Dylan Fox: Yeah, absolutely. Well, first, I mean, I have to say thank you again to Elise Dick and the other folks at the Information Technology and Innovation Foundation, because everything we put into the report on policy came almost straight out of there. They did an amazing job on it. Yes. But I think the first thing that I would clarify is that I think oftentimes people think of regulation as being this annoying extra cost pain in the butt and or of the government just being this kind of antiquated bureaucracy that takes forever to get anything done. And some of that is true, but I think it's important to think of the government as offering both carrot and stick to when it comes to accessibility. One of the co-founders of XR Access is PEAT, the Partnership on Employment for Accessible Technology. And that is right out of the Department of Labor, because there are people in the government that see XR as a potential solution to some of the problems facing our nation right now. certainly in terms of being able to encourage the participation of people with disabilities in the workforce, VR and AR are huge tools to doing that. And so when we have had this great support from the Department of Labor on supporting these, on exploring these, I think that's something that we shouldn't forget is part of this conversation, right? In the same way that DARPA, for example, made a ton of the technologies that we rely on now for everything in modern life. I think let's not forget the capability of the government to be a public funder of research for the common good. So I definitely wanted to say that, but then yes, when it does come to updating things like the ADA and 508, there's important things to be considered. We're still at that phase of VR right now where It's optional in a lot of ways. I don't think anybody says you must use this VR experience in the same way that maybe you must access this website in order to use a service. But I think that day is getting closer, especially as we find things like training programs that are designed to be in VR. Those are very hard to recreate in 2D, especially with any level of efficacy And so we're going to start running into places where if you want to get the equivalent experience to other people using these technologies, it needs to be accessible. And I think we're going to see probably some lawsuits are going to pave the way as they usually do, but the government can do things besides just wait for the courts to work it out and then demand everybody retroactively upgrade their systems, which is the least efficient way you could possibly go about getting accessibility implemented. So policymakers can look to first, again, bring in people with lived experience with disabilities into conversations. They can look to make sure that they're getting input from those folks and from the many people in academia that study this exact subject when they are making decisions. they can think about clarifying proactively what it means to have accessible XR. Because right now the government kind of just points to the WCAG guidelines, the W3C has put out and says, if it meets these, it's probably accessible. We don't really have that yet for XR. The W3C is working on it. Several other groups are working on it. But I would love to see the government say, here is what it means to be accessible in XR. If you meet these guidelines, if you meet these standards, you don't have to worry about getting sued, or you don't have to worry about not qualifying for federal funding. because there's so many of these use cases for XR revolve around healthcare, revolve around education, places that do receive a lot of federal funding. The government has a lot of leverage in these types of situations to say, hey, if your application isn't accessible, you're not going to receive our help paying for it. And if it exerts that pressure, we could have XR become accessible much faster than if we're waiting for private companies to put out inaccessible applications, then for somebody to get enough capital together to sue them, and then for the courts to go through it all, and then at the far end of that, come out with what it means to be accessible. So I would really love to see some proactive government action here. And in terms of exactly what that should be, I definitely recommend people read the report and read Elise's suggestions on it.

[00:28:16.360] Kent Bye: Yeah, just to clarify, because I understand that there's the American disabilities act and section 508 and I don't know if there's like say buildings have certain accessibility requirements like having ramps for people who are in wheelchairs and if there's some of that that is already applied to say websites and. effective communication on the different guidelines that have to come from 2D websites, and whether or not some of the rules and regulations that have applied to physical buildings are going to now be applicable to, say, what's happening in these virtual spaces, whereas before it was only maybe a 2D representation of communication, and if that even was different laws. So I don't know if you have a good sense of some of these existing regulations for regulating physical space, if they're somehow now being newly applicable to the digital realm with XR?

[00:29:05.890] Dylan Fox: It's a very good question because virtual spaces are this strange mishmash of digital and physical, right? I think it certainly is not as easy as saying you must be wheelchair accessible by having ramps of such and such descriptions, because being wheelchair accessible in VR means something completely different than being wheelchair accessible in real life, right? In real life, it's about what's the grade of the ramp, you know, does it have handles, et cetera, et cetera. But in VR, you might be able to just teleport up to the next level. So you don't necessarily need a ramp, but if somebody can't hold one button on the controller while they move their wheelchair with their other hand, that's not going to be accessible for them. And so it goes much more towards looking at those interactions and those control schemes as compared to the physical qualities of the space. But that said, there are parts that are applicable, right? You know, once you start getting into building codes and things, that's pretty well outside my familiarity, but there are aspects of physical accessibility that we can think about when we do XR, particularly when it comes to augmented reality. There are many ways we can enhance the safety and the accessibility of spaces using these technologies, right? Maybe we can retrofit some buildings using AR to make them more accessible. I've seen some great examples of ways to map buildings such that, you know, if you are blind, let's say you're blind and, you know, the fire alarm goes off, it may not be obvious at all where you're supposed to go, where the exit is, what the raft is to get there. if you're lucky, maybe there was some raised map or Braille somewhere along the way that told you how to do it. But consider the use of AR in that situation. There's some great work by Shiri Azenkot, for example, at Cornell Tech. Also, I should mention one of the co-founders of the XR Access Initiative and its current home, where a research initiative of Cornell Tech you know, you'd have this experience of a blind person putting on the AR headset and having it guide them using audio, using haptics, using visual enhancements, guide them to safety in a way that you could not do in just a unenhanced building. And so I think thinking about not only, you know, to flip your question about on the head, not only how do we apply these systems and these safety regulations to VR and AR, but how can VR and AR improve existing building safety and existing accessibility? That's a really rich area to look into.

[00:31:53.086] Kent Bye: Yeah, and as I look through the policy section, we've been really talking a lot about the ensuring that the technology is accessible, but there's other aspects here in this policy section that are more around discrimination and protecting constitutional rights. And so working with the EEOC, as well as the Department of Justice. And so Isabel or Dylan, I'm just curious if you have any other thoughts on some of those other dimensions, because I know Isabel, you've said you've been working with different human rights organizations. And so if there's other aspects reflected here that go above and beyond just the accessibility dimension of this as a discussion?

[00:32:27.456] Isabel Guenette Thornton: Sure. One thing that comes to mind is that XR makes possible personalization at scale, kind of in an unprecedented way, and that there are both opportunities there and also some potential problems. So opportunities, how wonderful if we can really address the needs of particular communities by having personalized experiences. That's kind of the hook. That's what we get excited about. But also problems in that, again, is that entrenching biases? Is that going to mean that one person's personalized experience is qualitatively different in a way that is inherently unequal? then there are existing issues with social media and misinformation and the kind of pop culture idea of the echo chambers is one that's been in the news a lot this year. And, you know, does XR entrench that phenomenon even more because it's potentially more totalizing? I think these are really good questions. I think one question that I have about it, going back to this idea of extraction, you know, are there ways to do work in XR that provide kind of more solidarity and more relational support for people in need rather than existing informational gathering modalities, like a survey, for example, which is very cold. One of the projects that we worked on at the Whistle at the University of Cambridge and then also in Nigeria with Global Rights Nigeria, where how do we develop survey systems that aren't extracted, where we can ask people to report on really horrific things like sexual assault or experience with racism, and do it in a way that provides more of a feeling of solidarity and is not re-traumatizing to the people who are speaking about their experiences. And that's very difficult to do with a survey because a survey is quite cold. So we had some ways to work around it. But at the end of the day, when someone's giving testimony, it's really helpful if that's more of a personal experience. So I'm curious if that's a potential use of XR one day or if that continues to feel extractive. So curious to see the ways in which the immediacy of XR can be applied to these communication cases that are currently very delicate. At the same time, you know, I am also curious about the potential ethics of having, for example, an AI character who would take someone's human rights testimony and perhaps say certain things that might be in solidarity, but also wouldn't be a real person. And what does that mean? And it just gets very pointlessly speculative at that point. But getting back to this question about personalization at scale, you know, having both opportunities for meeting the needs of very specific communities, very specific moments, like, for example, taking human rights testimony, and then also introducing a very intense potential for bias. And I really hope that producers working on these products are thinking this through.

[00:35:31.266] Kent Bye: Yeah. Yeah. I know that I've certainly seen as I've mapped out these different contexts of ethical dimensions, that there are sometimes these trade-offs between technologies that could increase accessibility and access to certain populations, but also be harmful for other populations. I'm happy to see that Meta as a company has taken a step back away from facial recognition as an example, but that was certainly a thing that could be used to say, Hey, if you're there's facial recognition capabilities and people who can't recognize faces that's going to be helpful for them but all the other degrees to which that type of incursions into our privacy and bystander privacy and the risks of having that widespread facial recognition on augmented reality glasses that are basically all over the place and both in public and private spaces. So yeah, I've definitely noticed that there can be these various trade-offs between how a company will be standing on a ground of saying that, well, this is to increase accessibility and diversity while there may be other secondary, or maybe even primary uses for why they're doing it that are more about their business model. So that's definitely a thing that I've seen come across.

[00:36:38.864] Isabel Guenette Thornton: Yeah, facial recognition is a great example. I have a wonderful colleague at Amnesty International, Matt Mamoudi, who does a lot of work on facial recognition and its harms. A lot of his work looks at how refugees use technology or refugees having technology used upon them and the problematics there. And certainly, I think one of the assumptions that a lot of the XR industry can take for granted, that it's not dangerous for people to be tracked. because that's true of most of its early users. But there are big populations, including refugees, including other groups, where it is dangerous to be tracked. It's dangerous to be tracked. It's dangerous if that information is made available to various government agencies. And this is something that's very well known in the human rights community is that tracking is not benign. and the assumption should not be that data is always collected. And I think sometimes, especially with nascent industries that have early use cases, like gaming, things like that, it isn't necessarily the same amount of digging into these basic assumptions that, you know, tracking, well, it enables the technology and maybe that's fine. Well, maybe that's a little tricky if we start to move beyond gaming context.

[00:37:54.585] Dylan Fox: Yeah. And I would add to that, that this conversation of accessibility and privacy can be very tricky because you're right. There are certain things where yeah. Theoretically, if we recorded every 3d space and every person that could go to some great assistive technologies, but at what cost, right. I would note, especially that when it comes to people with disabilities, privacy is a serious issue. I know a lot of folks who use screen readers. who say, I do not want the website I'm on to know whether or not I'm using a screen reader. Because A, that's a huge privacy issue. People finding out about a disability can have all kinds of consequences in terms of healthcare, in terms of social interaction, job finding, you know, all kinds of things. And also because if they know I'm using a screen reader, they might try to shift me to some screen reader friendly version of the website that hasn't been updated in five years. So we don't want that. We don't want this idea of, yes, just give us all of your data so that we can help you. I think that is something that's looked on with extreme skepticism. Ideally, you shouldn't need all of that private data in order to provide these services as much as possible. I know some of these things rely on having lots of data, but we need to find ways to get both ends to protect people's privacy and people's security while offering as best we can assistive technology. Right. I know Microsoft does a pretty good job of this on the HoloLens in terms of, I believe, and somebody correct me on this if I'm wrong, but they've architected it so that a lot of the data that is recorded on the HoloLens does not go back immediately to some server online. It stays on the device. And I think if we can architect it such that, again, let's look at the case of somebody who's blind, somebody who has, you know, recognizing people is really helpful. Well, rather than having some massive online database where everyone they see gets added to it and comes from it, maybe you teach it to recognize the most important people in your life. And that stays on the device. So it can recognize your friends and family when you come in or when they enter a room, but it's not going to trigger anything in some FBI database that, oh, so-and-so you just passed on the street. It's on a watch list. we need to find ways to achieve accessibility without sacrificing privacy.

[00:40:13.831] Isabel Guenette Thornton: I absolutely agree with that. And I would also add that, you know, sometimes particularly pro-social or safety-oriented use cases are used to justify these huge incursions into people's personal data. I think that's something to be particularly wary of. Dylan, you mentioned offhand as an example, you know, the watch list. So this is something that's being played out in a lot of cities right now with police access to facial recognition. And it's being framed as a safety issue. And that's unfortunately obscuring a lot of the problems with these sorts of technologies. So certainly I think it's helpful to keep an eye on these sorts of accessibility or ethical or pro-social framings to make sure that there's not a bigger story there in terms of costs and trade-offs. And I think Dylan, you and your group are really thoughtful about balancing those sorts of things and taking a wide and forward thinking view that it's tricky to do and it's worth it to do it early if we can.

[00:41:13.991] Dylan Fox: Yeah. And I think you'll find oftentimes if you ask the people who this stuff is supposed to be protecting, whether they want to give up their rights in order to be protected, you might find a different answers from what somebody who's trying to justify these programs might tell you.

[00:41:29.614] Isabel Guenette Thornton: Absolutely. Something that the Whistle also really found when we were working with both technologists and human rights groups is that the technologist assumptions of what human rights groups might need are not necessarily aligned and they're working from potentially very different worldviews. And so that can certainly also be something to understand.

[00:41:50.059] Kent Bye: Yeah, well, the end result was that meta, at least in the short term, has taken a step back from facial recognition. So I'm glad to see that they're maybe nipping some of these potentially really problematic technologies in the bud, but they haven't eliminated the possibility that they might use it in the future. So that's still there in the background as a potential But I wanted to just kind of wrap up the policy section and get into some of the more technical recommendations. I just wanted to shout out that in this section, and also my conversation I had with Elise, she was advocating a number of different efforts and initiatives from within the government to support fundings for different organizations, whether that's GSA, Department of Health and Human Services, Department of Education, lots of different branches of the government that may themselves be implementing and deploying aspects of these XR technologies. And that there needs to be from the government side investments to be able to do additional research and implementation of trying to make these XR technologies more accessible. And so there's a section here that recounts all those different groups and those specific recommendations. So there's an element of which that as XR technologies are adopted in the context of the government, there's a lot of stuff that can happen in this domain. So I don't know if there's any other thing you want to mention on that before we move on to the next section about some of the different recommendations technically.

[00:43:04.777] Dylan Fox: I would just reiterate what's in the report that there is an opportunity for government when it comes to investing in these technologies, both in terms of utilizing VR training for any number of government agencies could benefit from it. to helping to fund efforts in education and healthcare and other places. I would really encourage anybody who is a policymaker who's listening to this, anybody who is in government to consider how you can help use the public mandate basically to help make sure that this is successful, right? The government for all that it can be unwieldily is an institution that is fundamentally responsible for people and for citizens. And that makes it very different from any of the platforms that would otherwise control a lot of this XR technology. And so if you are in government, you have an opportunity to help make sure that XR is for the people and it is accessible and it is not simply a tool for surveillance capitalism.

[00:44:13.443] Kent Bye: Yeah, so as it gets deployed across these different groups, yeah, there's a certain number of different organizations. Pete has mentioned the GSA and other groups that are mentioned there. But maybe let's move on to section 2.2, the recommendations to XR creators, because what I found interesting is that there's already been quite a lot of work that's been happening in these different contexts, whether it's the XR access and their resources page, the W3C XR accessibility user requirements, which I know that when the process of developing the WebXR as an open standard was coming about, one of the blockers that came forth from the larger W3C was looking at these requirements coming from the W3C in terms of making things accessible. And that was one of the things that they had to address in order to really move forward with WXR as a open standards. So they have the W3C XR accessibility user requirements. So the web context getting into trying to make the XR technology overall more accessible. Oculus has their own set of guidelines, the VRCs that they have in terms of making experiences accessible, which in order to get into the store, you have to pass a number of these different things. And then just broader game industry accessibility guidelines. So love to hear any comments or reflections on this landscape of existing resources that are out there and how in this document you're pointing to them. And so how some of them may be covering a subsection of things, but not be comprehensive because there's other folks that are paying attention to other things and how these all kind of fit together in a mosaic and how you make sense of how they all fit together.

[00:45:42.502] Dylan Fox: Yeah, it's a challenge for sure. I've done some accessibility audits for VR and there wasn't one handy dandy set of guidelines I could use. I had to go through the WCAG and see which of those were applicable. And then, you know, the game accessibility guidelines. Yeah. Some of the newer ones come out, like the Oculus guidelines are very good. One thing I would put right at the top of this conversation is something that's come out since the report has been published. which is that XR Access and the XR Association have been working together on a GitHub page aimed at XR designers and developers that tries to bring all of these, yeah, as you said, this patchwork mosaic into one place. You can find that at xra.org slash GitHub, capital G, capital H, And, you know, we've listed out a number of resources that can help designers, developers try to make sense of all this. Some of the guidelines that have made the rounds, Oculus, Magic Leap, the game accessibility guidelines. And then we've started to lay out by platform by platform. You know, if you're on ARCore or ARKit, if you're on Unity, how do you actually go about making each of these platforms accessible? Now, this is open source. And as people invent these solutions, because a lot of the times people ask us, how do you do this? And we say, well, nobody has managed to do it yet. But if you are somebody who is looking to create accessible XR, or if you're somebody that has created something that's accessible and you want to share how you did it with the world, you know, this is open source. So we really encourage you to come check it out and help us create something that helps make sense of all this. But with that out of the way, I will say if you are familiar with traditional 2D accessibility, and I hope many of you are, I'm sure that it might be something that's considered niche, but I really will say if you are a designer or if you are a developer, learning accessibility will make you better at your job. A lot of the 2D accessibility patterns do apply here, right? When we're talking about things like visual accessibility, for example, having text that is large and high contrast avoiding colors that will be hard to tell apart for people that are colorblind or making sure you have some way other than color of telling things apart. That's the same challenge in 2D as it is in VR and AR. There might be new aspects of it in VR because, for example, you're not necessarily sure what the background somebody is going to be seeing your text is because as they move, that background might shift behind it. But you can take those 2D principles and apply them in many of them in the same ways. So looking at a lot of classical guides to accessibility, that's going to be big, right? If you are an XR developer, just reading up on regular old 2D accessibility is going to give you a lot of the answers you need for VR accessibility. For the ones that it doesn't, that's where all of this new research is coming in.

[00:48:52.548] Kent Bye: Yeah, and I know that there's been a number of different 2D games, like The Last of Us Part 2 has been a game that I've heard has got a lot of great accessibility options. I know that Vacation Simulator and Job Simulator has been great in terms of making sure that there's different height accessibilities for folks. Also, they have subtitles that are made available. So like, it seems like this GitHub repository that's been done in collaboration with the XR Association and XR Access is going to be the start of a lot of these code snippets and best practices for how to technically implement some of these different things. I'm imagining that there would be a, either a VR experience that implements all of this, that you can just go and see and experience them just like, you know, The Last of Us as an example of a game that includes a wide range of different accessibility options. Are there any other experiences that you tend to point people to, to say this is implementing a critical mass of best practices, or does that experience not exist yet for people to go and have a direct embodied experience of some of these different accessibility options?

[00:49:53.629] Dylan Fox: Honestly, I don't think it exists yet. I would love to be proven wrong, but I think there are a lot of applications that do one thing and push the boundaries in one way. So for example, AltspaceVR recently started to do captions in VR, which is awesome. They still have a long way to come before they're going to be really a hundred percent usable, but just the fact that we're starting to see things like that is excellent. Yeah, I'll say Vacation Simulator is a great example of pushing the envelope on that. Their subtitles, again, wouldn't call it 100% solved, but a huge step forward in terms of bridging those gaps. Alchemy has actually been a really great contributor in several ways to that and to this problem of 3D content descriptions, which, you know, screen readers in VR is something that's really We're really still exploring the basics of that. That is one of the projects that XR Access is actually currently doing. We have Thomas Logan of Equal Entry, who's a major contributor in this area. And we've been working on hubs, an instance of Mozilla hubs that, for example, could be accessible for screen reader users, right? Right now, screen readers usually just look at VR and say, image, which isn't terribly helpful. And there's questions both on the technological side of that, of how do you get screen readers across various platforms to work? How do you even get a screen reader on an Oculus, for example? And just fundamental design questions of what is it helpful to have a screen reader announce in VR? That's something that we are working with blind people who use screen readers to understand. Shout out to Jesse Anderson, illegally cited on YouTube, who is an absolutely spectacular content creator when it comes to the blind experience of using VR. So yeah, to return to your question, there's a lot of apps that will do one thing mostly right, but I still have yet to see any VR experience that I would rate as 100% accessible. That's what we're working towards.

[00:52:01.713] Kent Bye: Yeah, I imagine that some of this technology in the context of WebXR may have more of a clear pipeline, say like, you know, screen readers are already working with Chrome browsers using the DOM. Even how screen readers work is that there's a document object model that has a lot of the data that's being rendered out by web browsers, but Once you get into something like Unity or Unreal Engine or even WebGL with WebXR, that model becomes occluded and rendered out into something that is basically a 2d canvas with pixels being drawn to a screen. And it's like a black box at that point. So like there's open standards like USD and GLTF even has abilities to contain other objects with an object. So there's the scene graph of a 3d scene and making that available. And then once that is available, how's that. made useful into interpreting all those things. Oculus has a Chrome browser, but how to attach a peripheral and there might be Bluetooth, maybe open XR as an open standard that is supposedly allowing different peripherals to seamlessly integrate into these devices. But there hasn't been a USB port even available on some of these quests or if there's a Bluetooth option. So I don't know if there's even in a short term, a tech stack that you feel like is the optimal in order to be accessible that they have to have a PC VR, which is a whole other issue of not made available to folks who have standalone VR headsets or standalone AR headsets. If the only way to give access is through a computer or through something like WebXR stack. So yeah, I don't know if there's larger discussions that are happening there in order to start to close the gap between even just getting this assistive technology working with like Oculus Quest 2.

[00:53:42.778] Dylan Fox: Yeah, no, it's, it's a huge challenge and one that quickly outstrips any development capability I have, but I know this is something that a lot of people are working on, right? Yahoo has been doing some work. I can't speak to it too much, but accessible AR and yeah, it turns out there's a lot of thinking to do about how we make what's going on in a 3d scene accessible to augmented reality. And if you are at OpenXR and you're hearing this, call me. I'd love to see, I think one of the challenges we've had is that we really want to get participation and conversation from OpenXR, from Unity, from other groups that potentially are working on these deeper parts of the tech stack. that when you're focused on that deep technology, oftentimes accessibility is pretty far from your thoughts, right? Accessibility is seen as something frequently that is kind of a more UX, UI level front end type concern. But For something like this, we really need to think about how these stacks can surface core information to screen readers and other assistive technologies that needs buy-in from developers at every level and from groups like OpenXR, from folks making WebXR, from Oculus or Meta rather, all of these platforms that contribute to how we're accessing this and how we're running these applications, there needs to be coordination around how we're exposing that to screen readers, right? And assistive technologies. I'll give a shout out as well to, I know Mike Chobanek at Meta is working on the W3C ARIA interoperability challenge of right now, screen readers will work differently on different browsers, even for 2D things, right? It's something where even historically, we haven't had a lot of coordination on. And it's something that we are asking all of these platform creators to make a priority and to reach out and talk to others about. And that's something that we'd love to help coordinate if we can.

[00:55:52.912] Kent Bye: But I was wondering if we can maybe just hit some of these other major points in the last 10 minutes before we start to wrap up of some of these different aspects of inclusive design and details, either from removing the background information details, the undo, redo, there's a number of different high level suggestions that you have here. And I'm wondering if we could just quickly go through those as well as these different communities for people who are either deaf or blind or have cognitive disabilities, just to wrap up this section in terms of the technical specifications for what folks should be aware of when it comes to the technical implementation of some of these different accessibility guidelines.

[00:56:31.735] Dylan Fox: Right. So I think when it comes to looking at the guidelines themselves, first, there's some general best practices that are going to be helpful for everyone. And I'll just run through these quickly and can tell me if there's anything merit further exploration, but Starting with removing or reducing background details and audio, right? VR has so much going on, especially for people that are neurodiverse, may have a hard time filtering things out. Try to have some form of the experience that's a little pared back to the essentials. Number two, undo and redo. These go back to just core principles of usability, right? Make sure that people have a way of undoing mistakes, undoing actions that is easy and not fumbling around in a menu or something else like that. Things like reducing speed, setting up action sequences, Remember that for a lot of people, XR is still so new that your application may be the very first time they've used it. They're not familiar with how to interact with it. They might even be familiar with holding a controller. let alone a touch controller. And so making sure that you have ways to slow things down, that you're not rushing people or failing them if they take too long to understand where they are in the space and how they can interact with it is really helpful. Bypass functions, making sure that you give people a way to get past things that they may not be able or willing to do. Don't let your entire application be gated by one inaccessible experience. Save progress, right? Let people save early, save often, save automatically. Don't make them redo things that they've already done repeatedly. Moving to visual accessibility, we have the idea of altering the size of objects, elements, and text, making sure that people have the ability to change text to make it bigger, to make it higher contrast, to make it legible. This is something that's really hard right now because a lot of text in VR is just 3D blocks that have been exported that don't even have any metadata on them. So make sure that your text is dynamic, that it can change, that it can adjust itself to suit your players' preferences. Being able to have things like contrast or edge enhancements. I'd actually encourage everyone to look up the Seeing VR project by Yuheng Zhao for some great examples of visual accessibility. Making sure that you have audio augmentation and text-to-speech. Don't assume that sight is something that everyone has. Some people need alternative means of accessing your content. So making sure that everything has a description, everything has an audio equivalent of the visual aspect will really help to push accessibility. I already mentioned color filters and symbols. For people that are colorblind, don't communicate information solely through color or let them add in filters or adjust the colors that you use to something that they can perceive. Scrims and overlays, make sure all of your text has a background that will help improve the contrast and make sure you don't have white text on some dappled background that makes half of it invisible. Moving to deaf and hard of hearing, captioning is a huge one. please include captions in your application and please do it in a way that doesn't give your user whiplash from having to turn their head back and forth to see those captions. This is something that the W3C immersive captions group has been working very hard on. And so you can definitely look to some of the reports that hopefully by the time of this recording will have come out. But if not, there's a lot of good guidance on VR captions. adding in things like audio icons in order to have a visual indicator of where sounds are coming from could be really helpful. You'll find a lot of accessibility boils down to multimodal interactions, making sure that Each item, each direction has a visual element. It has an audio element. It has maybe even a haptic or tactile element because everyone is unique when it comes to their senses. The more different ways you give people to understand what's in the world and interact with it, the more likely that your application will be accessible. Sign language. Sign language in VR could have a whole talk of its own. but try to consider sign language users and sign language interpreters when you consider your application. Things like mono audio, if people have hearing loss, they may only have hearing in one ear, make sure they can put that through one channel. Yes, you lose some of that spatial sound, but it's better than people only hearing half the noises you intended anyway. Moving to mobility disabilities, there are a ton of settings and menu options that can help people with mobility disabilities. you know, consider people that have trouble with small motions, consider people that have trouble with large motions, consider people that have trouble standing up or sitting down, people that might have to play reclining, people that might have a co-pilot helping them to do interactions. Shout out to Walk-in VR Driver, which is an absolutely amazing Steam plugin focused on helping people with mobility disabilities. You know, there's the idea of dynamic foveated rendering and eye tracking. Eye tracking can be a very helpful thing for people that don't have use of their hands. And so consider if the platform you're on allows you to use eye tracking to select things, to enter text, and to make use of those kinds of capabilities. Same goes for controller free hand tracking. You know, not all people may be able to hold onto a controller for an entire game session. So if you can track their hands alone and interpret that for input, you'll have a big advantage. And then lastly, when it comes to cognitive disabilities, designing for cognitive disabilities is in many ways just about good UX design, the same way you would design for people that have never used XR before, or people that may have distractions going on in their environment. but being able to provide things like on-demand functions to receive assistance, in-app prompts such as reminders or button controller legends, training opportunities, let people interact and experiment with the interface in a low-pressure environment, allow people to review objectives at any time so that they don't get confused or get lost, allowing users to hide distracting or non-critical interface components, things like autoplaying video or any type of motion can be really distracting for some people. So making that optional, making that toggleable can be really helpful. Yeah, there's so many ways that you can potentially make your app accessible. You don't have to use all of them. because that is a tall task. But the more you do include, the greater the audience that will be able to use your application, and the better it will be for everybody who uses your application. So I really encourage everyone to check out the report, check out the XR Association Developer's Guide, and check out the XR Accessibility GitHub project. And think about how people with disabilities would be using your application.

[01:03:53.668] Kent Bye: Yeah. The ones that I just wanted to call out as well as ones that I personally use all the time is whether I'm standing or sitting and being able to, like, if I want to stand, I can, if I want to sit, then having the experience allow me to sit, I think in terms of mobility, but also the locomotion options in terms of how you're locomoting through a space. That's something that I've seen. There's a variety of different types of locomotion. Some of the locomotion makes folks six other that teleportation options and stuff. And so having just a wide range of ways that you can even locomote within the context of virtual space. And yeah, just as you read through all that, it's sad that there's not. An experience where you can just go see all of these things and rather than hear you explain it to actually go experience it with the code snippets. So I feel like that as a community is maybe a design goal and to create like open source projects where you can start to slot these things in with different code snippets from different languages so that maybe a unity package at some point. or at least in the open web or unreal, you know, a lot of these major platforms that are out there just make it so that there's a experience that you can just even experience what it's like for some of these different options. Because I feel like the thing that I mentioned earlier is that the accessibility, even though you may be specifically targeting folks who are not fully abled in some of these different capacities, that in the end, folks may realize that even just making dark mode or light mode on their phones is probably a good example of the ways that it can be more accessible, but also a better experience for everybody. So as you read through all these things, I feel like as they start to get implemented, then it's going to very likely increase the overall user experience for everybody who's using XR.

[01:05:29.280] Dylan Fox: Absolutely. And I think a long-term goal for us is really to put together that repository, that code base, those examples, because we don't want everyone to reinvent the wheel when it comes to accessibility, right? The easier it is to implement accessibility, the more people will do it. You know, there's a very big difference between installing an extra Unity plugin and coding something up from scratch that is really not supported by the platform. And so, The more we can get the platforms on board to make some of these things their platforms, the more that we can have people contribute examples of accessible XR and ideally code snippets to the XR accessibility project I mentioned before. And the more we can set up things like the Locomotion Vault is a fantastic example it's only clips. It doesn't have any code or anything. But the more we can show people that space of possible interactions and think about ones that are accessible that may not be the first thing that comes to mind because you haven't had to deal with those problems, we will be able to help everyone learn from one another and make this accessibility a reality instead of a hypothetical.

[01:06:43.638] Kent Bye: Nice. Well, just to start to wrap up here, I'm curious what each of you think the ultimate potential of immersive technologies and accessible technologies might be and what they might be able to enable.

[01:06:55.845] Dylan Fox: Isabel, I've been talking way too long. You take a turn.

[01:07:00.042] Isabel Guenette Thornton: I don't know that I can answer that question. I think one of the tenants of doing this kind of sociological work that's so different from journalism is that we're trying not to make any predictions. In some ways, I envy the freedom of journalism to actually speculate what things will be. I would say that my hope for technology in general is that it can provide opportunities for people to live full, whole lives, and that XR and its ability to support this world-making, that it allows people to create visions of the world that are more free and are more equal than we sometimes see entrenched in institutions, particularly worlds that feed human creativity and human problem-solving and provide a space for compassion and learning. That would be my hope.

[01:07:53.542] Dylan Fox: I hope that XR will be able to even the playing field for a lot of people and give folks with disabilities the chance to have equivalent experiences to people that are able-bodied. I think with things like augmented reality, the ability to have a conversation that you're being a part of transcribed for you in real time, for example, in VR to be able to participate in experiences that equivalent ones in real life would be impossible to access because of a disability. I see it as having a potential for a huge equalizer and a huge chance to improve equity for anybody that has been excluded because of physical challenges.

[01:08:38.669] Kent Bye: Nice. Is there anything else that's left unsaid you'd like to say to the broader immersive community?

[01:08:44.273] Isabel Guenette Thornton: Yeah, something that I wanted to mention, Dylan you said something really important, I think, which is that accessibility is often seen as a UX UI level issue and not as occurring at the deep tech level. And I think that's really key and it speaks to some of the weakness of how tech policy operates today. Because tech policy often comes in once those foundational product decision or industry decisions have already been made. And there are a lot of those decisions that have a huge impact on accessibility and on equity. For example, we've talked about local storage. How much local storage do hardware devices have? And what does that mean in terms of how much information needs to be on the cloud and how much information is shared. And that's huge for privacy. Also, you know, local storage means that devices may or may not have ability to run additional programs for accessibility. And even just the fit and comfort hardware and the assumptions it's making about what bodies are using what hardware, all of these are really, really key and important. And they get missed because I think the time that policy usually starts to get going those sorts of decisions have already been made. And so I just wanted to think about that a little bit, and what it means that we might need to think even further back to those foundational decisions to really make equitable tax

[01:10:05.730] Dylan Fox: And I would add as well that there are so many parts to this problem. There are so many people that have experienced challenges because of it, or that are working on solutions to overcome it, or have a key role in solving it and just don't even know it yet. that it's really vital that people talk to others outside of your immediate sphere. If you're a developer, or if you are a policymaker, or if you are an academic that is studying this problem, or if you are an advocate for people with disabilities, or if you are any of those things and also have a disability yourself, please reach out, please talk to one another. We are really trying to provide a space for those conversations to happen. So if you want to come to xraccess.org or slack at xraccess.slack.com and be part of those conversations, we need your help to solve this. Nobody can do it alone. And the more of us that work together on this, the better our solutions will be.

[01:11:13.790] Kent Bye: Awesome. Well, the paper is titled Extended Reality Ethics and Diversity, Inclusion and Accessibility. It's been a part of the IEEE Global Initiative on Ethics of Extended Reality. It's a series of different reports across different contexts. And Dylan, Isabel, thanks so much for joining us today and to be able to unpack it a little bit more.

[01:11:30.021] Dylan Fox: Thank you. Absolutely. Thanks Kent.

[01:11:32.413] Kent Bye: So that was Dylan Fox. He's the coordination and engagement team leader for XR Access, which is a nonprofit focused on XR accessibility. He's also a researcher at UC Berkeley for augmented reality for obstacle avoidance for people with low vision, as well as Isabel Gannett Thornton, who's a PhD candidate at the University of Cambridge in sociology, researching producers of XR technologies. And she also has a background in product management, worked on the Nest Smart thermostat. So I have a number of different takeaways about this interview is that first of all, Well, it's still an emerging area in terms of the work that needs to be done. Like I said, if you are listening to this before June 9-10, there is a whole XR Access symposium that's happening completely virtually. You can go check it out and attend and see some of the latest discussions about some of these conversations that we're having here. I'm sure that there's going to be a number of different updates. This is something that the XR Association, which is an industry trade group with all these different companies within the XR industry, as well as XR Access, which is a collaboration with PEAT, which is a U.S. government organization that stands for Partnership on Employment and Accessible Technology. So like Dylan was saying, there's a lot of potential for how XR could actually increase access, but there's a lot of ways in which that it could be increasing the digital divide even more because of people who have impairments can't fully experience all the different dimensions of these technologies. And that's a lot of what in the first part of this conversation that Isabel was bringing up from the scholar, Trustee McKellen Cotton, who is talking about how there's digital divides that may not go far enough to be able to capture all the various intersections of privilege, access, and power that operate online and offline. In this case, with XR technologies, there's a field of possibilities that are made available. But also, if you aren't fully able-bodied, then the different human-computer interactions that are based upon these new embodied interactions, if you have physical impairments, then you may actually be closed off with some of these new possibilities, which in some sense is furthering the digital divide that already exists for people who are not able to fully experience these experiences because they're not fully accessible. I think it's pretty prescient for what Dylan was saying. Usually what happens is that there's laws that are in the books that are trying to enforce a baseline of accessibility, and there's lawsuits that have to be filed. Once there's a winning of that lawsuit, then you go back and retrofit accessibility on top of all these different experiences. This has been an emerging area in order to get all these things defined, and like we discussed in this conversation, there's still a lot of work that needs to be done. One of the things that Dylan was pointing out, the work of Elise Deku did a lot of the policy angles when she was at the ITIF, which is a nonprofit research organization that she's since actually gone to work with Meta, working on policy there within Meta. But a lot of the different work that she was pointing out was looking at how all these different organizations within the government, that if they're going to be using immersive technologies, then they can start to mandate a minimum baseline of requirements for what would it mean to have an accessible XR experience. So lots of different suggestions that we had there at the end that are fully explicated within the context of the paper. It goes into a lot more details. And there's also a GitHub repository that you can look at. And I'll include a lot of different links of different shout outs that Dylan made throughout the course of this conversation that you can dig into a lot more other resources that he had mentioned here. If you are a designer, an XR designer, like he said, that if you are designing accessible experiences within XR, then it's going to make you a better designer because you have to solve all these really difficult and challenging problems. So no lack of different problems to be solved. And there's a mosaic of different accessibility guidelines that are out there. But like Don said, there's not like a master list of things that to be 100% fully accessible XR experience. This is everything that wouldn't be included. So there's a pulling in from many different resources that are out there and pointing to an individual experiences that are in the as applications that are starting to implement various different things. But nothing's really tied it all together from an experience perspective or just be able to create these GitHub repositories and code snippets and trying to establish what all the different dimensions of accessibility are going to look like. So definitely check out the white paper to get a lot more information and lots of different links that are going to be included in the write up as well. And yeah, if you have a chance to drop by the XR Access Symposium, go check it out, you know, live in real time, or if you miss it, go check out all the recordings and get involved. There's a number of different shout outs that Dylan had made in terms of OpenXR and other different organizations to be able to help move this topic forward. As a podcaster myself, there's a lot of ways in which that my website is not fully accessible, just in terms of transcripts and, you know, I'm in the process of needing to redesign and relaunch my website to make it more easy for everybody to find information that's been published there. But this is an ongoing process for myself. And so if there's other people that are interested in helping to either help get transcripts or other aspects of doing a redesign for our website, there's a lot of things that are to be done there. So definitely reach out to me on Twitter or on email Kent at Kent by dot com if you'd like to. help out with some of those different efforts. So yeah, anyway, that's all that I have for today. And I just wanted to thank you for listening to the Voices of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends and consider becoming a member of the Patreon. This is a list of supported podcast and I do rely upon donations from people like yourself in order to continue bringing this coverage. So you can become a member and donate today at patreon.com slash Voices of VR. Thanks for listening.

More from this show