Joel Ward is an Emerging Technology Manager at Booz Allen Hamilton who has been working within Web Accessibility since around the time Section 508 of the Rehabilitation Act was enacted on August 7, 1998 and the W3C’s Web Content Accessibility Guidelines came out (WCAG version 1.0 was published on May 5, 1999). Ward spoke on a XR Access Symposium panel discussion about Empowering the Workforce through Accessible XR, and also brought a pair of XREAL Air glasses with live captioning to show off the power of AR as an assistive technology to other attendees.
I had a chance to speak with Ward about the pending work that has to be done in order to specify XR accessibility experimentation within government contracts. Right now XR accessibility is in a bit of a Catch-22, because if something is not specified within a contract to be worked on, then it most likely won’t happen. However, XR Accessibility features and functionality are still too nascent to be clearly specified and scoping out within these contracts.
As I covered in the unpacking of public policy and XR accessibility write-up of my interview with XR Association’s Liz Hyman, then the emerging nature of XR technologies means that they’re not clearly specified within legislation like Section 508 or the American’s with Disability Act to be enforced. We talk what needs to be done at the federal government contracting level so that his group at Booz Allen Hamilton can have the freedom and resources to develop more XR Accessibility features that feeds into the type of Darwinian experimentation that Neil Trevett says is a crucial phase of the standards development process.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
Podcast: Play in new window | Download
Rough Transcript
[00:00:05.412] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR podcast. It's a podcast that looks at the future of spatial computing. You can support the podcast at patreon.com-voicesofvr. So this is the episode number 10 out of 15 of my series of looking at XR accessibility. Today, I'll be talking to Joel Ward of Booz Allen Hamilton. He's a emerging technology manager there at Booz Allen Hamilton, and he's looking at how to build and implement XR experiences for the federal government. So in this conversation, we start to talk about some of the nuances of what's it mean to start to do this type of contracting with government. And at this point, some of the different accessibility requirements are not being specified within these contracts. And so there's a need to start to create a contract structure that allows this type of open exploration where they're given permission to start to develop and innovate some of these different tools, even if they're not being requested right now, even though there might be a part of the spirit of the law of the Section 508 and ADA right now, because it's an emerging technology, it's a little bit vague for how to start to enforce some of these different requirements for accessibility for XR, especially because even at the research level, we don't even know some of the ultimate best practices and guidelines for the best use case scenario might be for this. So Joel has lots of different insights for working with the federal government and the role that the government could play in being a leader in helping to fund and create different contexts for some of these different XR applications to have more fully-fledged XR accessibility features. So that's what we're covering on today's episode of the Voices of AR podcast. So this interview with Joel happened on Thursday, June 15th, 2023 at the XR Access Symposium in New York City, New York. So with that, let's go ahead and dive right in.
[00:01:56.650] Joel Ward: So my name is Joel Ward. I am an Emerging Technology Manager at Booz Allen Hamilton. I've been working with XR for upwards of six, seven years now. Before that I did web development and I did accessibility and web back in the day, when the web was new. Since I've gotten involved in XR, Just like in the web, accessibility was not at the forefront when I started working with that. And obviously, we're here at XR Access trying to make sure that is addressed before it explodes like the web did. And so I work for Booz Allen, and we're a consulting company. We primarily work for the federal government and implement and build XR for our clients. But I also work internally on research and development group. And what we are doing to address the accessibility for emerging tech is, say, participate in things like XR Access, but also do investment and start to build some of these things, kind of like they do in academia, but do it on our side, right? To show off what's possible, obviously show to our clients what they should be doing or could be doing. And also then try some of the things that have not been done before, right? So a lot of our clients ask for something very specific, like the way the government works with contracting traditionally is like they are very specific in what they're looking for, and if they don't ask for it, then you generally don't do it. But for something like emerging technology, you don't necessarily know what to ask for. So that's why we need to show what is possible. But also, and even change the contract side, where it's more open for investigation. And there's some newer contract types that allow for that. So we can actually figure that out and find some of the answers we don't have yet. And so come up with novel solutions like it's done in academia, so we can actually apply it to XR.
[00:03:47.007] Kent Bye: OK. And yeah, I'd love to hear a bit more context as to your background and your journey into XR.
[00:03:52.733] Joel Ward: So I was supposed to be an architect. Went to school for architecture because I built a Lego city when I was a kid and I thought that would translate into architecture. It's obviously different when you're working for an architect firm versus building a Lego city. So then I went into the web and the web was new. Back in the time that's dating me a little bit, the web was new. So I got into building websites and back then it was a lot simpler than it is now. So when I graduated from college, I went to work for a company that built websites. And the cool thing is, that's when the whole Section 508, the Rehabilitation Act, the whole web accessibility guidelines came out around that same time. And so I got involved in that. learned how to build websites, got into content management systems, obviously learned accessibility, and did that for a long time, a very long time, until we opened an innovation center at Booz Allen back in 2016, and I was lucky enough to get pulled in to help lead the technology of the space. And one of the technologies we showed off was virtual reality and augmented reality, because this is 2016. It's when the Vive came out. It's when the Oculus Rift came out. It's when the HoloLens 1 came out, all at the same time, right when we opened up. And that was one of the things we were showing. There was a bunch of other stuff, AI, security. I don't even remember the other stuff, because the VR and the AR was my favorite thing to show off. We'd have people come through and demonstrate it. VR, AR, people don't necessarily get it until they try it. That's one of the things that is super important for someone to try it before they discount it. And seeing how people took to it when they got into it, I pivoted to that's what I want to focus on. And so since then I've moved now into a team. The team I'm on is actually a spatial computing team. We actually changed our name before Apple actually just used that term, XR Team, Spatial Computing Team, for the research. And then we actually have people working on client projects building XR solutions. And it's been a small portion of what we do at the company. Booz Allen, we have 30,000 people. Most people are not doing XR work, but the XR and the spatial part is growing. And, you know, we're making sure we're investing in it so we can help it grow and help it grow in the right way, right? And obviously accessibility needs to be part of that. And that's what I'm trying to, we're trying to make sure is not forgotten, is address that early on to make sure it's done right.
[00:06:26.023] Kent Bye: So yeah, we just finished up the day one of XR Access Symposium 2023, and lots of different presentations and talks and discussions. And you've had a chance to connect to a lot of the different folks here in the community. Love to hear some of your initial takeaways of stuff that you feel like is really striking and applicable to the type of work that you're doing at Booz Allen Hamilton.
[00:06:46.030] Joel Ward: I mean, everything, right? Honestly, from the first presentation today about the guidance for blind and low vision people in XR to the user research talk and all the breakouts, like I was at the avatar breakout and the policy breakout. I mean, everything we're talking about today are things that we could and should be doing. This is a small conference compared to many, but I think actually one of my people I'm with, they commented this is one of the best conferences they've been to because it's actually interactive. Like everybody's interacting with everybody and bouncing ideas off and it's kind of hard to get in any other context. My dream is that we can continue to do this, not just online, like we have a Slack for this, but can we start using things like XR or other spaces to be able to continue this kind of conversation more regularly, not just once a year. To answer your original question, overall I think this is like, just the first day has been great, like we're condensing all this into a day and a half, and we're getting a lot done in that short period of time.
[00:07:52.147] Kent Bye: Well, what are the specifics that I guess you're working on in terms of accessibility? Because I know that we saw some presentations from Alchemy Labs with what they're doing with Cosmonius High with adding lots of different accessibility features for folks who are low vision or blind. And so it's a consumer video game that is trying to build in some of these forward-looking accessibility features. And so I'd love to hear how, at Booz Allen Hamilton, you're similarly trying to chip away at making this incremental progress of trying to either implement some of these different accessibility features or take some of the best practices of what you're seeing in academia and just understand the overall landscape and create a roadmap for what the next viable step is for what you're doing day to day.
[00:08:32.881] Joel Ward: Yes. I mean, really, the way it should work is all of those kinds of features should be built into any app, any VR, AR app you built, right? So, what is missing in the platforms is it's not a default feature, right? Like, for example, Unity doesn't necessarily have this stuff built in, that's why Alchemy, you know, they built these features and, you know, it's not just platform-based, you obviously have to build it into the interface in the interaction. What we like to do, or what we should be doing, is having sort of a, you know, a set of here's some features you should be implementing in any app, right, from the start. So not retrofitting, right, which is often what we do. That's what we did in the web back in the day where we had websites and then we realized they were broken and we had to go back and fix them. You should be doing it at the beginning. I think we have a lot of good examples on the things that do work. There's still a gap in solving for as much as possible. So I think it's a mix of taking what has been done, like with Alchemy, like with, like what XRA and Extra Access has done with some of their guidance on development, Starting with that, but then also continuing to look at, like, where are the gaps? What can we help build and then give back to the community? And I think that's also another thing to comment on is all the stuff that we're doing in this realm should be sort of made available. It shouldn't be proprietary, right? It should be as part of the platforms. And I'd like to think, like, for example, Apple is hopefully doing that with what they're coming out with. where it's just part of, when you're building an application, these things are there, much like in a website, and you kind of, you just need to make sure you fill them in, but the base functionality is there, and then you can build on top of it versus having to build it all from scratch. So, I think to answer your question, is taking the parts that are there, making sure we use them at the beginning of every project, and then also helping to come up with solutions to fill the gaps that are still out there.
[00:10:22.216] Kent Bye: Yeah, and listening to what Liz Hyman was saying during your moderated session today, talking about different ways that folks like yourself in the industry are doing different accessibility features, and the Section 508 had come up with this as ways to set guidelines for minimum amount of accessibility features for different things that are mass communication platforms. So with the web, it's a pretty mature platform. so they can be very descriptive and set much more clear guidelines. And my understanding, at least from some of the discussion, was that because of the fluid and evolving nature of some of these emerging technologies, that it's not as clear for what those baseline standards would be to be enforced by something like Section 508. So I'd love to hear some of your reflections on being in this sort of liminal transitional space with the technology where you're working with the federal government, but yet the technology's growing and evolving, and how do you ensure that you're living into the spirit of what Section 508 is aiming for, even if it can't be necessarily strictly enforced at this point because the guidelines are still fluid and evolving.
[00:11:26.417] Joel Ward: So yeah, that's a good question. I think, kind of like I was talking about before, making sure we add in what we know is helpful, right, and we do know some of those things, is the absolute minimum we need to do now. I'd say it's the minimum. I think maybe what, I'm kind of thinking on the fly right now, but I've been thinking about this in the back of my mind for a while. There's sort of a set of checklist guidelines, and there's actually the intent. Right? And I think what happened with the original laws that we had, and obviously the sort of the human need that we had, turned into here's a checklist of things you need to do, and a lot of people just do the checklist and then they're done. And then, you know, the assumption is, well, it's accessible. But that's not true, even with the web, right? Like, you do that checklist, yes, it's better, but it doesn't necessarily mean that it's usable by everybody. So, teaching people the reason you're doing it, so as they're developing things, they think about it as they're building it, and not just like checking things off a list as they're building. is really like when I build stuff, that's always what I like to think like, who am I building it for? How might they use it? Obviously testing with different people to kind of get their feedback and then making changes based on that, not just a list of things to do. So I don't know if this makes sense. Does this make sense? Like, you know, why are we doing it? As you're actually going through the motions, you're thinking about that, not just those check boxes.
[00:12:53.861] Kent Bye: Yeah, that definitely makes sense. And what are some of the biggest open problems or challenges that you have that you're solving right now when it comes to accessibility?
[00:13:03.965] Joel Ward: Well, so I think the big challenge on the kind of stuff that we do is making sure that it's called out that we should do it and can do it within the context of the contract side. So I'm not even talking about, like, here's technically how you do it, but that we have the space to do it and the time to do it. is I think part of the problem. And that was the case with the web and other areas where in consulting and contracting, if it's not in the contract, you know, a lot of people say, well, it's not there, we're not doing it, right? And for this, we should be doing it no matter what. But if it's not called out, Many people will just ignore it. And so I want to make sure that, again, even if that happens, that it's done anyway, but also we get the language into it and have people understand we need to be doing this so it's not forgotten about. I think the other thing is, and actually one of the demos today during the demo session talked about working with our developers and our artists and then sort of the people that are part of the process and figuring out where in that process we can make sure that this stuff is introduced. So kind of back to what I was saying before, where it just becomes sort of natural to be addressed in the process versus saying, oh, hey, we have to stop and do the accessibility here, right? They actually put it in the process where it makes sense, right? And like figuring that out and then teaching everybody and educating people on how to do that properly. So again, they're not so much the technical. It's more the process and the contract stuff is probably the biggest challenge. So I think the technical thing, the technical things are solvable. And we've solved some of that, but it's making sure that we're actually implementing them. And again, either making sure it's required and called out, and also then we're in the process that it happens.
[00:14:46.761] Kent Bye: I was just at Augmented World Expo and saw a lot of the demos for companies like Xreel, which used to be called Nreel. So I got a chance to see Eyes On and a lot of their latest head-worn augmented reality glasses, which is, I guess, more like smart glasses more than augmented reality. You're just kind of overlaying a static image on top of it. But it's a pretty good resolution. And I saw that you were holding up a pair of Xreel glasses and using it in a form of assistive technology with live transcripts and captioning for folks who are deaf or hard of hearing in conversation. So I'd love to hear some of the different work that you've been doing with Xreel or exploring these type of smart glasses like Xreel as a form of assistive technology.
[00:15:25.337] Joel Ward: Yeah, so I actually bought those. And they were Nreal when I bought them. Now they're Xreel. I bought those for myself. A lot of the other stuff I do with the company, I get to try a lot of different headsets. And I'd use the Nreal lights, which are the predecessor to the Airs, and use those for a while. I went and bought the Airs for myself. One, because I was using the captioning software, and I wanted to have a really good pair of glasses to kind of bring around and really test that out. And on different platforms, right? Because it works best on Android, on certain Android phones, but it also works on iOS now. You can also use it on your Mac. to do, like, multi-screen. And so, like, all this together, like, I just need to have this so I can use it. This is kind of me doing some real-life R&D, or really, just real-life testing. I mean, the captioning, I don't necessarily need, but I find it helpful. And actually, my wife, who likes turning captioning on while watching TV, I've kind of had her try it. And then you can kind of do captioning wherever. just sort of investigating, like, where else could this be useful? Obviously, it's useful for people who might be deaf or hard of hearing, but it also could be useful for someone who has trouble following a discussion. Or, again, the translation is really interesting. But also, they are AR glasses. If you have the right software, you can actually do interactive stuff with these glasses. The problem is the software, right? The use cases that they have out there that are readily available are the captioning and heads-up display for your computer or your phone. and watching videos. That's what I've been doing. I watch YouTube videos in bed without a TV. On the train yesterday, I had a heads-up display, so I didn't have to look at my laptop, and I could black out my laptop and have a private display. And then the captioning, which I think there's a lot more that can be done there. The struggle is there's not necessarily enough apps available for them. And there's a bunch of different platforms, and Apple will probably help get this into the forefront of people thinking, well, what could we do? And then what could we do on the less expensive platforms like the Xreal or there's a couple other ones out there that are kind of similar that are in the couple hundred dollar range versus $3,500. But we still need better apps. I'm okay with it, but not everybody's going to buy a 400 pair of glasses just to use it for one thing, right? So I think we need to find those other use cases to get the stuff out there so people actually have things that are useful to do.
[00:17:46.977] Kent Bye: And I saw that you maybe had a chance to show them off to some other attendees who are either deaf or hard of hearing. I'd love to hear if you got any feedback for how effective it was for them.
[00:17:56.284] Joel Ward: So they really wanted to try it. And actually, that was one of the reasons I brought it, because I was curious if they had tried them before, and they hadn't. The version I had with me was with my iPhone today. And unfortunately, that version is just really a heads-up display of the iPhone display. So you can't really adjust where you place it. It's just sort of like in the center of your view, it's showing you captions, which is useful. But the comments I got from them were, I really want to be able to move it so it's not blocking my view. which is right. I mean, I totally agree. And unfortunately, the way it's implemented on the iPhone, you can't do that. But on Android, with the ideal setup and the ideal device, you actually can do what they were kind of asking, where you can move it around and adjust it and adjust the depth of it and adjust the font size and all those things like that. So their feedback was great. And I'm like, I wish it would just work the same way on every platform. You know, this is like the kind of the problem where an Apple, like they obviously are controlling it. So it'll always work the same on the device all the time. But all these other companies are doing this, and they're really dependent on the support of the phone or the laptop. And this is one of those great examples. Like, I wish I had my Android phone with me so I could show them really how it's supposed to work. So that was their biggest thing, is like, it can't just be a heads-up display. It has to be customizable. And that's what was missing with this sample I was showing them. But they could see the utility in it if they could do something like that, right? Because instead of having to look at your phone, you can kind of move it. And I think the ideal situation is you could place it above the person who's talking. versus having it in the same place so you actually know who's saying what versus just an ongoing scroll. So I think that's kind of the gist. There were some other things they shared, but that was the gist of it is make it customizable, allow me to set what I need versus just being sort of a basic display. But it was great to get that feedback because I know the company that makes the software that I was showing off is working with folks to get that feedback, but I think, you know, it's getting the info out there that it exists, but also how to actually run it properly and the whole setup to do it. It's much more complicated than just saying, hey, this is available. Go get it. You have to answer a lot of questions and know what you're doing. And that's the struggle with this kind of tech.
[00:20:07.668] Kent Bye: ROB DODSON. Great. And finally, what do you think the ultimate potential of XR special computing with accessibility in mind might be and what it might be able to enable?
[00:20:18.749] Joel Ward: So we're here in part because we want to make sure XR is accessible. But like with the example of the captioning, right, I think XR can make accessibility even better, both in the virtual space, virtual spatial computing, as well, obviously augmented reality, right, can kind of help. layer information over the real world to give you a lot more options on how you sort of interact with the real world and then vice versa in the virtual world, having the ability to customize how you interact with that. And I think that's what is not necessarily understood on a wide basis. Like people look at VR like, oh, like I can play a game or I can do training, right? I can do kind of a thing versus that it's a platform actually could be a tool for accessibility. Right? Like I'm looking at you and like, again, the captioning thing is a simple one, but I'm looking at you and if I have low vision or I'm blind, and there are apps like this on your phone that can tell you what you're looking at, where if you're deaf and you can't hear what's around you, it's doing sensing for you and you're picking what you want it to tell you in whatever way you want to be told. which we didn't really have any other way to do that right now where it's like we have the bits and pieces but if you have this spatial understanding either in the real world or in the virtual world and then you have that ability to kind of you know set that dial on what it's telling you and how it's telling you I think that is what can be super powerful. If we can get it right, to get the devices right, to get the software right, and get all the proper translation, which is easier said than done, when that all can come together, then we can have something that really can allow people to get the support they need when they need it, wherever they need it.
[00:21:52.451] Kent Bye: Is there anything else that's left unsaid that you'd like to say to the broader Immersive community?
[00:21:57.852] Joel Ward: Keep going, keep trying stuff. Don't limit yourself to what we've done so far. I really feel like we have the ability to do this. It may be years out until we have the perfect solution, this ideal that we're all thinking about, but keep going. Yeah, that's what I'll say, yeah.
[00:22:15.265] Kent Bye: Awesome. Well, Joel, thanks so much for joining me today and to share a little bit about what you're doing and fighting the good fight for accessibility at Booz Allen Hamilton. And yeah, let's get those federal contracts and invest into developing a lot of these core infrastructure technologies. And yeah, like you said, hopefully have a future where a lot of these different innovations are more widely shared with each other and ultimately built directly into these platforms, like you said. But I think as we start to build it out, coming from folks like yourself and Alchemy Labs and other independent creators and academics testing it out and implementing it across the industry. So yeah, thanks for taking the time to share your perspective and insights on what needs to happen as we move forward. So thanks again.
[00:22:51.217] Joel Ward: Yeah, you're welcome. Thank you.
[00:22:53.483] Kent Bye: So that was Joel Ward. He's an emerging technology manager at Booz Allen Hamilton. So I've had a number of takeaways about this interview is that first of all, what really appreciated this deep insight into these types of government contracts and the need to, at the contract level, start to figure out how to support the development of some of these XR accessibility tools. Cause right now there's no provisions for it to be done. And so if it's not the contract, then it essentially doesn't get done. So Joel's trying to figure out at that level how to even create a contract situation so that it could start to fund this more open-ended Exploration because there's not a checklist of different requirements and he talks about this mindset of having a checklist but just because you check all the different boxes off doesn't mean that it's usable or completely accessible and so I think that's this mindset that we're in this liminal space of trying to explore these new possibilities for what I accessibility looks like and so it means going back to the roots of trying to understand the spirit of these different regulations and laws and to see how to put all these different features into these applications to make them more accessible. We also talked about the pair of Xreel glasses to be able to do live captioning and transcripts. It used to be called Nreel. They rebranded to Xreel. And so his Xreel heirs, he was showing off to different folks there at the XR Access Symposium. And because it was an iPhone, it was occluding people's views and they wanted to have the ability to move it around. It sounds like Android is the best for the X real. I didn't try it out myself, but I probably should have just to check it out and see what it looked like. But yeah, he's just trying to get some feedback for what's possible. Cause that's one of the really compelling use cases for these immersive technologies is this kind of assistive technology where you could wear a pair of glasses and get these live transcripts based upon whatever conversations were happening around you. So yeah, Joel's also coming from the realm of web accessibility and just in this open inquiry of trying to explore some of the different best practices for XR accessibility, especially in the context of Booz Allen Hamilton, where he's in this emerging technology group that has the ability to build these XR experiences. And he's very keen to start to experiment more and more with implementing some of these different accessibility features. Alchemy Labs is. in the industry and they just took the initiative on their own to start doing that. And I hope to see other folks like Booz Allen Hamilton and other companies start to innovate and be in contact with these different user groups and have very specific contexts and be in conversation with these different groups and see what works and what doesn't. So he's hoping to continue to do that. But I think one of the blockers, like I said, is getting things figured out at the contract side so that once these contracts come in, then they have the freedom to be able to do this type of open-ended exploration. So, that's all that I have for today, and I just wanted to thank you for listening to the Voices of VR podcast. And if you enjoyed the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a listener-supported podcast, and so I do rely upon donations from people like yourself in order to continue to bring you this coverage. So you can become a member and donate today at patreon.com slash voicesofvr. Thanks for listening.