Yvonne Felix has been using augmented reality as an assistive technology since 2012. They tell me, “I actually wear an augmented reality device that allows me to see. I’m legally blind, so that means that I have about 2% of my vision left. And I use something called eSight Eyewear that actually gives me access to 100% of my vision, depending on the situation.”
I recorded this interview with Felix the VRTO conference back in 2018, and I wanted to include it in this series because I heard a lot of excitement from attendees at the XR Access Symposium 2023 on the assistive technology potential of XR. Felix told me, “Because we are actually the first people who are mobile with an AR device, using it for a specific application, I think we’re actually going to start the trend that it’s okay to wear AR in society and that it’s accepted because look at the impact it’s making.”
XR Access co-founder Shiri Azenkot told me, “There’s a very exciting opportunity to take these technologies, especially augmented reality, and to use them as accessibility tools. So to look at how we can use these new platforms to solve current unsolved problems that specifically people with disabilities experience… There’s so many other potential applications out there. It’s a very exciting opportunity.”
Some of the most compelling applications of Augmented Reality are likely to going to be in the realm of assistive technologies like Felix explores in this interview. The endgame of XR Accessibility is not just to make VR and AR technologies more accessible, but to make physical reality itself more accessible. Again, Azenkot told me, “You need to think more about how we make the physical world accessible and think about how that can be incorporated into these experiences rather than trying to take a two-dimensional accessibility framework and trying to kind of fit it, squash it into this new paradigm.”
Making the physical world more accessible is going to be a long journey as there are a lot of problems yet to be solved. But the consensus from the XR Access Symposium 2023 gathering seemed to be that solving accessibility problems for low-vision users within virtual reality like Owlchemy Labs has done with Cosmonious High, is likely going to provide design inspiration for helping to solve some of the more intractable problems in helping make the physical world more accessible through AR technologies. As the pioneering work of Felix shows, then there is certainly a huge amount of potential for XR to be used as an assistive technology.
This is a listener-supported podcast through the Voices of VR Patreon.
[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the future of spatial computing. You can support the podcast at patreon.com. So this is episode number 14 of 15 on my series on XR accessibility. So today's episode is with Yvonne Felix, who I actually did this interview way back, like five years ago at the VRTO, so the Virtual Reality Toronto Conference, which Karem Malicki-Sanchez always does a great job of curating lots of really interesting folks from the XR industry. So back in 2018, I was there and had this conversation with Yvonne Felix, who, they're legally blind, and they were wearing an EyeSight EyeWear augmented reality device as an assistive technology, so they were walking around with this device on their head that was allowing them to actually see the world around them . So I had a chance to sit down with them and talk about being an early adopter for some of these assistive technologies for augmented reality. And this is quite an interesting discussion to explore some of the different social dynamics of these assistive technologies, as well as the impact that this device has had in their own life of being able to open up this whole new world of locomoting, navigating and communicating and seeing body language and all these other things. And so. Yeah, their testimony of their experiences of using augmented reality as an assistive technology. So that's what we're covering on today's episode of the Voices of VR podcast. So this interview with Yvonne happened on Sunday, June 17th, 2018 at VRTO in Toronto, Canada. So with that, let's go ahead and dive right in.
[00:01:44.967] Yvonne Felix: So my name is Yvonne Felix, and I actually wear an augmented reality device that allows me to see. I'm legally blind, so that means that I have about 2% of my vision left. And I use something called eSight Eyewear that actually gives me access to 100% of my vision, depending on the situation.
[00:02:03.163] Kent Bye: Wow. So maybe you could tell me a bit about the back story of this device. Where did it come from?
[00:02:09.118] Yvonne Felix: So this device was actually invented in Ottawa. The founder, Conrad Lewis, his sister had the same eye disease as me, which is Stargardt's. And he was an electrical engineer that wanted to make something that was wearable so that his sisters could see, essentially acting as a prosthesis. So the company started in 2008. And they had a commercial device available in 2012. And I was the very first person to use that device.
[00:02:33.883] Kent Bye: Wow. So have you been using it since 2012 then?
[00:02:37.043] Yvonne Felix: Yes, I've been using it since 2012, since before it was commercially available. And then I started actually working for the company about a year after I was using the device. I gave a lot of feedback. I knew a lot about the low vision community because I was very involved. And one of the things I knew was that affording a device like this would be an issue because 70% of the population of individuals living with a disability have a really hard time accessing employment because of the gap in between funding for technologies and their affordability in general.
[00:03:12.602] Kent Bye: Yeah, so maybe you could describe a bit of how does this even work? How do you go from 2% to nearly recovering 100% of your vision using this augmented reality technology?
[00:03:22.133] Yvonne Felix: So it's a head-mounted display, so if you think about anything like HoloLens, it's more like a HoloLens as it gives me access to my peripheral vision. So if you think of Geordi from Star Trek, he had a visor sort of very similar to it, and what it does is it takes a real-time image. The processing happens in the head-mounted display, but I do have a controller that I use to configure the algorithms. And it presents the image in a way that my eye needs by providing more than ambient lighting. So in most cases when you have low vision, your eye just isn't functioning as it should. Your pupil isn't dilating or contracting the way it needs to. Maybe your eye is changing shape from strain. And what this device does, with the algorithms is it provides the image in a way that relaxes the eye, allows the eye to take in light, and then you have control over how it looks. So that means that maybe if something is 300 feet away, like an exit sign in a big conference room, I bring that exit sign to my eyes. And what's really interesting about the device is that it tricks the eye into thinking it's looking at a 40-inch monitor, but it's actually right in front of your eyes. So I can take the world around me, and I can adapt it in a way that visibly suits me.
[00:04:32.533] Kent Bye: Yeah, and as I am trying to imagine in my mind just like what that experience might be like to only be able to see 2%. Is there any metaphor that you have to describe to people what your vision feels like and then how it changes by using this device?
[00:04:48.080] Yvonne Felix: So imagine the most rudimentary cartoon with no details, no depth perception. Like I kind of like to say, I don't know how big a fan people are of cartoons, but imagine everything looked like South Park and mute the colors a bit and then put a giant ball of Christmas lights in the center of that. and that's what macular degeneration looks like. So there's no distance, there's no detail, and then centrally you've got this big distraction that kind of starts out as a couple of little speckles centrally and then over time it just gets bigger and bigger and bigger. So I guess another way to look at it is imagine you have a thousand piece puzzle and losing your vision is like taking a small piece of this puzzle away every day so you don't notice it until one day you look at the puzzle And the entire middle of the puzzle is missing. And that's what macular degeneration is like.
[00:05:41.440] Kent Bye: Oh, wow. So what was it like for you to start using this device? And what did that feel like?
[00:05:47.342] Yvonne Felix: Yeah, to be honest, it was really overwhelming. I started using it when I was 31. So I had spent all that time learning how to function in the world with the technology that I did have, which was very rudimentary. Technology is like a telescope. That's been around for how long? Magnifying glasses? How long has that been around? I used to use text-to-speech software, but when I tried this on, it meant that I could read anywhere I went. I didn't need to ask for help. I could apply to jobs that required real-time functionality. Usually what happens when you have low vision is it's this waiting game of waiting for someone to help you. Even if you're extremely independent, by 2 o'clock in the afternoon, you're exhausted because of the physical ongoing of trying to always figure out how to navigate your environment. So this addresses the fact that I do have some functional sight. because I like using my remaining site. There are situations where I could care less, and that's where I listen to an audiobook, but how many people listen to audiobooks? So this device has really leveled the playing field for me in terms of career. I'm a visual artist, and when I would put up installations, I used to have to ask like 12 different people to help me to do all the things that I couldn't do and I couldn't see. Now, I have a choice whether or not I would like to have help. the aesthetic of my work I'm aware of so many other details and things going on around me and even as a parent I have two kids you know there's things like I never thought I wouldn't see my kids face at a school play and then like being able to make eye contact with someone a hundred feet away is like the most emotionally crushing moment where you're like, I love that interaction. Nonverbal communication, having access to that, has made me feel like I have access to humanity in a way that I didn't before.
[00:07:38.380] Kent Bye: Yeah, I noticed that as I just met you a few moments ago, as you're walking up, you had your visor up, and then you came down, and then you put the visor down. And so you're kind of doing this context switch, where I guess what you're saying is that you use your remaining eyesight to navigate the world, but yet if you want to have maybe a more high-fidelity interaction with talking to somebody, then you sort of flip down the visor. Is that kind of what happens?
[00:07:58.897] Yvonne Felix: Yeah, exactly. I was born like this. So I grew up learning how to adapt and process the world around me in an audio way and a tactile way and a partially sighted way. And so walking around, what I do is I tip the visor up and I use the visor to see in the distance. So I'll turn up the magnification so that I can see what I'm going to anticipate because without the visor, I can't see 20 feet in front of me. So I could see where I was walking, I could see the stairs coming up, and then as images move out of the visor and into my natural sight, which is actually below, so that people understand, the visor sits in front of my eyes and they sit on a pair of glasses. and there are little magnets on either side of the glasses, so it allows me to tip the visor up as if I were tilting up, you know, like sunglass frames over your glasses. And so that gives me access, like a reverse bifocal to my distance vision, but also my natural vision.
[00:09:00.358] Kent Bye: Yeah, and I'm curious about the sociological response that you get from people as you're wearing this head-mounted display because, you know, there was the Google Glass that came out and there was a big backlash because there's this fear that people were using the cameras on these Google Glasses to potentially violate other people's privacy to record them and broadcast them onto the internet. And so, As you're walking around with this device that is somewhat ambiguous as to what it is, if they don't know what it is and why you have it, then what are the types of reactions that you get by being on the front lines of having an augmented reality head-mounted display out and about in public?
[00:09:35.925] Yvonne Felix: Yeah, so it's actually a game-changing question that I get asked, because before the question was, do you need help? Or there was stereotyping that would happen. And some of that stereotyping would be, oh, if I'm holding a cane, an identification cane, then I must need help with something. Now people walk up to me and ask me what I'm wearing. And it's a much more comfortable door-opening conversation, because I get to tell people what I use on a daily basis to get through life, different apps on my phone that read things to me. I can take pictures and access a cloud that if I want to know what that box over there is on the counter, I can take a picture of it and the application will say milk. Great. The conversation is that I'm not visually impaired. I'm just wearing an AR or VR device walking down the street. And to be completely honest with you, I think this community, because we are actually the first people who are mobile with an AR device, using it for a specific application, I think we're actually going to start the trend that it's okay to wear AR in society and that it's accepted because look at the impact it's making.
[00:10:46.904] Kent Bye: Great. So what's next for this company? What are the next big features or problems that you're trying to solve with this technology?
[00:10:53.803] Yvonne Felix: So I think there is a bigger conversation. Definitely look at some developments in how can this level the playing field for people with low vision, but how can everybody have the ability to see 400 feet away? Like that's really leveling the playing field. And it also goes back to this idea that universal design starts out for people with disabilities and trying to make accommodations. But in the end, just like texting that was made for the hearing impaired community, it's going to benefit everybody.
[00:11:22.210] Kent Bye: Yeah, and is this something that can plug into like an Android ecosystem in some way?
[00:11:26.439] Yvonne Felix: So this device actually runs off of Android. And so some features that are really nice, I'll talk a little bit of the features of the device. So not only changing the color contrast or the brightness or being able to zoom 300 feet away, I can also stream to my phone. So my phone can actually show up on my head-mounted display, and I can do that with tablets, laptops, televisions. I can also store images. So if you imagine a classroom setting, for a student, they have to go to a special room to get an accommodation. With this, you can just have access to a test and you don't have to wait for somebody else to read it to you. But you can actually, well, I don't want people to get ideas, but you can store information, like you can get PDFs that you can recall and look at your study notes. You can also screen share, so someone could see what I'm seeing remotely. If I could be in San Francisco, my kids can see me looking at the pile of lion seals that sit down on the wharf, which I have done before. It's really applications that everybody can use. Who doesn't want to see what their loved one is seeing when they're on a trip? But the way technology moves forward, just like cell phones, faster and smaller, and that's always going to be the end goal, I think, is just to have something like a pair of glasses like everybody else wears.
[00:12:39.902] Kent Bye: Yeah, I was in a talk yesterday with Philip Rosedale, and he said, you would be able to potentially put your email up on the wall. Wouldn't everybody, as you're sitting here listening to me talk, wouldn't you like to read your email on the wall? And I was like, no, I wouldn't. And part of it is like, and then I realized that sometimes I do look at my phone, but there's a sociological thing here where when I'm looking at my phone, my eyes are diverted from paying attention to someone. And I think there's a social contract such that the reason why that hit me so strongly is because there's this implication of using augmented reality technology so that you could be completely dissociated off into your own little world without really knowing what someone's paying attention to or looking at. And I think that there's a bit of a different social contract there, which is like, what are the boundaries and the rules of social etiquette and how does this change? So I'm just curious if you have thoughts on that.
[00:13:31.098] Yvonne Felix: Well, because this is new, I think my compass for boundaries is very wide and not very nailed down yet. Yeah, you brought up a good point. Nonverbal communication is 93, you know, here, there, it's almost together, but 93% of what we do is unspoken and it's reactive. And I think in the last five years, that is definitely one of the things that I noticed for myself is that I've had to learn how to engage nonverbally because I didn't have that option when I was younger. People not seeing my eyes, 100% that bothers them. They don't know where to look. So at one point, I had a friend of mine actually stick googly eyes onto the device. That was very successful. Otherwise, as you noticed before, I tilted the device up slightly. So usually what I do when I'm talking to someone is I'll tilt my head down when I want to look at them. And then when they're talking to me, so they know I'm paying attention to them, I will allow them to see my eyes.
[00:14:25.577] Kent Bye: Interesting. And so what are the applications and things that you want to do in either augmented or virtual reality? What do you want to experience?
[00:14:33.281] Yvonne Felix: Yeah, so this is sort of a sidetrack, but even being here at this event today, one of the things that augmented reality would be amazing for would be some sort of application, if anybody's listening, that they could create where you're in a virtual space and you teach your kids to pick up after themselves and then eventually you move it into augmented into the real world. But joking aside, the therapeutic outcomes of what augmented reality and virtual reality can provide It's social engagement. There are people that are isolated and can't get out of their house. We're seeing it now through gaming. Everyone is on a headset talking over PlayStation Plus. I'll give you an example. I have a visual impairment. I have worked with people who have hearing impairments. And because of the two technologies that we use, we're actually able to communicate. And I like to use the telephone as an example. When you're on the phone, everyone is blind. So imagine if someone's using virtual reality. When you're in that world, the playing field is level because everyone will have the same ability to be strong, to walk, to talk, to see. And that's all from the user end. That's all going to be something that they have control over. In terms, if I look at the disability space, taking back control over your actual life through technology, especially through AR and VR, being able to have more outputs and inputs than you do on a regular basis is going to greatly impact a marginalized society that, you know, is struggling to find the adaptation that will allow them to just be a part of humanity. And I believe that AR and VR is part of that path.
[00:16:10.556] Kent Bye: Yeah, listening to David Eagleman and the Neo Sensory Vest which is able to take audio input and translate it into different haptic devices on the body and for people who are deaf it's able to send the same signals into their brain that the cochlea would send to their brain and so that they functionally able to turn their torso into an ear. And so I think this disability space of not having access to all these different senses, actually the brain is plastic enough to be able to increase the capacity for these other senses. And I think as a trend in technology, I see that we're going to be able to either do sensory replacement or sensory addition so that we're able to take in more high fidelity information from the world. But what I hear you saying is that people who have less access to some sites have maybe a highly developed senses for the other senses. And so what's it mean to now use these immersive technologies for everybody to increase the capacity of their sensory experience?
[00:17:03.433] Yvonne Felix: Yeah, it's interesting. It's something that I think about a lot because for me, my draw to the adoption of this technology is that I wanted to have the same access that everybody else did to sight. But at the same time, my lack of being able to see gave me not other superhuman qualities, but, you know, being able to read someone's behavioral patterns and their body language. You know, that is something that if you can see fully, you don't really pay attention to. Even, I joke around that, you know, I do have a prejudice and it's what comes out of somebody's mouth because that's the first impression. Doesn't matter what you're wearing, doesn't matter what you look like, doesn't matter how old you are. The first thing you say out of your mouth, your tone, the words that you use. So if you imagine that the first time you're meeting someone, you're both meeting over a device that doesn't give you sound or that tone yet, which AI is going to change that, of course, and it already is. But leveling that playing field means I don't know who I'm talking to really. And I'm getting the purest form of that individual as they choose to present it to me. Not to say that everybody's perfect, but there's something about the intimacy that technology, VR, and AR allows. You feel like you're almost like you're shedding this world and you're going into another world. And I see this as an evolution. TV was kind of the same thing. These boxes that we'd sit in front of and stare at, and all of the content was created to give us a sense of comfort, and then eventually discomfort. And VR and AR, I'm finding there is the same sort of draw. People want to be scared. You can go into a safe place to be scared. And what I find interesting, too, is there is something therapeutic about being in that safe place. There's something therapeutic knowing that whatever is going on around you, you can focus on this little world and whoever else wants to join it. And even with the eyewear that I'm using, the fact that I can have access to a television and see at the same time that my kids are or vice versa, we're playing video games together. So I'm in a space that they are comfortable in. And I have access to that. So I really see access being a draw. Whether it's positive or negative, it's the human experience that you get to have. And coming together in a way where judgment is removed, because you can create an avatar, is quite phenomenal.
[00:19:28.065] Kent Bye: Great. And finally, what do you think is the ultimate potential of virtual and augmented reality, and what it might be able to enable?
[00:19:37.886] Yvonne Felix: It's a big question. I'll try to make it a short answer. What I think VR and AR will enable is the ability for human beings to succeed at keeping humanity on the right track. You know, I've heard a lot of different conversations about the fear of technology and what it's going to do to humanity. Oh no, we're going to have robots, we're going to have AI, we're going to be stuck in these virtual reality worlds or augmented reality and we're not going to know what's real anymore. Like this real fear sometimes comes out and If we ignore, if we're conscious of it actually, not ignore, conscious of it but we're not focusing on that, the future of AR and VR is... to me means again that everybody gets to take part. And I think what it's going to do is it's going to create a global consciousness that I think we're striving for. And it's amazing because now I look at the different platforms that are available for communication. ARVR is another communication platform. It's another industry that's going to grow in a way that I don't even think people can understand yet. And I think I've been fortunate to be in sort of the beginning stages of where it's going to go. And I hope with my experiences of having a disability that, disability is an old word. It'll one day we'll look back and go, remember when they used to put wheelchair signs on the bathrooms to say everybody can go here that has some kind of issue? Like it's, we're going to look back and go, that was weird. So it's really going to help with the evolution of society and our connection in, again, ways I don't think we're even prepared for.
[00:21:18.456] Kent Bye: Awesome great. Well, I just wanted to thank you for joining me today on the podcast. So thank you.
[00:21:22.358] Yvonne Felix: Yeah. Thank you It's great meeting you.
[00:21:24.400] Kent Bye: So that was Yvonne Felix. They are visual artist who is wearing an augmented reality device that allows them to see and that was called the eyesight eyewear That at the time of this recording that they had been wearing for at least five years since all the way back in 2012 so I've under front takeaways about this interview is that first of all, I Well, yeah, just really fascinating to hear how this type of assistive technology has been able to change their life and be able to take what they have only around 2% of their vision that they can see and that they're legally blind, but they're able to use this assistive technology, augmented reality, head mounted display called the eyesight eyewear to see at long distances, to see body language and to flip it up and down depending on what their context is, if they need to see far away as they're walking or see more on their near field and yeah, they just felt like it's been leveling of the playing field for them as a visual artist and that it's just given them more and more autonomy and just change the tone of their conversations as People would come up to them asking if they need help now It's like a conversation about what they're wearing on their head and it's just a different tenor where it's less exhausting for them and feels like they have more autonomy and sovereignty and just kind of open up all these new possibilities and So yeah, just an early look at someone who's been an early adopter of some of these different assistive technologies They said that they were using the eyesight eyewear even before it was commercially available Helping to test it out And yeah something that they've been using pretty regularly for around five years at the point when I talked to them in 2018 Yeah, interesting to hear some of the different sociological dimensions, because I have noticed that, you know, when you are wearing something that's occluding your eyes, there is this social contract where it does bother them. And so they confirmed that people want to see their eyes. And so they actually flips up their visor so that people could see their eyes, even though that means that they can't see them. That just makes folks more comfortable when they can actually see your eyes. There is a social contract for eye contact and the boundaries around that. I think even with the Apple Vision Pro, when the father was playing with his son and kicking a ball around, I think it creeped out a lot of people in terms of this uncanny valley type of experience where people just feel like it's a little bit dystopic if you're having something that's occluding your eyes. But in the case of assistive technologies, then this is a use case where it's actually improving the vision of folks like Yvonne, who's wearing this eyesight eyewear. So yeah, I think as we move forward, that's something certainly to look out for. We had this backlash with the Google Glass, with the glass hole effect, where I think it was all sorts of other social dynamics that are happening there. When I was interviewing Robert Scoble, he's the one who pointed out that eye contact is this implicit social contract that we have when we're engaging with each other, and that Google Glass was interfering with that. And that was some of his own experience with that. So as we move forward, that's just something to pay attention to there's a sociological Dimension of this and also normalization of these technologies. I myself don't imagine me ever Wearing a virtual reality headset or something that's including my eyes but who knows maybe if it's normalized in the next five or ten years and you know, there's some really amazing experiences to be had, then, you know, I could totally see it. That's something you see within Ready Player One, where you see folks running down the streets with their eyes occluded with these virtual reality headsets and more of a mixed reality context, or at least that they're scanning the world around them so that they're not running into anything But yeah, this kind of mixed reality or the future of things like apple vision pro or using these immersive technologies as an assistive technology, as Yvonne was saying that you can potentially expand and have your eyesight and have like super vision eyesight, be able to see something like 400 yards away. So yeah, just the ability to use the magnification for some of these different tools as well. And what's that mean to start to augment your vision and this kind of sensory expansion type of context. So yeah, really quite interesting to see where that all is going to go in the future and really appreciate just hearing some of their own first person testimonials for what it's been like for them to be using augmented reality as an assistive technology. And yeah, I just wanted to include this unpublished interview for my archive into the series on accessibility since using virtual or augmented reality as an assistive technology is something that is a trend that I see. especially as we say again and again this idea of the design patterns of trying to figure out some of these different things within the context of VR and then porting it over into augmented reality. How can these different devices start to solve some of the problems in the context of the physical world as people are embedded in it? And yeah, I do think that some of the most compelling use cases for some of these technologies are going to be for folks who are using it in the context of assistive technology. So assistive technology may actually be a technological innovator and driver for adoption of these different technologies, which I covered in the previous episode with Ohan Oda talking about the AR feature there called the live view and lens and maps. So, that's all that I have for today, and I just wanted to thank you for listening to the Voices of VR podcast. And if you enjoyed the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a listener-supported podcast, and so I do rely upon donations from people like yourself in order to continue to bring you this coverage. You can become a member and donate today at patreon.com. Thanks for listening.