Tom Furness has been working on virtual reality technologies since 1966, but most of his early work with the United States Air Force has remained fairly secret (see my previous interviews in episodes #245 and #347). Ivan Sutherland and Mort Heileg are often cited as early VR pioneers, but Furness was also working secretly at Wright Patterson Air Force base on the first helmet-mounted displays, visually-coupled systems, and eventually The Super Cockpit. It wasn’t until he was given permission to speak about The Super Cockpit project in the mid-80s that the world got to learn about the advances in VR technologies he’d been working on.
I had a chance to catch up with Furness at the AWE 2023 on June 1st, just ahead of Apple’s announcement of the Apple Vision Pro that happened a few days later on June 5th. We talk about the Virtual World Society, and the meeting of XR industry CEOs and leaders at AWE to have an off-the-record conversation about the impacts of AI and how to collaborate to help bring about a more exalted future of the XR industry.
Furness also shares a bit more context of the early history of VR, as he’s has been working continuously within for the past 57-58 years. I think it’s really important to look back upon where these immersive and spatial computing technologies have come from in order to get a better idea for where they might be going. Furness also recently won the inaugural member’s choice Ethical Values Award from the XR Guild for significant contributions to the XR industry, and continues to lead the Virtual World Society to promote the more pro-social uses of XR technologies.
Podcast: Play in new window | Download
Rough Transcript
[00:00:05.452] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR Podcast. It's a podcast that looks at the future of spatial computing. You can support the podcast at patreon.com slash voicesofvr. So I have been playing around on different experiences of Apple Vision Pro, and I'm going to be diving into much more coverage throughout the course of February and beyond. There's a lot of developers and other folks that I hope to be talking to to unpack more first impressions and reviews and capturing the zeitgeist of the movement, since that's really at this inflection point for the XR industry. I'm seeing a lot of tech reviews and people getting excited about what the Apple Vision Pro represents. I just saw a tweet from CNET's Scott Stein who said, the past 10 years of VR reactions and takes are colliding into the last two weeks of Vision Pro. So certainly a lot of folks who are talking about the implications of virtual reality and spatial computing. And one of the things that Apple has done that has irked the XR industry a little bit is that they seem to want to distance themselves from even associating directly with virtual reality, augmented reality, mixed reality, XR, any of these terms. There was actually guidance that was given from Apple to developers that instructs them to not refer to their applications in any official capacity using any of these terms. They say, refer to your app as a spatial computing app. Don't describe your app experience as augmented reality or AR, virtual reality or VR. Extended Reality, XR, or Mixed Reality, MR, which the company wrote in these guidelines. And on January 19, 2024, Tim Cook wrote, Apple Vision Pro pre-orders begin today. We're so excited for you to experience spatial computing for the first time! And there were a lot of people within the XR industry who were like, look, you're not like inventing virtual reality. This is not something that you're creating out of scratch. There's been lots of different immersive experiences that people have had. There is an argument that the Apple Vision Pro has a level of compute that goes above and beyond what has come before. But fundamentally, there's a lot of dimensions of spatial computing that have been around for a long, long time. And this just reminded me that I had a conversation with Tom Furness, who is really one of the godfathers of virtual reality. Back in 1966, he was working in secret for the Air Force trying to integrate virtual reality as a functional tool to be used for pilots. And a lot of his work is gone on and be included into HMDs for pilots. And a lot of the stuff that he was doing in 1966 is like extremely advanced level of virtual reality. So I had a chance to talk to Tom back on June 1st, 2023, just to catch up with him again. I've done a couple of interviews with him before, but I wanted to ask him some questions because I feel like there's certain aspects of the history of VR that, you know, a lot of people point to Ivan Southern and the work that he's done in 1965 with the ultimate display paper that he wrote, and then the sort of Damocles that he was doing in 1968. But Tom was doing stuff in secret at Bright-Patterson Air Force Base and with the Air Force for a good 20 years from 1966 up until the 80s, when he was given an opportunity to actually give some press conferences and to talk about some of this secret research that he had been working on. And so there's a part of the history of virtual reality that I wanted to return to just because there does seem to be a little bit of this annoying memory hauling of that history from Apple where they present as if they've invented and created everything out of scratch, when in reality, they're leaning upon a long history and lineage of folks that have been working on this for many, many decades, including Tom Furness, who's been pretty much working on this non-stop for the last 57, 58 years. So I wanted to go back into some of the history, and as we start to look forward into where this may go, it's good to look back for where this technology originated and where it really came from. So that's what we're covering on today's episode of the Voices of VR podcast. So this interview with Tom happened on Thursday, June 1st at the Augmented World Expo in Santa Clara, California. So with that, let's go ahead and dive right in.
[00:04:11.168] Tom Furness: Well thanks Kent. Name is Tom Furness and I've been around in the VR space, XR space, whatever you want to call it for some time. Starting in 1966 when I worked for the Department of Defense, the U.S. Air Force, and I was trying to solve some problems. I was involved in designing cockpits for fighter airplanes. And one of the big issues we had were the complexity of these cockpits and how do we get bandwidth to the brain from all of these instruments and control functions in the cockpit to the human so that they could actually make sense out of it. And the reason we had a human in this loop was because we needed to have this adaptive element that makes judgments and things like that that machines can do. at the time. So that is when I discovered that there wasn't enough space to put the instruments in the cockpit that would display all of this. And even if we could, the interpretation of it and the taking it in and the creating of what we call a gestalt of what's going on, a situation awareness, was almost impossible. And I started investigating alternative ways of how do we relay information visually. And it was clear that we can't use just discrete instruments to do that. But we're never going to change the cockpit dramatically because you have to have standby instruments, but you also have to have, you know, a big screen in order to convey spatial visual information. And it just wasn't enough space in the cockpit to put one of these displays. So I started looking at the technology of how do we create virtual images. Virtual images meaning something that appears in space but isn't really there. And that's what happens in head-up displays for example. We're projecting on a combiner on the windscreen or something like that and you see this image out in space. But what if we could actually move that into the headset and then project it into a visor? that could be a wide field of view and be equivalent to a display that would cover the whole cockpit or even more outside. And then, of course, on the helmet then, you could actually move it around as you move your head around. But then there became the problem, how do we know where the picture is located? So we needed to track where the head was located and what the optical axis of the display would be. So that's where we got into this whole idea of closing the loop. You had head tracking, and you provide this virtual display at the same time. This is really what we call virtual reality today. But we started working on this in 1966, 1967, and we were flying components of it in fighter airplanes, testing it, you know, pretty much around 1969 and 70 and so forth. And it was at that time I began to realize that we could go far beyond these limitations and actually create a virtual cockpit. The whole cockpit could be virtual and projected and stabilized in space or stabilized relative to the Earth or stabilized relative to the physical cockpit. And so this whole notion of a cockpit that you wear came out of that. And also, not only would you provide the visual spatial information, you could provide acoustic information, the spatial binaural sound, and even tactile information because you could track the hands. and even the fingers, and you could put tactors into the gloves where you'd have a haptic feedback. And so basically even switching switches using speech input and moving your hands and pointing at things or looking at things would be this intimate way of coupling the pilot to this incredibly powerful machine in the fighter cockpit. and could potentially replace, you know, through 300 switches and 75 displays you have in a typical cockpit at the time. So I've worked on this for quite a while. And of course, working for the Air Force, I had a good budget. I had my own fighter airplane out there that could test these things on. And I flew in the fighters as a flight test engineer. And I wasn't a pilot, but I was a flight test engineer testing the equipment. And that's when this all built up to what we call the super cockpit. So I was simulating all of this in my lab, and we were testing various components of it in flight tests. And it was mainly for operating the airplane. And then we began to realize, well, wait a minute, we could actually practice flying with this, too. And so you can rehearse the mission ahead of time. So it could be used as a training function as well as the way to operate the actual airplane. And then we started thinking, well, maybe why does the pilot even have to be in the airplane to begin with? I mean, maybe we could remotely control these aircraft as if you were in the aircraft. And then you can make an aircraft a lot simpler than you do now. You can make an X-wing aircraft, which would be ideal for, you don't have to roll the aircraft in order to redirect the direction. where the plane is flying. So this was all my background. But what an interesting thing happened that turned me into where I am now. And that point was when I was actually requested by a general officer to do a press conference about the work we were doing with this cockpit that you wear, this virtual cockpit, at Wright-Patterson Air Force Base. Because it was mainly to open up what was some classified work in that area. And it's just so the government could have a better, what can we say, representation of how they're spending the taxpayers' money. Because there was concern about that at the time. So as the story goes on, I did. I held a press conference and did a press release, and the CBS Evening News comes in. This is Dan Rather and crowd, and David Martin, the Pentagon correspondent, comes in with film crew to my laboratory, where I have this simulator. And they got a chance to tape what was going on with all the computers we had surrounding us and this big, huge virtual, what I call a Darth Vader display, that gave us a 120-degree field of view. Really, actually more field of view than we're getting in our headsets today. And was tracked with a 16-bit electromagnetic tracking system. We had speech input, binaural sound, all these kind of things built into it. And so I end up on the CBS thing explaining this virtual cockpit. And this was in like 1986, 1985, 1986. So what happened was after that newscast, then there was Pandora's box. Life changed after that. So, as it turns out, then all the other network television guys had to come, you know, then ABC had to come, and NBC, and CNN, and CBC, and BBC, and Australian television comes, and Nova comes in and tapes Top Gun and Beyond, and, you know, I'm in the show business now, and pretty much that's all I'm doing, is anybody that flies over Dayton, Ohio, ends up in my lab, I think, and I show them what this virtual cockpit can do. But then I started getting questions and these came from people that would call me on the phone They said I watched this program on television This one mother called me so I saw this program on television where you're working on this virtual cockpit thing And I'm just wondering my child has cerebral palsy is anything with this technology you can use to help my child and then actually after that several surgeons called and said one was a thoracic surgeon says I'm trying to do a graft of the aorta and And I'm up to my elbows inside this patient, but the problem is my navigation system, my map, is a CT scan on a light box on the wall. And I have to keep looking over at the wall, and I'm trying to figure out where I am with my hands. Enter any way you can put that scan inside the body using your technology. And other surgeons, anesthesiologists, firefighting companies, I was getting phone calls, many of them, every week. And I thought about that when I was talking to him. I said, well, yeah, actually, that'd be easy to do compared to what I'm trying to do. And that's when it dawned on me that we're on to something really big, a transformation in the way we interact, especially with computing machines. And that's when I decided, you know, we need to have a strategy of what we're going to do with taxpayers' money on how we develop this technology, not only for the military, but perhaps other applications that could be spent off in medicine and in education and things like that. So I asked him to give me a sabbatical. And they did. They gave me a year off to come up with a strategy. And I said, I need a travel budget. And so I got a travel budget, I got a year off, and I went everywhere. I went to toy companies, I went to kindergartens, I went to hospitals, aerospace companies, computer companies, because I was wearing a DOD badge. which let me get into these places to really see what was going on behind the scenes. And this was in 87. And it was clear to me at that time, oh my goodness, there's going to be a convergence of things coming together. There's the telecommunications stuff that's going on. There's a miniaturization of large-scale integrated circuits are coming along. I mean, we did have the mini-computer, but we didn't have microcomputers at that time. And graphic signal processors, things like this were beginning to happen. And I realized that, gosh, when these things come together, that convergence is going to lead us to a whole new way that is affordable way, you know, like a fighter cockpit kind of thing, to be able to interact with a computational capability. And so I decided that when I got back from my sabbatical to leave the military and start a laboratory in the U.S. that would take all the investment made in my education, a lot of which was not written down, and start a laboratory that would concentrate on the technology of interfacing. Because all of these things were going to happen with this convergence of telecommunications and computing, but nobody was working on the interface. We're still looking at screens, and those screens, they just fill up a fraction of the field of view of our eyes, so it wasn't a very good efficient means of transferring bandwidth. So, that's what led me to become of academic at the University of Washington. I started a laboratory that's intent was to work on this new technology starting with what we call today virtual reality. And all the other components that go with it, the acoustic, the visual, the taptic, and all those kinds of things. And then we started spinning off companies. In the end, after about 20 some odd years, we had about 27 companies that we'd spun off, and we had 50 members of our Virtual Worlds Consortium. These are companies that came in that they were really interested in what we're doing, and they just wanted to know, and the students that were doing it. Now, of course, what's happened after that is we built an industry. We went from these little companies that evolved to an industry. Now, meanwhile, we were way ahead of our time because really the infrastructure wasn't there. We didn't have content. You know, we had an idea how well this would work, and we knew the power of it in terms of unlocking intelligence and linking minds and things like that, but it really wasn't affordable. for anybody, aside from the military probably. But we started developing simulations, simulators for sinus surgery simulation, for other kinds of surgical simulation, transurethral extraction of the prostate. We worked on pain, that's when we discovered that VR can be used for pain alleviation, for phobia treatment, for PDSD. We did that original work. And so it became clear that there was indeed a pull for this technology, but it still needed to be affordable and there needed to be an install base. And so over the years, that industry has evolved. And really, the smartphone, it really took us over the edge because the smartphone contained, the development smartphone, especially when you talk about the iPhone, the combination of technologies there really did get us to the point where we could put this into a headset at a fairly low cost compared to what I'd been doing with the super cockpit kind of stuff. So that's really what got me into it. And since that time, I've seen now this community grow. I mean, we've had several kinds of organizations that do expositions and things like that. But now with the AWE and where we are at, whatever year we are at AWE, I've lost count. Oh, 14. 14, OK. So over those years, this is sort of one of the big ones where people who are developing the technology and people are using it get together. Yeah.
[00:16:54.600] Kent Bye: Yeah, so it sounds like that all the stuff that you were working on from like 1966 up until like 1986, so for around 20 years or so, was mostly behind closed doors and likely classified in secret and you probably weren't able to speak about it publicly. But once you were able to talk about it publicly, we now have a little bit more of that history. And one of the things I just wanted to ask you because You know, often it's cited, like, Ivan Sutherland and the Sword of Damocles in 1968. He wrote the ultimate display paper in 1965. And there was funding that he was getting from DARPA and the defense industry. And then you have Jaron Lanier, like, in the 80s, started to have the VPL and more of an enterprise applications with Silicon Graphics and have, like, more of a enterprise use case for VR. So you're kind of at the very, very beginning of this. But yet, I'd love to hear your reflections on if you were aware of what Ivan Sutherland was doing if there was any interface there, and if some of the early work that you had done at Wright-Patterson was an inspiration to start to fund some of the other, like the sort of VM Eclipse and other stuff, and if you were aware of each other and each other's work, or if you were kind of independently working on this without knowing about each other.
[00:18:00.713] Tom Furness: Well, it's really the latter. I didn't really know about Ivan Sutherland because we didn't talk about it much, what we were doing. And I have to say that his work was, you know, far-reaching, pioneering work. But I took an entirely different approach than he took. He was really sort of in a technology push mode. He was looking at how to interface better with computers, and that it had to be more than just this two-dimensional way that it was being done. And again, I didn't find out to him until later in 1973 when we had a symposium about visually coupled systems, we call it, and actually someone from the University of Utah who did the Sword of Damocles. And I guess I would have been probably naive that I didn't know that was happening. But then again, our technology was far beyond what was being used in the Sword of Damocles because we had more resources probably to work with, even though he was funded by some of the government agencies. But his work was amazing, and we were parallel. I mean, he was 65, I was 66 when we were starting doing this. But I had a problem to solve. My job was, how in the world do we figure out, you know, back to how do we get the bandwidth to and from? Because we had a real problem. We had threats we were trying to deal with. We had to solve it. And that was where the necessity became the mother of invention. So rather than probing and exploring, you know, something that what Ivan was doing, I was actually trying to solve a problem. And it was after we started getting into it, did we realize what the impact could potentially be in terms of this whole new way of interfacing with virtual worlds, that that became a world in itself. And Ivan, you know, we talk about my being the grandfather of virtual reality. I'd say, no, that's not right. I'm a grandfather of virtual reality. And I have to say that Ivan was also a grandfather, you know, and for the work he did, he didn't stick with it. He did it for several years and then he went on to do computer graphics work, which we absolutely had to have to do virtual reality. Then there was another individual that was involved back in the early days, Mort Heilig. Now, Mort was a filmmaker, so Mort was just trying to come up with a way to have more than just a picture and some audio in traditional movies, and he created this arcade thing, this so-called sensorama, where you not only got three dimensions, visual input, But you also had stereophonic sound, almost like binaural sound, and you had wind blowing in your face, you had smell, and you had a chair that you were sitting on that vibrated. And so you'd go into this, it's like you're riding a motorcycle, and you feel like you're on a motorcycle doing that. But it was all film, it was all canned, it was passive, it wasn't interactive. But it did show us, you know, that really there's more going on for the experience than just visual and acoustic. You know, we have to have these other things, you know, to make it complete, a real virtual world. So in my mind, it was really the three of us, you know, that would be considered the early pioneers. And then others came along, like Jaron. Now, Jaron's interesting, too, because what he was after is there's got to be a better way to program computers. And the idea that you could do this with using gestures in order to move code around and modules, code modules and things like that, that's where VPL research came from, Visual Programming Language. He wanted to actually use this kit with headsets and his data glove and things like that, to be a tool for actually programming computers. So it was, here again, a necessity as a mother invention, and that's where the iPhones came from originally, which now, only years later, Palmer Lucky. redid to start the revolution with the Oculus Rift and things like that. So it was these steps that came along and I remember that Jaron Fred Brooks from the University of North Carolina and I gave a testimony to the Senate, Senate Commerce Committee. At the time Al Gore was involved in that and this was the National Information Infrastructure and they really wanted to talk to us because they want to know what's going to happen with this virtual reality kind of stuff. And we told him we can suck up all the bandwidth that we'll ever have in the National Information Infrastructure with VR. But he was really keen and got it. He understood where this could go. And Jaron actually brought out one of his systems, you know, and we had it on the Senate. We were running senators through this gadget showing them what it's about. Yeah, that's what happened with the other guys.
[00:22:37.230] Kent Bye: So fast forward here, we're in 2023 and, you know, Augmented World Expo, I think it's the 14th year now, that it's really large crowds that are here this year, a lot of excitement. Apple is on the cusp of potentially announcing whatever they've been working on. And you had a big gathering, a breakfast this morning that I heard a lot of buzz about and a lot of excitement of getting a lot of different leaders from the industry. So I'd love to hear a bit of what were some of your takeaways of the types of conversations about reflecting on this moment in time of where we're at in the XR industry.
[00:23:07.110] Tom Furness: Okay, well this all goes back to the virtual world society. One of the things I realized that what can I be doing now that the industry is growing on its own? And I felt that what we really needed to have was someone that's really concentrated on the human side of this, because really it's a tool. Any of this technology is a tool to help us become better, including the AI side of things. You know, it may not end up doing that, but that's what we want it to do. And so really we intended the Virtual World Society to be a society of people who were thoughtful about how we want to apply technology to lift humanity and to look for these humanitarian applications. Well, it's clear that we want to answer questions to guide the industry and to be a thought leaders in this sort of a brain trust of the history of development because it's been underway for, you know, in my case, 57 years. And a lot of it's been lost to the younger generation, but that is important to have that historical understanding of where it came from, and especially things we discovered early on about the human factors aspect of this. So we formed a virtual world society, this idea that we wanted to actually be these thought leaders, be the conscience of development of this whole field. And to do this in a way that's a guide. And to also help promote the good applications of it. Because we felt that, you know, the default is that you make games of violence out of it, it seems like. That's what happens to the gaming industry. But there's so much that can be done with it in terms of education and medicine and enterprise and promoting creativity. So the Virtual World Society started the last iteration of it. It started originally in 93 when the movie Lawnmower Man came out. And I felt the best defense was an offense, because Lawnmower Man was a dystopian movie, but here there was my baby. This is this beautiful, wonderful thing that we can do for humanity. And I felt people would get the wrong idea of what Virtual Reality is. So that's why I started. But we were way ahead of time. I mean, nobody had any, they didn't have any VR in their homes, and it wasn't going to happen for quite a while. So I put that on show for a while and then brought it up again after what happened with the Oculus Rift. It was clear time is happening again. There is a demand for this. So again what we've been trying to do is look at areas of this development of this community that are going to be important and especially what's going to happen in homes with families and with children because this is an amazing technology for education and we found in all of our research a profound increase in acceleration of ability to learn and upskilling and things like that. So what we did today, these thought leaders in the industry, the CEOs and CTOs and these companies that are making all this happen, rarely get a chance to talk to each other off the record. Because it's one of these things where everyone needs to win, for all of them, all the win. And it's not like you're trying to kill off each other from a competition standpoint. So we created this forum, this roundtable that had a select number of these industry leaders just come in and off the record, Chatham House rules, just interact with each other. We had questions that we asked them, one of which was about the AI, the AI impact. and also the impact on people. You know, how job displacement concerns and things like that, and how the growth of our industry, when the AI becomes integrated into it, how we're going to compensate for this impact. So we talked about these issues, and later on tomorrow we're going to have to give a little report. Alvin Graylin and I will be giving a little report on that on the main stage. But we are issuing a report that people can have access to to hear what was talked about. And so we feel that that's something that we can do for the community. And I think there was a universal agreement that these industry leaders, they loved it. And they said, we got to do this. We got to keep this up. So that was, to us, a great outcome. Because they said, we need to have this kind of forum where we can just bounce the ideas off of each other. work together toward the things that are really going to be important. One of which is actually accessibility and getting the install base and getting the notion communicated to the overall community, to society. This is a good thing and that is growing and it's not something to be feared. So that's sort of what came out of it.
[00:27:55.127] Kent Bye: Awesome. Great. Yeah. And there's an XR access meeting that's coming up in a couple of weeks. That's going to be in New York city that I'll be at covering some of those aspects as well. Yeah. And just to wrap up, I'd love to hear what you think the ultimate potential of virtual reality might be and what it might be able to enable.
[00:28:10.013] Tom Furness: Sure. Thanks. Well, you know, at the end of the meeting, the round table meeting, I made a statement. I said, you know, I think we've missed one big piece of this. And that big piece, we've been talking about the technology and applications and things like that. I said the big piece in this is making us more human. Because we have so much capability we're not using. And what this technology can do is help us become better as persons. And we don't have to worry then as much about these things that we think are going to affect us because we're going to be better ourselves. Including, how do we use our senses better? And the capability we have and the capacity we have for empathy and for learning about the way things work with accelerated learning. And so it's the human side to me that is most important of how we can become these superhumans Because we're already super. We're just not using it. And so that's what I think the technology can do for us. And that's where I want to concentrate my attention. And I'm 80 years old now. And so I don't know how much time I have left. But I'm not dead yet. And I'm going to be working on some stuff. I want to have this forum of these industry folks together. But I also want to be working on these intimate things about how humans work and how we can use the technology to help us grow. Beautiful.
[00:29:36.165] Kent Bye: And is there anything else that's left unsaid that you'd like to say to the broader Immersive community?
[00:29:40.823] Tom Furness: Well, I love it. I mean, these are wonderful people. I mean, you go out on the floor and to see the people that are working on this. Underserved communities. There's no such thing as a dumb kid. And what I see is starting with getting the technology in the hands of our young people so that they are truly virtual natives from day one. They're going to be the future of our civilization. Let's face it. Right now, they're confronting some really big problems, including mental health. They're in an uncertain world right now. And they even are having their own identity crisis. Especially when they look at what's going on and what can be manipulated in terms of truth. What is truth now anymore? So these are things that we need to be working on with underserved community in particular. And the whole idea in my mind is that we need to give everybody a chance to become all they can be. And obviously they have their agency to do with that, but let's not pull them back because they can't get the technology or they can't get the opportunity.
[00:30:49.311] Kent Bye: Awesome. Well, Tom, thanks so much for your 57 years of continuous work on this field of virtual and augmented reality, and all that you've done in the academic sense, and the foundational work with the Defense Department, and yeah, all the stuff that you're doing now with the Virtual World Society. It's deeply inspiring to see all the work that you've done. And yeah, I just really appreciate you sitting down and help providing this deeper historical context for where we're at now and where it all came from. So thanks again.
[00:31:16.097] Tom Furness: Thank you, Kent, and thanks for what you've done for the community. It's been huge, because you have captured a lot of this history, and with many, many people. How many episodes?
[00:31:26.321] Kent Bye: I've published around 1,220 or so, and then I've recorded another 800 that I haven't had a chance to publish yet, so over 2,000 oral history interviews, so yeah, that's a fraction of the 57 years that you've been in, but so yeah, thanks again.
[00:31:38.887] Tom Furness: Still, thanks for what you've done, too.
[00:31:41.248] Kent Bye: Awesome, yeah, thank you. So that was Tom Furness. He is one of the early innovators and pioneers of virtual reality who's done a lot of amazing work over the years. And he's been a key part of translating a lot of this work that had been happening in these secret labs of the U.S. Government Air Force and bringing it out into the wider world. So I have a number of different takeaways about this interview is that first of all, well, a lot of people, if you just look up the history of virtual reality, Ivan Southern will be cited as, you know, one of the preeminent innovators of virtual reality. You know, he created the sketch pad in 1962, which is really the very beginnings of computer graphics. Really quite amazing. If you think about all the technology up to that point was coming out of punch cards and he's doing this kind of graphical user interface for the first time. And then from those insights from the sketchpad, he creates the ultimate display paper that Fred Brooks was actually at the speech where Ivan Southerland was presenting the ultimate display paper at the 1965 computer conference. And I did an interview with Fred Brooks back at IEEE in 2016, and Fred Brooks recounts this moment of listening to Ivan Southerland give the speech. Sutherland said that you shouldn't think of a computer screen as a way to display information, but rather as a window into a virtual world that could eventually look real, sound real, move real, interact real, and feel real. So that aspiration of trying to create these immersive technologies that are creating all these qualities of presence, both the embodied presence and environmental presence and emotional presence and mental and social presence and plausibility and agency and active presence. You know, all these things are the qualities of presence that virtuality is able to engender. And so there's a lot of ways in which that this technology was coming out of the military context. But even before the military context, there was Morton Heilig, who was creating stories and trying to immerse people much deeper into the film stories as a filmmaker. that he wanted to create this full sensory experience, which I think in a lot of ways is where the future of virtuality is going with the integration of all these processes of immersive storytelling and interactive gaming. So yeah, just really struck by the commitment that Tom Furness has had to this issue for so long since the mid 60s. And, you know, as he had a chance to start to talk about some of his research in virtual reality for the first time, and just getting calls from around the world and taking that sabbatical and just being able to hear for all the different use cases and the ultimate potentials that this technology could bring about. And so he started the HIT Lab at the University of Washington, the Human Interface Technology Lab, which has gone on to do a lot of really early frontier research about virtual reality, creating all these different companies, and really started to build up the foundation of what the industry is today that hasn't enabled a company like Apple to come out and launch a spatial computer. And so, yeah, I just wanted to recap some of this early history, where virtual reality has come from, and where it's going to be going. And I've got a lot of other interviews on the Voices of VR podcast that dig into a lot of these more historical aspects of the XR industry, but I wanted to release this interview that I've done back at AWE that I haven't released yet, just because it's something I've been thinking a lot about as the Apple Vision Pro has come out, and I know that Tom has just been awarded as the members choice award for a lot of his achievements within the XR industry by the XR guild and Just a lot of the work that he's done. That's been really renowned and incredible dedication to the XR industry for over 50 years now so Like I said at the top, I'll be diving into much more Apple Vision Pro coverage here over the next month and beyond, and so I look forward to digging into more experiences and talking to more developers and other folks with their first impressions and experiences that they're having within the Apple Vision Pro. So, that's all that I have for today, and I just wanted to thank you for listening to the Wusses of VR podcast. And if you enjoy the podcast, then please do spread the word, tell your friends, and consider becoming a member of the Patreon. This is a listener-supported podcast, and so I do rely upon donations from people like yourself in order to continue to bring you this coverage. So you can become a member and donate today at patreon.com slash wussesofvr. Thanks for listening.