I moderated a discussion with VR pioneers Fred Brooks & Henry Fuchs at the University of North Carolina School of the Arts’ Future of Reality Summit (original video is here). We talk about the evolution of immersive technologies since the mid 1960s. Fred Brooks heard Ivan Sutherland’s Ultimate Display speech in March of 1965, and Henry Fuchs heard about Sutherland’s Sword of Damocles VR prototype from Stanford’s Alan Kay.
They each share the milestones of the evolution of tracking technologies, display technologies, real-time graphics and scanning & volumetric capture technologies from the late sixties until today, as well as some of the early applications from NASA and with flight simulators. They also talk about the overpromises of the hype cycle of the early 1990s, but also how the term of “virtual reality” really helped to catalyze a community of practice as well as tell the story to military funders who continued to support the type of research and work that was being doing by Fuchs and Brooks at UNC Chapel Hill’s department of computer science.
We also talk about whether or not VR and AR are well on their way to mass ubiquity or if the immersive industry should be bracing for an winter period. Fuchs was skeptical that we’ve crossed a tipping point for XR moving into mass consumer products was going to be enough to justify the investments that companies having been making into spatial computing. He specifically cited Microsoft’s HoloLens, and he made these statements before it was announced that Microsoft won a $480 million military contract to develop an Integrated Visual Augmentation System for the Army. But overall, Fuchs and Brooks fill in a lot of gaps of the history of VR and provide lot of context and perspective for how we got to this point with all of these immersive technologies.
LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST
The video version of this discussion can also be found here on the University of North Carolina School of the Art’s Media + Emerging Technology Lab page.
This is a listener-supported podcast through the Voices of VR Patreon.
[00:00:05.412] Kent Bye: The Voices of VR Podcast. Hello, my name is Kent Bye, and welcome to the Voices of VR podcast. So on today's episode, I'm going to be exploring the history of virtual reality with two of the leading pioneers of VR, Fred Brooks and Henry Fuchs. So both Fred Brooks and Henry Fuchs are at the University of North Carolina at Chapel Hill, and at the University of Carolina School of the Arts, they're studying a brand new immersive storytelling program. It's in Winston-Salem, North Carolina, and their lab is called the Media and Emerging Technology Lab, also referred to as METL. So they held a one-day summit called the Future of Reality Summit, and they had me come out and give the opening keynote as well as moderate a series of different panel discussions, I think around six of them across the course of the day. They had 17 different talks, including this one with Fred Brooks and Henry Fuchs. And so We dive in into the major different phases of virtual reality, where the initial creation of the sort of Damocles and how Henry Fuchs was working with Ivan Sutherland at the University of Utah, how Fred Brooks was inspired by Ivan Sutherland's speech back in 1965, and then had just recently started the computer science department at the University of North Carolina and decided to really focus on computer graphics. And so I was really curious to learn more about like what was happening between the initial creation of virtual reality and when it actually went to like some of the first commercial products for the enterprise with the VPL in 1989 on throughout the 90s and with the third wave and the modern resurgence of consumer virtual reality with Oculus Rift and everything that's happened since 2012. So that's what we're covering on today's episode of Voices of VR Podcast. So this interview with Fred and Henry happened on Tuesday, November 6, 2018 at the University of North Carolina's School of the Arts Future of Reality Summit taking place in Winston-Salem, North Carolina. So with that, let's go ahead and dive right in. So I'm really excited about this panel. Both Henry Fuchs and Fred Brooks have both been involved with virtual reality for a really long time, almost from the beginning. I know, Fred, you saw Ivan Sutherland's speech in 1965. And so maybe I think a good place to start would be your experience of that speech. And then, Henry, you can talk about your entry point into this whole universe of space. But yeah, maybe, Fred, take us back to 1965.
[00:02:24.137] Fred Brooks: OK, I'll take you back a year earlier. In 64, I started the Department of Computer Science at Chapel Hill. And we were the second freestanding computer science department, not part of EE or math, PhD granting in the country. Purdue beat us by six months. And we were thinking hard, what areas of research do we want to emphasize? And we decided computer graphics was one of our peaks of excellence. So now, what do you do inside computer graphics? Well, I went to the conference that Ivan spoke at in spring of 65, in March, I believe. And he gave a phenomenal talk, but he said one sentence that changed my life. He said, don't think of that thing as a screen. Think of that thing as a window. And through the window, one looks into a virtual world. The challenge to computer graphics is make the picture in the window look real, sound real, move real, feel real. And we've been working on that ever since.
[00:03:35.140] Kent Bye: So Henry, what about yourself? Maybe you can talk a bit about, I know that you told me the story of going to Utah to work with Ivan, but what was your, I guess, entry point into what we now refer to as virtual reality? I mean, maybe you could talk a bit about your sort of entry point into this, and what made you decide to do that?
[00:03:51.445] Henry Fuchs: Sure. I was taking an independent study at Stanford from Alan Kay, who some of you may have heard of, the father of personal computer. And one of the many things he showed me, this was in, 1969, was a paper in 1968 called something like a head-mounted three-dimensional display by Ivan Sutherland. I had been involved in graphics at Caltex JPL beforehand, but I had never heard of this idea that you could put on a headset and see objects in the room around you. You could walk around and you could move them. They share your space. Wow. So Alan, who had just finished a PhD at Utah, says, you might want to think about going to grad school at Utah. That's what changed my life. And so I got there. The thing that, of course, he didn't say is that Ivan Sutherland and Dave Evans I had started a company called Evans & Sutherland Computer Corporation, so when I got there in 1970, there were a number of us interested in interactive 3D graphics, some of whom you may have heard of. Ed Catmull, who heads Disney Pictures, Jim Clark, who started Silicon Graphics and a number of other companies, but there was no full-time graphics faculty. They were both off doing their company. And so it was a very interesting time. The difference between that time and now is we were all excited about this, but it was sort of like this very esoteric It was sort of like butterfly collecting. No, no, no. It was like butterfly collecting a certain kind of butterfly. Only the kind that, you know, has four wings. And so when I'd be telling people about what we do, First, it was like, ugh, you're doing something with computers? And then, what is it you're doing with pictures? And what is it with this thing you mount on your head? And what is it that you're interested in some little subset of this? It was, you know, like some little part, a little part of a butterfly somewhere. You know, I don't think any of us had any idea that it would become something that most people would know about.
[00:06:19.267] Kent Bye: Yeah, so we're now today with a consumer virtual reality, and that there's people like Simon Wordley who looks at technology diffusion curves. And what he says is there's like four different phases. The first phase is the academic idea, which I think we can point to the ultimate display in 1965 as an academic paper, and then Ivan Sutherland with the Sword of Damocles in 1968. Built one. Yeah. He built it, yeah.
[00:06:40.461] Fred Brooks: He first said his division, and then he went and did it.
[00:06:44.544] Kent Bye: And so we have from the idea, then we go to the actual commercial enterprise bespoke applications. And then eventually there's the commercial applications, which we can see with Jaron Lanier in the 80s. But we have this big gap in my knowledge as to what was happening from the sort of Damocles on until we see the first publicly commercial virtual reality. What was happening in the industry? Were there flight simulators? What was happening?
[00:07:11.133] Fred Brooks: Well, flight simulators, of course, go back to World War II and the link trainer. And today, the best virtual reality is still flight simulators. And the best I've ever seen was a $13 million 707 simulator at British Air's simulator installation in London. I had a chance to fly it for two hours, and it was the virtual reality experience, but for $13 million. And so that technology has developed in parallel, and flight simulation is the ideal AR experience because everything you can reach is real and everything that's virtual is through the screen. And so it's the ideal application in terms of real haptics, real sound, the whole thing. But meanwhile, people were exploring in various places. My first experience with VR was, in fact, Warren Robinette at NASA at the time had built a little system. I remember his system had some nice things in it, an escalator that you wrote on, and there was a real wall superimposed on the virtual wall, so I remember reaching out to see how closely does the corner align with the real wall corner. And so there was a lot happening at essentially the academic and exploratory level.
[00:08:39.418] Henry Fuchs: From my standpoint, It was the technology that was developing at the time. So when I got to Utah in 70, Ivan was already focused on what was happening in the company. And while he had ideas about what people might do for dissertation, it was not his full-time occupation. And so there were few things that were being developed even then. For example, the 1968 system was totally a line drawing system. You had these white lines that were in the room with you on the black background. So they were like little glow lights, you know, of line segments. You could walk around them. You could move them. It was great. But it was not rastrographics. In Utah 1970, one of the guys built a first, as far as I know, the first real time rastrographic system that you poured polygons into and what you got was video coming out. And that was in fact the basis of the first general purpose flight simulator as far as I know. So if you want to be as part of the VR group, I think we could take some credit, however indirect, for the flight simulator industry. That is, the flight simulators in 1970, first of all, as Fred mentioned, they go back a long ways, but they were not general purpose. So the best ones at the time were by General Electric, and they had a processor per polygon. Can you imagine this? A processor per polygon. I mean, you know, now we're doing millions of polygons. They had a hardware box for each polygon, okay? And then there was the supervisory thing that had to do with, you know, when you pass the scan line, oh, now we can put a new polygon into that polygon box, okay? That was GE. What Ivan Sutherland's student Gary Watkins did is build The first general purpose one doesn't matter where those polygons are, you pour them in, out comes video. If you overpowered it, that is you put too many polygons in, whatever, that scan line, you know, would be glitching a little bit, there would be some error there, but then it would catch up. It was generating video in real time. Then, there was a question about tracker system. By the way, if you want to know one thing that's wrong about VR terminology, the Sword of Damocles was only the backup, mechanical, old-fashioned tracker for Ivan's system. The premier tracker was six degree of freedom, ultrasonic, no contact. 1968, you understand that? If you look at the 1968 paper, there are two figures and it shows two different trackers. He had an ultrasonic tracker with transmitters on the head and receivers from the ceiling. So the sort of Damocles was sort of the backup thing that you used when the ultrasonic one didn't work.
[00:11:35.036] Fred Brooks: But it was also zero latency.
[00:11:38.999] Henry Fuchs: Actually, there was hardly any latency in the ultrasonic one, either. And the reason is, a guy's name who designed this, Chuck Seitz, who's a member of the National Academy, and there was hardware in each one of the receivers to receive what the latency is for each one of the three transmitters, all in parallel. And there was a whole undergraduate thesis at Harvard just on this ultrasonic tracker, just on the algorithm on how you go from knowing the distance from each one of the transmitters to each one of the receivers, those 12 numbers, and how do you figure out the 3D position of each one of these transmitters, and then the sixth off position of the headgear. All 1968. It was just fabulous. One of the reasons I put on a 50th anniversary session like this at SIGGRAPH was just to tell people this one thing. There was an ultrasonic SIGDOF tracker in 1968. So, to get back to your question, there was a tremendous amount of technological development. Basically, after 68, and Sutherland went off to do, you know, other things, as far as I could tell, it was a desert. and a few of us who were trying to do something. For example, how do you get the definition of a Volkswagen that you wanna define into your graphic system? Sutherland just assigned a class to do it with yardsticks. But what you really want to do is you want to scan like a laser scanner from different points of view and then put it all together. So that was developed. You really want a tracking system that was better than this ultrasonic one because the ultrasonic one had problems having to do with air movement and temperature and things like that. So there were tracking systems developed in the 80s, one of which, for example, called self-tracker that used optical sensors on the head to determine where you are in the room. You know, this is what the latest systems use now, you know, outward-looking tracking. There were inward-looking trackers developed of all kinds. Then there was real-time image generation. So in the 1970s, if you wanted to do real-time, you could buy it from Sutherland, Evanston-Sutherland. It cost about a million dollars per channel. And as far as I know, only one person managed to raise the money to buy one of those. Fred Park at Case Western or something. But you know, it wasn't so good because it wasn't good for VR because it was basically a hardwired thing for flight simulator. So all kinds of technologies were developed for all the different pieces that you needed. So by In the late 80s, it was possible to sort of put together things to be able to sell to people, which VPL and General Linear did.
[00:14:34.177] Kent Bye: Yeah, and January Lanier found Sutherland's paper about the Sword of Damocles and inspired him to go and make it. And so in the 90s, we really see this explosion of the first commercial virtual reality. We have the story of what's possible, this huge hype cycle that the technology ultimately wasn't ready to be able to do what we have today. But it was at least proving out the potential. And it really kickstarted an industry. And so from your perspective, your vantage point, what was your experience of the 90s of that big explosion of VR?
[00:15:04.027] Fred Brooks: It's best epitomized by the name of the conference, Hip, Hype, and Hope. And it was more hype than hope. We were working in the field at that time and built some homemade head mounts and so forth. Gary Bishop's magnificent tracking system. So just chugging along, figuring we were paying a lot of money for the commercial display devices, but the rest of the system had to be done together. Now my most important experience in the 90s was in 92, my wife and I started planning a kitchen remodeling. And so we built a model of the kitchen and we had By that time we had really good tracking and really good space. So I brought her in to go through the model of the kitchen and see what needed changing. It turned out to be very valuable because what we discovered that we could not have discovered from, I think, from drawings was that we had two hanging cabinets on either side of a central sink and it ruined the spatial feeling of the room. It just destroyed the feeling of the room, and we would not have discovered that had we not walked in there. On the other hand, we had been in there about 15 minutes when Nancy said, get me out of this thing. And the reason was partly is her blood sugar falls off pretty sharply about 10.30 in the morning, and this was 11.30, and we had some latency problems, and she was getting sick pretty fast. Our experience in the 90s was typical, but the experience of using that to actually make the design decisions about the kitchen was a milestone for me and for our laboratory. And the kitchen today is the way we modeled it, except for those two hanging cabinets.
[00:17:23.913] Henry Fuchs: My sense was that the 90s was an exciting time. because it was then possible to build real systems, and what the high provided was finally some funding. In the 80s, there was no term for this, and when there's no term for this, there's no community, there's no understanding of what this means. And so I remember talking to people about the concepts that we all are familiar with today. But you'd have to start with each person from ground zero. So, for example, say the tracking system that one of our students, Gary Bishop, built in the 80s. A fabulous tracking system, okay? He understood already, I mean we understood, that if you're going to track something, you cannot track it at 30 frames per second. He says in his paper, in his dissertation, we need to track at about a thousand frames per second. Okay? We all know that now. But to explain this to somebody in 1985 was, you know, a task all in itself. Wait a minute, what is it that you want to do? Oh, you want to put something on your head? And why do you want to do that? Wait a minute, and then you're gonna put some sensors on your head? You know, the whole thing just didn't make any sense. So from my recollection, the thing that Jaron Lanier did that was so important was to educate the world about the vision of virtual reality. And what that meant was that when we went to get money in 1990, 1991, 92, the funders were receptive. So we could go to DARPA, And we had some money previously from them for doing other kinds of things. We were doing things that we now call interactive graphics. Pixel planes, real time system. But you know, it was not VR. But in 91, 92, we could go to them with a proposal that says, you know, if you put something on your head and you see through it, you could see on the other side of the hill. They understood that. And they didn't understand that in the 80s. And so it was really great in the 90s. In my sense, it was that there were all kinds of strides that were being made. And not just by us, but by a larger group of people. It wasn't just one or two places. It all of a sudden became an international community. In every one of these areas, in real-time systems, in displays, in tracking and interaction and applications. So now when you say any one of these applications of physical therapy, of surgery, of phobia treatment, they all go back to one or two places actually doing them in the 90s.
[00:20:15.409] Kent Bye: Yeah, and I bought my Oculus Rift on January 1, 2014. And in that March was the IEEE VR conference that was just after Facebook had bought Oculus for $2 billion. And I was hungry to find all information. I was looking on Twitter. Sebastian Kunz was tweeting out your keynote that you had done there, that you had changed your keynote the night before to completely rewrite it and to say, look, this is what's happening. Why didn't this come from the academic community? And there's this huge innovation. We have Mark Bolus and the FOV to go, and this whole sort of USCICT, all this sort of stuff that's happening. But there is this turning point with the Facebook buying Oculus, which was, in some ways, the signifier for so many people in the wider industry to say, OK, this is legitimate. This is happening. And it really was a turning point. But maybe you can kind of catch us up now, and we're in this third wave of virtual reality, your orientation of what you're seeing, what you're perceiving, as where we're at now.
[00:21:15.814] Fred Brooks: I think we've heard so much vision in the panels that we've just heard that I don't think I have anything to add in terms of vision. There's no shortage of vision. And the thing I found exciting about what we've already heard today is so much of it is real. People investing real money in real applications with real customers who are prepared to, oh, you're ready to disagree with all
[00:21:44.098] Henry Fuchs: He's known me for 40 years. All right, what I want to tell you guys is keep the faith. Keep the faith. And you'll need that in another two or three years. Dean, student, you need to keep the faith too. Because in three years, it's not going to be this. The difficulty now that's different from the 1990s is that the companies in the 1990s just wanted to make some sales to, like, professionals. You know, they wanted to make sales for somebody that was exploring oil, or they wanted to make sales to the designers of helicopters, or they wanted to make sales to the surgeon or the physical therapist. Now they want to make sales to everybody. What I worry about is that even though I completely agree with you, Facebook purchasing Oculus changed the world of VR, more than anything since Ivan Sutherland's system. The problem is that those billions of dollars need to somehow be monetized, you know, and you cannot do that on the applications that most of us talk about. because we need to have it be basically everybody in their home. So location-based entertainment, I think, is fabulous. I love this stuff, okay? But I just don't think there's enough of it to be able to pay for billions of dollars of investment of Facebook, you know, of Google, The people at HTC Vive, of people at Magic Leap, that's what I worry about. So that's why I say, keep the faith. It is going to get better, but what I worry is that in three to five years, it will be worse. And people will ask you, well, VR, that was just a hype, wasn't it? And we need to be able to say, no. It may have been overpromised, but, you know, it is coming. I guess it's coming in 10 years.
[00:23:42.425] Fred Brooks: Well, and it's been overpromised before. Yes. I mean, we've been there, done that.
[00:23:47.147] Kent Bye: Well, the thing that I see from Simon Mordley's model is from the innovation from academia, the enterprise applications, the mass consumer product, and then mass ubiquity. So I say we're either going to reach mass ubiquity in 2025 or 2045. that it's either going to get this huge exponential growth and it's just going to take off, and that we're really at a tipping point, or it may be another 30 years until we actually get to that point of mass ubiquity. But I see that there is still a lot of enterprise applications that, if nothing else, there's so many compelling applications that can be in the enterprise.
[00:24:20.048] Henry Fuchs: Absolutely. But think about one example. One of my favorite products is HoloLens. I think it is a fabulous system. But is Microsoft going to make its $2 billion back out of HoloLens? $2 billion is what I've heard good estimates of what they've put into it over the last decade. Are they going to make $2 billion of sales out of HoloLens? Well, some people say, oh, you know, it's not HoloLens, it's mixed reality and, you know, all the headgear. Really? Are they going to make $2 billion out of their mixed reality headsets? Microsoft, how long will they go with it to develop more and more versions before they sell it to somebody? You know, like Facebook, okay? Is Magic Leap going to earn $2 billion in order to pay back its investors, really? How many sales do they have to make at $1,500 each in order to earn back the investors? I'm worried that people are not going to stay with it till 2025. They'll stay with it till 2022.
[00:25:28.970] Fred Brooks: This is our congenital optimist.
[00:25:31.992] Henry Fuchs: No, but did I tell you, keep the faith. It's going to be great. And by the way, so if you want to have somebody to the next breakthrough, somebody mentioned AR. AR, I think, is the place where it could be the best. As much as I love VR and being in a different space, AR, you know, in our glasses. Half of us are wearing glasses. Why don't we have displays in our glasses? It's just a difficult problem, but people are working on it. Ten years from now, there'll be good displays in glasses. But not two years from now.
[00:26:09.494] Kent Bye: Well, let's end on not a note of pessimism, but a note of optimism. I'm curious to hear each of you maybe extrapolate out where you see the ultimate potential of where all this could go, of both the virtual and augmented reality technologies could enable.
[00:26:22.197] Fred Brooks: Well, I'm with Henry. I agree the augmented technologies have much more promise. And we've already seen that realized with cases like Boeing and the wiring. OK. We worked with submarine designers, and the The story in the shipyard was the way the engineers build it. Our job down here, we call it cut to plan, bang to fit. And so the whole question of we're already seeing major, major effects in the whole design, engineering design area. Massive changes. Rapid prototyping is making so much difference. And I think in terms of the whole economy, the non-entertainment applications, which have been financed by the entertainment applications, will still keep on being financed by the applications. But I think there will be more changes in real life as a result of the non-entertainment applications. the rehabilitation, the Lowe's transformation, all right? I think we're going to see a lot of imperceptible and not even noticeable changes in life because of the technology and it won't be nearly as visible as the entertainment applications. Now, this is the wrong audience to run down entertainment, but I still think that's where the major changes in the economy and in our daily life is going to be. Now Henry will have a different view.
[00:28:07.833] Henry Fuchs: So my favorite scenario is telepresence. I've been interested in that for as long as I can remember. And the reason is that there is something magical about being in the same place. Think about why is it that it's better for us to all be here in the same place than watching this on video. That's what I want to do. And I think that I hope to live long enough that we'll realize those. That is, that not these fake holography things on the stage, you know. Companies will do that for you, you know. This person will come to you by hologram and then, you know, they'll see this picture of somebody. No. What I mean is that we feel like we're in the same space. That we're sitting next to each other. That we can walk around, we can see each other.
[00:29:03.181] Fred Brooks: And it's going to have to be better than it is.
[00:29:06.124] Henry Fuchs: Exactly right.
[00:29:07.306] Fred Brooks: I ran a project that had a machine in Germany, a machine in France, a machine in Britain. A computer, he means, like an IBM 360. Yeah. And input output gear in California and in Boulder, Colorado. And the farthest distance was between Poughkeepsie and Endicott. There was no substitute to actually going in person. There was no substitute. The next best substitute was placing an ambassador resident in place who could tell me what was really going on in the other laboratory. And we're a long way from there with any of the technology that's available today. Oh, exactly right. You don't have to go as often with the technology today, but you still have to go.
[00:29:50.485] Henry Fuchs: Yep. Exactly right. when we could have a feeling of being in the same place, I think is gonna be the one that will change the world. And I think that will integrate with entertainment. Because then we could feel like we're in the same place with some performer, with our heroes. We'll be in the same room with them.
[00:30:14.579] Fred Brooks: By hero, he means dinosaurs in the field.
[00:30:18.882] Henry Fuchs: Well, you know, people that I've met here today. A person who started, you know, open source. Wow. How often have I read about new C++ compilers? Wow. This person here in person is just amazing to me. You know, you were saying about emotional impact. What is the emotional impact of meeting somebody that you've read about and admired face to face for, I don't know, 30 years or something? Wow. You know, it is an emotional impact, and it's being in the same place. I mean, Susan and I don't get a chance to be together very often, but it makes an impact when I'm sitting next to her at lunch. And that's what I think telepresence could do. It could bring us closer together.
[00:31:10.970] Kent Bye: Hmm. Awesome. Well, that feels like a good place to stop. So thank you. So that was Henry Fuchs and Fred Brooks. They're both at the University of North Carolina at Chapel Hill at the Department of Computer Science, and they've both been longtime pioneers within the field of virtual reality. So I have a number of different takeaways about this interview is that, first of all, I want to hone into Henry's response to keep the faith. And I think that is being informed by his very pragmatic temperament and being involved with virtual reality since really like the 1970s when he was getting involved with computer graphics and being involved with all the different technologies since then going through all these different winters of VR going through the first phase of the hype cycle in the 90s and then the aftermath of a lot of those over promises that were coming then and then seeing this latest resurgence of virtual reality over the last six years and seeing that a lot of the same over promises were being made. But I would also say that those promises are actually being borne out in a lot of different applications. And, you know, in the panel discussion, Henry is very skeptical that Microsoft would ever be able to make back the $2 billion in the investments that it's been making in HoloLens. And I think within the last couple of weeks or so, it's been announced that Microsoft is working with the U.S. military on a $480 million contract. And so I still think that there are going to be these types of collaborations with both the enterprise as well as with the military applications that are going to really bootstrap the overall immersive industry. And to me, I think it's still a bit of an open question as to whether or not we're going to see a mass adoption of this as consumer technologies over the next couple of years. and I think by 2020 or 2021 we'll have a better sense as to whether or not we're going to hit this mass ubiquity target by 2025 or if we're going to slip into another winter and it's going to be until 2045. I suspect that there's so many things that are happening and that it may actually just completely revolutionize and change a lot more quickly than we all imagined. I think if we just look at the consumer and gaming and storytelling applications that are out there, like Beat Saber is a great example of a game that is really an embodied game. And what I mean by that is that it's really like this visceral tapping into your unconscious. It's like a game where you have to play it each and every day. It's like this daily practice that you can slowly get better at it. But it's something that actually takes a lot of full body coordination with you making a choice. You take the action that you need to take, but also training your body to do the skills that it actually needs to do in order to have the experience. And I think that there's so many things in life where you can start to turn things into a video game, whether it's playing music or have different levels of perception. There's so many applications of immersive technologies for the sports arena to be able to train elite athletes. So I do think that we're still going to see like these highly specialized bespoke applications that are really going to bootstrap the industry. And I think the open question is legitimately, is this going to be like a consumer facing technology? So I think that's something we just have to kind of wait and see. And for me, I recently gave a talk at the VR Now conference in Berlin, where I was really talking about what I see as the ultimate potential of virtual reality after doing over a thousand interviews now, trying to really hone down to what I see as the essence of what is going to make these technologies sticky. and to really take root and make a difference in society. Again, it's gonna be starting from the enterprise and all these training applications and medical applications, these applications that are making real differences in people's lives. And I do think that there is a bit of a wild card into what type of immersive storytelling and entertainment experiences that we're gonna be able to have. And I think that Henry's right in the sense that telepresence is one of those killer applications that once we have enough of emotional fidelity that is going to be transmitted through the face and be able to have these virtual experiences where you can be anywhere and be able to have this sense of telepresence and feel like you're actually there. I think that is actually going to be a huge step forward in a trajectory of virtual reality as well. And it's going to make it feel like you're actually hanging out with other people. And once it gets to that level, I think that, you know, Neil Stevenson in his book, Snow Crash really nailed it when he said that was a bit of a turning point when it comes to these immersive technologies. The other thing that I just want to unpack a little bit is this dynamic between marketing and the pragmatic engineering reality. And so what Henry said is that in the 80s, there was no terms, there was no community, and there was no larger understanding with what all these technologies being added together could actually do. piecemealing together all these different technologies, whether it was the tracking technologies, the display technologies, the real-time graphics technologies, all these things that were slowly being innovated for each of their individual use cases, but it wasn't until it congealed into this term of virtual reality that Jaron Lanier helped to coin and propagate this term, that the story of virtual reality was being able to make sense of like, with these technologies all put together, you're able to solve these specific problems. And in the 90s, when it had all the story and all these promises of what the potential were, it actually made it easier for people like Henry Fuchs and Fred Brooks to actually go out and raise money from places like DARPA and the military to be able to continue to fund the type of research that needed to happen in order to actually develop and create these different technologies. So it's important, I think, that there are these marketing hype cycles, and I think they're to some extent unavoidable, because the story, the potential of something is going to be always ahead of what's actually pragmatically possible with a technology. And so whenever there's these new possibilities, you see this play out in the Gartner hype cycle curve, where there's a new emerging technologies that are coming together. You think about the ultimate potential of what that could actually do. it gets the peak of the hype cycle and then it falls down into this trough of disillusionment and then is slowly climbing up that slope of enlightenment up until it gets to the plateau of productivity. And virtual reality technologies are well along that road of that slope of enlightenment, heading towards that plateau of productivity. And in some industries, it's already reaching that point of getting to that state of being so useful and pragmatically helpful for anybody who's doing spatial design and 3D modeling and architecture visualization and construction, engineering, there's different specific use cases where VR is just a no-brainer and is already doing amazing stuff out there in the industry. And I would say that there's this whole phone-based AR ecosystem that's out there with people who have the phones and are able to potentially do different things with the technology. Now, the question I have is whether or not looking at things through the 2D frame is still going to be interesting and compelling enough to give people enough of a spatial experience for them to eventually want to take this technology and start to put it on their face and have head-mounted augmented reality display technologies. There's certainly so many use cases for that to be useful within the context of different enterprise applications when you're actually at work. But what are the use cases for you walking around being able to have this AR technology on your face all the time to be able to mediate the world in different ways? And I think that's where the concepts of the AR cloud and what Magic Leap calls the magic verse and what I kind of refer to as like overlaying the realms of platonic reality on top of the real world. whatever the term ends up being for describing the AR cloud, there's going to be specific applications that are useful for like firemen and the military. And we already have a sense of augmentation of us navigating the world with GPS, but it's good enough for us to look at on our phone. And what is the spatialization of all these technologies going to give us? And what kind of context and meaning are we going to be able to add to our world with these AR technologies? I tend to look at it through the lens of human experience. And so there's different qualities of experience of how are you going to be able to overlay information on top of the experience? How are you able to gain more social interactions at any place? What type of activities are you doing? So exercising around your neighborhood, Pokemon Go is a pretty good use case of being able to add an activity of you actually locomoting and moving your bodies through space in order to have some sort of gaming experience. there's going to be different dimensions of embodiment and you paying attention to sensory experiences and so how can you start to add location-specific haptics and to extend your sensory experiences in these different locations. And then finally the story and the gaming and the entertainment and the pleasure all these things that are going to be overlaying all types of alternative reality gaming on top of our world out there and And so I think a lot of these things are going to be bootstrapped with the phone-based AR. And eventually, we're going to have this tracking of our hands that are going to be good enough to be able to actually put it onto our faces. So I think there's a lot of open questions as to this technology roadmap and what's going to happen in a larger ecosystem and for the story of these use cases and these applications to be ready to make this jump to these enterprise applications that are already widespread and compelling into crossing the chasm into the mainstream. So I really just enjoyed hearing some of the evolution of the technology and to hear about the development of the tracking technologies, as well as the scanning technologies. And I think about how Henry Fuchs previously had told me the story back in episode number 139, when I talked to him in France in 2015. He told me the story of how Ivan Southern had sent out a lot of his grad students out to a VW bug to be able to measure with rulers and to turn that VW bug into a series of polygons so it could be modeled within computer graphics. And so we've come a long way from grad students doing that by hand to doing things like laser scanning or photogrammetry or depth sensor capture or volumetric capture and artificial intelligence be able to extrapolate this 3D mesh data from a lot of these scans of the world. All these things that have come such a far distance and to hear how even back in the mid-60s they were doing stuff like ultrasonic tracking and inside-out tracking and outside-in tracking. All these different tracking technologies that were developing at the time have continued to go through all the different iterations and I think now we're seeing a full breadth of all those different tracking technologies, especially as we see the Oculus Quest that's going to be coming out early next year. It has this inside out technology, which is not going to be as robust as a tracking technology that you have with something like the outside in with either the Rift or the HTC Vive, but it may prove to be just good enough for most of the use cases where it's at really a good price point and give you enough of the immersive experience with a full embodiment that is allowing the virtual reality to go to the next level. And I think we'll all be curious to see what happens with the launch of the Oculus Quest that's coming up. here early next year in 2019. So that's all that I have for today and I just wanted to thank you for listening to the Voices of VR podcast and also send a shout out to the University of North Carolina School of the Arts for their Future of Reality Summit for helping to bring me out to their Future of Reality Summit to be able to host it and to do a series of different panel discussions that I hope to be airing here soon on the Voices of VR podcast. And also a shout out to my Patreon supporters because I wouldn't be able to do the work that I'm doing without the support that I get from Patreon. And this is a listener-supported podcast, and so I do rely upon donations from listeners like yourself to be able to continue to bring you this type of coverage. And so, if you enjoy that, then please do consider becoming a member of the Patreon. $5 a month makes a huge difference to ensure that I can continue to bring you this type of coverage. So, you can become a member and donate today at patreon.com slash voicesofvr. Thanks for listening.