Dr. Jeff Norris works on Mission Operations Innovation at the Jet Propulsion Laboratory Ops Lab. They’ve been partnering with all of the major virtual reality & augmented reality companies to accelerate the progress in space exploration, designing spacecraft, and assisting astronauts with augmented expert assistants. Jeff was at the Unity AR/VR Vision Summit where he provided an inspiring speech about how NASA is using all of the latest immersive technologies to explore and discover more about our Universe.
LISTEN TO THE VOICES OF VR PODCAST
What was striking to me was when Jeff posted a photo in his presentation with astronauts and scientists using everything from the Oculus Rift to HTC Vive to Sony PlayStationVR to Microsoft HoloLens. Jeff is grateful that there’s already a diversity of different VR platforms and input devices that each have the different strengths and weaknesses.
They’re using virtual reality applications when they want to completely immerse people into exploring other worlds, such as simulations of environments but also for the future of telepresence robots that explore places in space that are too harsh to send humans. He calls these ‘telenauts’ that may end up having more capabilities than actually being there. While there will always be advantages to actually being there, going into the sun or onto the surface of Jupiter may just be simply impossible for someone to go to. With virtual reality technology, then we’ll be able to expand the places where scientists can explore and use their natural procedures of observation, research, and discovery.
NASA is also using augmented reality systems like the Microsoft HoloLens on the International Space Station to help astronauts perform hardware maintenance that may be too difficult for them to do on their own. For example, instead of reading an instruction manual filled with complicated descriptions of what to do, they’ll be able to use the updated Skype technology to have a subject matter expert project holograms and talk them through the process of performing maintenance.
Hearing about Jeff talk about some of the most applications that they’re using virtual and augmented reality for are some of the most exciting. NASA is tapping into the innate desire of humans to explore, learn, and discover. NASA has been already been using virtual reality for training applications for decades, and they’ve been early adopters of the latest consumer-level HMDs because they wanted to make sure that they could give feedback to the hardware manufacturers to ensure that it would be functional for how they wanted to use it.
Whether it’s applications for designing spacecraft, helping astronauts do their work on the International Space Station, or exploring Mars and beyond, NASA is using VR and AR to satisfy our innate desires for exploration and discovery.
Here’s Jeff’s awesome keynote that he gave at Unity’s AR/VR Vision Summit:
Become a Patron! Support The Voices of VR Podcast Patreon
Theme music: “Fatality” by Tigoolio
[00:00:05.452] Kent Bye: The Voices of VR Podcast.
[00:00:11.996] Jeff Norris: My name is Jeff Norris, and I lead Mission Operations Innovation at the NASA Jet Propulsion Laboratory. And I lead a lab there called the Ops Lab, where we're building new tools for controlling robots and spacecraft. And we have a special emphasis on applications of virtual and augmented reality to controlling robots and helping people to explore environments.
[00:00:32.488] Kent Bye: Great. And so we were just here at the Vision Summit from Unity and you gave a keynote where you were kind of talking about the future and all the things that we'd be able to do in the future with AR and VR. Maybe you could talk a little bit about some of those things.
[00:00:46.737] Jeff Norris: Sure. I mean, it's astonishing to me how quickly this whole field is moving, and it's really been a great thing for NASA because we're not attempting to push forward in this space all by ourselves. We've chosen a strategy of partnership with a lot of different companies that are working in this space, and that's really accelerated our progress. When I look back on the, you know, the few years that we've been working in this area, I'm surprised at how quickly things are moving. so it makes it even harder to try to guess where we're going to be in the future. What we have right now are three main applications that we're heavily working on in my lab and then a bunch of new ones that we're thinking about in the future. Those three main ones are an application for exploring Mars, an application for helping astronauts on the International Space Station in their work, and an application for designing spacecraft.
[00:01:33.070] Kent Bye: Yeah, and I think there was a bit of a punchline where you were talking about these things as if they were going to be happening in the future, but they're already happening. And so you had a slide that showed every head-mounted display from the Oculus Rift to the Sony PlayStation to the HTC Vive, as well as like augmented reality like HoloLens. So from your perspective, what do you see as kind of like the strengths of what you're using, which of these virtual reality systems for what?
[00:01:57.508] Jeff Norris: Actually, we're really grateful that all of these devices are not identical. And so this market, even though it's a young market, it's producing devices that are exploring different advantages of the medium. And not just virtual and augmented reality, even within the devices within those categories, we see special capabilities amongst them. So at NASA, what we're doing is doing our best to pair the unique qualities of each of these technologies with the right problem at NASA. And so far, that's been a strategy that's working out really well for us.
[00:02:26.205] Kent Bye: Great. So what are some of the strengths of, say, like a HoloLens, when you're able to put something that's in the middle of the room and have multiple people in the same shared physical space?
[00:02:34.189] Jeff Norris: Well, the differences between virtual and augmented reality are probably not news to your audience, but just to explain it from NASA's perspective, virtual reality is an appropriate technology for us when we are looking to immerse and explore in a distant environment. Whereas augmented reality is best for us when we need the user to be able to either make use of tools in their current environment or perhaps their current environment is a large part of why we're making that visualization. So we've chosen virtual reality technologies when we want to truly immerse our operators, perhaps in the body of a robot or in a distant environment. We've chose augmented reality when we might need them to use tools that are surrounding them or because they're operating on the environment around them. So that's the main reason why we've made those distinctions between the platforms.
[00:03:20.553] Kent Bye: Yeah, and with the 6-degree-of-freedom controllers that you get with, like, say, the HTC Vive that has very tight 360-degree laser-track focus where you could turn around in any direction and have tracking, is that something that these types of controllers, with the Oculus Touch controllers, with the Sony Move controllers, talk a bit about, like, what does this new 6-degree-of-freedom controllers enable you with this new immersive computing platforms?
[00:03:45.960] Jeff Norris: So I think input is going to be a really exciting next phase of innovation within virtual and augmented reality because it's really only sometimes just a few instants after someone puts on a virtual reality or augmented reality headset that they want to get the rest of their body involved and they want to put their hands on the things and touch things and manipulate the objects as if they're real. So, yeah, you've just listed a bunch of people who are working on that and I think that there's even more who are coming into that space. For us, it's critical because we need to be able to manipulate objects or robots, machines, etc. with precision and we want that to be as natural as possible. And for us, connecting it to our hands, you know, without having to go through, you know, an abstract input device, connecting it as directly to our hands as we can is really what we're hoping for.
[00:04:33.160] Kent Bye: Now I guess that's the challenge is that there's also like leap motion where you have like optical track with your fingers and your digits. Is that the level of fidelity that you need for some of this or is a trigger or button within something that may have like closer physical tracking going to be something that's going to have the quality of specifications that you need in order to do actual real work with NASA?
[00:04:53.325] Jeff Norris: And you probably won't be surprised when I say that it depends very much on the application. So there are applications where we don't need the individual digits, and others where it would be helpful to have that. So yeah, that's another example of, because we've also done work with Leap, where we're finding ways to use each of these technologies for the problem that it really matters for. So I think both is the answer.
[00:05:14.841] Kent Bye: So what would you need to use your fingers for in a telepresence application?
[00:05:18.790] Jeff Norris: Well, sure. So an application that we actually did, it's been about two years ago now, with Leap, we allowed people to control the legs of the Athlete Rover, which is a six foot tall robot, giant robot with wheels on the end of its legs. And it's a challenge to control all the degrees of freedom of that robot. And just as an experiment, we mapped the control of a hand onto the legs of the robot. And despite that you have five fingers and Athlete has six legs, it was a natural mapping that people could just instantly grasp and understand. So that would be one example. When you're working with large pieces of a spacecraft design and moving things around, there you might not need digit-level precision, but then if you get to finer levels of detail on that spacecraft, it might again become really attractive to have that. So, again, I think it's very dependent on the application.
[00:06:07.438] Kent Bye: It seems like with space technology that it has to have a certain level of specifications and quality assurance of not failing and you're putting these things out and a lot of times the technology is many many years old but it's reliable and so when we talk about something like virtual reality it's so cutting-edge that you know set the context in terms of something that's so innovative but yet at this stage it's pretty early in terms of whether or not it would actually be deployed to a production environment.
[00:06:36.087] Jeff Norris: Well, we are adopting these technologies early and we've chosen that strategy because we see the potential of this technology for our work and we want to be a part of early stages of this technology because we want to help make sure that our applications and our needs are represented as the medium develops. But we are pushing these technologies into production environments now. You know, we went through the full certification process necessary to launch the Microsoft HoloLens to the space station, you know, including all the safety issues that you're discussing. And so, you know, we're careful and we have to be careful because some of the things we're doing do have critical consequences. But that's, I believe, what's necessary right now, which is that it's not that we shouldn't use these technologies, it's that we should take the care that we need to because of our application area.
[00:07:22.998] Kent Bye: Yeah, and I heard that, you know, and fortunately with some of the original HoloLens that were being sent up, it was on a rocket that did explode, and now that they are actually being sent up now, and talk a bit about what the Microsoft HoloLens is going to be actually used for with astronauts at the International Space Station.
[00:07:39.834] Jeff Norris: Sure, I'd be happy to. Yes, we did have our first two units lost on a rocket that we launched last year, but the next two units arrived successfully in December, so we have two Microsoft HoloLens devices onboard the International Space Station today. And we're going to be using them in a couple of ways. Initially, we're going to be using in what we call remote export mode, which leverages the Skype technology that Microsoft has developed. and there we're going to be assisting astronauts on board the space station from remote experts in the mission control environment to help them accomplish tasks more quickly that may require complicated procedures or that they may not have been fully trained on before going to the space station. And we have aspirations beyond that to work on things that you might think of as sort of like a holographic construction manual, where they can come up to a piece of equipment that they need to work on and see overlays on top of that equipment showing them where a piece of equipment needs to be removed or installed, things that are safe to touch or not touch, that kind of thing, just by looking at the piece of hardware.
[00:08:38.794] Kent Bye: There was a section in your keynote where you started to talk about situations and contexts where it actually may make more sense to not be there physically in real reality, but to be remote through virtual reality, through what you call a telanaut. Maybe you could explain a telanaut is and what type of things and capabilities that a telanaut might be able to have.
[00:08:59.247] Jeff Norris: So I think it's important to keep in mind that it's not just NASA astronauts that are exploring. It's a culture of hundreds of thousands of people both working for NASA and our contractors and the universities that support us that are all contributing to this journey. And the technologies that we're discussing here at the Vision Summit are ones that can involve those people more effectively in exploration than we've been able to thus far. So, when I think about a telenaut, I think about a person who is able to be present in a distance environment without having to be physically present in that environment, and that does carry many advantages. And I'd be quick to say that there are many advantages associated with being physically present in an environment. What I wanted to explain in the keynote today, when I said that virtual reality can be better than actually being there, is that there are abilities that we have in a virtual environment that a human doesn't have in a physical environment, and those include you know, ways that you can look at an environment, being able to see through the eyes of instruments, scientific instruments that can measure the environment in ways that our eyes cannot, being freed from the physical limitations of your body so that you can move into whatever perspective is necessary to answer a scientific question. And I think if we look further out beyond Mars, which I spoke about today, there are many, many places that as a species we want to go. And some of those places may never be a really great place to put a human being. You know, we may want to get, you know, right up close to the surface of the sun, or we might want to descend into the environment of Jupiter. And those are places where virtual presence may be the best way possible to be there. And that's exciting to me because I think that there's a whole opportunity there for us to extend human presence into places without having to physically extend presence into those places.
[00:10:45.760] Kent Bye: Yeah, that's really interesting because, you know, you're talking about, like, virtual explorers in some sense, where you send some sort of robot that would be able to withstand environmental conditions, but yet being able to have, like, the sense of depth and scale that you see through VR would be able to, I guess, see, like, what advantage does that give you? Like, what would, as a scientist, you know, given in this kind of virtual explorer robot that is going to all these places, what does virtual reality give them with that presence?
[00:11:13.924] Jeff Norris: I think it's important to remember that all human beings are natural at this act of exploration. Our brains and our senses have evolved to allow us to just apprehend and understand our surroundings just, you know, effortlessly. And when we send a robot out into a distant environment, we have to be careful that we still do our best to engage those natural abilities. So the example I would give is that, you know, consider a geologist working in the Grand Canyon. The geologist, just by standing in that environment and being able to move around it as a human does, can just absorb and understand the environment so rapidly. We want them to have access to all of those core human abilities, regardless of the environment that we're exploring. So if that environment, for whatever reason, isn't a place we can yet put a human, then we want to give them as human of an experience as we can there, because we have so much respect for the ability of the human to understand environments. So it's really just a question of how do we engage these amazing abilities that we all have, and then the specialized training that we build on top of those abilities as a scientist, even if they can't put their boots in the environment that we're looking at.
[00:12:24.741] Kent Bye: So what do you want to experience in virtual reality then?
[00:12:27.682] Jeff Norris: So many things. I mean, just speaking personally, I'm a gamer. I've always been a gamer. And I'm looking forward to experiencing what the new creators in this medium are going to do as they really discover what this medium is for. I noticed that it's actually Marshall McLuhan that remarked upon it, that media always imitates the previous medium at first. So the first television shows are really radio shows with a camera pointing at people talking. So we're in that young age of VR right now, and people are just figuring out right now how to imitate the old medium. And I'm really looking forward to seeing it come into its own as it actually becomes a medium in its own right. And as people discover what it really is for, that's going to be just a wonderful thing to experience.
[00:13:13.283] Kent Bye: And finally, what do you see as kind of the ultimate potential of virtual reality and what that might be able to enable?
[00:13:21.485] Jeff Norris: That's a big question. Well, if I come back to exploration again, one thing that I think is exciting about this medium is that it allows us to imagine exploring very, very distant places. So, you know, one challenge that we face all the time as we explore further and further out is, you know, the speed of light, time delay. And, you know, for a long time, it has been discouraging to imagine trying to control a robot, perhaps out on the surface of Pluto, where it would take so long for a signal to travel there and travel back that the pace of exploration would be so slow. But as we see the domain of artificial intelligence advancing, I think in the domain of virtual augmented reality advancing, if you put these things together, you can start to imagine robotic systems that we're sending out further and further from Earth. And that the real goal is not to be controlled by us in real time, but to build models that we can then explore using an interface that is as adapted to our senses as possible. And I believe that that's the track that VR is on. So what that means is no longer are we worried so much. about that pesky speed of light. We're able to explore these models that have been built by these robotic avatars in a natural way. And I feel that that act of exploration we will come to recognize is no less meaningful than the act of exploration if you were physically present. So for me the ultimate promise of VR as it applies to exploration is a redefinition of what it means to be present, focusing on the experience of exploration rather than the physical travel that it has been associated with historically.
[00:14:58.744] Kent Bye: Okay, great. Well, thank you so much.
[00:15:00.165] Jeff Norris: It's my pleasure. Thank you.
[00:15:01.786] Kent Bye: And thank you for listening. If you'd like to support the Voices of VR podcast, then please consider becoming a patron at patreon.com slash voices of VR