At the time of this interview, Rob Swatton was the Research & Development Manager at Earth & Sky Ltd, which is an New Zealand-based astronomy organization. They do public outreach and education at Earth & Sky as well as scientific research into discovering exoplanets through a technique called microlensing.
In his R&D role, Rob was brainstorming different ways that virtual reality could be used in helping the visitors get a better understanding of our universe, but also potentially help with the process of discovering new exoplanets through distributed pattern recognition like SETI@home, but with people’s brains.
He sees that VR could help us visualization processes that happen on a grand time scale that span billions of years. There are visualizations of scale, distance, and time that transcend our metaphors and abilities to describe to people. Most of these visualizations are either really esoteric or rely upon complex mathematical models that are difficult for the public to fully comprehend. Rob speculates that perhaps VR could help show the process of a galaxy forming, how a nebula creates new stars, or what a black hole would look like. It’d be like a timelapse visualization that spans over billions of years.
Rob also imagines that some of the VR experiences would benefit from having an interactive guide to help explain difficult concepts. A visitor would be able to see a concept within VR, but also have an expert on hand to be able to ask questions about it as the experience unfolded.
One of the more speculative ideas that Rob had was thinking about how VR could make the process of exoplanet discovery more interactive with crowd-sourced pattern recognition tasks that people could do at home. He would imagine something along the lines of what SETI@home is doing with distributed cloud computing, but doing it with people’s brains.
They would likely have to do some type of filtering or symbolic translation of the raw data to be able to have the public understand the concepts and what they’re looking for. So it’d be a balance between making it accessible to more people to understand conceptually versus maintaining the integrity of the data.
There’s a lot of unanswered questions for how something like this would actually play out and be implemented in practice, but it’s an interesting idea to be able to crowd-source pattern recognition in order to help with different scientific research endeavors. One example of where this is already happening is with the Fold It game, which has been able to gamify the process of protein folding while at the same time allow people to contribute to scientific research.
Rob says that you wouldn’t necessarily need VR in order to search for the data analysis and number crunching within the raw data of how gravitational effects bends light in this microlensing process for exoplanet discovery. But it’s something where VR could just make the process more compelling to participate in.
Learn more about some of the ideas for how VR could be applied for education and scientific research by listening to this interview with Rob.
Become a Patron! Support The Voices of VR Podcast Patreon
Theme music: “Fatality” by Tigoolio
Subscribe to the Voices of VR podcast.
Rough Transcript
[00:00:05.452] Kent Bye: The Voices of VR podcast.
[00:00:12.016] Rob Swatton: So I'm Rob Swatton. I'm the R&D manager for Earth and Sky. We're a New Zealand-based astronomy organization based at Mount John, which is New Zealand's largest observatory.
[00:00:24.043] Kent Bye: I see. And so what are you doing with virtual reality then?
[00:00:26.925] Rob Swatton: So yeah, it may seem a bit of a leap to wonder what astronomy and virtual reality can do together, but effectively we interact with the public on a very large basis and it's very hard to convey some of these abstract concepts of scale and distance and time without using very euphemistic language that doesn't really represent anything in real life. You know, you talk about the galaxy being like a dinner plate or something like that. It doesn't really tie to anything that people can understand. So by being able to put people in a virtual reality, you can start to show people scale and distance and time in ways they can understand. You know, you can actually have comparative concepts that make it much easier to understand really those very in-depth physical concepts that we're actually dealing with.
[00:01:05.375] Kent Bye: Yeah, I think one of the paradigmatic examples of educational virtual reality experiences has been Titans of Space, where you're flying around the solar system to see the relative scale of all the planets relative to each other at one millionth the scale. That was kind of like on-a-rails experience flying around the solar system. What are you doing within astronomy and virtual reality?
[00:01:26.248] Rob Swatton: Titans definitely led the way in that and it opened a lot of people's eyes to what could be done in those kind of environments but where we're looking is a much more interactive experience where you'd have effectively a guide controlling your experience so if you didn't quite understand something it's not a case of trying to find some information in a menu or on screen you can actually ask the guide in person and say sorry I didn't quite understand that can you explain that from a different angle or in a different way that I might understand better So, we have the point of difference and also the fact that we're an end destination, so people are coming to us to look to the observatory or do the night tours or that kind of thing. So, it adds to the experience rather than creating a new experience. We're allowing people to have a much shorter but more intensive educational experience than a longer kind of, you know, go up on the mountain and have a, you know, a two-hour thing where you look at the sky and look through telescopes. You know, we are constrained, unfortunately, at this time to a plane. We're stuck on a sphere, effectively. We look outwards, but we can't go anywhere. There are very few people who've gone into space, and it's an experience that a lot of people want to have, but the bar of entry is set pretty high of either being, you know, incredibly physically fit and intelligent and wanting to go and join NASA or the European Space Agency, or signing up with, you know, Virgin Galactic or SpaceX or whoever it is who wants to send you up there for a large sum of money. So the possibility of being able to go and see these things that we may not see for generations to come, you know, the formation of a galaxy, the way a nebula creates new stars, a black hole, all those kind of amazing concepts that we love talking about and discussing. And, you know, you've seen very popular television shows come out recently that are really trying to demonstrate those concepts, and it gets people very excited, but you're still viewing it in a two-dimensional manner. You're not understanding its depth and its scale and how it works and how it moves because, frankly, we don't live long enough to see how these things move. So it simply opens up a lot of the constraints that have otherwise been keeping us tied down so far.
[00:03:10.250] Kent Bye: Yeah, one of the things that I find interesting is that the stars, if you take a time lapse, you can see how they move throughout the course of the night. But yet, if you're just watching with your naked eye, you may not be able to actually perceive any motion because it's moving too slowly. And so we have to accelerate time in order to see that movement. And so I'm curious if that's some of the things that you're trying to do, is doing these types of acceleration of time to show how the sun moves across the ecliptic, across the sky, or how the stars move throughout the course of the year.
[00:03:39.200] Rob Swatton: Those are certainly experiences that I think you can generate, but they are still relatively short in the terms of space-time. When you think about the sun moving across the sky in one night, that's simply based on our rotational speed. A year in human terms is because we've gone around the sun once. So all of these things we know as the human time scale, you divide down a year into minutes, hours, seconds, nanoseconds, all that kind of thing. It's all based around our physical relationship with our home star. So they're fantastic things to discuss. There's something very interesting in that that I'd love to show people. But for me, the really exciting stuff is talking about the really grand scales of time. The formation of a galaxy may take billions upon billions of years. We know the Earth to be around about 4.6 billion years and the Sun to be around about that same age. So those are the kind of things, if you can show the formation of a solar system you can explain how earthly bodies form and how they actually coalesce into these solid forms and turn into a sphere. Then people start to get a much better grasp of what it is that builds a solar system and then we can start integrating the research arm of exosolar planet research where we're looking for new planets and explaining how we're finding those new planets and what is forming those planets.
[00:04:46.460] Kent Bye: Wow, so that's quite a grand scale, if you can think about the timescale of the universe and it coming into being. Are there actual models, astronomically, that are describing how these things are actually formed? And are there existing visualizations that are at least in a 2D plane of that?
[00:05:01.461] Rob Swatton: There certainly are existing visualizations, but the problem is they're esoteric. They're very scientifically minded. So you look at something like the background radiation chart, which effectively maps the radiation left over from the Big Bang. And from that you can start inferring age and distance and time based on how far the universe has expanded in the time it's been around. But it takes a fair amount of explanation to understand the image and what you're looking at. You know, the hot spots, the cool spots, what it is that's going on there. Even in the formation of a planet, you're looking at very, very complex simulations or explanations. You can't just demonstrate it in a very quick and easy manner to show, you know, one large chunk gathers another chunk and then they gain more gravitational mass and they keep gathering more and more. Those are things that in a simulation with a guide, the people we work with are very experienced in explaining these concepts just on a verbal scale. So if you give them the tools to be able to explain it visually as well, I think you'll end up with a very effective partnership between the two technologies.
[00:05:54.523] Kent Bye: So it sounds like that not only are you creating experiences that are going to be educational for the general public, but also potentially be able to create new visualizations for the advancement of science itself. And I think it's sort of speculative at this point as to what that is going to look like, but what do you expect how that's going to unfold for what you're doing?
[00:06:15.202] Rob Swatton: It's very early days for VR in general, and it's incredibly early days for the idea of scientific visualization on a mass scale. When you talk about shows like Cosmos and those kind of things, they're putting these notions and these concepts into terms people can understand and showing them in the best way they can, you know, the event horizon of a black hole. But to get into the serious research side of things takes a whole nother leap above that when you really start, you know, you need it to be at a level where a physicist would look at this and say, yes, that's right, that matches up, that ties to what we have observed. So there are certainly some challenges involved in that and I'd love to see it through to a point where we can do that and that will be very much when we start integrating the raw data from the research into the visualization side of things. So with, you know, looking for these exosolar planets, we're looking for effectively gravitational effect. and how it bends light. So how do you represent that in a visual sense? It's tricky. You either have to accelerate it so much that it's kind of blown out of scale and it's hard to actually really, you know, you're making it so exaggerated that it's not really the true effect that you're viewing. So there's a balancing act between satisfying the public's want for knowledge and also actually representing true research and what it is and what it does.
[00:07:23.881] Kent Bye: You know, being from an observatory, there is a sense that when you look up into the heavens, it's from a perspective of the Earth or a geocentric perspective. And, you know, when you're going beyond into space, you're kind of in the heliocentric or kind of just roaming around in space. But from the perspective of astronomy and the geocentric perspective, is there anything specifically within virtual reality that you're doing in terms of working with this observatory?
[00:07:48.585] Rob Swatton: No, it is very early days, so we're still working out those relationships, but my hope is that what we can do is allow the data they use, which is hugely intensive on a computing scale. It's why the research we're doing has only really been possible for the last 20, 30 years, because the computers simply weren't powerful enough to process the raw data. So we have a huge stack of processors that are effectively crunching the numbers on the previous night's observations, looking for these new planets. So if you can allow a sort of a crowdsourcing model where you can introduce people to this data, but you can't just hand them the raw data, it's so detailed and so intense and you know, you need a physics degree to simply understand what you're reading. So you have to be able to represent that data in a visual fashion that people can understand and then they can start getting involved in the research. helping to refine the computing processes. I mean, the computer effectively learns based on what the researcher tells it. So the computer might flag something and say, yes, we think this is a new planet or a black hole or something along those lines. And then the researcher has to go in and say, yes, I think that is, or no, I don't think that is. And therefore, the computer is learning from his or her analysis. So if you can effectively put that information out to thousands upon thousands of people, you're exponentially increasing the refinement of the computing model. So it may not be that individual people are making the discovery of a new planet, but they're assisting in the refinement of the method.
[00:09:02.894] Kent Bye: Interesting, yeah. So there's also the SETI at home, which is like distributed computing, where the idea of just, you know, people making their computer processors available to do this grand processing for, I guess, the search for extraterrestrial intelligence. But for this, it sounds like, in some ways, it's a similar idea of distributing the pattern recognition of the human eye but having it in a way that's easy enough for the layman to start to jump into a virtual reality experience and then contribute back to the search for these exoplanets? Is that sort of the idea?
[00:09:35.972] Rob Swatton: I mean, that's pretty much the idea. Effectively, you don't necessarily need it to be virtual reality, but it's a very engaging medium. You know, people are drawn into it and they want to keep doing things and it captures people's imagination. In essence, you're doing the idea of the SETI cloud computer, but with people's brains. So you're allowing the human cognitive process to take over and assist in research that would otherwise be left to computers. So it's a funny inverse relationship where we're actually giving more of the power back to the human brain, but it's helping to refine the computing and therefore will lead to more accurate research in future.
[00:10:06.880] Kent Bye: Wow, that's really interesting. And so, what is the decision that the human is making in terms of when they're looking at this data, that they're sending information back, what are they actually looking back and then how is that, I guess how are you signaling from false positives and people making mistakes?
[00:10:20.633] Rob Swatton: I mean, I'll freely admit I'm not a physicist, so I wouldn't even want to go into the details of how the, you know, the true discovery is made. But in essence, they're looking for these very rare events. So they're looking at vast numbers of stars each night. And they're hoping to spot one of these microlensing events, which is effectively two stars passing by each other. And as one, the light from one goes past the other, it's bent and curved. And we can detect that curvature as a magnification of the light. If there's a planet around that star, it will also have its own effect gravitationally on the light, and we can then actually detect that as something as part of a light curve. So that's really the raw kind of science, and there's a huge amount more that goes into it, so I'm giving it a gross simplification, but you get the concept. So then we can effectively extrapolate that data into some kind of visual form that people can look at and understand. So I mean, representing it visually is a very tricky thing. We haven't really nailed that down as to how you go about doing that. But it needs to be something that's clear enough for people to understand. You can't just show them a massive star field. That's simply too much data. So what you would probably do is you'd allow the computer to make a rudimentary calculation of the events. And it would go in at a very low level and say, there's nothing here. There might be something here. That would then flag to the crowd or the community who wants to look into this. And they would then go in and say, based on their... I mean, you can have a Reddit-style rating where users get a rating based on their success in the past. So the more likely they are to get good results, the better. They would probably get priority on looking at the data first, because sending out these data packets does take time. It's not a small file. So over time you'd start to see trends developing and people who are very good at it would, you know, effectively move into a position of authority to tell others how to do it and how to go about that. So you're allowing a community to generate its own kind of autocracy and actually structure itself to look into the data. And so I think we can use a fairly light touch in how we go about visualizing the data and the community itself will start to come up with the best ways of doing it and we can then implement those solutions rather than trying to just kind of bludgeon them over the head and tell them this is how we're going to do it, this is how you must follow the system.
[00:12:21.180] Kent Bye: And so, being in this specific project that you're working on, how are you measuring success in terms of whether or not what you're doing is getting the end result for what you're looking for?
[00:12:31.625] Rob Swatton: So, microlensing is a very young science. In that regard, it's hard to measure success, and you might look at, you know, the initial cost and say, well, what was the point? You know, it's an incredibly high cost to build this telescope and this facility to make it happen. But the knock-on effect is that we're refining the techniques that make microlensing work. So in future you could have space-based microlensing telescopes that are much better and faster at doing the computations and are able to beam the data back to Earth to massive data centers. So it's very early days for the science and we're therefore, not so much the pioneers, but we're early enough to influence the development of this technology and allow it to move forward in future rather than aim for it to be We're never going to beat the kind of results that Kepler's getting. They're looking at thousands of new exoplanets each time they do some observation. They're up to huge numbers in the number of planets they've found. We're at relatively low numbers because it's a young technology and we haven't really refined the way we go about doing it yet, but we're getting there. It's simply just early days for the tech.
[00:13:27.054] Kent Bye: And how did virtual reality come into the picture of what you're doing here?
[00:13:31.118] Rob Swatton: That's an excellent question and one that probably wouldn't be answered over about three hours. I've always had an interest in technology. I've done a lot of varied things throughout my short career so far. And it just seemed a natural fit. I think it is the next evolution of our visual inputs, the way we actually interact with media and data. will move into much more of a stereoscopic and a spatial manner. So it seemed a natural fit to start working on it now when there is a small enough community that you can start talking to all the right people rather than it being to the point where you've got your Apples and your Samsungs who are taking it over and these corporate monoliths are really hard to actually go and approach and talk to about this. It was important to do it at an early stage, and then it was important to make sure that it was integrated early enough and well enough that it was naturally adopted rather than it being sort of co-opted at a later date and tacked onto the side, rather than just being a natural part of the process. I see.
[00:14:21.935] Kent Bye: Are there any special considerations that you're giving feedback to the oculuses of the world in terms of what you need to be able to do this?
[00:14:29.423] Rob Swatton: From the public side of things, from the more public engagement of people coming to us, certainly the headsets need to be strong because the public using them does take a fairly heavy toll on them. So I'm hoping to hear some things in the near distant future about what they're doing in regards to a consumer version. And that I think will lead us to conclude whether or not Oculus is... Right now Oculus is the perfect choice because they have developer kits out. So we can start using their tech and actually playing with their solutions and seeing how we can work in VR. It may be in future that we have to move to a different solution, because their solution simply isn't rugged enough to keep up with what we're doing. From the crowdsourcing data point of view, it's really a non-issue, because that's people providing their own hardware. All we're doing is providing the data and allowing people to get back to us with it. So that becomes pretty much a non-issue. But it's certainly, from a community standpoint, I would definitely like to see some moves towards a much more solid and rugged solution for just usage on a day-to-day basis.
[00:15:21.296] Kent Bye: And finally, what do you see as the ultimate potential for what virtual reality could bring to your field?
[00:15:26.850] Rob Swatton: From an astronomy point of view, it's definitely an outreach thing. I mean, I very much doubt you're going to see researchers sitting down, plugging in their Oculus and going to work and looking through telescopes. It's not really that kind of application, you know. They're using very specialized optics and cameras to get the results they're getting. I think when it comes to the data analysis and the way we crunch the numbers, then the ideas of virtual reality and being able to visualize data in a much more spatial way will become important. But it's hard to say. It could go any way. It's so early on in the development of this technology and the development of our technology that really there are so many open avenues, and it's just a case of which one we end up choosing.
[00:16:03.344] Kent Bye: Great. Well, thank you so much. Thank you, Ken. And thank you for listening! If you'd like to support The Voices of VR Podcast, then please consider becoming a patron at patreon.com slash voicesofvr.