#218: Visual Similarity Analysis of Atmospheric Nucleation Data using Virtual Reality

Logan-HercheAt the IEEE VR conference this year, there was the first-ever workshop on Virtual and Augmented Reality for Molecular Science (VARMS). Molecular science is one branch of science where virtual reality has a lot of applications, and Logan Herche from the University of the Pacific talks about his presentation about “Visual Similarity Analysis of Atmospheric Nucleation Data using Virtual Reality.”

Here’s the abstract from his presentation:

Nucleation processes are fundamental in many technological fields. However, the lack of effective, collaborative tools has slowed discovery in these fields. This paper outlines our work to visualize nucleation data in two ways to assist researchers seeking to learn more about nucleation processes. We created a similarity network to allow researchers to study similarities between nucleation structures easily. In addition, we developed tools to allow researchers to visualize aggregate molecular structures. To facilitate collaborative work over distance with low cost, these visualizations are web based, interactive, and can run on commodity machines. Though this paper focuses on similarity of molecular structures, this research can be applied to any network or graph visualization.

Logan mentions some of the other presentations that were interesting to him, including using sonification in order to help better visualize occluded information.

Logan-Herche

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast.

[00:00:12.093] Logan Herche: My name is Logan Hirsch. I'm a graduate student at the University of the Pacific in Stockton, California. And my research is looking at visual similarity analysis for nucleation structures. I looked at kind of two different main things. The first was similarity analysis with the node and link representation. to be able to look at the similarity between structures, and the second bit was a number of different visualization techniques for users with varying levels of hardware, so that they could evaluate and view those structures and visualize them and view similarities. And I think the one that is of the most interest is the Oculus Leap visualization, wherein we allow users to view two configurations simultaneously, with each hand controlling the manipulation of a given structure. offering of the techniques we implemented, which were standard anaglyph, an NVIDIA 3D Vision stereoscopic option, and that Oculus Rift Leap Motion Controller implementation, the final one offering the greatest levels of technological immersion.

[00:01:10.160] Kent Bye: I see. And so, you know, when you're looking at this, I'm curious how you're measuring or comparing or seeing which one is the most effective. Like, are you doing user surveys or how are you able to judge which of these various techniques are the most effective for visualizing these 3D objects?

[00:01:26.210] Logan Herche: So at this point, we look to do more evaluation and comparison between them in future work. At this point, the bulk of our comparisons have to do with performance, primarily in terms of what different machines we can have them run on. And the bulk of our analysis of performance relates to that known link representation. As far as the various levels go, we have a very sort of high-level analysis of which ones are worse and better, but it's not super specific. We don't really get in-depth with the user study as of yet, although we plan to in the future.

[00:01:56.515] Kent Bye: And so maybe you could tell me a little bit about why doing these visualizations in 3D are so much more effective than just 2D. What is it specifically that you're looking at and what is the 3D component of that?

[00:02:10.842] Logan Herche: So we're trying to make it so researchers can view molecular structures and kind of piece them apart to figure out how nucleation processes work. And it's a lot better to be able to get some sense of depth. When looking at the molecular structure, it helps you kind of keep track of where you are in comparison to what you're working with. And it helps you perceive the structure a little bit better. So that's kind of the primary motivation for providing a VR solution to people as opposed to just a straight 2D implementation.

[00:02:38.548] Kent Bye: And so maybe you could tell me a bit about this workshop that we're at here and all the stuff that you're hearing. It seems like this is the first of this type of workshop here at the IEEE.

[00:02:47.903] Logan Herche: So yes, this is the VERMS workshop. It's a workshop on molecular visualization or biological visualization. specifically with VR, and there have been a number of kind of fascinating talks. I really was intrigued by a sonification talk, which was a talk on using sound to help users sort of understand what's going on in a structure and also sort of understand their position. There was another talk which was discussing navigational aids when looking at very complex protein structures.

[00:03:16.745] Kent Bye: And so I know that there's also the thing of protein folding and is that something that is also playing a part in terms of what this may be able to do?

[00:03:25.981] Logan Herche: Yes, so the invited talk was on protein folding and that talk was a very interesting application and work with the models that were being presented and how they sort of worked together, but it was not as much VR at this point in time. It was a description of sort of potential future research for VR, but the folding innovative itself, the physical models that were created were very intricate, very good to work with to sort of figure out how alpha and beta helixes fold together.

[00:03:51.836] Kent Bye: And so what is your background? Did you have experience in VR before or is it coming mostly from the biological molecular structure angle?

[00:04:00.382] Logan Herche: So I'm coming primarily from a computer science background. So my biological background is relatively limited compared to the bulk of the other presenters. My primary interest is in virtual reality and in optimization of data visualization.

[00:04:14.032] Kent Bye: I see. And so what other kind of applications have you seen kind of come out with some of the framework that you're developing then?

[00:04:20.735] Logan Herche: You mean using the framework, the tool that I'm working with at this point? At this point, the tool is fairly self-contained. We're trying to make it available to people in association with the National Science Foundation, but it's not really being used by others. It's sort of a standalone tool, I guess you'd say.

[00:04:34.680] Kent Bye: And so what is the process for actually creating these molecular structures? Is there like a plugin for Unity or are you having to actually create these by hand or how do you get something actually up and running in this program then?

[00:04:46.629] Logan Herche: So we have a web-based visualization that we offer people. We want it to be able to run on commodity machines. We want distributed teams to be able to use it. People out in the field remotely without access to a lot of expensive hardware to be able to get access to this stuff. So we present the visualizations using WebGL and 3GS API to build and show them to users. We parse out the raw data in a sort of a two-step process to create first configuration files for individual configurations that we can display to users, and then a second which has a similarity JSON file which we use to do our node and link visualization. But the configurations themselves we parse out and build real-time in the web browser with WebGL.

[00:05:26.795] Kent Bye: I see. And when a user is actually manipulating these two structures with each of their hands using a leap motion, what kind of judgments or evaluations are they trying to make, or what's their intent for being able to compare these two structures?

[00:05:39.628] Logan Herche: So the primary objective is to be able to find similarity between structures and try to identify patterns. Because one of the things we want to be able to do is get a better understanding of how the nucleation process works. and being able to manipulate the models. We want to be able to provide tools for people to be able to identify trends, identify similar structures, perhaps. We don't know exactly what we're looking for, to be honest, but we want to get a better understanding of the process and have tools that will give us the ability to kind of delve deeper into how those structures work.

[00:06:08.695] Kent Bye: And so maybe you could tell me a bit more of the nucleation process. Is it looking at one point in time and then the same structure at a different point in time? Or are you looking at two different structures? Or maybe you could describe a little bit more as to what's actually happening.

[00:06:21.339] Logan Herche: I can do my best. It's not the easiest process to describe, and we don't fully understand it at this point. But it's the main process by which we get new particles in the atmosphere. And it's a gas-to-particle conversion. And so the actual data that we're looking at is generated from Monte Carlo sampling and accurate force fields. And that data, we're looking at individual instances of time, but not a single structure over time, if that makes sense.

[00:06:46.765] Kent Bye: Yeah, and so I guess I'm trying to get a sense of what are they actually comparing than just two different, like if it's not in the same structure in the same point in time, then what's the source of those two things then?

[00:06:57.856] Logan Herche: So we want to see if there are any common themes, really. So if there are common structures or if there's some commonality that we can find that can give us more information about how that process works. I think that's the big thing that we're looking for.

[00:07:11.328] Kent Bye: So for instance our test data we work with about 29,000 molecular structures And if we can find a theme that's very common throughout all of them that could give us potentially some valuable information Yeah, I'm curious to the reaction of professors or scientists that have been you know in this field for their entire lives But not been able to kind of have an immersive experience with this data for any point in them What's been their reaction to be able to see these in 3d stereoscopic 3d?

[00:07:37.672] Logan Herche: So the biologists and chemists that we've brought in have been overwhelmingly positive in their response to being able to see it. We were surprised in that the response to our anaglyph implementation was actually extremely positive, which we didn't necessarily expect given that it's a much more simple compared to others. But of the implementations, anaglyph and the leap motion and rift tended to be preferred, and the stereoscopic option was less preferred. of the people we've kind of managed to bring in. But overall the thought is that it's really cool to be able to see the structures in 3D and interact with them that way.

[00:08:12.633] Kent Bye: Cool. And finally, what do you see as sort of the future of using this type of technology for scientific visualizations?

[00:08:20.320] Logan Herche: I can't say for sure. I'd like to see it sort of expanded and better integrated, and I'd love to see it in use by people specifically doing pollution research, climate change research, nanotechnology and pharmaceutical development, kind of the main fields we would expect to see researchers playing around with it. But I can't say for sure who would benefit most.

[00:08:39.700] Kent Bye: Cool. Well, great. Thank you. Thank you. And thank you for listening. If you'd like to support the Voices of VR podcast, then please consider becoming a patron at patreon.com slash Voices of VR.

More from this show