Florian Frieß is from the Visualization Research Center at the University of Stuttgart, Germany. His presentation on using sonification to highlight events extracted from a molecular dynamics simulation. This talk on sonification was mentioned by a number of other participants in the first-ever workshop on Virtual and Augmented Reality for Molecular Science at the IEEE VR conference.
Here’s the abstract from his presentation on sonification:
Scientific visualization is an application area for virtual reality environments like stereoscopic displays or CAVEs. Especially interactive molecular visualizations that show the complex three-dimensional structures found in structural biology are often investigated using such environments. In contrast to VR applications like simulators, molecular visualization typically lacks auditory output. Nevertheless, sonification can be used to convey information about the data. In our work, we use sound to highlight events extracted from a molecular dynamics simulation. This not only offloads information from the visual channel, but can also guide the attention of the analyst towards important phenomena even if they are occluded in the visualization. Sound also creates a higher level of immersion, which can be beneficial for educational purposes. In this paper, we detail our application that adds sonification to the visualization of molecular simulations.
Become a Patron! Support The Voices of VR Podcast Patreon
Theme music: “Fatality” by Tigoolio
Subscribe to the Voices of VR podcast.
Rough Transcript
[00:00:05.452] Kent Bye: The Voices of VR Podcast.
[00:00:12.074] Florian Friess: My name is Florian Fries. I'm a student at the University of Stuttgart and I talked about sonification in molecular simulations. So we have this problem where we have proteins surrounded by liquids, fluids, and we wanted to extract events from this simulation. and the user can't just see the events because they were time-included or he didn't look at the right place. And so we use sonification to guide the attention of the user to these events. So we have audio for direction, so the user can hear where the event is happening and then can look at a specific point.
[00:00:48.743] Kent Bye: I see. And so what is it that you're trying to draw their attention to in the context of this simulation? Like, what is the hope that if they hear the sound, what they're going to then look at?
[00:00:58.347] Florian Friess: Okay, so we have different events. First, the hydrogen bonds are important for the protein because they give stability to it. And then when a hydrogen bond forms, it can play a sound and then the user can look at the specific part of the protein where it's happening. The other stuff we did was we kind of implemented a proximity sensor. It works like a park assistant in a car. If a protein has kind of a binding site where a molecule can dock, and we measured the distance between specified molecules and this binding site and if it gets close it starts like beeping like Parker system and this is important things in how protein behaves within a liquid and so we wanted to use the user to see these events and know what's going on.
[00:01:41.426] Kent Bye: Yeah, and you showed a few videos of these simulations with the sounds, and it was in a 2D screen, so we weren't immersed in the 3D environment, but I'm curious about where the data for the simulations come from and what type of frame rate you would typically see in terms of seeing the simulation.
[00:01:58.057] Florian Friess: Okay, so we're working together with colleagues from the biology department and they provide us with the simulation data. Frame rates are usually about, depending on your computer, about 400 frames per second. So the computation of most of these events can be done in real time. So basically it works and we have kind of a power wall where you can also have a six times two meter wide screen. You can also project in 3D, have classes on and then you can see the scene in 3D and have stereo sound. So it's kind of really immersive.
[00:02:28.642] Kent Bye: And so at 400 frames a second, that's a lot of data to go in a second. Are you speeding that up, or are you actually watching it in real time? Is it more helpful to watch this thing unfold that quickly?
[00:02:40.093] Florian Friess: The simulation, of course, is kind of discrete, so data is discrete. And to get a smooth playback, we interpolate between the data. So this is why we get more frames than there is actually data. So it depends on the interpolation. And you can speed up or slow the simulation down. If you speed it up too fast, you can't hear anything anymore because the events happen too quickly. And if you slow it down, it's just too slow to find anything. So usually we play back with a speed of one, Second in real time is one second in the simulation. So it's a one-to-one scale.
[00:03:13.234] Kent Bye: Cool. And so, you know, how did you get involved within this project then?
[00:03:17.558] Florian Friess: Well, the University of Stuttgart, they offer kind of a job for students. It can take while you're studying. It's basically 10 hours per week. You can take your time whenever you want to work on the project. and I got into it because I wanted to work, needed a bit of money and then I found they were offering a position for their framework called Megamall and I got into it and worked for half a year now with them and then they offered this idea of solidification and asked me if I wanted to work on the paper and so I said yes and now I'm here.
[00:03:50.897] Kent Bye: And so how do you actually implement the spatialized elements of the sound then within this framework? Is it within Unity or Megamall or how are you actually determining the spatializations of these sounds and controlling that?
[00:04:03.961] Florian Friess: In our implementation we have data, so we can look for time data and then we have position of atoms and we know which atom belongs to which molecule. and so we can then just look at the atom positions and derive what's happening in the data. So if hydrogen bonds have a certain distance between two hydrogen molecules, we can say okay now this is criterion for the hydrogen bond and then we can fire the event.
[00:04:30.507] Kent Bye: Cool. And so what's next for this process of sonification for moving forward some of the things that have yet to be done?
[00:04:37.652] Florian Friess: So we wanted to implement more events, two or three events that we wanted to sonify as well. And we also wanted to do user study to test how the system works and does it really offer anything apart from sound. Is it well worth it?
[00:04:53.609] Kent Bye: Great. And finally, in the context of virtual reality, what do you see that the potential is for VR and what it might be able to enable, either scientific visualization or just experiences for people?
[00:05:06.917] Florian Friess: I think it's going to offer more experience for people because it can create more immersion or higher immersion to the scene you're looking at. So I'm thinking it's more for people getting into than offer more for science. OK.
[00:05:23.737] Kent Bye: Great. Well, thank you. You're welcome. And thank you for listening. If you'd like to support the Voices of VR podcast, then please consider becoming a patron at patreon.com slash voices of VR.