#166: Yon Visell on Haptics research & real-time simulation of fracturing fiber composite materials

Yon-VisellYon Visell is an assistant professor at Drexel University focusing on Haptic display engineering at the RE Touch Lab. His research is in trying to figure out the neuroscientific and physical basis of human tactile sensation and perception. It’s in this process of trying to model physical sensations in the hands that he started to look at stochastic physics models that have a random probability distribution pattern that may be analyzed statistically but may not be predicted precisely.

At IEEE VR, Yon was presenting a long-paper titled, “Fast physically accurate rendering of multimodal signatures of distributed fracture in heterogeneous materials.” In essence, he’s able to model what it looks and sounds like when wood breaks at a sampling frequency of 48 kHz. There are thousands of fibers that could be modeled individually, but that’d be too computationally intensive to do in a real-time physics simulation. But there’s a pretty consistent probability distribution for how these fibers fail that can be approximated with pretty remarkable accuracy.

Here’s the abstract of his paper:

Abstract: This paper proposes a fast, physically accurate method for synthesizing multimodal, acoustic and haptic, signatures of distributed fracture in quasi-brittle heterogeneous materials, such as wood, granular media, or other fiber composites. Fracture processes in these materials are challenging to simulate with existing methods, due to the prevalence of large numbers of disordered, quasi-random spatial degrees of freedom, representing the complex physical state of a sample over the geometric volume of interest. Here, I develop an algorithm for simulating such processes, building on a class of statistical lattice models of fracture that have been widely investigated in the physics literature. This algorithm is enabled through a recently published mathematical construction based on the inverse transform method of random number sampling. It yields a purely time domain stochastic jump process representing stress fluctuations in the medium. The latter can be readily extended by a mean field approximation that captures the averaged constitutive (stress-strain) behavior of the material. Numerical simulations and interactive examples demonstrate the ability of these algorithms to generate physically plausible acoustic and haptic signatures of fracture in complex, natural materials interactively at audio sampling rates.

Yon also talks about the latest and most cutting edge haptics research, including what the best research-grade machines are capable of doing such as simulating force feedback at a sampling frequency of 5000 to 10000 samples per second with a desired latency of 1 sample.

He also discusses some of the other frequency bandwidth requirements of other haptic systems including the detection of temperature changes, pressure changes, and our sense of touch.

Here’s an excerpt from Yon’s RE Touch Lab describing the overall research focus:

What do we feel when we touch and manipulate objects in the world?

How do mechanical signatures of contact elicit conscious percepts of touch?

How is touch perception enabled through structural and functional specializations in the body?

In our lab, we study mechanisms underlying haptic perception and action, and the neuroscientific and biomechanical basis of touch. Our long-term goal is to uncover the biological (neural and mechanical) computations that enable haptic interaction, when movement-dependent sensory signals are concurrently available via multiple perceptual channels.

We conduct theoretical and behavioral studies of haptic perception to illuminate contributions of different mechanical cues and motor behaviors. We aim to develop novel hypotheses about how real haptic objects are perceived, and how they can be simulated with new technologies.

Yon says that the two biggest Haptics conferences in US are the
IEEE World Haptics Conference which just happened from June 22-26 in Illinois, and then the IEEE Haptics Symposium 2016 that’s coming up next year.

After talking to Yon and other researchers at the IEEE VR conference, then it’s pretty clear to me that haptics is something that is going to be very use-case specific. The idea of creating a generalized VR haptic solution that’s affordable to consumers is something that feels a long way out, perhaps 5-15 years especially if you imagine trying to achieve a 5-10kHz sampling rate combined with less than 100 microsecond latency. In the meantime, then we can get the most out of task-specific haptic devices that provide just enough haptic feedback to give that extra amount of presence and immersion.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast.

[00:00:11.998] Yon Visell: My name is Jan Vessel. I work generally in haptics, but I'm here at IEEE VR this year. And I guess what the theme of my talk this year is that there are a lot of really complex natural processes, interactions with natural materials, plant fibers. granular materials and things like this that are really physically complex, so consequently challenging to simulate interactively in virtual environments, but nonetheless evoke really simple perceptual qualities, really kind of almost unidimensional properties. So the crunch of snow under our foot or the breaking of a branch in our hands, So there's this sort of notion that there is some smaller gestalt there, some simpler characteristic of these processes that we should be able to capture. And actually my own background is coming from physics originally, so my first impulse is sometimes to think, well, what would be a minimal physical model for being able to capture these. And there's a practical motivation for asking that sort of question as well. So if we try to simulate fracture or realistic friction in real time, we find that's really challenging to do. We either trade off realism or trade off real-time performance. So if we can develop physically accurate models that capture the physical phenomena involved efficiently, we can improve performance. I work a lot in haptics and stability is a big issue there. Sometimes that's an issue in visual rendering as well and still preserve the perceptually salient properties of these processes. So the work I presented today is about fracture in fiber composite materials, things like wood and how we can describe the breaking of a branch or something like that through a relatively simple stochastic process that can be very efficiently simulated. Yeah, I think that was the thing that was really striking to me is to take a look at there's been a lot of work in terms of creating these mathematical models to simulate like, let's say, a piece of wood. And then when you break it, it's like this many thousands of different fibers that each have their own model. But yet you can still create a probability distribution of how they act over time and then convert these very complicated equations into something that you can compute in real time. And so I was struck at how accurate it was to be able to do that. And, you know, that's sort of being spurred by this desire to put these elements within a virtual reality environment to have this real time computing. And so I just had this impression that, oh, wow, there's going to be a lot of these types of well established physical models that are being converted and approximated so that we can sort of simulate them into real time. So I don't know if you see the same thing. Yeah, so the natural world's really complex. There's a lot of really interesting physics out there. And as virtual environments have come along, you know, tremendously since I started interacting with the field. And nowadays, what's possible to render visually and sometimes acoustically or haptically is incredible. And so there's a lot more interest, I think, in trying to reproduce those properties of the natural world that give it its really familiar quality. You know, the simple act of sort of sliding any finger on any surface. If you record with a sensor like your microphone here, what results? You have a very rich and almost quasi-random signal from that that the nervous system is really well adapted to extracting information from. So you can really readily discriminate touch surfaces by texture, fabrics. You walk into a fabric store and you can immediately, there's a real immediacy to discriminating these things via touch. these really subtle physical phenomena can carry a huge amount of meaning. And I think that's really an interesting avenue for further development in VR. And so maybe you can make the connection between some of the work that you're doing here and maybe some of the other work that you're doing in haptics and sort of how those two are connected. Yeah, so this is sort of part of my overall research thrust. Generally I'm trying to answer the question of what is it that we feel when we touch objects in the world and how can we reproduce that feeling with new technologies. And so part of what do we feel refers to what are the mechanical processes that give rise to touch interactions. And part of that is also related to what are the stimuli felt by the skin when we touch objects in the world. So we do some modeling like the work that I presented here today to try to capture the physical phenomena. We also develop instrumentation in my lab to try to measure actually in completely new ways what is actually felt by the hand when we touch an object. So if you grasp an object like your microphone or you manipulate it in your hand, you get really complicated patterns of deformation of the skin. And the brain manages to integrate this information with movements of your hand to build a really strong and stable perceptive, oh, this is shape. This is the shape of a microphone. It's long. It's got a squishy thing at the end, et cetera. How that process unfolds is unknown, essentially. and in fact what the basic inputs to that process are are unknown. So what is felt by the skin when we touch objects? So we develop hand-worn sensors for measuring what people feel when they grasp objects, for measuring how mechanical information propagates throughout the hand, and for capturing movements of the hand. As people interact with objects in the world and are using these to sort of first characterize what do people feel when they touch objects in the world, and then once we know that, we want to know, well, what parts of those signals are really perceptually meaningful and how can we reproduce those. So, you know, eventually we'd like to have displays for the sense of touch that are as perceptually immediate and vibrant and rich as visual displays that we have today. And that's, you know, that vision's far off for now. Yeah, it seems like what I'm sensing here at the IEEE VR conference is that there's this phenomena of the uncanny valley going from low fidelity to the middle fidelity to high fidelity and It seems like the haptics that I've seen that work really well is the really low fidelity, like very simple rumble. But then once you start to try to get into the uncanny valley or that middle fidelity, it starts to just feel awkward and just not believable at all. And almost like better to either do nothing or to go down to low fidelity. But there seems to be this striving to get to that higher fidelity type of haptic response. Or, alternatively, to do a very specific use case in terms of using haptic devices that are for one specific thing rather than kind of a generalized haptic device. So, from your perspective, I'm curious, how do you see that playing out in terms of this uncanny valley of haptics and, you know, where you see it going from here? I think part of that just relates to the challenge of the costs involved today in reproducing haptic stimuli with what are basically research devices. So, you know, we have devices in my lab that are able to produce really crisp, feeling, really plausibly realistic haptic sensations in interaction with a virtual environment, provided we're willing to mediate that interaction via a sort of virtual tool. So you're manipulating a tool. So you can do that really well now. The devices that can do that cost well are out of reach for consumers. So they're in the tens of thousands of dollars range. And then at the same time, you know, you're looking at severe compromises because the mode of interaction, if I'm only allowed to touch an environment via a rigid tool, That's a huge constraint compared to the naturalness with which I can use my hands in everyday environments. So you're looking at, in order to achieve fidelity today, making severe functional trade-offs. And I think we'll see improvements on both sides in the future. People will continue to improve very narrowly tailored haptic devices, but we're seeing a lot of interest in a wider range of devices. So in really high fidelity touch feedback from touchscreens where you have a flat surface that you're providing active touch feedback through that doesn't feel like a buzz but feels like a surface texture, a natural surface texture, even a material or a softness. And other sorts of devices worn on the body are starting to come out. We're working toward that in my lab as well, and many other people are, and I'm optimistic that we'll get there. But it's a big head start for the visual community and, to some extent, the auditory community. So in haptics, we're doing our best to catch up. Do you imagine something like a haptic glove of sorts to have like 6DoF interactions to be able to have like, you know, being able to get actual force feedback or also sensations of actually touching something? Or you mentioned this screen, would you put your hand on this like flat screen and then feel these touches? Maybe you could... Talk a bit about what you see as kind of the optimal haptic interface and moving forward. Oh, I think it's going to be application-specific. The question is sort of analogous to me of are we going to use a head-mounted display like the Oculus Rift or are we going to use a mobile device or are we going to use a flat panel display. It really depends on the context. But clearly there's a lot of opportunity in wearable haptics and wearable devices for the whole hand. One challenge is that the sensory modality in the hand isn't sort of six DOF, it's almost infinite DOF, so in a sense we have to reproduce touch images for the whole hand to reproduce touch experiences, and that has to be done at speeds that are actually much faster than what we would need to do for the eye, just because the sense of touch is a lot faster than vision is. So there's a lot of work to be done, but I think the fact that you have these different models for haptic interaction these days is really promising. It means that people are enlarging their conception of what it means to interact haptically with objects, and from an application side or a commercial side, products will find very different niches moving forward. And can you speak a little bit in terms of what's happening cognitively when we're experiencing haptic feedback? Oh, it's hard to say. I reached out, I touched it, and it felt like this. So generally we have mechanical stimuli at the skin. Those are transduced by specialized receptors in the skin. There are more types of those than we have in the eye. They're transduced into electrical signals that are propagated in the nervous system to the brain. receive more processing and are eventually integrated with information from the motor system. So typically passively supplied touch is not so meaningful. The really rich touch sensations we have in the world are when we're actively reaching out and manipulating things, touching things, performing interactions with our hands. So it's really this integration of sensation and motor activity that's at the heart of haptic experiences.

[00:10:55.438] Kent Bye: And with the Oculus Rift head-mounted display and Valve's HTC Vive, it seems like they're aiming for somewhere between 90 and 120 Hz, let's say.

[00:11:05.051] Yon Visell: For haptics, what type of frequency of an update rate would be sufficient to do haptic feedback for the hands? So it depends on which haptic subsystem we're talking about. So there's a subsystem that's sensitive to temperature or to heat exchange, and that one's relatively slow. So, you know, something with a bandwidth of a few hertz or a few frames per second might be adequate. And then there's sort of subsets of receptors that are sensitive to pressure in the skin, and so you're talking about maybe tens of hertz. And you can extend this hierarchy up to, oh, a kilohertz or so. So that would be sort of the physiological limit for touch sensation, roughly 1,000 frames per second. If we're rendering forces, though, and we want to do that really precisely, typically a frame rate of about a factor of 10 higher is required in order to ensure artifact-free reproduction and stability. So this issue of stability is really particular to the haptic modality because if we're presenting forces, then typically we're accommodating displacements. And so there's a continuous exchange between, say, the hand and the haptic force feedback device. And in practice, in order to do that in a way that feels really high fidelity and crisp, you want a sampling rate that's at least 5,000 frames per second, 5 kilohertz, and preferably 10 kilohertz. Oh, wow. Are there actual haptic devices that can achieve that type of fidelity of giving that type of force feedback? Yeah, so I guess the Novint Falcon is probably in the few kilohertz range. That's a consumer available haptic interface. The quality of that device is really limited by the manufacturing quality, the materials and things like that. Another device we have in my lab that works in a similar way, a commercial device from Force Dimension. will operate in the 5 to 10 kilohertz range. So we're accustomed to that in haptics. It's not even the full story, because in order to stably simulate haptic interactions at such rates, we also need really low latency. So latency will introduce instability. So typically, we ask for no more than one sample latency. So that means perhaps some 100 microseconds of latency, really small. I also work in audio quite a bit, and in audio you have high sample rates, tens of kilohertz, perhaps 48 kilohertz, but latency is usually less of an issue. Yeah, I noticed that in your physics model you had a sampling frequency of 48 kHz. And was that because of the audio or was it the actual physical simulation was going at that fast as well? They were both running that fast, so the physical simulation and the audio simulation. So with this model I presented, we're able to solve the physics really, really fast. So using methods that are similar to what's used in sound synthesis. So running at audio rates is no problem. Wow, awesome. And what's some of the best haptic devices that you've ever seen? Oh, I don't know. I really like the Force Dimension commercial devices. So, you know, like I said, they're constrained, tool-mediated interaction, but really, really good designs, low mass, high stiffness. So they have a very crisp feel to them. A lot of the devices have been around for a long time. Butterfly, it's a small workspace device. It's mainly been used for dental applications, really high bandwidth. I like things with good temporal resolution myself just so that everything feels crisp, but yeah, there's a lot out there. And, you know, we're at IEEE VR, there's the Haptics Symposium. Is that where the latest and greatest haptics that you've seen is kind of displayed at this IEEE Haptics conferences? Yeah, so IEEE World Haptics, which is this year in Chicago, is the sort of flagship forum. And then IEEE Haptics Symposium is in alternate years with that conference and both are sort of preeminent conferences for presenting haptics research. That's certainly where you'll see the latest haptics research. Here in Europe, the Eurohaptics Conference plays a similar role. And finally, what do you see as kind of like the ultimate potential for both virtual reality and in combination with using haptics? Oh, well, everybody wants a holodeck, right? So you want to be able to not only enter an immersive virtual environment, but you want to be able to interact with it, to touch things and move around. And if we had to live in a world in which we couldn't touch and feel anything, well, our lives would be pretty impoverished. So if we want to create virtual experiences that are anywhere close to as compelling as the real world is, we're going to have to bridge that gap. OK, great. Thank you.

[00:15:45.369] Kent Bye: Thank you. And thank you for listening. If you'd like to support the Voices of VR podcast, then please consider becoming a patron at patreon.com slash Voices of VR.

More from this show