#166: Yon Visell on Haptics research & real-time simulation of fracturing fiber composite materials

Yon-VisellYon Visell is an assistant professor at Drexel University focusing on Haptic display engineering at the RE Touch Lab. His research is in trying to figure out the neuroscientific and physical basis of human tactile sensation and perception. It’s in this process of trying to model physical sensations in the hands that he started to look at stochastic physics models that have a random probability distribution pattern that may be analyzed statistically but may not be predicted precisely.

At IEEE VR, Yon was presenting a long-paper titled, “Fast physically accurate rendering of multimodal signatures of distributed fracture in heterogeneous materials.” In essence, he’s able to model what it looks and sounds like when wood breaks at a sampling frequency of 48 kHz. There are thousands of fibers that could be modeled individually, but that’d be too computationally intensive to do in a real-time physics simulation. But there’s a pretty consistent probability distribution for how these fibers fail that can be approximated with pretty remarkable accuracy.

Here’s the abstract of his paper:

Abstract: This paper proposes a fast, physically accurate method for synthesizing multimodal, acoustic and haptic, signatures of distributed fracture in quasi-brittle heterogeneous materials, such as wood, granular media, or other fiber composites. Fracture processes in these materials are challenging to simulate with existing methods, due to the prevalence of large numbers of disordered, quasi-random spatial degrees of freedom, representing the complex physical state of a sample over the geometric volume of interest. Here, I develop an algorithm for simulating such processes, building on a class of statistical lattice models of fracture that have been widely investigated in the physics literature. This algorithm is enabled through a recently published mathematical construction based on the inverse transform method of random number sampling. It yields a purely time domain stochastic jump process representing stress fluctuations in the medium. The latter can be readily extended by a mean field approximation that captures the averaged constitutive (stress-strain) behavior of the material. Numerical simulations and interactive examples demonstrate the ability of these algorithms to generate physically plausible acoustic and haptic signatures of fracture in complex, natural materials interactively at audio sampling rates.

Yon also talks about the latest and most cutting edge haptics research, including what the best research-grade machines are capable of doing such as simulating force feedback at a sampling frequency of 5000 to 10000 samples per second with a desired latency of 1 sample.

He also discusses some of the other frequency bandwidth requirements of other haptic systems including the detection of temperature changes, pressure changes, and our sense of touch.

Here’s an excerpt from Yon’s RE Touch Lab describing the overall research focus:

What do we feel when we touch and manipulate objects in the world?

How do mechanical signatures of contact elicit conscious percepts of touch?

How is touch perception enabled through structural and functional specializations in the body?

In our lab, we study mechanisms underlying haptic perception and action, and the neuroscientific and biomechanical basis of touch. Our long-term goal is to uncover the biological (neural and mechanical) computations that enable haptic interaction, when movement-dependent sensory signals are concurrently available via multiple perceptual channels.

We conduct theoretical and behavioral studies of haptic perception to illuminate contributions of different mechanical cues and motor behaviors. We aim to develop novel hypotheses about how real haptic objects are perceived, and how they can be simulated with new technologies.

Yon says that the two biggest Haptics conferences in US are the
IEEE World Haptics Conference which just happened from June 22-26 in Illinois, and then the IEEE Haptics Symposium 2016 that’s coming up next year.

After talking to Yon and other researchers at the IEEE VR conference, then it’s pretty clear to me that haptics is something that is going to be very use-case specific. The idea of creating a generalized VR haptic solution that’s affordable to consumers is something that feels a long way out, perhaps 5-15 years especially if you imagine trying to achieve a 5-10kHz sampling rate combined with less than 100 microsecond latency. In the meantime, then we can get the most out of task-specific haptic devices that provide just enough haptic feedback to give that extra amount of presence and immersion.

Become a Patron! Support The Voices of VR Podcast Patreon

Theme music: “Fatality” by Tigoolio

Subscribe to the Voices of VR podcast.