SynTouch has created a system that can quantify the sense of touch on fifteen different dimensions called the SynTouch Standard, and they’re one of the most impressive haptic start-ups that I’ve seen so far. SynTouch isn’t creating haptic displays per se, but they are capturing the data that will vital for other VR haptic companies to work towards creating a display that’s capable of simulating a wide variety of different textures. SynTouch lists Oculus as one of their partners, and they’re also providing their data to a number of other unannounced haptic companies.
LISTEN TO THE VOICES OF VR PODCAST
I had a chance to talk with Matt Borzage, head of development and one of the co-founders of SynTouch at CES where we talked about the 15 different dimensions of their SynTouch Standard across the five major areas of Texture, Compliance, Friction, Thermal, and Adhesive. This research was originally funded by DARPA in order for adding the feeling of touch to prosthetics, and the founders have backgrounds in biomedical engineering. But their mechanical process of objectively measuring the different dimensions of textures has a lot of applications in virtual reality that creates a baseline of input data for haptic displays.
Here’s a comparison of denim and a sponge across the 15 dimensions of the SynTouch Standard:
SynTouch has found a great niche in the haptics space in being able to already provide a lot of insight and value to a number of different companies looking at the ergonomics of industrial design, and they’re a company to watch in the VR space as more and more different haptics companies try to solve some of the hardest engineering problems around creating a generalized haptic device for VR.
https://www.youtube.com/watch?v=T2qrcoqFg-M
Donate to the Voices of VR Podcast Patreon
Music: Fatality & Summer Trip
Support Voices of VR
- Subscribe on iTunes
- Donate to the Voices of VR Podcast Patreon
Music: Fatality & Summer Trip
Rough Transcript
[00:00:05.452] Kent Bye: The Voices of VR Podcast. My name is Kent Bye, and welcome to The Voices of VR Podcast. So when I was at the Consumer Electronics Show, I was roaming around the expo floor looking for different companies that were catching my attention for one reason or another. And when I came across Syntouch, the thing that really jumped out at me was that they had this machine that had this fingertip and it was connected to an arm and it was like moving it back and forth over these variety of different textures. And I had this picture of this spider plot that was translating these textures into a quantified number on a 15-dimensional scale. So I started just talking to one of the co-founders about what they were doing and why, and immediately just saw the application for how the input data that they were collecting is going to be sent into what is eventually going to be a virtual reality haptic device. So on today's episode, I talked to Matt Borzaghi. He's the head of development and one of the co-founders of Syntouch. So we talk about quantifying the sense of touch across the 15 different dimensions of the Syntouch standard on today's episode of the Voices of VR podcast. But first, a quick word from our sponsor. Today's episode is brought to you by the Silicon Valley Virtual Reality Conference and Expo. SVVR is the can't miss virtual reality event of the year. It brings together the full diversity of the virtual reality ecosystem. And I often tell people if they can only go to one VR conference, then be sure to make it SVVR. You'll just have a ton of networking opportunities and a huge expo floor that shows a wide range of all the different VR industries. SVVR 2017 is happening March 29th to 31st. So go to VRExpo.com to sign up today. So this interview with Matt happened at the Consumer Electronics Show that was happening in Las Vegas from January 5th to 8th, 2017. So with that, let's go ahead and dive right in.
[00:02:10.521] Matt Borzage: My name's Matt Borzeghi. I'm the head of development at Syntouch, also one of the co-founders. And what we do is essentially collect all of the haptic information that you might want to display to somebody who's building a VR haptic system.
[00:02:23.528] Kent Bye: Great. So why did you start to build this? What is the initial preliminary use case for this?
[00:02:28.710] Matt Borzage: That's a great question. The initial use case actually came from DARPA. About 15 years ago, DARPA wanted to create Luke Skywalker's prosthetic arm. They wanted it to have all the same capabilities as an arm, including to have the sense of touch just like a human fingertip can. So we looked at the human fingertip, were inspired by the different things that the human finger can sense, and then basically went to the drawing board to create a sensor that could do all the same measurements.
[00:02:54.025] Kent Bye: And so in talking to different people, I've heard different people say that essentially that if you're able to replicate the sense of touch on your hands and your fingertips especially, then you can get a lot of the haptic feeling, kind of trick the brain enough to feel like you're actually touching something. So maybe you could break it down a little bit in terms of like, how do you start to quantify what the fingertip can do?
[00:03:16.885] Matt Borzage: That's a great question. Quantifying what the fingertips feel is actually a very specific application for VR haptics. The fingertips and the lips are made of what's called glabrous skin. That's the skin that's basically hair-free. Hairy skin over the rest of your body and glabrous skin can sense very different things and they could be used different ways to cause different sensations in a VR experience. The glabrous skin on your fingertips is extremely sensitive and is what you would typically use in order to identify an object that you're touching. But it might also be very important to be able to stimulate the hairy skin. It's what you'd feel if somebody was giving you a hug.
[00:03:54.665] Kent Bye: Great. So you have a model here with 15 different dimensions. And maybe you could start to break down a little bit, maybe at a high level, and then maybe get into all the different dimensions of haptics and tactile touch.
[00:04:08.335] Matt Borzage: Sure, I'd be glad to break down the dimensions that we have. So, we call this the Syntouch Standard, and the Syntouch Standard has 15 different dimensions. They cover the entire range of what you can feel, ranging from textures to compliance, thermal, adhesive, and frictional properties. And within each of those areas, there's actually sub-dimensions. So if you take, for example, the textures, your brain perceives textures of objects that have small features, one millimeter and under, and large features, over one millimeter, very differently. And so when we process our data, we make sure that we capture those sorts of information separately. so that if somebody's able to build a very high-fidelity haptic rendering system, a haptic display, that they would actually be able to capture and use that information to excite the finger in different ways to simulate both of those.
[00:04:56.140] Kent Bye: Great. And so maybe you could go through some of the other, I'm just really fascinated that you've created essentially a taxonomy of all the different ways right now you're teaching a machine to be able to touch and be able to quantify and what is essentially like this circular plot and a number that's on there. But maybe you can talk about some of those other subdivisions of each of those major.
[00:05:16.737] Matt Borzage: I'd be happy to. A great example of a nuanced feel that you'll immediately understand is if you think about cooling. So if they had a large piece of metal in front of you, say a table leg, and you were to touch it, it would immediately feel very cool, and over time it might start to feel a little bit warmer. and it would start to feel warmer because your hand's actually adding heat to that material. And so you need to both have a transient experience, the way it feels initially when you put your hand on it, and also a persistent experience. So saying something feels warm or cool will actually change depending on how long your hand's been in contact with it. So we've broken down each of the different very broad categories that people will generally describe, things like friction and compliance, into all the individual things that if you concentrate really hard, they're able to perceive the difference of. It might not be necessary to actually replicate all of those, and I think identifying illusions of the brain and the ways that we can trick the brain into experiencing things that we would like it to are great avenues for being able to create virtual reality experiences.
[00:06:19.793] Kent Bye: Yeah, I think friction is a big one that I think is pretty clear and that's the resistance that you're getting as you're rubbing your finger over an object. Are there other ways that you break down friction?
[00:06:30.363] Matt Borzage: Yeah, actually friction is surprisingly complicated. And if you go with a classic engineering measurement of friction, you might identify something as a piece of glass as being slippery. And in fact, with the classic engineer's measurement of friction, it is slippery. But a human fingertip is not the same probe that an engineer would typically use when they were to explore that piece of glass. And the sensations that you get when you run your hand over a piece of glass will change depending on the hydration state, basically how sweaty your hand is. If your hand's very dry and you put it on a piece of glass and then you drag your hand across it, it's going to feel incredibly high friction, coarse, rough, unpleasant. If your hand is at all sweaty and you run your hand across it, it might feel very smooth. The temperature of your hand and the material will also change the way that you would perceive those sorts of things. As your hand becomes cold, it actually becomes slightly less compliant. And so what is compliance? So again, if you were to ask an engineer, and I am one, originally how they would define compliance, they'd think about things like springs, and they'd say, well, I'll deform it a certain amount and it will resist a certain amount, and therefore I can define a rigid or a stiff spring or a soft spring. But if you ask everybody else what they would consider to be soft, they might describe something more like a pillow. So there's actually aspects of compliance that people will perceive that involve how the material will wrap around their fingertip when they press into it. I see.
[00:07:48.399] Kent Bye: And so, what about adhesive? What is adhesive?
[00:07:51.721] Matt Borzage: Actually, that's one of the more straightforward ones. Adhesion as we define it, and as people perceive it, seems to be essentially how much force a material will attempt to hold your hand down with when you're trying to draw it straight up from that. So there again, it's a dimension which may or may not be critical to creating a realistic virtual reality experience, but it's one that we can currently capture with our system. And if it is necessary, then we're going to have to figure out a really clever way of recreating that for the VR experience.
[00:08:18.876] Kent Bye: Yeah, and you have a couple of the dimensions for thermal. Is that what you were talking about earlier in terms of the length of time that you have your hand on something changes your perception of it?
[00:08:28.387] Matt Borzage: Yeah, so you could look at a piece of metal, a very thin piece of foil over, say, the top of a piece of foam, and it would look identical to a very thick block of aluminum. But when you touch them, you'd be able to very quickly discern the thin foil because your hand would warm it up almost instantly. So that's the transient effect. Transiently, it would feel the same as the thick block, but the persistence of the thermal experience will change depending on whether or not you have a thin or thick piece of metal.
[00:08:55.907] Kent Bye: Right, so we have an image here that I'm looking at, and there seems to be like a number from 0 to 100% on each of these different dimensions. What am I looking at here, and how would you describe this texture here with these different numbers?
[00:09:09.074] Matt Borzage: So, without looking at it, basically it looks like a spider plot or a radar plot. So we have 15 radial dimensions. They range from 0 to 100. 0 represents the absence of an exciting stimulus, and 100 represents the maximum possible presence of that experience. So you can imagine, say, that if you had something that was essentially friction-free, you'd want it to be a zero, or if it was extremely rough as you slid your hand across it, that would correspond with one of our 100 values. These are absolute dimensions. It is somewhat difficult to compare across very widely different materials, but if you had similar ones, like different kinds of wood, they'll all live in sort of the same general space, and then the differences between them will actually be very important.
[00:09:50.261] Kent Bye: So how do you calibrate the 0 and 100% point? How do you determine what's 0 and what's 100?
[00:09:56.722] Matt Borzage: Determining the range of each of these dimensions is actually a really good question. About half of them are based on our actual experience. We've tested thousands and thousands of materials at this point. Everything from, say, a contact lens to a brick, wood, aluminum, people's skin and hair, touchscreens, sort of you name it. And so some of our dimensions are based on, in our experience, what the maximum and minimum values that we've ever encountered are. Other dimensions are actually based on the theoretical possible values of an algorithm that we're using to compute that dimension. So it really can't go below zero or above 100.
[00:10:32.097] Kent Bye: I see. And so we have a number of different materials here on the table, and as I run my finger over some of them, one's rough and one's kind of fuzzy here, and one's like carpet. And so as you're learning about this model and then having your own lived experience, do you start to calculate in your own mind, like your own subjective approximation of how you would quantify the texture?
[00:10:54.157] Matt Borzage: You do, actually. It's very interesting. I think, and you mentioned earlier, when you develop a taxonomy for something, I believe you become a little bit more attuned to those characteristics. So, if you really don't have the vocabularity for colors, then you aren't going to have a very nuanced appreciation for the different spectrum of things that you might see. Similarly, yeah, we've spent a lot of time putting samples of interesting materials and boring materials out on the table, feeling them, and then describing the ways that they feel different to us while we were developing the Syntouch standard.
[00:11:24.384] Kent Bye: So then you use this machine to then also quantify it. So it sounds like you're matching up your subjective experience with a machine that can kind of quantify that and then presumably give it any texture and then be able to come up with a similar profile.
[00:11:38.173] Matt Borzage: That's a very astute summary. I think that there's a variety of ways of making measurements. What really matters is if these are the same measurements that a human would perceive. You wouldn't want to use a classic measurement of friction to drive your VR display because it won't be what a human would feel when they were to rub their head across the same material. If you were to only measure the temperature of an object briefly, then you might not appreciate how having a warm hand present on it might change the way it feels. So it really is about coming up with measurements that mirror what humans perceive more than anything else.
[00:12:10.303] Kent Bye: And you mentioned the DARPA funding. What were they trying to do with being able to replicate Luke Skywalker's prosthetic arm?
[00:12:17.212] Matt Borzage: Well, the DoD has been very interested in being able to restore full function to veterans who have lost the use of their limbs. The goal was to get beyond the very basic prosthetics that have been used for a long time into something that would really give them the full use of their limbs back. If you've ever had the experience of having a cold hand, you'll know just how clumsy your hand feels when it's cold. You haven't lost any of your excellent neuro-processing capabilities. Your hand is still actually a very dexterous manipulator. But when you lose the sense of touch, you can't operate in a closed-loop feedback system. You are now going to be very clumsy and drop objects when you're trying to manipulate them. So we knew this going into the project. DARPA knew this. And DARPA wanted the sense of touch and those reflexes that use tactile information to be present in hands so that they could operate and manipulate objects dexterously.
[00:13:08.053] Kent Bye: So is that somehow you're using this machine to get these set of numbers on a 15-dimensional model, are you then converting that into neural input that's then being directly put into the brain somehow so that people can actually feel again?
[00:13:21.766] Matt Borzage: That's a great question. So there's actually a couple of different ways that you use tactile information and a couple of different ways that people are taking the tactile information we provide and using it in prosthetics. One way is to try and consciously perceive what it is that you're touching. If you do a really good job with that, then yes, you could actually essentially beam it into somebody's head and let them feel what you are touching. There's a lot of very different strategies for doing that, which are quite difficult and we don't do at Syntouch. What we've actually been focusing more on, on the prosthetic side, is figuring out how that information is used by the spinal cord and reflexive control of the objects that you're touching. So there's perception for action, and then there's perception for information. And what we were able to sell in the Syntouch standard is that information, and what the prosthetics really need is perception for action.
[00:14:08.145] Kent Bye: Great, so what's next for Syntouch then?
[00:14:10.648] Matt Borzage: Well, we're happy to be here at CES and unveiling our commercially available instrument. We're providing this to automotive industry, to the consumer electronics industry, personal care products, and also to people in the VR community that are interested in capturing the best data they possibly can to show it to a human.
[00:14:28.345] Kent Bye: Great. And finally, what do you see as kind of the ultimate potential of virtual reality and what it might be able to enable?
[00:14:35.728] Matt Borzage: That's a fantastic question and I think it depends on what your time frame is. Fortunately, I think I'll live to be able to see a level of fidelity in virtual reality that would surprise any of us today. I think that if we do manage to nail the haptic display problem that the advances we've already seen in vision and hearing will then be followed by what we will begin to see in the haptic side. The vision and hearing are really important for being present virtually, but if you want to interact virtually, then you're going to have to have your hand reach out and touch something. So I really hope that we get to be a part of that next step for the VR community.
[00:15:12.925] Kent Bye: And do you feel like a glove approach or, you know, I've seen some approach where you put your hand onto something and you're getting stimulated, but you kind of took away your agency of being able to actually move around and engage in the environment. I'd imagine that having some sort of either glove that you're putting on that could potentially stimulate all these different dimensions. Do you foresee that as a possible outcome to be able to recreate all the different dimensions of heat and force feedback and touch and all those things that we need to really trick our brains? If you feel like that's a, practical problem that could be solved?
[00:15:45.161] Matt Borzage: It's a great question and if I had a 100% answer then I would have a company that was making haptic displays as well. My perception is that it's of course a very difficult problem that somebody is going to solve and make a lot of money doing. I think what's really exciting is the idea that we might be close enough to a plausible haptic display with some of the technologies we have right now. that would allow sort of a black-and-white television equivalent of a haptic display that can show some of this information. Syntouch has already built the high-definition camera version of an acquisition system for textures, but it might be some time until we see the equivalent on the display side.
[00:16:19.985] Kent Bye: Awesome, well thank you so much.
[00:16:21.266] Matt Borzage: It was my pleasure.
[00:16:22.892] Kent Bye: So that was Matt Borzaghi. He's the head of development and one of the co-founders of Syntouch, which is a haptics and bioengineering company. So I have a number of different takeaways about this interview is that first of all, the Syntouch standard is the first approach that I've seen that's actually trying to break down and quantify the sense of touch into these 15 different dimensions of what they call the Syntouch standard. And one of the most compelling and amazing things is that they've created this system and robot to be able to objectively measure all these different dimensions. And so they've been gathering data on thousands and thousands of different objects. And it's this data that they're going to be able to feed into what is eventually going to be a VR haptics device And in talking to one of the co-founders and then looking at the website, they actually have listed Oculus as one of the partners that they're working with. And they also said that they've been collaborating with a lot of the leading haptic device creators, although didn't mention any specific names. A lot of these are unannounced companies and they haven't really come out of stealth mode at all. And so there's a lot of companies that are out there that are taking the data from Syntouch, which is essentially this objective set of measurements that is trying to quantify what things feel like. And so there's a number of companies that are out there that are trying to create devices that are going to be able to actually simulate the haptic response of touch. Now, I think that there's a lot of really difficult engineering problems that are going to have to be overcome in order to actually feasibly pull this off. First of all, in talking to Eric Verzolli of GoTouchVR, he was essentially telling me that in order to really simulate the sense of touch, you have to get to like a nanometer accuracy of being able to displace your skin on a fingertip. And each person's finger is different given their fingerprint. And so You're talking about something that's extremely complicated in terms of how you're able to do that mechanically. But it sounds like there's a number of different companies that are out there trying to solve that. And what Matt said is that if he had an idea of how to do that, he would actually be doing that already. And then if he had already figured it out, then he'd be in a position to be making a lot of money. I think right now, in the sense of the haptic ecosystem, Syntouch is generating a lot of the critical input that's going to be eventually creating the device and the output. Now, the interesting thing about Syntouch is that they've actually found a way to continue to work on this, but to have other outlets and applications. this line of research originally got funded by DARPA in order to create these prosthetic arms that were going to be able to replicate this sense of touch, whether or not they're going to be able to actually translate that into neural signals to then be able to have people who have lost one of their limbs in war, you know, it might be possible at some point to create this type of synthetic arm and then be able to have some sort of sensors that are on the fingertips of these arms that when you touch something is actually sending the proper signals to the brain to actually replicate what things feel like. But it also sounds like they're collaborating with a lot of companies that are doing industrial design and they're giving ergonomic feedback in terms of how things feel. So SunTouch is one of the discoveries that I had at CES that wasn't on my radar. I hadn't really heard of them before, but I think they're actually a critical part of the future of haptics and virtual reality. And so I think it's worth taking a look at their system of trying to quantify touch across these five major categories of texture and compliance and friction and thermal and adhesive. And within each of those, there's, you know, different subcategories, but Those are the major different categories that they're trying to quantify. And I think to me, it was just great to see that somebody is actually doing the groundwork to be able to look at this. And then perhaps at some point, we'll be able to have somebody that's able to create either a way to be able to replicate this or to be able to trick the mind and to be able to feeling these different types of textures. So that's all that I have for today. I just wanted to thank you for listening to the Voices of VR podcast. And if you enjoy the podcast, then please do tell your friends, spread the word, and become a donor to the Patreon. Just a couple of dollars a month makes a huge difference. So donate today at patreon.com slash Voices of VR. Thanks for listening.