#56: Ross Mead on designing social behaviors & body language for virtual human NPCs

Ross Mead studies Human-Robot Interactions to make robots and virtual human non-player characters (NPCs) more realistic to engage with. There’s a lot of overlap between designing body language for physical robots and for NPCs since they use the same principles.

RossMeadNon-verbal communication is a fundamental building block of social interactions, and he talks about principles like spacing, socially-appropriate eye gaze, gestures, using and understanding pointing behaviors, modulating speaking voices to be louder/softer or faster/slower, head nodding, and taking turns when communicating.

He talks about how humans are always broadcasting information with everything that they do whether it’s speaking or not speaking, moving or not moving. Any reaction or lack of reaction communicates some form of meaning of whether or not your interested and engaged or disinterested and not fully connecting.

Body language can tell you the nature of the relationship with someone, and being able to identify open and closed body language cues can add another layer of depth and realism to interactions with NPCs within virtual environments.

Ross says that there a couple of ways to measure how believable your social interactions was either robots or virtual avatars. There are physiological measures that can come from looking at heart rate, galvanic skin responses, respiration rate, and general activity like the speed and frequency of motion. But there are also traditional psychological surveys that can measure how believable or comfortable the interaction was subjectively perceived.

He sees that the top two body language cues to implement with virtual humans would be adaptive positioning and automating co-verbal behaviors of gestures that are coordinated with speech so that it doesn’t feel like a robot or zombie.

Finally, Ross talks about the different cues for open vs. closed body language, the importance of mimicking for building rapport, and some of the ways that these techniques could be applied to provide a safe escape that’s fun and improves people’s lives. Stay tuned for more information about his company named Semio.

TOPICS

  • 0:00 – Intro – Ross studies Human-Robot interaction and presenting work on getting robots to use body language and understanding non-verbal communication that are the building blocks of social interaction like spacing, socially-appropriate eye gaze, gestures, using and understanding pointing behaviors, modulating it’s voices louder/softer, faster/slower, head nods, taking turns when communicating.
  • 1:12 – Applies to both robots and avatars. Robots are physically co-present NPC. Could be applied to virtual worlds to make characters more engaging. Working on making characters more engaging by using body language.
  • 2:04 – Eye gaze can feel weird if implemented in a way that feels natural. It broadcasts info, and tells others what you can observe & connected to privacy and the nature of the relationship. Continued eye contact means “I want to see more.” Too much eye contact violates the amount of intimacy that people are comfortable with. Will compensate by averting our gaze, can get more spacing, change frequency and duration of direct eye gazes or perhaps cross arms, or pacifier behavior of self-touching
  • 3:31 – Measuring psychological impact of implemented body language? Two ways. Use physiological measures like heart rate, galvanic skin responses, respiration rate, general activity and speed of motion. Can use psychological surveys with Likert scales. How intelligent NPC? Was it violating your personal space? Use these to figure out how people react to these
  • 4:43 – Top behaviors to implement with NPCs. Positioning is the first thing to get correct, and will be more engaging you adaptively use positioning. Secondly would be automating co-verbal behaviors of gestures that are coordinated with speech so that it’s not a robot or zombie. Eye gaze. Pointing. Immersive and engaging.
  • 6:26 – Pointing behaviors like pick “that” up or talk “her,” which is a referencing behavior that’s fundamental to human communication
  • 6:58 – Body language for engagement like a forward lean, increased eye contact, increased rate of speech. Opposite with the opposite like leaning back, attention if focused elsewhere. Can’t look at these in isolation, and look for combinations and clusters of behaviors because there’s other reasons
  • 8:12 – Open body language, not arms crossed and reveal front of the body. Open eyes, eyebrows up and a smile. Bitchy resting face when an idle pose scrunch up, and have to consciously counter this. Humans are broadcasting 24/7 and need to be aware of what you’re putting out.
  • 9:34 – Mimicking body language is a fundamental component to building rapport. USC’s ICT is looking at virtual humans.
  • 10:24 – These technologies will make our lives more fun. Seen as an outlet and relief to challenges we face during the day. Safe escape, but also if someone has a disability and want to improve their lives. Focusing in on helping people with special needs and make the world a better place.

Theme music: “Fatality” by Tigoolio

Rough Transcript

[00:00:05.452] Kent Bye: The Voices of VR Podcast.

[00:00:11.974] Ross Mead: My name is Ross Mead. I'm a PhD student at the University of Southern California and I study human-robot interaction, particularly developing robots to help people with special needs. So we work with children with autism, people in post-traumatic rehabilitation, the elderly, people with Alzheimer's, to somehow improve quality of life. and what I was presenting here at the Immersion Conference was my work in getting robots to understand and use body language and nonverbal communication when interacting with people. Sort of the building blocks of social interaction, so things like spacing between the human and the robot, using socially appropriate eye gaze while also making sure the robot can work while it's shifting its eyes away, using gestures while the robot's speaking, or using or understanding pointing behaviors, pointing gestures. changing the way that it talks by modulating its voice, speaking louder or slower or faster, and doing things like back-channeling, which is like a head nod, or turn-taking, you know, saying, it's my turn to talk, and then your turn to talk, and getting the robot to understand and use those things when interacting with a person.

[00:01:12.010] Kent Bye: That's really interesting because it sounds like a lot of these things that you're talking about in terms of this natural interactions that are happening between humans and robots also could potentially apply to avatars within virtual worlds then.

[00:01:25.559] Ross Mead: Absolutely, yeah. None of these technologies are limited to robotics. You could think of a robot as a physically co-present NPC. Think of it as an NPC but in the real world. You could take the same kind of technology and apply it in a virtual world, in Second Life or in World of Warcraft or any game you want, really. Take that and apply it to make the characters more engaging. In fact, I will say that there is a company that's working on exactly that kind of thing, both for physical characters, but also for on-screen characters. It's a company called Semio, for which I am the CEO. So, we're working on that, trying to make it so that we can make characters more engaging, no matter where they are.

[00:02:03.936] Kent Bye: I see. So I guess one of the things that I find really interesting is that eye gaze are things that feel really awkward, that if robots or avatars that are NPCs within virtual reality worlds feels weird if they're not looking away. What is it about eye gaze and not just staring at people? Why is that a part of human body language?

[00:02:24.118] Ross Mead: Well, so eye gaze is something that, first off, it's not just us getting information, right? It's also communicating, it's broadcasting, it's sending information out to the world. It's not like eye beams or eye lasers or anything like that, but it's telling people what you can observe. And so this gets into some of the privacy discussions that we had today and just maybe the nature of our relationship. If you're making continued eye contact with me, then what you're saying is that I want to see more kind of thing. I want to continue seeing what you're putting out there, basically. And so, in those kinds of situations, if a person's making too much eye contact, it can make us feel uncomfortable. It's violating the level of intimacy in the relationship with the persons you have. And so, usually what we'll do is we'll compensate. So if someone's making increased eye contact, we'll compensate by averting our gaze more often, so we don't look at that person as often. We might change the distance between the two of us, so I might stand farther back. You can change your arm configurations, so you expect to see like arms crossing quite a bit when you see this, or what's called a pacifier behavior, so self-touching. Things to sort of calm yourself down because someone might be making you feel uncomfortable with intense eye gaze.

[00:03:30.434] Kent Bye: And so how do you measure the psychological impact of some of this nonverbal body language when you're interacting with, say, a robot or an NPC avatar, you know, how are you measuring the impact of how people are reacting to this body language?

[00:03:46.640] Ross Mead: So really, there are two really great ways to go about measuring these things. One is to use physiological sensors. So you can actually look at changes in heart rate, respiration rate, galvanic skin response, just general activity, speed of motion, those kinds of things. You could use physiological sensors to do that, but you can also use more of these, I guess, psychological metrics. surveys, questionnaires, those kinds of things, saying, hey, you just interacted with this robot. What do you think about it? And so we use like Likert scales. So on a scale from one to seven, how intelligent did you perceive the robot? Was the robot somehow violating your personal space? Or we can be very direct about the question about personal space, or we can ask other questions that are less direct, but really help us tease out what the issue was. You know, how were people actually reacting to the robot? And there are standard validated surveys that you can use, and we had to modify them maybe a little bit for the robot, but hopefully they are consistent with the validations that we've seen in human-human interactions.

[00:04:43.882] Kent Bye: And so if you were to have like a hierarchy of the top seven things that you're implementing in terms of body language, behaviors within an NPC or a robot, what would be the top seven things that you would do?

[00:04:54.968] Ross Mead: The order that I would focus on different behaviors. So it's interesting you said the top seven because there were exactly seven on the slide. I think the one that's the most important to me is actually just the positioning, getting the positioning right. So it's interesting, you think in a video game, NPCs are usually fixed in a location. What happens is your player character approaches them and puts itself directly in front of them, right? Like as the character is talking, you might start walking around, moving around, and the NPC is completely oblivious. They're talking straightforward. And so, you know, imagine two NPCs are talking to each other and you enter into the conversation. If those two NPCs actually reacted to you stepping into the conversation, I think that would be a much more engaging thing. They're showing, hey, we want you the game. automating the process of generating gestures with what's being said. So it's called co-verbal behavior. It's speech and gesture happening at the same time and automating that process so the character is not just a robot or a zombie standing there just not emoting at all. And then the eye gaze, I think is really important. The pointing behavior, I think is really important. I think all these things, at this point, I don't have really a ranking for them, but I think they're all important. And for us to have really, really immersive and engaging experiences with these characters, we need to tackle all those problems. What is pointing behavior? Pointing behavior is like, hey, go over there. Like, where is there? If I don't point, where is there, right? So I could say pick that up or go talk to her, right? So I'm now referring to a person. I could say in the future and point forward or in some cultures point backwards. So, pointing behaviors or something like referencing behavior is very, very important in our society. It helps us learn, and it's something that our machines don't really understand very well. We don't think about it too much, but it's fundamental to human communication.

[00:06:55.591] Kent Bye: What are some of the body language elements that indicate engagement and interest, as well as disinterest and boredom?

[00:07:03.114] Ross Mead: Sure, so the body language you'd expect to see for some sort of engagement or interest might be more of like a forward lean kind of thing, maybe increased eye contact, even frequency of speech or rate of speech. These might be things that are saying, I'm interested in what's going on, I'm talking faster, I'm excited about what we're talking about. And so you kind of expect to see the opposite if I'm disengaged or disinterested in something. So if I'm leaning back in my chair or if I'm just doing something else, if my attention is focused elsewhere, if I'm looking at my watch or looking at my phone, looking really at anything but the target of interest here, which is the person with whom I'm interacting, that can be a sign of disinterest. Now one thing to emphasize with all of these things is that you can't look at anything in isolation. And there are other reasons why those behaviors might be happening, which is why you can't look at them in isolation. So if I lean back, right, it doesn't necessarily mean I'm disinterested. It might mean that my back is sore and I just want to sort of kick back and relax. So what I might need to look at is I lean back and I'm also looking around at other things. And I let out a big sigh or I'm yawning. So I'm looking at all these things together and as I see them all in a cluster, the likelihood of you being disinterested is going up and the likelihood of you being interested is going down.

[00:08:13.058] Kent Bye: And what are some indicators of open body language versus closed body language?

[00:08:17.661] Ross Mead: Sure, so open body language would be things like I'm not crossing my arms. I've actually sort of, in some sense, revealed the front of my body to the person. So in the sense that I'm not covering up really in any way, this is communicating some level of openness. But there's also a lot in the face. open eyes, eyebrows up, those kinds of things, a smile, that kind of stuff. There's this notion of a bitchy resting face. I don't know if you've seen, there's a nice video on this kind of thing. And what it is, is this notion that when you're in that idle pose, when you're just like relaxing to yourself, some people's faces scrunch up, they look angry. And I'm one of those people, my face tends to look angry and so I actively kind of catch myself. I feel myself like frowning and my face kind of scowling and I actively pull my face up like, you know, thinking about it. I have to actively bring my eyebrows up, open my eyes a little bit more because it can be kind of off-putting to people. So I want to communicate that I'm open to things and Sometimes even when you're not doing anything, when you think you're not communicating, you're still broadcasting. A human is a transmitter broadcasting 24-7. You say something, you're broadcasting. You don't say something, you're broadcasting. You move, broadcasting. You don't move, broadcasting. So you have to be aware of what you're broadcasting, what you're putting out into the world, and sometimes you need to change it so that people don't get the wrong idea.

[00:09:34.297] Kent Bye: And what have you observed in terms of people mimicking other people's body language in terms of building up some sort of rapport?

[00:09:41.143] Ross Mead: Sure, yeah. So mimicry is one of the fundamental components for building rapport with someone, right? So there's a, I do a head nod, you do a head nod, those kinds of things. Or if I look at you or I do some sort of pacifier behavior, I touch some part of my body, usually I'd expect to see something like that in you if we're very much sort of in sync. So, there are people who are looking at this at USC's Institute for Creative Technologies, ICT. They're doing some really great work with virtual humans on looking at mimicry behavior. It hasn't been looked at too much with robots, but it's definitely something that I think that we will shift into. Pretty much everything that's going on in the virtual human community tends to also happen in the human-robot interaction community just a little bit after because we have a lot of other things to think about with a robot in the loop.

[00:10:24.456] Kent Bye: I see. And finally, where do you see all of this virtual reality, you know, what is the potential that you see that these types of technologies could bring to society?

[00:10:34.240] Ross Mead: Well, I think a lot of these technologies are going to, at the very least, make our lives more fun. I'm hoping that this is going to be an exciting thing that is seen as an outlet or a relief from any challenges that we have in a given day. And this could be you get off work and you want to escape somehow in a safe way. But this could also be someone who has some sort of disability or disorder, and they want to somehow improve their quality of life. And I think that these kinds of technologies, be it robot technology or on-screen or in-game technology, I think these things have the potential to really do that. And I think that the people who are working on them are really, really focusing in on those kinds of applications, really focusing in on helping people with special needs, because the need is there, and, you know, you want to do good, do good work, do good things, make the world a better place, and hopefully there are a lot of people working on that. I think there are. Great, well thank you. Alright, thank you Ken.

More from this show