Ross Mead studies Human-Robot Interactions to make robots and virtual human non-player characters (NPCs) more realistic to engage with. There’s a lot of overlap between designing body language for physical robots and for NPCs since they use the same principles.

RossMeadNon-verbal communication is a fundamental building block of social interactions, and he talks about principles like spacing, socially-appropriate eye gaze, gestures, using and understanding pointing behaviors, modulating speaking voices to be louder/softer or faster/slower, head nodding, and taking turns when communicating.

He talks about how humans are always broadcasting information with everything that they do whether it’s speaking or not speaking, moving or not moving. Any reaction or lack of reaction communicates some form of meaning of whether or not your interested and engaged or disinterested and not fully connecting.

Body language can tell you the nature of the relationship with someone, and being able to identify open and closed body language cues can add another layer of depth and realism to interactions with NPCs within virtual environments.

Ross says that there a couple of ways to measure how believable your social interactions was either robots or virtual avatars. There are physiological measures that can come from looking at heart rate, galvanic skin responses, respiration rate, and general activity like the speed and frequency of motion. But there are also traditional psychological surveys that can measure how believable or comfortable the interaction was subjectively perceived.

He sees that the top two body language cues to implement with virtual humans would be adaptive positioning and automating co-verbal behaviors of gestures that are coordinated with speech so that it doesn’t feel like a robot or zombie.

Finally, Ross talks about the different cues for open vs. closed body language, the importance of mimicking for building rapport, and some of the ways that these techniques could be applied to provide a safe escape that’s fun and improves people’s lives. Stay tuned for more information about his company named Semio.

TOPICS

  • 0:00 – Intro – Ross studies Human-Robot interaction and presenting work on getting robots to use body language and understanding non-verbal communication that are the building blocks of social interaction like spacing, socially-appropriate eye gaze, gestures, using and understanding pointing behaviors, modulating it’s voices louder/softer, faster/slower, head nods, taking turns when communicating.
  • 1:12 – Applies to both robots and avatars. Robots are physically co-present NPC. Could be applied to virtual worlds to make characters more engaging. Working on making characters more engaging by using body language.
  • 2:04 – Eye gaze can feel weird if implemented in a way that feels natural. It broadcasts info, and tells others what you can observe & connected to privacy and the nature of the relationship. Continued eye contact means “I want to see more.” Too much eye contact violates the amount of intimacy that people are comfortable with. Will compensate by averting our gaze, can get more spacing, change frequency and duration of direct eye gazes or perhaps cross arms, or pacifier behavior of self-touching
  • 3:31 – Measuring psychological impact of implemented body language? Two ways. Use physiological measures like heart rate, galvanic skin responses, respiration rate, general activity and speed of motion. Can use psychological surveys with Likert scales. How intelligent NPC? Was it violating your personal space? Use these to figure out how people react to these
  • 4:43 – Top behaviors to implement with NPCs. Positioning is the first thing to get correct, and will be more engaging you adaptively use positioning. Secondly would be automating co-verbal behaviors of gestures that are coordinated with speech so that it’s not a robot or zombie. Eye gaze. Pointing. Immersive and engaging.
  • 6:26 – Pointing behaviors like pick “that” up or talk “her,” which is a referencing behavior that’s fundamental to human communication
  • 6:58 – Body language for engagement like a forward lean, increased eye contact, increased rate of speech. Opposite with the opposite like leaning back, attention if focused elsewhere. Can’t look at these in isolation, and look for combinations and clusters of behaviors because there’s other reasons
  • 8:12 – Open body language, not arms crossed and reveal front of the body. Open eyes, eyebrows up and a smile. Bitchy resting face when an idle pose scrunch up, and have to consciously counter this. Humans are broadcasting 24/7 and need to be aware of what you’re putting out.
  • 9:34 – Mimicking body language is a fundamental component to building rapport. USC’s ICT is looking at virtual humans.
  • 10:24 – These technologies will make our lives more fun. Seen as an outlet and relief to challenges we face during the day. Safe escape, but also if someone has a disability and want to improve their lives. Focusing in on helping people with special needs and make the world a better place.

Theme music: “Fatality” by Tigoolio

Comments are closed.

Voices of VR Podcast © 2017