The Effects of Humanlike and Robot-Specific Affective Nonverbal Behavior on Perception, Emotion, and Behavior

Research demonstrated that humans are able to interpret humanlike (affective) nonverbal behavior (HNB) in artificial entities (e.g. Beck et al., in: Proceedings of the 19th IEEE international symposium on robot and human interactive communication, IEEE Press, Piscataway, 2010. https://doi.org/10.1109/ROMAN.2010.5598649; Bente et al. in J Nonverbal Behav 25: 151–166, 2001; Mumm and Mutlu, in: Proceedings of the 6th international conference on human–robot interaction, HRI. ACM Press, New York, 2011. https://doi.org/10.1145/1957656.1957786). However, some robots lack the possibility to produce HNB. Using robot-specific nonverbal behavior (RNB) such as different eye colors to convey emotional meaning might be a fruitful mechanism to enhance HRI experiences, but it is unclear whether RNB is as effective as HNB. We present a review on affective nonverbal behaviors in robots and an experimental study. We experimentally tested the influence of HNB and RNB (colored LEDs) on users’ perception of the robot (e.g. likeability, animacy), their emotional experience, and self-disclosure. In a between-subjects design, users ($$n=80$$n=80) interacted with either (a) a robot displaying no nonverbal behavior, (b) a robot displaying affective RNB, (c) a robot displaying affective HNB or (d) a robot displaying affective HNB and RNB. Results show that HNB, but not RNB, has a significant effect on the perceived animacy of the robot, participants’ emotional state, and self-disclosure. However, RNB still slightly influenced participants’ perception, emotion, and behavior: Planned contrasts revealed having any type of nonverbal behavior significantly increased perceived animacy, positive affect, and self-disclosure. Moreover, observed linear trends indicate that the effects increased with the addition of nonverbal behaviors (control< RNB< HNB). In combination, our results suggest that HNB is more effective in transporting the robot’s communicative message than RNB.

[1]  Kai Oliver Arras,et al.  Robot-specific social cues in emotional body language , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[2]  Atsuo Takanishi,et al.  Bipedal humanoid robot that makes humans laugh with use of the method of comedy and affects their psychological state actively , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[3]  Jeffrey W. Bertrand,et al.  Effects of Virtual Human Animation on Emotion Contagion in Simulated Inter-Personal Experiences , 2014, IEEE Transactions on Visualization and Computer Graphics.

[4]  P. Ekman Facial expression and emotion. , 1993, The American psychologist.

[5]  Akira Ito,et al.  Artificial emotion expression for a robot by dynamic color change , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[6]  Elisabeth André,et al.  Creation and Evaluation of emotion expression with body movement, sound and eye color for humanoid robots , 2011, 2011 RO-MAN.

[7]  Hiroshi Ishiguro,et al.  Evaluating facial displays of emotion for the android robot Geminoid F , 2011, 2011 IEEE Workshop on Affective Computational Intelligence (WACI).

[8]  V. M. Zat︠s︡iorskiĭ,et al.  Biomechanics of Skeletal Muscles , 2012 .

[9]  K. Scherer,et al.  Acoustic profiles in vocal emotion expression. , 1996, Journal of personality and social psychology.

[10]  B. Manav Color‐emotion associations and color preferences: A case study for residences , 2007 .

[11]  Matthias Scheutz,et al.  The utility of affect expression in natural language interactions in joint human-robot tasks , 2006, HRI '06.

[12]  Koen V. Hindriks,et al.  Robot mood is contagious: effects of robot body language in the imitation game , 2014, AAMAS.

[13]  Mark H. Chignell,et al.  Communication of Emotion in Social Robots through Simple Head and Arm Movements , 2011, Int. J. Soc. Robotics.

[14]  Stefan Kopp,et al.  Effects of Gesture on the Perception of Psychological Anthropomorphism: A Case Study with a Humanoid Robot , 2011, ICSR.

[15]  Harald G. Wallbott In and out of context: Influences of facial expression and context information on emotion attributions , 1988 .

[16]  Jonathan Gratch,et al.  Virtual humans elicit socially anxious interactants' verbal self-disclosure , 2010 .

[17]  Ana Paiva,et al.  Using Empathy to Improve Human-Robot Relationships , 2010, HRPR.

[18]  Fabio Tesser,et al.  Interpretation of Emotional Body Language Displayed by a Humanoid Robot: A Case Study with Children , 2013, Int. J. Soc. Robotics.

[19]  Vladimir M. Zatsiorsky,et al.  Comprar Biomechanics Of Skeletal Muscles | Vladimir Zatsiorsky | 9780736080200 | Human Kinetics , 2012 .

[20]  Judee K. Burgoon,et al.  Nonverbal Communication Skills , 2003 .

[21]  Nicole C. Krämer,et al.  "It doesn't matter what you are!" Explaining social effects of agents and avatars , 2010, Comput. Hum. Behav..

[22]  Bilge Mutlu,et al.  Human-robot proxemics: Physical and psychological distancing in human-robot interaction , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[23]  D. Cicchetti Guidelines, Criteria, and Rules of Thumb for Evaluating Normed and Standardized Assessment Instruments in Psychology. , 1994 .

[24]  P. Valdez,et al.  Effects of color on emotions. , 1994, Journal of experimental psychology. General.

[25]  Nicole C. Krämer,et al.  An Experimental Study on Emotional Reactions Towards a Robot , 2012, International Journal of Social Robotics.

[26]  Ana Paiva,et al.  "Why Can't We Be Friends?" An Empathic Game Companion for Long-Term Interaction , 2010, IVA.

[27]  Aryel Beck,et al.  Towards an Affect Space for robots to display emotional body language , 2010, 19th International Symposium in Robot and Human Interactive Communication.

[28]  Raymond H. Cuijpers,et al.  Imitating Human Emotions with Artificial Facial Expressions , 2013, Int. J. Soc. Robotics.

[29]  Clare Press,et al.  Neuroscience and Biobehavioral Reviews Action Observation and Robotic Agents: Learning and Anthropomorphism , 2022 .

[30]  Nicole C. Krämer,et al.  Computer Animated Movement and Person Perception: Methodological Advances in Nonverbal Behavior Research , 2001 .

[31]  K. McGraw,et al.  Forming inferences about some intraclass correlation coefficients. , 1996 .

[32]  Dana Kulic,et al.  Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots , 2009, Int. J. Soc. Robotics.

[33]  A. Manstead,et al.  The social and emotional functions of facial displays , 1999 .

[34]  K. Scherer,et al.  Emotion expression in body action and posture. , 2012, Emotion.

[35]  Michiteru Kitazaki,et al.  Measuring empathy for human and robot hand pain using electroencephalography , 2015, Scientific Reports.

[36]  Stefan Kopp,et al.  Smile and the world will smile with you - The effects of a virtual agent's smile on users' evaluation and behavior , 2013, Int. J. Hum. Comput. Stud..

[37]  Tatsuya Nomura,et al.  Measurement of Anxiety toward Robots , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[38]  Takayuki Kanda,et al.  Nonverbal leakage in robots: Communication of intentions through seemingly unintentional behavior , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[39]  Andrea Lockerd Thomaz,et al.  Effects of nonverbal communication on efficiency and robustness in human-robot teamwork , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[40]  Atsuo Takanishi,et al.  Design of the humanoid robot KOBIAN - preliminary analysis of facial and whole body emotion expression capabilities- , 2008, Humanoids 2008 - 8th IEEE-RAS International Conference on Humanoid Robots.

[41]  T. Kanda,et al.  Measurement of negative attitudes toward robots , 2006 .

[42]  Ana Paiva,et al.  iCat: an affective game buddy based on anticipatory mechanisms , 2008, AAMAS.

[43]  C. Nass,et al.  Machines and Mindlessness , 2000 .

[44]  Emily C. Collins,et al.  Saying It with Light: A Pilot Study of Affective Communication Using the MIRO Robot , 2015, Living Machines.

[45]  A. Hurlbert,et al.  Biological components of sex differences in color preference , 2007, Current Biology.

[46]  Milind Tambe,et al.  A Study of Emotional Contagion with Virtual Characters , 2012, IVA.

[47]  Nicole C. Krämer,et al.  Investigations on empathy towards humans and robots using fMRI , 2014, Comput. Hum. Behav..

[48]  Takayuki Kanda,et al.  Footing in human-robot conversations: How robots might shape participant roles using gaze cues , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[49]  Nicole C. Krämer,et al.  Quid Pro Quo? Reciprocal Self-disclosure and Communicative Accomodation towards a Virtual Interviewer , 2011, IVA.

[50]  D. Watson,et al.  Development and validation of brief measures of positive and negative affect: the PANAS scales. , 1988, Journal of personality and social psychology.