Emotional or Social?: How to Enhance Human-Robot Social Bonding

The behavior of a social robot is designed based on either its emotional state or social situation. In this research, we defined these expressions as the affective and social expressions, and proposed a new emotional expression method to integrate these two expressions. It is known that a human has involuntary facial muscles around the eyes, and voluntary facial muscles around the mouth. Thus, it is considered that facial muscles around the eyes express the affective expressions, while facial muscles around the mouth express the social expressions. In an experiment involving a human-robot conversation, using the proposed method, the results showed a sense of human likeness and sociality. Furthermore, the intimacy evaluation results showed that the social expressions work effectively when the intimacy does not require high social bonding such as the desire for friendship. However, the affective expressions are necessary for evaluating intimacy that requires higher social bonding, such as wanting to live with someone.

[1]  Brian Scassellati,et al.  A Context-Dependent Attention System for a Social Robot , 1999, IJCAI.

[2]  Atsuo Takanishi,et al.  Development of whole-body emotion expression humanoid robot , 2008, 2008 IEEE International Conference on Robotics and Automation.

[3]  Effect of thought-stopping on thoughts, mood and corrugator EMG in depressed patients. , 1978, Behaviour research and therapy.

[4]  J. Burgoon,et al.  Nonverbal Communication , 2018, Encyclopedia of Evolutionary Psychological Science.

[5]  Attuned to the positive? Awareness and responsiveness to others’ positive emotion experience and display , 2015 .

[6]  P. Ekman Facial expression and emotion. , 1993, The American psychologist.

[7]  Kazuaki Tanaka,et al.  Teleoperated or autonomous?: How to produce a robot operator's pseudo presence in HRI , 2016, 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[8]  P. Ekman,et al.  American-Japanese cultural differences in intensity ratings of facial expressions of emotion , 1989 .

[9]  Judith A. Hall,et al.  The Deliberate Duchenne Smile: Individual Differences in Expressive Control , 2013 .

[10]  A. Manstead,et al.  Can Duchenne smiles be feigned? New evidence on felt and false smiles. , 2009, Emotion.

[11]  Jodi Forlizzi,et al.  Ripple effects of an embedded social agent: a field study of a social robot in the workplace , 2012, CHI.

[12]  J. Russell A circumplex model of affect. , 1980 .

[13]  John Nolt EXPRESSION AND EMOTION , 1981 .

[14]  Taichi Shiiba,et al.  Realization of realistic and rich facial expressions by face robot , 2004, IEEE Conference on Robotics and Automation, 2004. TExCRA Technical Exhibition Based..

[15]  M. Yuki,et al.  Are the windows to the soul the same in the East and West? Cultural differences in using the eyes and mouth as cues to recognize emotions in Japan and the United States , 2007 .

[16]  Shohei Kato,et al.  Facial expressions using emotional space in sensitivity communication robot "Ifbot" , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[17]  P. Ekman,et al.  Unmasking the face : a guide to recognizing emotions from facial clues , 1975 .

[18]  Tomoko Koda,et al.  Cross-Cultural Study on Facial Regions as Cues to Recognize Emotions of Virtual Agents , 2010, Culture and Computing.

[19]  Hiroshi Ishiguro,et al.  Can Androids Be Salespeople in the Real World? , 2015, CHI Extended Abstracts.