Robotics facial expression of anger in collaborative human–robot interaction

The facial expression of angry emotion can be useful to direct the interaction between agents, especially in unclear and cluttered environments. During the presence of an angry face, a process of analysis and diagnosis is activated in the subject that notices it, which could impact its behavior toward the one who expresses the emotion. In order to study such an effect in human–robot interaction, an expressive robotics face was designed and constructed. The influence of this face on human action and attention was analyzed in two collaborative tasks. Results of a digital survey, experimental interaction, and a questionnaire indicated that anger is the best recognized universal facial expression, has a regulatory effect in human action, and induces human attention when an unclear condition arises during the task. An additional finding was that the prolonged presence of an angry face reduces its impact compared to positive expressions.

[1]  Jaap Ham,et al.  When Artificial Social Agents Try to Persuade People: The Role of Social Agency on the Occurrence of Psychological Reactance , 2011, Int. J. Soc. Robotics.

[2]  John Tooby,et al.  Internal Regulatory Variables and the Design of Human Motivation: A Computational and Evolutionary Approach , 2008 .

[3]  Illah R. Nourbakhsh,et al.  A survey of socially interactive robots , 2003, Robotics Auton. Syst..

[4]  J. Vaidya,et al.  The two general activation systems of affect: Structural findings, evolutionary considerations, and psychobiological evidence , 1999 .

[5]  A. Takanishi,et al.  Various emotional expressions with emotion expression humanoid robot WE-4RII , 2004, IEEE Conference on Robotics and Automation, 2004. TExCRA Technical Exhibition Based..

[6]  John Tooby,et al.  Formidability and the logic of human anger , 2009, Proceedings of the National Academy of Sciences.

[7]  S. Gallagher,et al.  Social Constraints on the Direct Perception of Emotions and Intentions , 2014 .

[8]  Dietrich Paulus,et al.  Enhancing Human-Robot Interaction by a Robot Face with Facial Expressions and Synchronized Lip Movements , 2013 .

[9]  J. Cohn,et al.  Deciphering the Enigmatic Face , 2005, Psychological science.

[10]  J. Russell,et al.  The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology , 2005, Development and Psychopathology.

[11]  O. John,et al.  Automatic vigilance: the attention-grabbing power of negative social information. , 1991, Journal of personality and social psychology.

[12]  O. Pos,et al.  Facial Expressions, Colours and Basic Emotions , 2012 .

[13]  S. Gunnery,et al.  Perceptions of Duchenne and non-Duchenne smiles: A meta-analysis , 2016, Cognition & emotion.

[14]  Jaap Ham,et al.  The Power of Negative Feedback from an Artificial Agent to Promote Energy Saving Behavior , 2014, HCI.

[15]  Martin Buss,et al.  Design and Evaluation of Emotion-Display EDDIE , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[16]  Susan C. Johnson The recognition of mentalistic agents in infancy , 2000, Trends in Cognitive Sciences.

[17]  A. Andries Positive and Negative Emotions within the Organizational Context , 2011 .

[18]  Wolfram Burgard,et al.  MINERVA: a second-generation museum tour-guide robot , 1999, Proceedings 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C).

[19]  Selma Sabanovic,et al.  Deriving Minimal Features for Human-Like Facial Expressions in Robotic Faces , 2014, International Journal of Social Robotics.

[20]  Iván V. Meza,et al.  The Positive Effect of Negative Feedback in HRI Using a Facial Expression Robot , 2015, CR@RO-MAN.

[21]  D. Watson,et al.  Toward a consensual structure of mood. , 1985, Psychological bulletin.

[22]  Gabriele Trovato,et al.  Impression survey of the emotion expression humanoid robot with mental model based dynamic emotions , 2013, 2013 IEEE International Conference on Robotics and Automation.

[23]  Santa Barbara Regulating Welfare Tradeoff Ratios: Three Tests of an Evolutionary-Computational Model of Human Anger , 2005 .

[24]  Atsuo Takanishi,et al.  Robot personality based on the equations of emotion defined in the 3D mental space , 2001, Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No.01CH37164).

[25]  Jeff T. Larsen,et al.  Further evidence for mixed emotions. , 2011, Journal of personality and social psychology.

[26]  Javier Ruiz-del-Solar,et al.  Bender: a general-purpose social robot with human-robot interaction capabilities , 2013, HRI 2013.

[27]  Cynthia Breazeal,et al.  Emotion and sociable humanoid robots , 2003, Int. J. Hum. Comput. Stud..

[28]  Myung Jin Chung,et al.  Determining color and blinking to support facial expression of a robot for conveying emotional intensity , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[29]  Cynthia Breazeal,et al.  Function meets style: insights from emotion theory applied to HRI , 2004, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[30]  Robert C. Solomon,et al.  On “Positive” and “Negative” Emotions , 2002 .

[31]  Eva Wiese,et al.  Minimal Physical Features Required for Social Robots , 2015 .

[32]  J. M. Carroll,et al.  On the bipolarity of positive and negative affect. , 1999, Psychological bulletin.

[33]  P. Ekman,et al.  The Duchenne smile: emotional expression and brain physiology. II. , 1990, Journal of personality and social psychology.

[34]  Удк,et al.  ‘ Unmasking the Face : A Guide to Recognizing Emotions from Facial Clues , 2018 .

[35]  P. Ekman Expression and the Nature of Emotion , 1984 .

[36]  Jun Takamatsu,et al.  A gesture-centric Android system for multi-party human-robot interaction , 2013, HRI 2013.

[37]  Caleb Rascon,et al.  Concept and Functional Structure of a Service Robot , 2015 .

[38]  C. Barrick,et al.  Color sensitivity and mood disorders: biology or metaphor? , 2002, Journal of affective disorders.

[39]  Melissa L. Finucane,et al.  Risk as Analysis and Risk as Feelings: Some Thoughts about Affect, Reason, Risk, and Rationality , 2004, Risk analysis : an official publication of the Society for Risk Analysis.

[40]  Gabriel Skantze,et al.  The furhat Back-Projected humanoid Head-Lip Reading, gaze and Multi-Party Interaction , 2013, Int. J. Humanoid Robotics.

[41]  Myung Jin Chung,et al.  A Linear Affect–Expression Space Model and Control Points for Mascot-Type Facial Robots , 2007, IEEE Transactions on Robotics.

[42]  Shigeru Akamatsu,et al.  Dynamic Properties Influence the Perception of Facial Expressions , 2001, Perception.

[43]  Britta Wrede,et al.  The social robot ‘Flobi’: Key concepts of industrial design , 2010, 19th International Symposium in Robot and Human Interactive Communication.

[44]  M. Knapp,et al.  Nonverbal communication in human interaction , 1972 .