Evaluating Emotion Expressing Robots in Affective Space

Research on human emotions has been an area of increased interest in the field of human-robot interaction in the last decade. Subfields reach from usability studies over emotionally enriched communication to even social integration in human-robot groups. Prominent aims are the investigation of the impact of emotional responses, perception of emotions, and emotional decision making on the efficiency and robustness of the interaction process. Intuitive communication and easy familiarization are other factors of major interest. In order to facilitate emotionally enriched communication, means of expressing “emotional states” of a robot are necessary, i.e. expressive features, which can be used to induce emotions in the human or simply to provide additional cues on the progression of the communication or interaction process. A common approach is the integration of facial expression elements in the robot artefact as very elaborated frameworks on human facial expressions exist, which can be utilized, e.g. (Blow et al., 2006; Breazeal, 2002a; Grammer & Oberzaucher, 2006; Hara & Kobayashi, 1996; Sosnowski et al., 2006a; Zecca et al., 2004). The design and control of such expressive elements have a significant impact on how the represented emotional state of the robot is perceived by the human counterpart. Particularly, the controlled posture is an important aspect and a well investigated issue in human nonverbal communication considering facial expressions. Common frameworks are works using the Facial Action Coding System (FACS) (Ekman & Friesen, 1977) and variants establishing the link between muscular activations and facial expressions, i.e. the quantitative contribution of muscular group poses to perceived emotions, e.g. (Grammer & Oberzaucher, 2006). Such a design approach is dimensional (continuous) in nature as a continuous representation of the emotional state space composed of the dimensions valence/pleasure, arousal, and dominance/stance is used and the contribution of muscular group poses to these components is provided. The choice of concept for the evaluation of displayed facial expressions is an issue of equal importance. A comprehensive evaluation is essential as the actuating elements of the robot (motors, joints, transmission elements, etc.) differ significantly from those of the human. Thus, although elaborated frameworks as e.g. FACS are used in the design process a significant deviation of the intended and perceived expression can be expected. Common evaluation procedures use a categorical approach where test participants may choose best fits from a set.

[1]  J. M. Kittross The measurement of meaning , 1959 .

[2]  J. Russell,et al.  An approach to environmental psychology , 1974 .

[3]  D. Watson,et al.  Development and validation of brief measures of positive and negative affect: the PANAS scales. , 1988, Journal of personality and social psychology.

[4]  A. Hamm,et al.  Emotionsinduktion durch visuelle Reize: Validierung einer Stimulationsmethode auf drei Reaktionsebenen , 1993 .

[5]  P. Valdez,et al.  Effects of color on emotions. , 1994, Journal of experimental psychology. General.

[6]  Fumio Hara,et al.  A face robot able to recognize and produce facial expression , 1996, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems. IROS '96.

[7]  A. Mehrabian Comparison of the PAD and PANAS as models for describing emotions and for differentiating anxiety from depression , 1997 .

[8]  A. Mehrabian,et al.  Emotional correlates of preferences for situation-activity combinations in everyday life. , 1997, Genetic, social, and general psychology monographs.

[9]  J. Russell The psychology of facial expression: Reading emotions from and into faces: Resurrecting a dimensional-contextual perspective , 1997 .

[10]  Cynthia Breazeal,et al.  Designing sociable robots , 2002 .

[11]  Cynthia Breazeal,et al.  Emotion and sociable humanoid robots , 2003, Int. J. Hum. Comput. Stud..

[12]  Paolo Dario,et al.  On the development of the emotion expression humanoid robot WE-4RII with RCH-1 , 2004, 4th IEEE/RAS International Conference on Humanoid Robots, 2004..

[13]  Takashi Minato,et al.  Development of an Android Robot for Studying Human-Robot Interaction , 2004, IEA/AIE.

[14]  Martin Buss,et al.  Design and Evaluation of Emotion-Display EDDIE , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[15]  K. Grammer,et al.  The Reconstruction of Facial Expressions in Embodied Systems New Approaches to an Old Problem , 2006 .

[16]  Martin Buss,et al.  EDDIE - An Emotion-Display with Dynamic Intuitive Expressions , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[17]  David Lee,et al.  Perception of Robot Smiles and Dimensions for Human-Robot Interaction Design , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[18]  Martin Buss,et al.  On the Evaluation of Emotion Expressing Robots , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.