Emotion-sensitive robots - a new paradigm for human-robot interaction

An emotion-sensitive human-robot cooperation framework where a robot is sensitive to the emotions of the human working with it and is also capable of changing its behavior based on this perception is presented in this paper. Peripheral physiological responses of a human are measured through wearable biofeedback sensors to detect and identify his/her underlying level of anxiety. A control architecture inspired by Riley's original information-flow model is designed. In this human-robot interaction framework, the robot is responsive to the psychological states of the human and detects both implicit and explicit communication from the human to determine its own behavior. Human-robot cooperation experiments using a mobile robot as a test bed are performed where the robot senses anxiety level of the human and responds appropriately. The results presented here validate the proposed framework and demonstrated a new way of achieving emotion-based interaction between a human and a robot.

[1]  M. Helander Applicability of drivers' electrodermal response to the design of the traffic environment. , 1978, The Journal of applied psychology.

[2]  H. Saito,et al.  Evaluation of the relation between emotional concepts and emotional parameters in speech , 2001 .

[3]  Hideo Saito,et al.  Evaluation of the relationship between emotional concepts and emotional parameters on speech , 1997, 1997 IEEE International Conference on Acoustics, Speech, and Signal Processing.

[4]  J. Breese,et al.  Modeling Emotional State and Personality for Conversational Agents , 1998 .

[5]  Carla H. Lagorio,et al.  Psychology , 1929, Nature.

[7]  J. Stainer,et al.  The Emotions , 1922, Nature.

[8]  Bianchi-BerthouzeNadia,et al.  Modeling Multimodal Expression of User's Affective Subjective Experience , 2002 .

[9]  Frank H Wilhelm,et al.  Vagal rebound during resolution of tearful crying among depressed and nondepressed individuals. , 2003, Psychophysiology.

[10]  Nilanjan Sarkar,et al.  Online stress detection using psychophysiological signals for implicit human-robot cooperation , 2002, Robotica.

[11]  Christine L. Lisetti,et al.  Modeling Multimodal Expression of User’s Affective Subjective Experience , 2002, User Modeling and User-Adapted Interaction.

[12]  Nicu Sebe,et al.  Emotion recognition using a Cauchy Naive Bayes classifier , 2002, Object recognition supported by user interaction for service robots.

[13]  R. Lazarus Emotion and Adaptation , 1991 .

[14]  C A Smith Dimensions of appraisal and physiological response in emotion. , 1989, Journal of personality and social psychology.

[15]  J Aasman,et al.  Operator Effort and the Measurement of Heart-Rate Variability , 1987, Human factors.

[16]  R. A. Brooks,et al.  Intelligence without Representation , 1991, Artif. Intell..

[17]  Stefanos D. Kollias,et al.  On emotion recognition of faces and of speech using neural networks, fuzzy logic and the ASSESS system , 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium.

[18]  Cristina Conati,et al.  Modeling Students' Emotions from Cognitive Appraisal in Educational Games , 2002, Intelligent Tutoring Systems.

[19]  Cristina Conati,et al.  Probabilistic assessment of user's emotions in educational games , 2002, Appl. Artif. Intell..

[20]  V. Petrushin Emotion Recognition Agents in Real World , 2000 .

[21]  Nilanjan Sarkar,et al.  Affect-sensitive human-robot cooperation - theory and experiments , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[22]  David Heckerman,et al.  Causal independence for probability assessment and inference using Bayesian networks , 1996, IEEE Trans. Syst. Man Cybern. Part A.

[23]  Victor A. Riley,et al.  A General Model of Mixed-Initiative Human-Machine Systems , 1989 .

[24]  Alex Pentland,et al.  A Bayesian Computer Vision System for Modeling Human Interactions , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[25]  Christian A. Müller,et al.  Recognizing Time Pressure and Cognitive Load on the Basis of Speech: An Experimental Study , 2001, User Modeling.

[26]  Kostas Karpouzis,et al.  A fuzzy system for emotion classification based on the MPEG-4 facial definition parameter set , 2000, 2000 10th European Signal Processing Conference.

[27]  Louis D. Silverstein,et al.  Changes in Electromyographic Activity Associated with Occupational Stress and Poor Performance in the Workplace , 1987, Human factors.

[28]  C. Lebiere,et al.  The Atomic Components of Thought , 1998 .

[29]  B. C. Lacey,et al.  Verification and extension of the principle of autonomic response-stereotypy. , 1958, The American journal of psychology.

[30]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[31]  Jennifer Healey,et al.  Toward Machine Emotional Intelligence: Analysis of Affective Physiological State , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[32]  D. Cicchetti Emotion and Adaptation , 1993 .

[33]  Glenn F. Wilson,et al.  An Analysis of Mental Workload in Pilots During Flight Using Multiple Psychophysiological Measures , 2002 .

[34]  Robert Malone,et al.  The Robot Book , 1978 .

[35]  Jennifer Healey,et al.  Digital processing of affective signals , 1998, Proceedings of the 1998 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP '98 (Cat. No.98CH36181).

[36]  B. Kushner Descartes' error. , 1998, Journal of AAPOS : the official publication of the American Association for Pediatric Ophthalmology and Strabismus.

[37]  K. J. Vicente,et al.  Spectral Analysis of Sinus Arrhythmia: A Measure of Mental Effort , 1987, Human factors.

[38]  Yuan Qi,et al.  Context-sensitive Bayesian classifiers and application to mouse pressure pattern classification , 2002, Object recognition supported by user interaction for service robots.

[39]  E. Sirevaag,et al.  A Psychophysiological Assessment of Operator Workload During Simulated Flight Missions , 1987, Human factors.

[40]  Valery A. Petrushin,et al.  Emotion recognition in speech signal: experimental study, development, and application , 2000, INTERSPEECH.

[41]  Dominic W. Massaro,et al.  MULTIMODAL EMOTION PERCEPTION: ANALOGOUS TO SPEECH PROCESSES , 2000 .

[42]  Kenneth Hugdahl,et al.  Cognition and the autonomic nervous system: Orienting, anticipation, and conditioning. , 2000 .

[43]  Hiroshi Yokoi,et al.  Adaptive learning interface used physiological signals , 2000, Smc 2000 conference proceedings. 2000 ieee international conference on systems, man and cybernetics. 'cybernetics evolving to systems, humans, organizations, and their complex interactions' (cat. no.0.

[44]  W. Roth,et al.  Embarrassment and social phobia: the role of parasympathetic activation. , 2003, Journal of anxiety disorders.

[45]  Wendy S. Ark,et al.  The Emotion Mouse , 1999, HCI.

[46]  J. Breese,et al.  Emotion and personality in a conversational agent , 2001 .

[47]  Thomas S. Huang,et al.  Emotion Recognition from Facial Expressions using Multilevel HMM , 2000 .

[48]  Yuan Qi,et al.  The Bayes Point Machine for computer-user frustration detection via pressuremouse , 2001, PUI '01.

[49]  Michael D. McNeese,et al.  Assessment of User Affective and Belief States for Interface Adaptation: Application to an Air Force Pilot Task , 2002, User Modeling and User-Adapted Interaction.

[50]  B. C. Lacey,et al.  Pupillary and cardiac activity during visual attention. , 1973, Psychophysiology.

[51]  Anthony Jameson,et al.  Numerical uncertainty management in user and student modeling: An overview of systems and issues , 2005, User Modeling and User-Adapted Interaction.

[52]  Nilanjan Sarkar,et al.  Anxiety detecting robotic system – towards implicit human-robot collaboration , 2004, Robotica.

[53]  Eszter Láng,et al.  Validating a New Method for Ergonomic Evaluation of Human Computer Interfaces , 1999 .

[54]  Cristina Conati,et al.  Using Bayesian Networks to Manage Uncertainty in Student Modeling , 2002, User Modeling and User-Adapted Interaction.

[55]  J. Cassell,et al.  Embodied conversational agents , 2000 .