A Survey of Autonomous Human Affect Detection Methods for Social Robots Engaged in Natural HRI

In Human-Robot Interactions (HRI), robots should be socially intelligent. They should be able to respond appropriately to human affective and social cues in order to effectively engage in bi-directional communications. Social intelligence would allow a robot to relate to, understand, and interact and share information with people in real-world human-centered environments. This survey paper presents an encompassing review of existing automated affect recognition and classification systems for social robots engaged in various HRI settings. Human-affect detection from facial expressions, body language, voice, and physiological signals are investigated, as well as from a combination of the aforementioned modes. The automated systems are described by their corresponding robotic and HRI applications, the sensors they employ, and the feature detection techniques and affect classification strategies utilized. This paper also discusses pertinent future research directions for promoting the development of socially intelligent robots capable of recognizing, classifying and responding to human affective states during real-time HRI.

[1]  Stephen D. Prior,et al.  Gesture recognition for control of rehabilitation robots , 2007, Cognition, Technology & Work.

[2]  Chang Dong Yoo,et al.  Speech emotion recognition via a max-margin framework incorporating a loss function based on the Watson and Tellegen's emotion model , 2009, 2009 IEEE International Conference on Acoustics, Speech and Signal Processing.

[3]  Jennifer S. Beer,et al.  Facial expression of emotion. , 2003 .

[4]  Goldie Nejat,et al.  Determining the affective body language of older adults during socially assistive HRI , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[5]  Shlomo Hareli,et al.  What's Social About Social Emotions? , 2008 .

[6]  Bogdan Raducanu,et al.  Efficient Facial Expression Recognition for Human Robot Interaction , 2007, IWANN.

[7]  Gwen Littlewort,et al.  Towards Social Robots: Automatic Evaluation of Human-robot Interaction by Face Detection and Expression Classification , 2003, NIPS.

[8]  Fabiano Botta,et al.  Operator Performance in Exploration Robotics , 2011, J. Intell. Robotic Syst..

[9]  Sylvia D. Kreibig,et al.  Autonomic nervous system activity in emotion: A review , 2010, Biological Psychology.

[10]  Rafael A. Calvo,et al.  Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications , 2010, IEEE Transactions on Affective Computing.

[11]  L. F. Barrett,et al.  Do discrete emotions exist? , 2009 .

[12]  Kai-Tai Song,et al.  A Fast Learning Algorithm for Robotic Emotion Recognition , 2007, 2007 International Symposium on Computational Intelligence in Robotics and Automation.

[13]  S. Tomkins Affect, imagery, consciousness, Vol. 3: The negative affects: Anger and fear. , 1991 .

[14]  Takayuki Kanda,et al.  An affective guide robot in a shopping mall , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[15]  Ana Paiva,et al.  Modelling empathic behaviour in a robotic game companion for children: An ethnographic study in real-world settings , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[16]  Jeong-Sik Park,et al.  Feature vector classification based speech emotion recognition for service robots , 2009, IEEE Transactions on Consumer Electronics.

[17]  Haizhou Li,et al.  Gesture Recognition Based on Localist Attractor Networks with Application to Robot Control [Application Notes] , 2012, IEEE Computational Intelligence Magazine.

[18]  Doreen Meier,et al.  Fundamentals Of Neural Networks Architectures Algorithms And Applications , 2016 .

[19]  Myung Jin Chung,et al.  Generation of Realistic Robot Facial Expressions for Human Robot Interaction , 2015, J. Intell. Robotic Syst..

[20]  K. Scherer Psychological models of emotion. , 2000 .

[21]  U. Hess,et al.  The impact of social context on mimicry , 2008, Biological Psychology.

[22]  Pablo Bustos,et al.  A real time and robust facial expression recognition and imitation approach for affective human-robot interaction using Gabor filtering , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[23]  Shaogang Gong,et al.  Beyond Facial Expressions: Learning Human Emotion from Body Gestures , 2007, BMVC.

[24]  Terence Sim,et al.  The CMU Pose, Illumination, and Expression Database , 2003, IEEE Trans. Pattern Anal. Mach. Intell..

[25]  L. Fabrigar,et al.  Reexamining the Circumplex Model of Affect , 2000 .

[26]  Enzo Pasquale Scilingo,et al.  Performance evaluation of sensing fabrics for monitoring physiological and biomechanical variables , 2005, IEEE Transactions on Information Technology in Biomedicine.

[27]  Tanja Schultz,et al.  Towards an EEG-based emotion recognizer for humanoid robots , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[28]  Karsten Berns,et al.  Visual-Based Emotion Detection for Natural Man-Machine Interaction , 2008, KI.

[29]  Astrid Paeschke,et al.  A database of German emotional speech , 2005, INTERSPEECH.

[30]  J. Russell,et al.  The psychology of facial expression: Frontmatter , 1997 .

[31]  S. Tomkins The positive affects , 1963 .

[32]  Yoon Keun Kwak,et al.  Robust emotion recognition feature, frequency range of meaningful signal , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[33]  Ana Paiva,et al.  It's all in the game: Towards an affect sensitive and context aware game companion , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[34]  Yasunari Yoshitomi,et al.  Effect of sensor fusion for recognition of emotional states using voice, face image and thermal image of face , 2000, Proceedings 9th IEEE International Workshop on Robot and Human Interactive Communication. IEEE RO-MAN 2000 (Cat. No.00TH8499).

[35]  C. Darwin The Expression of the Emotions in Man and Animals , .

[36]  A. J. Fridlund The psychology of facial expression: The new ethology of human facial expressions , 1997 .

[37]  Caridakis,et al.  Body gesture and facial expression analysis for automatic affect recognition , 2010 .

[38]  Benoit Huet,et al.  Bimodal Emotion Recognition , 2010, ICSR.

[39]  Marcelo H. Ang,et al.  A Survey on Perception Methods for Human–Robot Interaction in Social Robots , 2013, International Journal of Social Robotics.

[40]  Dana Kulic,et al.  Anxiety detection during human-robot interaction , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[41]  P. Ekman Pictures of Facial Affect , 1976 .

[42]  俊一 甘利,et al.  A. Hyvärinen, J. Karhunen and E. Oja, Independent Component Analysis, Jhon Wiley & Sons, 2001年,504ページ. (根本幾・川勝真喜訳:独立成分分析——信号解析の新しい世界,東京電機大学出版局,2005年,532ページ.) , 2010 .

[43]  Philippe Gaussier,et al.  A Robot Learns the Facial Expressions Recognition and Face/Non-face Discrimination Through an Imitation Game , 2014, Int. J. Soc. Robotics.

[44]  Goldie Nejat,et al.  Affect detection from body language during social HRI , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[45]  Luca Iocchi,et al.  RoboCup@Home: Results in Benchmarking Domestic Service Robots , 2009, RoboCup.

[46]  J. Russell,et al.  The psychology of facial expression: Foreword , 1997 .

[47]  A. Mehrabian Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in Temperament , 1996 .

[48]  Dana Kulic,et al.  Perception and Generation of Affective Hand Movements , 2013, Int. J. Soc. Robotics.

[49]  D. Keltner,et al.  Social Functions of Emotions at Four Levels of Analysis , 1999 .

[50]  Uwe Handmann,et al.  Fusion of Audio- and Visual Cues for Real-Life Emotional Human Robot Interaction , 2011, DAGM-Symposium.

[51]  Matthias Scheutz,et al.  Using Near Infrared Spectroscopy to Index Temporal Changes in Affect in Realistic Human-robot Interactions , 2018, PhyCS.

[52]  Jochen Triesch,et al.  GripSee: A Gesture-Controlled Robot for Object Perception and Manipulation , 1999, Auton. Robots.

[53]  J. Montepare,et al.  The Use of Body Movements and Gestures as Cues to Emotions in Younger and Older Adults , 1999 .

[54]  K. Conn,et al.  Towards Affect-sensitive Assistive Intervention Technologies for Children with Autism , 2008, RO-MAN 2008.

[55]  S. Tomkins Illuminating and Stimulating. (Book Reviews: Affect, Imagery, Consciousness. vol. 1, The Positive Affects) , 1963 .

[56]  T. Johnstone The effect of emotion on voice production and speech acoustics , 2017 .

[57]  C. Breazeal,et al.  Robots that imitate humans , 2002, Trends in Cognitive Sciences.

[58]  Carlos Balaguer,et al.  Facial Emotion Recognition and Adaptative Postural Reaction by a Humanoid based on Neural Evolution , 2013 .

[59]  Ana Paiva,et al.  Inter-ACT: an affective and contextually rich multimodal video corpus for studying interaction with robots , 2010, ACM Multimedia.

[60]  Susan R. Fussell,et al.  Comparing a computer agent with a humanoid robot , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[61]  Craig A. Smith,et al.  Dimensions of appraisal and physiological response in emotion. , 1989, Journal of personality and social psychology.

[62]  Matthias Scheutz,et al.  The utility of affect expression in natural language interactions in joint human-robot tasks , 2006, HRI '06.

[63]  Ian H. Witten,et al.  The WEKA data mining software: an update , 2009, SKDD.

[64]  F. Hara,et al.  Facial interaction between animated 3D face robot and human beings , 1997, 1997 IEEE International Conference on Systems, Man, and Cybernetics. Computational Cybernetics and Simulation.

[65]  Michael J. Lyons,et al.  Coding facial expressions with Gabor wavelets , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[66]  Elisabeth André,et al.  EmoVoice - A Framework for Online Recognition of Emotions from Voice , 2008, PIT.

[67]  Shuichi Nishio,et al.  Recognizing affection for a touch-based interaction with a humanoid robot , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[68]  Fumio Hara,et al.  The recognition of basic facial expressions by neural network , 1991, [Proceedings] 1991 IEEE International Joint Conference on Neural Networks.

[69]  Brian Scassellati,et al.  Socially Assistive Robotics: Methods and Implications for the Future of Work and Care , 2022, Robophilosophy.

[70]  H. Schlosberg Three dimensions of emotion. , 1954, Psychological review.

[71]  Erkki Oja,et al.  Independent Component Analysis , 2001 .

[72]  Matthias Scheutz,et al.  Measuring users' responses to humans, robots, and human-like robots with functional near infrared spectroscopy , 2014, The 23rd IEEE International Symposium on Robot and Human Interactive Communication.

[73]  J. Russell A circumplex model of affect. , 1980 .

[74]  Enzo Pasquale Scilingo,et al.  The Role of Nonlinear Dynamics in Affective Valence and Arousal Recognition , 2012, IEEE Transactions on Affective Computing.

[75]  Gwen Littlewort,et al.  The motion in emotion — A CERT based approach to the FERA emotion challenge , 2011, Face and Gesture 2011.

[76]  P. Ekman,et al.  Constants across cultures in the face and emotion. , 1971, Journal of personality and social psychology.

[77]  Kai-Tai Song,et al.  Speech signal-based emotion recognition and its application to entertainment robots , 2014 .

[78]  Shokat Ali,et al.  Human computer interaction-A brief study , 2013 .

[79]  Frédéric Lerasle,et al.  Multimodal Interaction Abilities for a Robot Companion , 2008, ICVS.

[80]  Joanna J. Bryson,et al.  THE CONCEPTUALISATION OF EMOTION QUALIA: SEMANTIC CLUSTERING OF EMOTIONAL TWEETS , 2014 .

[81]  Mohammad Obaid,et al.  A Framework for User-Defined Body Gestures to Control a Humanoid Robot , 2014, International Journal of Social Robotics.

[82]  Beno Benhabib,et al.  A two-dimensional facial-affect estimation system for human–robot interaction using facial expression parameters , 2013, Adv. Robotics.

[83]  Antonio Origlia,et al.  Attentional and emotional regulation in human-robot interaction , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[84]  Bruce A. MacDonald,et al.  Mental Schemas of Robots as More Human-Like Are Associated with Higher Blood Pressure and Negative Emotions in a Human-Robot Interaction , 2011, Int. J. Soc. Robotics.

[85]  David B. Kaber,et al.  Emotional State Classification in Patient–Robot Interaction Using Wavelet Analysis and Statistics-Based Feature Selection , 2013, IEEE Transactions on Human-Machine Systems.

[86]  Dan O. Popa,et al.  Robot Head Motion Control with an Emphasis on Realism of Neck–Eye Coordination during Object Tracking , 2011, J. Intell. Robotic Syst..

[87]  Beat Fasel,et al.  Automati Fa ial Expression Analysis: A Survey , 1999 .

[88]  Yoon Keun Kwak,et al.  Improvement of emotion recognition by Bayesian classifier using non-zero-pitch concept , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[89]  Dana Kulic,et al.  Affective State Estimation for Human–Robot Interaction , 2007, IEEE Transactions on Robotics.

[90]  N. Remmington,et al.  Reexamining the circumplex model of affect. , 2000, Journal of personality and social psychology.

[91]  Fadi Dornaika,et al.  On Appearance Based Face and Facial Action Tracking , 2006, IEEE Transactions on Circuits and Systems for Video Technology.

[92]  Andrea Kleinsmith,et al.  A categorical approach to affective gesture recognition , 2003, Connect. Sci..

[93]  Seungbin Moon,et al.  Face and Facial Expression Recognition with an Embedded System for Human-Robot Interaction , 2005, ACII.

[94]  A. Savitzky,et al.  Smoothing and Differentiation of Data by Simplified Least Squares Procedures. , 1964 .

[95]  Dan Xu,et al.  Online Dynamic Gesture Recognition for Human Robot Interaction , 2015, J. Intell. Robotic Syst..

[96]  H. H. Madden Comments on the Savitzky-Golay convolution method for least-squares-fit smoothing and differentiation of digital data , 1976 .

[97]  Hiroshi G. Okuno,et al.  The MEI Robot: Towards Using Motherese to Develop Multimodal Emotional Intelligence , 2014, IEEE Transactions on Autonomous Mental Development.

[98]  H. Conte,et al.  Circumplex models of personality and emotions , 1997 .

[99]  Tao Zhang,et al.  Gesture-based human-robot interaction using a knowledge-based software platform , 2006, Ind. Robot.

[100]  Dong-Soo Kwon,et al.  Computational Model of Emotion Generation for Human–Robot Interaction Based on the Cognitive Appraisal Theory , 2010, J. Intell. Robotic Syst..

[101]  A. Mehrabian Significance of posture and posiion in the communication of attitude and status relationships. , 1969, Psychological bulletin.

[102]  Changchun Liu,et al.  Affective feedback in closed loop human-robot interaction , 2006, IEEE/ACM International Conference on Human-Robot Interaction.

[103]  Philippe Gaussier,et al.  Imitation as a communication tool for online facial expression learning and recognition , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[104]  Emilia I. Barakova,et al.  Communicating emotions and mental states to robots in a real time parallel framework using Laban movement analysis , 2010, Robotics Auton. Syst..

[105]  Yoon Keun Kwak,et al.  Emotional Feature Extraction Based On Phoneme Information for Speech Emotion Recognition , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[106]  P. Ekman,et al.  Emotion in the Human Face: Guidelines for Research and an Integration of Findings , 1972 .

[107]  Pablo Bustos,et al.  Muecas: A Multi-Sensor Robotic Head for Affective Human Robot Interaction and Imitation , 2014, Sensors.

[108]  Gilles Degottex,et al.  Usual voice quality features and glottal features for emotional valence detection , 2012 .

[109]  K. Deaux,et al.  Nonverbal Behavior and Communication (2nd ed.). , 1987 .

[110]  Horst-Michael Groß,et al.  Camera-based gesture recognition for robot control , 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium.

[111]  Tran Huy Dat,et al.  Affective social interaction with CuDDler robot , 2013, 2013 6th IEEE Conference on Robotics, Automation and Mechatronics (RAM).

[112]  D.M. Mount,et al.  An Efficient k-Means Clustering Algorithm: Analysis and Implementation , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[113]  Britta Wrede,et al.  Playing a different imitation game: Interaction with an Empathic Android Robot , 2006, 2006 6th IEEE-RAS International Conference on Humanoid Robots.

[114]  Brian Scassellati,et al.  Socially assistive robotics [Grand Challenges of Robotics] , 2007, IEEE Robotics & Automation Magazine.

[115]  Michael A. Goodrich,et al.  Human-Robot Interaction: A Survey , 2008, Found. Trends Hum. Comput. Interact..

[116]  Goldie Nejat,et al.  Meal-time with a socially assistive robot and older adults at a long-term care facility , 2013, J. Hum. Robot Interact..

[117]  Paula M. Niedenthal,et al.  Being happy and seeing "happy": Emotional state mediates visual word recognition. , 1997 .

[118]  John-Jules Ch. Meyer,et al.  Adaptive Emotional Expression in Robot-Child Interaction , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[119]  A. Brakensiek,et al.  Neural networks for gesture-based remote control of a mobile robot , 1998, 1998 IEEE International Joint Conference on Neural Networks Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98CH36227).

[120]  Claudio Barbaranelli,et al.  A questionnaire for measuring the Big Five in late childhood , 2003 .

[121]  D. Rubin,et al.  A comparison of dimensional models of emotion: Evidence from emotions, prototypical events, autobiographical memories, and words , 2009, Memory.

[122]  Katherine B. Martin,et al.  Facial Action Coding System , 2015 .

[123]  Pamela J. Hinds,et al.  Whose job is it anyway? a study of human-robot interaction in a collaborative task , 2004 .

[124]  Takeo Kanade,et al.  Comprehensive database for facial expression analysis , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[125]  Ayanna M. Howard,et al.  Providing tablets as collaborative-task workspace for human-robot interaction , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[126]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[127]  Futoshi Naya,et al.  Differences in effect of robot and screen agent recommendations on human decision-making , 2005, Int. J. Hum. Comput. Stud..

[128]  Yoon Keun Kwak,et al.  Emotional Feature Extraction Method Based on the Concentration of Phoneme Influence for Human–Robot Interaction , 2010, Adv. Robotics.

[129]  Klaus R. Scherer,et al.  Emotional expression in prosody: a review and an agenda for future research , 2004, Speech Prosody 2004.

[130]  Andrea Kleinsmith,et al.  Affective Body Expression Perception and Recognition: A Survey , 2013, IEEE Transactions on Affective Computing.

[131]  Yoram Singer,et al.  Improved Boosting Algorithms Using Confidence-rated Predictions , 1998, COLT' 98.

[132]  Wooseok Lee,et al.  Novel acoustic features for speech emotion recognition , 2009 .

[133]  Daniele Mazzei,et al.  Development and testing of a multimodal acquisition platform for human-robot interaction affective studies , 2014, J. Hum. Robot Interact..

[134]  D. Watson,et al.  Development and validation of brief measures of positive and negative affect: the PANAS scales. , 1988, Journal of personality and social psychology.

[135]  Uwe D. Hanebeck,et al.  Design Issues of a Semi-Autonomous Robotic Assistant for the Health Care Environment , 1998, J. Intell. Robotic Syst..

[136]  Bernhard Schölkopf,et al.  New Support Vector Algorithms , 2000, Neural Computation.

[137]  Gwen Littlewort,et al.  Real Time Face Detection and Facial Expression Recognition: Development and Applications to Human Computer Interaction. , 2003, 2003 Conference on Computer Vision and Pattern Recognition Workshop.

[138]  Heinz Wörn,et al.  Human–Robot Cooperation Using Multi-Agent-Systems , 2001, J. Intell. Robotic Syst..

[139]  Ren C. Luo,et al.  Dynamic face recognition system in recognizing facial expressions for service robotics , 2012, 2012 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM).

[140]  Changchun Liu,et al.  Online Affect Detection and Robot Behavior Adaptation for Intervention of Children With Autism , 2008, IEEE Transactions on Robotics.

[141]  Silvan S. Tomkins,et al.  The negative affects , 1963 .

[142]  Rodney A. Brooks,et al.  Humanoid robots , 2002, CACM.

[143]  L. F. Barrett Discrete Emotions or Dimensions? The Role of Valence Focus and Arousal Focus , 1998 .

[144]  Tetsuo Ono,et al.  Robovie: an interactive humanoid robot , 2001 .

[145]  E.H. Kim,et al.  Speech ermotion recognition separately from voiced and unvoiced sound for emotional interaction robot , 2008, 2008 International Conference on Control, Automation and Systems.

[146]  Shuzhi Sam Ge,et al.  Facial expression recognition and tracking for intelligent human-robot interaction , 2008, Intell. Serv. Robotics.

[147]  Sebastian Thrun,et al.  A Gesture Based Interface for Human-Robot Interaction , 2000, Auton. Robots.

[148]  Hong Liu,et al.  Mandarin emotion recognition based on multifractal theory towards human-robot interaction , 2013, 2013 IEEE International Conference on Robotics and Biomimetics (ROBIO).

[149]  W. Wundt Outlines of Psychology , 1897 .

[150]  K. Scherer Expression of emotion in voice and music. , 1995, Journal of voice : official journal of the Voice Foundation.

[151]  Tetsuya Ogata,et al.  Towards expressive musical robots: a cross-modal framework for emotional gesture, voice and music , 2012, EURASIP J. Audio Speech Music. Process..

[152]  Ehud Sharlin,et al.  Using bio-electrical signals to influence the social behaviours of domesticated robots , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[153]  Oscar Déniz-Suárez,et al.  ENCARA2: Real-time detection of multiple faces at different resolutions in video streams , 2007, J. Vis. Commun. Image Represent..

[154]  A. J. Fridlund Human Facial Expression: An Evolutionary View , 1994 .

[155]  F. Pollick,et al.  A motion capture library for the study of identity, gender, and emotion perception from biological motion , 2006, Behavior research methods.

[156]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[157]  Thomas B. Sheridan,et al.  Human–Robot Interaction , 2016, Hum. Factors.

[158]  Eric Becker,et al.  Acoustical implicit communication in human-robot interaction , 2010, PETRA '10.

[159]  Myung Jin Chung,et al.  LMA based emotional motion representation using RGB-D camera , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[160]  Mitsuo Nagamachi,et al.  Kansei Engineering: A new ergonomic consumer-oriented technology for product development , 1995 .

[161]  Shuichi Nishio,et al.  Telenoid android robot as an embodied perceptual social regulation medium engaging natural human-humanoid interaction , 2014, Robotics Auton. Syst..

[162]  Hyun Seung Yang,et al.  Affective communication system with multimodality for a humanoid robot, AMI , 2004, 4th IEEE/RAS International Conference on Humanoid Robots, 2004..

[163]  Yi Li,et al.  Effect of emotional synchronization using facial expression recognition in human-robot communication , 2011, 2011 IEEE International Conference on Robotics and Biomimetics.

[164]  Paul Boersma,et al.  Praat, a system for doing phonetics by computer , 2002 .

[165]  Dong-Soo Kwon,et al.  Emotion Interaction System for a Service Robot , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[166]  H. Sebastian Seung,et al.  Learning the parts of objects by non-negative matrix factorization , 1999, Nature.

[167]  Tetsuo Ono,et al.  Development and evaluation of interactive humanoid robots , 2004, Proceedings of the IEEE.

[168]  F. Hara Artificial emotion of face robot through learning in communicative interactions with human , 2004, RO-MAN 2004. 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE Catalog No.04TH8759).

[169]  Bruce A. MacDonald,et al.  Facial Expression Recognition for Human-Robot Interaction - A Prototype , 2008, RobVis.

[170]  N. Alberto Borghese,et al.  Interacting with an artificial partner: modeling the role of emotional aspects , 2008, Biological Cybernetics.

[171]  Paul A. Viola,et al.  Robust Real-Time Face Detection , 2001, Proceedings Eighth IEEE International Conference on Computer Vision. ICCV 2001.

[172]  Elizabeth A. Crane,et al.  Effort-Shape and kinematic assessment of bodily expression of emotion during gait. , 2012, Human movement science.

[173]  M. Portella,et al.  Physiological Responses Induced by Emotion-Eliciting Films , 2012, Applied Psychophysiology and Biofeedback.

[174]  K. Scherer,et al.  Handbook of affective sciences. , 2003 .

[175]  Emilia I. Barakova,et al.  Expressing and interpreting emotional movements in social games with robots , 2010, Personal and Ubiquitous Computing.

[176]  João Sequeira,et al.  A Multimodal Emotion Detection System during Human-Robot Interaction , 2013, Sensors.

[177]  Goldie Nejat,et al.  Can I be of assistance? The intelligence behind an assistive robot , 2008, 2008 IEEE International Conference on Robotics and Automation.

[178]  Yoon Keun Kwak,et al.  Speech Emotion Recognition Using Eigen-FFT in Clean and Noisy Environments , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[179]  Fabio Pianesi,et al.  A first evaluation study of a database of kinetic facial expressions (DaFEx) , 2005, ICMI '05.

[180]  Rita Paradiso,et al.  A wearable health care system based on knitted integrated sensors , 2005, IEEE Transactions on Information Technology in Biomedicine.

[181]  Changchun Liu,et al.  An empirical study of machine learning techniques for affect recognition in human–robot interaction , 2006, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[182]  Frederik Hegger,et al.  Johnny: An Autonomous Service Robot for Domestic Environments , 2011, Journal of Intelligent & Robotic Systems.

[183]  Stephanie Rosenthal,et al.  Is Someone in this Office Available to Help Me? , 2012, J. Intell. Robotic Syst..

[184]  Ana Paiva,et al.  Automatic analysis of affective postures and body motion to detect engagement with a game companion , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[185]  Antonio Camurri,et al.  Toward a Minimal Representation of Affective Gestures , 2011, IEEE Transactions on Affective Computing.

[186]  Cynthia Breazeal,et al.  Social interactions in HRI: the robot view , 2004, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[187]  Jagdish Lal Raheja,et al.  Real-Time Robotic Hand Control Using Hand Gestures , 2010, 2010 Second International Conference on Machine Learning and Computing.

[188]  Aleix M. Martínez,et al.  A Model of the Perception of Facial Expressions of Emotion by Humans: Research Overview and Perspectives , 2012, J. Mach. Learn. Res..

[189]  H. Wallbott Bodily expression of emotion , 1998 .

[190]  Yoon Keun Kwak,et al.  Improved Emotion Recognition With a Novel Speaker-Independent Feature , 2009, IEEE/ASME Transactions on Mechatronics.

[191]  J. Borod The Neuropsychology of emotion , 2000 .

[192]  M. Davis,et al.  Nonverbal aspects of therapist attunement. , 1994, Journal of clinical psychology.

[193]  D. Watson,et al.  Toward a consensual structure of mood. , 1985, Psychological bulletin.

[194]  Franz Kummert,et al.  Direct imitation of human facial expressions by a user-interface robot , 2009, 2009 9th IEEE-RAS International Conference on Humanoid Robots.

[195]  Diane J. Schiano,et al.  Face to interface: facial affect in (hu)man and machine , 2000, CHI.

[196]  Carlos Balaguer,et al.  Facial gesture recognition using active appearance models based on neural evolution , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[197]  Nicolás F. Lori,et al.  Visuo-auditory Multimodal Emotional Structure to Improve Human-Robot-Interaction , 2012, Int. J. Soc. Robotics.

[198]  Daniel Thalmann,et al.  Human-virtual human interaction by upper body gesture understanding , 2013, VRST '13.

[199]  R. Dillmann,et al.  Using gesture and speech control for commanding a robot assistant , 2002, Proceedings. 11th IEEE International Workshop on Robot and Human Interactive Communication.

[200]  N. Alberto Borghese,et al.  A Simple Model for Human-Robot Emotional Interaction , 2007, KES.

[201]  Nilanjan Sarkar,et al.  Anxiety detecting robotic system – towards implicit human-robot collaboration , 2004, Robotica.

[202]  Kolja Kühnlenz,et al.  An emotional adaption approach to increase helpfulness towards a robot , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[203]  Klaus R. Scherer,et al.  Vocal communication of emotion , 2000 .

[204]  K. Scherer,et al.  Beyond arousal: valence and potency/control cues in the vocal expression of emotion. , 2010, The Journal of the Acoustical Society of America.

[205]  Paul A. Viola,et al.  Rapid object detection using a boosted cascade of simple features , 2001, Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001.

[206]  Jie Zhu,et al.  A novel voice activity detection based on phoneme recognition using statistical model , 2012, EURASIP J. Audio Speech Music. Process..

[207]  Shuzhi Sam Ge,et al.  Active affective facial analysis for human-robot interaction , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[208]  Jing Xiao,et al.  Automatic analysis and recognition of brow actions and head motion in spontaneous facial behavior , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[209]  Ana Paiva,et al.  Multimodal Affect Modeling and Recognition for Empathic Robot Companions , 2013, Int. J. Humanoid Robotics.

[210]  Brian Scassellati,et al.  The Grand Challenges in Socially Assistive Robotics , 2007 .

[211]  Sebastian Thrun,et al.  Real-Time Human Pose Tracking from Range Data , 2012, ECCV.

[212]  Léon J. M. Rothkrantz,et al.  Facial Expression Recognition with Relevance Vector Machines , 2005, 2005 IEEE International Conference on Multimedia and Expo.

[213]  Koen V. Hindriks,et al.  Robot mood is contagious: effects of robot body language in the imitation game , 2014, AAMAS.

[214]  Peter W. McOwan,et al.  A real-time automated system for the recognition of human facial expressions , 2006, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[215]  P. Ekman,et al.  Facial Expressions of Emotion , 1979 .

[216]  Ren C. Luo,et al.  Confidence fusion based emotion recognition of multiple persons for human-robot interaction , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[217]  Ching-Chih Tsai,et al.  Interactive emotion recognition using Support Vector Machine for human-robot interaction , 2009, 2009 IEEE International Conference on Systems, Man and Cybernetics.

[218]  Andrea Kleinsmith,et al.  Recognizing Affective Dimensions from Body Posture , 2007, ACII.

[219]  Illah R. Nourbakhsh,et al.  A survey of socially interactive robots , 2003, Robotics Auton. Syst..