Robot Emotion and Performance Regulation Based on HMM

This paper discusses the transference process of emotional states driven by psychological energy in the active field state space and also builds a robot expression model based on the Hidden Marov Models(HMM). Facial expressions and behaviours are two important channels for human-robot interaction. Robot performance based on a static emotional state cannot vividly display dynamic and complex emotional transference. Building a real-time emotional interactive model is a critical part of robot expression. First, the attenuating emotional state space is defined and the state transfer probability is acquired. Secondly, the emotional expression model based on the HMM is proposed and the performance transference probability is calculated. Finally, this model is verified by a 15 degrees of freedom robot platform. Moreover, the interactive effects are analysed by a statistical algorithm. The experimental results demonstrate that the emotional expression model can acquire an expressive performance and remove the mechanized interactive mode when compared to traditional algorithms.

[1]  Antonio Camurri,et al.  A System for Real-Time Multimodal Analysis of Nonverbal Affective Social Interaction in User-Centric Media , 2010, IEEE Transactions on Multimedia.

[2]  Myung Jin Chung,et al.  Robot's emotion generation model for transition and diversity using energy, entropy, and homeostasis concepts , 2010, 2010 IEEE International Conference on Robotics and Biomimetics.

[3]  Rosalind W. Picard Affective Computing: From Laughter to IEEE , 2010 .

[4]  Dong-Soo Kwon,et al.  Robot's behavior expressions according to the sentence types and emotions with modification by personality , 2010, 2010 IEEE Workshop on Advanced Robotics and its Social Impacts.

[5]  Andrew Ortony,et al.  The Cognitive Structure of Emotions , 1988 .

[6]  Cynthia Breazeal,et al.  Function meets style: insights from emotion theory applied to HRI , 2004, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[7]  Carlo Ratti,et al.  An Affective Intelligent Driving Agent: Driver's Trajectory and Activities Prediction , 2009, 2009 IEEE 70th Vehicular Technology Conference Fall.

[8]  Liu Xin,et al.  Dynamic regulation process of facial expression robot , 2011 .

[9]  H. Karatza,et al.  SIMULATION Transactions of The Society for Modeling and Simulation International SCS , 2002 .

[10]  Zhiliang Wang,et al.  Mechanical Design and Affective Interaction of Bionic Robot Head , 2011 .

[11]  Koichi Kise,et al.  Object Recognition Based on n-gram Expression of Human Actions , 2010, 2010 20th International Conference on Pattern Recognition.

[12]  Dana Kulic,et al.  Affective State Estimation for Human–Robot Interaction , 2007, IEEE Transactions on Robotics.