Real-time estimation of facial expression intensity

Changing facial expressions is a natural and powerful way of conveying personal intention, expressing emotion and regulating interpersonal communication. Automatic estimation of human facial expression intensity is an important step in enhancing the capability of human-robot interfaces. In this research, we have developed a system which can automatically estimate the intensity of facial expression in real-time. Based on isometric feature mapping, the intensity of expression is extracted from training facial transition sequences. Then, intelligent models including cascade neural networks and support vector machines are applied to model the relationship between the trajectories of facial feature points and expression intensity level. We have implemented a vision system which can estimate the expression intensity of happiness, anger and sadness in real-time.

[1]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.

[2]  Yangsheng Xu,et al.  Cascade neural networks with node-decoupled extended Kalman filtering , 1997, Proceedings 1997 IEEE International Symposium on Computational Intelligence in Robotics and Automation CIRA'97. 'Towards New Computational Principles for Robotics and Automation'.

[3]  U. Hess,et al.  The Intensity of Emotional Facial Expressions and Decoding Accuracy , 1997 .

[4]  M. Yachida,et al.  Facial expression recognition and its degree estimation , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[5]  Hartmut Neven,et al.  Online facial expression recognition based on personalized galleries , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[6]  Masahiko Yachida,et al.  Recognizing degree of continuous facial expression change , 1998, Proceedings. Fourteenth International Conference on Pattern Recognition (Cat. No.98EX170).

[7]  Takeo Kanade,et al.  Subtly different facial expression recognition and expression intensity estimation , 1998, Proceedings. 1998 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No.98CB36231).

[8]  Yangsheng Xu,et al.  Learning and validation of human control strategies , 1998 .

[9]  Christine L. Lisetti,et al.  Facial Expression Recognition Using a Neural Network , 1998, FLAIRS.

[10]  Takeshi Naemura,et al.  Personal facial expression space based on multidimensional scaling for the recognition improvement , 1999, ISSPA '99. Proceedings of the Fifth International Symposium on Signal Processing and its Applications (IEEE Cat. No.99EX359).

[11]  D. Matsumoto,et al.  American-Japanese Cultural Differences in Judgements of Expression Intensity and Subjective Experience , 1999 .

[12]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[13]  Yangsheng Xu,et al.  Human sensation modeling in virtual environments , 2000, Proceedings. 2000 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2000) (Cat. No.00CH37113).

[14]  P. Niedenthal,et al.  Emotional state and the detection of change in facial expression of emotion , 2000 .

[15]  H. Sebastian Seung,et al.  The Manifold Ways of Perception , 2000, Science.