Human-Computer Interaction Using Emotion Recognition from Facial Expression

This paper describes emotion recognition system based on facial expression. A fully automatic facial expression recognition system is based on three steps: face detection, facial characteristic extraction and facial expression classification. We have developed an anthropometric model to detect facial feature points combined to Shi&amp, Thomasi method. The variations of 21 distances which describe the facial features deformations from the neutral face, were used to coding the facial expression. Classification step is based on SVM method (Support Vector Machine). Experimental results demonstrate that the proposed approach is an effective method to recognize emotions through facial expression with an emotion recognition rate more than90% in real time. This approach is used to control music player based on the variation of the emotional state of the computer user.

[1]  Shamik Sural,et al.  Segmentation and histogram generation using the HSV color space for image retrieval , 2002, Proceedings. International Conference on Image Processing.

[2]  Maria E. Jabon,et al.  Real-time classification of evoked emotions using facial feature tracking and physiological responses , 2008, Int. J. Hum. Comput. Stud..

[3]  P. Ekman Emotion in the human face , 1982 .

[4]  Alan L. Yuille,et al.  Feature extraction from faces using deformable templates , 2004, International Journal of Computer Vision.

[5]  Paul A. Viola,et al.  Robust Real-Time Face Detection , 2001, International Journal of Computer Vision.

[6]  Dong-Soo Kwon,et al.  Emotional interaction model for a service robot , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[7]  Shuzhi Sam Ge,et al.  Facial expression imitation in human robot interaction , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[8]  Xu Yanjun,et al.  Locating facial features with color information , 1998, ICSP '98. 1998 Fourth International Conference on Signal Processing (Cat. No.98TH8344).

[9]  P. Ekman,et al.  Facial action coding system: a technique for the measurement of facial movement , 1978 .

[10]  J.-Y. Bouguet,et al.  Pyramidal implementation of the lucas kanade feature tracker , 1999 .

[11]  Paul A. Viola,et al.  Robust Real-time Object Detection , 2001 .

[12]  Zakia Hammal,et al.  Segmentation des Traits du Visage, Analyse et Reconnaissance des Expressions Faciales par les Modèles de Croyance Transférable. , 2006 .

[13]  Nicu Sebe,et al.  Authentic Facial Expression Analysis , 2004, FGR.

[14]  鈴木 聡 Media Equation 研究の背景と動向 , 2011 .

[15]  Takeo Kanade,et al.  Comprehensive database for facial expression analysis , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[16]  Roberto Valenti,et al.  Automatic facial emotion recognition , 2005 .

[17]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[18]  Nicu Sebe,et al.  Authentic facial expression analysis , 2004, Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. Proceedings..

[19]  S. Demleitner [Communication without words]. , 1997, Pflege aktuell.

[20]  Alain Pruski,et al.  Real time facial feature points tracking with Pyramidal Lucas-Kanade algorithm , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.