From facial expression to level of interest: a spatio-temporal approach

This paper presents a novel approach to recognize the six universal facial expressions from visual data and use them to derive the level of interest using psychological evidences. The proposed approach relies on a two-step classification built on the top of refined optical flow computed from sequence of images. First, a bank of linear classifier was applied at frame level and the output of this stage was coalesced to produce a temporal signature for each observation. Second, temporal signatures thus computed from the training data set were used to train discrete hidden Markov models (HMMs) to learn the underlying models for each universal facial expressions. The average recognition rate of the proposed facial expression classifier is 90.9% without classifier fusion and 91.2% with fusion using a five fold cross validation scheme on a database of 488 video sequences that include 97 subjects. Recognized facial expressions were combined with the intensity of activity (motion) around the apex frame to measure the level of interest. To further illustrate the efficacy of the proposed approach two set of experiments, namely, television (TV) broadcast data (108 sequences of facial expression containing severe lighting conditions, diverse subjects and expressions) analysis and emotion elicitation on 21 subjects were conducted.

[1]  P. Ekman,et al.  The nature of emotion: Fundamental questions. , 1994 .

[2]  Michael J. Lyons,et al.  Coding facial expressions with Gabor wavelets , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[3]  Thomas S. Huang,et al.  Analysis-based facial expression synthesis , 1994, Proceedings of 1st International Conference on Image Processing.

[4]  Tomaso A. Poggio,et al.  Learning-based approach to real time tracking and analysis of faces , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[5]  R. R. Avent,et al.  Machine vision recognition of facial affect using backpropagation neural networks , 1994, Proceedings of 16th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[6]  Alex Pentland,et al.  Facial expression recognition using a dynamic model and motion energy , 1995, Proceedings of IEEE International Conference on Computer Vision.

[7]  A. Treves,et al.  A neural network facial expression recognition system using unsupervised local processing , 2001, ISPA 2001. Proceedings of the 2nd International Symposium on Image and Signal Processing and Analysis. In conjunction with 23rd International Conference on Information Technology Interfaces (IEEE Cat..

[8]  E. Fox,et al.  The face of fear: Effects of eye gaze and emotion on visual attention , 2003, Visual cognition.

[9]  Alex Pentland,et al.  Coding, Analysis, Interpretation, and Recognition of Facial Expressions , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[10]  Rosalind W. Picard Toward computers that recognize and respond to user emotion , 2000, IBM Syst. J..

[11]  Mohammed Yeasin,et al.  Detecting and tracking human face and eye using an space-varying sensor and an active vision head , 2000, Proceedings IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2000 (Cat. No.PR00662).

[12]  Rosalind W. Picard,et al.  Towards a Learning Companion that Recognizes Affect , 2001 .

[13]  Larry S. Davis,et al.  Recognizing Human Facial Expressions From Long Image Sequences Using Optical Flow , 1996, IEEE Trans. Pattern Anal. Mach. Intell..

[14]  M. Rosenblum,et al.  Human emotion recognition from motion using a radial basis function network architecture , 1994, Proceedings of 1994 IEEE Workshop on Motion of Non-rigid and Articulated Objects.

[15]  Takeo Kanade,et al.  Comprehensive database for facial expression analysis , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[16]  Sevgin Eroğlu,et al.  Describing and Measuring Emotional Response to Shopping Experience , 2000 .

[17]  David Salesin,et al.  Performance-driven hand-drawn animation , 2000, NPAR '00.

[18]  Andrew Blake,et al.  Determining facial expressions in real time , 1995, Proceedings of IEEE International Conference on Computer Vision.

[19]  Alex Pentland,et al.  LAFTER: lips and face real time tracker , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[20]  Yasunari Yoshitomi,et al.  Effect of sensor fusion for recognition of emotional states using voice, face image and thermal image of face , 2000, Proceedings 9th IEEE International Workshop on Robot and Human Interactive Communication. IEEE RO-MAN 2000 (Cat. No.00TH8499).

[21]  Michael J. Black,et al.  Recognizing Facial Expressions in Image Sequences Using Local Parameterized Models of Image Motion , 1997, International Journal of Computer Vision.

[22]  P. Ekman Pictures of Facial Affect , 1976 .

[23]  James Jenn-Jier Lien,et al.  A Multi-Method Approach for Discriminating Between Similar Facial Expressions, Including Expression Intensity Estimation , 1998 .

[24]  D. Keltner,et al.  Culture and Facial Expression: Open-ended Methods Find More Expressions and a Gradient of Recognition , 1999 .

[25]  Takeo Kanade,et al.  Recognizing Action Units for Facial Expression Analysis , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[26]  Daw-Tung Lin,et al.  Facial expressions classification with hierarchical radial basis function networks , 1999, ICONIP'99. ANZIIS'99 & ANNES'99 & ACNN'99. 6th International Conference on Neural Information Processing. Proceedings (Cat. No.99EX378).

[27]  J. N. Bassili Emotion recognition: the role of facial movement and the relative importance of upper and lower areas of the face. , 1979, Journal of personality and social psychology.

[28]  Jordi Vitrià,et al.  A weighted non-negative matrix factorization for local representations , 2001, Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001.

[29]  Alex Pentland,et al.  LAFTER: a real-time face and lips tracker with facial expression recognition , 2000, Pattern Recognit..

[30]  Thomas S. Huang,et al.  Emotion Recognition from Facial Expressions using Multilevel HMM , 2000 .

[31]  Michael J. Black,et al.  Tracking and recognizing rigid and non-rigid facial motions using local parametric models of image motion , 1995, Proceedings of IEEE International Conference on Computer Vision.

[32]  Christine L. Lisetti,et al.  Facial Expression Recognition Using a Neural Network , 1998, FLAIRS.

[33]  J. Russell Is there universal recognition of emotion from facial expression? A review of the cross-cultural studies. , 1994, Psychological bulletin.

[34]  Alex Pentland,et al.  Correlation and Interpolation Networks for Real-time Expression Analysis/Synthesis , 1994, NIPS.