Distributing expressional faces in 2-D emotional space

Facial expressions are often classified into one of several basic emotion categories. This categorical approach seems improper to treat faces with blended emotion, as well as hard to measure the intensity of an emotion. In this paper facial expressions are evaluated with dimensional approach of affect that was originally introduced by psycho-physiologic study. An expressional face can be represented as a point in a two-dimensional (2-D) emotional space characterized by arousal and valence factors. To link low-level face features with emotional factors, we propose a simple method that builds an emotional mapping by a coarse labeling on Cohn-Kanade database and a linear fitting on the labeled data. Our preliminary experimental result shows that the proposed emotional mapping can be used to visualize the distribution of affective content in a large face set and further retrieval expressional face images or relevant video shots by specifying a region in the 2-D emotional space.

[1]  Timothy F. Cootes,et al.  Active Shape Models-Their Training and Application , 1995, Comput. Vis. Image Underst..

[2]  Timothy F. Cootes,et al.  Toward Automatic Simulation of Aging Effects on Face Images , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[3]  Changbo Hu,et al.  Manifold of facial expression , 2003, 2003 IEEE International SOI Conference. Proceedings (Cat. No.03CH37443).

[4]  Matti Pietikäinen,et al.  Multiresolution Gray-Scale and Rotation Invariant Texture Classification with Local Binary Patterns , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[5]  Alan Hanjalic,et al.  Affective video content representation and modeling , 2005, IEEE Transactions on Multimedia.

[6]  Michael J. Lyons,et al.  Automatic Classification of Single Facial Images , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[7]  Li Zhang,et al.  Robust face alignment based on local texture classifiers , 2005, IEEE International Conference on Image Processing 2005.

[8]  Cynthia Breazeal,et al.  Robot in Society: Friend or Appliance? , 1999 .

[9]  J. Russell,et al.  Evidence for a three-factor theory of emotions , 1977 .

[10]  Maja Pantic,et al.  Automatic Analysis of Facial Expressions: The State of the Art , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[11]  Kristian Kroschel,et al.  RECOGNIZING EMOTIONS IN SPONTANEOUS FACIAL EXPRESSIONS , 2006 .

[12]  Timothy F. Cootes,et al.  Automatic Interpretation and Coding of Face Images Using Flexible Models , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[13]  Shaogang Gong,et al.  Robust facial expression recognition using local binary patterns , 2005, IEEE International Conference on Image Processing 2005.

[14]  Maja Pantic,et al.  Biologically vs. Logic Inspired Encoding of Facial Actions and Emotions in Video , 2006, 2006 IEEE International Conference on Multimedia and Expo.

[15]  P. Lang,et al.  Affective judgment and psychophysiological response: Dimensional covariation in the evaluation of pictorial stimuli. , 1989 .

[16]  Gareth J. F. Jones,et al.  Affect-based indexing and retrieval of films , 2005, MULTIMEDIA '05.

[17]  Hidekazu Yoshikawa,et al.  A study of real-time image processing method for treating human emotion by facial expression , 1999, IEEE SMC'99 Conference Proceedings. 1999 IEEE International Conference on Systems, Man, and Cybernetics (Cat. No.99CH37028).

[18]  A. Hanjalic,et al.  Extracting moods from pictures and sounds: towards truly personalized TV , 2006, IEEE Signal Processing Magazine.

[19]  Yuan Li,et al.  Vector boosting for rotation invariant multi-view face detection , 2005, Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1.

[20]  Takeo Kanade,et al.  Comprehensive database for facial expression analysis , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[21]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[22]  Nicu Sebe,et al.  Facial expression recognition from video sequences: temporal and static modeling , 2003, Comput. Vis. Image Underst..

[23]  P. Ekman,et al.  Facial action coding system: a technique for the measurement of facial movement , 1978 .

[24]  Beat Fasel,et al.  Automati Fa ial Expression Analysis: A Survey , 1999 .

[25]  Fadi Dornaika,et al.  Simultaneous facial action tracking and expression recognition using a particle filter , 2005, Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1.

[26]  R. Simons,et al.  Roll ‘em!: The effects of picture motion on emotional responses , 1998 .

[27]  Mohammed Yeasin,et al.  Recognition of facial expressions and measurement of levels of interest from video , 2006, IEEE Transactions on Multimedia.

[28]  Shaogang Gong,et al.  Dynamic Facial Expression Recognition Using A Bayesian Temporal Manifold Model , 2006, BMVC.

[29]  Shaogang Gong,et al.  Appearance Manifold of Facial Expression , 2005, ICCV-HCI.

[30]  Takeo Kanade,et al.  Recognizing Action Units for Facial Expression Analysis , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[31]  Maja Pantic,et al.  Web-based database for facial expression analysis , 2005, 2005 IEEE International Conference on Multimedia and Expo.

[32]  P. Ekman,et al.  Constants across cultures in the face and emotion. , 1971, Journal of personality and social psychology.