One-class classification for spontaneous facial expression analysis

In this paper, we explore one-class classification application in recognizing emotional and nonemotional facial expressions occurred in a realistic human conversation setting - adult attachment interview (AAI). Although emotional facial expressions are defined in terms of facial action units in the psychological study, non-emotional facial expressions have not distinct description. It is difficult and expensive to model non-emotional facial expressions. Thus, we treat this facial expression recognition as a one-class classification problem which is to describe target objects (i.e. emotional facial expressions) and distinguish them from outliers (i. e. non-emotional ones). We first apply Kernel whitening to map the emotional data in a kernel subspace with unit variances in all directions. Then, we use support vector data description (SVDD) for the classification which is to directly fit a boundary with minimal volume around the target data. We present our preliminary experiments on the AAI data, and compare Kernel whitening SVDD with PCA+SVDD and PCA+Gaussian methods

[1]  David M. J. Tax,et al.  One-class classification , 2001 .

[2]  Gwen Littlewort,et al.  Recognizing facial expression: machine learning and application to spontaneous behavior , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[3]  Gunter Ritter,et al.  Outliers in statistical pattern recognition and an application to automatic chromosome classification , 1997, Pattern Recognit. Lett..

[4]  Alex Pentland,et al.  Bayesian face recognition , 2000, Pattern Recognit..

[5]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[6]  Nicu Sebe,et al.  Authentic Facial Expression Analysis , 2004, FGR.

[7]  Nathalie Japkowicz,et al.  A Novelty Detection Approach to Classification , 1995, IJCAI.

[8]  Bernhard Schölkopf,et al.  Nonlinear Component Analysis as a Kernel Eigenvalue Problem , 1998, Neural Computation.

[9]  Jeffrey F. Cohn,et al.  The Timing of Facial Motion in posed and Spontaneous Smiles , 2003, Int. J. Wavelets Multiresolution Inf. Process..

[10]  Glenn I. Roisman,et al.  The emotional integration of childhood experience: physiological, facial expressive, and self-reported emotional response during the adult attachment interview. , 2004, Developmental psychology.

[11]  Zhihong Zeng,et al.  Bimodal HCI-related affect recognition , 2004, ICMI '04.

[12]  Christopher M. Bishop,et al.  Novelty detection and neural network validation , 1994 .

[13]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[14]  L. Rothkrantz,et al.  Toward an affect-sensitive multimodal human-computer interaction , 2003, Proc. IEEE.

[15]  P. Ekman,et al.  Unmasking the face : a guide to recognizing emotions from facial clues , 1975 .

[16]  Thomas S. Huang,et al.  Explanation-based facial motion tracking using a piecewise Bezier volume deformation model , 1999, Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149).

[17]  R. Gibson,et al.  What the Face Reveals , 2002 .

[18]  David M. J. Tax,et al.  Kernel Whitening for One-Class Classification , 2002, Int. J. Pattern Recognit. Artif. Intell..