Recognizing Action Units for Facial Expression Analysis

Most automatic expression analysis systems attempt to recognize a small set of prototypic expressions, such as happiness, anger, surprise, and fear. Such prototypic expressions, however, occur rather infrequently. Human emotions and intentions are more often communicated by changes in one or a few discrete facial features. In this paper, we develop an Automatic Face Analysis (AFA) system to analyze facial expressions based on both permanent facial features (brows, eyes, mouth) and transient facial features (deepening of facial furrows) in a nearly frontal-view face image sequence. The AFA system recognizes fine-grained changes in facial expression into action units (AUs) of the Facial Action Coding System (FACS), instead of a few prototypic expressions. Multistate face and facial component models are proposed for tracking and modeling the various facial features, including lips, eyes, brows, cheeks, and furrows. During tracking, detailed parametric descriptions of the facial features are extracted. With these parameters as the inputs, a group of action units (neutral expression, six upper face AUs and 10 lower face AUs) are recognized whether they occur alone or in combinations. The system has achieved average recognition rates of 96.4 percent (95.4 percent if neutral expressions are excluded) for upper face AUs and 96.7 percent (95.6 percent with neutral expressions excluded) for lower face AUs. The generalizability of the system has been tested by using independent image databases collected and FACS-coded for ground-truth by different research teams.

[1]  C. Darwin The Expression of the Emotions in Man and Animals , .

[2]  C. Darwin,et al.  The Expression of the Emotions in Man and Animals , 1872 .

[3]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[4]  B. Everitt,et al.  Statistical methods for rates and proportions , 1973 .

[5]  D. Barash Human Ethology , 1973 .

[6]  P. Ekman Pictures of Facial Affect , 1976 .

[7]  P. Ekman,et al.  Facial action coding system: a technique for the measurement of facial movement , 1978 .

[8]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.

[9]  P. Ekman,et al.  Handbook of methods in nonverbal behavior research , 1982 .

[10]  John F. Canny,et al.  A Computational Approach to Edge Detection , 1986, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[11]  Alan L. Yuille,et al.  Feature extraction from faces using deformable templates , 1989, Proceedings CVPR '89: IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[12]  Demetri Terzopoulos,et al.  Analysis of facial images using physical and anatomical models , 1990, [1990] Proceedings Third International Conference on Computer Vision.

[13]  Lawrence Sirovich,et al.  Application of the Karhunen-Loeve Procedure for the Characterization of Human Faces , 1990, IEEE Trans. Pattern Anal. Mach. Intell..

[14]  Alex Pentland,et al.  Face recognition using eigenfaces , 1991, Proceedings. 1991 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[15]  Kenji Mase,et al.  Recognition of Facial Expression from Optical Flow , 1991 .

[16]  Roberto Brunelli,et al.  Face Recognition: Features Versus Templates , 1993, IEEE Trans. Pattern Anal. Mach. Intell..

[17]  P. Ekman Facial expression and emotion. , 1993, The American psychologist.

[18]  Michael J. Black,et al.  Tracking and recognizing rigid and non-rigid facial motions using local parametric models of image motion , 1995, Proceedings of IEEE International Conference on Computer Vision.

[19]  Tsuhan Chen,et al.  Audio visual interaction in multimedia , 1995 .

[20]  Larry S. Davis,et al.  Recognizing Human Facial Expressions From Long Image Sequences Using Optical Flow , 1996, IEEE Trans. Pattern Anal. Mach. Intell..

[21]  Hong Yan,et al.  Locating and extracting the eye in human face images , 1996, Pattern Recognit..

[22]  Larry S. Davis,et al.  Human expression recognition from motion using a radial basis function network architecture , 1996, IEEE Trans. Neural Networks.

[23]  Alex Pentland,et al.  Coding, Analysis, Interpretation, and Recognition of Facial Expressions , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[24]  J. M. Carroll,et al.  Facial Expressions in Hollywood's Portrayal of Emotion , 1997 .

[25]  Takeo Kanade,et al.  Neural Network-Based Face Detection , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[26]  Osamu Yamaguchi,et al.  Facial feature point extraction method based on combination of shape extraction and pattern matching , 1998, Systems and Computers in Japan.

[27]  Osamu Yamaguchi,et al.  Facial Feature Point Extraction Method Based on Combination of Shape Extraction and Pattern Matching , 1998 .

[28]  T. Sejnowski,et al.  Measuring facial expressions by computer image analysis. , 1999, Psychophysiology.

[29]  J. Cohn,et al.  Automated face analysis by feature point tracking has high concurrent validity with manual FACS coding. , 1999, Psychophysiology.

[30]  Niels da Vitoria Lobo,et al.  Age Classification from Facial Images , 1999, Comput. Vis. Image Underst..

[31]  Zhengyou Zhang,et al.  Feature-Based Facial Expression Recognition: Sensitivity Analysis and Experiments with A Multilayer Perceptron , 1999, Int. J. Pattern Recognit. Artif. Intell..

[32]  Marian Stewart Bartlett,et al.  Classifying Facial Actions , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[33]  Michael J. Black,et al.  Parameterized Modeling and Recognition of Activities , 1999, Comput. Vis. Image Underst..

[34]  Takeo Kanade,et al.  Dual-state parametric eye tracking , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[35]  Takeo Kanade,et al.  Detection, tracking, and classification of action units in facial expression , 2000, Robotics Auton. Syst..

[36]  Takeo Kanade,et al.  Comprehensive database for facial expression analysis , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[37]  장윤희,et al.  Y. , 2003, Industrial and Labor Relations Terms.

[38]  M. Melamed Detection , 2021, SETI: Astronomy as a Contact Sport.