Real-time classification of evoked emotions using facial feature tracking and physiological responses

We present automated, real-time models built with machine learning algorithms which use videotapes of subjects' faces in conjunction with physiological measurements to predict rated emotion (trained coders' second-by-second assessments of sadness or amusement). Input consisted of videotapes of 41 subjects watching emotionally evocative films along with measures of their cardiovascular activity, somatic activity, and electrodermal responding. We built algorithms based on extracted points from the subjects' faces as well as their physiological responses. Strengths of the current approach are (1) we are assessing real behavior of subjects watching emotional videos instead of actors making facial poses, (2) the training data allow us to predict both emotion type (amusement versus sadness) as well as the intensity level of each emotion, (3) we provide a direct comparison between person-specific, gender-specific, and general models. Results demonstrated good fits for the models overall, with better performance for emotion categories than for emotion intensity, for amusement ratings than sadness ratings, for a full model using both physiological measures and facial tracking than for either cue alone, and for person-specific models than for gender-specific or general models.

[1]  Nicu Sebe,et al.  Emotion recognition using a Cauchy Naive Bayes classifier , 2002, Object recognition supported by user interaction for service robots.

[2]  Simeon Keates,et al.  Temporal context and the recognition of emotion from facial expression , 2003 .

[3]  B. Parkinson,et al.  Emotion and motivation , 1995 .

[4]  A. Fischer Gender and emotion: Social psychological perspectives , 2000 .

[5]  T. Mexia,et al.  Author ' s personal copy , 2009 .

[6]  Frank H Wilhelm,et al.  How to Bite Your Tongue Without Blowing Your Top: Implicit Evaluation of Emotion Regulation Predicts Affective Responding to Anger Provocation , 2006, Personality & social psychology bulletin.

[7]  Agneta H. Fischer,et al.  Gender Differences in Motives for Regulating Emotions , 1998 .

[8]  David P. Whistler,et al.  Chapter 11 , 2003, Aristotle's De Motu Animalium.

[9]  P. Ekman,et al.  The Duchenne smile: emotional expression and brain physiology. II. , 1990, Journal of personality and social psychology.

[10]  Michael J. Lyons Facial gesture interfaces for expression and communication , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[11]  U. Bronfenbrenner Ecological systems theory. , 1992 .

[12]  J. Gross,et al.  The tie that binds? Coherence among emotion experience, behavior, and physiology. , 2005, Emotion.

[13]  Diane J. Schiano,et al.  Communicating facial affect: it's not the realism, it's the motion , 2000, CHI Extended Abstracts.

[14]  P. Ekman,et al.  Facial action coding system: a technique for the measurement of facial movement , 1978 .

[15]  Diane J. Schiano,et al.  Categorical imperative NOT: facial affect is perceived continuously , 2004, CHI '04.

[16]  Gwen Littlewort,et al.  Recognizing facial expression: machine learning and application to spontaneous behavior , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[17]  Ian H. Witten,et al.  Data mining - practical machine learning tools and techniques, Second Edition , 2005, The Morgan Kaufmann series in data management systems.

[18]  D. Watson,et al.  Toward a consensual structure of mood. , 1985, Psychological bulletin.

[19]  Ian Witten,et al.  Data Mining , 2000 .

[20]  M. Yachida,et al.  Facial expression recognition and its degree estimation , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[21]  Peter Robinson,et al.  Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures , 2004, 2004 Conference on Computer Vision and Pattern Recognition Workshop.

[22]  Yajie Tian,et al.  Handbook of face recognition , 2003 .

[23]  G. Bonanno,et al.  Brief Report The coherence of emotion systems: Comparing “on‐line” measures of appraisal and facial expressions, and self‐report , 2004 .

[24]  A. Pentland,et al.  Thin slices of negotiation: predicting outcomes from conversational dynamics within the first 5 minutes. , 2007, The Journal of applied psychology.

[25]  S. Odom,et al.  The Tie That Binds , 2009 .

[26]  Takeo Kanade,et al.  Eye-State Action Unit Detection by Gabor Wavelets , 2000, ICMI.

[27]  Christoph Bartneck,et al.  HCI and the face , 2006, CHI EA '06.

[28]  Ross A. Thompson,et al.  Emotion regulation: Conceptual foundations , 2007 .

[29]  Khalil Sima'an,et al.  Wired for Speech: How Voice Activates and Advances the Human-Computer Relationship , 2006, Computational Linguistics.

[30]  Jeremy N. Bailenson,et al.  The Effect of Behavioral Realism and Form Realism of Real-Time Avatar Faces on Verbal Disclosure, Nonverbal Disclosure, Emotion Recognition, and Copresence in Dyadic Interaction , 2006, PRESENCE: Teleoperators and Virtual Environments.

[31]  Y. Freund,et al.  Discussion of the Paper \additive Logistic Regression: a Statistical View of Boosting" By , 2000 .

[32]  Maja Pantic,et al.  Dynamics of facial expression: recognition of facial actions and their temporal segments from face profile image sequences , 2006, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[33]  J. Cohn,et al.  A Psychometric Evaluation of the Facial Action Coding System for Assessing Spontaneous Expression , 2001 .

[34]  P. Ekman,et al.  EMFACS-7: Emotional Facial Action Coding System , 1983 .

[35]  Ian H. Witten,et al.  Data mining: practical machine learning tools and techniques, 3rd Edition , 1999 .

[36]  John P. Lewis,et al.  Perceiving Visual Emotions with Speech , 2006, IVA.

[37]  Takeo Kanade,et al.  Subtly different facial expression recognition and expression intensity estimation , 1998, Proceedings. 1998 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No.98CB36231).

[38]  Zhengyou Zhang,et al.  Comparison between geometry-based and Gabor-wavelets-based facial expression recognition using multi-layer perceptron , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[39]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[40]  Takeo Kanade,et al.  A computer vision based method of facial expression analysis in parent-infant interaction , 1998 .

[41]  P. Philippot Inducing and assessing differentiated emotion-feeling states in the laboratory. , 1993, Cognition & emotion.

[42]  A. Kring,et al.  Gender and anger. , 2000 .

[43]  Rosalind W. Picard,et al.  Evaluating affective interactions: Alternatives to asking what users feel , 2005 .

[44]  Yoav Freund,et al.  Experiments with a New Boosting Algorithm , 1996, ICML.

[45]  J. Cacioppo,et al.  Handbook Of Psychophysiology , 2019 .

[46]  Alex Pentland,et al.  Coding, Analysis, Interpretation, and Recognition of Facial Expressions , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[47]  Judith A. Hall Nonverbal sex differences : communication accuracy and expressive style , 1984 .

[48]  J. Gross,et al.  Emotion elicitation using films , 1995 .

[49]  A. Kazdin Encyclopedia of psychology , 1984 .