Sensing Affective States Using Facial Expression Analysis

An important factor for the next generation of Human Computer Interaction is the implementation of an interaction model that automatically reasons in context of the users goals, attitudes, affective characteristics and capabilities, and adapts the system accordingly. Although various techniques have been proposed for automatically detecting affective states using facial expression, this is still a research challenge in terms of classification accuracy. This paper investigates an extensible automatic affective state detection approach via the analysis of facial expressions from digital photographs. The main contribution of this study can be summarised in two points. Firstly, utilising facial point distance vectors within the representation of facial expressions is shown to be more accurate and robust in comparison to using standard Cartesian coordinates. Secondly, employing a two-stage Support Vector Machine-based classification model, entitled Hierarchical Parallelised Binary Support Vector Machines (HPBSVM), is shown to improve classification performance over other machine learning techniques. The resulting classification model has been evaluated using two different facial expression datasets (namely CKPLUS and KDEF), yielding accuracy rates of 96.9 % and 96.2 % over each dataset respectively.

[1]  M. den Uyl,et al.  The FaceReader: Online facial expression recognition , 2006 .

[2]  Hugo Fuks,et al.  FX e-Makeup for Muscle Based Interaction , 2014, HCI.

[3]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[4]  Nicu Sebe,et al.  Exploiting facial expressions for affective video summarisation , 2009, CIVR '09.

[5]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[6]  J. G. Taylor,et al.  Emotion recognition in human-computer interaction , 2005, Neural Networks.

[7]  Daniel McDuff,et al.  Remote measurement of cognitive stress via heart rate variability , 2014, 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[8]  Shaogang Gong,et al.  Beyond Facial Expressions: Learning Human Emotion from Body Gestures , 2007, BMVC.

[9]  J. Russell,et al.  The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology , 2005, Development and Psychopathology.

[10]  A. M. Martinez,et al.  Deciphering the face , 2011, CVPR 2011 WORKSHOPS.

[11]  Eibe Frank,et al.  Logistic Model Trees , 2003, ECML.

[12]  Bülent Sankur,et al.  Spatiotemporal-Boosted DCT Features for Head and Face Gesture Analysis , 2010, HBU.

[13]  Maja Pantic,et al.  Action unit detection using sparse appearance descriptors in space-time video volumes , 2011, Face and Gesture 2011.

[14]  Takeo Kanade,et al.  The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops.

[15]  Daphne Koller,et al.  Hierarchically Classifying Documents Using Very Few Words , 1997, ICML.

[16]  Stefanos Zafeiriou,et al.  Incremental Face Alignment in the Wild , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[17]  Nicu Sebe,et al.  Communication and Automatic Interpretation of Affect from Facial Expressions , 2010 .

[18]  Deepak Ghimire,et al.  Geometric Feature-Based Facial Expression Recognition in Image Sequences Using Multi-Class AdaBoost and Support Vector Machines , 2013, Sensors.

[19]  Marian Stewart Bartlett,et al.  Automatic facial expression recognition for intelligent tutoring systems , 2008, 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops.

[20]  Shaogang Gong,et al.  Facial expression recognition based on Local Binary Patterns: A comprehensive study , 2009, Image Vis. Comput..

[21]  John C. Platt,et al.  Fast training of support vector machines using sequential minimal optimization, advances in kernel methods , 1999 .

[22]  Mann Oo. Hay Emotion recognition in human-computer interaction , 2012 .

[23]  Rosalind W. Picard,et al.  Measuring Affective-Cognitive Experience and Predicting Market Success , 2014, IEEE Transactions on Affective Computing.

[24]  William Colgrove Intelligent user interfaces and the Internet , 1995 .

[25]  P. Ekman,et al.  Constants across cultures in the face and emotion. , 1971, Journal of personality and social psychology.

[26]  Takehisa Yairi,et al.  Facial Expression Recognition and Analysis: A Comparison Study of Feature Descriptors , 2015, IPSJ Trans. Comput. Vis. Appl..

[27]  D. Lundqvist,et al.  Karolinska Directed Emotional Faces , 2015 .

[28]  Peter Robinson,et al.  Perception of emotional expressions in different representations using facial feature points , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[29]  Zhigang Deng,et al.  Analysis of emotion recognition using facial expressions, speech and multimodal information , 2004, ICMI '04.

[30]  Ashish Kapoor,et al.  Automatic prediction of frustration , 2007, Int. J. Hum. Comput. Stud..

[31]  Paul Ekman,et al.  What Scientists Who Study Emotion Agree About , 2016, Perspectives on psychological science : a journal of the Association for Psychological Science.

[32]  Thierry Pun,et al.  Multimodal Emotion Recognition in Response to Videos , 2012, IEEE Transactions on Affective Computing.

[33]  Alex Pentland,et al.  LAFTER: a real-time face and lips tracker with facial expression recognition , 2000, Pattern Recognit..