Morphset: Augmenting Categorical Emotion Datasets With Dimensional Affect Labels Using Face Morphing

Emotion recognition and understanding is a vital component in human-machine interaction. Dimensional models of affect such as those using valence and arousal have advantages over traditional categorical ones due to the complexity of emotional states in humans. However, dimensional emotion annotations are difficult and expensive to collect, therefore they are still limited in the affective computing community. To address these issues, we propose a method to generate synthetic images from existing categorical emotion datasets using face morphing, with full control over the resulting sample distribution as well as dimensional labels in the circumplex space, while achieving augmentation factors of at least 20x or more.

[1]  Stefanos Zafeiriou,et al.  Deep Neural Network Augmentation: Generating Faces for Affect Analysis , 2018, International Journal of Computer Vision.

[2]  Mohammad H. Mahoor,et al.  AffectNet: A Database for Facial Expression, Valence, and Arousal Computing in the Wild , 2017, IEEE Transactions on Affective Computing.

[3]  Yunhong Wang,et al.  Facial Expression Synthesis by U-Net Conditional Generative Adversarial Networks , 2018, ICMR.

[4]  Shan Li,et al.  Deep Facial Expression Recognition: A Survey , 2018, IEEE Transactions on Affective Computing.

[5]  Zixing Zhang,et al.  Adversarial Training in Affective Computing and Sentiment Analysis: Recent Advances and Perspectives , 2018, ArXiv.

[6]  Pia Rotshtein,et al.  Identification of Emotional Facial Expressions: Effects of Expression, Intensity, and Sex on Eye Gaze , 2016, PloS one.

[7]  Yong Tao,et al.  Compound facial expressions of emotion , 2014, Proceedings of the National Academy of Sciences.

[8]  Francesc Moreno-Noguer,et al.  GANimation: Anatomically-aware Facial Animation from a Single Image , 2018, ECCV.

[9]  Aleix M. Martinez,et al.  Discriminant Functional Learning of Color Features for the Recognition of Facial Action Units and Their Intensities , 2019, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[10]  J. Russell A circumplex model of affect. , 1980 .

[11]  Vladimir Pavlovic,et al.  Context-Sensitive Dynamic Ordinal Regression for Intensity Estimation of Facial Action Units , 2015, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[12]  D. Perrett,et al.  Assessment of perception of morphed facial expressions using the Emotion Recognition Task: normative data from healthy participants aged 8-75. , 2014, Journal of neuropsychology.

[13]  Stefan Winkler,et al.  Efficient Facial Expression Analysis For Dimensional Affect Recognition Using Geometric Features , 2021, ArXiv.

[14]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[15]  Guoying Zhao,et al.  Aff-Wild: Valence and Arousal ‘In-the-Wild’ Challenge , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW).

[16]  Stefan Winkler,et al.  Identity-Invariant Facial Landmark Frontalization For Facial Expression Analysis , 2020, 2020 IEEE International Conference on Image Processing (ICIP).

[17]  Davis E. King,et al.  Dlib-ml: A Machine Learning Toolkit , 2009, J. Mach. Learn. Res..

[18]  Skyler T. Hawk,et al.  Presentation and validation of the Radboud Faces Database , 2010 .

[19]  P. Lewinski,et al.  Warsaw set of emotional facial expression pictures: a validation study of facial display photographs , 2015, Front. Psychol..

[20]  Andrea Cavallaro,et al.  Automatic Analysis of Facial Affect: A Survey of Registration, Representation, and Recognition , 2015, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[21]  L. Leyman,et al.  The Karolinska Directed Emotional Faces: A validation study , 2008 .

[22]  P. Ekman,et al.  Constants across cultures in the face and emotion. , 1971, Journal of personality and social psychology.

[23]  Rama Chellappa,et al.  ExprGAN: Facial Expression Editing with Controllable Expression Intensity , 2017, AAAI.

[24]  Justus Thies,et al.  Real-time expression transfer for facial reenactment , 2015, ACM Trans. Graph..

[25]  Oliver G. B. Garrod,et al.  Realistic facial animation generation based on facial expression mapping , 2014, International Conference on Graphic and Image Processing.

[26]  Frédéric Jurie,et al.  The Many Moods of Emotion , 2018, ArXiv.

[27]  Stefanos Zafeiriou,et al.  SliderGAN: Synthesizing Expressive Face Images by Sliding 3D Blendshape Parameters , 2019, International Journal of Computer Vision.

[28]  Jeffrey R. Spies,et al.  Mapping and Manipulating Facial Expression , 2009, Language and speech.

[29]  Mike Thelwall,et al.  Seeing Stars of Valence and Arousal in Blog Posts , 2013, IEEE Transactions on Affective Computing.