Identifying Real and Posed Smiles from Observers' Galvanic Skin Response and Blood Volume Pulse

This study addresses the question whether galvanic skin response (GSR) and blood volume pulse (BVP) of untrained and unaided observers can be used to identify real and posed smiles from different sets of smile videos or smile images. Observers were shown smile face videos/images, either singly or paired, with the intention to recognise each viewed as real or posed smiles. We created four experimental situations, namely single images (SI), single videos (SV), paired images (PI), and paired videos (PV). The GSR and BVP signals were recorded and processed. Our machine learning classifiers reached the highest accuracy of 93.3%, 87.6%, 92.0%, 91.7% for PV, PI, SV, and SI respectively. Finally, PV and SI were found to be the easiest and hardest way to identify real and posed smiles respectively. Overall, we demonstrated that observers’ subconscious physiological signals (GSR and BVP) are able to identify real and posed smiles at a good accuracy.

[1]  Ashok Samal,et al.  Automatic recognition and analysis of human faces and facial expressions: a survey , 1992, Pattern Recognit..

[2]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2009, IEEE Trans. Pattern Anal. Mach. Intell..

[3]  F. B. Reguig,et al.  Emotion recognition from physiological signals , 2011, Journal of medical engineering & technology.

[4]  Jinshuai Ma,et al.  Measuring User Responses to Driving Simulators: A Galvanic Skin Response Based Study , 2019, 2019 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR).

[5]  M. A. Rowe,et al.  Guide for Analysing Electrodermal Activity & Skin Conductance Responses for Psychological Experiments , 2013 .

[6]  S. Pugh,et al.  Service with a smile: Emotional contagion in the service encounter. , 2001 .

[7]  Mustafa E. Kamasak,et al.  Emotion Recognition via Galvanic Skin Response: Comparison of Machine Learning Algorithms and Feature Extraction Methods , 2017 .

[8]  Chung-Hsien Wu,et al.  Survey on audiovisual emotion recognition: databases, features, and data fusion strategies , 2014, APSIPA Transactions on Signal and Information Processing.

[9]  Guanming Lu,et al.  Emotion Recognition Based on Physiological Signals Using Convolution Neural Networks , 2020, ICMLC.

[10]  Albert Ali Salah,et al.  Recognition of Genuine Smiles , 2015, IEEE Transactions on Multimedia.

[11]  Ahmad Y. Javaid,et al.  Facial Emotion Recognition: A Survey and Real-World User Experiences in Mixed Reality , 2018, Sensors.

[12]  Jennifer Healey,et al.  Toward Machine Emotional Intelligence: Analysis of Affective Physiological State , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[13]  Nicu Sebe,et al.  Multimodal approaches for emotion recognition: a survey , 2005, IS&T/SPIE Electronic Imaging.

[14]  F. Deutsch,et al.  What is in a Smile? , 1987 .

[15]  Zhenqi Li,et al.  A Review of Emotion Recognition Using Physiological Signals , 2018, Sensors.

[16]  P. Ekman,et al.  Felt, false, and miserable smiles , 1982 .

[17]  Daniel Teichmann,et al.  Detection of acute periodontal pain from physiological signals , 2018, Physiological measurement.

[18]  Fakhri Karray,et al.  Survey on speech emotion recognition: Features, classification schemes, and databases , 2011, Pattern Recognit..

[19]  M Murugappan,et al.  Physiological signals based human emotion Recognition: a review , 2011, 2011 IEEE 7th International Colloquium on Signal Processing and its Applications.

[20]  P. Ekman,et al.  Voluntary Smiling Changes Regional Brain Activity , 1993 .

[21]  Ze Wang,et al.  Smile Big or Not? Effects of Smile Intensity on Perceptions of Warmth and Competence , 2016 .

[22]  Gwen Littlewort,et al.  Toward Practical Smile Detection , 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[23]  Thierry Pun,et al.  DEAP: A Database for Emotion Analysis ;Using Physiological Signals , 2012, IEEE Transactions on Affective Computing.

[24]  Tamás D. Gedeon,et al.  Classifying posed and real smiles from observers' peripheral physiology , 2017, PervasiveHealth.

[25]  Tamás D. Gedeon,et al.  Discriminating real and posed smiles: human and avatar smiles , 2017, OZCHI.

[26]  Maria E. Jabon,et al.  Real-time classification of evoked emotions using facial feature tracking and physiological responses , 2008, Int. J. Hum. Comput. Stud..

[27]  Andrius Dzedzickis,et al.  Human Emotion Recognition: Review of Sensors and Methods , 2020, Sensors.

[28]  S. Sussman,et al.  You're Only as Pretty as You Feel: Facial Expression as a Determinant of Physical Attractiveness , 1984 .

[29]  Beat Fasel,et al.  Automati Fa ial Expression Analysis: A Survey , 1999 .

[30]  Daphne Blunt Bugental,et al.  Unmasking the "Polite Smile" , 1986 .

[31]  Albert Ali Salah,et al.  Eyes do not lie: spontaneous versus posed smiles , 2010, ACM Multimedia.

[32]  Md. Zakir Hossain,et al.  A robust feature selection system with Colin's CCA network , 2016, Neurocomputing.

[33]  Alicia A. Grandey,et al.  Service with a smile and encounter satisfaction: emotional contagion and appraisal mechanisms , 2006 .