Evaluating and Validating Emotion Elicitation Using English and Arabic Movie Clips on a Saudi Sample

With the advancement of technology in both hardware and software, estimating human affective states has become possible. Currently, movie clips are used as they are a widely-accepted method of eliciting emotions in a replicable way. However, cultural differences might influence the effectiveness of some video clips to elicit the target emotions. In this paper, we describe several sensors and techniques to measure, validate and investigate the relationship between cultural acceptance and eliciting universal expressions of affect using movie clips. For emotion elicitation, a standardised list of English language clips, as well as an initial set of Arabic video clips are used for comparison. For validation, bio-signal devices to measure physiological and behavioural responses associated with emotional stimuli are used. Physiological and behavioural responses are measured from 29 subjects of Arabic background while watching the selected clips. For the six emotions’ classification, a multiclass SVM (six-class) classifier using the physiological and behavioural measures as input results in a higher recognition rate for elicited emotions from Arabic video clips (avg. 60%) compared to the English video clips (avg. 52%). These results might reflect that using video clips from the subjects’ culture is more likely to elicit the target emotions. Besides measuring the physiological and behavioural responses, an online survey was carried out to evaluate the effectiveness of the selected video clips in eliciting the target emotions. The online survey, having on average 220 respondents for each clip, supported the findings.

[1]  P. Ekman,et al.  Measuring facial movement , 1976 .

[2]  Religious commitment related to message contentiousness , 1999 .

[3]  Sharifa Alghowinem,et al.  Comparison of User Responses to English and Arabic Emotion Elicitation Video Clips , 2015, HCI.

[4]  Sharifa Alghowinem,et al.  An exploratory study of detecting emotion states using eye-tracking technology , 2013, 2013 Science and Information Conference.

[5]  J. Russell Affective space is bipolar. , 1979 .

[6]  Richard J. Davidson,et al.  Now You Feel It, Now You Don't , 2003, Psychological science.

[7]  Matthias Scheutz,et al.  The Architectural Basis of Affective States and Processes , 2005, Who Needs Emotions?.

[8]  P. Philippot Inducing and assessing differentiated emotion-feeling states in the laboratory. , 1993, Cognition & emotion.

[9]  Roland Göcke,et al.  Design of an Emotion Elicitation Framework for Arabic Speakers , 2014, HCI.

[10]  Bao-Liang Lu,et al.  EEG-based emotion recognition during watching movies , 2011, 2011 5th International IEEE/EMBS Conference on Neural Engineering.

[11]  Bao-Liang Lu,et al.  Multimodal emotion recognition using EEG and eye tracking data , 2014, 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[12]  Veikko Surakka,et al.  Pupil size variation as an indication of affective processing , 2003, Int. J. Hum. Comput. Stud..

[13]  P. Lang International Affective Picture System (IAPS) : Technical Manual and Affective Ratings , 1995 .

[14]  R. Davidson,et al.  Manipulating affective state using extended picture presentations. , 1997, Psychophysiology.

[15]  Thierry Pun,et al.  Multimodal Emotion Recognition in Response to Videos , 2012, IEEE Transactions on Affective Computing.

[16]  Ana Paiva,et al.  On the Need of New Methods to Mine Electrodermal Activity in Emotion-Centered Studies , 2012, ADMI.

[17]  Javier R. Movellan,et al.  Towards Automatic Recognition of Spontaneous Facial Actions , 2003 .

[18]  Dirk Hagemann,et al.  The assessment of affective reactivity using films: Validity, reliability and sex differences , 1999 .

[19]  Kevin B. Wright,et al.  Researching Internet-Based Populations: Advantages and Disadvantages of Online Survey Research, Online Questionnaire Authoring Software Packages, and Web Survey Services , 2006, J. Comput. Mediat. Commun..

[20]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[21]  Björn W. Schuller,et al.  Speech emotion recognition , 2018, Commun. ACM.

[22]  U. Yavas,et al.  Advertising in Saudi Arabia: Content and regulation† , 1989 .

[23]  C. Darwin,et al.  The Expression of the Emotions in Man and Animals , 1956 .

[24]  K. Scherer,et al.  Bodily expression of emotion , 2009 .

[25]  C. Darwin The Expression of the Emotions in Man and Animals , .

[26]  T. Dalgleish,et al.  Handbook of cognition and emotion , 1999 .

[27]  F. Hesse,et al.  Experimental inductions of emotional states and their effectiveness: A review , 1994 .

[28]  Rafael A. Calvo,et al.  Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications , 2010, IEEE Transactions on Affective Computing.

[29]  Martin H. Levinson Not by Genes Alone: How Culture Transformed Human Evolution , 2006 .

[30]  Byoung-Jun Park,et al.  Emotion classification based on bio-signals emotion recognition using machine learning algorithms , 2014, 2014 International Conference on Information Science, Electronics and Electrical Engineering.

[31]  Dat Tran,et al.  Emotion Recognition Using the Emotiv EPOC Device , 2012, ICONIP.

[32]  M Murugappan,et al.  Physiological signals based human emotion Recognition: a review , 2011, 2011 IEEE 7th International Colloquium on Signal Processing and its Applications.

[33]  J. Sabini Culture and emotion. , 1997 .

[34]  Sakiko Yoshikawa,et al.  Emotion Elicitation EffEct of films in a JapanEsE samplE , 2007 .

[35]  Sylvia D. Kreibig,et al.  An affective computing approach to physiological emotion specificity: toward subject-independent and stimulus-independent classification of film-induced emotions. , 2011, Psychophysiology.

[36]  J. Gross,et al.  Emotion elicitation using films , 1995 .

[37]  P. Richerson,et al.  Not by genes alone: How culture transformed human evolution. , 2004 .

[38]  P. Ekman,et al.  Constants across cultures in the face and emotion. , 1971, Journal of personality and social psychology.

[39]  Michael Wagner,et al.  Head Pose and Movement Analysis as an Indicator of Depression , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[40]  Gary Kebbel,et al.  The Media Economy , 2011 .

[41]  S. Mohamed,et al.  Statistical Normalization and Back Propagation for Classification , 2022 .

[42]  J. M. Carroll,et al.  Facial Expressions in Hollywood's Portrayal of Emotion , 1997 .

[43]  A. Schaefer,et al.  Assessing the effectiveness of a large database of emotion-eliciting films: A new tool for emotion researchers , 2010 .

[44]  Leontios J. Hadjileontiadis,et al.  Emotion Recognition From EEG Using Higher Order Crossings , 2010, IEEE Transactions on Information Technology in Biomedicine.

[45]  R. Likert “Technique for the Measurement of Attitudes, A” , 2022, The SAGE Encyclopedia of Research Design.

[46]  J. Cohn,et al.  A Psychometric Evaluation of the Facial Action Coding System for Assessing Spontaneous Expression , 2001 .

[47]  Victor Ginsburgh,et al.  On the Perceived Quality of Movies , 1999 .

[48]  M. Yachida,et al.  Facial expression recognition and its degree estimation , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[49]  Maja Pantic,et al.  Dynamics of facial expression: recognition of facial actions and their temporal segments from face profile image sequences , 2006, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[50]  G. Larsson Muslims and the New Media: Historical and Contemporary Debates , 2011 .

[51]  T. L. Gilman,et al.  A film set for the elicitation of emotion in research: A comprehensive catalog derived from four decades of investigation , 2017, Behavior Research Methods.

[52]  Rafael Ramírez,et al.  Detecting Emotion from EEG Signals Using the Emotive Epoc Device , 2012, Brain Informatics.

[53]  M. Dawson,et al.  Lapses in skin conductance responding across anatomical sites: Comparison of fingers, feet, forehead, and wrist. , 2016, Psychophysiology.

[54]  P. Ekman,et al.  Facial Expression in Affective Disorders , 2005 .

[55]  Tamás D. Gedeon,et al.  Modeling Stress Using Thermal Facial Patterns: A Spatio-temporal Approach , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[56]  Pavol Partila,et al.  Emotional Impact on Neurological Characteristics and Human Speech , 2014, ECC.

[57]  J. Movellan,et al.  The Next Generation of Automatic Facial Expression Measurement , 2003 .

[58]  M. Sherman,et al.  Disgust sensitivity as a function of the Big Five and gender , 1999 .

[59]  F. Hesse,et al.  Relative effectiveness and validity of mood induction procedures : a meta-analysis , 1996 .

[60]  Enrique G Fernández-Abascal,et al.  [Spanish validation of an emotion-eliciting set of films]. , 2011, Psicothema.

[61]  M. Bradley,et al.  The pupil as a measure of emotional arousal and autonomic activation. , 2008, Psychophysiology.

[62]  Dirk Heylen,et al.  Head Gestures, Gaze and the Principles of Conversational Structure , 2006, Int. J. Humanoid Robotics.

[63]  Sylvia D. Kreibig,et al.  Eliciting positive, negative and mixed emotional states: A film library for affective scientists , 2016, Cognition & emotion.

[64]  Takeo Kanade,et al.  Subtly different facial expression recognition and expression intensity estimation , 1998, Proceedings. 1998 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No.98CB36231).

[65]  Anil K. Jain,et al.  39 Dimensionality and sample size considerations in pattern recognition practice , 1982, Classification, Pattern Recognition and Reduction of Dimensionality.

[66]  H. Prendinger,et al.  Emotion Recognition from Electromyography and Skin Conductance , 2005 .

[67]  Charles A. Sanislow,et al.  Ratings for emotion film clips , 2015, Behavior research methods.

[68]  Robert Feldt,et al.  Validity Threats in Empirical Software Engineering Research - An Initial Survey , 2010, SEKE.

[69]  Melanie Revilla,et al.  Ideal and Maximum Length for a Web Survey , 2017 .

[70]  P. Ekman,et al.  Autonomic nervous system activity distinguishes among emotions. , 1983, Science.

[71]  Wen-Bing Horng,et al.  Driver fatigue detection based on eye tracking and dynamk, template matching , 2004, IEEE International Conference on Networking, Sensing and Control, 2004.

[72]  Tieniu Tan,et al.  Affective Computing: A Review , 2005, ACII.

[73]  Veikko Surakka,et al.  Pupillary responses to emotionally provocative stimuli , 2000, ETRA.

[74]  Peter Robinson,et al.  Generalization of a Vision-Based Computational Model of Mind-Reading , 2005, ACII.

[75]  F. Shamsudin,et al.  Ethics and ethical theories from an Islamic perspective , 2013 .

[76]  Antony S R Manstead,et al.  Gender and culture differences in emotion. , 2004, Emotion.

[77]  Sergio Escalera,et al.  Survey on Emotional Body Gesture Recognition , 2018, IEEE Transactions on Affective Computing.

[78]  Michael Wagner,et al.  Exploring Eye Activity as an Indication of Emotional States Using an Eye-Tracking Sensor , 2014 .

[79]  C. Izard Innate and universal facial expressions: evidence from developmental and cross-cultural research. , 1994, Psychological bulletin.

[80]  M. Jensen,et al.  Validation of the Brief Pain Inventory for chronic nonmalignant pain. , 2004, The journal of pain : official journal of the American Pain Society.

[81]  Zhigang Deng,et al.  Analysis of emotion recognition using facial expressions, speech and multimodal information , 2004, ICMI '04.

[82]  Erik Cambria,et al.  Convolutional MKL Based Multimodal Emotion Recognition and Sentiment Analysis , 2016, 2016 IEEE 16th International Conference on Data Mining (ICDM).

[83]  A. Greeley The American Catholic : a social portrait , 1978 .

[84]  H. Heekeren,et al.  Neuroticism influences pupillary responses during an emotional interference task. , 2008, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[85]  Barbara F. Gentile,et al.  Foundations of psychological thought : a history of psychology , 2009 .

[86]  S. Dehaene,et al.  Unconscious semantic priming extends to novel unseen stimuli , 2001, Cognition.

[87]  Christine L. Lisetti,et al.  Emotion recognition from physiological signals using wireless sensors for presence technologies , 2004, Cognition, Technology & Work.

[88]  Cristina Fernández Megías,et al.  Validación española de una batería de películas para inducir emociones , 2011 .

[89]  Michael Wagner,et al.  From Joyous to Clinically Depressed: Mood Detection Using Spontaneous Speech , 2012, FLAIRS.

[90]  Lianhong Cai,et al.  Speech emotion classification with the combination of statistic features and temporal features , 2004, 2004 IEEE International Conference on Multimedia and Expo (ICME) (IEEE Cat. No.04TH8763).

[91]  Zixiang Xiong,et al.  Optimal number of features as a function of sample size for various classification rules , 2005, Bioinform..

[92]  Alex Pentland,et al.  Human computing and machine understanding of human behavior: a survey , 2006, ICMI '06.

[93]  Christine L. Lisetti,et al.  Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals , 2004, EURASIP J. Adv. Signal Process..

[94]  G. Norman Likert scales, levels of measurement and the “laws” of statistics , 2010, Advances in health sciences education : theory and practice.

[95]  G. Berríos,et al.  The psychopathology of affectivity: conceptual and historical aspects , 1985, Psychological Medicine.

[96]  G. Stemmler,et al.  The autonomic differentiation of emotions revisited: convergent and discriminant validation. , 1989, Psychophysiology.

[97]  Feng Zhou,et al.  Affective and cognitive design for mass personalization: status and prospect , 2012, Journal of Intelligent Manufacturing.

[98]  Victor Raskin,et al.  Semantic mechanisms of humor , 1984 .

[99]  Nicu Sebe,et al.  Multimodal Human Computer Interaction: A Survey , 2005, ICCV-HCI.

[100]  Anne-Laure Gilet [Mood induction procedures: a critical review]. , 2008, L'Encephale.

[101]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[102]  A. Gilet Procédures d’induction d’humeurs en laboratoire : une revue critique , 2008 .

[103]  Michael Wagner,et al.  Multimodal Depression Detection: Fusion Analysis of Paralinguistic, Head Pose and Eye Gaze Behaviors , 2018, IEEE Transactions on Affective Computing.

[104]  Johannes Hewig,et al.  A revised film set for the induction of basic emotions. , 2005 .

[105]  Johannes Wagner,et al.  From Physiological Signals to Emotions: Implementing and Comparing Selected Methods for Feature Extraction and Classification , 2005, 2005 IEEE International Conference on Multimedia and Expo.

[106]  D. Kahneman,et al.  Pupillary, heart rate, and skin resistance changes during a mental task. , 1969, Journal of experimental psychology.

[107]  S. Nasehi,et al.  An Optimal EEG-based Emotion Recognition Algorithm Using Gabor Features , 2012 .

[108]  P. Philippot,et al.  The perception of bodily sensations during emotion: A cross-cultural perspective. , 1997 .

[109]  Scotty D. Craig,et al.  Affect and learning: An exploratory look into the role of affect in learning with AutoTutor , 2004 .

[110]  P. Ekman,et al.  Strong evidence for universals in facial expressions: a reply to Russell's mistaken critique. , 1994, Psychological bulletin.

[111]  J. Beatty Task-evoked pupillary responses, processing load, and the structure of processing resources. , 1982, Psychological bulletin.

[112]  J. Russell Is there universal recognition of emotion from facial expression? A review of the cross-cultural studies. , 1994, Psychological bulletin.