Investigating the Influence of Personal Memories on Video-Induced Emotions

This paper contributes to the automatic estimation of the subjective emotional experience that audio-visual media content induces in individual viewers, e.g. to support affect-based recommendations. Making accurate predictions of these responses is a challenging task because of their highly person-dependent and situation-specific nature. Findings from psychology indicate that an important driver for the emotional impact of media is the triggering of personal memories in observers. However, existing research on automated predictions focuses on the isolated analysis of audiovisual content, ignoring such contextual influences. In a series of empirical investigations, we (1) quantify the impact of associated personal memories on viewers' emotional responses to music videos in-the-wild and (2) assess the potential value of information about triggered memories for personalizing automatic predictions in this setting. Our findings indicate that the occurrence of memories intensifies emotional responses to videos. Moreover, information about viewers' memory response explains more variation in video-induced emotions than either the identity of videos or relevant viewer-characteristics (e.g. personality or mood). We discuss the implications of these results for existing approaches to automated predictions and describe ways for progress towards developing memory-sensitive alternatives.

[1]  Mohammad Soleymani,et al.  Large-scale Affective Content Analysis: Combining Media Content Features and Facial Reactions , 2017, 2017 12th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2017).

[2]  G. Wolford,et al.  Evaluations of pleasurable experiences: The peak-end rule , 2008, Psychonomic bulletin & review.

[3]  Emmanuel Dellandréa,et al.  Affective Video Content Analysis: A Multidisciplinary Insight , 2018, IEEE Transactions on Affective Computing.

[4]  David S. Kirk,et al.  It's Just My History Isn't It?: Understanding Smart Journaling Practices , 2016, CHI.

[5]  D. Berntsen,et al.  Why am I remembering this now? Predicting the occurrence of involuntary (spontaneous) episodic memories. , 2013, Journal of experimental psychology. General.

[6]  Lisanne M. Jenkins,et al.  A New Set of Standardised Verbal and Non-verbal Contemporary Film Stimuli for the Elicitation of Emotions , 2012, Brain Impairment.

[7]  Mark A. Neerincx,et al.  Context in Human Emotion Perception for Automatic Affect Detection: A Survey of Audiovisual Databases , 2019, 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII).

[8]  Caitlin Mills,et al.  On the Validity of the Autobiographical Emotional Memory Task for Emotion Induction , 2014, PloS one.

[9]  Alan F. Smeaton,et al.  LifeLogging: Personal Big Data , 2014, Found. Trends Inf. Retr..

[10]  Weisi Lin,et al.  Do Personality and Culture Influence Perceived Video Quality and Enjoyment? , 2016, IEEE Transactions on Multimedia.

[11]  Antonio Torralba,et al.  Understanding and Predicting Image Memorability at a Large Scale , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).

[12]  Joost Broekens,et al.  AffectButton: A method for reliable and valid affective self-report , 2013, Int. J. Hum. Comput. Stud..

[13]  Amy M. Belfi,et al.  Music evokes vivid autobiographical memories , 2016, Memory.

[14]  P. Janata,et al.  Characterisation of music-evoked autobiographical memories , 2007, Memory.

[15]  H. Hung,et al.  Artificial Empathic Memory: Enabling Media Technologies to Better Understand Subjective User Experience , 2018, EE-USAD'18.

[16]  Reinout E. de Vries,et al.  The 24-item Brief HEXACO Inventory (BHI) , 2013 .

[17]  M. Lee Three case studies in the Bayesian analysis of cognitive models , 2008, Psychonomic bulletin & review.

[18]  Shinichi Nakagawa,et al.  A general and simple method for obtaining R2 from generalized linear mixed‐effects models , 2013 .

[19]  Qiang Ji,et al.  Video Affective Content Analysis: A Survey of State-of-the-Art Methods , 2015, IEEE Transactions on Affective Computing.

[20]  Arthur C. Graesser,et al.  Mind wandering while reading easy and difficult texts , 2013, Psychonomic bulletin & review.

[21]  A. Mehrabian Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in Temperament , 1996 .

[22]  Martin A. Conway,et al.  Overshadowing the Reminiscence Bump: Memories of a Struggle for Independence , 1999 .

[23]  J. Russell A circumplex model of affect. , 1980 .

[24]  Thierry Pun,et al.  DEAP: A Database for Emotion Analysis ;Using Physiological Signals , 2012, IEEE Transactions on Affective Computing.

[25]  Thierry Pun,et al.  Multimodal Emotion Recognition in Response to Videos , 2012, IEEE Transactions on Affective Computing.

[26]  Weisi Lin,et al.  Modelling the influence of personality and culture on affect and enjoyment in multimedia , 2015, 2015 International Conference on Affective Computing and Intelligent Interaction (ACII).

[27]  Touradj Ebrahimi,et al.  Multimedia content analysis for emotional characterization of music video clips , 2013, EURASIP J. Image Video Process..

[28]  J. Igartua Identification with characters and narrative persuasion through fictional feature films , 2010 .

[29]  K. Scherer,et al.  Appraisal Theories of Emotion: State of the Art and Future Development , 2013 .

[30]  T. Dalgleish Basic Emotions , 2004 .

[31]  John H. Mace Involuntary autobiographical memories are highly dependent on abstract cuing: the Proustian view is incorrect , 2004 .

[32]  Hans Baumgartner,et al.  Remembrance of Things Past: Music, Autobiographical Memory, and Emotion , 1992 .

[33]  Angela Stewart,et al.  Face Forward: Detecting Mind Wandering from Video During Narrative Film Comprehension , 2017, AIED.

[34]  Yi Zhu,et al.  Understanding the role of social context and user factors in video Quality of Experience , 2015, Comput. Hum. Behav..

[35]  Mohammad Soleymani,et al.  Corpus Development for Affective Video Indexing , 2012, IEEE Transactions on Multimedia.