Exploring Domain Knowledge for Affective Video Content Analyses

The well-established film grammar is often used to change visual and audio elements of videos to invoke audiences' emotional experience. Such film grammar, referred to as domain knowledge, is crucial for affective video content analyses, but has not been thoroughly explored yet. In this paper, we propose a novel method to analyze video affective content through exploring domain knowledge. Specifically, take visual elements as an example, we first infer probabilistic dependencies between visual elements and emotions from the summarized film grammar. Then, we transfer the domain knowledge as constraints, and formulate affective video content analyses as a constrained optimization problem. Experiments on the LIRIS-ACCEDE database and the DEAP database demonstrate that the proposed affective content analyses method can successfully leverage well-established film grammar for better emotion classification from video content.

[1]  Hermann Ney,et al.  Computing Mel-frequency cepstral coefficients on the power spectrum , 2001, 2001 IEEE International Conference on Acoustics, Speech, and Signal Processing. Proceedings (Cat. No.01CH37221).

[2]  David Bordwell,et al.  Film Art: An Introduction , 1979 .

[3]  Touradj Ebrahimi,et al.  Multimedia content analysis for emotional characterization of music video clips , 2013, EURASIP J. Image Video Process..

[4]  Shiliang Zhang,et al.  Affective MTV analysis based on arousal and valence features , 2008, 2008 IEEE International Conference on Multimedia and Expo.

[5]  Alan Hanjalic,et al.  Affective video content representation and modeling , 2005, IEEE Transactions on Multimedia.

[6]  Riccardo Leonardi,et al.  Affective Recommendation of Movies Based on Selected Connotative Features , 2013, IEEE Transactions on Circuits and Systems for Video Technology.

[7]  Junqing Yu,et al.  Video Affective Content Representation and Recognition Using Video Affective Tree and Hidden Markov Models , 2007, ACII.

[8]  Svetha Venkatesh,et al.  Toward automatic extraction of expressive elements from motion pictures: tempo , 2002, IEEE Trans. Multim..

[9]  Shiliang Zhang,et al.  Affective Visualization and Retrieval for Music Video , 2010, IEEE Transactions on Multimedia.

[10]  Ehtesham Hassan,et al.  TCS-ILAB - MediaEval 2015: Affective Impact of Movies and Violent Scene Detection , 2015, MediaEval.

[11]  Loong Fah Cheong,et al.  Affective understanding in film , 2006, IEEE Trans. Circuits Syst. Video Technol..

[12]  Shiliang Zhang,et al.  Utilizing affective analysis for efficient movie browsing , 2009, 2009 16th IEEE International Conference on Image Processing (ICIP).

[13]  Chong-Wah Ngo,et al.  Deep Multimodal Learning for Affective Analysis and Retrieval , 2015, IEEE Transactions on Multimedia.

[14]  H. Zettl Sight, Sound, Motion: Applied Media Aesthetics , 1973 .

[15]  Frank Hopfgartner,et al.  A comprehensive study on mid-level representation and ensemble learning for emotional analysis of video material , 2016, Multimedia Tools and Applications.

[16]  Qiang Ji,et al.  Implicit hybrid video emotion tagging by integrating video content and users' multiple physiological responses , 2016, 2016 23rd International Conference on Pattern Recognition (ICPR).

[17]  Shiliang Zhang,et al.  Music video affective understanding using feature importance analysis , 2010, CIVR '10.

[18]  Emmanuel Dellandréa,et al.  LIRIS-ACCEDE: A Video Database for Affective Content Analysis , 2015, IEEE Transactions on Affective Computing.

[19]  Qiang Ji,et al.  Video Affective Content Analysis: A Survey of State-of-the-Art Methods , 2015, IEEE Transactions on Affective Computing.

[20]  Ling-Yu Duan,et al.  Hierarchical movie affective content analysis based on arousal and valence features , 2008, ACM Multimedia.

[21]  P. Valdez,et al.  Effects of color on emotions. , 1994, Journal of experimental psychology. General.

[22]  Peter Y. K. Cheung,et al.  A Novel Probabilistic Approach to Modeling the Pleasure-Arousal-Dominance Content of the Video based on "Working Memory" , 2007 .

[23]  The ICL-TUM-PASSAU Approach for the MediaEval 2015 "Affective Impact of Movies" Task , 2015, MediaEval.

[24]  Xiangjian He,et al.  Hierarchical affective content analysis in arousal and valence dimensions , 2013, Signal Process..

[25]  Rainer Stiefelhagen,et al.  KIT at MediaEval 2015 - Evaluating Visual Cues for Affective Impact of Movies Task , 2015, MediaEval.

[26]  Vu Lam,et al.  NII-UIT at MediaEval 2015 Affective Impact of Movies Task , 2015, MediaEval.

[27]  Thierry Dutoit,et al.  UMons at MediaEval 2015 Affective Impact of Movies Task including Violent Scenes Detection , 2015, MediaEval.

[28]  Nipon Charoenkitkarn,et al.  A Sieving ANN for Emotion-Based Movie Clip Classification , 2008, IEICE Trans. Inf. Syst..

[29]  Emmanuel Dellandréa,et al.  The MediaEval 2015 Affective Impact of Movies Task , 2015, MediaEval.

[30]  Xi Wang,et al.  Fudan-Huawei at MediaEval 2015: Detecting Violent Scenes and Affective Impact in Movies with Deep Learning , 2015, MediaEval.

[31]  Peter Y. K. Cheung,et al.  A Novel Probabilistic Approach to Modeling the Pleasure-Arousal-Dominance Content of the Video based on "Working Memory" , 2007, International Conference on Semantic Computing (ICSC 2007).

[32]  Mohammad Soleymani,et al.  A Multimodal Database for Affect Recognition and Implicit Tagging , 2012, IEEE Transactions on Affective Computing.

[33]  Shiliang Zhang,et al.  Mutual Information-Based Emotion Recognition , 2013 .

[34]  Emmanuel Dellandréa,et al.  The MediaEval 2016 Emotional Impact of Movies Task , 2016, MediaEval.

[35]  Greg M. Smith Film Structure and the Emotion System , 2003 .

[36]  Yaser Sheikh,et al.  On the use of computable features for film classification , 2005, IEEE Transactions on Circuits and Systems for Video Technology.

[37]  Riccardo Leonardi,et al.  Emotional identity of movies , 2009, 2009 16th IEEE International Conference on Image Processing (ICIP).

[38]  Touradj Ebrahimi,et al.  Affective content analysis of music video clips , 2011, MIRUM '11.

[39]  Shih-Fu Chang,et al.  Color-mood analysis of films based on syntactic and psychological models , 2004, 2004 IEEE International Conference on Multimedia and Expo (ICME) (IEEE Cat. No.04TH8763).

[40]  Markus Schedl,et al.  RFA at MediaEval 2015 Affective Impact of Movies Task: A Multimodal Approach , 2015, MediaEval.