Emotional event detection using relevance feedback

Affective content analysis is necessary to represent a user's preferences in various applications such as video data retrieval and video abstraction. In this paper, we propose a new method to detect emotional events such as fear, sadness, and joy from video data using relevance feedback scheme. We ask the user to provide feedbacks regarding the relevance with emotions for video shot. Then, the system is learned from training data to achieve an improved performance in detecting emotional events. Even though simple low level features are used, experimental results are encouraging.

[1]  Thomas S. Huang,et al.  Relevance feedback: a power tool for interactive content-based image retrieval , 1998, IEEE Trans. Circuits Syst. Video Technol..

[2]  Svetha Venkatesh,et al.  Affect computing in film through sound energy dynamics , 2001, MULTIMEDIA '01.

[3]  Li-Qun Xu,et al.  User-oriented affective video content analysis , 2001, Proceedings IEEE Workshop on Content-Based Access of Image and Video Libraries (CBAIVL 2001).

[4]  Alan Hanjalic,et al.  Video and image retrieval beyond the cognitive level: the needs and possibilities , 2001, IS&T/SPIE Electronic Imaging.

[5]  J. Moran,et al.  Sensation and perception , 1980 .

[6]  Shih-Fu Chang,et al.  Determining computable scenes in films and their structures using audio-visual memory models , 2000, ACM Multimedia.

[7]  T. S. Huang,et al.  Exploring the nature and variants of relevance feedback , 2001, Proceedings IEEE Workshop on Content-Based Access of Image and Video Libraries (CBAIVL 2001).

[8]  Rosalind W. Picard Affective Computing , 1997 .

[9]  Stephen W. Smoliar,et al.  An integrated system for content-based video retrieval and browsing , 1997, Pattern Recognit..

[10]  SangKeun Lee,et al.  Real-time camera motion classification for content-based indexing and retrieval using templates , 2002, 2002 IEEE International Conference on Acoustics, Speech, and Signal Processing.