Tracking liking state in brain activity while watching multiple movies

Emotion is a valuable information in various applications ranging from human-computer interaction to automated multimedia content delivery. Conventional methods to recognize emotion were based on speech prosody cues, facial expression, and body language. However, this information may not appear when people watch a movie. In recent years, some studies have started to use electroencephalogram (EEG) signals in recognizing emotion. But, the EEG data were entirely analyzed in each scene of movies for emotion classification. Thus, the detailed information of emotional state changes cannot be extracted. In this study, we utilize EEG to track affective state during watching multiple movies. Experiments were done by measuring continuous liking state during watching three types of movies, and then constructing subject dependent emotional state tracking model. We used support vector machine (SVM) as a classifier, and support vector regression (SVR) for regression. As a result, the best classification accuracy was 77.6%, and the best regression model achieved 0.645 of correlation coefficient between actual liking state and predicted liking state. These results demonstrate that continuous emotional state can be predicted by our EEG-based method.

[1]  A. North,et al.  Liking, arousal potential, and the emotions expressed by music. , 1997, Scandinavian journal of psychology.

[2]  John S. Johnson,et al.  Audience preferences are predicted by temporal reliability of neural processing , 2014, Nature Communications.

[3]  W. Klimesch EEG alpha and theta oscillations reflect cognitive and memory performance: a review and analysis , 1999, Brain Research Reviews.

[4]  Mohammad Soleymani,et al.  A Multimodal Database for Affect Recognition and Implicit Tagging , 2012, IEEE Transactions on Affective Computing.

[5]  D. R. Peryam,et al.  Hedonic scale method of measuring food preferences. , 1957 .

[6]  Athanasios Katsamanis,et al.  Tracking changes in continuous emotion states using body language and prosodic cues , 2011, 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[7]  C. Pelachaud,et al.  Emotion-Oriented Systems: The Humaine Handbook , 2011 .

[8]  Jennifer Healey,et al.  Toward Machine Emotional Intelligence: Analysis of Affective Physiological State , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[9]  Bao-Liang Lu,et al.  EEG-based emotion recognition during watching movies , 2011, 2011 5th International IEEE/EMBS Conference on Neural Engineering.

[10]  Thierry Pun,et al.  DEAP: A Database for Emotion Analysis ;Using Physiological Signals , 2012, IEEE Transactions on Affective Computing.

[11]  Peter W. McOwan,et al.  A real-time automated system for the recognition of human facial expressions , 2006, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[12]  Andreas Stolcke,et al.  Prosody-based automatic detection of annoyance and frustration in human-computer dialog , 2002, INTERSPEECH.

[13]  R. Lane,et al.  Neuroanatomical correlates of happiness, sadness, and disgust. , 1997, The American journal of psychiatry.

[14]  Seungjin Choi,et al.  Composite Common Spatial Pattern for Subject-to-Subject Transfer , 2009, IEEE Signal Processing Letters.

[15]  Gamini Dissanayake,et al.  Choice modeling and the brain: A study on the Electroencephalogram (EEG) of preferences , 2012, Expert Syst. Appl..

[16]  W. Ray,et al.  EEG alpha activity reflects attentional demands, and beta activity reflects emotional and cognitive processes. , 1985, Science.

[17]  G. Schwartz,et al.  Differential lateralization for positive and negative emotion in the human brain: EEG spectral analysis , 1985, Neuropsychologia.

[18]  M. Bradley,et al.  Measuring emotion: the Self-Assessment Manikin and the Semantic Differential. , 1994, Journal of behavior therapy and experimental psychiatry.