Single Trial Classification of EEG and Peripheral Physiological Signals for Recognition of Emotions Induced by Music Videos

Recently, the field of automatic recognition of users. affective states has gained a great deal of attention. Automatic, implicit recognition of affective states has many applications, ranging from personalized content recommendation to automatic tutoring systems. In this work, we present some promising results of our research in classification of emotions induced by watching music videos. We show robust correlations between users' self-assessments of arousal and valence and the frequency powers of their EEG activity. We present methods for single trial classification using both EEG and peripheral physiological signals. For EEG, an average (maximum) classification rate of 55.7% (67.0%) for arousal and 58.8% (76.0%) for valence was obtained. For peripheral physiological signals, the results were 58.9% (85.5%) for arousal and 54.2% (78.5%) for valence.

[1]  Christine L. Lisetti,et al.  Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals , 2004, EURASIP J. Adv. Signal Process..

[2]  Touradj Ebrahimi,et al.  Implicit emotional tagging of multimedia using EEG signals and brain computer interface , 2009, WSM@MM.

[3]  T. Scherer,et al.  Constraints for emotion specificity in fear and anger: the context counts. , 2001, Psychophysiology.

[4]  J. Cacioppo,et al.  The psychophysiology of emotion. , 1993 .

[5]  Mohammad Soleymani,et al.  Short-term emotion assessment in a recall paradigm , 2009, Int. J. Hum. Comput. Stud..

[6]  G. Knyazev Motivation, emotion, and their inhibitory control mirrored in brain oscillations , 2007, Neuroscience & Biobehavioral Reviews.

[7]  R. Mccraty,et al.  The effects of emotions on short-term power spectrum analysis of heart rate variability . , 1995, The American journal of cardiology.

[8]  Abdul Wahab,et al.  EEG Emotion Recognition System , 2009 .

[9]  H. Demaree,et al.  Brain lateralization of emotional processing: historical roots and a future incorporating "dominance". , 2005, Behavioral and cognitive neuroscience reviews.

[10]  M. Bradley,et al.  Looking at pictures: affective, facial, visceral, and behavioral reactions. , 1993, Psychophysiology.

[11]  Kwang-Eun Ko,et al.  Emotion recognition using EEG signals with relative power values and Bayesian network , 2009 .

[12]  Mohammad Soleymani,et al.  Queries and tags in affect-based multimedia retrieval , 2009, 2009 IEEE International Conference on Multimedia and Expo.

[13]  Sazali Yaacob,et al.  An investigation on visual and audiovisual stimulus based emotion recognition using EEG , 2009, Int. J. Medical Eng. Informatics.

[14]  Z J Koles,et al.  The quantitative extraction and topographic mapping of the abnormal components in the clinical EEG. , 1991, Electroencephalography and clinical neurophysiology.

[15]  Gert Pfurtscheller,et al.  Overt foot movement detection in one single Laplacian EEG derivation , 2008, Journal of Neuroscience Methods.

[16]  Thomas M. Loughin,et al.  A systematic comparison of methods for combining p , 2004, Comput. Stat. Data Anal..

[17]  Yihong Gong,et al.  Recognition of multiple drivers’ emotional state , 2008, 2008 19th International Conference on Pattern Recognition.

[18]  P. Ekman,et al.  DIFFERENCES Universals and Cultural Differences in the Judgments of Facial Expressions of Emotion , 2004 .

[19]  Huan Liu,et al.  Efficient Feature Selection via Analysis of Relevance and Redundancy , 2004, J. Mach. Learn. Res..

[20]  J. Russell A circumplex model of affect. , 1980 .