An Emotion Model for Music Using Brain Waves

Every person reacts differently to music. The task then is to identify a specific set of music features that have a significant effect on emotion for an individual. Previous research have used self-reported emotions or tags to annotate short segments of music using discrete labels. Our approach uses an electroencephalograph to record the subject’s reaction to music. Emotion spectrum analysis method is used to analyse the electric potentials and provide continuousvalued annotations of four emotional states for different segments of the music. Music features are obtained by processing music information from the MIDI files which are separated into several segments using a windowing technique. The music features extracted are used in two separate supervised classification algorithms to build the emotion models. Classifiers have a minimum error rate of 5% predicting the emotion labels.

[1]  Thierry Pun,et al.  A channel selection method for EEG classification in emotion assessment based on synchronization likelihood , 2007, 2007 15th European Signal Processing Conference.

[2]  Ian H. Witten,et al.  The WEKA data mining software: an update , 2009, SKDD.

[3]  J. Ross Quinlan,et al.  C4.5: Programs for Machine Learning , 1992 .

[4]  Eduardo Reck Miranda,et al.  Toward Direct Brain-Computer Musical Interfaces , 2005, NIME.

[5]  Yi-Hsuan Yang,et al.  Exploiting online music tags for music emotion classification , 2011, TOMCCAP.

[6]  Masayuki Numao,et al.  Constructive adaptive user interfaces: composing music based on human feelings , 2002, AAAI/IAAI.

[7]  Jeffrey J. Scott,et al.  MUSIC EMOTION RECOGNITION: A STATE OF THE ART REVIEW , 2010 .

[8]  Ichiro Fujinaga,et al.  jSymbolic: A Feature Extractor for MIDI Files , 2006, ICMC.

[9]  P. Juslin,et al.  Emotional expression in music. , 2003 .

[10]  Masayuki Numao,et al.  Music compositional intelligence with an affective flavor , 2007, IUI '07.

[11]  Andrew R. Brown,et al.  Changing Musical Emotion: A Computational Rule System for Modifying Score and Performance , 2010, Computer Music Journal.

[12]  J. R. Quinlan Learning With Continuous Classes , 1992 .

[13]  Guillaume Chanel,et al.  Emotion Assessment: Arousal Evaluation Using EEG's and Peripheral Physiological Signals , 2006, MRCS.

[14]  Emery Schubert,et al.  Affective, evaluative, and collative responses to hated and loved music. , 2010 .

[15]  Masayuki Numao,et al.  Acquisition of human feelings in music arrangement , 1997, IJCAI 1997.

[16]  J. Sloboda,et al.  Handbook of Music and Emotion: Theory, Research, Applications , 2011 .

[17]  Toshimitsu Musha,et al.  Feature extraction from EEGs associated with emotions , 1997, Artificial Life and Robotics.

[18]  Grigorios Tsoumakas,et al.  Multi-Label Classification of Music into Emotions , 2008, ISMIR.