Identifying Emotion Segments in Music by Discovering Motifs in Physiological Data

Music can induce different emotions in people. We propose a system that can identify music segments which induce specific emotions from the listener. The work involves building a knowledge base with mappings between affective states (happiness, sadness, etc.) and music features (rhythm, chord progression, etc.). Building this knowledge base requires background knowledge from music and emotions psychology. Psychophysiological responses of a user, particularl y, the blood volume pulse, are taken while he listens to music. These signals are analyzed and mapped to various musical features of the songs he listened to. A motif discovery algorithm used in data mining is adapted to analyze signals of physiological data. Motif discovery finds patterns in the data that indicate points of interest in the music. The diffe rent motifs are stored in a library of patterns and used to iden tify other songs that have similar musical content. Results show that motifs selected have similar chord progressions. Some of which include frequently used chords in western pop music.

[1]  Dimitrios Gunopulos,et al.  Discovering similar multidimensional trajectories , 2002, Proceedings 18th International Conference on Data Engineering.

[2]  Jeremy Buhler,et al.  Finding motifs using random projections , 2001, RECOMB.

[3]  Andrew R. Brown,et al.  Changing Musical Emotion: A Computational Rule System for Modifying Score and Performance , 2010, Computer Music Journal.

[4]  Eamonn J. Keogh,et al.  Dimensionality Reduction for Fast Similarity Search in Large Time Series Databases , 2001, Knowledge and Information Systems.

[5]  Daniel Tranel,et al.  Cardiovascular and respiratory responses during musical mood induction. , 2006, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[6]  Masayuki Numao,et al.  Constructive Adaptive User Interfaces Based on Brain Waves , 2009, HCI.

[7]  Eamonn J. Keogh,et al.  Probabilistic discovery of time series motifs , 2003, KDD '03.

[8]  Roger B. Dannenberg,et al.  TagATune: A Game for Music and Sound Annotation , 2007, ISMIR.

[9]  Gert R. G. Lanckriet,et al.  Semantic Annotation and Retrieval of Music and Sound Effects , 2008, IEEE Transactions on Audio, Speech, and Language Processing.

[10]  Eamonn J. Keogh,et al.  On the Need for Time Series Data Mining Benchmarks: A Survey and Empirical Demonstration , 2002, Data Mining and Knowledge Discovery.

[11]  Jeremy Buhler,et al.  Finding Motifs Using Random Projections , 2002, J. Comput. Biol..

[12]  Christos Faloutsos,et al.  Fast Time Sequence Indexing for Arbitrary Lp Norms , 2000, VLDB.

[13]  P. Juslin,et al.  Emotional expression in music. , 2003 .

[14]  Isabelle Peretz,et al.  Role of tempo entrainment in psychophysiological differentiation of happy and sad music? , 2008, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[15]  Jeffrey J. Scott,et al.  MUSIC EMOTION RECOGNITION: A STATE OF THE ART REVIEW , 2010 .

[16]  Ada Wai-Chee Fu,et al.  Efficient time series matching by wavelets , 1999, Proceedings 15th International Conference on Data Engineering (Cat. No.99CB36337).

[17]  C. Krumhansl An exploratory study of musical emotions and psychophysiology. , 1997, Canadian journal of experimental psychology = Revue canadienne de psychologie experimentale.