Affective Jukebox: A Confirmatory Study of EEG Emotional Correlates in Response to Musical Stimuli

This paper presents a proof-of-concept pilot study investigating whether 2-dimensional arousal-valence correlates determined from electroencephalogram (EEG) readings can be used to select music based on the affective state of a user. Self-reported emotional states are used to evaluate a system for estimating arousal and valance from EEG by means of music selection from a real-time jukebox, with stimuli that have strong emotional connotations determined by a perceptual scaling analysis. Statistical analysis of participant responses suggests that this approach can provide a feasible platform for further experimentation in future work. This could include using affective correlations to EEG measurements in order to control real-time systems for musical applications such as arrangement, re-composition, re-mixing, and generative composition via a neurofeedback mechanism which responds to listener affective states.

[1]  Erik Cambria,et al.  Taking Refuge in Your Personal Sentic Corner , 2011 .

[2]  L. Trainor,et al.  Frontal brain electrical activity (EEG) distinguishes valence and intensity of musical emotions , 2001 .

[3]  J. Russell,et al.  Core affect, prototypical emotional episodes, and other things called emotion: dissecting the elephant. , 1999, Journal of personality and social psychology.

[4]  Eduardo Miranda,et al.  Real-Time Notation Using Brainwave Control , 2013 .

[5]  Eaton New Approaches in Brain-Computer Music Interfacing Mapping EEG for Real-Time Musical Control , 2012 .

[6]  Guillaume Chanel,et al.  Emotion Assessment: Arousal Evaluation Using EEG's and Peripheral Physiological Signals , 2006, MRCS.

[7]  Robert F. Potter,et al.  Effects of Music on Physiological Arousal: Explorations into Tempo and Genre , 2007 .

[8]  K. Scherer Which Emotions Can be Induced by Music? What Are the Underlying Mechanisms? And How Can We Measure Them? , 2004 .

[9]  Olga Sourina,et al.  Real-Time EEG-Based Human Emotion Recognition and Visualization , 2010, 2010 International Conference on Cyberworlds.

[10]  Emery Schubert Measuring emotion continuously: Validity and reliability of the two-dimensional emotion-space , 1999 .

[11]  J. Russell A circumplex model of affect. , 1980 .

[12]  Rafael Ramírez,et al.  Detecting Emotion from EEG Signals Using the Emotive Epoc Device , 2012, Brain Informatics.

[13]  Eduardo Miranda,et al.  Brain-Computer Music Interfacing (BCMI): From Basic Research to the Real World of Special Needs , 2011 .

[14]  J. Stephen Downie,et al.  Improving mood classification in music digital libraries by combining lyrics and audio , 2010, JCDL '10.

[15]  D. Zillmann,et al.  Mood Management via the Digital Jukebox , 2002 .

[16]  Trevor J. Cox,et al.  Musical Moods: A Mass Participation Experiment for Affective Classification of Music , 2011, ISMIR.

[17]  K. Hevner Experimental studies of the elements of expression in music , 1936 .

[18]  Yuan-Pin Lin,et al.  EEG-Based Emotion Recognition in Music Listening , 2010, IEEE Transactions on Biomedical Engineering.

[19]  C. Tenke,et al.  Reference-free quantification of EEG spectra: Combining current source density (CSD) and frequency principal components analysis (fPCA) , 2005, Clinical Neurophysiology.

[20]  S. Gosling,et al.  PERSONALITY PROCESSES AND INDIVIDUAL DIFFERENCES The Do Re Mi’s of Everyday Life: The Structure and Personality Correlates of Music Preferences , 2003 .