Classifying EEG Recordings of Rhythm Perception

Electroencephalography (EEG) recordings of rhythm perception might contain enough information to distinguish different rhythm types/genres or even identify the rhythms themselves. In this paper, we present first classification results using deep learning techniques on EEG data recorded within a rhythm perception study in Kigali, Rwanda. We tested 13 adults, mean age 21, who performed three behavioral tasks using rhythmic tone sequences derived from either East African or Western music. For the EEG testing, 24 rhythms ‐ half East African and half Western with identical tempo and based on a 2-bar 12/8 scheme ‐ were each repeated for 32 seconds. During presentation, the participants’ brain waves were recorded via 14 EEG channels. We applied stacked denoising autoencoders and convolutional neural networks on the collected data to distinguish African and Western rhythms on a group and individual participant level. Furthermore, we investigated how far these techniques can be used to recognize the individual rhythms.

[1]  Laura A. Stambaugh,et al.  Enculturation Effects in Music Cognition , 2008 .

[2]  Pascal Vincent,et al.  Stacked Denoising Autoencoders: Learning Useful Representations in a Deep Network with a Local Denoising Criterion , 2010, J. Mach. Learn. Res..

[3]  Justin A. Blanco,et al.  Modeling electroencephalography waveforms with semi-supervised deep belief nets: fast classification and anomaly measurement , 2011, Journal of neural engineering.

[4]  Amy Loutfi,et al.  Sleep Stage Classification Using Unsupervised Feature Learning , 2012, Adv. Artif. Neural Syst..

[5]  Yichuan Tang,et al.  Deep Learning using Linear Support Vector Machines , 2013, 1306.0239.

[6]  Aniruddh D. Patel,et al.  Top‐Down Control of Rhythm Perception Modulates Early Auditory Responses , 2009, Annals of the New York Academy of Sciences.

[7]  Gábor P. Háden,et al.  Probing Attentive and Preattentive Emergent Meter in Adult Listeners without Extensive Music Training , 2009 .

[8]  T. Jung,et al.  EEG dynamics during music appreciation , 2009, 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[9]  B. Ross,et al.  Internalized Timing of Isochronous Sounds Is Represented in Neuromagnetic Beta Oscillations , 2012, The Journal of Neuroscience.

[10]  Masayuki Numao,et al.  An Emotion Model for Music Using Brain Waves , 2012, ISMIR.

[11]  J. Snyder,et al.  Gamma-band activity reflects the metric structure of rhythmic tone sequences. , 2005, Brain research. Cognitive brain research.

[12]  Isabelle Peretz,et al.  Selective Neuronal Entrainment to the Beat and Meter Embedded in a Musical Rhythm , 2012, The Journal of Neuroscience.

[13]  Jessica A. Grahn,et al.  Cross-cultural influences on rhythm processing: reproduction, discrimination, and beat tapping , 2015, Front. Psychol..

[14]  C. C. A. M. Gielen,et al.  Shared mechanisms in perception and imagery of auditory accents , 2011, Clinical Neurophysiology.

[15]  E. Hannon,et al.  Infants prefer the musical meter of their own culture: a cross-cultural comparison. , 2010, Developmental psychology.

[16]  Gregory F. Barz,et al.  Music in East Africa , 2004 .

[17]  Razvan Pascanu,et al.  Pylearn2: a machine learning research library , 2013, ArXiv.

[18]  Isabelle Peretz,et al.  Tagging the Neuronal Entrainment to Beat and Meter , 2011, The Journal of Neuroscience.

[19]  LängkvistMartin,et al.  Sleep stage classification using unsupervised feature learning , 2012 .

[20]  Nitish Srivastava,et al.  Improving neural networks by preventing co-adaptation of feature detectors , 2012, ArXiv.

[21]  B. Ross,et al.  Beta and Gamma Rhythms in Human Auditory Cortex during Musical Beat Processing , 2009, Annals of the New York Academy of Sciences.

[22]  Esther Ziegler,et al.  Early electrophysiological correlates of meter and rhythm processing in music perception , 2009, Cortex.

[23]  Steven J Morrison,et al.  FMRI investigation of cross-cultural music comprehension , 2003, NeuroImage.