Using Convolutional Neural Networks to Recognize Rhythm Stimuli from Electroencephalography Recordings

Electroencephalography (EEG) recordings of rhythm perception might contain enough information to distinguish different rhythm types/genres or even identify the rhythms themselves. We apply convolutional neural networks (CNNs) to analyze and classify EEG data recorded within a rhythm perception study in Kigali, Rwanda which comprises 12 East African and 12 Western rhythmic stimuli - each presented in a loop for 32 seconds to 13 participants. We investigate the impact of the data representation and the pre-processing steps for this classification tasks and compare different network structures. Using CNNs, we are able to recognize individual rhythms from the EEG with a mean classification accuracy of 24.4% (chance level 4.17%) over all subjects by looking at less than three seconds from a single channel. Aggregating predictions for multiple channels, a mean accuracy of up to 50% can be achieved for individual subjects.

[1]  Nitish Srivastava,et al.  Improving neural networks by preventing co-adaptation of feature detectors , 2012, ArXiv.

[2]  B. Ross,et al.  Beta and Gamma Rhythms in Human Auditory Cortex during Musical Beat Processing , 2009, Annals of the New York Academy of Sciences.

[3]  Jasper Snoek,et al.  Practical Bayesian Optimization of Machine Learning Algorithms , 2012, NIPS.

[4]  Vince D. Calhoun,et al.  Deep learning for neuroimaging: a validation study , 2013, Front. Neurosci..

[5]  Pascal Vincent,et al.  Stacked Denoising Autoencoders: Learning Useful Representations in a Deep Network with a Local Denoising Criterion , 2010, J. Mach. Learn. Res..

[6]  Jessica A. Grahn,et al.  Cross-cultural influences on rhythm processing: reproduction, discrimination, and beat tapping , 2015, Front. Psychol..

[7]  Esther Ziegler,et al.  Early electrophysiological correlates of meter and rhythm processing in music perception , 2009, Cortex.

[8]  Gregory F. Barz,et al.  Music in East Africa , 2004 .

[9]  Sebastian Stober,et al.  Classifying EEG Recordings of Rhythm Perception , 2014, ISMIR.

[10]  Justin A. Blanco,et al.  Modeling electroencephalography waveforms with semi-supervised deep belief nets: fast classification and anomaly measurement , 2011, Journal of neural engineering.

[11]  Razvan Pascanu,et al.  Pylearn2: a machine learning research library , 2013, ArXiv.

[12]  C. C. A. M. Gielen,et al.  Shared mechanisms in perception and imagery of auditory accents , 2011, Clinical Neurophysiology.

[13]  U. Will,et al.  Brain wave synchronization and entrainment to periodic acoustic stimuli , 2007, Neuroscience Letters.

[14]  Amy Loutfi,et al.  Sleep Stage Classification Using Unsupervised Feature Learning , 2012, Adv. Artif. Neural Syst..

[15]  Yichuan Tang,et al.  Deep Learning using Linear Support Vector Machines , 2013, 1306.0239.

[16]  Isabelle Peretz,et al.  Tagging the Neuronal Entrainment to Beat and Meter , 2011, The Journal of Neuroscience.

[17]  Isabelle Peretz,et al.  Selective Neuronal Entrainment to the Beat and Meter Embedded in a Musical Rhythm , 2012, The Journal of Neuroscience.

[18]  B. Ross,et al.  Internalized Timing of Isochronous Sounds Is Represented in Neuromagnetic Beta Oscillations , 2012, The Journal of Neuroscience.

[19]  J. Snyder,et al.  Gamma-band activity reflects the metric structure of rhythmic tone sequences. , 2005, Brain research. Cognitive brain research.

[20]  LängkvistMartin,et al.  Sleep stage classification using unsupervised feature learning , 2012 .

[21]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[22]  Aniruddh D. Patel,et al.  Top‐Down Control of Rhythm Perception Modulates Early Auditory Responses , 2009, Annals of the New York Academy of Sciences.

[23]  Gábor P. Háden,et al.  Probing Attentive and Preattentive Emergent Meter in Adult Listeners without Extensive Music Training , 2009 .