Decoding auditory attention to instruments in polyphonic music using single-trial EEG classification

OBJECTIVE Polyphonic music (music consisting of several instruments playing in parallel) is an intuitive way of embedding multiple information streams. The different instruments in a musical piece form concurrent information streams that seamlessly integrate into a coherent and hedonistically appealing entity. Here, we explore polyphonic music as a novel stimulation approach for use in a brain-computer interface. APPROACH In a multi-streamed oddball experiment, we had participants shift selective attention to one out of three different instruments in music audio clips. Each instrument formed an oddball stream with its own specific standard stimuli (a repetitive musical pattern) and oddballs (deviating musical pattern). MAIN RESULTS Contrasting attended versus unattended instruments, ERP analysis shows subject- and instrument-specific responses including P300 and early auditory components. The attended instrument can be classified offline with a mean accuracy of 91% across 11 participants. SIGNIFICANCE This is a proof of concept that attention paid to a particular instrument in polyphonic music can be inferred from ongoing EEG, a finding that is potentially relevant for both brain-computer interface and music research.

[1]  N. Birbaumer,et al.  An auditory oddball (P300) spelling system for brain-computer interfaces. , 2009, Psychophysiology.

[2]  J. Friedman Regularized Discriminant Analysis , 1989 .

[3]  Ken Sharman,et al.  On Harnessing the Electroencephalogram for the Musical Braincap , 2003, Computer Music Journal.

[4]  Tim R. Mullen,et al.  Minding the (Transatlantic) Gap: An Internet-Enabled Acoustic Brain-Computer Music Interface , 2011, NIME.

[5]  Jason Farquhar,et al.  Name that tune: Decoding music from the listening brain , 2011, NeuroImage.

[6]  Olivier Ledoit,et al.  A well-conditioned estimator for large-dimensional covariance matrices , 2004 .

[7]  A. Kübler,et al.  A Brain–Computer Interface Controlled Auditory Event‐Related Potential (P300) Spelling System for Locked‐In Patients , 2009, Annals of the New York Academy of Sciences.

[8]  Tapani Ristaniemi,et al.  Linking Brain Responses to Naturalistic Music Through Analysis of Ongoing EEG and Stimulus Features , 2013, IEEE Transactions on Multimedia.

[9]  Kai Puolamäki,et al.  Sound sample detection and numerosity estimation using auditory display , 2013, TAP.

[10]  Eduardo Reck Miranda,et al.  Interfacing the Brain Directly with Musical Systems: On Developing Systems for Making Music with Brain Signals , 2005, Leonardo.

[11]  B. Blankertz,et al.  A New Auditory Multi-Class Brain-Computer Interface Paradigm: Spatial Hearing as an Informative Cue , 2010, PloS one.

[12]  B. Blankertz,et al.  (C)overt attention and visual speller design in an ERP-based brain-computer interface , 2010, Behavioral and Brain Functions.

[13]  I. Peretz,et al.  Singing in the Brain: Independence of Lyrics and Tunes , 1998 .

[14]  B. Schölkopf,et al.  An online brain-computer interface based on shifting attention to concurrent streams of auditory stimuli. , 2012, Journal of neural engineering.

[15]  Michael Tangermann,et al.  No surprise - fixed sequence event-related potentials for brain-computer interfaces , 2012, 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[16]  M S Treder,et al.  Gaze-independent brain–computer interfaces based on covert attention and feature attention , 2011, Journal of neural engineering.

[17]  N. Squires,et al.  Two varieties of long-latency positive waves evoked by unpredictable auditory stimuli in man. , 1975, Electroencephalography and clinical neurophysiology.

[18]  Nicole Krämer,et al.  Time Domain Parameters as a feature for EEG-based Brain-Computer Interfaces , 2009, Neural Networks.

[19]  J. Wolpaw,et al.  Does the ‘P300’ speller depend on eye gaze? , 2010, Journal of neural engineering.

[20]  L. R. Quitadamo,et al.  Which Physiological Components are More Suitable for Visual ERP Based Brain–Computer Interface? A Preliminary MEG/EEG Study , 2010, Brain Topography.

[21]  Scott Makeig,et al.  First Demonstration of a Musical Emotion BCI , 2011, ACII.

[22]  D. Levitin,et al.  The neurochemistry of music , 2013, Trends in Cognitive Sciences.

[23]  Matthias S Treder Special section on gaze-independent brain-computer interfaces. , 2012, Journal of neural engineering.

[24]  I. A. Basyul,et al.  N1 wave in the P300 BCI is not sensitive to the physical characteristics of stimuli. , 2009, Journal of integrative neuroscience.

[25]  Jason Farquhar,et al.  Shared processing of perception and imagery of music in decomposed EEG , 2013, NeuroImage.

[26]  Albert S. Bregman,et al.  The Auditory Scene. (Book Reviews: Auditory Scene Analysis. The Perceptual Organization of Sound.) , 1990 .

[27]  Cynthia Fraser,et al.  Music to Your Brain: Background Music Changes Are Processed First, Reducing Ad Message Recall , 2013 .

[28]  Emanuel Donchin,et al.  Definition, Identification, and Reliability of Measurement of the P300 Component of the Event-Related Brain Potential , 1987 .

[29]  Michael Tangermann,et al.  Listen, You are Writing! Speeding up Online Spelling with a Dynamic Auditory BCI , 2011, Front. Neurosci..

[30]  Shangkai Gao,et al.  An Auditory Brain–Computer Interface Using Active Mental Response , 2010, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[31]  William J Tyler,et al.  A quantitative overview of biophysical forces impinging on neural function , 2013, Physical biology.

[32]  Stefan Haufe,et al.  Single-trial analysis and classification of ERP components — A tutorial , 2011, NeuroImage.

[33]  F. Cincotti,et al.  Eye-gaze independent EEG-based brain–computer interfaces for communication , 2012, Journal of neural engineering.

[34]  E. Donchin,et al.  Do Re Mi Fa Sol La Ti——Constraints, Congruity, and Musical Training: An Event-Related Brain Potentials Study of Musical Expectancies , 2002 .

[35]  Bernhard Schölkopf,et al.  Selective Attention to Auditory Stimuli: A Brain-Computer Interface Paradigm , 2004 .

[36]  Michael Tangermann,et al.  Natural stimuli improve auditory BCIs with respect to ergonomics and performance. , 2012, Journal of neural engineering.

[37]  R. Schaefer,et al.  Music perception and imagery in EEG: alpha band effects of task and stimulus. , 2011, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[38]  Benjamin Blankertz,et al.  A Novel 9-Class Auditory ERP Paradigm Driving a Predictive Text Entry System , 2011, Front. Neurosci..