Multimodal target detection using single trial evoked EEG responses in single and dual-tasks

The detection of event-related potentials in the electroencephalogram signal is a common way for creating a brain-computer interface (BCI). Successful detection of evoked responses can be enhanced by the user selectively attending to specific stimuli presented in the BCI task. Because BCI users need a system that performs well in a variety of contexts, even ones that may impair selective attention, it is critical to understand how single trial detection is affected by attention. We tested 16 participants using a rapid serial visual/auditory presentation paradigm under three conditions, one in which they detected the presence of a visual target, one in which they detected the presence of an auditory target, and one in which they detected both visual and auditory targets. The behavioral performance indicates that the visual task was more difficult than the auditory task. Consistent with the higher behavioral difficulty of the visual task, single trial performance showed no difference between single and dual-task for the visual target detection (mean=0.76). However, the area under the curve for the auditory target detection was significantly lower than the dual-task (mean=0.81 for single task, 0.75 for dual-task). The results support the conclusion that single-trial target detection is impaired when attention is divided between multiple tasks.

[1]  N. Bigdely-Shamlo,et al.  Brain Activity-Based Image Classification From Rapid Serial Visual Presentation , 2008, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[2]  B. Blankertz,et al.  (C)overt attention and visual speller design in an ERP-based brain-computer interface , 2010, Behavioral and Brain Functions.

[3]  L. Parra,et al.  Ieee Signal Processing Magazine, Accepted for Publication, August 2007 Spatio-temporal Linear Decoding of Brain State: Application to Performance Augmentation in High-throughput Tasks , 2022 .

[4]  Guillaume Gibert,et al.  xDAWN Algorithm to Enhance Evoked Potentials: Application to Brain–Computer Interface , 2009, IEEE Transactions on Biomedical Engineering.

[5]  Touradj Ebrahimi,et al.  An efficient P300-based brain–computer interface for disabled subjects , 2008, Journal of Neuroscience Methods.

[6]  E. Donchin,et al.  P300 and tracking difficulty: evidence for multiple resources in dual-task performance. , 1980, Psychophysiology.

[7]  R. Johnson A triarchic model of P300 amplitude. , 1986, Psychophysiology.

[8]  J. Polich Updating P300: An integrative theory of P3a and P3b , 2007, Clinical Neurophysiology.

[9]  O Bertrand,et al.  A robust sensor-selection method for P300 brain–computer interfaces , 2011, Journal of neural engineering.

[10]  David J. C. MacKay,et al.  Bayesian Interpolation , 1992, Neural Computation.

[11]  P. Sajda,et al.  Spatiotemporal Linear Decoding of Brain State , 2008, IEEE Signal Processing Magazine.

[12]  E. Donchin,et al.  Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. , 1988, Electroencephalography and clinical neurophysiology.

[13]  Jose M. Leiva,et al.  MLSP Competition, 2010: Description of first place method , 2010, 2010 IEEE International Workshop on Machine Learning for Signal Processing.

[14]  J. Wolpaw,et al.  Does the ‘P300’ speller depend on eye gaze? , 2010, Journal of neural engineering.

[15]  P. Sajda,et al.  Cortically coupled computer vision for rapid image search , 2006, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[16]  J. C. Johnston,et al.  Attentional limitations in dual-task performance. , 1998 .

[17]  N. Birbaumer,et al.  fMRI Brain-Computer Interfaces , 2008, IEEE Signal Processing Magazine.

[18]  H. Pashler Dual-task interference in simple tasks: data and theory. , 1994, Psychological bulletin.