CES-531: Collaborative Brain-Computer Interfaces for Target Detection and Localisation in Rapid Serial Visual Presentation

The rapid serial visual presentation protocol can be used to show images sequentially on the same spatial location at high presentation rates. We used this technique to present aerial images to participants looking for predefined targets (airplanes) at rates ranging from 5 to 12 Hz. We used linear support vector machines for the single-trial classification of event-related potentials from both individual users and pairs of users (in which case we averaged either their individual classifiers' analogue outputs before thresholding or their electroencephalographic signals associated to the same stimuli) with and without the selection of compatible pairs. We considered two tasks - the detection of targets and the identification of the visual hemifield in which targets appeared. While single users did well in both tasks, we found that pairs of participants with similar individual performance provided significant improvements. In particular, in the target-detection task we obtained median improvements in the area under the receiver operating characteristic curve (AUC) of up to 8.3% w.r.t. single-user BCIs, while in the hemifield classification task we ob- tained AUCs up to 7.7% higher than for single users. Furthermore, we found that this second system allows not just to say if a target is in on the left or the right of an image, but to also recover the target's approximate horizontal position.

[1]  Misha Pavel,et al.  Rapid image analysis using neural signals , 2008, CHI Extended Abstracts.

[2]  A. Stoica,et al.  Improving decision-making based on visual perception via a collaborative brain-computer interface , 2013, 2013 IEEE International Multi-Disciplinary Conference on Cognitive Methods in Situation Awareness and Decision Support (CogSIMA).

[3]  Riccardo Poli,et al.  Collaborative brain-computer interfaces for the automatic classification of images , 2013, 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER).

[4]  Kenneth I. Forster,et al.  Visual perception of rapidly presented word sequences of varying complexity , 1970 .

[5]  P. Sajda,et al.  Cortically coupled computer vision for rapid image search , 2006, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[6]  Tzyy-Ping Jung,et al.  A Collaborative Brain-Computer Interface for Accelerating Human Decision Making , 2013, HCI.

[7]  Tzyy-Ping Jung,et al.  A Collaborative Brain-Computer Interface for Improving Human Performance , 2011, PloS one.

[8]  Andrew P. Bradley,et al.  The use of the area under the ROC curve in the evaluation of machine learning algorithms , 1997, Pattern Recognit..

[9]  E. Donchin,et al.  Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. , 1988, Electroencephalography and clinical neurophysiology.

[10]  F. L. D. Silva,et al.  Event-related EEG/MEG synchronization and desynchronization: basic principles , 1999, Clinical Neurophysiology.

[11]  E Donchin,et al.  The mental prosthesis: assessing the speed of a P300-based brain-computer interface. , 2000, IEEE transactions on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[12]  P. Latham,et al.  References and Notes Supporting Online Material Materials and Methods Figs. S1 to S11 References Movie S1 Optimally Interacting Minds R�ports , 2022 .

[13]  Misha Pavel,et al.  A framework for rapid visual image search using single-trial brain evoked responses , 2011, Neurocomputing.

[14]  S J Luck,et al.  Spatial filtering during visual search: evidence from human electrophysiology. , 1994, Journal of experimental psychology. Human perception and performance.

[15]  Steven J. Luck,et al.  Electrophysiological Correlates of the Focusing of Attention within Complex Visual Scenes: N2pc and Related ERP Components , 2011 .

[16]  Michael Vitale,et al.  The Wisdom of Crowds , 2015, Cell.

[17]  Alan F. Smeaton,et al.  Curiosity Cloning : Neural Modelling for Image Analysis , Technical Report , 2010 .

[18]  Hubert Cecotti,et al.  Spelling with non-invasive Brain–Computer Interfaces – Current and future trends , 2011, Journal of Physiology-Paris.

[19]  C. Cinel,et al.  Multi-brain fusion and applications to intelligence analysis , 2013, Defense, Security, and Sensing.

[20]  Christa Neuper,et al.  An asynchronously controlled EEG-based virtual keyboard: improvement of the spelling rate , 2004, IEEE Transactions on Biomedical Engineering.

[21]  Peng Yuan,et al.  Study on an online collaborative BCI to accelerate response to visual targets , 2012, 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[22]  Albert B. Kao,et al.  Decision accuracy in complex environments is often maximized by small group sizes , 2014, Proceedings of the Royal Society B: Biological Sciences.

[23]  C. Cinel,et al.  P300-Based BCI Mouse With Genetically-Optimized Analogue Control , 2008, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[24]  N. Squires,et al.  Two varieties of long-latency positive waves evoked by unpredictable auditory stimuli in man. , 1975, Electroencephalography and clinical neurophysiology.

[25]  Hubert Cecotti,et al.  Performance estimation of a cooperative brain-computer interface based on the detection of steady-state visual evoked potentials , 2014, 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[26]  Stephen Umunna,et al.  Towards a brain computer interface based on the N2pc event-related potential , 2013, 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER).

[27]  M. Eimer The N2pc component as an indicator of attentional selectivity. , 1996, Electroencephalography and clinical neurophysiology.