Comparison of tactile, auditory, and visual modality for brain-computer interface use: a case study with a patient in the locked-in state

This paper describes a case study with a patient in the classic locked-in state, who currently has no means of independent communication. Following a user-centered approach, we investigated event-related potentials (ERP) elicited in different modalities for use in brain-computer interface (BCI) systems. Such systems could provide her with an alternative communication channel. To investigate the most viable modality for achieving BCI based communication, classic oddball paradigms (1 rare and 1 frequent stimulus, ratio 1:5) in the visual, auditory and tactile modality were conducted (2 runs per modality). Classifiers were built on one run and tested offline on another run (and vice versa). In these paradigms, the tactile modality was clearly superior to other modalities, displaying high offline accuracy even when classification was performed on single trials only. Consequently, we tested the tactile paradigm online and the patient successfully selected targets without any error. Furthermore, we investigated use of the visual or tactile modality for different BCI systems with more than two selection options. In the visual modality, several BCI paradigms were tested offline. Neither matrix-based nor so-called gaze-independent paradigms constituted a means of control. These results may thus question the gaze-independence of current gaze-independent approaches to BCI. A tactile four-choice BCI resulted in high offline classification accuracies. Yet, online use raised various issues. Although performance was clearly above chance, practical daily life use appeared unlikely when compared to other communication approaches (e.g., partner scanning). Our results emphasize the need for user-centered design in BCI development including identification of the best stimulus modality for a particular user. Finally, the paper discusses feasibility of EEG-based BCI systems for patients in classic locked-in state and compares BCI to other AT solutions that we also tested during the study.

[1]  J. Wolpaw,et al.  Brain-Computer Interfaces: Principles and Practice , 2012 .

[2]  Fred Plum,et al.  [The diagnosis of stupor and coma]. , 2015, Brain and nerve = Shinkei kenkyu no shinpo.

[3]  A. Kübler,et al.  A Brain–Computer Interface Controlled Auditory Event‐Related Potential (P300) Spelling System for Locked‐In Patients , 2009, Annals of the New York Academy of Sciences.

[4]  Tobias Kaufmann,et al.  Spelling is Just a Click Away – A User-Centered Brain–Computer Interface Including Auto-Calibration and Predictive Text Entry , 2012, Front. Neurosci..

[5]  M. Eimer Event-related brain potentials distinguish processing stages involved in face perception and recognition , 2000, Clinical Neurophysiology.

[6]  Bernhard Schölkopf,et al.  An Auditory Paradigm for Brain-Computer Interfaces , 2004, NIPS.

[7]  Anne-Marie Brouwer,et al.  A tactile P 300 brain-computer interface , 2010 .

[8]  A. Lenhardt,et al.  An Adaptive P300-Based Online Brain–Computer Interface , 2008, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[9]  Anton Nijholt,et al.  Towards Practical Brain-Computer Interfaces: Bridging the Gap from Research to Real-World Applications , 2012 .

[10]  Z. Hasan A Survey on Shari’Ah Governance Practices in Malaysia, GCC Countries and the UK , 2011 .

[11]  L. Cohen,et al.  Brain–computer interfaces: communication and restoration of movement in paralysis , 2007, The Journal of physiology.

[12]  Dean J Krusienski,et al.  A comparison of classification techniques for the P300 Speller , 2006, Journal of neural engineering.

[13]  Tobias Kaufmann,et al.  User Centred Design in BCI Development , 2012 .

[14]  Benjamin Blankertz,et al.  A novel brain-computer interface based on the rapid serial visual presentation paradigm , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.

[15]  T. Allison,et al.  Electrophysiological Studies of Face Perception in Humans , 1996, Journal of Cognitive Neuroscience.

[16]  Peter Desain,et al.  Introducing the tactile speller: an ERP-based brain–computer interface for communication , 2012, Journal of neural engineering.

[17]  Yael Arbel,et al.  Bcis That Use P 300 Event-related Potentials , 2012 .

[18]  J. Wolpaw,et al.  Brain-computer communication: unlocking the locked in. , 2001, Psychological bulletin.

[19]  B. Blankertz,et al.  A New Auditory Multi-Class Brain-Computer Interface Paradigm: Spatial Hearing as an Informative Cue , 2010, PloS one.

[20]  Yael Arbel,et al.  BCIs that use P300 Event Related Potentials , 2012 .

[21]  A. Kübler,et al.  Face stimuli effectively prevent brain–computer interface inefficiency in patients with neurodegenerative disease , 2013, Clinical Neurophysiology.

[22]  Facp Jan L. Bernheim,et al.  The Anamnestic Comparative Self-Assessment for Measuring the Subjective Quality of Life of Cancer Patients , 1993 .

[23]  J. Charles,et al.  A Sino-German λ 6 cm polarization survey of the Galactic plane I . Survey strategy and results for the first survey region , 2006 .

[24]  N. Birbaumer,et al.  An auditory oddball (P300) spelling system for brain-computer interfaces. , 2009, Psychophysiology.

[25]  L. Cohen,et al.  Brain–computer interface in paralysis , 2008, Current opinion in neurology.

[26]  Jonathan R Wolpaw,et al.  A brain-computer interface for long-term independent home use , 2010, Amyotrophic lateral sclerosis : official publication of the World Federation of Neurology Research Group on Motor Neuron Diseases.

[27]  Donatella Mattia,et al.  A Brain-Computer Interface as Input Channel for a Standard Assistive Technology Software , 2011, Clinical EEG and neuroscience.

[28]  A. Cichocki,et al.  A novel BCI based on ERP components sensitive to configural processing of human faces , 2012, Journal of neural engineering.

[29]  A. Cichocki,et al.  The Changing Face of P300 BCIs: A Comparison of Stimulus Changes in a P300 BCI Involving Faces, Emotion, and Movement , 2012, PloS one.

[30]  E. Donchin,et al.  Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. , 1988, Electroencephalography and clinical neurophysiology.

[31]  E. Donchin,et al.  A P300-based brain–computer interface: Initial tests by ALS patients , 2006, Clinical Neurophysiology.

[32]  J. Polich Updating P300: An integrative theory of P3a and P3b , 2007, Clinical Neurophysiology.

[33]  Christoph Braun,et al.  A portable auditory P300 brain–computer interface with directional cues , 2013, Clinical Neurophysiology.

[34]  N. Birbaumer,et al.  An auditory oddball brain–computer interface for binary choices , 2010, Clinical Neurophysiology.

[35]  Benjamin Blankertz,et al.  Control-display mapping in brain–computer interfaces , 2012, Ergonomics.

[36]  Febo Cincotti,et al.  Out of the frying pan into the fire--the P300-based BCI faces real-world challenges. , 2011, Progress in brain research.

[37]  Xingyu Wang,et al.  An adaptive P300-based control system , 2011, Journal of neural engineering.

[38]  N. Birbaumer,et al.  BCI2000: a general-purpose brain-computer interface (BCI) system , 2004, IEEE Transactions on Biomedical Engineering.

[39]  Martin Maguire,et al.  User-Centred Requirements Handbook , 2010 .

[40]  B. Blankertz,et al.  (C)overt attention and visual speller design in an ERP-based brain-computer interface , 2010, Behavioral and Brain Functions.

[41]  F. Babiloni,et al.  A covert attention P300-based brain–computer interface: Geospell , 2012, Ergonomics.

[42]  J. Polich Updating P 300 : An Integrative Theory of P 3 a and P 3 b , 2009 .

[43]  Jan B. F. van Erp,et al.  A Tactile P300 Brain-Computer Interface , 2010, Front. Neurosci..

[44]  Cornelia Herbert,et al.  Brain Painting: Usability testing according to the user-centered design in end users with severe motor paralysis , 2013, Artif. Intell. Medicine.

[45]  Tobias Kaufmann,et al.  Effects of resting heart rate variability on performance in the P300 brain-computer interface. , 2012, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[46]  F. Cincotti,et al.  Eye-gaze independent EEG-based brain–computer interfaces for communication , 2012, Journal of neural engineering.

[47]  B. Schölkopf,et al.  An online brain-computer interface based on shifting attention to concurrent streams of auditory stimuli. , 2012, Journal of neural engineering.

[48]  Benjamin Blankertz,et al.  Performance optimization of ERP-based BCIs using dynamic stopping , 2011, 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[49]  E. John,et al.  Evoked-Potential Correlates of Stimulus Uncertainty , 1965, Science.

[50]  Benjamin Blankertz,et al.  Two-dimensional auditory p300 speller with predictive text system , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.

[51]  Eric W. Sellers,et al.  Predictive Spelling With a P300-Based Brain–Computer Interface: Increasing the Rate of Communication , 2010, Int. J. Hum. Comput. Interact..

[52]  Stefan Haufe,et al.  Optimizing event-related potential based brain-computer interfaces: a systematic evaluation of dynamic stopping methods. , 2013, Journal of neural engineering.

[53]  Benjamin Blankertz,et al.  Gaze-independent BCI-spelling using rapid serial visual presentation (RSVP) , 2013, Clinical Neurophysiology.

[54]  Niels Birbaumer,et al.  Training effects of multiple auditory BCI sessions , 2013 .

[55]  D. Hu,et al.  Gaze independent brain–computer speller with covert visual search tasks , 2011, Clinical Neurophysiology.

[56]  J. W. Minett,et al.  Optimizing the P300-based brain–computer interface: current status, limitations and future directions , 2011, Journal of neural engineering.

[57]  C. Neuper,et al.  Toward a high-throughput auditory P300-based brain–computer interface , 2009, Clinical Neurophysiology.

[58]  F. Gerstenbrand,et al.  Varieties of the locked-in syndrome , 1979, Journal of Neurology.

[59]  J. Wolpaw,et al.  Does the ‘P300’ speller depend on eye gaze? , 2010, Journal of neural engineering.

[60]  Benjamin Blankertz,et al.  A Novel 9-Class Auditory ERP Paradigm Driving a Predictive Text Entry System , 2011, Front. Neurosci..

[61]  Athena Demertzi,et al.  A survey on self-assessed well-being in a cohort of chronic locked-in syndrome patients: happy majority, miserable minority , 2011, BMJ Open.

[62]  A. Kübler,et al.  Flashing characters with famous faces improves ERP-based brain–computer interface performance , 2011, Journal of neural engineering.

[63]  F. Babiloni,et al.  Multimodal stimulation for a P300-based BCI , 2007 .

[64]  Shangkai Gao,et al.  An online brain–computer interface using non-flashing visual evoked potentials , 2010, Journal of neural engineering.

[65]  E Donchin,et al.  Discriminant analysis in average evoked response studies: the study of single trial data. , 1969, Electroencephalography and clinical neurophysiology.

[66]  Michael Tangermann,et al.  Listen, You are Writing! Speeding up Online Spelling with a Dynamic Auditory BCI , 2011, Front. Neurosci..