Introducing the tactile speller: an ERP-based brain–computer interface for communication

In this study, a tactile speller was developed and compared with existing visual speller paradigms in terms of classification performance and elicited event-related potentials (ERPs). The fingertips of healthy participants were stimulated with short mechanical taps while electroencephalographic activity was measured. The letters of the alphabet were allocated to different fingers and subjects could select one of the fingers by silently counting the number of taps on that finger. The offline and online performance of the tactile speller was compared to the overt and covert attention visual matrix speller and the covert attention Hex-o-Spell speller. For the tactile speller, binary target versus non-target classification accuracy was 67% on average. Classification and decoding accuracies of the tactile speller were lower than the overt matrix speller, but higher than the covert matrix speller, and similar to Hex-o-Spell. The average maximum information transfer rate of the tactile speller was 7.8 bits min(-1) (1.51 char min(-1)), with the best subject reaching a bit-rate of 27 bits min(-1) (5.22 char min(-1)). An increased amplitude of the P300 ERP component was found in response to attended stimuli versus unattended stimuli in all speller types. In addition, the tactile and overt matrix spellers also used the N2 component for discriminating between targets and non-targets. Overall, this study shows that it is possible to use a tactile speller for communication. The tactile speller provides a useful alternative to the visual speller, especially for people whose eye gaze is impaired.

[1]  J. Polich Updating P300: An integrative theory of P3a and P3b , 2007, Clinical Neurophysiology.

[2]  J. Farquhar,et al.  Transient and steady-state responses to mechanical stimulation of different fingers reveal interactions based on lateral inhibition , 2010, Clinical Neurophysiology.

[3]  R. Oostenveld,et al.  Nonparametric statistical testing of EEG- and MEG-data , 2007, Journal of Neuroscience Methods.

[4]  Michael Tangermann,et al.  Listen, You are Writing! Speeding up Online Spelling with a Dynamic Auditory BCI , 2011, Front. Neurosci..

[5]  Clemens Brunner,et al.  Better than random? A closer look on BCI results , 2008 .

[6]  C. Neuper,et al.  Toward a high-throughput auditory P300-based brain–computer interface , 2009, Clinical Neurophysiology.

[7]  G. Pfurtscheller,et al.  Steady-state somatosensory evoked potentials: suitable brain signals for brain-computer interfaces? , 2006, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[8]  C. Spence,et al.  Crossmodal links between vision and touch in covert endogenous spatial attention. , 2000, Journal of experimental psychology. Human perception and performance.

[9]  P. Haggard,et al.  The Posterior Parietal Cortex Remaps Touch into External Space , 2010, Current Biology.

[10]  P. Michie,et al.  Selective Attention Effects on Somatosensory Event‐Related Potentials , 1984, Annals of the New York Academy of Sciences.

[11]  G WESTHEIMER,et al.  VISUAL ACUITY. , 1965, Annual review of psychology.

[12]  C. Spence,et al.  Tactile selective attention and body posture: Assessing the multisensory contributions of vision and proprioception , 2004, Perception & psychophysics.

[13]  Robert Oostenveld,et al.  FieldTrip: Open Source Software for Advanced Analysis of MEG, EEG, and Invasive Electrophysiological Data , 2010, Comput. Intell. Neurosci..

[14]  J. Wolpaw,et al.  A P300-based brain–computer interface for people with amyotrophic lateral sclerosis , 2008, Clinical Neurophysiology.

[15]  N. Birbaumer,et al.  An auditory oddball (P300) spelling system for brain-computer interfaces. , 2009, Psychophysiology.

[16]  G. Pfurtscheller,et al.  Brain-Computer Interfaces for Communication and Control. , 2011, Communications of the ACM.

[17]  B. Blankertz,et al.  (C)overt attention and visual speller design in an ERP-based brain-computer interface , 2010, Behavioral and Brain Functions.

[18]  S. Gielen,et al.  The brain–computer interface cycle , 2009, Journal of neural engineering.

[19]  J. Driver,et al.  Cross-modal links in endogenous spatial attention are mediated by common external locations: evidence from event-related brain potentials , 2001, Experimental Brain Research.

[20]  A Belitski,et al.  P300 audio-visual speller , 2011, Journal of neural engineering.

[21]  E. Macaluso,et al.  The representation of space near the body through touch and vision , 2010, Neuropsychologia.

[22]  G Pfurtscheller,et al.  EEG-based communication: improved accuracy by response verification. , 1998, IEEE transactions on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[23]  Dennis J. McFarland,et al.  Brain–computer interfaces for communication and control , 2002, Clinical Neurophysiology.

[24]  Gunther Krausz,et al.  ow many people are able to control a P 300-based brain – computer nterface ( BCI ) ? , 2009 .

[25]  S. Soto-Faraco,et al.  Changing Reference Frames during the Encoding of Tactile Events , 2008, Current Biology.

[26]  A. Papanicolaou,et al.  Selective attention effects on somatosensory evoked potentials. , 1989, The International journal of neuroscience.

[27]  E. Donchin,et al.  Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. , 1988, Electroencephalography and clinical neurophysiology.

[28]  Jan B. F. van Erp,et al.  A Tactile P300 Brain-Computer Interface , 2010, Front. Neurosci..

[29]  S. Kitazawa Where conscious sensation takes place , 2002, Consciousness and Cognition.

[30]  Dieter Kleinböhl,et al.  Body posture affects tactile discrimination and identification of fingers and hands , 2010, Experimental Brain Research.

[31]  A. Kok Event-related-potential (ERP) reflections of mental resource̊s: a review and synthesis , 1997, Biological Psychology.

[32]  J. Wolpaw,et al.  Does the ‘P300’ speller depend on eye gaze? , 2010, Journal of neural engineering.