Classification of Selective Attention based on Steady-State Somatosensory Evoked Potentials using High-Frequency Vibration Stimuli

In recent years, the steady-state somatosensory evoked potential (SSSEP)-based brain-computer interfaces (BCIs) have been developed for assisting people with physical disabilities. However, due to its no well-designed standard tactile vibration stimulator to elicit the SSSEP, the SSSEP-based BCI has not been widely used to real-world applications. In this study, we aim to SSSEP-based BCI with high-frequency vibration stimuli. In our experiments, the stimuli were constructed using the coin vibration motor which controlled by the Arduino. The feasibility of the high-frequency vibration stimuli to elicit SSSEP was validated with SSSEP data from two healthy participants. For data collection, the vibration stimuli were attached on the index finger of the left and right hand. And the participant concentrated on one of the vibration stimuli following the visual commands. For the data analysis, the spatial features were extracted from the acquired EEG signals using the common spatial pattern (CSP) filtering. The regularized linear discriminant analysis (RLDA) also used as a classifier. The experimental results showed that the user's selective attention to one of the high-frequency vibration stimuli was classified with averaged 61.7±9.4%. Based on the experimental results, we can conclude that the SSSEP can elicit by the high-frequency vibration stimuli and it can be used to the BCI.

[1]  Klaus-Robert Müller,et al.  A lower limb exoskeleton control system based on steady state visual evoked potentials , 2015, Journal of neural engineering.

[2]  D J McFarland,et al.  An EEG-based brain-computer interface for cursor control. , 1991, Electroencephalography and clinical neurophysiology.

[3]  K. Müller,et al.  Effect of higher frequency on the classification of steady-state visual evoked potentials , 2016, Journal of neural engineering.

[4]  Sung Chan Jun,et al.  Steady-State Somatosensory Evoked Potential for Brain-Computer Interface—Present and Future , 2016, Front. Hum. Neurosci..

[5]  Stefan Haufe,et al.  Detection of braking intention in diverse situations during simulated driving based on EEG feature combination , 2015, Journal of neural engineering.

[6]  Stefan Haufe,et al.  Single-trial analysis and classification of ERP components — A tutorial , 2011, NeuroImage.

[7]  Touradj Ebrahimi,et al.  An efficient P300-based brain–computer interface for disabled subjects , 2008, Journal of Neuroscience Methods.

[8]  Heung-Il Suk,et al.  A Novel Bayesian Framework for Discriminative Feature Extraction in Brain-Computer Interfaces , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[9]  G. Pfurtscheller,et al.  Steady-state somatosensory evoked potentials: suitable brain signals for brain-computer interfaces? , 2006, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[10]  A. Cichocki,et al.  Steady-state visually evoked potentials: Focus on essential paradigms and future perspectives , 2010, Progress in Neurobiology.

[11]  Seong-Whan Lee,et al.  Decoding Three-Dimensional Trajectory of Executed and Imagined Arm Movements From Electroencephalogram Signals , 2015, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[12]  L. Cohen,et al.  Brain–computer interfaces: communication and restoration of movement in paralysis , 2007, The Journal of physiology.

[13]  Seong-Whan Lee,et al.  Commanding a Brain-Controlled Wheelchair Using Steady-State Somatosensory Evoked Potentials , 2018, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[14]  Dennis J. McFarland,et al.  Brain–computer interfaces for communication and control , 2002, Clinical Neurophysiology.