A Single-Channel EOG-Based Speller

Electrooculography (EOG) signals, which can be used to infer the intentions of a user based on eye movements, are widely used in human–computer interface (HCI) systems. Most existing EOG-based HCI systems incorporate a limited number of commands because they generally associate different commands with a few different types of eye movements, such as looking up, down, left, or right. This paper presents a novel single-channel EOG-based HCI that allows users to spell asynchronously by only blinking. Forty buttons corresponding to 40 characters displayed to the user via a graphical user interface are intensified in a random order. To select a button, the user must blink his/her eyes in synchrony as the target button is flashed. Two data processing procedures, specifically support vector machine (SVM) classification and waveform detection, are combined to detect eye blinks. During detection, we simultaneously feed the feature vectors extracted from the ongoing EOG signal into the SVM classification and waveform detection modules. Decisions are made based on the results of the SVM classification and waveform detection. Three online experiments were conducted with eight healthy subjects. We achieved an average accuracy of 94.4% and a response time of 4.14 s for selecting a character in synchronous mode, as well as an average accuracy of 93.43% and a false positive rate of 0.03/min in the idle state in asynchronous mode. The experimental results, therefore, demonstrated the effectiveness of this single-channel EOG-based speller.

[1]  E Donchin,et al.  Brain-computer interface technology: a review of the first international meeting. , 2000, IEEE transactions on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[2]  Pablo F. Diez,et al.  Asynchronous BCI control using high-frequency SSVEP , 2011, Journal of NeuroEngineering and Rehabilitation.

[3]  Fumitoshi Matsuno,et al.  A Novel EOG/EEG Hybrid Human–Machine Interface Adopting Eye Movements and ERPs: Application to Robot Control , 2015, IEEE Transactions on Biomedical Engineering.

[4]  M. Betke,et al.  The Camera Mouse: visual tracking of body features to provide computer access for people with severe disabilities , 2002, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[5]  Tao Zhang,et al.  Structural and functional correlates of motor imagery BCI performance: Insights from the patterns of fronto-parietal attention network , 2016, NeuroImage.

[6]  M Congedo,et al.  A review of classification algorithms for EEG-based brain–computer interfaces , 2007, Journal of neural engineering.

[7]  Ying Sun,et al.  Asynchronous P300 BCI: SSVEP-based control state detection , 2010, 2010 18th European Signal Processing Conference.

[8]  Serkan Gurkan,et al.  Design of a Novel Efficient Human–Computer Interface: An Electrooculagram Based Virtual Keyboard , 2010, IEEE Transactions on Instrumentation and Measurement.

[9]  Gerhard Tröster,et al.  Eye Movement Analysis for Activity Recognition Using Electrooculography , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[10]  Hao Yang,et al.  The hybrid BCI system for movement control by combining motor imagery and moving onset visual evoked potential , 2017, Journal of neural engineering.

[11]  Tao Liu,et al.  N200-speller using motion-onset visual response , 2009, Clinical Neurophysiology.

[12]  Brad A. Myers,et al.  A brief history of human-computer interaction technology , 1998, INTR.

[13]  Fanglin Chen,et al.  A novel hybrid BCI speller based on the incorporation of SSVEP into the P300 paradigm , 2013, Journal of neural engineering.

[14]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[15]  Ron Kohavi,et al.  A Study of Cross-Validation and Bootstrap for Accuracy Estimation and Model Selection , 1995, IJCAI.

[16]  Clemens Brunner,et al.  Mu rhythm (de)synchronization and EEG single-trial classification of different motor imagery tasks , 2006, NeuroImage.

[17]  Doru Talaba,et al.  P300-Based Brain-Neuronal Computer Interaction for Spelling Applications , 2013, IEEE Transactions on Biomedical Engineering.

[18]  Helge J. Ritter,et al.  An Augmented-Reality Based Brain-Computer Interface for Robot Control , 2010, ICONIP.

[19]  Chih-Jen Lin,et al.  A Practical Guide to Support Vector Classication , 2008 .

[20]  G Calhoun,et al.  Brain-computer interfaces based on the steady-state visual-evoked response. , 2000, IEEE transactions on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[21]  F Babiloni,et al.  P300-based brain–computer interface for environmental control: an asynchronous approach , 2011, Journal of neural engineering.

[22]  Guanghua Xu,et al.  Performance reliability estimation method based on adaptive failure threshold , 2013 .

[23]  Feng Li,et al.  Discrimination Between Control and Idle States in Asynchronous SSVEP-Based Brain Switches: A Pseudo-Key-Based Approach , 2013, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[24]  Michael Tangermann,et al.  Listen, You are Writing! Speeding up Online Spelling with a Dynamic Auditory BCI , 2011, Front. Neurosci..

[25]  Chang-Hwan Im,et al.  Development of an SSVEP-based BCI spelling system adopting a QWERTY-style LED keyboard , 2012, Journal of Neuroscience Methods.

[26]  M. Teplan FUNDAMENTALS OF EEG MEASUREMENT , 2002 .

[27]  Cuntai Guan,et al.  Asynchronous P300-Based Brain--Computer Interfaces: A Computational Approach With Statistical Models , 2008, IEEE Transactions on Biomedical Engineering.

[28]  G. Vecchiato,et al.  A hybrid platform based on EOG and EEG signals to restore communication for patients afflicted with progressive motor neuron diseases , 2009, 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[29]  Wei Wu,et al.  Multimodal BCIs: Target Detection, Multidimensional Control, and Awareness Evaluation in Patients With Disorder of Consciousness , 2016, Proceedings of the IEEE.

[30]  E. Donchin,et al.  Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. , 1988, Electroencephalography and clinical neurophysiology.

[31]  Chin-Teng Lin,et al.  Controlling a Human–Computer Interface System With a Novel Classification Method that Uses Electrooculography Signals , 2013, IEEE Transactions on Biomedical Engineering.

[32]  G. Pfurtscheller,et al.  Brain-Computer Interfaces for Communication and Control. , 2011, Communications of the ACM.

[33]  G.F. Inbar,et al.  An improved P300-based brain-computer interface , 2005, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[34]  Tzyy-Ping Jung,et al.  High-speed spelling with a noninvasive brain–computer interface , 2015, Proceedings of the National Academy of Sciences.

[35]  G Townsend,et al.  Pushing the P300-based brain-computer interface beyond 100 bpm: extending performance guided constraints into the temporal domain. , 2016, Journal of neural engineering.

[36]  Worthy N. Martin,et al.  Human-computer interaction using eye-gaze input , 1989, IEEE Trans. Syst. Man Cybern..

[37]  Xingyu Wang,et al.  An ERP-Based BCI using an oddball Paradigm with Different Faces and Reduced errors in Critical Functions , 2014, Int. J. Neural Syst..

[38]  Yuanqing Li,et al.  A P300-Based Threshold-Free Brain Switch and Its Application in Wheelchair Control , 2017, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[39]  A. Kübler,et al.  Face stimuli effectively prevent brain–computer interface inefficiency in patients with neurodegenerative disease , 2013, Clinical Neurophysiology.

[40]  William Z Rymer,et al.  Brain-computer interface technology: a review of the Second International Meeting. , 2003, IEEE transactions on neural systems and rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[41]  Shangkai Gao,et al.  An N200 speller integrating the spatial profile for the detection of the non-control state , 2012, Journal of neural engineering.

[42]  Andrzej Cichocki,et al.  An improved P300 pattern in BCI to catch user’s attention , 2017, Journal of neural engineering.

[43]  M. Mazo,et al.  System for assisted mobility using eye movements based on electrooculography , 2002, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[44]  E. Donchin,et al.  The contingent negative variation and the late positive wave of the average evoked potential. , 1970, Electroencephalography and clinical neurophysiology.

[45]  Walter H. Chang,et al.  The new design of an infrared-controlled human-computer interface for the disabled. , 1999, IEEE transactions on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[46]  Fanglin Chen,et al.  A Speedy Hybrid BCI Spelling Approach Combining P300 and SSVEP , 2014, IEEE Transactions on Biomedical Engineering.

[47]  Stephen J. Roberts,et al.  Sequential classification of mental tasks vs. idle state for EEG based BCIs , 2009, 2009 4th International IEEE/EMBS Conference on Neural Engineering.

[48]  A. Edwards Extra-ordinary human-computer interaction: interfaces for users with disabilities , 1995 .

[49]  Yeung Sam Hung,et al.  Efficient Implementation and Design of a New Single-Channel Electrooculography-Based Human–Machine Interface System , 2015, IEEE Transactions on Circuits and Systems II: Express Briefs.

[50]  Yuanqing Li,et al.  A Hybrid BCI System Combining P300 and SSVEP and Its Application to Wheelchair Control , 2013, IEEE Transactions on Biomedical Engineering.