EOG/ERP hybrid human-machine interface for robot control

Electrooculogram (EOG) signals are potential responses generated by eye movements, and event related potential (ERP) is a special electroencephalogram (EEG) pattern which evoked by external stimuli. Both EOG and ERP have been used separately for implementing human-machine interfaces which can assist disabled patients in performing daily tasks. In this paper, we present a novel EOG/ERP hybrid human-machine interface which integrates the traditional EOG and ERP interfaces together. Eye movements like the blink, wink, gaze, and frown are detected from EOG signals using double threshold algorithm. Multiple ERP components, i.e., N170, VPP and P300 are evoked by inverted face stimuli and classified by linear discriminant analysis (LDA). Based on this hybrid interface, we also design a control scheme for the humanoid robot NAO (Aldebaran robotics, Inc). On-line experiment results show that the proposed hybrid interface can effectively control the robot's basic movements and order it to make various behaviors. While normally operating the robot by hands takes 49.1 s to complete the experiment sessions, using the proposed EOG/ERP interface, the subject is able to finish the sessions in 54.1 s.

[1]  Bao-Liang Lu,et al.  Vigilance estimation by using electrooculographic features , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.

[2]  A. Cichocki,et al.  A novel BCI based on ERP components sensitive to configural processing of human faces , 2012, Journal of neural engineering.

[3]  Dennis J. McFarland,et al.  Brain–computer interfaces for communication and control , 2002, Clinical Neurophysiology.

[4]  E. Donchin,et al.  Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. , 1988, Electroencephalography and clinical neurophysiology.

[5]  Chao Zhang,et al.  Implementation of the EOG-Based Human Computer Interface System , 2008, 2008 2nd International Conference on Bioinformatics and Biomedical Engineering.

[6]  Doru Talaba,et al.  Controlling a Robotic Arm by Brainwaves and Eye Movement , 2011, DoCEIS.

[7]  J.R. Wolpaw,et al.  BCI meeting 2005-workshop on signals and recording methods , 2006, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[8]  F. Mussa-Ivaldi,et al.  Brain–machine interfaces: computational demands and clinical needs meet basic neuroscience , 2003, Trends in Neurosciences.

[9]  Chun-Liang Hsu,et al.  EOG-based Human-Computer Interface system development , 2010, Expert Syst. Appl..