Human–machine interface based on muscular and brain signals applied to a robotic wheelchair

This paper presents a Human-Machine Interface (HMI) based on the signals generated by eye blinks or brain activity. The system structure and the signal acquisition and processing are shown. The signals used in this work are either the signal associated to the muscular movement corresponding to an eye blink or the brain signal corresponding to visual information processing. The variance is the feature extracted from such signals in order to detect the intention of the user. The classification is performed by a variance threshold which is experimentally determined for each user during the training stage. The command options, which are going to be sent to the commanded device, are presented to the user in the screen of a PDA (Personal Digital Assistant). In the experiments here reported, a robotic wheelchair is used as the device being commanded.

[1]  José del R. Millán,et al.  Non-Invasive Brain-Actuated Control of a Mobile Robot , 2003, IJCAI.

[2]  T.F.B. Filho,et al.  Human-Machine Interface Based on Electro-Biological Signals for Mobile Vehicles , 2006, 2006 IEEE International Symposium on Industrial Electronics.

[3]  Janne Lehtonen,et al.  EEG-Based Brain-Computer Interfaces , 2002 .

[4]  M. Sarcinelli-Filho,et al.  Teleoperation of an Industrial Manipulator Through a TCP/IP Channel Using EEG Signals , 2006, 2006 IEEE International Symposium on Industrial Electronics.