Multi-modal interface for communication operated by eye blinks, eye movements, head movements, blowing/sucking and brain waves

This work presents a multi-modal interface that can be used for communication of people with disabilities. The interface is installed onboard a robotic wheelchair, and provides flexibility to choose different modalities for communication by people with different levels of disabilities. Users can use the interface through eye blinks, eye movements, head movements, by blowing or sucking a straw, and through brain signals. The interface is easy to use and has a flexible graphical user interface running on a personal digital assistant or tablet. Several experiments were carried out with healthy people and people with disabilities, and the results validate the developed interface as an assistive tool to allow communication of people with distinct levels of disability.

[1]  Dennis J. McFarland,et al.  Brain–computer interfaces for communication and control , 2002, Clinical Neurophysiology.

[2]  Febo Cincotti,et al.  High-resolution EEG techniques for brain–computer interface applications , 2008, Journal of Neuroscience Methods.

[3]  N. Birbaumer Breaking the silence: brain-computer interfaces (BCI) for communication and motor control. , 2006, Psychophysiology.

[4]  Antonio Mauricio Ferreira Leite Miranda de Sá,et al.  Assessing Time- and Phase-Locked Changes in the EEG during Sensory Stimulation by Means of Spectral Techniques , 2009 .

[5]  Yoshiaki Saitoh,et al.  Development of communication supporting device controlled by eye movements and voluntary eye blink , 2006, The 26th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[6]  José del R. Millán,et al.  Non-Invasive Brain-Actuated Control of a Mobile Robot , 2003, IJCAI.

[7]  Farhad Faradji,et al.  Plausibility assessment of a 2-state self-paced mental task-based BCI using the no-control performance analysis , 2009, Journal of Neuroscience Methods.

[8]  Neil D. Lawrence,et al.  Proceedings of the First international conference on Deterministic and Statistical Methods in Machine Learning , 2004 .

[9]  J. Wolpaw,et al.  Brain-computer communication: unlocking the locked in. , 2001, Psychological bulletin.

[10]  A. Cichocki,et al.  Steady-state visually evoked potentials: Focus on essential paradigms and future perspectives , 2010, Progress in Neurobiology.

[11]  Cheng Ming,et al.  An EEG-based cursor control system , 1999, Proceedings of the First Joint BMES/EMBS Conference. 1999 IEEE Engineering in Medicine and Biology 21st Annual Conference and the 1999 Annual Fall Meeting of the Biomedical Engineering Society (Cat. N.

[12]  Cesar Rizzo Cassemiro,et al.  Comunicação visual por computador na esclerose lateral amiotrófica , 2004 .

[13]  Urbano Nunes,et al.  Statistical spatial filtering for a P300-based BCI: Tests in able-bodied, and patients with cerebral palsy and amyotrophic lateral sclerosis , 2011, Journal of Neuroscience Methods.

[14]  Claudia Fernandes Borges Dependência e morte da "mãe de família": a solidariedade familiar e comunitária nos cuidados com a paciente de esclerose lateral amiotrófica , 2003 .

[15]  Chris Ball,et al.  Efficient Communication by Breathing , 2004, Deterministic and Statistical Methods in Machine Learning.

[16]  Margrit Betke,et al.  Communication via eye blinks - detection and duration analysis in real time , 2001, Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001.

[17]  Yijun Wang,et al.  Brain-Computer Interfaces Based on Visual Evoked Potentials , 2008, IEEE Engineering in Medicine and Biology Magazine.