Brain-Operated Assistive Devices: the ASPICE Project

The ASPICE project aims at the development of a system which allows the neuromotor disabled persons to improve or recover their mobility (directly or by emulation) and communication within the surrounding environment. The system pivots around a software controller running on a personal computer, which offers to the user a proper interface to communicate through input interfaces matched with the individual's residual abilities. The system uses the user's input to control domotic devices - such as remotely controlled lights, TV sets, etc. - and a Sony AIBO robot. At this time, the system is under clinical validation, that will provide assessment through patients' feedback and guidelines for customized system installation

[1]  Jean-Paul Laumond,et al.  Robot Motion Planning and Control , 1998 .

[2]  Marilena Vendittelli,et al.  WMR control via dynamic feedback linearization: design, implementation, and experimental validation , 2002, IEEE Trans. Control. Syst. Technol..

[3]  D J McFarland,et al.  Brain-computer interface research at the Wadsworth Center. , 2000, IEEE transactions on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[4]  N. Birbaumer,et al.  BCI2000: a general-purpose brain-computer interface (BCI) system , 2004, IEEE Transactions on Biomedical Engineering.

[5]  Hugh F. Durrant-Whyte,et al.  A solution to the simultaneous localization and map building (SLAM) problem , 2001, IEEE Trans. Robotics Autom..

[6]  Miguel A. L. Nicolelis,et al.  Real-time control of a robot arm using simultaneously recorded neurons in the motor cortex , 1999, Nature Neuroscience.

[7]  Pierre Dragicevic,et al.  Input Device Selection and Interaction Configuration with ICON , 2001, BCS HCI/IHM.

[8]  E Donchin,et al.  Brain-computer interface technology: a review of the first international meeting. , 2000, IEEE transactions on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[9]  H. Flor,et al.  A spelling device for the paralysed , 1999, Nature.

[10]  D J McFarland,et al.  An EEG-based brain-computer interface for cursor control. , 1991, Electroencephalography and clinical neurophysiology.

[11]  Jean-Paul Laumond,et al.  Guidelines in nonholonomic motion planning for mobile robots , 1998 .

[12]  Jean-Claude Latombe,et al.  Robot motion planning , 1970, The Kluwer international series in engineering and computer science.

[13]  H. Flor,et al.  The thought translation device (TTD) for completely paralyzed patients. , 2000, IEEE transactions on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[14]  Nicholas G. Hatsopoulos,et al.  Brain-machine interface: Instant neural control of a movement signal , 2002, Nature.

[15]  Gert Pfurtscheller,et al.  Motor imagery and direct brain-computer communication , 2001, Proc. IEEE.

[16]  G. Pfurtscheller,et al.  Prediction of the side of hand movements from single-trial multi-channel EEG data using neural networks. , 1992, Electroencephalography and Clinical Neurophysiology.

[17]  José del R. Millán,et al.  Noninvasive brain-actuated control of a mobile robot by human EEG , 2004, IEEE Transactions on Biomedical Engineering.

[18]  Jerald D. Kralik,et al.  Real-time prediction of hand trajectory by ensembles of cortical neurons in primates , 2000, Nature.

[19]  G. Pfurtscheller,et al.  Brain-Computer Interfaces for Communication and Control. , 2011, Communications of the ACM.

[20]  Marilena Vendittelli,et al.  Real-time map building and navigation for autonomous robots in unknown environments , 1998, IEEE Trans. Syst. Man Cybern. Part B.

[21]  B. Rockstroh,et al.  Slow potentials of the cerebral cortex and behavior. , 1990, Physiological reviews.

[22]  J. Wolpaw,et al.  Patients with ALS can use sensorimotor rhythms to operate a brain-computer interface , 2005, Neurology.