An Intention-Driven Semi-autonomous Intelligent Robotic System for Drinking

In this study, an intention-driven semi-autonomous intelligent robotic (ID-SIR) system is designed and developed to assist the severely disabled patients to live independently. The system mainly consists of a non-invasive brain–machine interface (BMI) subsystem, a robot manipulator and a visual detection and localization subsystem. Different from most of the existing systems remotely controlled by joystick, head- or eye tracking, the proposed ID-SIR system directly acquires the intention from users’ brain. Compared with the state-of-art system only working for a specific object in a fixed place, the designed ID-SIR system can grasp any desired object in a random place chosen by a user and deliver it to his/her mouth automatically. As one of the main advantages of the ID-SIR system, the patient is only required to send one intention command for one drinking task and the autonomous robot would finish the rest of specific controlling tasks, which greatly eases the burden on patients. Eight healthy subjects attended our experiment, which contained 10 tasks for each subject. In each task, the proposed ID-SIR system delivered the desired beverage container to the mouth of the subject and then put it back to the original position. The mean accuracy of the eight subjects was 97.5%, which demonstrated the effectiveness of the ID-SIR system.

[1]  J. Wolpaw,et al.  A novel P300-based brain–computer interface stimulus presentation paradigm: Moving beyond rows and columns , 2010, Clinical Neurophysiology.

[2]  Daniel Pérez-Marcos,et al.  Writing through a robot: a proof of concept for a brain-machine interface. , 2011, Medical engineering & physics.

[3]  Miguel A. L. Nicolelis,et al.  Real-time control of a robot arm using simultaneously recorded neurons in the motor cortex , 1999, Nature Neuroscience.

[4]  Hermano Igo Krebs,et al.  MIT-Skywalker: A Novel Gait Neurorehabilitation Robot for Stroke and Cerebral Palsy , 2016, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[5]  Tae-Seong Kim,et al.  An efficient word typing P300-BCI system using a modified T9 interface and random forest classifier , 2015, Comput. Biol. Medicine.

[6]  Christopher M. Bishop,et al.  Pattern Recognition and Machine Learning (Information Science and Statistics) , 2006 .

[7]  Wolfram Burgard,et al.  An autonomous robotic assistant for drinking , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[8]  A. Lenhardt,et al.  An Adaptive P300-Based Online Brain–Computer Interface , 2008, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[9]  José del R. Millán,et al.  Brain-Controlled Wheelchairs: A Robotic Architecture , 2013, IEEE Robotics & Automation Magazine.

[10]  R. Jacob Vogelstein,et al.  HARMONIE: A multimodal control framework for human assistive robotics , 2013, 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER).

[11]  Yue Zhao,et al.  A Wireless BCI and BMI System for Wearable Robots , 2016, IEEE Transactions on Systems, Man, and Cybernetics: Systems.

[12]  Kwang Suk Park,et al.  Eliciting dual-frequency SSVEP using a hybrid SSVEP-P300 BCI , 2016, Journal of Neuroscience Methods.

[13]  Arjon Turnip,et al.  An experiment of lie detection based EEG-P300 classified by SVM algorithm , 2015, 2015 International Conference on Automation, Cognitive Science, Optics, Micro Electro-Mechanical System, and Information Technology (ICACOMIT).

[14]  Xingyu Wang,et al.  A comparison of navigation system based on P300 BCI and SSVEP BCI , 2012, 2012 24th Chinese Control and Decision Conference (CCDC).

[15]  Yuanqing Li,et al.  A Hybrid BCI System Combining P300 and SSVEP and Its Application to Wheelchair Control , 2013, IEEE Transactions on Biomedical Engineering.

[16]  Zhengyou Zhang,et al.  A Flexible New Technique for Camera Calibration , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[17]  Cong Wang,et al.  A brain-computer interface controlled mail client , 2013, 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[18]  Wei Wu,et al.  Multimodal BCIs: Target Detection, Multidimensional Control, and Awareness Evaluation in Patients With Disorder of Consciousness , 2016, Proceedings of the IEEE.

[19]  David J. C. MacKay,et al.  Bayesian Interpolation , 1992, Neural Computation.

[20]  Michael J. Black,et al.  Point-and-Click Cursor Control With an Intracortical Neural Interface System by Humans With Tetraplegia , 2011, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[21]  Nicolas Y. Masse,et al.  Reach and grasp by people with tetraplegia using a neurally controlled robotic arm , 2012, Nature.

[22]  Zhao Wang,et al.  How Autonomy Impacts Performance and Satisfaction: Results From a Study With Spinal Cord Injured Subjects Using an Assistive Robot , 2012, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[23]  Sauro Longhi,et al.  Auditory paradigm for a P300 BCI system using spatial hearing , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[24]  Touradj Ebrahimi,et al.  An efficient P300-based brain–computer interface for disabled subjects , 2008, Journal of Neuroscience Methods.

[25]  Yuanqing Li,et al.  EEG-based hybrid BCIs and their applications , 2015, The 3rd International Winter Conference on Brain-Computer Interface.

[26]  C Grozea,et al.  On the feasibility of using motor imagery EEG-based brain–computer interface in chronic tetraplegics for assistive robotic arm control: a clinical test and long-term post-trial follow-up , 2012, Spinal Cord.

[27]  N Pouratian,et al.  Editorial Note on: On the feasibility of using motor imagery EEG-based brain–computer interface in chronic tetraplegics for assistive robotic arm control: a clinical test and long-term post trial follow-up , 2012, Spinal Cord.

[28]  Qing Zhu,et al.  Fuzzy sliding mode control of an upper limb exoskeleton for robot-assisted rehabilitation , 2015, 2015 IEEE International Symposium on Medical Measurements and Applications (MeMeA) Proceedings.

[29]  Bertram E. Shi,et al.  Hybrid gaze/EEG brain computer interface for robot arm control on a pick and place task , 2015, 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).