Demonstration of a Semi-Autonomous Hybrid Brain–Machine Interface Using Human Intracranial EEG, Eye Tracking, and Computer Vision to Control a Robotic Upper Limb Prosthetic

To increase the ability of brain-machine interfaces (BMIs) to control advanced prostheses such as the modular prosthetic limb (MPL), we are developing a novel system: the Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE). This system utilizes hybrid input, supervisory control, and intelligent robotics to allow users to identify an object (via eye tracking and computer vision) and initiate (via brain-control) a semi-autonomous reach-grasp-and-drop of the object by the MPL. Sequential iterations of HARMONIE were tested in two pilot subjects implanted with electrocortico-graphic (ECoG) and depth electrodes within motor areas. The subjects performed the complex task in 71.4% (20/28) and 67.7% (21/31) of trials after minimal training. Balanced accuracy for detecting movements was 91.1% and 92.9%, significantly greater than chance accuracies (p <; 0.05). After BMI-based initiation, the MPL completed the entire task 100% (one object) and 70% (three objects) of the time. The MPL took approximately 12.2 s for task completion after system improvements implemented for the second subject. Our hybrid-BMI design prevented all but one baseline false positive from initiating the system. The novel approach demonstrated in this proof-of-principle study, using hybrid input, supervisory control, and intelligent robotics, addresses limitations of current BMIs.

[1]  Mandayam A. Srinivasan,et al.  Continuous shared control for stabilizing reaching and grasping with brain-machine interfaces , 2006, IEEE Transactions on Biomedical Engineering.

[2]  José del R. Millán,et al.  Brain-Controlled Wheelchairs: A Robotic Architecture , 2013, IEEE Robotics & Automation Magazine.

[3]  Thorsten O. Zander,et al.  Combining Eye Gaze Input With a Brain–Computer Interface for Touchless Human–Computer Interaction , 2010, Int. J. Hum. Comput. Interact..

[4]  K. Salisbury,et al.  Issues in human/computer control of dexterous remote hands , 1988 .

[5]  C. Neuper,et al.  Combining Brain–Computer Interfaces and Assistive Technologies: State-of-the-Art and Challenges , 2010, Front. Neurosci..

[6]  Brendan Z. Allison,et al.  Non-invasive Brain-Computer Interfaces for Semi-autonomous Assistive Devices , 2008 .

[7]  Paul C. Schutte,et al.  The H-Metaphor as a Guideline for Vehicle Automation and Interaction , 2005 .

[8]  Vera Kaiser,et al.  BCI Applications for People with Disabilities: Defining User Needs and User Requirements , 2009 .

[9]  Michael J. Black,et al.  Neural control of cursor trajectory and click by a human with tetraplegia 1000 days after implant of an intracortical microelectrode array , 2011 .

[10]  H. Yokoi,et al.  Electrocorticographic control of a prosthetic arm in paralyzed patients , 2012, Annals of neurology.

[11]  Mincheol Whang,et al.  A brain–computer interface method combined with eye tracking for 3D interaction , 2010, Journal of Neuroscience Methods.

[12]  A. Nobunaga,et al.  Recent demographic and injury trends in people served by the Model Spinal Cord Injury Care Systems. , 1999, Archives of physical medicine and rehabilitation.

[13]  Redwan Alqasemi,et al.  Control of a 9-DoF Wheelchair-Mounted Robotic Arm System , 2023, Proceedings of the 20th Florida Conference on Recent Advances in Robotics.

[14]  Brendan Z. Allison,et al.  The Hybrid BCI , 2010, Frontiers in Neuroscience.

[15]  A. Schwartz,et al.  High-performance neuroprosthetic control by an individual with tetraplegia , 2013, The Lancet.

[16]  Bin He,et al.  Goal selection versus process control in a brain–computer interface based on sensorimotor rhythms , 2009, Journal of neural engineering.

[17]  Mark Hallett,et al.  Self-modulation of primary motor cortex activity with motor and motor imagery tasks using real-time fMRI-based neurofeedback , 2012, NeuroImage.

[18]  Thomas B. Sheridan,et al.  Telerobotics, Automation, and Human Supervisory Control , 2003 .

[19]  G R Müller-Putz,et al.  Toward smarter BCIs: extending BCIs through hybridization and intelligent control , 2012, Journal of neural engineering.

[20]  Nicholas G Hatsopoulos,et al.  Improving brain–machine interface performance by decoding intended future movements , 2013, Journal of neural engineering.

[21]  Amar R. Marathe,et al.  Stereoelectroencephalography for continuous two-dimensional cursor control in a brain-machine interface. , 2013, Neurosurgical focus.

[22]  Javier Minguez,et al.  A Telepresence Mobile Robot Controlled With a Noninvasive Brain–Computer Interface , 2012, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[23]  Roland Siegwart,et al.  Article in Press Robotics and Autonomous Systems ( ) – Robotics and Autonomous Systems Brain-coupled Interaction for Semi-autonomous Navigation of an Assistive Robot , 2022 .

[24]  Stuart D. Harshbarger,et al.  An Overview of the Developmental Process for the Modular Prosthetic Limb , 2011 .

[25]  M. Bergamasco,et al.  A New Gaze-BCI-Driven Control of an Upper Limb Exoskeleton for Rehabilitation in Real-World Tasks , 2012, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[26]  Axel Gräser,et al.  The Rehabilitation Robots FRIEND-I & II: Daily Life Independency through Semi-Autonomous Task-Execution , 2007 .

[27]  G. Pfurtscheller,et al.  Self-Paced Operation of an SSVEP-Based Orthosis With and Without an Imagery-Based “Brain Switch:” A Feasibility Study Towards a Hybrid BCI , 2010, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[28]  C. Mehring,et al.  Detection of Error Related Neuronal Responses Recorded by Electrocorticography in Humans during Continuous Movements , 2013, PloS one.

[29]  R.A. Williams,et al.  Application of the H-Mode, A Design and Interaction Concept for Highly Automated Vehicles, to Aircraft , 2006, 2006 ieee/aiaa 25TH Digital Avionics Systems Conference.

[30]  Iñaki Iturrate,et al.  A Noninvasive Brain-Actuated Wheelchair Based on a P300 Neurophysiological Protocol and Automated Navigation , 2009, IEEE Transactions on Robotics.

[31]  Robin C. Ashmore,et al.  An Electrocorticographic Brain Interface in an Individual with Tetraplegia , 2013, PloS one.

[32]  Nitish V. Thakor,et al.  Simultaneous Neural Control of Simple Reaching and Grasping With the Modular Prosthetic Limb Using Intracranial EEG , 2014, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[33]  Yasuharu Koike,et al.  Prediction of Three-Dimensional Arm Trajectories Based on ECoG Signals Recorded from Human Sensorimotor Cortex , 2013, PloS one.

[34]  André J. Szameitat,et al.  Cortical activation during executed, imagined, observed, and passive wrist movements in healthy volunteers and stroke patients , 2012, NeuroImage.

[35]  Antonio Frisoli,et al.  A new Kinect-based guidance mode for upper limb robot-aided neurorehabilitation , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[36]  Radu Bogdan Rusu,et al.  Semantic 3D Object Maps for Everyday Manipulation in Human Living Environments , 2010, KI - Künstliche Intelligenz.

[37]  R. Andersen,et al.  Cognitive Control Signals for Neural Prosthetics , 2004, Science.

[38]  José del R. Millán,et al.  The role of shared-control in BCI-based telepresence , 2010, 2010 IEEE International Conference on Systems, Man and Cybernetics.

[39]  Oleg Ivlev,et al.  Rehabilitation Robots FRIEND-I and FRIEND-II with the dexterous lightweight manipulator , 2005 .

[40]  Danijela Ristic-Durrant,et al.  ROVIS: Robust machine vision for service robotic system FRIEND , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[41]  Jonathan R Wolpaw,et al.  Brain–computer interfaces as new brain output pathways , 2007, The Journal of physiology.

[42]  Nicolas Y. Masse,et al.  Reach and grasp by people with tetraplegia using a neurally controlled robotic arm , 2012, Nature.

[43]  D.J. McFarland,et al.  Sensorimotor rhythm-based brain-computer interface (BCI): feature selection by regression improves performance , 2005, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[44]  C Grozea,et al.  On the feasibility of using motor imagery EEG-based brain–computer interface in chronic tetraplegics for assistive robotic arm control: a clinical test and long-term post-trial follow-up , 2012, Spinal Cord.

[45]  Rajesh P. N. Rao,et al.  Electrocorticography-based brain computer Interface-the seattle experience , 2006, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[46]  P R Kennedy,et al.  Direct control of a computer from the human central nervous system. , 2000, IEEE transactions on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[47]  Samad Hayati,et al.  Design and implementation of a robot control system with traded and shared control capability , 1989, Proceedings, 1989 International Conference on Robotics and Automation.

[48]  M. Devivo,et al.  Lifetime Direct Costs After Spinal Cord Injury , 2011 .

[49]  G Pfurtscheller,et al.  Toward a hybrid brain–computer interface based on imagined movement and visual attention , 2010, Journal of neural engineering.

[50]  S. Acharya,et al.  Toward Electrocorticographic Control of a Dexterous Upper Limb Prosthesis: Building Brain-Machine Interfaces , 2012, IEEE Pulse.

[51]  Mark W. Spong,et al.  Robot dynamics and control , 1989 .

[52]  M. Nuttin,et al.  A brain-actuated wheelchair: Asynchronous and non-invasive Brain–computer interfaces for continuous control of robots , 2008, Clinical Neurophysiology.

[53]  Bin He,et al.  Goal selection versus process control while learning to use a brain–computer interface , 2011, Journal of neural engineering.

[54]  X. Zeng,et al.  Geometric strategies for neuroanatomic analysis from MRI , 2004, NeuroImage.