Development and Performance Evaluation of a Neural Signal Based Computer Interface

The use of personal computers has drastically increased since the 1990s, and they have been responsible for tremendous achievements in information searching (Internet browsing) and communication (e-mail) around the world. People commonly use standard computer interfaces such as the keyboard and mouse, which are operated through physical contact and movement. These physical interactions inherently involve delicate and coordinated movement of the upper limb, wrist, palm, and fingers. However, there are some people who are not capable of using these interfaces because they have physical disabilities such as spinal cord injuries (SCIs), paralysis, and amputated limbs. In 2005, the Ministry of Health and Welfare in South Korea estimated that there were approximately one million people suffering from motor disabilities in South Korea, and the number has been steadily increasing since 1995. It has also been reported that more than 500,000 individuals are living with SCIs in North America and Europe (Guertin, 2005). If people with disabilities could access computers for tasks such as reading and writing documents, communicating with others, and browsing the Internet, they could become capable of a wider range of activities independently. Alternative methods for providing individuals with disabilities access to computing environments include direct contact with physical keyboards, such as that shown in Fig. 1 (a); i.e., through the use of mouth sticks and head sticks. However, these devices have the disadvantage of being inaccurate and inconvenient to use. Another notable computer interface is the eye-movement tracking system, shown in Fig. 1 (b). This interface can perform as fast as, or even faster than, a mouse (Sibert & Jacob, 2000). This is because eye-gaze supports hand movement planning (Johansson et al., 2001); therefore, signals due to eye movement are quicker than those due to hand movement. Eye movements, however, as with other passive and non-command inputs (e.g., gestures and conversational speech), are often neither intentional nor conscious. Therefore, whenever a user looks at a point on the computer monitor, a command is activated (Jacob, 1993); consequently, a user cannot look at any point on the monitor without issuing a command. The eye-movement tracking system thus brings about unintended results. Currently, biomedical scientists are making new advances in computer interface technology with the development of a neural-signal-based computer interface that is capable of directly bridging the gap between the human nervous system and the computer. This neural

[1]  Xiaorong Gao,et al.  Design and implementation of a brain-computer interface with high transfer rates , 2002, IEEE Transactions on Biomedical Engineering.

[2]  Simon Haykin,et al.  Neural Networks: A Comprehensive Foundation , 1998 .

[3]  P. Dario,et al.  Control of multifunctional prosthetic hands by processing the electromyographic signal. , 2002, Critical reviews in biomedical engineering.

[4]  Dean J Krusienski,et al.  Emulation of computer mouse control with a noninvasive brain–computer interface , 2008, Journal of neural engineering.

[5]  Mario Aguilar,et al.  Using EMG to anticipate head motion for virtual-environment applications , 2005, IEEE Transactions on Biomedical Engineering.

[6]  Silvestro Micera,et al.  On the Shared Control of an EMG-Controlled Prosthetic Hand: Analysis of User–Prosthesis Interaction , 2008, IEEE Transactions on Robotics.

[7]  G Rau,et al.  From cell to movement: to what answers does EMG really contribute? , 2004, Journal of electromyography and kinesiology : official journal of the International Society of Electrophysiological Kinesiology.

[8]  Steven W. Smith,et al.  The Scientist and Engineer's Guide to Digital Signal Processing , 1997 .

[9]  Dawn M. Taylor,et al.  Direct Cortical Control of 3D Neuroprosthetic Devices , 2002, Science.

[10]  Jon A. Mukand,et al.  Neuronal ensemble control of prosthetic devices by a human with tetraplegia , 2006, Nature.

[11]  R.J.K. Jacob,et al.  Hot topics-eye-gaze computer interfaces: what you look at is what you get , 1993, Computer.

[12]  Emilio Bizzi,et al.  Combinations of muscle synergies in the construction of a natural motor behavior , 2003, Nature Neuroscience.

[13]  Kevin B. Englehart,et al.  A robust, real-time control scheme for multifunction myoelectric control , 2003, IEEE Transactions on Biomedical Engineering.

[14]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[15]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.

[16]  Shumin Zhai,et al.  Human on-line response to target expansion , 2003, CHI '03.

[17]  P R Kennedy,et al.  Direct control of a computer from the human central nervous system. , 2000, IEEE transactions on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[18]  C. Cinel,et al.  P300-Based BCI Mouse With Genetically-Optimized Analogue Control , 2008, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[19]  H. van der Kooij,et al.  Design and Evaluation of the LOPES Exoskeleton Robot for Interactive Gait Rehabilitation , 2007, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[20]  Alexandros Pino,et al.  Brain Computer Interface Cursor Measures for Motion-impaired and Able-bodied Users , 2022 .

[21]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. 1954. , 1992, Journal of experimental psychology. General.

[22]  R.F. Weir,et al.  The Optimal Controller Delay for Myoelectric Prostheses , 2007, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[23]  P. Cavanagh,et al.  Electromechanical delay in human skeletal muscle under concentric and eccentric contractions , 1979, European Journal of Applied Physiology and Occupational Physiology.

[24]  I.,et al.  Fitts' Law as a Research and Design Tool in Human-Computer Interaction , 1992, Hum. Comput. Interact..

[25]  R. Merletti,et al.  Modeling of surface myoelectric signals--Part II: Model-based signal interpretation. , 1999, IEEE transactions on bio-medical engineering.

[26]  K. Shimohara,et al.  EEG topography recognition by neural networks , 1990, IEEE Engineering in Medicine and Biology Magazine.

[27]  Toshio Tsuji,et al.  A human-assisting manipulator teleoperated by EMG signals and arm motions , 2003, IEEE Trans. Robotics Autom..

[28]  José del R. Millán,et al.  Noninvasive brain-actuated control of a mobile robot by human EEG , 2004, IEEE Transactions on Biomedical Engineering.

[29]  Jerald D. Kralik,et al.  Real-time prediction of hand trajectory by ensembles of cortical neurons in primates , 2000, Nature.

[30]  P. Guertin,et al.  Paraplegic mice are leading to new advances in spinal cord injury research , 2005, Spinal Cord.

[31]  J. Donoghue,et al.  Plasticity and primary motor cortex. , 2000, Annual review of neuroscience.

[32]  Serge H. Roy,et al.  Modeling of surface myoelectric signals. II. Model-based signal interpretation , 1999, IEEE Transactions on Biomedical Engineering.

[33]  R. Johansson,et al.  Eye–Hand Coordination in Object Manipulation , 2001, The Journal of Neuroscience.

[34]  Robert W. Mann,et al.  Myoelectric Signal Processing: Optimal Estimation Applied to Electromyography - Part I: Derivation of the Optimal Myoprocessor , 1980, IEEE Transactions on Biomedical Engineering.

[35]  Shin-Ki Kim,et al.  A Supervised Feature-Projection-Based Real-Time EMG Pattern Recognition for Multifunction Myoelectric Hand Control , 2007, IEEE/ASME Transactions on Mechatronics.

[36]  N. Hogan,et al.  Customized interactive robotic treatment for stroke: EMG-triggered therapy , 2005, IEEE Transactions on Neural Systems and Rehabilitation Engineering.