Classification of Desired Motion Force Based On Cerebral Hemoglobin Information

Strength training using patients’ desired force level is helpful to improve training effect and promote rehabilitation. Generally, force levels are recognized by applying EMG or biomechanical information, these methods were not suitable for patients who lost important muscle groups or have weakened muscle functions. This paper proposed a method for identifying force level based on cerebral hemoglobin information, rather than the information depending on limbs. Ten subjects performed pedaling movement in three force levels. Features were extracted in both the time-domain and frequency-domain, with deoxygenated hemoglobin (deoxy) and the difference between oxygenated hemoglobin (oxy) and deoxy as parameters. Important frequency bands (0.01-0.03Hz, 0.03-0.06Hz, 0.06-0.09Hz, 0.09-0.12Hz) were confirmed by performing power spectrum density analysis. And significant measure channels were selected by performing one-way analyses of variance on three time periods around the start of movement. Force level was recognized by applying extreme learning machine (ELM). The corresponding precision rate was up to 78.7%. The proposed identification method was not restricted to the existence of limbs or the strength of limb information. It was realized based on brain information recorded in a real movement environment; it is helpful to realize the desired force level of subjects and to provide a control command for rehabilitation training equipment.

[1]  Seung-Min Park,et al.  EEG Analysis Following Change in Hand Grip Force Level for BCI Based Robot Arm Force Control , 2013 .

[2]  Fu Yun Recognition of Actual Grip Force Movement Modes Based on Movement-related Cortical Potentials , 2014 .

[3]  Yunfa Fu,et al.  Classification of fNIRS data using wavelets and support vector machine during speed and force imagination , 2011, 2011 IEEE International Conference on Robotics and Biomimetics.

[4]  Hayashi,et al.  [IEEE 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems - Edmonton, Alta., Canada (2005.08.2-2005.08.2)] 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems - Control method of robot suit HAL working as operator\'s muscle using biological and dynamical inf , 2005 .

[5]  A. Mak,et al.  Wavelet analysis of skin blood oscillations in persons with spinal cord injury and able-bodied subjects. , 2006, Archives of physical medicine and rehabilitation.

[6]  Marie-Françoise Lucas,et al.  Optimization of wavelets for classification of movement-related cortical potentials generated by variation of force-related parameters , 2007, Journal of Neuroscience Methods.

[7]  Nicholas G. Hatsopoulos,et al.  Brain-machine interface: Instant neural control of a movement signal , 2002, Nature.

[8]  Hongming Zhou,et al.  Extreme Learning Machine for Regression and Multiclass Classification , 2012, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[9]  Yoshiyuki Sankai,et al.  Power assist control for walking aid with HAL-3 based on EMG and impedance adjustment around knee joint , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[10]  Masako Okamoto,et al.  Virtual spatial registration of stand-alone fNIRS data to MNI space , 2007, NeuroImage.

[11]  Juan Li,et al.  Adjustment of Intention to Start a Motion of Lower Limbs Based on Cerebral Hemoglobin Information , 2015, 2015 7th International Conference on Intelligent Human-Machine Systems and Cybernetics.

[12]  Masako Okamoto,et al.  Three-dimensional probabilistic anatomical cranio-cerebral correlation via the international 10–20 system oriented for transcranial functional brain mapping , 2004, NeuroImage.

[13]  P. Strick,et al.  Imaging the premotor areas , 2001, Current Opinion in Neurobiology.

[14]  Chee Kheong Siew,et al.  Extreme learning machine: Theory and applications , 2006, Neurocomputing.

[15]  Jing Zhang,et al.  Automatic Evaluation of Hypernasality Based on a Cleft Palate Speech Database , 2015, Journal of Medical Systems.

[16]  李洪谊,et al.  Classification of Hemodynamic Responses Associated With Force and Speed Imagery for a Brain-Computer Interface , 2015 .

[17]  Takahiro Kagawa,et al.  A human interface for stride control on a wearable robot , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[18]  E. Donchin,et al.  Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. , 1988, Electroencephalography and clinical neurophysiology.

[19]  Hasan Onur Keles,et al.  Hybrid EEG-fNIRS Asynchronous Brain-Computer Interface for Multiple Motor Tasks , 2016, PloS one.

[20]  Cuntai Guan,et al.  Extracting effective features from high density nirs-based BCI for assessing numerical cognition , 2012, 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[21]  Yoshiyuki Sankai,et al.  Control method of robot suit HAL working as operator's muscle using biological and dynamical information , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[22]  R. Derakhshani,et al.  Classification-guided feature selection for NIRS-based BCI , 2011, 2011 5th International IEEE/EMBS Conference on Neural Engineering.

[23]  Ming Zhang,et al.  Spectral analysis of near-infrared spectroscopy signals measured from prefrontal lobe in subjects at risk for stroke. , 2012, Medical physics.

[24]  Lining Sun,et al.  Identification of motion trend of lower limbs based on near-infrared spectroscopic technology , 2015 .

[25]  Wolfgang Grodd,et al.  Principles of a brain-computer interface (BCI) based on real-time functional magnetic resonance imaging (fMRI) , 2004, IEEE Transactions on Biomedical Engineering.

[26]  Y. Kim,et al.  Classification of prefrontal and motor cortex signals for three-class fNIRS–BCI , 2015, Neuroscience Letters.