Optimizing Activity Recognition in Stroke Survivors for Wearable Exoskeletons

Stroke affects the mobility, hence the quality of life of people victim of this cerebrovascular disease. Part of research has been focusing on the development of exoskeletons bringing support to the user's joints to improve their gait and to help regaining independence in daily life. One example is Xosoft, a soft modular exoskeleton currently being developed in the framework of the European project of the same name. On top of its assistive properties, the soft exoskeleton will provide therapeutic feedback via the analysis of kinematic data stemming from inertial sensors mounted on the exoskeleton. Prior to these analyses however, the activities performed by the user must be known in order to have sufficient behavioral context to interpret the data. Four activity recognition chains, based on machine learning algorithm, were implemented to automatically identify the nature of the activities performed by the user. To be consistent with the application they are being used for (i.e. wearable exoskeleton), focus was made on reducing energy consumption by configuration minimization and bringing robustness to these algorithms. In this study, movement sensor data was collected from eleven stroke survivors while performing daily-life activities. From this data, we evaluated the influence of sensor reduction and position on the performances of the four algorithms. Moreover, we evaluated their resistance to sensor failures. Results show that in all four activity recognition chains, and for each patient, reduction of sensors is possible until a certain limit beyond which the position on the body has to be carefully chosen in order to maintain the same performance results. In particular, the study shows the benefits of avoiding lower legs and foot locations as well as the sensors positioned on the affected side of the stroke patient. It also shows that robustness can be brought to the activity recognition chain when the data stemming from the different sensors are fused at the very end of the classification process.

[1]  Miguel Damas,et al.  Multi-sensor Fusion Based on Asymmetric Decision Weighting for Robust Activity Recognition , 2014, Neural Processing Letters.

[2]  H. van der Kooij,et al.  Design and Evaluation of the LOPES Exoskeleton Robot for Interactive Gait Rehabilitation , 2007, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[3]  Ting Zhang,et al.  Using decision trees to measure activities in people with stroke , 2013, 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[4]  K. M. Gill,et al.  Clinical Gait Assessment in the Neurologically Impaired , 1984 .

[5]  Aakash Gupta,et al.  Activity recognition in patients with lower limb impairments: Do we need training data from each patient? , 2016, 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[6]  K. M. Gill,et al.  Clinical gait assessment in the neurologically impaired. Reliability and meaningfulness. , 1984, Physical therapy.

[7]  A. Esquenazi,et al.  The ReWalk Powered Exoskeleton to Restore Ambulatory Function to Individuals with Thoracic-Level Motor-Complete Spinal Cord Injury , 2012, American journal of physical medicine & rehabilitation.

[8]  J. Eng,et al.  Daily physical activity and its contribution to the health-related quality of life of ambulatory individuals with chronic stroke , 2010, Health and quality of life outcomes.

[9]  M. Morari,et al.  Robotic Orthosis Lokomat: A Rehabilitation and Research Tool , 2003, Neuromodulation : journal of the International Neuromodulation Society.

[10]  P. Lee,et al.  For non-exercising people, the number of steps walked is more strongly associated with health than time spent walking. , 2013, Journal of science and medicine in sport.

[11]  Annemarie Laudanski,et al.  Activity classification in persons with stroke based on frequency features. , 2015, Medical engineering & physics.

[12]  Torsten Bumgarner,et al.  Biomechanics and Motor Control of Human Movement , 2013 .

[13]  Daniel Roggen,et al.  Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition , 2016, Sensors.

[14]  The Capacity to Restore Steady Gait After a Step Modification Is Reduced in People With Poststroke Foot Drop Using an Ankle-Foot Orthosis , 2014, Physical Therapy.

[15]  Bernt Schiele,et al.  A tutorial on human activity recognition using body-worn inertial sensors , 2014, CSUR.