An online system for tracking the performance of Parkinson's patients

An objective performance measure for movement tasks is widely regarded as having utmost relevance for the therapy of movement disorders. Existing systems typically rely on human experts, which is known to produce substantial inter- and intra-rater variability. Present solutions are either based on simple features or invasive motion capture techniques. They typically work on a specific motion task only and fail to generalize to other tasks. In addition, they often require manual offline pre- and post-processing. In this paper we present a novel approach to compute a continuous and objective performance measure online during a patient session, without tedious and time-consuming pre- or post-processing steps. Our approach is able to generalize between different motion capture devices and different motion tasks. It runs on live motion data extracted with a non-invasive marker-less off-the-shelf vision-based tracking system as well as on data extracted from an inertial measurement unit suit. In the experiments we show that our approach is competitive with an offline approach as well as with the Unified Parkinson's Disease Rating Scale. Our approach is robust with respect to motion execution speed and it outperforms the offline approach regarding movement task generalization. We show promising results to track the current state of a Parkinson's subject online during a therapy session.

[1]  Gianluigi Ferrari,et al.  Inertial BSN-Based Characterization and Automatic UPDRS Evaluation of the Gait Task of Parkinsonians , 2016, IEEE Transactions on Affective Computing.

[2]  Wolfram Burgard,et al.  A probabilistic approach based on Random Forests to estimating similarity of human motion in the context of Parkinson's Disease , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[3]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[4]  Tomohiro Shibata,et al.  Accuracy assessment of kinect body tracker in instant posturography for balance disorders , 2013, 2013 7th International Symposium on Medical Information and Communication Technology (ISMICT).

[5]  Etienne Burdet,et al.  A Robust and Sensitive Metric for Quantifying Movement Smoothness , 2012, IEEE Transactions on Biomedical Engineering.

[6]  Wolfram Burgard,et al.  Online marker labeling for fully automatic skeleton tracking in optical motion capture , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[7]  Patrick Boissy,et al.  Autonomous Quality Control of Joint Orientation Measured with Inertial Sensors , 2016, Sensors.

[8]  Massimo Panella,et al.  Selection of clinical features for pattern recognition applied to gait analysis , 2017, Medical & Biological Engineering & Computing.

[9]  Petros Maragos,et al.  Experimental validation of human pathological gait analysis for an assisted living intelligent robotic walker , 2016, 2016 6th IEEE International Conference on Biomedical Robotics and Biomechatronics (BioRob).

[10]  Kazushi Ikeda,et al.  In-Home Posture Evaluation and Visual Feedback Training to ImprovePosture with a Kinect-Based System in Parkinsonâs Disease , 2014 .

[11]  P. Cochat,et al.  Et al , 2008, Archives de pediatrie : organe officiel de la Societe francaise de pediatrie.

[12]  Wolfram Burgard,et al.  Automatic bone parameter estimation for skeleton tracking in optical motion capture , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[13]  Yu-Liang Hsu,et al.  A Wearable Inertial Measurement System With Complementary Filter for Gait Analysis of Patients With Stroke or Parkinson’s Disease , 2016, IEEE Access.

[14]  Jesús Fontecha,et al.  Comparison between passive vision-based system and a wearable inertial-based system for estimating temporal gait parameters related to the GAITRite electronic walkway , 2016, J. Biomed. Informatics.

[15]  Hans-Peter Seidel,et al.  Fast articulated motion tracking using a sums of Gaussians body model , 2011, 2011 International Conference on Computer Vision.