Kinect as a Tool for Gait Analysis: Validation of a Real-Time Joint Extraction Algorithm Working in Side View

The Microsoft Kinect sensor has gained attention as a tool for gait analysis for several years. Despite the many advantages the sensor provides, however, the lack of a native capability to extract joints from the side view of a human body still limits the adoption of the device to a number of relevant applications. This paper presents an algorithm to locate and estimate the trajectories of up to six joints extracted from the side depth view of a human body captured by the Kinect device. The algorithm is then applied to extract data that can be exploited to provide an objective score for the “Get Up and Go Test”, which is typically adopted for gait analysis in rehabilitation fields. Starting from the depth-data stream provided by the Microsoft Kinect sensor, the proposed algorithm relies on anthropometric models only, to locate and identify the positions of the joints. Differently from machine learning approaches, this solution avoids complex computations, which usually require significant resources. The reliability of the information about the joint position output by the algorithm is evaluated by comparison to a marker-based system. Tests show that the trajectories extracted by the proposed algorithm adhere to the reference curves better than the ones obtained from the skeleton generated by the native applications provided within the Microsoft Kinect (Microsoft Corporation, Redmond, WA, USA, 2013) and OpenNI (OpenNI organization, Tel Aviv, Israel, 2013) Software Development Kits.

[1]  Richard Bowden,et al.  Static Pose Estimation from Depth Images using Random Regression Forests and Hough Voting , 2012, VISAPP.

[2]  Ling Shao,et al.  Enhanced Computer Vision With Microsoft Kinect Sensor: A Review , 2013, IEEE Transactions on Cybernetics.

[3]  Andreas Holzinger,et al.  Data Mining with Decision Trees: Theory and Applications , 2015, Online Inf. Rev..

[4]  Behzad Dariush,et al.  Controlled human pose estimation from depth image streams , 2008, 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops.

[5]  Lior Rokach,et al.  Data Mining with Decision Trees - Theory and Applications , 2007, Series in Machine Perception and Artificial Intelligence.

[6]  Atsushi Yamashita,et al.  Evaluation of wearable gyroscope and accelerometer sensor (PocketIMU2) during walking and sit-to-stand motions , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[7]  Andreas Hein,et al.  Skeleton Timed Up and Go , 2012, 2012 IEEE International Conference on Bioinformatics and Biomedicine.

[8]  Laura Montanini,et al.  Comparison of RGB-D Mapping Solutions for Application to Food Intake Monitoring , 2015 .

[9]  Tomás Pajdla,et al.  3D with Kinect , 2011, 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops).

[10]  Ran Gilad-Bachrach,et al.  Full body gait analysis with Kinect , 2012, 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[11]  Sukhendu Das,et al.  Real-Time Upper-Body Human Pose Estimation Using a Depth Camera , 2011, MIRAGE.

[12]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[13]  J. Shotton,et al.  Decision Forests for Classification, Regression, Density Estimation, Manifold Learning and Semi-Supervised Learning , 2011 .

[14]  Ennio Gambi,et al.  A depth-based joints estimatio n algorithm for get up and go test using Kinect , 2014, 2014 IEEE International Conference on Consumer Electronics (ICCE).

[15]  Ruigang Yang,et al.  Accurate 3D pose estimation from a single depth image , 2011, 2011 International Conference on Computer Vision.

[16]  M. Tinetti Performance‐Oriented Assessment of Mobility Problems in Elderly Patients , 1986, Journal of the American Geriatrics Society.

[17]  Andrew W. Fitzgibbon,et al.  Real-time human pose recognition in parts from single depth images , 2011, CVPR 2011.

[18]  Richard Bowden,et al.  Putting the pieces together: Connected Poselets for human pose estimation , 2011, 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops).

[19]  N. Petty,et al.  Neuromusculoskeletal Examination and Assessment: A Handbook for Therapists , 1998 .