Anthropometric and human gait identification using skeleton data from Kinect sensor

This work investigates the use of 3D skeleton data retrieved from Microsoft Kinect sensor to obtain gait kinematic parameters from human walk for biometric identification. Several subjects were captured walking in front of the Kinect sensor resulting in a data set containing several 3D points from skeleton joints of each person. These points were used to extract anthropometric information and calculate angles described by lower joints (hips, knees and ankles) during walk in order to extract gait kinematic parameters. Statistical descriptors were used in the resulting curves to define attributes to train machine learning models (K-Nearest Neighbor and Multilayer Perceptron) and the models tested for its efficacy in identifying individuals from the attributes. The results were validated using the 10-fold cross-validation method and showed an overall high rate of accuracy.