The Research and Application of an Algorithm about Human Joint Points Tracking based on a Video

This paper presents an algorithm about human joint points tracking based on a video. The method requires locating manually the joints’ position of the first frame, and tracking automatically all the joints starting from the second frame. Firstly, segment human lower limb marking the feature points based on two frame difference and threshold method. Then tracking feature points by using block matching method. Finally, the information of each joint’ location and the corresponding angle can be obtained. In order to ensure the accuracy of tracking, the scope of tracing must be limited by using human lower limb segmented by image and length model of each joint. This method has high accuracy and speed. Comparing the manual position and the computer programming tracking, its accuracy can be as high as 96%, and the actual error is less than 0.4cm. Keywordsfeature tracking; joint position location; motion capture; Image segmentation; Image analysis

[1]  Xiao-Jun Zeng,et al.  An Instance-Based Algorithm With Auxiliary Similarity Information for the Estimation of Gait Kinematics From Wearable Sensors , 2008, IEEE Transactions on Neural Networks.

[2]  Rui Zhang,et al.  Moving Objects Detection Method Based on Brightness Distortion and Chromaticity Distortion , 2007, IEEE Transactions on Consumer Electronics.

[3]  Chung-Lin Huang,et al.  Model-based human body motion analysis for MPEG IV video encoder , 2001, Proceedings International Conference on Information Technology: Coding and Computing.

[4]  Tim J. Ellis,et al.  Multi camera image tracking , 2006, Image Vis. Comput..

[5]  S. M. Mahbubur Rahman,et al.  Detection and classification of vehicles from a video using time-spatial image , 2010, International Conference on Electrical & Computer Engineering (ICECE 2010).

[6]  Jun-Ming Lu,et al.  Real-time gait cycle parameters recognition using a wearable motion detector , 2011, Proceedings 2011 International Conference on System Science and Engineering.