This paper proposes a some novel and state of the art technique for analyzing the Athletic Movement (Vertical Jump) and feats by analyzing video frame by frame. Most common method to analyze "Athletic Movement" such as Jump and feats accomplished in them are either an observations made by an human expert / coach, or they are the values captured by measurement devices in the suit or wearables attached to the body of an athlete. Where former requires an access to the human expert, the later requires the special kind of a hardware / sensor that has capability to extract the body movement statistics with respect to time and space. Both methods are pretty accurate but due to their overhead in terms of necessity / dependence on 3rd party system or person. Not to mention along with the cost such methods come up with, they are often inaccessible in situations where one's just home practicing or when an athlete is just trying out something in own backyard or Gym (personal zones). Our target was here to reduce those dependencies and create such heuristics and algorithms that can help an individual athlete to assess the feats like Jump, Run, and Leap, without using any 3rd party systems, and be able to approximate the feats and compare them with the existing ones using only the cellphone device in their pocket. This paper focused on Jump sport. The system processed video frame by frame and Applying Histogram Of Oriented Gradient Technique to find Human in Frame and then track human from initial to last and we are capable now to calculate pixel distance covered by human in Jump. We used some values like human height to find physical distance covered, Frame Per Frame (FPS) of video, Markers on screen of mobile while recording videos. To validate the algorithm results, a number of experiments were performed and then compare with the actual vertical jump height and derive a statistical relation between the proposed methodology and the traditional techniques. Proposed technique can also be used for calculating different statistics of sport person.
[1]
Shaharyar Ahmed Khan Tareen,et al.
A comparative analysis of SIFT, SURF, KAZE, AKAZE, ORB, and BRISK
,
2018,
2018 International Conference on Computing, Mathematics and Engineering Technologies (iCoMET).
[2]
Y Fan,et al.
An Accurate Framework for Arbitrary View Pedestrian Detection in Images
,
2018
.
[3]
Nuno M. Garcia,et al.
Calculation of Jump Flight Time using a Mobile Device
,
2015,
HEALTHINF.
[4]
Ieee Xplore,et al.
IEEE Transactions on Pattern Analysis and Machine Intelligence Information for Authors
,
2022,
IEEE Transactions on Pattern Analysis and Machine Intelligence.
[5]
Qingliang Li,et al.
A Local Neighborhood Constraint Method for SIFT Features Matching
,
2018
.
[6]
Bill Triggs,et al.
Histograms of oriented gradients for human detection
,
2005,
2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).
[7]
Yaxiang Fan,et al.
Accurate non-maximum suppression for object detection in high-resolution remote sensing images
,
2018
.
[8]
V. Prakash,et al.
Real-Time Human Detection and Tracking Using Quadcopter
,
2018
.
[9]
山崎 俊彦.
画像の特徴抽出 : Histogram of Oriented Gradients (HoG)(100行で書く画像処理最先端)
,
2010
.
[10]
Gavin L. Moir,et al.
Three Different Methods of Calculating Vertical Jump Height from Force Platform Data in Men and Women
,
2008
.
[11]
Bernt Schiele,et al.
Towards Reaching Human Performance in Pedestrian Detection
,
2018,
IEEE Transactions on Pattern Analysis and Machine Intelligence.