Throwing Recognition Based on Magnetic Tracking and Trajectory Computation in an Augmented Reality System

In our augmented reality system, the throwing is taken as a main interactive method. So the system must have the ability of action recognition and quick response. The research analyzes the character of throwing, and finds out the law of its velocity changing, then uses convolution filtering to smooth dynamic data. Based on them, the fuzzy synthetic judgment algorithm can be used for recognizing the throwing action. Besides, object's trajectory is affected by many factors, such as its shape, posture and material. Taking advantage of the principle of dynamics, their relationships are obtained. The experiments demonstrate the system can recognize various throwing actions, and compute the trajectory accurately.

[1]  Thomas B. Moeslund,et al.  A Survey of Computer Vision-Based Human Motion Capture , 2001, Comput. Vis. Image Underst..

[2]  Martial Hebert,et al.  Spatio-temporal Shape and Flow Correlation for Action Recognition , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[3]  K. Matsuda,et al.  Adaptive control for a throwing motion of a 2 DOF robot , 1996, Proceedings of 4th IEEE International Workshop on Advanced Motion Control - AMC '96 - MIE.

[4]  Klaus H. Hinrichs,et al.  Grab-and-Throw Metaphor: Adapting Desktop-based Interaction Paradigms to Virtual Reality , 2006, 3D User Interfaces (3DUI'06).

[5]  Hui Hu,et al.  Another look at projectile motion , 2000 .

[6]  Marc Hanheide,et al.  Action Recognition in aWearable Assistance System , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[7]  Gang Hua,et al.  Tracking articulated body by dynamic Markov network , 2003, Proceedings Ninth IEEE International Conference on Computer Vision.

[8]  Liu Feng Video-based Motion Capture , 2009, Encyclopedia of Biometrics.

[9]  Nicholas R. Howe,et al.  Silhouette Lookup for Automatic Pose Tracking , 2004, 2004 Conference on Computer Vision and Pattern Recognition Workshop.

[10]  R. Nevatia,et al.  Online, Real-time Tracking and Recognition of Human Actions , 2008, 2008 IEEE Workshop on Motion and video Computing.

[11]  Yunhe Pan,et al.  Incomplete motion feature tracking algorithm in video sequences , 2002, Proceedings. International Conference on Image Processing.