Trajectories tracing for a pitching robot based on human recognition

In this study, we discussed human recognition method for the trajectory tracking. Visual perception is very important to realize the feature extraction from the time series of images, but it is very difficult to perform the object tracking. We classified errors occurring in the throwing; errors caused by internal factors and errors caused by external factors. At first, we propose a theoretical method of calculating the trajectory of a ball from the set-values. Second, we propose the method of calculating the trajectory from parts positions. These positions are recognized from the image captured by CCD camera. In this calculation, the pattern recognition were used to find parts positions. These positions are the released position of a ball and the position of the fulcrum. By this way, e.g., we can find the released position error of a ball and the speed error of a ball caused by internal factors. We used parabola equations in these calculations of trajectories. Third, we propose the method of extracting the trajectory from the time series of images captured by CCD camera. The specifications of camera are “480 by 360 pixels”, “RGB color” and “29 frames per second”. We propose the method of recognizing a position of the flying ball from images of the movie, directly. The robot plots the trajectory of a flying ball. By using this method, we can find the errors caused by external factors, e.g., we can suppose the influence of the air resistance working to the ball. Finally, we performed the experiment of the trajectories tracing for a pitching robot based on human recognition.

[1]  Naoyuki Kubota,et al.  Hierarchical growing neural gas for information structured space , 2009, 2009 IEEE Workshop on Robotic Intelligence in Informationally Structured Space.

[2]  Paul A. Viola,et al.  Rapid object detection using a boosted cascade of simple features , 2001, Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001.

[3]  Rainer Lienhart,et al.  An extended set of Haar-like features for rapid object detection , 2002, Proceedings. International Conference on Image Processing.

[4]  Sridhar Mahadevan,et al.  Robot Learning , 1993 .

[5]  C. H. Chen,et al.  Handbook of Pattern Recognition and Computer Vision , 1993 .

[6]  Frank P. Ferrie,et al.  Autonomous exploration: driven by uncertainty , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[7]  Lotfi A. Zadeh,et al.  Outline of a New Approach to the Analysis of Complex Systems and Decision Processes , 1973, IEEE Trans. Syst. Man Cybern..

[8]  Minoru Asada,et al.  Motion Sketch: Acquisition of Visual Motion Guided Behaviors , 1995, IJCAI.

[9]  M. Asada,et al.  Action-Based State Space Construction for Robot Learning , 1997 .

[10]  Yiannis Aloimonos,et al.  Active vision , 2004, International Journal of Computer Vision.

[11]  L. F. Pau,et al.  Handbook of pattern recognition & computer vision , 1993 .