Motion Prediction Based on Eigen-Gestures

Abstract Motion prediction is important for the realization of “proactive” gesture-based man-machine interaction systems, which can react in various ways before the end of user’s action. In this paper, two motion prediction methods are proposed. The first method is based on a naive extrapolation using a reference gesture. The second method is based on eigen-gestures, which represent typical spatial and temporal variations from the reference gesture. Experimental results show that the second method outperforms the first method because the eigen-gestures are useful to compensate various changes in input gestures.

[1]  Alex Pentland,et al.  Flexible Images: Matching and Recognition Using Learned Deformations , 1997, Comput. Vis. Image Underst..

[2]  Alex Pentland,et al.  Space-time gestures , 1993, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[3]  Michael J. Black,et al.  Parameterized Modeling and Recognition of Activities , 1999, Comput. Vis. Image Underst..

[4]  Maja J. Mataric,et al.  Automated Derivation of Primitives for Movement Classification , 2000, Auton. Robots.

[5]  Toshiyuki Amano Image interpolation by high dimensional projection based on subspace method , 2004, Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004..