Hand motion tracking based on a constraint of three-dimensional continuity

We propose to set a 3-D search volume for tracking a 3-D palm motion efficiently using two cameras. If we perform template matching for right and left images independently, two points in two images do not always correspond to each other. Then, we cannot always track the correct 3-D position. Instead of finding the corresponding point in each image, we set the search volume in the 3-D space, not in the 2-D image planes, so that only valid 2-D pairs are considered in the proposed search process. The tracking process is as follows. First, we set the search volume. The 3-D coordinates of the search volume are projected on two in each image plane. We perform template matching at the projected pixel in each image. The similarity of the 3-D position is computed from two dissimilarities in the two images. We search for the position that has the maximum similarity in the search volume, and we obtain the correct correspondence result. We incorporate this technique into our tracking system, and we compare the proposed method with a method that tracks a palm motion without epipolar constraint. Our experimental results show that use of the proposed 3-D search volume makes the method accurate and efficient for tracking the 3-D motion.

[1]  Jun-Ho Oh,et al.  Recursive resolving algorithm for multiple stereo and motion matches , 1997, Image Vis. Comput..

[2]  Jin-Hyung Kim,et al.  An HMM-Based Threshold Model Approach for Gesture Recognition , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[3]  Vladimir Pavlovic,et al.  Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[4]  Shan Lu,et al.  Color-based hands tracking system for sign language recognition , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[5]  Thomas S. Huang,et al.  Virtual Gun, A Vision Based Human Computer Interface Using the Human Hand , 1994, MVA.

[6]  Satoru Hayamizu,et al.  Gesture recognition using HLAC features of PARCOR images and HMM based recognizer , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[7]  King Ngi Ngan,et al.  Locating facial region of a head-and-shoulders color image , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[8]  Takahiro Watanabe,et al.  Real time gesture recognition using eigenspace from multi-input image sequences , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[9]  Dariu Gavrila,et al.  The Visual Analysis of Human Movement: A Survey , 1999, Comput. Vis. Image Underst..

[10]  Matthew Turk,et al.  View-based interpretation of real-time optical flow for gesture recognition , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[11]  Alex Pentland,et al.  Looking at People: Sensing for Ubiquitous and Wearable Computing , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[12]  Rajeev Sharma,et al.  Tracking hand dynamics in unconstrained environments , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[13]  Hideo Saito,et al.  3-D drawing system via hand motion recognition from two cameras , 2000, Smc 2000 conference proceedings. 2000 ieee international conference on systems, man and cybernetics. 'cybernetics evolving to systems, humans, organizations, and their complex interactions' (cat. no.0.