Real-Time 3D Model-Based Gesture Tracking for Multimedia Control

This paper presents a new 3D model-based gesture tracking system for controlling multimedia player in an intuitive way. The motivation of this paper is to make home appliance aware of user’s intention. This 3D model-based gesture tracking system adopts a Bayesian framework to track the user’s 3D hand position and to recognize meaning of these postures for controlling 3D player interactively. To avoid the high dimensionality of the whole 3D upper body model, which may complicate the gesture tracking problem, our system applies a novel hierarchical tracking algorithm to improve the system performance. Moreover, this system applies multiple cues for improving the accuracy of tracking results. Based on the above idea, we have implemented a 3D hand gesture interface for controlling multimedia players. Experimental results have shown that the proposed system robustly tracks the 3D position of the hand and has high potential for controlling the multimedia player.

[1]  I-Cheng Chang,et al.  3D Human Motion Tracking Using Progressive Particle Filter , 2008, ISVC.

[2]  Michael Isard,et al.  CONDENSATION—Conditional Density Propagation for Visual Tracking , 1998, International Journal of Computer Vision.

[3]  I-Cheng Chang,et al.  Dynamic Kernel-Based Progressive Particle Filter for 3D Human Motion Tracking , 2009, ACCV.

[4]  Paul A. Viola,et al.  Rapid object detection using a boosted cascade of simple features , 2001, Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001.

[5]  Konrad Tollmar,et al.  Navigating in virtual environments using a vision-based interface , 2004, NordiCHI '04.

[6]  William T. Freeman,et al.  Television control by hand gestures , 1994 .

[7]  Andrew Blake,et al.  Articulated body motion capture by annealed particle filtering , 2000, Proceedings IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2000 (Cat. No.PR00662).