Detecting, Tracking, and Interpretation of a Pointing Gesture by an Overhead View Camera

In this work we describe a set of visual routines, which support a novel sensor free interface between a human and virtual objects. The visual routines detect, track and interpret a gesture of pointing in real time. This is solved in the context of a scenario, which enables a user to activate virtual objects displayed on a projective screen. By changing a direction of pointing with an extended towards the screen arm, the user controls the motion of virtual objects. The vision system consists of a single overhead view camera and exploits a priori knowledge of the human body appearance, interactive context and environment. The system operates in real time on a standard Pentium-PC platform.

[1]  Trevor Darrell,et al.  A novel environment for situated vision and behavior , 1994 .

[2]  Pietro Perona,et al.  Monocular tracking of the human arm in 3D , 1995, Proceedings of IEEE International Conference on Computer Vision.

[3]  Karl Rohr,et al.  Incremental recognition of pedestrians from image sequences , 1993, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[4]  Alex Pentland,et al.  Pfinder: Real-Time Tracking of the Human Body , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[5]  R. E. Kahn,et al.  Understanding people pointing: the Perseus system , 1995, Proceedings of International Symposium on Computer Vision - ISCV.

[6]  James W. Davis,et al.  Real-time closed-world tracking , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[7]  Larry S. Davis,et al.  3-D model-based tracking of humans in action: a multi-view approach , 1996, Proceedings CVPR IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[8]  Marina Kolesnik,et al.  Algorithmic solution for autonomous vision-based off-road navigation , 1998, Defense, Security, and Sensing.

[9]  Larry S. Davis,et al.  W/sup 4/: Who? When? Where? What? A real time system for detecting and tracking people , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[10]  Alex Pentland,et al.  A novel environment for situated vision and behavior , 1994 .

[11]  Aaron F. Bobick,et al.  Parametric Hidden Markov Models for Gesture Recognition , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[12]  Thomas S. Huang,et al.  Vision-based overhead view person recognition , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.

[13]  Michael J. Swain,et al.  Task and Environment-Sensitive Tracking, , 1994 .

[14]  S. P. Mudur,et al.  Three-dimensional computer vision: a geometric viewpoint , 1993 .

[15]  Jitendra Malik,et al.  Tracking people with twists and exponential maps , 1998, Proceedings. 1998 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No.98CB36231).