Fast and accurate 3D gesture recognition interface

A video-based gesture recognition system can serve as a natural and accurate 3D user input device. We describe a two-camera system, that recognizes three gesture classes: two static and one dynamic. For one of these gestures (pointing), the system estimates five parameters of 3D pose: position and pointing direction. The recognition is robust, independent of the user and fast (60 Hz), and the estimated pose is very stable. We describe some of the interface applications that demonstrate the benefits of the system: control of a video game, piloting a virtual reality fly-through, and interaction with a 3D scene editor.

[1]  Vladimir Pavlovic,et al.  Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  James M. Rehg,et al.  Digiteyes: Vision-based Human Hand Tracking Contents 1 Introduction 2 2 the Articulated Mechanism Tracking Problem 2 3 State Model for Articulated Mechanisms 4 , 1993 .

[3]  Myron W. Krueger,et al.  Artificial reality II , 1991 .

[4]  John R. Kender,et al.  Toward the use of gesture in traditional user interfaces , 1996, Proceedings of the Second International Conference on Automatic Face and Gesture Recognition.

[5]  Caroline Hummels,et al.  Meaningful gestures for human computer interaction: beyond hand postures , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[6]  Alex Pentland,et al.  Pfinder: real-time tracking of the human body , 1996, Proceedings of the Second International Conference on Automatic Face and Gesture Recognition.

[7]  Jochen Triesch,et al.  A gesture interface for human-robot-interaction , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.