Computer vision-based gesture recognition for an augmented reality interface

Wearable computing and Augmented Reality applications call for less obtrusive and more intuitive human computer interfaces than keyboards and mice. One way to realise such interfaces is using gestures, e.g., for pointing in order to replace the mouse. The less obtrusive way of gesture recognition is to use computer vision based methods. This paper presents a computer vision-based gesture interface that is part of an Augmented Reality system. It can recognise a 3D pointing gesture, a click gesture, and five static gestures. A lookup-table based colour segmentation and a fast gesture recognition method are presented that enable for 25Hz performance on a standard PC.

[1]  Vladimir Pavlovic,et al.  Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  Bruce H. Thomas,et al.  The Tinmith System - Demonstrating New Techniques for Mobile Augmented Reality Modelling , 2002, AUIC.

[3]  Michael Isard,et al.  Partitioned Sampling, Articulated Objects, and Interface-Quality Hand Tracking , 2000, ECCV.

[4]  J. Birgitta Martinkauppi,et al.  Behavior of skin color under varying illumination seen by different cameras at different color spaces , 2001, IS&T/SPIE Electronic Imaging.

[5]  Karl-Friedrich Kraiss,et al.  Extraction of 3D hand shape and posture from image sequences for sign language recognition , 2003, 2003 IEEE International SOI Conference. Proceedings (Cat. No.03CH37443).

[6]  Michael Isard,et al.  CONDENSATION—Conditional Density Propagation for Visual Tracking , 1998, International Journal of Computer Vision.

[7]  Thomas B. Moeslund,et al.  Real-time recognition of hand alphabet gestures using principal component analysis , 1997 .

[8]  Ying Wu,et al.  Vision-Based Gesture Recognition: A Review , 1999, Gesture Workshop.

[9]  Maribeth Gandy Coleman,et al.  The Gesture Pendant: A Self-illuminating, Wearable, Infrared Computer Vision System for Home Automation Control and Medical Monitoring , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.

[10]  Jr. Joseph J. LaViola,et al.  A Survey of Hand Posture and Gesture Recognition Techniques and Technology , 1999 .

[11]  Takeo Kanade,et al.  DigitEyes: vision-based hand tracking for human-computer interaction , 1994, Proceedings of 1994 IEEE Workshop on Motion of Non-rigid and Articulated Objects.

[12]  R. Watson A Survey of Gesture RecognitionTechniques. , 1993 .

[13]  Thomas B. Moeslund,et al.  A brief overview of hand gestures used in wearable human computer interfaces , 2003 .

[14]  Ying Wu,et al.  Capturing human hand motion in image sequences , 2002, Workshop on Motion and Video Computing, 2002. Proceedings..

[15]  Dieter Schmalstieg,et al.  Finger tracking for interaction in augmented environments , 2001, Proceedings IEEE and ACM International Symposium on Augmented Reality.

[16]  R. Watson A Survey of Gesture Recognition Techniques , 1993 .

[17]  Erik Hjelmås,et al.  Face Detection: A Survey , 2001, Comput. Vis. Image Underst..