A robust hand tracking for gesture-based interaction of wearable computers

Hand gesture-based interface is one of the most promising modes of natural and fluid human-computer interaction that substitutes for the mouse and keyboard in wearable computing systems. In wearable computing scenarios, hand positioning and tracking is particularly difficult due to complex background, lighting variation and image dithering caused by head movement. This paper proposes a robust hand tracking method for gesture-based interaction of a wearable computer with a visual helmet. The method is an extension of the basic CONDENSATION algorithm, which is able to track hand in dynamic and complex background. Furthermore, the algorithm can recognize the current hand gesture and automatically switch between multiple well-defined gesture templates in the tracking loop. The experimental results show that the proposed algorithm worked well in dynamic and complex background in real time.

[1]  David W. Murray,et al.  Wearable visual robots , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.

[2]  Jin-Hyung Kim,et al.  An HMM-Based Threshold Model Approach for Gesture Recognition , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[3]  C. Jennings,et al.  Robust finger tracking with multiple cameras , 1999, Proceedings International Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems. In Conjunction with ICCV'99 (Cat. No.PR00378).

[4]  Ken Endo,et al.  A Functionally-Distributed Hand Tracking Method for Wearable Visual Interfaces and Its Applications , 2002, MVA.

[5]  Michael Isard,et al.  Visual Motion Analysis by Probabilistic Propagation of Conditional Density , 1998 .

[6]  Katsuhiko Sakaue,et al.  The Hand Mouse: GMM hand-color classification and mean shift tracking , 2001, Proceedings IEEE ICCV Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems.

[7]  Maribeth Gandy Coleman,et al.  The Gesture Pendant: A Self-illuminating, Wearable, Infrared Computer Vision System for Home Automation Control and Medical Monitoring , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.

[8]  Ali H. Sayed,et al.  SNAP&TELL: a multi-modal wearable computer interface for browsing the environment , 2002, Proceedings. Sixth International Symposium on Wearable Computers,.

[9]  T. Kurata,et al.  A natural feature-based 3D object tracking method for wearable augmented reality , 2004, The 8th IEEE International Workshop on Advanced Motion Control, 2004. AMC '04..

[10]  Yunde Jia,et al.  A miniature stereo vision machine (MSVM-III) for dense disparity mapping , 2004, ICPR 2004.

[11]  Nobuyuki Kita,et al.  3D simultaneous localisation and map-building using active vision for a robot moving on undulating terrain , 2001, Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001.

[12]  Michael Isard,et al.  A mixed-state condensation tracker with automatic model-switching , 1998, Sixth International Conference on Computer Vision (IEEE Cat. No.98CH36271).

[13]  Michael Isard,et al.  CONDENSATION—Conditional Density Propagation for Visual Tracking , 1998, International Journal of Computer Vision.

[14]  Yunde Jia,et al.  A Miniature Stereo Vision Machine for Real-Time Dense Depth Mapping , 2003, ICVS.

[15]  Michael Harrington,et al.  WearTrack: a self-referenced head and hand tracker for wearable computers and portable VR , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.

[16]  G. Kitagawa Monte Carlo Filter and Smoother for Non-Gaussian Nonlinear State Space Models , 1996 .

[17]  Andrew J. Davison,et al.  Real-time simultaneous localisation and mapping with a single camera , 2003, Proceedings Ninth IEEE International Conference on Computer Vision.

[18]  David W. Murray,et al.  Simultaneous Localization and Map-Building Using Active Vision , 2002, IEEE Trans. Pattern Anal. Mach. Intell..