A Functionally-Distributed Hand Tracking Method for Wearable Visual Interfaces and Its Applications

This paper describes a finctionatly-Distributed (FD) hand tmcking method for hand-gesture-based wearable visual interfaces. The method is an extension of the Distributed Monte Carlo (DMC) tracking method which we have developed. The method provides coarse but rapid hand tracking results with the lowest possible number of samples on the wearable side, and can reduce latency which causes a decline in usability and performance of gesture-based interfaces. The method also provides the adaptive tracking mechanism by using the suficient number of samples and the hand-color modeling on the infmstructure side. This paper also describes three promising applications of the hand-gesture-based wearable visual interfaces implemented on our wearable systems.

[1]  Katsuhiko Sakaue,et al.  VizWear-Active: distributed Monte Carlo face tracking for wearable active cameras , 2002, Object recognition supported by user interaction for service robots.

[2]  Alex Pentland,et al.  A Wearable Computer Based American Sign Language Recognizer , 1998, Assistive Technology and Artificial Intelligence.

[3]  Katsuhiko Sakaue,et al.  VizWear-Active: Towards a functionally-distributed architecture for real-time visual tracking and context-aware UI , 2002, Proceedings. Sixth International Symposium on Wearable Computers,.

[4]  Jennifer Healey,et al.  Augmented Reality through Wearable Computing , 1997, Presence: Teleoperators & Virtual Environments.

[5]  Katsuhiko Sakaue,et al.  The Hand Mouse: GMM hand-color classification and mean shift tracking , 2001, Proceedings IEEE ICCV Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems.

[6]  Maribeth Gandy Coleman,et al.  The Gesture Pendant: A Self-illuminating, Wearable, Infrared Computer Vision System for Home Automation Control and Medical Monitoring , 2000, Digest of Papers. Fourth International Symposium on Wearable Computers.

[7]  Ying Wu,et al.  A co-inference approach to robust visual tracking , 2001, Proceedings Eighth IEEE International Conference on Computer Vision. ICCV 2001.

[8]  Alex Pentland,et al.  A Wearable Computer Based American Sign Language Recognizer , 1997, SEMWEB.

[9]  Katsuhiko Sakaue,et al.  A panorama-based method of personal positioning and orientation and its real-time applications for wearable computers , 2001, Proceedings Fifth International Symposium on Wearable Computers.

[10]  Alexander H. Waibel,et al.  Segmenting hands of arbitrary color , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[11]  Ali H. Sayed,et al.  Snap&tell: a vision-based wearable system to support web-on-the-world applications , 2002 .

[12]  Michael Isard,et al.  CONDENSATION—Conditional Density Propagation for Visual Tracking , 1998, International Journal of Computer Vision.

[13]  Steve Mann,et al.  Wearable Computing: A First Step Toward Personal Imaging , 1997, Computer.

[14]  Li-Te Cheng,et al.  The WristCam as input device , 1999, Digest of Papers. Third International Symposium on Wearable Computers.

[15]  Michael Isard,et al.  ICONDENSATION: Unifying Low-Level and High-Level Tracking in a Stochastic Framework , 1998, ECCV.

[16]  Katsuhiko Sakaue,et al.  VizWear: Toward Human-Centered Interaction through Wearable Vision and Visualization , 2001, IEEE Pacific Rim Conference on Multimedia.

[17]  Kunihiro Chihara,et al.  HIT-Wear: A Menu System Superimposing on a Human Hand for Wearable Computers , 1999 .

[18]  JungHyun Han,et al.  Text scanner with text detection technology on image sequences , 2002, Object recognition supported by user interaction for service robots.