Hand gestures are an important modality for human computer interaction (HCI) [1]. Compared to many existing interfaces, hand gestures have the advantages of being easy to use, natural, and intuitive. Successful applications of hand gesture recognition include computer games control [2], human-robot interaction [3], and sign language recognition [4], to name a few. Vision-based recognition systems can give computers the capability of understanding and responding to hand gestures. The aim of this technique is the proposal of a real time vision system for its application within visual interaction environments through hand gesture recognition, using general-purpose hardware and low cost sensors, like a simple personal computer and an USB web cam, so any user could make use of it in his office or home. The basis of our approach is a fast segmentation process to obtain the moving hand from the whole image, which is able to deal with a large number of hand shapes against different backgrounds and lighting conditions, and a recognition process that identifies the hand posture from the temporal sequence of segmented hands. The use of a visual memory (Stored database) allows the system to handle variations within a gesture and speed up the recognition process through the storage of different variables related to each gesture. A hierarchical gesture recognition algorithm is introduced to recognize a large number of gestures. Three stages of the proposed algorithm are based on a new hand tracking technique to recognize the actual beginning of a gesture using a Kalman filtering process, hidden Markov models and graph matching. Processing time is important in working with large databases. Therefore, special cares are taken to deal with the large number of gestures.
[1]
Rangachar Kasturi,et al.
Machine vision
,
1995
.
[2]
William T. Freeman,et al.
Television control by hand gestures
,
1994
.
[3]
James Wc.
American association of mental deficiency presents panel on training the mentally retarded deaf.
,
1967
.
[4]
Vladimir Pavlovic,et al.
Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review
,
1997,
IEEE Trans. Pattern Anal. Mach. Intell..
[5]
M. C. Jones.
Cued speech.
,
1992,
ASHA.
[6]
Jochen Triesch,et al.
Robotic Gesture Recognition
,
1997,
Gesture Workshop.
[7]
Biing-Hwang Juang,et al.
Fundamentals of speech recognition
,
1993,
Prentice Hall signal processing series.
[8]
A. P. Pentland,et al.
Computer Vision for Human–Machine Interaction: Smart Rooms: Machine Understanding of Human Behavior
,
1998
.
[9]
Alex Pentland,et al.
Real-time American Sign Language recognition from video using hidden Markov models
,
1995
.
[10]
William T. Freeman.
Computer vision for television and games
,
1999,
Proceedings International Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems. In Conjunction with ICCV'99 (Cat. No.PR00378).
[11]
Aaron F. Bobick,et al.
A state-based technique for the summarization and recognition of gesture
,
1995,
Proceedings of IEEE International Conference on Computer Vision.
[12]
James M. Rehg,et al.
Statistical Color Models with Application to Skin Detection
,
1999,
Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149).