Recognition of hand gesture to human-computer interaction

In this paper, a robust hand gesture recognition system is designed and implemented to explore the communication methods between human and machine. Hand gestures in the proposed approach are used to communicate with a computer for actions with a high degree of freedom. The user does not need to wear any cumbersome devices like cyber-gloves. In addition, no assumption is made on whether the user is wearing any wrist ornaments and whether the user is using the left or right hand when gesturing. Image segmentation based upon the skin-color and a shape analysis based upon the invariant moments are combined. The features are extracted and used for an input vector to a radial basis function networks (RBFN). Our 'puppy' robot is employed as a testbed. The results show that the robot can act for the purpose of production by providing the feeling that spring up in raising a pet for a user.

[1]  Yajun Li,et al.  Reforming the theory of invariant moments for pattern recognition , 1992, Pattern Recognit..

[2]  Dana H. Ballard,et al.  Computer Vision , 1982 .

[3]  Chung-Lin Huang,et al.  Model-based articulated hand motion tracking for gesture recognition , 1998, Image Vis. Comput..

[4]  David Zeltzer,et al.  A survey of glove-based input , 1994, IEEE Computer Graphics and Applications.

[5]  Dariu Gavrila,et al.  The Visual Analysis of Human Movement: A Survey , 1999, Comput. Vis. Image Underst..

[6]  Vladimir Pavlovic,et al.  Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[7]  Thomas S. Huang,et al.  Vision based hand modeling and tracking for virtual teleconferencing and telecollaboration , 1995, Proceedings of IEEE International Conference on Computer Vision.