Real Time Based Bare Hand Gesture Recognition

Gestures are a powerful means of communication among humans. Gesture recognition is the process of recognizing and interpreting a stream continuous sequential gesture from the given set of input data. Primary Concern in hand gesture recognition is that it could be possible for users to communicate with computerized equipment without need for external control devices. Another advantage is that the user not only can communicate from a distance, but need have no physical contact with the computer. According to survey almost 40% of all gestures are done with one hand and multiple hands at 20% and the rest are distributed between the other body parts. In this method bare hand gestures are recognizing using dynamic vision sensor (DVS) camera. DVS is different from conventional cameras. DVS cameras only respond to pixels with temporal luminance differences, which can greatly reduce the computational cost of comparing consecutive frames to track moving object. This method attempts to classify three different hand gestures made by a user which is called as Rock,Paper,Scissors. This is novel method detect the point where the user delivers a throw, to extract hand regions and to extract useful features for classification.

[1]  Vaishali S. Kulkarni,et al.  Appearance Based Recognition of American Sign Language Using Gesture Segmentation , 2010 .

[2]  F. Hank Grant,et al.  Simulation modeling with artificial reality technology (SMART): an integration of virtual reality and simulation modeling , 1998, 1998 Winter Simulation Conference. Proceedings (Cat. No.98CH36274).

[3]  Alex Pentland,et al.  Real-time American Sign Language recognition from video using hidden Markov models , 1995 .

[4]  Tsukasa Ogasawara,et al.  A hand-pose estimation for vision-based human interfaces , 2003, IEEE Trans. Ind. Electron..

[5]  Akira Iwata,et al.  A rotation invariant approach on static-gesture recognition using boundary histograms and neural networks , 2002, Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02..

[6]  Tsukasa Ogasawara,et al.  Hand pose estimation for vision-based human interface , 2001, Proceedings 10th IEEE International Workshop on Robot and Human Interactive Communication. ROMAN 2001 (Cat. No.01TH8591).

[7]  Jaron Lanier,et al.  A hand gesture interface device , 1987, CHI 1987.

[8]  Surendra Ranganath,et al.  Real-time gesture recognition system and application , 2002, Image Vis. Comput..

[9]  R. Caves,et al.  Multinational Enterprise and Economic Analysis: Preface , 2007 .

[10]  Yangsheng Xu,et al.  Online, interactive learning of gestures for human/robot interfaces , 1996, Proceedings of IEEE International Conference on Robotics and Automation.

[11]  mokhtar mohammed hasan,et al.  Features Fitting using Multivariate Gaussian Distribution for Hand Gesture Recognition , 2012 .

[12]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..

[13]  Pierre David Wellner,et al.  Interacting with paper on the DigitalDesk , 1993, CACM.

[14]  M. Clerc,et al.  The swarm and the queen: towards a deterministic and adaptive particle swarm optimization , 1999, Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406).

[15]  Andrea Bonaccorsi,et al.  On the Relationship Between Firm Size and Export Intensity , 1992 .

[16]  Aaron F. Bobick,et al.  Learning visual behavior for gesture analysis , 1995, Proceedings of International Symposium on Computer Vision - ISCV.