Recognition of gestures in Arabic sign language using neuro-fuzzy systems

Hand gestures play an important role in communication between people during their daily lives. But the extensive use of hand gestures as a mean of communication can be found in sign languages. Sign language is the basic communication method between deaf people. A translator is usually needed when an ordinary person wants to communicate with a deaf one. The work presented in this paper aims at developing a system for automatic translation of gestures of the manual alphabets in the Arabic sign language. In doing so, we have designed a collection of ANFIS networks, each of which is trained to recognize one gesture. Our system does not rely on using any gloves or visual markings to accomplish the recognition job. Instead, it deals with images of bare hands, which allows the user to interact with the system in a natural way. An image of the hand gesture is processed and converted into a set of features that comprises of the lengths of some vectors which are selected to span the fingertips' region. The extracted features are rotation, scale, and translation invariat, which makes the system more flexible. The subtractive clustering algorithm and the least-squares estimator are used to identify the fuzzy inference system, and the training is achieved using the hybrid learning algorithm. Experiments revealed that our system was able to recognize the 30 Arabic manual alphabets with an accuracy of 93.55%.

[1]  Vladimir Pavlovic,et al.  Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  William T. Freeman,et al.  Television control by hand gestures , 1994 .

[3]  Rangachar Kasturi,et al.  Machine vision , 1995 .

[4]  Alireza Khotanzad,et al.  Classification of invariant image representations using a neural network , 1990, IEEE Trans. Acoust. Speech Signal Process..

[5]  Kazuo Kyuma,et al.  Computer vision for computer games , 1996, Proceedings of the Second International Conference on Automatic Face and Gesture Recognition.

[6]  Zeungnam Bien,et al.  Real-time recognition system of Korean sign language based on elementary components , 1997, Proceedings of 6th International Fuzzy Systems Conference.

[7]  Mohammed A Hussain Automatic recognition of sign language gestures , 1999 .

[8]  Anil K. Jain Fundamentals of Digital Image Processing , 2018, Control of Color Imaging Systems.

[9]  Norbert Krüger,et al.  Face Recognition by Elastic Bunch Graph Matching , 1997, CAIP.

[10]  Jyh-Shing Roger Jang,et al.  ANFIS: adaptive-network-based fuzzy inference system , 1993, IEEE Trans. Syst. Man Cybern..

[11]  William T. Freeman,et al.  Orientation Histograms for Hand Gesture Recognition , 1995 .

[12]  Milan Sonka,et al.  Image Processing, Analysis and Machine Vision , 1993, Springer US.

[13]  Patrick K. Simpson,et al.  Fuzzy min-max neural networks. I. Classification , 1992, IEEE Trans. Neural Networks.

[14]  P. K. Simpson Fuzzy Min-Max Neural Networks-Part 1 : Classification , 1992 .

[15]  Edward H. Adelson,et al.  The Design and Use of Steerable Filters , 1991, IEEE Trans. Pattern Anal. Mach. Intell..

[16]  Stephen L. Chiu,et al.  Fuzzy Model Identification Based on Cluster Estimation , 1994, J. Intell. Fuzzy Syst..

[17]  Jochen Triesch,et al.  Robust classification of hand postures against complex backgrounds , 1996, Proceedings of the Second International Conference on Automatic Face and Gesture Recognition.

[18]  S. Ahmad,et al.  A usable real-time 3D hand tracker , 1994, Proceedings of 1994 28th Asilomar Conference on Signals, Systems and Computers.

[19]  Paul A. Beardsley,et al.  Computer Vision for Interactive Computer Graphics , 1998, IEEE Computer Graphics and Applications.