Analysis of clustering techniques to detect hand signs

The term multimedia has a different meaning to different communities. The computer industry uses this term to refer to a system that can display audio and video clips. Generally speaking, a multimedia system supports multiple presentation modes to convey information. Humans have five senses: sight, hearing, touch, smell and taste. In theory, a system based on this generalized definition must be able to convey information in support of all senses. This would be a step towards virtual environments that facilitate total recall of an experience. This study builds on our previous work with audio and video servers and explores haptic data in support of touch and motor skills. It investigates the use of clustering techniques to recognize hand signs using haptic data. An application of these results is communication devices for the hearing impaired.

[1]  Geoffrey E. Hinton,et al.  Glove-TalkII: an adaptive gesture-to-formant interface , 1995, CHI '95.

[2]  Tomoichi Takahashi,et al.  Hand gesture coding based on experiments using a hand gesture interface device , 1991, SGCH.

[3]  Alex Pentland,et al.  Real-time American Sign Language recognition from video using hidden Markov models , 1995 .

[4]  R. Ng,et al.  Eecient and Eeective Clustering Methods for Spatial Data Mining , 1994 .

[5]  W. Kadous GRASP: Recognition of Australian Sign Language Using Instrumented Gloves , 1995 .

[6]  Jérôme Martin,et al.  An Appearance-Based Approach to Gesture-Recognition , 1997, ICIAP.

[7]  Anders Sandberg,et al.  Gesture Recognition using Neural Networks , 1997 .

[8]  Ying Wu,et al.  Vision-Based Gesture Recognition: A Review , 1999, Gesture Workshop.

[9]  Dimitris N. Metaxas,et al.  Adapting hidden Markov models for ASL recognition by using three-dimensional computer vision methods , 1997, 1997 IEEE International Conference on Systems, Man, and Cybernetics. Computational Cybernetics and Simulation.

[10]  Anil K. Jain,et al.  Algorithms for Clustering Data , 1988 .

[11]  James W. Davis,et al.  GESTURE RECOGNITION , 2023, International Research Journal of Modernization in Engineering Technology and Science.

[12]  Yangsheng Xu,et al.  Online, interactive learning of gestures for human/robot interfaces , 1996, Proceedings of IEEE International Conference on Robotics and Automation.

[13]  Alex Pentland,et al.  Recognition of Space-Time Gestures using a Distributed Representation , 1993 .

[14]  D. Banarase,et al.  Hand posture recognition with the neocognitron network , 1993 .

[15]  Jiawei Han,et al.  Efficient and Effective Clustering Methods for Spatial Data Mining , 1994, VLDB.

[16]  J. MacQueen Some methods for classification and analysis of multivariate observations , 1967 .

[17]  KwangYun Wohn,et al.  Recognition of space-time hand-gestures using hidden Markov model , 1996, VRST.

[18]  Lingling Zhang,et al.  Alternative techniques for the efficient acquisition of haptic data , 2001, SIGMETRICS '01.

[19]  Kouichi Murakami,et al.  Gesture recognition using recurrent neural networks , 1991, CHI.

[20]  M.J. Martin-Bautista,et al.  A survey of genetic feature selection in mining issues , 1999, Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406).

[21]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[22]  Dean Rubine,et al.  Specifying gestures by example , 1991, SIGGRAPH.

[23]  Cyrus Shahabi,et al.  Analysis of haptic data for sign language recognition , 2001, HCI.

[24]  Thomas B. Moeslund,et al.  Real-time recognition of hand alphabet gestures using principal component analysis , 1997 .

[25]  Thad Starner,et al.  Visual Recognition of American Sign Language Using Hidden Markov Models. , 1995 .