Gesture recognition through angle space

As the notion of ubiquitous computing becomes a reality, the keyboard and mouse paradigm become less satisfactory as an input modality. The ability to interpret gestures can open another dimension in the user interface technology. In this paper, we present a novel approach for dynamic hand gesture modeling using neural networks. The results show high accuracy in detecting single and multiple gestures, which makes this a promising approach for gesture recognition from continuous input with undetermined boundaries. This method is independent of the input device and can be applied as a general back-end processor for gesture recognition systems.

[1]  José Miguel Salles Dias,et al.  O.G.R.E. - Open Gestures Recognition Engine , 2004, SIBGRAPI.

[2]  Yangsheng Xu,et al.  Gesture interface: modeling and learning , 1994, Proceedings of the 1994 IEEE International Conference on Robotics and Automation.

[3]  S. Chiba,et al.  Dynamic programming algorithm optimization for spoken word recognition , 1978 .

[4]  Takahiro Watanabe,et al.  Real time gesture recognition using eigenspace from multi-input image sequences , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[5]  P. Bajcsy,et al.  Recognition of arm gestures using multiple orientation sensors: gesture classification , 2004, Proceedings. The 7th International IEEE Conference on Intelligent Transportation Systems (IEEE Cat. No.04TH8749).

[6]  Mario Aguilar,et al.  Facilitating User Interaction with Complex Systems via Hand Gesture Recognition , 2003 .

[7]  Katsushi Ikeuchi,et al.  Acquiring hand-action models by attention point analysis , 2001, Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No.01CH37164).