Towards a Hand-Based Gestural Language for Smart-Home Control Using Hand Shapes and Dynamic Hand Movements
暂无分享,去创建一个
The application of gestural application can vary a lot: from a simple pointing mechanism to a visual sign language. Application of gestural interaction as a sign language with a proper formalism to interact with the technology has not been studied much. In this work, we present our Kinect-based gesture recognition system that is able to recognize personalized hand shapes and dynamic hand movements. We then describe how we created a “language” for smart-home control based on a specific syntax over sequences of hand shapes and hand movements.
[1] D. McNeill. Hand and Mind: What Gestures Reveal about Thought , 1992 .
[2] Yael Edan,et al. Vision-based hand-gesture applications , 2011, Commun. ACM.
[3] Scott K. Liddell. Grammar, Gesture, and Meaning in American Sign Language , 2003 .
[4] Mike Wu,et al. A study of hand shape use in tabletop gesture interaction , 2006, CHI Extended Abstracts.
[5] Thomas B. Moeslund,et al. A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI , 2003, Gesture Workshop.