Human hand modeling, analysis and animation in the context of HCI

The use of human hand as a natural interface device serves as a motivating force for research in visual analysis of highly articulated hand movement. Since hand motion covers a huge domain, the scope of this paper is limited to the developments of 3D model-based approaches. Numerous 3D models that have been used to analyze hand motion are studied. Various approaches to articulated motion analysis are discussed. Some realistic synthesis methods are also included in this paper. We conclude with some thoughts about future research directions.

[1]  David C. Hogg,et al.  Towards 3D hand tracking using a deformable model , 1996, Proceedings of the Second International Conference on Automatic Face and Gesture Recognition.

[2]  Dariu Gavrila,et al.  The Visual Analysis of Human Movement: A Survey , 1999, Comput. Vis. Image Underst..

[3]  Xindong Wu,et al.  RIEVL: Recursive Induction Learning in Hand Gesture Recognition , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[4]  Alex Pentland,et al.  A Wearable Computer Based American Sign Language Recognizer , 1997, SEMWEB.

[5]  S. Sarkar,et al.  Human skin and hand motion analysis from range image sequences using nonlinear FEM , 1997, Proceedings IEEE Nonrigid and Articulated Motion Workshop.

[6]  John R. Kender,et al.  Toward the use of gesture in traditional user interfaces , 1996, Proceedings of the Second International Conference on Automatic Face and Gesture Recognition.

[7]  Yoshiaki Shirai,et al.  Hand gesture estimation and model refinement using monocular camera-ambiguity limitation by inequality constraints , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[8]  Jake K. Aggarwal,et al.  Human Motion Analysis: A Review , 1999, Comput. Vis. Image Underst..

[9]  Mubarak Shah,et al.  Toward 3-D Gesture Recognition , 1999, Int. J. Pattern Recognit. Artif. Intell..

[10]  Thomas S. Huang,et al.  A Multimodal Human-Computer Interface for the Control of a Virtual Environment , 1998 .

[11]  Tosiyasu L. Kunii,et al.  Model-based analysis of hand posture , 1995, IEEE Computer Graphics and Applications.

[12]  Jochen Triesch,et al.  A gesture interface for human-robot-interaction , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[13]  Dimitris N. Metaxas,et al.  ASL recognition based on a coupling between HMMs and 3D motion analysis , 1998, Sixth International Conference on Computer Vision (IEEE Cat. No.98CH36271).

[14]  Ying Wu,et al.  Capturing articulated human hand motion: a divide-and-conquer approach , 1999, Proceedings of the Seventh IEEE International Conference on Computer Vision.

[15]  Michael F. Cohen,et al.  Efficient generation of motion transitions using spacetime constraints , 1996, SIGGRAPH.

[16]  Aaron F. Bobick,et al.  Recognition and interpretation of parametric gesture , 1998, Sixth International Conference on Computer Vision (IEEE Cat. No.98CH36271).

[17]  S. Ahmad,et al.  A usable real-time 3D hand tracker , 1994, Proceedings of 1994 28th Asilomar Conference on Signals, Systems and Computers.

[18]  Vladimir Pavlovic,et al.  Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[19]  Kang-Hyun Jo,et al.  Manipulative hand gesture recognition using task knowledge for human computer interaction , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[20]  Takeo Kanade,et al.  Model-based tracking of self-occluding articulated objects , 1995, Proceedings of IEEE International Conference on Computer Vision.