Vision-based hand interaction and its application in pervasive games

Pervasive games have become a popular field of investigation in recent years, in which natural human computer interaction (HCI) plays a key role. In this paper, a vision-based approach for human hand motion gesture recognition is proposed for natural HCI in pervasive games. A LED light pen is used to indicate the user's hand position, while a web-camera is used to capture the hand motion. A rule-based approach is used to design a set of hand gestures which are classified into two categories: linear gestures and arc-shaped gestures. A determinate finite state automaton is developed to segment the captured hand motion trajectories. The proposed interaction method has been applied to the traditional game Tetris on PC with the hand-held LED light pen being used to drive the game instead of traditional key strokes. Experimental results show that the vision-based interactions are natural and effective.

[1]  Alex Pentland,et al.  Real-Time American Sign Language Recognition Using Desk and Wearable Computer Based Video , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  Nikom Suvonvorn,et al.  Real Time Hand Tracking as a User Input Device , 2010, KICSS.

[3]  Carlos Martínez,et al.  ECHOES - A Crazy Multiplayer Pervasive Game , 2008, GI Jahrestagung.

[4]  Adrian David Cheok,et al.  Pervasive games: bringing computer entertainment back to the real world , 2005, CIE.

[5]  C. Mulcahy,et al.  Image compression using the Haar wavelet transform , 2000 .

[6]  Domenico Prattichizzo,et al.  A framework for bounded-time collision detection in haptic interactions , 2006, VRST '06.

[7]  Miguel. A. Otaduy,et al.  Sensation preserving "Simplication for haptic rendering" , 2003 .

[8]  Tieniu Tan,et al.  Gesture recognition using temporal template based trajectories , 2004, Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004..

[9]  A. Kendon Some Relationships Between Body Motion and Speech , 1972 .

[10]  R. Jacob Human-computer interaction: input devices , 1996, CSUR.

[11]  Vladimir Pavlovic,et al.  Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[12]  Roberto Cipolla,et al.  Real-Time Adaptive Hand Motion Recognition Using a Sparse Bayesian Classifier , 2005, ICCV-HCI.

[13]  P. K. Bora,et al.  Hand motion tracking and trajectory matching for dynamic hand gesture recognition , 2006, J. Exp. Theor. Artif. Intell..

[14]  Paul A. Cairns,et al.  A grounded investigation of game immersion , 2004, CHI EA '04.

[15]  Günter Niemeyer,et al.  Haptic rendering with predictive representation of local geometry , 2004, 12th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2004. HAPTICS '04. Proceedings..

[16]  Ming C. Lin,et al.  Sensation preserving simplification for haptic rendering , 2003, ACM Trans. Graph..

[17]  Kenro Aihara,et al.  Estimation of Interest from Physical Actions Captured by Familiar User Device , 2011, Whole Body Interaction.

[18]  Aditya Ramamoorthy,et al.  Recognition of dynamic hand gestures , 2003, Pattern Recognit..

[19]  Tieniu Tan,et al.  Real-time hand tracking using a mean shift embedded particle filter , 2007, Pattern Recognit..

[20]  A. Goodwin,et al.  Tactile discrimination of gaps by slowly adapting afferents: effects of population parameters and anisotropy in the fingerpad. , 2000, Journal of neurophysiology.

[21]  James J. Troy,et al.  Six degree-of-freedom haptic rendering using voxel sampling , 1999, SIGGRAPH.

[22]  Martin Bauer,et al.  Inverse Kinematic Infrared Optical Finger Tracking , 2005 .

[23]  Ming C. Lin,et al.  Sensation preserving simplification for haptic rendering , 2005, SIGGRAPH Courses.

[24]  Matthew Turk,et al.  View-based interpretation of real-time optical flow for gesture recognition , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[25]  Patrick Lambert,et al.  Dynamic Hand Gesture Recognition Using the Skeleton of the Hand , 2005, EURASIP J. Adv. Signal Process..

[26]  Jian Zhang,et al.  A study of level-of-detail in haptic rendering , 2005, TAP.

[27]  Frank Vetere,et al.  Brute force interface: Leveraging intense physical exertion in whole body interactions , 2009, CHI 2009.

[28]  Cagatay Basdogan,et al.  Haptics in virtual environments: taxonomy, research status, and challenges , 1997, Comput. Graph..

[29]  Dimitris N. Metaxas,et al.  A Framework for Recognizing the Simultaneous Aspects of American Sign Language , 2001, Comput. Vis. Image Underst..

[30]  Stephan Wong,et al.  Vision­Based Hand Gesture Recognition for Human Computer Interaction: A Review , 2008 .

[31]  Alistair Sutherland,et al.  A Dynamic Model for Real-Time Tracking of Hands in Bimanual Movements , 2003, Gesture Workshop.

[32]  Oussama Khatib,et al.  The haptic display of complex graphical environments , 1997, SIGGRAPH.