hMouse: Head Tracking Driven Virtual Computer Mouse

A novel head tracking driven camera mouse system, called "hMouse", is developed for manipulating hand-free perceptual user interfaces. The system consists of a robust real-time head tracker, a head pose/motion estimator, and a virtual mouse control module. For the hMouse tracker, we propose a 2D detection/tracking complementary switching strategy with an interactive loop. Based on the reliable tracking results, hMouse calculates the user's head roll, tilt, yaw, scaling, horizontal, and vertical motion for further mouse control. Cursor position is navigated and fine tuned by calculating the relative position of tracking window in image space and the user's head tilt or yaw rotation. After mouse cursor is navigated to the desired location, head roll rotation triggers virtual mouse button clicks. Experimental results demonstrate that hMouse succeeds under the circumstances of user jumping, extreme movement, large degree rotation, turning around, hand/object occlusion, part face out of camera shooting region, and multiuser occlusion. It provides alternative solutions for convenient device control, which encourages the application of interactive computer games, machine guidance, robot control, and machine access for disabilities and elders

[1]  Marco La Cascia,et al.  Fast, Reliable Head Tracking under Varying Illumination: An Approach Based on Registration of Texture-Mapped 3D Models , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  M. Betke,et al.  The Camera Mouse: visual tracking of body features to provide computer access for people with severe disabilities , 2002, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[3]  Thomas S. Huang,et al.  Face as mouse through visual face tracking , 2005, The 2nd Canadian Conference on Computer and Robot Vision (CRV'05).

[4]  Gerhard Roth,et al.  Nouse 'Use Your Nose as a Mouse' - a New Technology for Hands-free Games and Interfaces , 2002 .

[5]  Quan Pan,et al.  Reliable and fast tracking of faces under varying pose , 2006, 7th International Conference on Automatic Face and Gesture Recognition (FGR06).

[6]  Dmitry O. Gorodnichy Nouse ‘Use Your Nose as a Mouse’ – a New Technology for Hands-free Games and Interfaces , 2002 .

[7]  Tim Morris,et al.  Facial feature tracking for cursor control , 2006, J. Netw. Comput. Appl..

[8]  Kentaro Toyama,et al.  “Look, Ma – No Hands!” Hands-Free Cursor Control with Real-Time 3D Face Tracking , 1998 .

[9]  Paul A. Viola,et al.  Rapid object detection using a boosted cascade of simple features , 2001, Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001.

[10]  Rainer Lienhart,et al.  An extended set of Haar-like features for rapid object detection , 2002, Proceedings. International Conference on Image Processing.

[11]  Kikuo Fujimura,et al.  A robust elliptical head tracker , 2004, Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. Proceedings..

[12]  Wolfram Schiffmann,et al.  Head pose estimation of partially occluded faces , 2005, The 2nd Canadian Conference on Computer and Robot Vision (CRV'05).

[13]  Gary R. Bradski,et al.  Real time face and object tracking as a component of a perceptual user interface , 1998, Proceedings Fourth IEEE Workshop on Applications of Computer Vision. WACV'98 (Cat. No.98EX201).

[14]  Thomas S. Huang,et al.  Explanation-based facial motion tracking using a piecewise Bezier volume deformation model , 1999, Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149).