Face as mouse through visual face tracking

This paper introduces a novel camera mouse driven by 3D model based visual face tracking technique. While camera becomes standard configuration for personal computer (PC) and computer speed becomes faster and faster, achieving human machine interaction through visual face tracking becomes a feasible solution to hand-free control. The human facial movement can be decomposed into rigid movement, e.g. rotation and translation, and non-rigid movement, such as the open/close of mouth, eyes, and facial expressions, etc. We introduce our visual face tracking system that can robustly and accurately retrieve these motion parameters from video at real-time. After calibration, the retrieved head orientation and translation can be employed to navigate the mouse cursor, and the detection of mouth movement can be utilized to trigger mouse events. 3 mouse control modes are investigated and compared. Experiments in Windows XP environment verify the convenience of navigation and operations using our face mouse. This technique can be an alternative input device for people with hand and speech disability and for futuristic vision-based game and interface.

[1]  James W. Davis,et al.  A perceptual user interface for recognizing head gesture acknowledgements , 2001, PUI '01.

[2]  Zhihong Zeng,et al.  Face localization via hierarchical CONDENSATION with Fisher boosting feature selection , 2004, Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004. CVPR 2004..

[3]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[4]  Changbo Hu,et al.  TLA Based Face Tracking , 2002 .

[5]  Yuxiao Hu,et al.  Estimating face pose by facial asymmetry and geometry , 2004, Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. Proceedings..

[6]  Tim Morris,et al.  Facial feature tracking for cursor control , 2006, J. Netw. Comput. Appl..

[7]  Trevor Darrell,et al.  Exploring Vision-Based Interfaces: How to Use Your Head in Dual Pointing Tasks , 2002 .

[8]  Gregory D. Abowd,et al.  Perceptual user interfaces using vision-based eye tracking , 2003, ICMI '03.

[9]  Paul A. Viola,et al.  Fast and Robust Classification using Asymmetric AdaBoost and a Detector Cascade , 2001, NIPS.

[10]  M. Betke,et al.  The Camera Mouse: visual tracking of body features to provide computer access for people with severe disabilities , 2002, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[11]  Timothy F. Cootes,et al.  Active Shape Model Search using Local Grey-Level Models: A Quantitative Evaluation , 1993, BMVC.

[12]  Kentaro Toyama,et al.  “Look, Ma – No Hands!” Hands-Free Cursor Control with Real-Time 3D Face Tracking , 1998 .

[13]  Matthew Turk,et al.  Perceptual user interfaces , 2000 .

[14]  Robert S. Olyha,et al.  Negative inertia: a dynamic pointing function , 1995, CHI '95.

[15]  Gerhard Roth,et al.  Nouse 'Use Your Nose as a Mouse' - a New Technology for Hands-free Games and Interfaces , 2002 .

[16]  Myron Flickner,et al.  Real-Time Detection of Eyes and FAces , 1998 .

[17]  Dmitry O. Gorodnichy,et al.  On importance of nose for face tracking , 2002, Proceedings of Fifth IEEE International Conference on Automatic Face Gesture Recognition.

[18]  Dmitry O. Gorodnichy Nouse ‘Use Your Nose as a Mouse’ – a New Technology for Hands-free Games and Interfaces , 2002 .

[19]  Matthew Turk,et al.  Perceptual user interfaces (introduction) , 2000, CACM.

[20]  G. Sharmila Sujatha CAMERA MOUSE , .

[21]  Thomas S. Huang,et al.  Explanation-based facial motion tracking using a piecewise Bezier volume deformation model , 1999, Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149).