Controlling a computer via facial aspect

Control of a computer workstation via face position and facial gesturing would be an important advance for people with hand or body disabilities as well as for all users. Steps toward realization of such a system are reported here. A computer system has been developed to track the eyes and the nose of a subject and to compute the direction of the face. Face direction and movement is then used to control the cursor. Test results show that the resulting system is usable, although several improvements are needed. >

[1]  Mark S. Ackerman,et al.  Augmenting a window system with speech input , 1990, Computer.

[2]  Azriel Rosenfeld,et al.  Computer Vision , 1988, Adv. Comput..

[3]  H.L. Galiana,et al.  Evaluation of three template matching algorithms for registering images of the eye , 1992, IEEE Transactions on Biomedical Engineering.

[4]  Ian Craw,et al.  Automatic extraction of face-features , 1987, Pattern Recognit. Lett..

[5]  John F. Canny,et al.  A Computational Approach to Edge Detection , 1986, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[6]  T. Poggio Vision by man and machine. , 1984, Scientific American.

[7]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[8]  A. L. Yarbus,et al.  Eye Movements and Vision , 1967, Springer US.

[9]  Robert J. Baron,et al.  Mechanisms of Human Facial Recognition , 1981, Int. J. Man Mach. Stud..

[10]  Yukio Kobayashi,et al.  Method Of Detecting Face Direction Using Image Processing For Human Interface , 1988, Other Conferences.

[11]  M. Turk,et al.  Eigenfaces for Recognition , 1991, Journal of Cognitive Neuroscience.

[12]  Robert C. Bolles,et al.  Parametric Correspondence and Chamfer Matching: Two New Techniques for Image Matching , 1977, IJCAI.

[13]  Venu Govindaraju,et al.  A computational model for face location , 1990, [1990] Proceedings Third International Conference on Computer Vision.