Facial expression based computer cursor control system for assisting physically disabled person

Recent years, several researchers are developing different kinds of assistive devices for physically disabled peoples. In this work, movement of illuminant markers through facial expressions is used to control the cursor movement in computer applications. A set of five facial expressions namely left and right cheek movement, eye brow rise and down and mouth open are used for controlling cursor movement in left and right direction, up and down and click, respectively. Four very small luminous stickers are fixed on subject's face and the subject is instructed to perform the above said facial expressions. Conventional web-camera is used for capturing the facial expression and sends the data into BASIC STAMP microcontroller through serial port interfacing. Movements of markers are detected through its x-y coordinate's changes on the video image and each facial expression is uniquely represented by a binary number. As a result of change of x-y co-ordinates, the BASIC STAMP microcontroller sends the binary code to the computer for controlling the mouse actions.

[2]  Alexander H. Waibel,et al.  A real-time face tracker , 1996, Proceedings Third IEEE Workshop on Applications of Computer Vision. WACV'96.

[3]  P Blenkhorn,et al.  Controlling mouse pointer position using an infrared head-operated joystick. , 2000, IEEE transactions on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[4]  Paul A. Viola,et al.  Robust Real-Time Face Detection , 2001, International Journal of Computer Vision.

[5]  M. Betke,et al.  The Camera Mouse: visual tracking of body features to provide computer access for people with severe disabilities , 2002, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[6]  P ? ? ? ? ? ? ? % ? ? ? ? , 1991 .

[7]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[8]  Erik Hjelmås,et al.  Face Detection: A Survey , 2001, Comput. Vis. Image Underst..

[9]  E. Granum,et al.  Skin colour detection under changing lighting conditions , 1999 .

[10]  Ioannis Pitas,et al.  Segmentation and tracking of faces in color images , 1996, Proceedings of the Second International Conference on Automatic Face and Gesture Recognition.

[11]  Kentaro Toyama,et al.  “Look, Ma – No Hands!” Hands-Free Cursor Control with Real-Time 3D Face Tracking , 1998 .

[12]  Alex Waibel,et al.  Gaze Tracking Based on Face‐Color , 1995 .

[13]  James W. Davis,et al.  A perceptual user interface for recognizing head gesture acknowledgements , 2001, PUI '01.

[14]  Michael J. Lyons Facial gesture interfaces for expression and communication , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[15]  Takeo Kanade,et al.  Neural Network-Based Face Detection , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[16]  Roberto Brunelli,et al.  Face Recognition: Features Versus Templates , 1993, IEEE Trans. Pattern Anal. Mach. Intell..

[17]  Kentaro Toyama,et al.  Manipulation of Video Eye Gaze and Head Orientation for Video Teleconferencing , 1999 .

[18]  Alex Pentland,et al.  Coding, Analysis, Interpretation, and Recognition of Facial Expressions , 1997, IEEE Trans. Pattern Anal. Mach. Intell..