Human Computer Interface for Quadriplegic People Based on Face Position/gesture Detection

This paper proposes a human computer interface using a single depth camera for quadriplegic people. The nose position is employed to control the cursor along with the commands provided by mouth's status. The detection of nose position and mouth's status is based on randomized decision tree algorithm.The experimental results show that the proposed interface is comfortable, easy to use, robust, and outperforms the existing assistive technology.

[1]  Thomas S. Huang,et al.  Face as mouse through visual face tracking , 2005, The 2nd Canadian Conference on Computer and Robot Vision (CRV'05).

[2]  Josip Musić,et al.  Testing inertial sensor performance as hands-free human-computer interface , 2009 .

[3]  Roseli de Deus Lopes,et al.  Human–Computer Interface Controlled by the Lip , 2015, IEEE Journal of Biomedical and Health Informatics.

[4]  Maysam Ghovanloo,et al.  Quantitative and Comparative Assessment of Learning in a Tongue-Operated Computer Input Device , 2011, IEEE Transactions on Information Technology in Biomedicine.

[5]  Andrew W. Fitzgibbon,et al.  Real-time human pose recognition in parts from single depth images , 2011, CVPR 2011.

[6]  Nadia Magnenat-Thalmann,et al.  Fall Detection Based on Body Part Tracking Using a Depth Camera , 2015, IEEE Journal of Biomedical and Health Informatics.

[7]  Tim Morris,et al.  Facial feature tracking for cursor control , 2006, J. Netw. Comput. Appl..

[8]  M. Betke,et al.  The Camera Mouse: visual tracking of body features to provide computer access for people with severe disabilities , 2002, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[9]  Arman Savran,et al.  Comparative evaluation of 3D vs. 2D modality for automatic detection of facial action units , 2012, Pattern Recognit..