Automatically Analyzing Facial-Feature Movements to Identify Human Errors

Everyday countless human errors occur around the globe. Although many of these errors are harmless, disastrous errors-such as Bhopal, Chernobyl, and Three Mile Island-demonstrate that developing ways to improve human performance is not only desirable but crucial. Considerable research exists in human-error identification (HEI), a field devoted to developing systems to predict human errors. However, these systems typically predict only instantaneous errors, not overall human performance. Furthermore, they often rely on predefined hierarchies of errors and manual minute by-minute analyses of users by trained analysts, making them costly and time consuming to implement. Using facial feature points automatically extracted from short video segments of participants' faces during laboratory experiments, our work applies a bottom-up approach to predict human performance.

[1]  Xin Jin,et al.  Machine Learning Techniques and Chi-Square Feature Selection for Cancer Classification Using SAGE Gene Expression Profiles , 2006, BioDM.

[2]  Matthew S. Goodwin,et al.  Technology for just-in-time in-situ learning of facial affect for persons diagnosed with an autism spectrum disorder , 2008, Assets '08.

[3]  Young-suk Shin,et al.  Recognizing Facial Expressions with PCA and ICA onto Dimension of the Emotion , 2006, SSPR/SPR.

[4]  J. Friedman Special Invited Paper-Additive logistic regression: A statistical view of boosting , 2000 .

[5]  Matthew L. Jensen,et al.  Deception detection through automatic, unobtrusive analysis of nonverbal behavior , 2005, IEEE Intelligent Systems.

[6]  Ashish Kapoor,et al.  Automatic prediction of frustration , 2007, Int. J. Hum. Comput. Stud..

[7]  Scotty D. Craig,et al.  AutoTutor Detects and Responds to Learners Affective and Cognitive States , 2008 .

[8]  Ron Kohavi,et al.  The Power of Decision Tables , 1995, ECML.

[9]  Barry Kirwan,et al.  Human error in European air traffic management: the HERA project , 2002, Reliab. Eng. Syst. Saf..

[10]  Rosalind W. Picard Computers That Recognize and Respond to User Emotion , 2003, User Modeling.

[11]  Ian Witten,et al.  Data Mining , 2000 .

[12]  Rosalind W. Picard,et al.  Relative subjective count and assessment of interruptive technologies applied to mobile monitoring of stress , 2007, Int. J. Hum. Comput. Stud..

[13]  Maria E. Jabon,et al.  Real-time classification of evoked emotions using facial feature tracking and physiological responses , 2008, Int. J. Hum. Comput. Stud..

[14]  P. Ekman,et al.  What the face reveals : basic and applied studies of spontaneous expression using the facial action coding system (FACS) , 2005 .

[15]  Peter Robinson,et al.  Mind reading machines: automated inference of cognitive mental states from video , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[16]  Pawel Kosla A Feature Selection Approach in Problems with a Great Number of Features , 2008, Computer Recognition Systems 2.

[17]  John W. Senders,et al.  Human Error: Cause, Prediction, and Reduction , 1991 .

[18]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.