Automated Tracking of Facial Features in Patients with Facial Neuromuscular Dysfunction

Facial neuromuscular dysfunction severely impacts adaptive and expressive behavior and emotional health. Appropriate treatment is aided by quantitative and efficient assessment of facial motion impairment. We validated a newly developed method of quantifying facial motion, automated face analysis (AFA), by comparing it with an established manual marking method, the Maximal Static Response Assay (MSRA). In the AFA, motion of facial features is tracked automatically by computer vision without the need for placement of physical markers or restrictions of rigid head motion. Nine patients (seven women and two men) with a mean age of 39.3 years and various facial nerve disorders (five with Bell's palsy, three with trauma, and one with tumor resection) participated. The patients were videotaped while performing voluntary facial action tasks (brow raise, eye closure, and smile). For comparison with MSRA, physical markers were placed on facial landmarks. Image sequences were digitized into 640 × 480 × 24‐bit pixel arrays at 30 frames per second (1 pixel ≅ 0.3 mm). As defined for the MSRA, the coordinates of the center of each marker were manually recorded in the initial and final digitized frames, which correspond to repose and maximal response. For the AFA, these points were tracked automatically in the image sequence. Pearson correlation coefficients were used to evaluate consistency of measurement between manual (the MSRA) and automated (the AFA) tracking methods, and paired t tests were used to assess the mean difference between methods for feature tracking. Feature measures were highly consistent between methods, Pearson's r = 0.96 or higher, p < 0.001 for each of the action tasks. The mean differences between the methods were small; the mean error between methods was comparable to the error within the manual method (less than 1 pixel). The AFA demonstrated strong concurrent validity with the MSRA for pixel‐wise displacement. Tracking was fully automated and provided motion vectors, which may be useful in guiding surgical and rehabilitative approaches to restoring facial function in patients with facial neuromuscular disorders. (Plast. Reconstr. Surg. 107: 1124, 2001.)

[1]  Takeo Kanade,et al.  Recognizing lower face action units for facial expression analysis , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[2]  C A Trotman,et al.  A comparison of three-dimensional and two-dimensional analyses of facial motion. , 1996, The Angle orthodontist.

[3]  N. Cohen,et al.  Electromyographic rehabilitation of facial function and introduction of a facial paralysis grading scale for hypoglossal‐facial nerve anastomosis , 1988, The Laryngoscope.

[4]  C A Trotman,et al.  Reliability of a three-dimensional method for measuring facial animation: a case report. , 1996, The Angle orthodontist.

[5]  J. VanSwearingen,et al.  Quantitation of Patterns of Facial Movement in Patients with Ocular to Oral Synkinesis , 1998, Plastic and reconstructive surgery.

[6]  Takeo Kanade,et al.  Detection, tracking, and classification of action units in facial expression , 2000, Robotics Auton. Syst..

[7]  J. Cohn,et al.  Specific Impairment of Smiling Increases the Severity of Depressive Symptoms in Patients with Facial Neuromuscular Disorders , 1999, Aesthetic Plastic Surgery.

[8]  J. Faraway,et al.  Sensitivity of a Method for the Analysis of Facial Mobility. II. Interlandmark Separation , 1998 .

[9]  J. VanSwearingen,et al.  The Facial Disability Index: reliability and validity of a disability assessment instrument for disorders of the facial neuromuscular system. , 1996, Physical therapy.

[10]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.

[11]  Takeo Kanade,et al.  Optical flow estimation using wavelet motion model , 1998, Sixth International Conference on Computer Vision (IEEE Cat. No.98CH36271).

[12]  J. Neely,et al.  Quantitative Assessment of the Variation Within Grades of Facial Paralysis , 1996, The Laryngoscope.

[13]  L. Camras,et al.  Infant “surprise” expressions as coordinative motor structures , 1996 .

[14]  J. Faraway,et al.  Sensitivity of a method for the analysis of facial mobility. I. Vector of displacement. , 1998, The Cleft palate-craniofacial journal : official publication of the American Cleft Palate-Craniofacial Association.

[15]  Anthony Delitto,et al.  Impairment and Disability in Patients with Facial Neuromuscular Dysfunction , 1997, Otolaryngology--head and neck surgery : official journal of American Academy of Otolaryngology-Head and Neck Surgery.

[16]  A. Bajaj-Luthra,et al.  Quantitative Analysis of Facial Motion Components: Anatomic and Nonanatomic Motion in Normal Persons and in Patients with Complete Facial Paralysis , 1997, Plastic and reconstructive surgery.

[17]  R. Kraut,et al.  Social and emotional messages of smiling: An ethological approach. , 1979 .

[18]  J. VanSwearingen,et al.  Facial Neuromuscular Retraining for Oral Synkinesis , 1997, Plastic and reconstructive surgery.

[19]  J. W. House,et al.  Facial nerve grading systems. , 1983, The Laryngoscope.

[20]  J. W. House,et al.  Facial Nerve Grading System , 1985, Otolaryngology--head and neck surgery : official journal of American Academy of Otolaryngology-Head and Neck Surgery.

[21]  J. Nedzelski,et al.  Efficacy of feedback training in long‐standing facial nerve paresis , 1991, The Laryngoscope.

[22]  J Byers,et al.  Computerized quantitative dynamic analysis of facial motion in the paralyzed and synkinetic face. , 1992, The American journal of otology.

[23]  Takeo Kanade,et al.  Appearance-based virtual view generation of temporally-varying events from multi-camera images in the 3D room , 1999, Second International Conference on 3-D Digital Imaging and Modeling (Cat. No.PR00062).

[24]  J. Gail Neely,et al.  Validation of Objective Measures for Facial Paralysis , 1997, The Laryngoscope.

[25]  A. Jenny,et al.  Development of a New Documentation System for Facial Movements as a Basis for the International Registry for Neuromuscular Reconstruction in the Face , 1994, Plastic and reconstructive surgery.

[26]  Takeo Kanade,et al.  Dual-state parametric eye tracking , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[27]  J. Nedzelski,et al.  Development of a sensitive clinical facial grading system. , 1996, European archives of oto-rhino-laryngology : official journal of the European Federation of Oto-Rhino-Laryngological Societies (EUFOS) : affiliated with the German Society for Oto-Rhino-Laryngology - Head and Neck Surgery.

[28]  Jianzhong Wang,et al.  Adaptive multiresolution collocation methods for initial boundary value problems of nonlinear PDEs , 1996 .

[29]  J. Lien,et al.  Automatic recognition of facial expressions using hidden markov models and estimation of expression intensity , 1998 .

[30]  J. Cohn,et al.  Automated face analysis by feature point tracking has high concurrent validity with manual FACS coding. , 1999, Psychophysiology.

[31]  J. Cohn,et al.  Psychological Distress , 1998, Otolaryngology--head and neck surgery : official journal of American Academy of Otolaryngology-Head and Neck Surgery.

[32]  R. Balliet,et al.  Simultaneous Quantitation of Facial Movements: The Maximal Static Response Assay of Facial Nerve Function , 1994, Annals of plastic surgery.