Automatic Physiognomic Analysis by Classifying Facial Component Feature

This paper presents a method for generating physiognomic information from facial images, by analyzing features of facial components. The physical personality of the face can be modeled by the combination of facial feature components. The facial region is detected from an input image, in order to analyze the various facial feature components. Then, the gender of the subject is subsequently classified, and facial components are extracted. The active appearance model (AAM) is used to extract facial feature points. From these facial feature points, 16 measures are computed to distinguish each facial component into defined classes, such as large eye, small mouth, and so on. After classifying facial components with each classification criterion and gender of subject, physiognomic information is generated by combining the classified results of each classification criteria. The proposed method has been tested with 200 persons' samples. The proposed method achieved a classification rate of 85.5% for all facial components feature