Japanese Face Emotions Classification Using LIP Features

In this paper, lip features are applied to classify the human emotion using a set of irregular ellipse fitting equations using Genetic algorithm. As Japanese, is considered in this study. All six universally accepted emotions are considered for classifications. Lip is usually considered as one of the features for recognizing the emotion. In this work, three feature extraction methods are proposed and their respective performances are compared for determining the feature of the lips. The method which is fastest in extracting lip features is adopted in this study. Observation of various emotions of the subject lead to unique characteristic of lips. GA is adopted to optimize such irregular ellipse characteristics of the lip features in each emotion. That is, the top portion of lip configuration is a part of one ellipse and the bottom of different ellipse. Two ellipse based fitness equations are proposed for the lip configuration and relevant parameters that define the emotion. This has given reasonably successful emotion classifications for Japanese subject.

[1]  Yasue Mitsukura,et al.  Face detection and emotional extraction system using double structure neural network , 2003, Proceedings of the International Joint Conference on Neural Networks, 2003..

[2]  Rafael C. González,et al.  Local Determination of a Moving Contrast Edge , 1985, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[3]  Gary G. Yen,et al.  Facial feature extraction using genetic algorithm , 2002, Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600).

[4]  H. Li Computer recognition of human emotions , 2001, Proceedings of 2001 International Symposium on Intelligent Multimedia, Video and Speech Processing. ISIMP 2001 (IEEE Cat. No.01EX489).

[5]  Nicu Sebe,et al.  Emotion recognition using a Cauchy Naive Bayes classifier , 2002, Object recognition supported by user interaction for service robots.

[6]  Marzuki Khalid,et al.  A real time marking inspection scheme for semiconductor industries , 2007 .

[7]  L.C. De Silva,et al.  Real-time facial feature extraction and emotion recognition , 2003, Fourth International Conference on Information, Communications and Signal Processing, 2003 and the Fourth Pacific Rim Conference on Multimedia. Proceedings of the 2003 Joint.

[8]  Kenji Terada,et al.  Detecting of one's eye from facial image by using genetic algorithm , 2001, IECON'01. 27th Annual Conference of the IEEE Industrial Electronics Society (Cat. No.37243).

[9]  Peter W. McOwan,et al.  A real-time automated system for the recognition of human facial expressions , 2006, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[10]  Maja Pantic,et al.  Dynamics of facial expression: recognition of facial actions and their temporal segments from face profile image sequences , 2006, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).