Active Shape Model (ASM) is one of the most popular local texture model for detecting region of interest for face and locating facial features. This paper implements an efficient extraction method of facial feature points to utilize on iPhone. We extend the original ASM algorithm to improve the performance by four modifications: (1) we apply a face detection API included with iOS CoreImage framework to detect a face area and to initialize the shape model, (2) construct a weighted local structure model for landmarks to utilize the edge points of the face contour, (3) build a modified model definition and fitting more landmarks than the classical ASM, and (4) extend and build two-dimensional profile model for detecting faces within input images. The proposed method is evaluated on experimental test set containing over 500 face images, and found to successfully extract facial feature points, clearly outperforming the original ASM. This approach opens up new applications in the field of computer vision.
[1]
Adam Schmidt,et al.
The put face database
,
2008
.
[2]
Timothy F. Cootes,et al.
Active Appearance Models
,
2001,
IEEE Trans. Pattern Anal. Mach. Intell..
[3]
C. Thomaz,et al.
A new ranking method for principal components analysis and its application to face image analysis
,
2010,
Image Vis. Comput..
[4]
Thai Hoang Le,et al.
Face Alignment Using Active Shape Model And Support Vector Machine
,
2012,
ArXiv.
[5]
Mohammad H. Mahoor,et al.
Improved Active Shape Model for Facial Feature Extraction in Color Images
,
2006,
J. Multim..
[6]
Tim Cootes,et al.
An Introduction to Active Shape Models
,
2000
.