Modified Active Shape Model for Realtime Facial Feature Tracking on iPhone

Active Shape Model (ASM) is one of the most popular local texture model for detecting region of interest for face and locating facial features. This paper implements an efficient extraction method of facial feature points to utilize on iPhone. We extend the original ASM algorithm to improve the performance by four modifications: (1) we apply a face detection API included with iOS CoreImage framework to detect a face area and to initialize the shape model, (2) construct a weighted local structure model for landmarks to utilize the edge points of the face contour, (3) build a modified model definition and fitting more landmarks than the classical ASM, and (4) extend and build two-dimensional profile model for detecting faces within input images. The proposed method is evaluated on experimental test set containing over 500 face images, and found to successfully extract facial feature points, clearly outperforming the original ASM. This approach opens up new applications in the field of computer vision.