Facial Animation Based on Feature Points

This paper presents a hybrid method for synthesizing natural animation of facial expression with data from motion capture. The captured expression was transferred from the space of source performance to that of a 3D target face using an accurate mapping process in order to realize the reuse of motion data. The transferred animation was then applied to synthesize the expression of the target model through a framework of two-stage deformation. A local deformation technique preliminarily considered a set of neighbor feature points for every vertex and their impact on the vertex. Furthermore, the global deformation was exploited to ensure the smoothness of the whole facial mesh. The experimental results show our hybrid mesh deformation strategy was effective, which could animate different target face without complicated manual efforts required by most of facial animation approaches. DOI:  http://dx.doi.org/10.11591/telkomnika.v11i3.2039

[1]  Markus H. Gross,et al.  Pose-space animation and transfer of facial details , 2008, SCA '08.

[2]  Koichi Harada,et al.  A Facial Expression Parameterization by Elastic Surface Model , 2009, Int. J. Comput. Games Technol..

[3]  Keith Waters,et al.  A muscle model for animation three-dimensional facial expression , 1987, SIGGRAPH.

[4]  Christian B. Allen,et al.  Reduced surface point selection options for efficient mesh deformation using radial basis functions , 2010, J. Comput. Phys..

[5]  Xin Tong,et al.  Leveraging motion capture and 3D scanning for high-fidelity facial performance acquisition , 2011, ACM Trans. Graph..

[6]  Frederick I. Parke,et al.  Computer generated animation of faces , 1972, ACM Annual Conference.

[7]  Lihua You,et al.  Adaptive Physics-Inspired Facial Animation , 2009, MIG.

[8]  Scott A. King,et al.  Use and Re-use of Facial Motion Capture Data , 2003, VVG.

[9]  Marco Fratarcangeli,et al.  Position‐based facial animation synthesis , 2012, Comput. Animat. Virtual Worlds.

[10]  Ken-ichi Anjyo,et al.  Spacetime expression cloning for blendshapes , 2012, TOGS.

[11]  Jian-Jun Zhang,et al.  Automatic muscle generation for physically-based facial animation , 2010, SIGGRAPH '10.

[12]  Mathieu Desbrun,et al.  Learning controls for blend shape based realistic facial animation , 2003, SIGGRAPH '03.

[13]  Saïda Bouakaz,et al.  Feature points based facial animation retargeting , 2008, VRST '08.

[14]  Martin D. Buhmann,et al.  Radial Basis Functions: Theory and Implementations: Preface , 2003 .

[15]  Zhaoqi Wang,et al.  Exploring Non‐Linear Relationship of Blendshape Facial Animation , 2011, Comput. Graph. Forum.

[16]  Steven M. Seitz,et al.  Spacetime faces , 2004, ACM Trans. Graph..

[17]  John P. Lewis,et al.  Compression and direct manipulation of complex blendshape models , 2011, ACM Trans. Graph..

[18]  Ken-ichi Anjyo,et al.  Direct Manipulation Blendshapes , 2010, IEEE Computer Graphics and Applications.

[19]  Li Zhang,et al.  Spacetime faces: high resolution capture for modeling and animation , 2004, SIGGRAPH 2004.

[20]  Yeongho Seol,et al.  Weighted pose space editing for facial animation , 2012, The Visual Computer.

[21]  Pieter Peers,et al.  Facial cartography: interactive high-resolution scan correspondence , 2011, SIGGRAPH '11.

[22]  Erika Chuang,et al.  Performance Driven Facial Animation using Blendshape Interpolation , 2002 .

[23]  Frederick I. Parke,et al.  Computer gernerated animation of faces , 1998 .

[24]  Taehyun Rhee,et al.  Real-time facial animation from live video tracking , 2011, SCA '11.

[25]  M. Otaduy,et al.  Multi-scale capture of facial geometry and motion , 2007, ACM Trans. Graph..

[26]  Qiang Zhang,et al.  On the simulation of expressional animation based on facial MoCap , 2011, Science China Information Sciences.

[27]  Nadia Magnenat-Thalmann,et al.  Feature Point Based Mesh Deformation Applied to MPEG-4 Facial Animation , 2000, DEFORM/AVATARS.