A Facial Repertoire for Avatars

Facial expressions are becoming more and more important in today’s computer systems with humanoid user interfaces. Avatars have become popular, however their facial communication is usually limited. This is partly due to the fact that many questions, especially on the dynamics of expressions, are still open. Moreover, the few commercial facial animation tools have limited facilities, and are not aimed at lightweight Web applications. In this article we discuss the empirical basis and a software tool to produce faces with emotional expressions and lip-sync. In order to elicit the characteristics of expressions on real human faces and to map them on synthetic, non-realistic ones, we analysed expressions on real and artist-drawn cartoon faces. We developed CharToon, a software tool that allows the construction of faces and to animate them from scratch or by re-using components of the facial feature and expression repertoire. CharToon faces can be animated real time. The list of applications includes 3D faces of avatars in a VRML environment.

[1]  M. Katsikitis,et al.  The classification of facial emotions: a computer-based taxonomic approach. , 1994, Journal of affective disorders.

[2]  Kristinn R. Thrisson ToonFace: A System for Creating and Animating Interactive Cartoon Faces , 1996 .

[3]  Cor J. Veenman,et al.  A fast and robust point tracking algorithm , 1998, Proceedings 1998 International Conference on Image Processing. ICIP98 (Cat. No.98CB36269).

[4]  W. Kilmer A Friendly Guide To Wavelets , 1998, Proceedings of the IEEE.

[5]  J. Russell Is there universal recognition of emotion from facial expression? A review of the cross-cultural studies. , 1994, Psychological bulletin.

[6]  Larry Davis,et al.  Recognizing facial expressions by spatio-temporal analysis , 1994, Proceedings of 12th International Conference on Pattern Recognition.

[7]  G. Reeke The society of mind , 1991 .

[8]  Frédéric Benhamou,et al.  Applying Interval Arithmetic to Real, Integer, and Boolean Constraints , 1997, J. Log. Program..

[9]  Jean-Daniel Fekete,et al.  TicTacToon: a paperless system for professional 2D animation , 1995, SIGGRAPH.

[10]  Zsófia Ruttkay,et al.  CharToon 2.1 extensions : Expression repertoire and lip sync , 2000 .

[11]  Zoran Popovic,et al.  Motion warping , 1995, SIGGRAPH.

[12]  Z. Ruttkay,et al.  Exploring the space of emotional faces of subjects without acting experience , 2000 .

[13]  Michael Gleicher,et al.  Constraint-based motion adaptation , 1998, Comput. Animat. Virtual Worlds.

[14]  Diane J. Schiano,et al.  Face to interface: facial affect in (hu)man and machine , 2000, CHI.

[15]  Irfan Essa,et al.  Analysis, interpretation and synthesis of facial expressions , 1995 .

[16]  Philip J. Willis,et al.  Modelling and interpolating cartoon characters , 1994, Proceedings of Computer Animation '94.

[17]  H. Yamada,et al.  Dimensions of visual information for categorizing facial expressions of emotion , 1993 .

[18]  David Salesin,et al.  Comic Chat , 1996, SIGGRAPH.

[19]  J. Russell A circumplex model of affect. , 1980 .

[20]  David C. Brogan,et al.  Animating human athletics , 1995, SIGGRAPH.

[21]  H. Schlosberg The description of facial expressions in terms of two dimensions. , 1952, Journal of experimental psychology.

[22]  James R. Schott,et al.  Principles of Multivariate Analysis: A User's Perspective , 2002 .

[23]  Keith Waters,et al.  Computer facial animation , 1996 .

[24]  Frank Van Reeth,et al.  Integrating 2 1/2-D Computer Animation Techniques for Supporting Traditional Animation , 1996, CA.

[25]  Norman I. Badler,et al.  User-controlled physics-based animation for articulated figures , 1996, Proceedings Computer Animation '96.