Eye state tracking for face cloning

This article presents an efficient approach to eye movement estimation by combining color and energy based image analysis algorithms. The movement is first analyzed and then described in terms of action units. A temporal state diagram is used to control the behavior of the analysis over time so that the movements of the eye can be synthesized from the former description, after translating them into face animation parameters.

[1]  Hichem Sahbi,et al.  From coarse to fine skin and face detection , 2000, ACM Multimedia.

[2]  Jean-Luc Dugelay,et al.  Face Tracking and Realistic Animations for Telecommunicant Clones , 2000, IEEE Multim..

[3]  Ali Al-Qayedi,et al.  Constant-rate eye tracking and animation for model-based-coded video , 2000, 2000 IEEE International Conference on Acoustics, Speech, and Signal Processing. Proceedings (Cat. No.00CH37100).

[4]  Jean-Luc Dugelay,et al.  Analysis and Reproduction of Facial Expressions for Realistic Communicating Clones , 2001, J. VLSI Signal Process..

[5]  Gerasimos Potamianos,et al.  Audio-visual unit selection for the synthesis of photo-realistic talking-heads , 2000, 2000 IEEE International Conference on Multimedia and Expo. ICME2000. Proceedings. Latest Advances in the Fast Changing World of Multimedia (Cat. No.00TH8532).

[6]  Fabio Lavagetto,et al.  The facial animation engine: toward a high-level interface for the design of MPEG-4 compliant animated faces , 1999, IEEE Trans. Circuits Syst. Video Technol..

[7]  Lijun Yin,et al.  Partial update of active textures for efficient expression synthesis in model-based coding , 2000, 2000 IEEE International Conference on Multimedia and Expo. ICME2000. Proceedings. Latest Advances in the Fast Changing World of Multimedia (Cat. No.00TH8532).