HapFACS 3.0: FACS-Based Facial Expression Generator for 3D Speaking Virtual Characters

With the growing number of researchers interested in modeling the inner workings of affective social intelligence, the need for tools to easily model its associated expressions has emerged. The goal of this article is two-fold: 1) we describe HapFACS, a free software and API that we developed to provide the affective computing community with a resource that produces static and dynamic facial expressions for three-dimensional speaking characters; and 2) we discuss results of multiple experiments that we conducted in order to scientifically validate our facial expressions and head animations in terms of the widely accepted Facial Action Coding System (FACS) standard, and its Action Units (AU). The result is that users, without any 3D-modeling nor computer graphics expertise, can animate speaking virtual characters with FACS-based realistic facial expression animations, and embed these expressive characters in their own application(s). The HapFACS software and API can also be used for generating repertoires of realistic FACS-validated facial expressions, useful for testing emotion expression generation theories.

[1]  Christine L. Lisetti,et al.  Emotionally Responsive Virtual Counselor for Behavior-Change Health Interventions , 2014, DESRIST.

[2]  Olga Sourina,et al.  Real-Time EEG-Based Human Emotion Recognition and Visualization , 2010, 2010 International Conference on Cyberworlds.

[3]  Etienne de Sevin,et al.  GRETA: Towards an interactive conversational virtual Companion , 2010 .

[4]  L. Leyman,et al.  The Karolinska Directed Emotional Faces: A validation study , 2008 .

[5]  M. Jensen,et al.  Evaluation of nurses' self-insight into their pain assessment and treatment decisions. , 2010, The journal of pain : official journal of the American Pain Society.

[6]  Naphtali Rishe,et al.  On-Demand Virtual Health Counselor for Delivering Behavior-Change Health Interventions , 2013, 2013 IEEE International Conference on Healthcare Informatics.

[7]  A. J. Fridlund Human Facial Expression: An Evolutionary View , 1994 .

[8]  Tibor Bosse,et al.  An Intelligent Virtual Agent to Increase Involvement in Financial Services , 2010, IVA.

[9]  Skyler T. Hawk,et al.  Moving faces, looking places: validation of the Amsterdam Dynamic Facial Expression Set (ADFES). , 2011, Emotion.

[10]  Tara S. Behrend,et al.  The effects of avatar appearance on interviewer ratings in virtual employment interviews , 2012, Comput. Hum. Behav..

[11]  M. Pantic,et al.  Induced Disgust , Happiness and Surprise : an Addition to the MMI Facial Expression Database , 2010 .

[12]  Markku Turunen,et al.  How was your day? An architecture for multimodal ECA systems , 2010, SIGDIAL Conference.

[13]  Rafael A. Calvo,et al.  Research and Development Tools in Affective Computing , 2015 .

[14]  Sergi Villagrasa,et al.  FACe! 3D Facial Animation System based on FACS , 2009 .

[15]  José Francisco de Magalhães Netto,et al.  TUtor Collaborator Using Multi-Agent System , 2014, CollabTech.

[16]  Naphtali Rishe,et al.  I Can Help You Change! An Empathic Virtual Agent Delivers Behavior Change Health Interventions , 2013, TMIS.

[17]  Dale-Marie Wilson,et al.  Improving performance and retention in computer science courses using a virtual game show , 2011, ACM-SE '11.

[18]  N. Ambady,et al.  Thin slices of expressive behavior as predictors of interpersonal consequences: A meta-analysis. , 1992 .

[19]  Yukiko I. Nakano,et al.  Estimating user's engagement from eye-gaze behaviors in human-agent conversations , 2010, IUI '10.

[20]  Yukiko I. Nakano,et al.  An empirical study of eye-gaze behaviors: towards the estimation of conversational engagement in human-agent communication , 2010, EGIHMI '10.

[21]  P. Ekman,et al.  Autonomic nervous system activity distinguishes among emotions. , 1983, Science.

[22]  J. Cohn,et al.  A Psychometric Evaluation of the Facial Action Coding System for Assessing Spontaneous Expression , 2001 .

[23]  L. J. M. Rothkrantz,et al.  Parametric Generation of Facial Expressions Based on FACS , 2005, Comput. Graph. Forum.

[24]  P. Ekman,et al.  EMFACS-7: Emotional Facial Action Coding System , 1983 .

[25]  P. Ekman,et al.  Facial action coding system: a technique for the measurement of facial movement , 1978 .

[26]  Debbie Richards,et al.  Users's Expectations of IVA Recall and Forgetting , 2011, IVA.

[27]  Christine L. Lisetti,et al.  Adapting Psychologically Grounded Facial Emotional Expressions to Different Anthropomorphic Embodiment Platforms , 2007, FLAIRS.

[28]  Marc Cavazza,et al.  Interaction Strategies for an Affective Conversational Agent , 2010, PRESENCE: Teleoperators and Virtual Environments.

[29]  Julian Szymanski,et al.  Information retrieval with semantic memory model , 2012, Cognitive Systems Research.

[30]  Nicole Novielli,et al.  User attitude towards an embodied conversational agent: Effects of the interaction mode , 2010 .

[31]  Andrew Jones,et al.  Digital Ira: creating a real-time photoreal digital actor , 2013, SIGGRAPH '13.

[32]  Franco Casalino,et al.  MPEG-4: A Multimedia Standard for the Third Millennium, Part 1 , 1999, IEEE Multim..

[33]  Naphtali Rishe,et al.  Let’s talk! speaking virtual counselor offers you a brief intervention , 2014, Journal on Multimodal User Interfaces.

[34]  Nicole Chovil Discourse‐oriented facial displays in conversation , 1991 .

[35]  Nicole Novielli,et al.  Social robots and ECAs for accessing smart environments services , 2010, AVI.

[36]  Chao Fan,et al.  See Me, Teach Me: Facial Expression and Gesture Recognition for Intelligent Tutoring Systems , 2006, 2006 Innovations in Information Technology.

[37]  Anton Leuski,et al.  Virtual Patients for Clinical Therapist Skills Training , 2007, IVA.

[38]  K. Shockley,et al.  Using emotional cues in a discrimination learning task: Effects of trait emotional intelligence and affective state , 2012 .

[39]  Tara S. Behrend,et al.  Similarity effects in online training: Effects with computerized trainer agents , 2011, Comput. Hum. Behav..

[40]  Florian Schiel,et al.  The SmartKom Multimodal Corpus at BAS , 2002, LREC.

[41]  P. Ekman,et al.  Detecting deception from the body or face. , 1974 .

[42]  Ari Shapiro,et al.  Building a Character Animation System , 2011, MIG.

[43]  Skyler T. Hawk,et al.  Presentation and validation of the Radboud Faces Database , 2010 .

[44]  Marc Cavazza,et al.  Multimodal and mobile conversational Health and Fitness Companions , 2011, Comput. Speech Lang..

[45]  Jessica L. Tracy,et al.  Development of a Facs-verified Set of Basic and Self-conscious Emotion Expressions Extant Facs-verified Sets , 2009 .

[46]  Christine L. Lisetti,et al.  HapFACS: An Open Source API/Software to Generate FACS-Based Expressions for ECAs Animation and for Corpus Generation , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[47]  Katherine B. Martin,et al.  Facial Action Coding System , 2015 .

[48]  Christine L. Lisetti,et al.  HapFACS 1.0: software/API for generating FACS-based facial expressions , 2012, FAA '12.

[49]  Stacy Marsella,et al.  SmartBody: behavior realization for embodied conversational agents , 2008, AAMAS.

[50]  Juan Carlos Augusto,et al.  Editorial: Inaugural issue , 2009, J. Ambient Intell. Smart Environ..

[51]  Elisabeth André,et al.  Simplified facial animation control utilizing novel input devices: a comparative study , 2009, IUI.

[52]  Steve DiPaola,et al.  Socially Communicative Characters for Interactive Applications , 2006 .

[53]  Eric O. Postma,et al.  Vocal and Facial Imitation of Humans Interacting with Virtual Agents , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[54]  P. Debevec,et al.  Creating a Photoreal Digital Actor: The Digital Emily Project , 2009, 2009 Conference for Visual Media Production.

[55]  K. Scherer,et al.  Appraisal processes in emotion: Theory, methods, research. , 2001 .

[56]  U. Hess,et al.  Cross-Cultural Emotion Recognition among Canadian Ethnic Groups , 2005 .

[57]  Marc Cavazza,et al.  How was your day? An Affective Companion ECA Prototype , 2010, SIGDIAL Conference.

[58]  O. Sourina,et al.  Emotion-enabled EEG-based interaction , 2011, SA '11.

[59]  Eric O. Postma,et al.  Mirror mirror on the wall: Is there mimicry in you all? , 2015, J. Ambient Intell. Smart Environ..

[60]  Catherine Pelachaud,et al.  Greta: A Simple Facial Animation Engine , 2002 .

[61]  Stacy Marsella,et al.  Expressive Behaviors for Virtual Worlds , 2004, Life-like characters.

[62]  Eva G Krumhuber,et al.  FACSGen 2.0 animation software: generating three-dimensional FACS-valid facial expressions for emotion research. , 2012, Emotion.

[63]  D. Lundqvist,et al.  Karolinska Directed Emotional Faces , 2015 .

[64]  K. Scherer,et al.  Multimodal expression of emotion: affect programs or componential appraisal patterns? , 2007, Emotion.

[65]  Takeo Kanade,et al.  The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops.

[66]  J. Allwood BODILY COMMUNICATION DIMENSIONS OF EXPRESSION AND CONTENT , 2002 .