Emotional virtual agents: how do young people decode synthetic facial expressions?

Given the need of remote learning and the growing presence of virtual agents within online learning environments, the present research aims at investigating young people’ ability to decode emotional expressions conveyed by virtual agents. The study, involves 50 healthy participants aged between 22 and 35 years (mean age=27.86; SD= ±2.75; 30 females) which were required to label pictures and video clips depicting female and male virtual agents of different ages (young, middleaged and old) displaying static and dynamic expressions of disgust, anger, sadness, fear, happiness, surprise and neutrality. Depending on the emotional category, significant effects were observed for the agents’ age, gender, and type of administered (static vs dynamic) stimuli on the young people’ decoding accuracy of the virtual agents’ emotional faces. Anger was significantly more accurately decoded in male rather than female faces while the opposite result was observed for happy, fearful, surprised, and disgusted faces. Middle aged faces were generally more accurately decoded than young and old emotional faces except for sadness and disgust. Significantly greater accuracy was observed for dynamic vs static faces of disgust, sadness, and fear, in contrast to static vs dynamic neutral and surprised faces.

[1]  Anastasios A. Economides,et al.  Affective Learning: Empathetic Agents with Emotional Facial and Tone of Voice Expressions , 2012, IEEE Transactions on Affective Computing.

[2]  Arvid Kappas,et al.  Impact Study of Nonverbal Facial Cues on Spontaneous Chatting with Virtual Humans , 2013, J. Virtual Real. Broadcast..

[3]  Maria del Puy Carretero,et al.  ELEIN: E-Learning with 3D Interactive Emotional Agents , 2009, Edutainment.

[4]  Mar Rus-Calafell,et al.  Creation of a new set of dynamic virtual reality faces for the assessment and training of facial emotion recognition ability , 2014, Virtual Reality.

[5]  Tsuyoshi Murata,et al.  {m , 1934, ACML.

[6]  M. Spérandio,et al.  Attempts, Successes, and Failures of Distance Learning in the Time of COVID-19 , 2020, Journal of Chemical Education.

[7]  Fabio Pianesi,et al.  Recognising emotions in human and synthetic faces: the role of the upper and lower parts of the face , 2005, IUI.

[8]  J. Noguez,et al.  Incorporating an Affective Model to an Intelligent Tutor for Mobile Robotics , 2006, Proceedings. Frontiers in Education. 36th Annual Conference.

[9]  Attempts , 2021, Philosophical Studies.

[10]  Ronald A. Cole,et al.  Role of embodiment and presence in human perception of robots' facial cues , 2018, Int. J. Hum. Comput. Stud..

[11]  M. Inés Torres,et al.  Elder user’s attitude toward assistive virtual agents: the role of voice and gender , 2019, Journal of Ambient Intelligence and Humanized Computing.

[12]  Donggil Song,et al.  Participation in Online Courses and Interaction With a Virtual Agent , 2019, The International Review of Research in Open and Distributed Learning.

[13]  Klaus Mathiak,et al.  Recognition Profile of Emotions in Natural and Virtual Faces , 2008, PloS one.

[14]  Mohamed Chetouani,et al.  The Attribution of Emotional State - How Embodiment Features and Social Traits Affect the Perception of an Artificial Agent , 2018, 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[15]  Patrice Renaud,et al.  Virtual Faces Expressing Emotions: An Initial Concomitant and Construct Validity Study , 2014, Front. Hum. Neurosci..

[16]  S. Dhawan Online Learning: A Panacea in the Time of COVID-19 Crisis , 2020, Journal of Educational Technology Systems.

[17]  Yanghee Kim,et al.  Pedagogical agents as learning companions: the impact of agent emotion and gender , 2007, J. Comput. Assist. Learn..

[18]  Alexander Serenko,et al.  Are interface agents scapegoats? Attributions of responsibility in human-agent interaction , 2007, Interact. Comput..

[19]  Konstantina Chatzara,et al.  Students Attitude and Learning Effectiveness of Emotional Agents , 2010, 2010 10th IEEE International Conference on Advanced Learning Technologies.