Comparing Methods for Mapping Facial Expressions to Enhance Immersive Collaboration with Signs of Emotion

We present a user study comparing a pre-evaluated mapping approach with a state-of-the-art direct mapping method of facial expressions for emotion judgment in an immersive setting. At its heart, the pre-evaluated approach leverages semiotics, a theory used in linguistic. In doing so, we want to compare pre-evaluation with an approach that seeks to directly map real facial expressions onto their virtual counterparts. To evaluate both approaches, we conduct a controlled lab study with 22 participants. The results show that users are significantly more accurate in judging virtual facial expressions with pre-evaluated mapping. Additionally, participants were slightly more confident when deciding on a presented emotion. We could not find any differences regarding potential Uncanny Valley effects. However, the pre-evaluated mapping shows potential to be more convenient in a conversational scenario.

[1]  Jennifer L. Gregg,et al.  The Networked Minds Measure of Social Presence : Pilot Test of the Factor Structure and Concurrent Validity , 2001 .

[2]  Louis-Philippe Morency,et al.  OpenFace 2.0: Facial Behavior Analysis Toolkit , 2018, 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018).

[3]  Theresa Jean Tanenbaum,et al.  Nonverbal Communication in Virtual Worlds: Understanding and Designing Expressive Characters , 2014 .

[4]  Katherine B. Martin,et al.  Facial Action Coding System , 2015 .

[5]  Shaaron Ainsworth,et al.  Emotion understanding and performance during computer-supported collaboration , 2012, Comput. Hum. Behav..

[6]  Karl F. MacDorman,et al.  The Uncanny Valley [From the Field] , 2012, IEEE Robotics Autom. Mag..

[7]  D. Kort,et al.  D3.3 : Game Experience Questionnaire:development of a self-report measure to assess the psychological impact of digital games , 2007 .

[8]  P. Ekman,et al.  What the face reveals : basic and applied studies of spontaneous expression using the facial action coding system (FACS) , 2005 .

[9]  Dzulkifli Mohamad,et al.  Blend Shape Interpolation and FACS for Realistic Avatar , 2015 .

[10]  Christian Sandor,et al.  Social semiotic analysis of the design space of augmented reality , 2011, 2011 IEEE International Symposium on Mixed and Augmented Reality - Arts, Media, and Humanities.

[11]  Alexandra Broillet,et al.  Evolution of discourses along human-computer interaction throughout computer game based teaching technology: The compentency building process of an individual , 2012, 2012 IEEE International Professional Communication Conference.

[12]  Dong Yu,et al.  Deep Learning: Methods and Applications , 2014, Found. Trends Signal Process..

[13]  G. Bente,et al.  Social Presence and Interpersonal Trust in Avatar-Based, Collaborative Net- Communications , 2004 .

[14]  Zezhou Chen,et al.  A Neural Virtual Anchor Synthesizer Based on Seq2Seq and GAN Models , 2019, 2019 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct).

[15]  A. Lecours,et al.  The Biological foundations of gestures : motor and semiotic aspects , 1986 .

[16]  Paul Strauss,et al.  Foundations Of The Theory Of Signs , 2016 .

[17]  Hideyuki Ando,et al.  Co-creativity of communication system on behavioral interaction , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[18]  Guillaume Chanel,et al.  Emotion Feedback During Computer-mediated Collaboration: Effects on Self-Reported Emotions and Perceived Interaction , 2013, CSCL.

[19]  Heloir,et al.  The Uncanny Valley , 2019, The Animation Studies Reader.

[20]  Clarisse Sieckenius de Souza,et al.  A semiotic engineering approach to HCI , 2001, CHI Extended Abstracts.

[21]  Pattie Maes,et al.  Emotional Beasts: Visually Expressing Emotions through Avatars in VR , 2017, CHI Extended Abstracts.

[22]  Michael Neff,et al.  Communication Behavior in Embodied Virtual Reality , 2018, CHI.

[23]  R. Gur,et al.  Automated Facial Action Coding System for dynamic analysis of facial expressions in neuropsychiatric disorders , 2011, Journal of Neuroscience Methods.

[24]  Soo Youn Oh,et al.  Let the Avatar Brighten Your Smile: Effects of Enhancing Facial Expressions in Virtual Environments , 2016, PloS one.

[25]  M. Knapp,et al.  Nonverbal communication in human interaction , 1972 .

[26]  Joseph C. Hager,et al.  A comparison of units for visually measuring facial actions , 1985 .

[27]  Marc Erich Latoschik,et al.  Effects of Hybrid and Synthetic Social Gaze in Avatar-Mediated Interactions , 2018, 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct).

[28]  Victoria Interrante,et al.  The Influence of Avatar Representation and Behavior on Communication in Social Immersive Virtual Environments , 2018, 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

[29]  Ross T. Smith,et al.  Emotion Sharing and Augmentation in Cooperative Virtual Reality Games , 2018, CHI PLAY.

[30]  E. Zalama,et al.  Virtual Avatar for Emotion Recognition in Patients with Schizophrenia: A Pilot Study , 2016, Front. Hum. Neurosci..

[31]  Takeo Kanade,et al.  A Comparative Study of Alternative FACS Coding Algorithms , 2001 .

[32]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[33]  Eike Langbehn,et al.  Influence of avatar appearance on presence in social VR , 2017, 2017 IEEE Symposium on 3D User Interfaces (3DUI).

[34]  Dinesh Manocha,et al.  Learning Perceived Emotion Using Affective and Deep Features for Mental Health Applications , 2019, 2019 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct).

[35]  Marc Erich Latoschik,et al.  Avatar realism and social interaction quality in virtual reality , 2016, 2016 IEEE Virtual Reality (VR).