Toward Affective XAI: Facial Affect Analysis for Understanding Explainable Human-AI Interactions

As machine learning approaches are increasingly used to augment human decision-making, eXplainable Artificial Intelligence (XAI) research has explored methods for communicating system behavior to humans. However, these approaches often fail to account for the affective responses of humans as they interact with explanations. Facial affect analysis, which examines human facial expressions of emotions, is one promising lens for understanding how users engage with explanations. Therefore, in this work, we aim to (1) identify which facial affect features are pronounced when people interact with XAI interfaces, and (2) develop a multitask feature embedding for linking facial affect signals with participants’ use of explanations. Our analyses and results show that the occurrence and values of facial AU1 and AU4, and Arousal are heightened when participants fail to use explanations effectively. This suggests that facial affect analysis should be incorporated into XAI to personalize explanations to individuals’ interaction styles and to adapt explanations based on the difficulty of the task performed.

[1]  P. White Appraisal Theory , 2015 .

[2]  Louis-Philippe Morency,et al.  OpenFace 2.0: Facial Behavior Analysis Toolkit , 2018, 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018).

[3]  Yu Qiao,et al.  Joint Face Detection and Alignment Using Multitask Cascaded Convolutional Networks , 2016, IEEE Signal Processing Letters.

[4]  Amanda Prorok,et al.  Culture-Based Explainable Human-Agent Deconfliction , 2020, AAMAS.

[5]  P. Ekman,et al.  Constants across cultures in the face and emotion. , 1971, Journal of personality and social psychology.

[6]  R. Gur,et al.  Differences in facial expressions of four universal emotions , 2004, Psychiatry Research.

[7]  Shweta,et al.  Evaluation of Inter-Rater Agreement and Inter-Rater Reliability for Observational Data: An Overview of Concepts and Methods , 2015 .

[8]  Stefanos Zafeiriou,et al.  Aff-Wild2: Extending the Aff-Wild Database for Affect Recognition , 2018, ArXiv.

[9]  Eric Horvitz,et al.  Beyond Accuracy: The Role of Mental Models in Human-AI Team Performance , 2019, HCOMP.

[10]  K. Scherer,et al.  Reliable facial muscle activation enhances recognizability and credibility of emotional expression. , 2012, Emotion.

[11]  Mohammad Hossein Jarrahi,et al.  Artificial intelligence and the future of work: Human-AI symbiosis in organizational decision making , 2018, Business Horizons.

[12]  Klaus-Robert Müller,et al.  Explainable Artificial Intelligence: Understanding, Visualizing and Interpreting Deep Learning Models , 2017, ArXiv.

[13]  D. Kort,et al.  The Game Experience Questionnaire , 2013 .

[14]  Hatice Gunes,et al.  Continuous Prediction of Spontaneous Affect from Multiple Cues and Modalities in Valence-Arousal Space , 2011, IEEE Transactions on Affective Computing.

[15]  P. Viviani,et al.  The Identification of Unfolding Facial Expressions , 2012, Perception.

[16]  Ingo Siegert,et al.  Inter-rater reliability for emotion annotation in human–computer interaction: comparison and methodological improvements , 2013, Journal on Multimodal User Interfaces.

[17]  Lauren Wilcox,et al.  "Hello AI": Uncovering the Onboarding Needs of Medical Practitioners for Human-AI Collaborative Decision-Making , 2019, Proc. ACM Hum. Comput. Interact..

[18]  John J. B. Allen,et al.  The handbook of emotion elicitation and assessment , 2007 .

[19]  Xiaoou Tang,et al.  From Facial Expression Recognition to Interpersonal Relation Prediction , 2016, International Journal of Computer Vision.

[20]  Daniel S. Weld,et al.  The challenge of crafting intelligible intelligence , 2018, Commun. ACM.

[21]  Peter Robinson,et al.  Cross-dataset learning and person-specific normalisation for automatic Action Unit detection , 2015, 2015 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).

[22]  John Zimmerman,et al.  Unremarkable AI: Fitting Intelligent Decision Support into Critical, Clinical Decision-Making Processes , 2019, CHI.

[23]  Bertram E. Shi,et al.  Multitask Emotion Recognition with Incomplete Labels , 2020, 2020 15th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2020).

[24]  Koen V. Hindriks,et al.  The role of emotion in self-explanations by cognitive agents , 2017, 2017 Seventh International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW).

[25]  J. Russell A circumplex model of affect. , 1980 .

[26]  Rafael A. Calvo,et al.  Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications , 2010, IEEE Transactions on Affective Computing.

[27]  Dimitrios Kollias,et al.  Expression, Affect, Action Unit Recognition: Aff-Wild2, Multi-Task Learning and ArcFace , 2019, BMVC.

[28]  Jeffrey F. Cohn,et al.  Observer-based measurement of facial expression with the Facial Action Coding System. , 2007 .

[29]  Heng Tao Shen,et al.  Principal Component Analysis , 2009, Encyclopedia of Biometrics.

[30]  Mark A. Neerincx,et al.  Using Perceptual and Cognitive Explanations for Enhanced Human-Agent Team Performance , 2018, HCI.

[31]  L. Longo,et al.  Explainable Artificial Intelligence: a Systematic Review , 2020, ArXiv.

[32]  B. Waller,et al.  EquiFACS: The Equine Facial Action Coding System , 2015, PloS one.

[33]  Koen V. Hindriks,et al.  Self-explanations of a cognitive agent by citing goals and emotions , 2017, 2017 Seventh International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW).

[34]  Maja Pantic,et al.  AFEW-VA database for valence and arousal estimation in-the-wild , 2017, Image Vis. Comput..

[35]  Mark A. Neerincx,et al.  Interpretable confidence measures for decision support systems , 2020, Int. J. Hum. Comput. Stud..

[36]  Zhao Ren,et al.  EmoBed: Strengthening Monomodal Emotion Recognition via Training with Crossmodal Emotion Embeddings , 2019, IEEE Transactions on Affective Computing.

[37]  Gelareh Mohammadi,et al.  Towards Understanding Emotional Experience in a Componential Framework , 2019, 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII).

[38]  Geoffrey J. Gordon,et al.  Artificial Intelligence in Medicine: 17th Conference on Artificial Intelligence in Medicine, AIME 2019, Poznan, Poland, June 26–29, 2019, Proceedings , 2019, Lecture Notes in Computer Science.

[39]  M. McHugh Interrater reliability: the kappa statistic , 2012, Biochemia medica.

[40]  Dimitrios Kollias,et al.  Analysing Affective Behavior in the First ABAW 2020 Competition , 2020, 2020 15th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2020).

[41]  Ross A. Knepper,et al.  Implicit Communication of Actionable Information in Human-AI teams , 2019, CHI.

[42]  Jennifer M. George,et al.  Affect, emotion, and decision making , 2016 .

[43]  Mohammad H. Mahoor,et al.  DISFA: A Spontaneous Facial Action Intensity Database , 2013, IEEE Transactions on Affective Computing.

[44]  L. Lin,et al.  A concordance correlation coefficient to evaluate reproducibility. , 1989, Biometrics.