Frustratingly Easy Personalization for Real-time Affect Interpretation of Facial Expression

In recent years, researchers have developed technology to analyze human facial expressions and other affective data at very high time resolution. This technology is enabling researchers to develop and study interactive robots that are increasingly sensitive to their human interaction partners' affective states. However, typical interaction planning models and algorithms operate on timescales that are frequently orders of magnitude larger than the timescales at which real-time affect data is sensed. To bridge this gap between the scales of sensor data collection and interaction modeling, affective data must be aggregated and interpreted over longer timescales. In this paper we clarify and formalize the computational task of affect interpretation in the context of an interactive educational game played by a human and a robot, during which facial expression data is sensed, interpreted, and used to predict the interaction partner's gameplay behavior. We compare different techniques for affect interpretation, used to generate sets of affective labels for an interactive modeling and inference task, and evaluate how the labels generated by each interpretation technique impact model training and inference. We show that incorporating a simple method of personalization into the affect interpretation process - dynamically calculating and applying a personalized threshold for determining affect feature labels over time - leads to a significant improvement in the quality of inference, comparable to performance gains from other data pre-processing steps such as smoothing data via median filter. We discuss the implications of these findings for future development of affect-aware interactive robots and propose guidelines for the use of affect interpretation methods in interactive scenarios.

[1]  Alejandro Lopez-Rincon Emotion Recognition using Facial Expressions in Children using the NAO Robot , 2019, 2019 International Conference on Electronics, Communications and Computers (CONIELECOMP).

[2]  Akane Sano,et al.  Predicting students' happiness from physiology, phone, mobility, and behavioral data , 2015, 2015 International Conference on Affective Computing and Intelligent Interaction (ACII).

[3]  Cynthia Breazeal,et al.  Affect-Aware Student Models for Robot Tutors , 2016, AAMAS.

[4]  Angela L. Duckworth,et al.  Advanced, Analytic, Automated (AAA) Measurement of Engagement During Learning , 2017, Educational psychologist.

[5]  Cynthia Breazeal,et al.  Affective Personalization of a Social Robot Tutor for Children's Second Language Skills , 2016, AAAI.

[6]  Hal Daumé,et al.  Frustratingly Easy Domain Adaptation , 2007, ACL.

[7]  J. Gratch,et al.  The Affective Computing Approach to Affect Measurement , 2018 .

[8]  Fernando De la Torre,et al.  Selective Transfer Machine for Personalized Facial Expression Analysis , 2017, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[9]  Vijay Kumar,et al.  The grand challenges of Science Robotics , 2018, Science Robotics.

[10]  Sooyeon Jeong,et al.  Improving Smartphone Users' Affect and Wellbeing with Personalized Positive Psychology Interventions , 2016, HAI.

[11]  Carlos Busso,et al.  Emotion recognition using a hierarchical binary decision tree approach , 2011, Speech Commun..

[12]  Daniel McDuff,et al.  Designing emotionally sentient agents , 2018, Commun. ACM.

[13]  Cynthia Breazeal,et al.  A Social Robot System for Modeling Children's Word Pronunciation: Socially Interactive Agents Track , 2018, AAMAS.

[14]  Ana Paiva,et al.  Context-Sensitive Affect Recognition for a Robotic Game Companion , 2014, ACM Trans. Interact. Intell. Syst..

[15]  Cynthia Breazeal,et al.  A Model-Free Affective Reinforcement Learning Approach to Personalization of an Autonomous Social Robot Companion for Early Literacy Education , 2019, AAAI.

[16]  Inioluwa Deborah Raji,et al.  Actionable Auditing: Investigating the Impact of Publicly Naming Biased Performance Results of Commercial AI Products , 2019, AIES.

[17]  Kate Saenko,et al.  Return of Frustratingly Easy Domain Adaptation , 2015, AAAI.

[18]  Daniel McDuff,et al.  AFFDEX SDK: A Cross-Platform Real-Time Multi-Face Expression Recognition Toolkit , 2016, CHI Extended Abstracts.

[19]  Yaser Sheikh,et al.  OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields , 2018, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[20]  Cynthia Breazeal,et al.  A Bayesian Theory of Mind Approach to Nonverbal Communication , 2019, 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[21]  John See,et al.  A Survey of Automatic Facial Micro-Expression Analysis: Databases, Methods, and Challenges , 2018, Front. Psychol..

[22]  Florian Vogt,et al.  Towards More Robust Automatic Facial Expression Recognition in Smart Environments , 2017, PETRA.

[23]  Ning An,et al.  Speech Emotion Recognition Using Fourier Parameters , 2015, IEEE Transactions on Affective Computing.

[24]  Takashi Yamauchi,et al.  Reading Emotion From Mouse Cursor Motions: Affective Computing Approach , 2018, Cogn. Sci..

[25]  Björn W. Schuller,et al.  Personalized machine learning for robot perception of affect and engagement in autism therapy , 2018, Science Robotics.

[26]  Rada Mihalcea,et al.  Muse-ing on the Impact of Utterance Ordering on Crowdsourced Emotion Annotations , 2019, ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[27]  Timnit Gebru,et al.  Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification , 2018, FAT.

[28]  Malte F. Jung Affective Grounding in Human-Robot Interaction , 2017, 2017 12th ACM/IEEE International Conference on Human-Robot Interaction (HRI.

[29]  Akane Sano,et al.  Predicting Tomorrow's Mood, Health, and Stress Level using Personalized Multitask Learning and Domain Adaptation , 2017, AffComp@IJCAI.

[30]  Vinod Chandran,et al.  Towards robust automatic affective classification of images using facial expressions for practical applications , 2015, Multimedia Tools and Applications.