Fully Automatic Analysis of Engagement and Its Relationship to Personality in Human-Robot Interactions

Engagement is crucial to designing intelligent systems that can adapt to the characteristics of their users. This paper focuses on the automatic analysis and classification of engagement based on humans’ and robot’s personality profiles in a triadic human–human–robot interaction setting. More explicitly, we present a study that involves two participants interacting with a humanoid robot, and investigate how participants’ personalities can be used together with the robot’s personality to predict the engagement state of each participant. The fully automatic system is first trained to predict the Big Five personality traits of each participant by extracting individual and interpersonal features from their nonverbal behavioural cues. Second, the output of the personality prediction system is used as an input to the engagement classification system. Third, we focus on the concept of “group engagement”, which we define as the collective engagement of the participants with the robot, and analyze the impact of similar and dissimilar personalities on the engagement classification. Our experimental results show that: 1) using the automatically predicted personality labels for engagement classification yields an F-measure on par with using the manually annotated personality labels, demonstrating the effectiveness of the automatic personality prediction module proposed; 2) using the individual and interpersonal features without utilizing personality information is not sufficient for engagement classification, instead incorporating the participants and robots personalities with individual/interpersonal features increases engagement classification performance; and 3) the best classification performance is achieved when the participants and the robot are extroverted, while the worst results are obtained when all are introverted.

[1]  Paul A. Viola,et al.  Robust Real-Time Face Detection , 2001, International Journal of Computer Vision.

[2]  Adriana Tapus,et al.  A model for synthesizing a combined verbal and nonverbal behavior based on personality traits in human-robot interaction , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[3]  Nick Bryan-Kinns,et al.  Identifying mutual engagement , 2012, Behav. Inf. Technol..

[4]  Marek P. Michalowski,et al.  A spatial model of engagement for a social robot , 2006, 9th IEEE International Workshop on Advanced Motion Control, 2006..

[5]  Candace L. Sidner,et al.  Explorations in engagement for humans and robots , 2005, Artif. Intell..

[6]  Pawel Dybala,et al.  Activating Humans with Humor - A Dialogue System That Users Want to Interact with , 2009, IEICE Trans. Inf. Syst..

[7]  Martin L. Martens,et al.  Sticking it All Together: A Critical Assessment of the Group Cohesion–Performance Literature , 2009 .

[8]  Angel P. del Pobil,et al.  The Law of Attraction in Human-Robot Interaction: , 2012 .

[9]  M. Mason,et al.  The Look of Love , 2005, Psychological science.

[10]  Ana Paiva,et al.  Detecting Engagement in HRI: An Exploration of Social and Task-Based Context , 2012, 2012 International Conference on Privacy, Security, Risk and Trust and 2012 International Confernece on Social Computing.

[11]  Daniel Gatica-Perez,et al.  Personality Trait Classification via Co-Occurrent Multiparty Multimodal Event Discovery , 2015, ICMI.

[12]  Anna Esposito,et al.  Toward Robotic Socially Believable Behaving Systems - Volume I - Modeling Emotions , 2016, Intelligent Systems Reference Library.

[13]  C. A. Higgins,et al.  THE BIG FIVE PERSONALITY TRAITS, GENERAL MENTAL ABILITY, AND CAREER SUCCESS ACROSS THE LIFE SPAN , 1999 .

[14]  Munsang Kim,et al.  Intention reading from a fuzzy-based human engagement model and behavioural features , 2012 .

[15]  Brian Scassellati,et al.  Comparing Models of Disengagement in Individual and Group Interactions , 2015, 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[16]  Catherine Pelachaud,et al.  Definitions of engagement in human-agent interaction , 2015, 2015 International Conference on Affective Computing and Intelligent Interaction (ACII).

[17]  Giampiero Salvi,et al.  A gaze-based method for relating group involvement to individual engagement in multimodal multiparty dialogue , 2013, ICMI '13.

[18]  Mohamed Chetouani,et al.  Predicting Extraversion from Non-verbal Features During a Face-to-Face Human-Robot Interaction , 2015, ICSR.

[19]  Subramanian Ramanathan,et al.  Automatic modeling of personality states in small group interactions , 2011, MM '11.

[20]  Manuel Giuliani,et al.  How can i help you': comparing engagement classification strategies for a robot bartender , 2013, ICMI '13.

[21]  Fernando De la Torre,et al.  Supervised Descent Method and Its Applications to Face Alignment , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[22]  Hatice Gunes,et al.  Personality perception of robot avatar tele-operators , 2016, 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[23]  Alessandro Vinciarelli,et al.  A Survey of Personality Computing , 2014, IEEE Transactions on Affective Computing.

[24]  Nicole C. Krämer,et al.  How Our Personality Shapes Our Interactions with Virtual Characters - Implications for Research and Development , 2010, IVA.

[25]  Björn W. Schuller,et al.  Building Autonomous Sensitive Artificial Listeners , 2012, IEEE Transactions on Affective Computing.

[26]  Catherine Pelachaud,et al.  An ECA expressing appreciations , 2015, 2015 International Conference on Affective Computing and Intelligent Interaction (ACII).

[27]  Daniel Gatica-Perez,et al.  One of a kind: inferring personality impressions in meetings , 2013, ICMI '13.

[28]  Daniel Gatica-Perez,et al.  Rapport with Virtual Agents: What Do Human Social Cues and Personality Explain? , 2017, IEEE Transactions on Affective Computing.

[29]  Ed H. Chi,et al.  Crowdsourcing for Usability: Using Micro-Task Markets for Rapid, Remote, and Low-Cost User Measurements , 2007 .

[30]  Mohamed Chetouani,et al.  Engagement detection based on mutli-party cues for human robot interaction , 2015, 2015 International Conference on Affective Computing and Intelligent Interaction (ACII).

[31]  Matthias Hirth,et al.  Is affective crowdsourcing reliable? , 2014, 2014 IEEE Fifth International Conference on Communications and Electronics (ICCE).

[32]  N. Otsu A threshold selection method from gray level histograms , 1979 .

[33]  Candace L. Sidner,et al.  Human-robot interaction: engagement between humans and robots for hosting activities , 2002, Proceedings. Fourth IEEE International Conference on Multimodal Interfaces.

[34]  Gislin Dagnelie,et al.  Visual prosthetics: physiology, bioengineering, rehabilitation. , 2011 .

[35]  D. Gática-Pérez,et al.  A Nonverbal Behavior Approach to Identify Emergent Leaders in Small Groups , 2012, IEEE Transactions on Multimedia.

[36]  Ana Paiva,et al.  Automatic analysis of affective postures and body motion to detect engagement with a game companion , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[37]  Arthur C. Graesser,et al.  AutoTutor and affective autotutor: Learning by talking with cognitively and emotionally intelligent computers that talk back , 2012, TIIS.

[38]  Eric Horvitz,et al.  Models for Multiparty Engagement in Open-World Dialog , 2009, SIGDIAL Conference.

[39]  Md. Atiqur Rahman Ahad Motion History Image , 2013 .

[40]  Candace L. Sidner,et al.  Engagement by Looking: Behaviors for Robots When Collaborating with People , 2003 .

[41]  Kostas Karpouzis,et al.  Investigating shared attention with a virtual agent using a gaze-based interface , 2010, Journal on Multimodal User Interfaces.

[42]  D. A. Kenny,et al.  PERSON: A General Model of Interpersonal Perception , 2004, Personality and social psychology review : an official journal of the Society for Personality and Social Psychology, Inc.

[43]  Ginevra Castellano,et al.  Learner Modelling and Automatic Engagement Recognition with Robotic Tutors , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[44]  Yuri Ivanov,et al.  Probabilistic combination of multiple modalities to detect interest , 2004, Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004..

[45]  Andrew Zisserman,et al.  2D Articulated Human Pose Estimation and Retrieval in (Almost) Unconstrained Still Images , 2012, International Journal of Computer Vision.

[46]  Hatice Gunes,et al.  Automatic Prediction of Impressions in Time and across Varying Context: Personality, Attractiveness and Likeability , 2017, IEEE Transactions on Affective Computing.

[47]  Md. Atiqur Rahman Ahad,et al.  Motion History Images for Action Recognition and Understanding , 2012, SpringerBriefs in Computer Science.

[48]  Eric Horvitz,et al.  Managing Human-Robot Engagement with Forecasts and... um... Hesitations , 2014, ICMI.

[49]  J. Fleiss,et al.  Intraclass correlations: uses in assessing rater reliability. , 1979, Psychological bulletin.

[50]  Catherine Pelachaud,et al.  Fostering User Engagement in Face-to-Face Human-Agent Interactions: A Survey , 2016, Toward Robotic Socially Believable Behaving Systems.

[51]  Ginevra Castellano,et al.  Engagement Perception and Generation for Social Robots and Virtual Agents , 2016, Toward Robotic Socially Believable Behaving Systems.

[52]  Hatice Gunes,et al.  Computational analysis of human-robot interactions through first-person vision: Personality and interaction experience , 2015, 2015 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[53]  Yang Wang,et al.  Human Action Recognition by Semilatent Topic Models , 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[54]  Nadia Mana,et al.  Multimodal recognition of personality traits in social interactions , 2008, ICMI '08.

[55]  Mohamed Chetouani,et al.  Towards Engagement Models that Consider Individual Factors in HRI: On the Relation of Extroversion and Negative Attitude Towards Robots to Gaze and Speech During a Human–Robot Assembly Task , 2015, Int. J. Soc. Robotics.

[56]  Catherine Pelachaud,et al.  Politeness versus Perceived Engagement: an Experimental Study , 2014, NLPCS 2014.

[57]  M. Chetouani,et al.  Interaction and behaviour imaging: a novel method to measure mother–infant interaction using video 3D reconstruction , 2016, Translational psychiatry.

[58]  Daniel McDuff,et al.  Crowdsourcing Techniques for Affective Computing , 2015 .

[59]  Stéphanie Buisine,et al.  THE INFLUENCE OF USER ’ S PERSONALITY AND GENDER ON THE PROCESSING OF VIRTUAL AGENTS ’ MULTIMODAL BEHAVIOR , 2009 .

[60]  Andrea Vedaldi,et al.  Vlfeat: an open and portable library of computer vision algorithms , 2010, ACM Multimedia.

[61]  Jean-Marc Odobez,et al.  Recognizing the Visual Focus of Attention for Human Robot Interaction , 2012, HBU.

[62]  Daniel Gatica-Perez,et al.  Estimating Cohesion in Small Groups Using Audio-Visual Nonverbal Behavior , 2010, IEEE Transactions on Multimedia.

[63]  Ian H. Witten,et al.  The WEKA data mining software: an update , 2009, SKDD.

[64]  Dominique Vaufreydaz,et al.  Multi-Sensors Engagement Detection with a Robot Companion in a Home Environment , 2012, IROS 2012.

[65]  David A. McAllester,et al.  Object Detection with Discriminatively Trained Part Based Models , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[66]  Kayvan Najarian,et al.  A physiological signal processing system for optimal engagement and attention detection , 2011, 2011 IEEE International Conference on Bioinformatics and Biomedicine Workshops (BIBMW).