Computational analysis of human-robot interactions through first-person vision: Personality and interaction experience

In this paper, we analyse interactions with Nao, a small humanoid robot, from the viewpoint of human participants through an ego-centric camera placed on their forehead. We focus on human participants' and robot's personalities and their impact on the human-robot interactions. We automatically extract nonverbal cues (e.g., head movement) from first-person perspective and explore the relationship of nonverbal cues with participants' self-reported personality and their interaction experience. We generate two types of behaviours for the robot (i.e., extroverted vs. introverted) and examine how robot's personality and behaviour affect the findings. Significant correlations are obtained between the extroversion and agreeable-ness traits of the participants and the perceived enjoyment with the extroverted robot. Plausible relationships are also found between the measures of interaction experience and personality and the first-person vision features. We then use computational models to automatically predict the participants' personality traits from these features. Promising results are achieved for the traits of agreeableness, conscientiousness and extroversion.

[1]  Alessandro Vinciarelli,et al.  A Survey of Personality Computing , 2014, IEEE Transactions on Affective Computing.

[2]  Nicole C. Krämer,et al.  How Our Personality Shapes Our Interactions with Virtual Characters - Implications for Research and Development , 2010, IVA.

[3]  W. Ickes,et al.  Big Five predictors of behavior and perceptions in initial dyadic interactions: personality similarity helps extraverts and introverts, but hurts "disagreeables". , 2009, Journal of personality and social psychology.

[4]  Richard E Petty,et al.  Overt head movements and persuasion: a self-validation analysis. , 2003, Journal of personality and social psychology.

[5]  Maartje M. A. de Graaf,et al.  Expectation Setting and Personality Attribution in HRI , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[6]  R. Larsen,et al.  GAZE AVOIDANCE: PERSONALITY AND SOCIAL JUDGMENTS OF PEOPLE WHO AVOID DIRECT FACE-TO- FACE CONTACT , 1996 .

[7]  Matthias Rauterberg,et al.  The Evolution of First Person Vision Methods: A Survey , 2014, IEEE Transactions on Circuits and Systems for Video Technology.

[8]  James M. Rehg,et al.  Learning to Recognize Daily Actions Using Gaze , 2012, ECCV.

[9]  R. Riggio,et al.  Impression formation: the role of expressive behavior. , 1986, Journal of personality and social psychology.

[10]  Arie W. Kruglanski,et al.  Person Perception by Introverts and Extraverts Under Time Pressure: Effects of Need for Closure , 1991 .

[11]  Subramanian Ramanathan,et al.  Connecting Meeting Behavior with Extraversion—A Systematic Study , 2012, IEEE Transactions on Affective Computing.

[12]  O. John,et al.  Measuring personality in one minute or less: A 10-item short version of the Big Five Inventory in English and German , 2007 .

[13]  Takahiro Okabe,et al.  Attention Prediction in Egocentric Video Using Motion and Visual Saliency , 2011, PSIVT.

[14]  Patricia Ladret,et al.  The blur effect: perception and estimation with a new no-reference perceptual blur metric , 2007, Electronic Imaging.

[15]  James M. Rehg,et al.  Social interactions: A first-person perspective , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[16]  Alexander H. Waibel,et al.  From Gaze to Focus of Attention , 1999, VISUAL.

[17]  Cheonshu Park,et al.  Building an Automated Engagement Recognizer based on Video Analysis , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[18]  Michael Neff,et al.  Evaluating the Effect of Gesture and Language on Personality Perception in Conversational Agents , 2010, IVA.

[19]  Joo-Hwee Lim,et al.  Understanding the Nature of First-Person Videos: Characterization and Classification Using Low-Level Features , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition Workshops.

[20]  Antonio Torralba,et al.  SIFT Flow: Dense Correspondence across Scenes and Its Applications , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[21]  Björn W. Schuller,et al.  MAPTRAITS 2014 - The First Audio/Visual Mapping Personality Traits Challenge - An Introduction: Perceived Personality and Social Dimensions , 2014, ICMI.

[22]  Daniel Gatica-Perez,et al.  How Do You Like Your Virtual Agent?: Human-Agent Interaction Experience through Nonverbal Features and Personality Traits , 2014, HBU.

[23]  J. Terken,et al.  The influence of robot personality on perceived and preferred level of user control , 2008 .

[24]  Stéphanie Buisine,et al.  THE INFLUENCE OF USER ’ S PERSONALITY AND GENDER ON THE PROCESSING OF VIRTUAL AGENTS ’ MULTIMODAL BEHAVIOR , 2009 .

[25]  Michael F. Land,et al.  The coordination of rotations of the eyes, head and trunk in saccadic turns produced in natural situations , 2004, Experimental Brain Research.

[26]  Adriana Tapus,et al.  A model for synthesizing a combined verbal and nonverbal behavior based on personality traits in human-robot interaction , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[27]  Daniel Gatica-Perez,et al.  One of a kind: inferring personality impressions in meetings , 2013, ICMI '13.

[28]  Ning Wang,et al.  Agreeable People Like Agreeable Virtual Humans , 2008, IVA.

[29]  Subramanian Ramanathan,et al.  On the relationship between head pose, social attention and personality prediction for unstructured and dynamic group interactions , 2013, ICMI '13.

[30]  Larry H. Matthies,et al.  First-Person Activity Recognition: What Are They Doing to Me? , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[31]  James M. Rehg,et al.  Learning to Predict Gaze in Egocentric Video , 2013, 2013 IEEE International Conference on Computer Vision.

[32]  R. Lippa The Nonverbal Display and Judgment of Extraversion, Masculinity, Femininity, and Gender Diagnosticity: A Lens Model Analysis , 1998 .

[33]  Giampiero Salvi,et al.  A gaze-based method for relating group involvement to individual engagement in multimodal multiparty dialogue , 2013, ICMI '13.