How Do You Like Your Virtual Agent?: Human-Agent Interaction Experience through Nonverbal Features and Personality Traits

Recent studies suggest that human interaction experience with virtual agents can be, to a very large degree, described by people’s personality traits. Moreover, the nonverbal behavior of a person has been known to indicate several social constructs in different settings. In this study, we analyze human-agent interaction from the perspective of the personality of the human and the nonverbal behaviors he/she displays during the interaction. Based on existing work in psychology, we designed and recorded an experiment on human-agent interactions, in which a human communicates with two different virtual agents. Human-agent interactions are described with three self-reported measures: quality, rapport and likeness of the agent. We investigate the use of self-reported personality traits and extracted audio-visual nonverbal features as descriptors of these measures. Our results on a correlation analysis show significant correlations between the interaction measures and several of the personality traits and nonverbal features, which are supported by both psychology and human-agent interaction literature. We further use traits and nonverbal cues as features to build regression models for predicting measures of interaction experience. Our results show that the best results are obtained when nonverbal cues and personality traits are used together.

[1]  W. Ickes,et al.  Big Five predictors of behavior and perceptions in initial dyadic interactions: personality similarity helps extraverts and introverts, but hurts "disagreeables". , 2009, Journal of personality and social psychology.

[2]  B. Fredrickson,et al.  Strangers in sync: Achieving embodied rapport through shared movements. , 2012, Journal of experimental social psychology.

[3]  Ning Wang,et al.  Agreeable People Like Agreeable Virtual Humans , 2008, IVA.

[4]  J. Waldron Judgment of Like-Dislike from Facial Expression and Body Posture , 1975 .

[5]  M. Knapp,et al.  Nonverbal communication in human interaction , 1972 .

[6]  Fabio Pianesi,et al.  A multimodal annotated corpus of consensus decision making meetings , 2007, Lang. Resour. Evaluation.

[7]  Daniel Gatica-Perez,et al.  One of a kind: inferring personality impressions in meetings , 2013, ICMI '13.

[8]  Daniel Gatica-Perez,et al.  The Good, the Bad, and the Angry: Analyzing Crowdsourced Impressions of Vloggers , 2012, ICWSM.

[9]  A. Mehrabian Significance of posture and posiion in the communication of attitude and status relationships. , 1969, Psychological bulletin.

[10]  Nicole C. Krämer,et al.  How Our Personality Shapes Our Interactions with Virtual Characters - Implications for Research and Development , 2010, IVA.

[11]  Anton Nijholt,et al.  Automatic Understanding of Affective and Social Signals by Multimodal Mimicry Recognition , 2011, ACII.

[12]  Kory Floyd,et al.  Nonverbal Expressions of Liking and Disliking in Initial Interaction: Encoding and Decoding Perspectives , 2006 .

[13]  Alex Pentland,et al.  Honest Signals - How They Shape Our World , 2008 .

[14]  P. Ekman,et al.  Relative importance of face, body, and speech in judgments of personality and affect. , 1980 .

[15]  Alvaro Marcos-Ramiro,et al.  Multimodal analysis of body communication cues in employment interviews , 2013, ICMI '13.

[16]  Adriana Tapus,et al.  Socially Assistive Robots: The Link between Personality, Empathy, Physiological Signals, and Task Performance , 2008, AAAI Spring Symposium: Emotion, Personality, and Social Behavior.

[17]  Clifford Nass,et al.  Does computer-generated speech manifest personality? an experimental test of similarity-attraction , 2000, CHI.

[18]  D. Funder,et al.  Behavioral manifestations of personality: an ecological approach to judgmental accuracy. , 1993, Journal of personality and social psychology.

[19]  Clifford Nass,et al.  Consistency of personality in interactive characters: verbal cues, non-verbal cues, and user characteristics , 2000, Int. J. Hum. Comput. Stud..

[20]  C. G. Jung Psychological Types , 2000 .

[21]  Léon J. M. Rothkrantz,et al.  Automatic Audio-Visual Fusion for Aggression Detection Using Meta-information , 2012, 2012 IEEE Ninth International Conference on Advanced Video and Signal-Based Surveillance.

[22]  Etienne de Sevin,et al.  Evaluation of Four Designed Virtual Agent Personalities , 2012, IEEE Transactions on Affective Computing.

[23]  M. T. Palmer,et al.  Communicating Intentions Through Nonverbal Behaviors Conscious and Nonconscious Encoding of Liking , 1995 .

[24]  Daniel Gatica-Perez,et al.  The YouTube Lens: Crowdsourced Personality Impressions and Audiovisual Analysis of Vlogs , 2013, IEEE Transactions on Multimedia.

[25]  Daniel Gatica-Perez,et al.  Emergent leaders through looking and speaking: from audio-visual data to multimodal recognition , 2012, Journal on Multimodal User Interfaces.

[26]  Fabio Valente,et al.  An Information Theoretic Combination of MFCC and TDOA Features for Speaker Diarization , 2011, IEEE Transactions on Audio, Speech, and Language Processing.

[27]  Dirk Heylen,et al.  First Impressions: Users' Judgments of Virtual Agents' Personality and Interpersonal Attitude in First Encounters , 2012, IVA.

[28]  P. Costa,et al.  A contemplated revision of the NEO Five-Factor Inventory , 2004 .

[29]  Stéphanie Buisine,et al.  THE INFLUENCE OF USER ’ S PERSONALITY AND GENDER ON THE PROCESSING OF VIRTUAL AGENTS ’ MULTIMODAL BEHAVIOR , 2009 .