Evaluating the Effect of Gesture and Language on Personality Perception in Conversational Agents

A significant goal in multi-modal virtual agent research is to determine how to vary expressive qualities of a character so that it is perceived in a desired way. The "Big Five" model of personality offers a potential framework for organizing these expressive variations. In this work, we focus on one parameter in this model - extraversion - and demonstrate how both verbal and non-verbal factors impact its perception. Relevant findings from the psychology literature are summarized. Based on these, an experiment was conducted with a virtual agent that demonstrates how language generation, gesture rate and a set of movement performance parameters can be varied to increase or decrease the perceived extraversion. Each of these factors was shown to be significant. These results offer guidance to agent designers on how best to create specific characters.

[1]  W. T. Norman,et al.  Toward an adequate taxonomy of personality attributes: replicated factors structure in peer nomination personality ratings. , 1963, Journal of abnormal and social psychology.

[2]  A. Mehrabian Significance of posture and posiion in the communication of attitude and status relationships. , 1969, Psychological bulletin.

[3]  M. North Personality Assessment Through Movement , 1972 .

[4]  M. Knapp,et al.  Nonverbal communication in human interaction , 1972 .

[5]  K. Scherer,et al.  Social Markers in Speech , 1980 .

[6]  J. Brebner Personality theory and movement , 1985 .

[7]  R. Riggio,et al.  Impression formation: the role of expressive behavior. , 1986, Journal of personality and social psychology.

[8]  M. Argyle Bodily communication, 2nd ed. , 1988 .

[9]  A. Furnham Language and personality. , 1990 .

[10]  D. Funder The Personality Puzzle , 1996 .

[11]  R. Lippa The Nonverbal Display and Judgment of Extraversion, Masculinity, Femininity, and Gender Diagnosticity: A Lens Model Analysis , 1998 .

[12]  J. Pennebaker,et al.  Linguistic styles: language use as an individual difference. , 1999, Journal of personality and social psychology.

[13]  Clifford Nass,et al.  Consistency of personality in interactive characters: verbal cues, non-verbal cues, and user characteristics , 2000, Int. J. Hum. Comput. Stud..

[14]  Norman I. Badler,et al.  The EMOTE model for effort and shape , 2000, SIGGRAPH.

[15]  Thomas Rist,et al.  The automated design of believable dialogues for animated presentation teams , 2001 .

[16]  Maurizio Mancini,et al.  Formational parameters and adaptive prototype instantiation for MPEG-4 compliant gesture synthesis , 2002, Proceedings of Computer Animation 2002 (CA 2002).

[17]  Jean-Marc Dewaele,et al.  Variation in the Contextuality of Language: An Empirical Measure , 2002 .

[18]  S. Gosling,et al.  A very brief measure of the Big-Five personality domains , 2003 .

[19]  Paul Piwek,et al.  A Flexible Pragmatics-Driven Language Generator for Animated Agents , 2003, EACL.

[20]  Justine Cassell,et al.  BEAT: the Behavior Expression Animation Toolkit , 2001, Life-like characters.

[21]  Stefan Kopp,et al.  Synthesizing multimodal utterances for conversational agents , 2004, Comput. Animat. Virtual Worlds.

[22]  Maurizio Mancini,et al.  Implementing Expressive Gesture Synthesis for Embodied Conversational Agents , 2005, Gesture Workshop.

[23]  Ning Wang,et al.  The Politeness Effect: Pedagogical Agents and Learning Gains , 2005, AIED.

[24]  Michael Neff,et al.  AER: aesthetic exploration and refinement for expressive character animation , 2005, SCA '05.

[25]  S. Gosling,et al.  Personality in its natural habitat: manifestations and implicit folk theories of personality in daily life. , 2006, Journal of personality and social psychology.

[26]  Nicolas Courty,et al.  Gesture in Human-Computer Interaction and Simulation , 2006 .

[27]  Michael Neff,et al.  Towards Natural Gesture Synthesis: Evaluating Gesture Units in a Data-Driven Approach to Gesture Synthesis , 2007, IVA.

[28]  Marilyn A. Walker,et al.  PERSONAGE: Personality Generation for Dialogue , 2007, ACL.

[29]  James C. Lester,et al.  Modeling self-efficacy in intelligent tutoring systems: An inductive approach , 2008, User Modeling and User-Adapted Interaction.

[30]  Marilyn A. Walker,et al.  Trainable Generation of Big-Five Personality Styles through Data-Driven Parameter Estimation , 2008, ACL.

[31]  Hans-Peter Seidel,et al.  Annotated New Text Engine Animation Animation Lexicon Animation Gesture Profiles MR : . . . JL : . . . Gesture Generation Video Annotated Gesture Script , 2007 .

[32]  Stacy Marsella,et al.  SmartBody: behavior realization for embodied conversational agents , 2008, AAMAS.

[33]  Michael Neff,et al.  Interactive editing of motion style using drives and correlations , 2009, SCA '09.

[34]  Dieter W. Fellner,et al.  Proceedings of the 2009 ACM SIGGRAPH/Eurographics Symposium on Computer Animation, SCA 2009, New Orleans, Louisiana, USA, August 1-2, 2009 , 2009, Symposium on Computer Animation.

[35]  Sergey Levine,et al.  Real-time prosody-driven synthesis of body language , 2009, SIGGRAPH 2009.

[36]  David A. Forsyth,et al.  Generalizing motion edits with Gaussian processes , 2009, ACM Trans. Graph..

[37]  Alexis Héloir,et al.  EMBR: A realtime animation engine for interactive embodied agents , 2009, ACII.