A Verbal and Gestural Corpus of Story Retellings to an Expressive Embodied Virtual Character

We present a corpus of 44 human-agent verbal and gestural story retellings designed to explore whether humans would gesturally entrain to an embodied intelligent virtual agent. We used a novel data collection method where an agent presented story components in installments, which the human would then retell to the agent. At the end of the installments, the human would then retell the embodied animated agent the story as a whole. This method was designed to allow us to observe whether changes in the agent’s gestural behavior would result in human gestural changes. The agent modified its gestures over the course of the story, by starting out the first installment with gestural behaviors designed to manifest extraversion, and slowly modifying gestures to express introversion over time, or the reverse. The corpus contains the verbal and gestural transcripts of the human story retellings. The gestures were coded for type, handedness, temporal structure, spatial extent, and the degree to which the participants’ gestures match those produced by the agent. The corpus illustrates the variation in expressive behaviors produced by users interacting with embodied virtual characters, and the degree to which their gestures were influenced by the agent’s dynamic changes in personality-based expressive style.

[1]  M. Pickering,et al.  Toward a mechanistic psychology of dialogue , 2004, Behavioral and Brain Sciences.

[2]  Michael Neff,et al.  An annotation scheme for conversational gestures: how to economically capture timing and form , 2007, Lang. Resour. Evaluation.

[3]  Maurizio Mancini,et al.  Implementing Expressive Gesture Synthesis for Embodied Conversational Agents , 2005, Gesture Workshop.

[4]  Michael Neff,et al.  Evaluating the Effect of Gesture and Language on Personality Perception in Conversational Agents , 2010, IVA.

[5]  Janet Beavin Bavelas,et al.  Gesturing on the telephone: Independent effects of dialogue and visibility. , 2008 .

[6]  Peter Hagoort,et al.  In dialogue with an avatar, syntax production is identical compared to dialogue with a human partner , 2014, CogSci.

[7]  M. Studdert-Kennedy Hand and Mind: What Gestures Reveal About Thought. , 1994 .

[8]  Hennie Brugman,et al.  Annotating Multi-media/Multi-modal Resources with ELAN , 2004, LREC.

[9]  T. Chartrand,et al.  The chameleon effect: the perception-behavior link and social interaction. , 1999, Journal of personality and social psychology.

[10]  C. Nass,et al.  Machines and Mindlessness , 2000 .

[11]  Sotaro Kita,et al.  Movement Phase in Signs and Co-Speech Gestures, and Their Transcriptions by Human Coders , 1997, Gesture Workshop.

[12]  O. John,et al.  Determinants of interjudge agreement on personality traits: the big five domains, observability, evaluativeness, and the unique perspective of the self. , 1993, Journal of personality.

[13]  Marko Dragojevic,et al.  Communication Accommodation Theory , 2015 .

[14]  R. Krauss,et al.  PSYCHOLOGICAL SCIENCE Research Article GESTURE, SPEECH, AND LEXICAL ACCESS: The Role of Lexical Movements in Speech Production , 2022 .

[15]  Elliot Aronson,et al.  Gain and loss of esteem as determinants of interpersonal attractiveness , 1965 .

[16]  Stefan Kopp,et al.  The Effects of an Embodied Conversational Agent's Nonverbal Behavior on User's Evaluation and Behavioral Mimicry , 2007, IVA.

[17]  Marilyn A. Walker,et al.  Towards personality-based user adaptation: psychologically informed stylistic language generation , 2010, User Modeling and User-Adapted Interaction.

[18]  Dawn O. Braithwaite,et al.  Explaining Communication : Contemporary Theories and Exemplars , 2013 .

[19]  Stefan Kopp,et al.  An Alignment-Capable Microplanner for Natural Language Generation , 2009, ENLG.

[20]  Michael Neff,et al.  Gestural Adaptation in Extravert-Introvert Pairs and Implications for IVAs , 2013 .

[21]  Marilyn A. Walker,et al.  Controlling User Perceptions of Linguistic Style: Trainable Generation of Personality Traits , 2011, CL.

[22]  Marilyn A. Walker,et al.  PersonaBank: A Corpus of Personal Narratives and Their Story Intention Graphs , 2016, LREC.

[23]  Michael Neff,et al.  A Corpus of Gesture-Annotated Dialogues for Monologue-to-Dialogue Generation from Personal Narratives , 2016, LREC.

[24]  Clifford Nass,et al.  Syntactic alignment between computers and people: The role of belief about mental states , 2003 .

[25]  M. Swerts,et al.  Adaptation in Gesture: Converging Hands or Converging Minds?. , 2012 .

[26]  Communication, Context, and Consequence , 2022 .

[27]  Jeremy N. Bailenson,et al.  Detecting digital chameleons , 2008, Comput. Hum. Behav..

[28]  R. Riggio,et al.  Impression formation: the role of expressive behavior. , 1986, Journal of personality and social psychology.

[29]  S. Gosling,et al.  A very brief measure of the Big-Five personality domains , 2003 .

[30]  D. Funder,et al.  Differences between traits: properties associated with interjudge agreement. , 1987, Journal of personality and social psychology.

[31]  Stefan Kopp,et al.  Towards a Common Framework for Multimodal Generation: The Behavior Markup Language , 2006, IVA.

[32]  Michael Neff,et al.  Storytelling Agents with Personality and Adaptivity , 2015, IVA.

[33]  D. McNeill Gesture and Thought , 2005 .

[34]  Ana Paiva,et al.  Computational Models of Cultural Behavior for Human-Agent Interaction (Dagstuhl Seminar 14131) , 2014, Dagstuhl Reports.

[35]  Michael Neff,et al.  Judging IVA Personality Using an Open-Ended Question , 2013, IVA.

[36]  Thomas Rist,et al.  From adaptive hypertext to personalized web companions , 2002, CACM.

[37]  Timothy W. Bickmore,et al.  Towards the design of multimodal interfaces for handheld conversational characters , 2002, CHI Extended Abstracts.

[38]  Michael Neff,et al.  Two Techniques for Assessing Virtual Agent Personality , 2016, IEEE Transactions on Affective Computing.

[39]  References , 1971 .

[40]  Stefan Kopp,et al.  Gestural Alignment in Natural Dialogue , 2012, CogSci.

[41]  Stefan Kopp,et al.  Generation and Evaluation of Communicative Robot Gesture , 2012, Int. J. Soc. Robotics.

[42]  J. Bailenson,et al.  Digital Chameleons , 2005, Psychological science.

[43]  R. Lippa The Nonverbal Display and Judgment of Extraversion, Masculinity, Femininity, and Gender Diagnosticity: A Lens Model Analysis , 1998 .

[44]  Michael Neff,et al.  Towards Natural Gesture Synthesis: Evaluating Gesture Units in a Data-Driven Approach to Gesture Synthesis , 2007, IVA.

[45]  J. Holler,et al.  Co-Speech Gesture Mimicry in the Process of Collaborative Referring During Face-to-Face Dialogue , 2011 .

[46]  Tom A. B. Snijders,et al.  Multilevel Analysis , 2011, International Encyclopedia of Statistical Science.

[47]  Oliver Lemon,et al.  Natural Language Generation as Planning Under Uncertainty for Spoken Dialogue Systems , 2009, EACL.

[48]  Rick Dale,et al.  Behavior Matching in Multimodal Communication Is Synchronized , 2012, Cogn. Sci..

[49]  Michael J. Beatty,et al.  Is there empirical evidence for a nonverbal profile of extraversion?: a meta‐analysis and critique of the literature , 2004 .

[50]  Peter Huber,et al.  Generating Culture-Specific Gestures for Virtual Agent Dialogs , 2010, IVA.

[51]  Irene Kimbara Gesture Form Convergence in Joint Description , 2008 .

[52]  Marilyn A. Walker,et al.  Individual and Domain Adaptation in Sentence Planning for Dialogue , 2007, J. Artif. Intell. Res..