Assessing the Impact of Hand Motion on Virtual Character Personality

Designing virtual characters that are capable of conveying a sense of personality is important for generating realistic experiences, and thus a key goal in computer animation research. Though the influence of gesture and body motion on personality perception has been studied, little is known about which attributes of hand pose and motion convey particular personality traits. Using the “Big Five” model as a framework for evaluating personality traits, this work examines how variations in hand pose and motion impact the perception of a character's personality. As has been done with facial motion, we first study hand motion in isolation as a requirement for running controlled experiments that avoid the combinatorial explosion of multimodal communication (all combinations of facial expressions, arm movements, body movements, and hands) and allow us to understand the communicative content of hands. We determined a set of features likely to reflect personality, based on research in psychology and previous human motion perception work: shape, direction, amplitude, speed, and manipulation. Then we captured realistic hand motion varying these attributes and conducted three perceptual experiments to determine the contribution of these attributes to the character's personalities. Both hand poses and the amplitude of hand motion affected the perception of all five personality traits. Speed impacted all traits except openness. Direction impacted extraversion and openness. Manipulation was perceived as an indicator of introversion, disagreeableness, neuroticism, and less openness to experience. From these results, we generalize guidelines for designing detailed hand motion that can add to the expressiveness and personality of characters. We performed an evaluation study that combined hand motion with gesture and body motion. Even in the presence of body motion, hand motion still significantly impacted the perception of a character's personality and could even be the dominant factor in certain situations.

[1]  W. T. Norman,et al.  Toward an adequate taxonomy of personality attributes: replicated factors structure in peer nomination personality ratings. , 1963, Journal of abnormal and social psychology.

[2]  A. Mehrabian Significance of posture and posiion in the communication of attitude and status relationships. , 1969, Psychological bulletin.

[3]  M. North Personality Assessment Through Movement , 1972 .

[4]  M. Knapp,et al.  Nonverbal communication in human interaction , 1972 .

[5]  J. Burgoon The Unspoken dialogue : an introduction to nonverbal communication / Judee K. Burgoon, Thomas Saine , 1978 .

[6]  Shelley Masion Rosenberg Bodily Communication , 1978 .

[7]  J. Brebner Personality theory and movement , 1985 .

[8]  R. Riggio,et al.  Impression formation: the role of expressive behavior. , 1986, Journal of personality and social psychology.

[9]  A. Tellegen,et al.  An alternative "description of personality": the big-five factor structure. , 1990, Journal of personality and social psychology.

[10]  T. S. Huang,et al.  Human computer interaction via the human hand: a hand model , 1994, Proceedings of 1994 28th Asilomar Conference on Signals, Systems and Computers.

[11]  Neff Walker,et al.  Evaluation of the CyberGlove as a whole-hand input device , 1995, TCHI.

[12]  Tosiyasu L. Kunii,et al.  Model-based analysis of hand posture , 1995, IEEE Computer Graphics and Applications.

[13]  D. Funder The Personality Puzzle , 1996 .

[14]  Vladimir Pavlovic,et al.  Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[15]  R. Lippa The Nonverbal Display and Judgment of Extraversion, Masculinity, Femininity, and Gender Diagnosticity: A Lens Model Analysis , 1998 .

[16]  J. Pennebaker,et al.  Linguistic styles: language use as an individual difference. , 1999, Journal of personality and social psychology.

[17]  Te-Shun Chou,et al.  Hand-Eye: A Vision-Based Approach to Data Glove Calibration , 2000 .

[18]  Mark R. Cutkosky,et al.  Calibration and Mapping of a Human Hand for Dexterous Telemanipulation , 2000, Dynamic Systems and Control: Volume 2.

[19]  Clifford Nass,et al.  Consistency of personality in interactive characters: verbal cues, non-verbal cues, and user characteristics , 2000, Int. J. Hum. Comput. Stud..

[20]  Thomas Rist,et al.  The automated design of believable dialogues for animated presentation teams , 2001 .

[21]  Michael Leo Turner Programming dexterous manipulation by demonstration , 2001 .

[22]  Maurizio Mancini,et al.  Formational parameters and adaptive prototype instantiation for MPEG-4 compliant gesture synthesis , 2002, Proceedings of Computer Animation 2002 (CA 2002).

[23]  Norman I. Badler,et al.  Representing and parameterizing agent behaviors , 2002, Proceedings of Computer Animation 2002 (CA 2002).

[24]  S. Gosling,et al.  A very brief measure of the Big-Five personality domains , 2003 .

[25]  Kevin Montgomery,et al.  Using Registration, Calibration, and Robotics to Build a More Accurate Virtual Reality Simulation for Astronaut Training and Telemedicine , 2003, WSCG.

[26]  Paul Piwek,et al.  A Flexible Pragmatics-Driven Language Generator for Animated Agents , 2003, EACL.

[27]  Karan Singh,et al.  Eurographics/siggraph Symposium on Computer Animation (2003) Handrix: Animating the Human Hand , 2003 .

[28]  R. Klein,et al.  'Visual-fidelity' dataglove calibration , 2004, Proceedings Computer Graphics International, 2004..

[29]  Jie Wang,et al.  Calibrating human hand for teleoperating the HIT/DLR hand , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[30]  Stefan Kopp,et al.  Synthesizing multimodal utterances for conversational agents , 2004, Comput. Animat. Virtual Worlds.

[31]  Maurizio Mancini,et al.  Implementing Expressive Gesture Synthesis for Embodied Conversational Agents , 2005, Gesture Workshop.

[32]  Ning Wang,et al.  The Politeness Effect: Pedagogical Agents and Learning Gains , 2005, AIED.

[33]  S. Gosling,et al.  Personality in its natural habitat: manifestations and implicit folk theories of personality in daily life. , 2006, Journal of personality and social psychology.

[34]  Marilyn A. Walker,et al.  PERSONAGE: Personality Generation for Dialogue , 2007, ACL.

[35]  James C. Lester,et al.  Modeling self-efficacy in intelligent tutoring systems: An inductive approach , 2008, User Modeling and User-Adapted Interaction.

[36]  Marilyn A. Walker,et al.  Trainable Generation of Big-Five Personality Styles through Data-Driven Parameter Estimation , 2008, ACL.

[37]  Hans-Peter Seidel,et al.  Annotated New Text Engine Animation Animation Lexicon Animation Gesture Profiles MR : . . . JL : . . . Gesture Generation Video Annotated Gesture Script , 2007 .

[38]  Stacy Marsella,et al.  SmartBody: behavior realization for embodied conversational agents , 2008, AAMAS.

[39]  C. Karen Liu,et al.  Eurographics/ Acm Siggraph Symposium on Computer Animation (2008) Synthesis of Interactive Hand Manipulation , 2022 .

[40]  C. Karen Liu,et al.  Dextrous manipulation from a grasping pose , 2009, ACM Trans. Graph..

[41]  Alexis Héloir,et al.  EMBR: A realtime animation engine for interactive embodied agents , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[42]  Alexis Héloir,et al.  EMBR: A realtime animation engine for interactive embodied agents , 2009, ACII.

[43]  Marilyn A. Walker,et al.  Towards personality-based user adaptation: psychologically informed stylistic language generation , 2010, User Modeling and User-Adapted Interaction.

[44]  Michael Neff,et al.  Evaluating the Effect of Gesture and Language on Personality Perception in Conversational Agents , 2010, IVA.

[45]  Matt Huenerfauth,et al.  Accurate and Accessible Motion-Capture Glove Calibration for Sign Language Data Collection , 2010, TACC.

[46]  Dana Kulic,et al.  A study of human performance in recognizing expressive hand movements , 2011, 2011 RO-MAN.

[47]  Michael Neff,et al.  Don't Scratch! Self-adaptors Reflect Emotional Stability , 2011, IVA.

[48]  N. Given Robust dataglove mapping for recording human hand postures , 2011 .

[49]  Marilyn A. Walker,et al.  Controlling User Perceptions of Linguistic Style: Trainable Generation of Personality Traits , 2011, CL.

[50]  Jinxiang Chai,et al.  Combining marker-based mocap and RGB-D camera for acquiring high-fidelity hand motion data , 2012, SCA '12.

[51]  C. Karen Liu,et al.  Synthesis of detailed hand manipulations using contact sampling , 2012, ACM Trans. Graph..

[52]  Jessica K. Hodgins,et al.  Data-driven finger motion synthesis for gesturing characters , 2012, ACM Trans. Graph..

[53]  Ludovic Hoyet,et al.  Sleight of hand: perception of finger motion from reduced marker sets , 2012, I3D '12.

[54]  Michael Neff,et al.  Automatic Hand-Over Animation for Free-Hand Motions from Low Resolution Input , 2012, MIG.

[55]  Bernd Hamann,et al.  A system for automatic animation of piano performances , 2013, Comput. Animat. Virtual Worlds.

[56]  Michael Neff,et al.  Data-driven glove calibration for hand motion capture , 2013, SCA '13.

[57]  Jinxiang Chai,et al.  Robust realtime physics-based motion control for human grasping , 2013, ACM Trans. Graph..

[58]  Victor B. Zordan,et al.  Automatic Hand-Over Animation using Principle Component Analysis , 2013, MIG.

[59]  Dana Kulic,et al.  Perception and Generation of Affective Hand Movements , 2013, Int. J. Soc. Robotics.

[60]  Qionghai Dai,et al.  Video-based hand manipulation capture through composite motion control , 2013, ACM Trans. Graph..

[61]  Rachel McDonnell,et al.  Does render style affect perception of personality in virtual humans? , 2014, SAP.

[62]  Rachel McDonnell,et al.  Perception of personality through eye gaze of realistic and cartoon models , 2015, SAP.

[63]  Michael Neff,et al.  Storytelling Agents with Personality and Adaptivity , 2015, IVA.

[64]  Michael Neff,et al.  Two Techniques for Assessing Virtual Agent Personality , 2016, IEEE Transactions on Affective Computing.