Perform: perceptual approach for adding OCEAN personality to human motion using laban movement analysis

© 2016 ACM. The research reported in this document was performed in connection with Contract Numbers W911NF-07-1-0216 and W911NF-10-2-0016 with the U.S. Army Research Laboratory. The views and conclusions contained in this document are those of the authors and should not be interpreted as presenting the official policies or position, either expressed or implied, of the U.S. Army Research Laboratory or the U.S. Government unless so designated by other authorized documents. Citation of manufacturers or trade names does not constitute an official endorsement or approval of the use thereof. The U.S. Government is authorized to reproduce and distribute reprints for Government purposes notwithstanding any copyright notation heron. characters. The purpose and contribution of this work is to describe a formal, broadly applicable, procedural, and empirically grounded association between personality and body motion and apply this association to modify a given virtual human body animation that can be represented by these formal concepts. Because the body movement of virtual characters may involve different choices of parameter sets depending on the context, situation, or application, formulating a link from personality to body motion requires an intermediate step to assist generalization. For this intermediate step, we refer to Laban Movement Analysis, which is a movement analysis technique for systematically describing and evaluating human motion. We have developed an expressive human motion generation system with the help of movement experts and conducted a user study to explore how the psychologically validated OCEAN personality factors were perceived in motions with various Laban parameters. We have then applied our findings to procedurally animate expressive characters with personality, and validated the generalizability of our approach across different models and animations via another perception study.

[1]  S. Gosling,et al.  A very brief measure of the Big-Five personality domains , 2003 .

[2]  Michael Neff,et al.  Evaluating the Effect of Gesture and Language on Personality Perception in Conversational Agents , 2010, IVA.

[3]  A. Tellegen,et al.  An alternative "description of personality": the big-five factor structure. , 1990, Journal of personality and social psychology.

[4]  M. Knapp,et al.  Nonverbal communication in human interaction , 1972 .

[5]  C. Karen Liu,et al.  Learning physics-based motion style with nonlinear inverse optimization , 2005, ACM Trans. Graph..

[6]  Robert R. McCrae,et al.  The NEO–PI–3: A More Readable Revised NEO Personality Inventory , 2005, Journal of personality assessment.

[7]  M. Mancini,et al.  Real-time analysis and synthesis of emotional gesture expressivity , 2007 .

[8]  Luca Chittaro,et al.  Behavioral programming of autonomous characters based on probabilistic automata and personality , 2004, Comput. Animat. Virtual Worlds.

[9]  Elizabeth A. Crane,et al.  Motion Capture and Emotion: Affect Detection in Whole Body Movement , 2007, ACII.

[10]  P. Ekman,et al.  Relative importance of face, body, and speech in judgments of personality and affect. , 1980 .

[11]  Dinesh Manocha,et al.  Simulating heterogeneous crowd behaviors using personality trait theory , 2011, SCA '11.

[12]  John Hart,et al.  ACM Transactions on Graphics , 2004, SIGGRAPH 2004.

[13]  Nadia Magnenat-Thalmann,et al.  A Model for Personality and Emotion Simulation , 2003, KES.

[14]  Maurizio Mancini,et al.  Implementing Expressive Gesture Synthesis for Embodied Conversational Agents , 2005, Gesture Workshop.

[15]  Norman I. Badler,et al.  How the Ocean Personality Model Affects the Perception of Crowds , 2011, IEEE Computer Graphics and Applications.

[16]  Carol O'Sullivan,et al.  Clone attack! Perception of crowd variety , 2008, ACM Trans. Graph..

[17]  Kenji Amaya,et al.  Emotion from Motion , 1996, Graphics Interface.

[18]  Norman I. Badler,et al.  Semantic Segmentation of Motion Capture Using Laban Movement Analysis , 2007, IVA.

[19]  Sophie Jörg,et al.  Evaluating the emotional content of human motions on real and virtual characters , 2008, APGV '08.

[20]  Mark C. Coulson Attributing Emotion to Static Body Postures: Recognition Accuracy, Confusions, and Viewpoint Dependence , 2004 .

[21]  S. Gosling,et al.  Personality in its natural habitat: manifestations and implicit folk theories of personality in daily life. , 2006, Journal of personality and social psychology.

[22]  Michael Neff,et al.  Two Techniques for Assessing Virtual Agent Personality , 2016, IEEE Transactions on Affective Computing.

[23]  Michael Neff,et al.  Towards Natural Gesture Synthesis: Evaluating Gesture Units in a Data-Driven Approach to Gesture Synthesis , 2007, IVA.

[24]  Michael Neff,et al.  Don't Scratch! Self-adaptors Reflect Emotional Stability , 2011, IVA.

[25]  Michael Neff,et al.  AER: aesthetic exploration and refinement for expressive character animation , 2005, SCA '05.

[26]  S. Gosling,et al.  A room with a cue: personality judgments based on offices and bedrooms. , 2002, Journal of personality and social psychology.

[27]  I. Bartenieff,et al.  Body Movement: Coping with the Environment , 1980 .

[28]  N. Badler,et al.  Toward Representing Agent Behaviors Modified by Personality and Emotion , 2002 .

[29]  Aaron Hertzmann,et al.  Style machines , 2000, SIGGRAPH 2000.

[30]  S. Levine,et al.  Gesture controllers , 2010, ACM Trans. Graph..

[31]  Yong Cao,et al.  Style components , 2006, Graphics Interface.

[32]  Mel Slater,et al.  Building Expression into Virtual Characters , 2006, Eurographics.

[33]  Norman I. Badler,et al.  The EMOTE model for effort and shape , 2000, SIGGRAPH.

[34]  Richard H. Bartels,et al.  Interpolating splines with local tension, continuity, and bias control , 1984, SIGGRAPH.

[35]  Lorenzo Torresani,et al.  Learning Motion Style Synthesis from Perceptual Observations , 2006, NIPS.

[36]  Jovan Popovic,et al.  Style translation for human motion , 2005, ACM Trans. Graph..

[37]  Ken-ichi Anjyo,et al.  Fourier principles for emotion-based human figure animation , 1995, SIGGRAPH.

[38]  Zhigang Deng,et al.  Context-Aware Motion Diversification for Crowd Simulation , 2011, IEEE Computer Graphics and Applications.

[39]  Norman I. Badler,et al.  Interpreting movement manner , 2000, Proceedings Computer Animation 2000.

[40]  Dana Kulic,et al.  Laban Effort and Shape Analysis of Affective Hand and Arm Movements , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[41]  Shih-Pin Chao,et al.  An LMA‐Effort simulator with dynamics parameters for motion capture animation , 2006, Comput. Animat. Virtual Worlds.

[42]  Norman I. Badler,et al.  The effect of posture and dynamics on the perception of emotion , 2013, SAP.

[43]  Yuichi Kobayashi,et al.  EM-in-M: Analyze and Synthesize Emotion in Motion , 2006, IWICPAS.

[44]  Norman I. Badler,et al.  Parametric keyframe interpolation incorporating kinetic adjustment and phrasing control , 1985, SIGGRAPH.

[45]  Peter Kulchyski and , 2015 .

[46]  Norman I. Badler,et al.  Authoring Multi-actor Behaviors in Crowds with Diverse Personalities , 2013, Modeling, Simulation and Visual Analysis of Crowds.

[47]  M. North Personality Assessment Through Movement , 1972 .

[48]  Norman I. Badler,et al.  Efficient motion retrieval in large motion databases , 2013, I3D '13.