Fast, automatic character animation pipelines

Humanoid three‐dimensional (3D) models can be easily acquired through various sources, including through online marketplaces. The use of such models within a game or simulation environment requires human input and intervention in order to associate such a model with a relevant set of motions and control mechanisms. In this paper, we demonstrate a pipeline where humanoid 3D models can be incorporated within seconds into an animation system and infused with a wide range of capabilities, such as locomotion, object manipulation, gazing, speech synthesis and lip syncing. We offer a set of heuristics that can associate arbitrary joint names with canonical ones and describe a fast retargeting algorithm that enables us to instill a set of behaviors onto an arbitrary humanoid skeleton on‐the‐fly. We believe that such a system will vastly increase the use of 3D interactive characters due to the ease that new models can be animated.Copyright © 2013 John Wiley & Sons, Ltd.

[1]  Sung Yong Shin,et al.  Computer puppetry: An importance-based approach , 2001, TOGS.

[2]  Bruno Arnaldi,et al.  Morphology‐independent representation of motions for interactive human‐like animation , 2005, Comput. Graph. Forum.

[3]  Stacy Marsella,et al.  SmartBody: behavior realization for embodied conversational agents , 2008, AAMAS.

[4]  Yong Cao,et al.  Style components , 2006, Graphics Interface.

[5]  Victor B. Zordan,et al.  Mapping optical motion capture data to skeletal motion using a physical model , 2003, SCA '03.

[6]  Kwang-Jin Choi,et al.  On-line motion retargetting , 1999, Proceedings. Seventh Pacific Conference on Computer Graphics and Applications (Cat. No.PR00293).

[7]  Michael F. Cohen,et al.  Verbs and Adverbs: Multidimensional Motion Interpolation , 1998, IEEE Computer Graphics and Applications.

[8]  Michael Neff,et al.  Interactive editing of motion style using drives and correlations , 2009, SCA '09.

[9]  Ari Shapiro,et al.  Building a Character Animation System , 2011, MIG.

[10]  Daniel Thalmann,et al.  Robust on-line adaptive footplant detection and enforcement for locomotion , 2006, The Visual Computer.

[11]  Lucas Kovar,et al.  Footskate cleanup for motion capture editing , 2002, SCA '02.

[12]  Radoslaw Niewiadomski,et al.  Greta: an interactive expressive ECA system , 2009, AAMAS.

[13]  Jovan Popovic,et al.  Style translation for human motion , 2005, ACM Trans. Graph..

[14]  Dennis Reidsma,et al.  Elckerlyc - A BML Realizer for continuous, multimodal interaction with a Virtual Human , 2009 .

[15]  Alexis Héloir,et al.  EMBR: A realtime animation engine for interactive embodied agents , 2009, ACII.

[16]  Stefan Kopp,et al.  Towards a Common Framework for Multimodal Generation: The Behavior Markup Language , 2006, IVA.

[17]  Justine Cassell,et al.  BEAT: the Behavior Expression Animation Toolkit , 2001, Life-like characters.

[18]  N. Badler,et al.  Eyes Alive Eyes Alive Eyes Alive Figure 1: Sample Images of an Animated Face with Eye Movements , 2022 .

[19]  Chris Hecker,et al.  Real-time motion retargeting to highly varied user-created morphologies , 2008, ACM Trans. Graph..

[20]  Marcelo Kallmann,et al.  Analytical inverse kinematics with body posture control , 2008, Comput. Animat. Virtual Worlds.

[21]  Michael Gleicher,et al.  Retargetting motion to new characters , 1998, SIGGRAPH.

[22]  Sung Yong Shin,et al.  A hierarchical approach to interactive motion editing for human-like figures , 1999, SIGGRAPH.

[23]  Jinxiang Chai,et al.  Synthesis and editing of personalized stylistic human motion , 2010, I3D '10.

[24]  Daniel Thalmann,et al.  Using an Intermediate Skeleton and Inverse Kinematics for Motion Retargeting , 2000, Comput. Graph. Forum.

[25]  Yuyu Xu,et al.  An example-based motion synthesis technique for locomotion and object manipulation , 2012, I3D '12.

[26]  Kenji Amaya,et al.  Emotion from Motion , 1996, Graphics Interface.

[27]  International Foundation for Autonomous Agents and MultiAgent Systems ( IFAAMAS ) , 2007 .

[28]  Kwang-Jin Choi,et al.  Online motion retargetting , 2000, Comput. Animat. Virtual Worlds.

[29]  Okan Arikan,et al.  Frankenrigs: Building Character Rigs from Multiple Sources , 2011, IEEE Transactions on Visualization and Computer Graphics.

[30]  Edmond S. L. Ho,et al.  Spatial relationship preserving character motion adaptation , 2010, ACM Trans. Graph..

[31]  David J. Fleet,et al.  Multifactor Gaussian process models for style-content separation , 2007, ICML '07.

[32]  Dennis Reidsma,et al.  Elckerlyc , 2010, Journal on Multimodal User Interfaces.

[33]  Yuyu Xu,et al.  Automating the Transfer of a Generic Set of Behaviors onto a Virtual Character , 2012, MIG.

[34]  Alexis Héloir,et al.  EMBR: A realtime animation engine for interactive embodied agents , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.