Precision: precomputing environment semantics for contact-rich character animation

The widespread availability of high-quality motion capture data and the maturity of solutions to animate virtual characters has paved the way for the next generation of interactive virtual worlds exhibiting intricate interactions between characters and the environments they inhabit. However, current motion synthesis techniques have not been designed to scale with complex environments and contact-rich motions, requiring environment designers to manually embed motion semantics in the environment geometry in order to address online motion synthesis. This paper presents an automated approach for analyzing both motions and environments in order to represent the different ways in which an environment can afford a character to move. We extract the salient features that characterize the contact-rich motion repertoire of a character and detect valid transitions in the environment where each of these motions may be possible, along with additional semantics that inform which surfaces of the environment the character may use for support during the motion. The precomputed motion semantics can be easily integrated into standard navigation and animation pipelines in order to greatly enhance the motion capabilities of virtual characters. The computational efficiency of our approach enables two additional applications. Environment designers can interactively design new environments and get instant feedback on how characters may potentially interact, which can be used for iterative modeling and refinement. End users can dynamically edit virtual worlds and characters will automatically accommodate the changes in the environment in their movement strategies.

[1]  Manfred Lau,et al.  Behavior planning for character animation , 2005, SCA '05.

[2]  Okan Arikan,et al.  Interactive motion generation from examples , 2002, ACM Trans. Graph..

[3]  Zoran Popovic,et al.  Discovery of complex behaviors through contact-invariant optimization , 2012, ACM Trans. Graph..

[4]  M. V. D. Panne,et al.  Sampling-based contact-rich motion control , 2010, ACM Trans. Graph..

[5]  Norman I. Badler,et al.  ADAPT: The Agent Developmentand Prototyping Testbed , 2013, IEEE Transactions on Visualization and Computer Graphics.

[6]  Nancy S. Pollard,et al.  Responsive characters from motion fragments , 2007, SIGGRAPH 2007.

[7]  Manfred Lau,et al.  Precomputed search trees: planning for interactive goal-driven animation , 2006, SCA '06.

[8]  Jessica K. Hodgins,et al.  Interactive control of avatars animated with human motion data , 2002, SIGGRAPH.

[9]  Baining Guo,et al.  Terrain runner , 2012, ACM Trans. Graph..

[10]  Zoran Popovic,et al.  Compact character controllers , 2009, ACM Trans. Graph..

[11]  Lucas Kovar,et al.  Motion graphs , 2002, SIGGRAPH '08.

[12]  Norman I. Badler,et al.  Efficient motion retrieval in large motion databases , 2013, I3D '13.

[13]  Sanjeev Khanna,et al.  Automatic construction of a minimum size motion graph , 2009, SCA '09.

[14]  Petros Faloutsos,et al.  Environment optimization for crowd evacuation , 2015, Comput. Animat. Virtual Worlds.

[15]  Franck Multon,et al.  Task efficient contact configurations for arbitrary virtual creatures , 2014, Graphics Interface.

[16]  Dinesh Manocha,et al.  V-COLLIDE: accelerated collision detection for VRML , 1997, VRML '97.

[17]  Michiel van de Panne,et al.  A grasp-based motion planning algorithm for character animation , 2001, Comput. Animat. Virtual Worlds.

[18]  Michael Gleicher,et al.  Parametric motion graphs , 2007, SI3D.

[19]  Sung Yong Shin,et al.  Planning biped locomotion using motion capture data and probabilistic roadmaps , 2003, TOGS.

[20]  Nancy S. Pollard,et al.  Efficient synthesis of physically valid human motion , 2003, ACM Trans. Graph..

[21]  Jehee Lee,et al.  Tiling Motion Patches , 2013, IEEE Transactions on Visualization and Computer Graphics.

[22]  Marcelo Kallmann,et al.  Navigation meshes and real-time dynamic planning for virtual worlds , 2014, SIGGRAPH '14.

[23]  Glenn Reinman,et al.  Footstep navigation for dynamic crowds , 2011, SI3D.

[24]  Leonidas J. Guibas,et al.  Shape2Pose , 2014, ACM Trans. Graph..

[25]  Nils J. Nilsson,et al.  A Formal Basis for the Heuristic Determination of Minimum Cost Paths , 1968, IEEE Trans. Syst. Sci. Cybern..

[26]  Dinesh Manocha,et al.  Real-time navigation of independent agents using adaptive roadmaps , 2008, SIGGRAPH '08.

[27]  Yueting Zhuang,et al.  An Efficient Keyframe Extraction from Motion Capture Data , 2006, Computer Graphics International.

[28]  Jehee Lee,et al.  Precomputing avatar behavior from human motion data , 2004, SCA '04.

[29]  Dinesh Manocha,et al.  Reciprocal n-Body Collision Avoidance , 2011, ISRR.

[30]  Timothy Bretl,et al.  Non-gaited humanoid locomotion planning , 2005, 5th IEEE-RAS International Conference on Humanoid Robots, 2005..

[31]  Matthias Zwicker,et al.  Real-time planning for parameterized human motion , 2008, SCA '08.

[32]  Manfred Lau,et al.  Scalable Precomputed Search Trees , 2010, MIG.

[33]  Sergey Levine,et al.  Space-time planning with parameterized locomotion controllers , 2011, TOGS.

[34]  Taku Komura,et al.  Relationship descriptors for interactive motion adaptation , 2013, SCA '13.

[35]  Taku Komura,et al.  Character Motion Synthesis by Topology Coordinates , 2009, Comput. Graph. Forum.

[36]  Philippe Beaudoin,et al.  Robust task-based control policies for physics-based characters , 2009, ACM Trans. Graph..

[37]  Jinxiang Chai,et al.  Motion graphs++ , 2012, ACM Trans. Graph..

[38]  Jessica K. Hodgins,et al.  Construction and optimal search of interpolated motion graphs , 2007, ACM Trans. Graph..

[39]  Sung-Hee Lee,et al.  Environment-adaptive contact poses for virtual characters , 2014, SIGGRAPH '14.

[40]  Jehee Lee,et al.  Deformable Motion: Squeezing into Cluttered Environments , 2011, Comput. Graph. Forum.