Motion patches: buildings blocks for virtual environments annotated with motion data

Real time animation of human figures in virtual environments is an important problem in the context of computer games and virtual environments. Recently, the use of large collections of captured motion data has increased realism in character animation. However, assuming that the virtual environment is large and complex, the effort of capturing motion data in a physical environment and adapting them to an extended virtual environment is the bottleneck for achieving interactive character animation and control. We present a new technique for allowing our animated characters to navigate through a large virtual environment, which is constructed using a set of building blocks. The building blocks, called motion patches, can be arbitrarily assembled to create novel environments. Each patch is annotated with motion data, which informs what actions are available for animated characters within the block. The versatility and flexibility of our approach are demonstrated through examples in which multiple characters are animated and controlled at interactive rates in large, complex virtual environments.

[1]  Sung Yong Shin,et al.  A hierarchical approach to interactive motion editing for human-like figures , 1999, SIGGRAPH.

[2]  Michael Gleicher,et al.  Scalable behaviors for crowd simulation , 2004, Comput. Graph. Forum.

[3]  Lucas Kovar,et al.  Fast and accurate goal-directed motion synthesis for crowds , 2005, SCA '05.

[4]  Michiel van de Panne,et al.  A grasp-based motion planning algorithm for character animation , 2001, Comput. Animat. Virtual Worlds.

[5]  Okan Arikan,et al.  Interactive motion generation from examples , 2002, ACM Trans. Graph..

[6]  Daniel Thalmann,et al.  Planning with Smart Objects , 2005, WSCG.

[7]  Thierry Siméon,et al.  Eurographics/siggraph Symposium on Computer Animation (2003) Visual Simulation of Ice Crystal Growth , 2022 .

[8]  Sung Yong Shin,et al.  Planning biped locomotion using motion capture data and probabilistic roadmaps , 2003, TOGS.

[9]  Nancy S. Pollard,et al.  Evaluating motion graphs for character navigation , 2004, SCA '04.

[10]  Matthew Stone,et al.  Speaking with hands: creating animated conversational characters from recordings of human performance , 2004, ACM Trans. Graph..

[11]  David A. Forsyth,et al.  Motion synthesis from annotations , 2003, ACM Trans. Graph..

[12]  Norman I. Badler,et al.  Posture interpolation with collision avoidance , 1994, Proceedings of Computer Animation '94.

[13]  Manfred Lau,et al.  Behavior planning for character animation , 2005, SCA '05.

[14]  Katsu Yamane,et al.  Synthesizing animations of human manipulation tasks , 2004, SIGGRAPH 2004.

[15]  Masayuki Inaba,et al.  Footstep planning among obstacles for biped robots , 2001, Proceedings 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems. Expanding the Societal Role of Robotics in the the Next Millennium (Cat. No.01CH37180).

[16]  Norman I. Badler,et al.  Animated human agents with motion planning capability for 3D-space postural goals , 1994, Comput. Animat. Virtual Worlds.

[17]  Stephen Chenney,et al.  Flow tiles , 2004, SCA '04.

[18]  David G. Stork,et al.  Pattern Classification , 1973 .

[19]  Jehee Lee,et al.  Precomputing avatar behavior from human motion data , 2004, SCA '04.

[20]  Jovan Popovic,et al.  Example-based control of human motion , 2004, SCA '04.

[21]  Christoph Bregler,et al.  Motion capture assisted animation: texturing and synthesis , 2002, ACM Trans. Graph..

[22]  Daniel Thalmann,et al.  Navigation for digital actors based on synthetic vision, memory, and learning , 1995, Comput. Graph..

[23]  Sung Yong Shin,et al.  Rhythmic-motion synthesis based on motion-beat analysis , 2003, ACM Trans. Graph..

[24]  Aaron Hertzmann,et al.  Style-based inverse kinematics , 2004, SIGGRAPH 2004.

[25]  Taesoo Kwon,et al.  Motion modeling for on-line locomotion synthesis , 2005, SCA '05.

[26]  Bobby Bodenheimer,et al.  Computing the duration of motion transitions: an empirical approach , 2004, SCA '04.

[27]  David A. Forsyth,et al.  Pushing people around , 2005, SCA '05.

[28]  Jessica K. Hodgins,et al.  Interactive control of avatars animated with human motion data , 2002, SIGGRAPH.

[29]  Jean-Claude Latombe,et al.  Robot motion planning , 1970, The Kluwer international series in engineering and computer science.

[30]  Robert E. Tarjan,et al.  Depth-First Search and Linear Graph Algorithms , 1972, SIAM J. Comput..

[31]  Aaron Hertzmann,et al.  Style-based inverse kinematics , 2004, ACM Trans. Graph..

[32]  Yu-Chi Lai,et al.  Group motion graphs , 2005, SCA '05.

[33]  Daniel Thalmann,et al.  A configuration space approach for efficient animation of human figures , 1997, Proceedings IEEE Nonrigid and Articulated Motion Workshop.

[34]  Sung Yong Shin,et al.  On‐line motion blending for real‐time locomotion generation , 2004, Comput. Animat. Virtual Worlds.

[35]  Jean-Claude Latombe,et al.  Planning motions with intentions , 1994, SIGGRAPH.

[36]  Demetri Terzopoulos,et al.  Autonomous pedestrians , 2007, Graph. Model..

[37]  Michael Gleicher,et al.  Automated extraction and parameterization of motions in large data sets , 2004, SIGGRAPH 2004.