Dynamic response for motion capture animation

Human motion capture embeds rich detail and style which is difficult to generate with competing animation synthesis technologies. However, such recorded data requires principled means for creating responses in unpredicted situations, for example reactions immediately following impact. This paper introduces a novel technique for incorporating unexpected impacts into a motion capture-driven animation system through the combination of a physical simulation which responds to contact forces and a specialized search routine which determines the best plausible re-entry into motion library playback following the impact. Using an actuated dynamic model, our system generates a physics-based response while connecting motion capture segments. Our method allows characters to respond to unexpected changes in the environment based on the specific dynamic effects of a given contact while also taking advantage of the realistic movement made available through motion capture. We show the results of our system under various conditions and with varying responses using martial arts motion capture as a testbed.

[1]  Michael F. Cohen,et al.  Efficient generation of motion transitions using spacetime constraints , 1996, SIGGRAPH.

[2]  Zoran Popovic,et al.  Physically based motion transformation , 1999, SIGGRAPH.

[3]  Nancy S. Pollard,et al.  Simple Machines for Scaling Human Motion , 1999, Computer Animation and Simulation.

[4]  Nancy S. Pollard,et al.  Force-based motion editing for locomotion tasks , 2000, Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065).

[5]  Richard Szeliski,et al.  Video textures , 2000, SIGGRAPH.

[6]  Masaki Oshita,et al.  A Dynamic Motion Control Technique for Human‐like Articulated Figures , 2001, Comput. Graph. Forum.

[7]  Petros Faloutsos,et al.  Composable controllers for physics-based character animation , 2001, SIGGRAPH.

[8]  Harry Shum,et al.  Motion texture: a two-level statistical model for character motion synthesis , 2002, ACM Trans. Graph..

[9]  Jessica K. Hodgins,et al.  Interactive control of avatars animated with human motion data , 2002, SIGGRAPH.

[10]  Jessica K. Hodgins,et al.  Building behaviors with examples , 2002 .

[11]  Jessica K. Hodgins,et al.  Motion capture-driven simulations that hit and react , 2002, SCA '02.

[12]  Frédéric H. Pighin,et al.  Hybrid control for interactive character animation , 2003, 11th Pacific Conference onComputer Graphics and Applications, 2003. Proceedings..

[13]  Nancy S. Pollard,et al.  Perceptual metrics for character animation: sensitivity to errors in ballistic motion , 2003, ACM Trans. Graph..

[14]  Victor B. Zordan,et al.  Mapping optical motion capture data to skeletal motion using a physical model , 2003, SCA '03.

[15]  Bobby Bodenheimer,et al.  Computing the duration of motion transitions: an empirical approach , 2004, SCA '04.

[16]  Taku Komura,et al.  Animating reactive motions for biped locomotion , 2004, VRST '04.

[17]  Lucas Kovar,et al.  Motion graphs , 2002, SIGGRAPH '08.