Force-based motion editing for locomotion tasks

This paper describes a fast technique for modifying motion sequences for complex articulated mechanisms in a way that preserves physical properties of the motion. This technique is relevant to the problem of teaching motion tasks by demonstration, because it allows a single example to be adapted to a range of situations. Motion may be obtained from any source, e.g., it may be captured from a human user. A model of applied forces is extracted from the motion data, and forces are scaled to achieve new goals. Each scaled force model is checked to ensure that frictional and kinematic constraints are maintained for a rigid body approximation of the character. Scale factors can be obtained in closed form, and constraints can be approximated analytically, making motion editing extremely fast. To demonstrate the effectiveness of this approach, we show that a variety of simulated jumps can be created by modifying a single key-framed jumping motion. We also scale a simulated running motion to a new character and to a range of new velocities.

[1]  T. Takenaka,et al.  The development of Honda humanoid robot , 1998, Proceedings. 1998 IEEE International Conference on Robotics and Automation (Cat. No.98CH36146).

[2]  David C. Brogan,et al.  Animating human athletics , 1995, SIGGRAPH.

[3]  T. McMahon,et al.  The mechanics of running: how does stiffness couple with speed? , 1990, Journal of biomechanics.

[4]  Zoran Popovic,et al.  Physically based motion transformation , 1999, SIGGRAPH.

[5]  Eugene Fiume,et al.  Limit cycle control and its application to the animation of balancing and walking , 1996, SIGGRAPH.

[6]  Ales Ude,et al.  Integration of Symbolic and Subsymbolic Learning to Support Robot Programming by Human Demonstration , 1996 .

[7]  Andrew P. Witkin,et al.  Spacetime constraints , 1988, SIGGRAPH.

[8]  M. Spengler Fast Neural Network Emulation and Control of Physics-Based Models , 1999 .

[9]  Michael F. Cohen,et al.  Verbs and Adverbs: Multidimensional Motion Interpolation , 1998, IEEE Computer Graphics and Applications.

[10]  Katsushi Ikeuchi,et al.  Toward automatic robot instruction from perception-temporal segmentation of tasks from human hand motion , 1993, IEEE Trans. Robotics Autom..

[11]  Marc H. Raibert,et al.  Legged Robots That Balance , 1986, IEEE Expert.

[12]  Hiroo Iwata,et al.  Locomotion Interface for Virtual Environments , 2000 .

[13]  Jessica K. Hodgins,et al.  Animation of dynamic legged locomotion , 1991, SIGGRAPH.

[14]  Jessica K. Hodgins,et al.  Adapting simulated behaviors for new characters , 1997, SIGGRAPH.

[15]  Daniel E. Koditschek,et al.  Analysis of a Simplified Hopping Robot , 1988, Proceedings. 1988 IEEE International Conference on Robotics and Automation.

[16]  Geoffrey E. Hinton,et al.  NeuroAnimator: fast neural network emulation and control of physics-based models , 1998, SIGGRAPH.

[17]  Daniel E. Koditschek,et al.  Toward the control of a multi-jointed, monoped runner , 1998, Proceedings. 1998 IEEE International Conference on Robotics and Automation (Cat. No.98CH36146).

[18]  Jerry E. Pratt,et al.  Intuitive control of a planar bipedal walking robot , 1998, Proceedings. 1998 IEEE International Conference on Robotics and Automation (Cat. No.98CH36146).

[19]  W. T. Dempster,et al.  Properties of body segments based on size and weight , 1967 .

[20]  H. Harry Asada,et al.  The direct teaching of tool manipulation skills via the impedance identification of human motions , 1988, Proceedings. 1988 IEEE International Conference on Robotics and Automation.

[21]  M G Pandy,et al.  A parameter optimization approach for the optimal control of large-scale musculoskeletal systems. , 1992, Journal of biomechanical engineering.