Simple Steps for Simply Stepping

We introduce a general method for animating controlled stepping motion for use in combining motion capture sequences. Our stepping algorithm is characterized by two simple models which idealize the movement of the stepping foot and the projected center of mass based on observations from a database of step motions. We draw a parallel between stepping and point-to-point reaching to motivate our foot model and employ an inverted pendulum model common in robotics for the center of mass. Our system computes path and speed profiles from each model and then adapts an interpolation to follow the synthesized trajectories in the final motion. We show that our animations can be enriched through the use of step examples, but also that we can synthesize stepping to create transitions between existing segments without the need for a motion example. We demonstrate that our system can generate precise, realistic stepping for a number of scenarios.

[1]  Marko B. Popovic,et al.  Ground Reference Points in Legged Locomotion: Definitions, Biological Trajectories and Control Implications , 2005, Int. J. Robotics Res..

[2]  Taku Komura,et al.  Stepping motion for a human-like character to maintain balance against large perturbations , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[3]  Michael F. Cohen,et al.  Efficient generation of motion transitions using spacetime constraints , 1996, SIGGRAPH.

[4]  Lucas Kovar,et al.  Footskate cleanup for motion capture editing , 2002, SCA '02.

[5]  Kazuhito Yokoi,et al.  The 3D linear inverted pendulum mode: a simple modeling for a biped walking pattern generation , 2001, Proceedings 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems. Expanding the Societal Role of Robotics in the the Next Millennium (Cat. No.01CH37180).

[6]  Hyun Joon Shin,et al.  Physical touch-up of human motions , 2003, 11th Pacific Conference onComputer Graphics and Applications, 2003. Proceedings..

[7]  Dinesh K. Pai,et al.  Data-driven Interactive Balancing Behaviors , 2005 .

[8]  Bobby Bodenheimer,et al.  Computing the duration of motion transitions: an empirical approach , 2004, SCA '04.

[9]  Okan Arikan,et al.  Interactive motion generation from examples , 2002, ACM Trans. Graph..

[10]  Michiel van de Panne,et al.  From Footprints to Animation , 1997, Comput. Graph. Forum.

[11]  Norman I. Badler,et al.  Real-Time Inverse Kinematics Techniques for Anthropomorphic Limbs , 2000, Graph. Model..

[12]  David A. Forsyth,et al.  Knowing when to put your foot down , 2006, I3D '06.

[13]  Hyun Joon Shin,et al.  Snap-together motion: assembling run-time animations , 2003, ACM Trans. Graph..

[14]  William H. Press,et al.  Numerical recipes in C , 2002 .

[15]  E. Bizzi,et al.  Human arm trajectory formation. , 1982, Brain : a journal of neurology.

[16]  Sergey V. Drakunov,et al.  Capture Point: A Step toward Humanoid Push Recovery , 2006, 2006 6th IEEE-RAS International Conference on Humanoid Robots.

[17]  Victor B. Zordan,et al.  Psychologically Inspired Anticipation and Dynamic Response for Impacts to the Head and Upper Body , 2008, IEEE Transactions on Visualization and Computer Graphics.

[18]  David A. Forsyth,et al.  Quick transitions with cached multi-way blends , 2007, SI3D.

[19]  T. Flash,et al.  The coordination of arm movements: an experimentally confirmed mathematical model , 1985, The Journal of neuroscience : the official journal of the Society for Neuroscience.

[20]  Hyeong-Seok Ko,et al.  Motion Balance Filtering , 2000, Comput. Graph. Forum.

[21]  Lucas Kovar,et al.  Motion graphs , 2002, SIGGRAPH Classes.

[22]  Daniel Thalmann,et al.  Position control of the center of mass for articulated figures in multiple support , 1995 .

[23]  Jessica K. Hodgins,et al.  Interactive control of avatars animated with human motion data , 2002, SIGGRAPH.