Fragment-based responsive character motion for interactive games

Fragment-based character animation has become popular in recent years. By stringing appropriate motion capture fragments together, the system drives characters responding to the control signals of the user and generates realistic character motions. In this paper, we propose a novel, straightforward and fast method to build the control policy table, which selects the next motion fragment to play based on the current user’s input and the previous motion fragment. During the synthesis of the control policy table, we cluster similar fragments together to create several fragment classes. Dynamic programming is employed to generate the training samples based on the control signals of the user. Finally, we use a supervised learning routine to create the tabular control policy. We demonstrate the efficacy of our method by comparing the motions generated by our controller to the optimal controller and other previous controllers. The results indicate that although a reinforcement learning algorithm known as value iteration also creates the tabular control policy, it is more complex and requires more expensive space–time cost in synthesis of the control policy table. Our approach is simple but efficient, and is practical for interactive character games.

[1]  Jovan Popovic,et al.  Simulation of Human Motion Data using Short‐Horizon Model‐Predictive Control , 2008, Comput. Graph. Forum.

[2]  Christoph Bregler,et al.  Motion capture assisted animation: texturing and synthesis , 2002, ACM Trans. Graph..

[3]  Victor B. Zordan,et al.  Interactive dynamic response for games , 2007, Sandbox '07.

[4]  Irfan A. Essa,et al.  Machine Learning for Video-Based Rendering , 2000, NIPS.

[5]  Jessica K. Hodgins,et al.  Construction and optimal search of interpolated motion graphs , 2007, ACM Trans. Graph..

[6]  Daniel Thalmann,et al.  Virtual humans: thirty years of research, what next? , 2005, The Visual Computer.

[7]  Lucas Kovar,et al.  Motion graphs , 2002, SIGGRAPH Classes.

[8]  GleicherMichael,et al.  Snap-together motion , 2003 .

[9]  Nancy S. Pollard,et al.  Responsive characters from motion fragments , 2007, SIGGRAPH 2007.

[10]  David A. Forsyth,et al.  Motion synthesis from annotations , 2003, ACM Trans. Graph..

[11]  Taku Komura,et al.  Animating reactive motions for biped locomotion , 2004, VRST '04.

[12]  Nancy S. Pollard,et al.  Evaluating motion graphs for character navigation , 2004, SCA '04.

[13]  Tom Appolloni,et al.  Proceedings of the 29th annual conference on Computer graphics and interactive techniques , 2002, SIGGRAPH.

[14]  Victor B. Zordan,et al.  Dynamic response for motion capture animation , 2005, SIGGRAPH '05.

[15]  Lucas Kovar,et al.  Motion Graphs , 2002, ACM Trans. Graph..

[16]  Okan Arikan,et al.  Interactive motion generation from examples , 2002, ACM Trans. Graph..

[17]  Zhigeng Pan,et al.  Interactive generation of falling motions: Research Articles , 2006 .

[18]  Hyun Joon Shin,et al.  Snap-together motion: assembling run-time animations , 2003, I3D '03.

[19]  Dinesh K. Pai,et al.  Interaction capture and synthesis , 2005, SIGGRAPH 2005.

[20]  Le Zheng,et al.  Interactive generation of falling motions , 2006, Comput. Animat. Virtual Worlds.

[21]  Michael Gleicher,et al.  Parametric motion graphs , 2007, SI3D.

[22]  BreglerChristoph,et al.  Motion capture assisted animation , 2002 .

[23]  M. V. D. Panne,et al.  SIMBICON: simple biped locomotion control , 2007, SIGGRAPH 2007.

[24]  Taku Komura,et al.  Computing inverse kinematics with linear programming , 2005, VRST '05.

[25]  Alan Heirich,et al.  Proceedings of the 2007 ACM SIGGRAPH Symposium on Video Games, Sandbox 2007, San Diego, California, USA, August 4-5, 2007 , 2007, Sandbox@SIGGRAPH.

[26]  Jessica K. Hodgins,et al.  Performance animation from low-dimensional control signals , 2005, SIGGRAPH 2005.