Simulating competitive interactions using singly captured motions

It is difficult to create scenes where multiple avatars are fighting / competing with each other. Manually creating the motions of avatars is time consuming due to the correlation of the movements between the avatars. Capturing the motions of multiple avatars is also difficult as it requires a huge amount of post-processing. In this paper, we propose a new method to generate a realistic scene of avatars densely interacting in a competitive environment. The motions of the avatars are considered to be captured individually, which will increase the easiness of obtaining the data. We propose a new algorithm called the temporal expansion approach which maps the continuous time action plan to a discrete space such that turn-based evaluation methods can be used. As a result, many mature algorithms in game such as the min-max search and α---β pruning can be applied. Using our method, avatars will plan their strategies taking into account the reaction of the opponent. Fighting scenes with multiple avatars are generated to demonstrate the effectiveness of our algorithm. The proposed method can also be applied to other kinds of continuous activities that require strategy planning such as sport games.

[1]  Manfred Lau,et al.  Behavior planning for character animation , 2005, SCA '05.

[2]  Sung Yong Shin,et al.  A hierarchical approach to interactive motion editing for human-like figures , 1999, SIGGRAPH.

[3]  Thore Graepel,et al.  LEARNING TO FIGHT , 2004 .

[4]  David A. Forsyth,et al.  Pushing people around , 2005, SCA '05.

[5]  Manfred Lau,et al.  Precomputed search trees: planning for interactive goal-driven animation , 2006, SCA '06.

[6]  Jessica K. Hodgins,et al.  Motion capture-driven simulations that hit and react , 2002, SCA '02.

[7]  Jessica K. Hodgins,et al.  Interactive control of avatars animated with human motion data , 2002, SIGGRAPH.

[8]  Michael Gleicher,et al.  Automated extraction and parameterization of motions in large data sets , 2004, SIGGRAPH 2004.

[9]  Lucas Kovar,et al.  Motion graphs , 2002, SIGGRAPH Classes.

[10]  Taku Komura,et al.  Animating reactive motion using momentum‐based inverse kinematics , 2005, Comput. Animat. Virtual Worlds.

[11]  Thomas Jakobsen,et al.  Advanced Character Physics , 2003 .

[12]  C. Karen Liu,et al.  Composition of complex optimal multi-character motions , 2006, SCA '06.

[13]  Jessica K. Hodgins,et al.  Construction and optimal search of interpolated motion graphs , 2007, ACM Trans. Graph..

[14]  Jehee Lee,et al.  Precomputing avatar behavior from human motion data , 2004, SCA '04.

[15]  Hyun Joon Shin,et al.  Analysis and Synthesis of Interactive Two-Character Motions , 2004 .

[16]  Taku Komura,et al.  Animating reactive motions for biped locomotion , 2004, VRST '04.

[17]  Taesoo Kwon,et al.  Motion modeling for on-line locomotion synthesis , 2005, SCA '05.

[18]  GleicherMichael,et al.  Automated extraction and parameterization of motions in large data sets , 2004 .

[19]  Victor B. Zordan,et al.  Dynamic response for motion capture animation , 2005, SIGGRAPH '05.

[20]  Hyun Joon Shin,et al.  Snap-together motion: assembling run-time animations , 2003, ACM Trans. Graph..

[21]  Michael Gleicher,et al.  Retargetting motion to new characters , 1998, SIGGRAPH.

[22]  J. Hodgins,et al.  Construction and optimal search of interpolated motion graphs , 2007, SIGGRAPH 2007.

[23]  C. Karen Liu,et al.  Momentum-based parameterization of dynamic character motion , 2004, SCA '04.

[24]  Tomohiko Mukai,et al.  Geostatistical motion interpolation , 2005, SIGGRAPH '05.

[25]  Okan Arikan,et al.  Interactive motion generation from examples , 2002, ACM Trans. Graph..