Quick transitions with cached multi-way blends

We describe a discriminative method for distinguishing natural-looking from unnatural-looking motion. Our method is based on physical and data-driven features of motion to which humans seem sensitive. We demonstrate that our technique is significantly more accurate than current alternatives. We use this technique as the testing part of a hypothesize-and-test motion synthesis procedure. The mechanism we build using this procedure can quickly provide an application with a transition of user-specified duration from any frame in a motion collection to any other frame in the collection. During pre-processing, we search all possible 2-, 3-, and 4-way blends between representative samples of motion obtained using clustering. The blends are automatically evaluated, and the recipe (i.e., the representatives and the set of weighting functions) that created the best blend is cached. At run-time, we build a transition between motions by matching a future window of the source motion to a representative, matching the past of the target motion to a representative, and then applying the blend recipe recovered from the cache to source and target motion. People seem sensitive to poor contact with the environment like sliding foot plants. We determine appropriate temporal and positional constraints for each foot plant using a novel technique, then apply an off-the-shelf inverse kinematics technique to enforce the constraints. This synthesis procedure yields good-looking transitions between distinct motions with very low online cost.

[1]  Sung Yong Shin,et al.  On-line motion blending for real-time locomotion generation: Research Articles , 2004 .

[2]  James K. Hahn,et al.  Interpolation synthesis for articulated figure motion , 1997, Proceedings of IEEE 1997 Annual International Symposium on Virtual Reality.

[3]  Taesoo Kwon,et al.  Motion modeling for on-line locomotion synthesis , 2005, SCA '05.

[4]  Michael F. Cohen,et al.  Efficient generation of motion transitions using spacetime constraints , 1996, SIGGRAPH.

[5]  Lucas Kovar,et al.  Footskate cleanup for motion capture editing , 2002, SCA '02.

[6]  Alex Pothen,et al.  PARTITIONING SPARSE MATRICES WITH EIGENVECTORS OF GRAPHS* , 1990 .

[7]  Lucas Kovar,et al.  Flexible automatic motion blending with registration curves , 2003, SCA '03.

[8]  M Vukobratović,et al.  Contribution to the synthesis of biped gait. , 1969, IEEE transactions on bio-medical engineering.

[9]  Adrian Hilton,et al.  Realistic synthesis of novel human movements from a database of motion capture examples , 2000, Proceedings Workshop on Human Motion.

[10]  David A. Forsyth,et al.  Pushing people around , 2005, SCA '05.

[11]  David A. Winter,et al.  Biomechanics and Motor Control of Human Movement , 1990 .

[12]  Hyeong-Seok Ko,et al.  Motion Balance Filtering , 2000, Comput. Graph. Forum.

[13]  Alberto Menache,et al.  Understanding Motion Capture for Computer Animation and Video Games , 1999 .

[14]  Michael Gleicher,et al.  Automated extraction and parameterization of motions in large data sets , 2004, SIGGRAPH 2004.

[15]  Hyun Joon Shin,et al.  Snap-together motion: assembling run-time animations , 2003, ACM Trans. Graph..

[16]  Lucas Kovar,et al.  Motion graphs , 2002, SIGGRAPH Classes.

[17]  Sung Yong Shin,et al.  On-line locomotion generation based on motion blending , 2002, SCA '02.

[18]  Jitendra Malik,et al.  Spectral grouping using the Nystrom method , 2004, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[19]  Harry Shum,et al.  Motion texture: a two-level statistical model for character motion synthesis , 2002, ACM Trans. Graph..

[20]  Lance Williams,et al.  Motion signal processing , 1995, SIGGRAPH.

[21]  Okan Arikan Compression of motion capture databases , 2006, SIGGRAPH 2006.

[22]  Norman I. Badler,et al.  Motion Abstraction and Mapping with Spatial Constraints , 1998, CAPTECH.

[23]  Ronan Boulic,et al.  Robust kinematic constraint detection for motion data , 2006, SCA '06.

[24]  Sung Yong Shin,et al.  Computer puppetry: An importance-based approach , 2001, TOGS.

[25]  David A. Forsyth,et al.  Enriching a motion collection by transplanting limbs , 2004, SCA '04.

[26]  David A. Forsyth,et al.  Knowing when to put your foot down , 2006, I3D '06.

[27]  James M. Rehg,et al.  A data-driven approach to quantifying natural human motion , 2005, ACM Trans. Graph..

[28]  C. Karen Liu,et al.  Synthesis of complex dynamic character motion from simple animations , 2002, ACM Trans. Graph..

[29]  Okan Arikan,et al.  Interactive motion generation from examples , 2002, ACM Trans. Graph..

[30]  Bobby Bodenheimer,et al.  An evaluation of a cost metric for selecting transitions between motion segments , 2003, SCA '03.

[31]  Michael F. Cohen,et al.  Verbs and Adverbs: Multidimensional Motion Interpolation , 1998, IEEE Computer Graphics and Applications.

[32]  Jovan Popovic,et al.  Adaptation of performed ballistic motion , 2005, TOGS.

[33]  Hyun Joon Shin,et al.  Physical touch-up of human motions , 2003, 11th Pacific Conference onComputer Graphics and Applications, 2003. Proceedings..

[34]  Bobby Bodenheimer,et al.  Computing the duration of motion transitions: an empirical approach , 2004, SCA '04.

[35]  Jessica K. Hodgins,et al.  Analyzing the physical correctness of interpolated human motion , 2005, SCA '05.

[36]  Andrew P. Witkin,et al.  Spacetime constraints , 1988, SIGGRAPH.

[37]  Jessica K. Hodgins,et al.  Interactive control of avatars animated with human motion data , 2002, SIGGRAPH.