Quick Motion Transitions with Cached Multi-way Blends

We describe a method for responsive, high-quality synthesis of human motion. Our method can quickly provide a motion synthesizer with a one-second long, high-quality transition from any frame in a motion collection to any other frame in the collection. We construct these transitions using 2-, 3and 4-way blends. During pre-processing, we search all possible blends between representative samples of motion obtained using clustering. The blends are evaluated automatically with a novel motion evaluation procedure, which we demonstrate is significantly more accurate than current alternatives. The best blending recipe for each pair of representatives is then cached. At run-time, we build a transition between motions by matching a future window of the source motion to a representative, matching the past of the target motion to a representative, and then applying the blend recipe recovered from the cache to source and target motion and whatever stubs are required. This method yields goodlooking transitions between distinct motions with very low online cost.

[1]  Jitendra Malik,et al.  Spectral grouping using the Nystrom method , 2004, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[2]  Norman I. Badler,et al.  Motion Abstraction and Mapping with Spatial Constraints , 1998, CAPTECH.

[3]  Bobby Bodenheimer,et al.  An evaluation of a cost metric for selecting transitions between motion segments , 2003, SCA '03.

[4]  Lance Williams,et al.  Motion signal processing , 1995, SIGGRAPH.

[5]  Lucas Kovar,et al.  Flexible automatic motion blending with registration curves , 2003, SCA '03.

[6]  Michael F. Cohen,et al.  Verbs and Adverbs: Multidimensional Motion Interpolation , 1998, IEEE Computer Graphics and Applications.

[7]  David A. Forsyth,et al.  Pushing people around , 2005, SCA '05.

[8]  David A. Forsyth,et al.  Enriching a motion collection by transplanting limbs , 2004, SCA '04.

[9]  David A. Forsyth,et al.  Knowing when to put your foot down , 2006, I3D '06.

[10]  Okan Arikan,et al.  Interactive motion generation from examples , 2002, ACM Trans. Graph..

[11]  Michael F. Cohen,et al.  Efficient generation of motion transitions using spacetime constraints , 1996, SIGGRAPH.

[12]  Jessica K. Hodgins,et al.  Analyzing the physical correctness of interpolated human motion , 2005, SCA '05.

[13]  Lucas Kovar,et al.  Footskate cleanup for motion capture editing , 2002, SCA '02.

[14]  James K. Hahn,et al.  Interpolation synthesis for articulated figure motion , 1997, Proceedings of IEEE 1997 Annual International Symposium on Virtual Reality.

[15]  James M. Rehg,et al.  A data-driven approach to quantifying natural human motion , 2005, ACM Trans. Graph..

[16]  Jitendra Malik,et al.  Normalized cuts and image segmentation , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[17]  Taesoo Kwon,et al.  Motion modeling for on-line locomotion synthesis , 2005, SCA '05.

[18]  Sung Yong Shin,et al.  Computer puppetry: An importance-based approach , 2001, TOGS.

[19]  Sung Yong Shin,et al.  On-line locomotion generation based on motion blending , 2002, SCA '02.

[20]  Adrian Hilton,et al.  Realistic synthesis of novel human movements from a database of motion capture examples , 2000, Proceedings Workshop on Human Motion.

[21]  Andrew P. Witkin,et al.  Spacetime constraints , 1988, SIGGRAPH.

[22]  Lucas Kovar,et al.  Automated extraction and parameterization of motions in large data sets , 2004, ACM Trans. Graph..

[23]  Sung Yong Shin,et al.  On-line motion blending for real-time locomotion generation: Research Articles , 2004 .

[24]  Michael Gleicher,et al.  Automated extraction and parameterization of motions in large data sets , 2004, SIGGRAPH 2004.

[25]  Lucas Kovar,et al.  Motion graphs , 2002, SIGGRAPH Classes.

[26]  Harry Shum,et al.  Motion texture: a two-level statistical model for character motion synthesis , 2002, ACM Trans. Graph..

[27]  Bobby Bodenheimer,et al.  Computing the duration of motion transitions: an empirical approach , 2004, SCA '04.

[28]  Jovan Popovic,et al.  Adaptation of performed ballistic motion , 2005, TOGS.

[29]  Jessica K. Hodgins,et al.  Interactive control of avatars animated with human motion data , 2002, SIGGRAPH.

[30]  C. Karen Liu,et al.  Synthesis of complex dynamic character motion from simple animations , 2002, ACM Trans. Graph..

[31]  Hyun Joon Shin,et al.  Snap-together motion: assembling run-time animations , 2003, ACM Trans. Graph..

[32]  Alberto Menache,et al.  Understanding Motion Capture for Computer Animation and Video Games , 1999 .