Quick transitions using multi-way blends

Many applications require character animation that is both natural-looking and controllable. These two requirements are often in opposition. For example, motion graphs (e.g., [Kovar et al. 2002]) are a popular technique that can create realistic-looking transitions between frames of motion. However, the technique is not very controllable because: (1) it may take a long time (proportional to the square of the number of frames for a dense graph using Djikstra’s algorithm) to find the shortest transition between the character’s current frame of motion and a desired frame; and (2) the shortest transition may be too long. This sketch describes an alternative transition mechanism that can generate a transition of user-specified length between any two frames of motion. A typical motion dataset may contain many versions of the same motion, so we first find a set of motion clips that represents the dataset well. We split the motions in the dataset into overlapping clips (with length equal to the user-specified transition length), then cluster them. We use the cluster medoids as the representative set. Now we wish to find a natural-looking transition from every member of the representative set to every other member. We propose transitions to a classifier which decides whether the motion is realistic-looking or not. A transition involves blending from one motion to another (two-way blends), or blending from one motion to another using intermediate motions (multi-way blends). We search over all combinations of two, three, and four way blends between the representatives. We classify motions using a weighted combination of perceptually important motion features. The features we use measure the footskate and zero moment point error. Our classifier outperforms other state-of-the-art classifiers (e.g., [Ren et al. 2005]) on our motion databases. We store the blending schedule (i.e., the frames of the source motions used at each tick) for the blend the classifier liked best. These blending schedules form a transition matrix which we can use to transition between any two frames in the time the user specified. At run-time, the application can demand a transition at any time from the character’s current frame to a desired frame. We find the representative that is closest to the clip of motion starting at the current frame (with user-specified duration), and the representative closest to the clip ending at the desired frame. We then look up the blending schedule for these representatives and employ it, using the current and desired clips in place of the representatives.

[1]  James M. Rehg,et al.  A data-driven approach to quantifying natural human motion , 2005, ACM Trans. Graph..

[2]  Lucas Kovar,et al.  Motion graphs , 2002, SIGGRAPH '08.

[3]  Lucas Kovar,et al.  Motion graphs , 2002, SIGGRAPH Classes.

[4]  James M. Rehg,et al.  A data-driven approach to quantifying natural human motion , 2005, SIGGRAPH '05.