Fat graphs: constructing an interactive character with continuous controls

This paper proposes a methodology that allows users to control character's motion interactively but continuously. Inspired by the work of Gleicher et al. [GSKJ03], we propose a semi-automatic method to build fat graphs where a node corresponds to a pose and its incoming and outgoing edges represent the motion segments starting from and ending at similar poses. A group of edges is built into a fat edge that parameterizes similar motion segments into a blendable form. Employing the existing motion transition and blending methods, our run-time system allows users to control a character interactively in continuous parameter spaces with conventional input devices such as joysticks and the mice. The capability of the proposed methodology is demonstrated through several applications. Although our method has some limitations on motion repertories and qualities, it can be adapted to a number of real-world applications including video games and virtual reality applications.

[1]  Lucas Kovar,et al.  Flexible automatic motion blending with registration curves , 2003, SCA '03.

[2]  Sung Yong Shin,et al.  On‐line motion blending for real‐time locomotion generation , 2004, Comput. Animat. Virtual Worlds.

[3]  Sung Yong Shin,et al.  Rhythmic-motion synthesis based on motion-beat analysis , 2003, ACM Trans. Graph..

[4]  Sung Yong Shin,et al.  Planning biped locomotion using motion capture data and probabilistic roadmaps , 2003, TOGS.

[5]  Hyun Joon Shin,et al.  Snap-together motion: assembling run-time animations , 2003, SIGGRAPH '08.

[6]  Jessica K. Hodgins,et al.  Interactive control of avatars animated with human motion data , 2002, SIGGRAPH.

[7]  Sung Yong Shin,et al.  On-line locomotion generation based on motion blending , 2002, SCA '02.

[8]  Lucas Kovar,et al.  Motion Graphs , 2002, ACM Trans. Graph..

[9]  David A. Forsyth,et al.  Motion synthesis from annotations , 2003, ACM Trans. Graph..

[10]  Lucas Kovar,et al.  Automated extraction and parameterization of motions in large data sets , 2004, ACM Trans. Graph..

[11]  Jehee Lee,et al.  Precomputing avatar behavior from human motion data , 2004, SCA '04.

[12]  Michael F. Cohen,et al.  Verbs and Adverbs: Multidimensional Motion Interpolation , 1998, IEEE Computer Graphics and Applications.

[13]  Tomohiko Mukai,et al.  Geostatistical motion interpolation , 2005, SIGGRAPH '05.

[14]  Okan Arikan,et al.  Interactive motion generation from examples , 2002, ACM Trans. Graph..