Performance-based control interface for character animation

Most game interfaces today are largely symbolic, translating simplified input such as keystrokes into the choreography of full-body character movement. In this paper, we describe a system that directly uses human motion performance to provide a radically different, and much more expressive interface for controlling virtual characters. Our system takes a data feed from a motion capture system as input, and in real-time translates the performance into corresponding actions in a virtual world. The difficulty with such an approach arises from the need to manage the discrepancy between the real and virtual world, leading to two important subproblems 1) recognizing the user's intention, and 2) simulating the appropriate action based on the intention and virtual context. We solve this issue by first enabling the virtual world's designer to specify possible activities in terms of prominent features of the world along with associated motion clips depicting interactions. We then integrate the prerecorded motions with online performance and dynamic simulation to synthesize seamless interaction of the virtual character in a simulated virtual world. The result is a flexible interface through which a user can make freeform control choices while the resulting character motion maintains both physical realism and the user's personal style.

[1]  Alex Pentland,et al.  Space-time gestures , 1993, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[2]  Hans-Peter Seidel,et al.  Layered Performance Animation with Correlation Maps , 2007, Comput. Graph. Forum.

[3]  Michael Gleicher,et al.  Motion editing with spacetime constraints , 1997, SI3D.

[4]  Tido Röder,et al.  Efficient content-based retrieval of motion capture data , 2005, SIGGRAPH 2005.

[5]  Jessica K. Hodgins,et al.  Interactive control of avatars animated with human motion data , 2002, SIGGRAPH.

[6]  Brennan Rusnell Interactive Control For Physically-Based Animation , 2006 .

[7]  Lucas Kovar,et al.  Motion graphs , 2002, SIGGRAPH Classes.

[8]  Dinesh K. Pai,et al.  FootSee: an interactive animation system , 2003, SCA '03.

[9]  Sung Yong Shin,et al.  Computer puppetry: An importance-based approach , 2001, TOGS.

[10]  Takeo Igarashi,et al.  Spatial keyframing for performance-driven animation , 2005 .

[11]  Zoran Popovic,et al.  Physically based motion transformation , 1999, SIGGRAPH.

[12]  Mira Dontcheva,et al.  Layered acting for character animation , 2003, ACM Trans. Graph..

[13]  Jessica K. Hodgins,et al.  Accelerometer-based user interfaces for the control of a physically simulated character , 2008, SIGGRAPH 2008.

[14]  Aaron Hertzmann,et al.  Style-based inverse kinematics , 2004, ACM Trans. Graph..

[15]  Sung Yong Shin,et al.  A hierarchical approach to interactive motion editing for human-like figures , 1999, SIGGRAPH.

[16]  Dimitrios Gunopulos,et al.  Indexing Large Human-Motion Databases , 2004, VLDB.

[17]  Peng Zhao,et al.  User interfaces for interactive control of physics-based 3D characters , 2005, I3D '05.

[18]  John Hart,et al.  ACM Transactions on Graphics , 2004, SIGGRAPH 2004.

[19]  Geoffrey E. Hinton,et al.  Local Physical Models for Interactive Character Animation , 2002, Comput. Graph. Forum.

[20]  Michiel van de Panne,et al.  Motion doodles: an interface for sketching character motion , 2004, SIGGRAPH Courses.

[21]  Jessica K. Hodgins,et al.  Performance animation from low-dimensional control signals , 2005, SIGGRAPH 2005.

[22]  Zoran Popovic,et al.  Motion warping , 1995, SIGGRAPH.

[23]  Junji Yamato,et al.  Recognizing human action in time-sequential images using hidden Markov model , 1992, Proceedings 1992 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.