Expressing animated performances through puppeteering

An essential form of communication between the director and the animators early in the animation pipeline is rough cut at the motion (a blocked-in animation). This version of the character's performance allows the director and animators to discuss how the character will play his/her role in each scene. However, blocked-in animation is also quite time consuming to construct, with short scenes requiring many hours of preparation between presentations. In this paper, we present a puppeteering interface for creating blocked-in motion for characters and various simulation effects more quickly than is possible in a keyframing interface. The animator manipulates one of a set of tracked objects in a motion capture system to control a few degrees of freedom of the character on each take. We explore the design space for the 3D puppeteering interface with a set of seven professional animators using a “think-aloud” protocol. We present a number of animations that they created and compare the time required to create similar animations in our 3D user interface and a commercial keyframing interface.

[1]  Atsushi Nakazawa,et al.  A puppet interface for retrieval of motion capture data , 2011, SCA '11.

[2]  John Lasseter,et al.  Principles of traditional animation applied to 3D computer animation , 1987, SIGGRAPH.

[3]  Mira Dontcheva,et al.  Layered acting for character animation , 2003, ACM Trans. Graph..

[4]  Geoffrey E. Hinton,et al.  A Desktop Input Device and Interface for Interactive 3D Character Animation , 2002, Graphics Interface.

[5]  Sung Yong Shin,et al.  Computer puppetry: An importance-based approach , 2001, TOGS.

[6]  Bruce Blumberg,et al.  Sympathetic interfaces: using a plush toy to direct synthetic characters , 1999, CHI '99.

[7]  Chris Esposito,et al.  Of mice and monkeys: a specialized input device for virtual body animation , 1995, I3D '95.

[8]  Yuta Sugiura,et al.  An actuated physical puppet as an input device for controlling a digital manikin , 2011, CHI.

[9]  James Davis,et al.  Motion capture data retrieval using an artist’s doll , 2008, 2008 19th International Conference on Pattern Recognition.

[10]  Marek P. Michalowski,et al.  Keepon : A Playful Robot for Research, Therapy, and Entertainment (Original Paper) , 2009 .

[11]  Taku Komura,et al.  Real‐time locomotion control by sensing gloves , 2006, Comput. Animat. Virtual Worlds.

[12]  Marek P. Michalowski,et al.  Keepon , 2009, Int. J. Soc. Robotics.

[13]  J. Lance,et al.  Physiological tremor. , 1968, Lancet.

[14]  Hans-Peter Seidel,et al.  Layered Performance Animation with Correlation Maps , 2007, Comput. Graph. Forum.

[15]  Chih-Chung Lin,et al.  Building Hand Motion-Based Character Animation: The Case of Puppetry , 2010, 2010 International Conference on Cyberworlds.

[16]  F. Thomas,et al.  The illusion of life : Disney animation , 1981 .

[17]  Brian Knep,et al.  Dinosaur input device , 1995, CHI '95.

[18]  Katsu Yamane,et al.  Natural Motion Animation through Constraining and Deconstraining at Will , 2003, IEEE Trans. Vis. Comput. Graph..

[19]  Russell M. Taylor,et al.  VRPN: a device-independent, network-transparent VR peripheral system , 2001, VRST '01.

[20]  Takeo Igarashi,et al.  Spatial keyframing for performance-driven animation , 2006, SCA '05.

[21]  Eugene Fiume,et al.  Interactive control for physically-based animation , 2000, SIGGRAPH.

[22]  Jessica K. Hodgins,et al.  Performance animation from low-dimensional control signals , 2005, SIGGRAPH 2005.