A puppet interface for retrieval of motion capture data

Intuitive and efficient retrieval of motion capture data is essential for effective use of motion capture databases. In this paper, we describe a system that allows the user to retrieve a particular sequence by performing an approximation of the motion with an instrumented puppet. This interface is intuitive because both adults and children have experience playacting with puppets and toys to express particular behaviors or to tell stories with style and emotion. The puppet has 17 degrees of freedom and can therefore represent a variety of motions. We develop a novel similarity metric between puppet and human motion by computing the reconstruction errors of the puppet motion in the latent space of the human motion and those of the human motion in the latent space of the puppet motion. This metric works even for relatively large databases. We conducted a user study of the system and subjects could find the desired motion with reasonable accuracy from a database consisting of everyday, exercise, and acrobatic behaviors.

[1]  Jessica K. Hodgins,et al.  Action capture with accelerometers , 2008, SCA '08.

[2]  Okan Arikan,et al.  Interactive motion generation from examples , 2002, ACM Trans. Graph..

[3]  Jessica K. Hodgins,et al.  Interactive control of avatars animated with human motion data , 2002, SIGGRAPH.

[4]  Tao Yu,et al.  MotionMaster: authoring and choreographing Kung-fu motions by sketch drawings , 2006, SCA '06.

[5]  Jernej Barbic,et al.  Segmenting Motion Capture Data into Distinct Behaviors , 2004, Graphics Interface.

[6]  Takehisa Yairi,et al.  An approach to spacecraft anomaly detection problem using kernel feature space , 2005, KDD '05.

[7]  Jessica K. Hodgins,et al.  Performance animation from low-dimensional control signals , 2005, ACM Trans. Graph..

[8]  Meinard Müller,et al.  Motion templates for automatic classification and retrieval of motion capture data , 2006, SCA '06.

[9]  Paul A. Viola,et al.  Learning silhouette features for control of human motion , 2004, SIGGRAPH '04.

[10]  Lucas Kovar,et al.  Automated extraction and parameterization of motions in large data sets , 2004, ACM Trans. Graph..

[11]  T. Kailath The Divergence and Bhattacharyya Distance Measures in Signal Selection , 1967 .

[12]  Neil D. Lawrence,et al.  Gaussian Process Latent Variable Models for Visualisation of High Dimensional Data , 2003, NIPS.

[13]  Chris Esposito,et al.  Of mice and monkeys: a specialized input device for virtual body animation , 1995, I3D '95.

[14]  Katsu Yamane,et al.  Animating non-humanoid characters with human motion data , 2010, SCA '10.

[15]  Paul Clifton,et al.  I'm in the game: embodied puppet interface improves avatar control , 2010, TEI.

[16]  Lucas Kovar,et al.  Motion graphs , 2002, SIGGRAPH '08.

[17]  Zhigang Deng,et al.  Perceptually consistent example-based human motion retrieval , 2009, I3D '09.

[18]  Taku Komura,et al.  Indexing and Retrieving Motions of Characters in Close Contact , 2009, IEEE Transactions on Visualization and Computer Graphics.

[19]  Brian Knep,et al.  Dinosaur input device , 1995, CHI '95.

[20]  Nancy S. Pollard,et al.  To appear in the ACM SIGGRAPH conference proceedings Responsive Characters from Motion Fragments , 2022 .

[21]  Michael Gleicher,et al.  Automated extraction and parameterization of motions in large data sets , 2004, SIGGRAPH 2004.

[22]  Lucas Kovar,et al.  Motion graphs , 2002, SIGGRAPH Classes.

[23]  Dinesh K. Pai,et al.  FootSee: an interactive animation system , 2003, SCA '03.

[24]  Meinard Müller,et al.  Efficient content-based retrieval of motion capture data , 2005, SIGGRAPH '05.

[25]  Bruce Blumberg,et al.  Sympathetic interfaces: using a plush toy to direct synthetic characters , 1999, CHI '99.

[26]  Aaron Hertzmann,et al.  Style-based inverse kinematics , 2004, ACM Trans. Graph..

[27]  David A. Forsyth,et al.  Motion synthesis from annotations , 2003, ACM Trans. Graph..

[28]  Michiel van de Panne,et al.  Motion doodles: an interface for sketching character motion , 2004, SIGGRAPH Courses.

[29]  Ken-ichi Maeda,et al.  Face recognition using temporal image sequence , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[30]  Yuta Sugiura,et al.  An actuated physical puppet as an input device for controlling a digital manikin , 2011, CHI.

[31]  C. Karen Liu,et al.  Performance-based control interface for character animation , 2009, SIGGRAPH 2009.

[32]  James Davis,et al.  Motion capture data retrieval using an artist’s doll , 2008, 2008 19th International Conference on Pattern Recognition.

[33]  Lance Williams,et al.  Motion signal processing , 1995, SIGGRAPH.