Data-driven finger motion synthesis for gesturing characters

Capturing the body movements of actors to create animations for movies, games, and VR applications has become standard practice, but finger motions are usually added manually as a tedious post-processing step. In this paper, we present a surprisingly simple method to automate this step for gesturing and conversing characters. In a controlled environment, we carefully captured and post-processed finger and body motions from multiple actors. To augment the body motions of virtual characters with plausible and detailed finger movements, our method selects finger motion segments from the resulting database taking into account the similarity of the arm motions and the smoothness of consecutive finger motions. We investigate which parts of the arm motion best discriminate gestures with leave-one-out cross-validation and use the result as a metric to select appropriate finger motions. Our approach provides good results for a number of examples with different gesture types and is validated in a perceptual experiment.

[1]  N. Palastanga,et al.  Anatomy and Human Movement: Structure and Function , 1989 .

[2]  P. McCoy Anatomy and Human Movement: Structure and function , 1989 .

[3]  M. Studdert-Kennedy Hand and Mind: What Gestures Reveal About Thought. , 1994 .

[4]  Michael F. Cohen,et al.  Verbs and Adverbs: Multidimensional Motion Interpolation , 1998, IEEE Computer Graphics and Applications.

[5]  Charles F. Rose,et al.  Verbs and adverbs: multidimensional motion interpolation using radial basis functions , 1999 .

[6]  Jessica K. Hodgins,et al.  Interactive control of avatars animated with human motion data , 2002, SIGGRAPH.

[7]  Okan Arikan,et al.  Interactive motion generation from examples , 2002, ACM Trans. Graph..

[8]  Lucas Kovar,et al.  Motion graphs , 2002, SIGGRAPH '08.

[9]  Stan Sclaroff,et al.  Estimating 3D hand pose from a cluttered image , 2003, 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings..

[10]  Karan Singh,et al.  Eurographics/siggraph Symposium on Computer Animation (2003) Handrix: Animating the Human Hand , 2003 .

[11]  Justine Cassell,et al.  BEAT: the Behavior Expression Animation Toolkit , 2001, Life-like characters.

[12]  Matthew Stone,et al.  Speaking with hands: creating animated conversational characters from recordings of human performance , 2004, ACM Trans. Graph..

[13]  Gabriel Zachmann,et al.  'Visual-fidelity' dataglove calibration , 2004 .

[14]  A. Kendon Gesture: Visible Action as Utterance , 2004 .

[15]  Dinesh K. Pai,et al.  Interaction capture and synthesis , 2005, SIGGRAPH 2005.

[16]  Eugene Fiume,et al.  Helping hand: an anatomically accurate inverse dynamics solution for unconstrained hand motion , 2005, SCA '05.

[17]  Victor B. Zordan,et al.  Physically based grasping control from example , 2005, SCA '05.

[18]  Dinesh K. Pai,et al.  Interaction capture and synthesis , 2005, ACM Trans. Graph..

[19]  Victor B. Zordan,et al.  Automatic splicing for hand and body animations , 2006, SCA '06.

[20]  Hans-Peter Seidel,et al.  Modeling Relaxed Hand Shape for Character Animation , 2006, AMDO.

[21]  Nancy S. Pollard,et al.  To appear in the ACM SIGGRAPH conference proceedings Responsive Characters from Motion Fragments , 2022 .

[22]  Michael Neff,et al.  An annotation scheme for conversational gestures: how to economically capture timing and form , 2007, Lang. Resour. Evaluation.

[23]  Hans-Peter Seidel,et al.  Annotated New Text Engine Animation Animation Lexicon Animation Gesture Profiles MR : . . . JL : . . . Gesture Generation Video Annotated Gesture Script , 2007 .

[24]  Lucas Kovar,et al.  Motion graphs , 2002, SIGGRAPH Classes.

[25]  Midori Kitagawa,et al.  MoCap for Artists: Workflow and Techniques for Motion Capture , 2008 .

[26]  C. Karen Liu Synthesis of interactive hand manipulation , 2008, SCA '08.

[27]  C. Karen Liu,et al.  Dextrous manipulation from a grasping pose , 2009, ACM Trans. Graph..

[28]  Jovan Popović,et al.  Real-time hand-tracking with a color glove , 2009, SIGGRAPH 2009.

[29]  Sergey Levine,et al.  Real-time prosody-driven synthesis of body language , 2009, ACM Trans. Graph..

[30]  Robert Y. Wang,et al.  Real-time hand-tracking with a color glove , 2009, ACM Trans. Graph..

[31]  S. Levine,et al.  Gesture controllers , 2010, ACM Trans. Graph..

[32]  Jessica K. Hodgins,et al.  The perception of finger motions , 2010, APGV '10.

[33]  Dana Kulic,et al.  A study of human performance in recognizing expressive hand movements , 2011, 2011 RO-MAN.

[34]  Sylvain Paris,et al.  6D hands: markerless hand-tracking for computer aided design , 2011, UIST.

[35]  Jinxiang Chai,et al.  Combining marker-based mocap and RGB-D camera for acquiring high-fidelity hand motion data , 2012, SCA '12.

[36]  C. Karen Liu,et al.  Synthesis of detailed hand manipulations using contact sampling , 2012, ACM Trans. Graph..

[37]  Ludovic Hoyet,et al.  Sleight of hand: perception of finger motion from reduced marker sets , 2012, I3D '12.