Motion Feature Extraction and Stylization for Character Animation using Hilbert-Huang Transform

This paper presents novel insights to feature extraction and stylization of character motion in the instantaneous frequency domain by proposing a method using the Hilbert-Huang transform (HHT). HHT decomposes human motion capture data in the frequency domain into several pseudo monochromatic signals, so-called intrinsic mode functions (IMFs). We propose an algorithm to reconstruct these IMFs and extract motion features automatically using the Fibonacci sequence in the link-dynamical structure of the human body. Our research revealed that these reconstructed motions could be mainly divided into three parts, a primary motion and a secondary motion, corresponding to the animation principles, and a basic motion consisting of posture and position. Our method help animators edit target motions by extracting and blending the primary or secondary motions extracted from a source motion. To demonstrate results, we applied our proposed method to general motions (jumping, punching, and walking motions) to achieve different stylizations.