Music-driven motion editing: local motion transformations guided by music analysis

This paper presents a general framework for synchronising motion curves to music in computer animation. Motions are locally modified using perceptual cues extracted from the music. The key to this approach is the use of standard music analysis techniques on complementary MIDI and audio representations of the same soundtrack. These musical features then guide the motion editing process. It allows users to easily combine different aspects of the music with different aspects of the motion.

[1]  Lance Williams,et al.  Motion signal processing , 1995, SIGGRAPH.

[2]  James K. Hahn,et al.  Integrating Sounds and Motions in Virtual Environments , 1998, Presence.

[3]  Michael F. Cohen,et al.  Verbs and Adverbs: Multidimensional Motion Interpolation , 1998, IEEE Computer Graphics and Applications.

[4]  Ken Perlin,et al.  Real Time Responsive Animation with Personality , 1995, IEEE Trans. Vis. Comput. Graph..

[5]  Jyri Huopaniemi,et al.  DIVA Virtual Audio Reality System , 1996 .

[6]  Kunio Kashino,et al.  A Sound Source Separation System with the Ability of Automatic Tone Modeling , 1993, International Conference on Mathematics and Computing.

[7]  Tapio Lokki,et al.  Virtual Environment Simulation - Advances in the DIVA project , 1997 .

[8]  Emilios Cambouropoulos,et al.  Extracting 'Significant' Patterns from Musical Stri ngs: Some Interesting Problems. , 2000 .

[9]  Peter Litwinowicz,et al.  Inkwell: A 2-D animation system , 1991, SIGGRAPH.

[10]  Tapio Takala,et al.  An integrated approach to motion and sound , 1995, Comput. Animat. Virtual Worlds.

[11]  Sabine Coquillart,et al.  A Control-Point-Based Sweeping Technique , 1987, IEEE Computer Graphics and Applications.

[12]  David Temperley The perception of harmony and tonality : an algorithmic perspective , 1997 .

[13]  Peter Desain,et al.  Computational models of beat induction: the rule-based approach , 1999 .

[14]  Ken-ichi Anjyo,et al.  Fourier principles for emotion-based human figure animation , 1995, SIGGRAPH.

[15]  Siome Goldenstein,et al.  Motion cyclification by time/spl times/frequency warping , 1999, XII Brazilian Symposium on Computer Graphics and Image Processing (Cat. No.PR00481).

[16]  Robert Rowe,et al.  Two Highly Integrated Real-Time Music and Graphics Performance Systems , 1997, ICMC.

[17]  John Lasseter,et al.  Principles of traditional animation applied to 3D computer animation , 1987, SIGGRAPH.

[18]  Zsófia Ruttkay,et al.  CharToon 2.1 extensions : Expression repertoire and lip sync , 2000 .

[19]  Ian E. McDowall,et al.  Sculpting 3D worlds with music: advanced texturing techniques , 1996, Electronic Imaging.

[20]  Georg Trogemann,et al.  Automated Lip-Sync for 3D-Character Animation , 1997 .

[21]  David Gerhard,et al.  Ph.d. Depth Paper: Audio Signal Classiication , 2000 .

[22]  Sho Yoshida,et al.  Automatic background music generation based on actors' mood and motions , 1994, Comput. Animat. Virtual Worlds.

[23]  H C Longuet-Higgins,et al.  The Perception of Musical Rhythms , 1982, Perception.

[24]  Ken Perlin,et al.  Improv: a system for scripting interactive actors in virtual worlds , 1996, SIGGRAPH.