Music similarity-based approach to generating dance motion sequence

In this paper, we propose a novel approach to generating a sequence of dance motions using music similarity as a criterion to find the appropriate motions given a new musical input. Based on the observation that dance motions used in similar musical pieces can be a good reference in choreographing a new dance, we first construct a music-motion database that comprises a number of segment-wise music-motion pairs. When a new musical input is given, it is divided into short segments and for each segment our system suggests the dance motion candidates by finding from the database the music cluster that is most similar to the input. After a user selects the best motion segment, we perform music-dance synchronization by means of cross-correlation between the two music segments using the novelty functions as an input. We evaluate our system’s performance using a user study, and the results show that the dance motion sequence generated by our system achieves significantly higher ratings than the one generated randomly.

[1]  白鳥 貴亮 Synthesis of dance performance based on analyses of human motion and music , 2007 .

[2]  D.M. Mount,et al.  An Efficient k-Means Clustering Algorithm: Analysis and Implementation , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[3]  G. H. Wakefield,et al.  To catch a chorus: using chroma-based representations for audio thumbnailing , 2001, Proceedings of the 2001 IEEE Workshop on the Applications of Signal Processing to Audio and Acoustics (Cat. No.01TH8575).

[4]  Gazihan Alankus,et al.  Automated motion synthesis for dancing characters: Motion Capture and Retrieval , 2005 .

[5]  Gazihan Alankus,et al.  Automated motion synthesis for dancing characters , 2005, Comput. Animat. Virtual Worlds.

[6]  Yee-Hong Yang,et al.  Music-driven character animation , 2009, TOMCCAP.

[7]  Kazuhito Yokoi,et al.  Intuitive and flexible user interface for creating whole body motions of biped humanoid robots , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[8]  B. Schölkopf,et al.  Modeling Human Motion Using Binary Latent Variables , 2007 .

[9]  Daniel Thalmann,et al.  MotionLab: A Matlab Toolbox for Extracting and Processing Experimental Motion Capture Data for Neuromuscular Simulations , 2009, 3DPH.

[10]  Nikos A. Vlassis,et al.  The global k-means clustering algorithm , 2003, Pattern Recognit..

[11]  Youngmoo E. Kim,et al.  Creating an autonomous dancing robot , 2009, ICHIT '09.

[12]  Jonathan Foote,et al.  Automatic audio segmentation using a measure of audio novelty , 2000, 2000 IEEE International Conference on Multimedia and Expo. ICME2000. Proceedings. Latest Advances in the Fast Changing World of Multimedia (Cat. No.00TH8532).

[13]  Dongho Kim,et al.  Synthesis of Dancing Character Motion from Beatboxing Sounds , 2007, Smart Graphics.

[14]  Atsushi Nakazawa,et al.  Dancing‐to‐Music Character Animation , 2006, Comput. Graph. Forum.

[15]  Jonathan Foote,et al.  Visualizing music and audio using self-similarity , 1999, MULTIMEDIA '99.

[16]  Ryohei Nakatsu,et al.  Dance Motion Control of a Humanoid Robot Based on Real-Time Tempo Tracking from Musical Audio Signals , 2009, ICEC.

[17]  James K. Hahn,et al.  Perceptually motivated automatic dance motion generation for music , 2009, Comput. Animat. Virtual Worlds.

[18]  Tsai-Yen Li,et al.  Automatic Generation of Character Animations Expressing Music Features , 2009 .

[19]  A. Murat Tekalp,et al.  Multi-modal analysis of dance performances for music-driven choreography synthesis , 2010, 2010 IEEE International Conference on Acoustics, Speech and Signal Processing.