Autonomous robot dancing driven by beats and emotions of music

Many robot dances are preprogrammed by choreographers for a particular piece of music so that the motions can be smoothly executed and synchronized to the music. We are interested in automating the task of robot dance choreography to allow robots to dance without detailed human planning. Robot dance movements are synchronized to the beats and reflect the emotion of any music. Our work is made up of two parts: (1) The first algorithm plans a sequence of dance movements that is driven by the beats and the emotions detected through the preprocessing of selected dance music. (2) We also contribute a real-time synchronizing algorithm to minimize the error between the execution of the motions and the plan. Our work builds on previous research to extract beats and emotions from music audio. We created a library of parameterized motion primitives, whereby each motion primitive is composed of a set of keyframes and durations and generate the sequence of dance movements from this library. We demonstrate the feasibility of our algorithms on the NAO humanoid robot to show that the robot is capable of using the mappings defined to autonomously dance to any music. Although we present our work using a humanoid robot, our algorithm is applicable to other robots.

[1]  R. Thayer The biopsychology of mood and arousal , 1989 .

[2]  P. Ekman Are there basic emotions? , 1992, Psychological review.

[3]  Cynthia Breazeal,et al.  Emotion and sociable humanoid robots , 2003, Int. J. Hum. Comput. Stud..

[4]  Sung Yong Shin,et al.  Rhythmic-motion synthesis based on motion-beat analysis , 2003, ACM Trans. Graph..

[5]  이현철 Automatic synchronization of background music and motion in computer animation , 2005 .

[6]  Daniel P. W. Ellis,et al.  Beat Tracking by Dynamic Programming , 2007 .

[7]  Atsushi Nakazawa,et al.  Dancing‐to‐Music Character Animation , 2006, Comput. Graph. Forum.

[8]  Reid G. Simmons,et al.  Modeling Affect in Socially Interactive Robots , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[9]  Yan Wang,et al.  Motion Control of a Dancing Character with Music , 2007, 6th IEEE/ACIS International Conference on Computer and Information Science (ICIS 2007).

[10]  Seungmin Rho,et al.  SMERS: Music Emotion Recognition Using Support Vector Regression , 2009, ISMIR.

[11]  Kazuhito Yokoi,et al.  Intuitive and flexible user interface for creating whole body motions of biped humanoid robots , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[12]  Luis Paulo Reis,et al.  Synthesis of variable dancing styles based on a compact spatiotemporal representation of dance , 2010, IROS 2010.

[13]  Anssi Klapuri,et al.  Music Tempo Estimation With $k$-NN Regression , 2010, IEEE Transactions on Audio, Speech, and Language Processing.

[14]  Markus Schedl,et al.  ENHANCED BEAT TRACKING WITH CONTEXT-AWARE NEURAL NETWORKS , 2011 .

[15]  Roger B. Dannenberg,et al.  A Framework for Coordination and Synchronization of Media , 2011, NIME.