Real-Time Dance Generation to Music for a Legged Robot

The development of robots that can dance has received considerable attention. However, they are often either limited to a pre-defined set of movements and music or demonstrate little variance when reacting to external stimuli, such as microphone or camera input. In this paper, we contribute with a novel approach allowing a legged robot to listen to live music while dancing in synchronization with the music in a diverse fashion. This is achieved by extracting the beat from an onboard microphone in real-time, and subsequently creating a dance choreography by picking from a user-generated dance motion library at every new beat. Dance motions include various stepping and base motions. The process of picking from the library is defined by a probabilistic model, namely a Markov chain, that depends on the previously picked dance motion and the current music tempo. Finally, delays are determined online by time-shifting a measured signal and a reference signal, and minimizing the least squares error with the time-shift as parameter. Delays are then compensated for by using a combined feedforward and feedback delay controller which shifts the robot whole-body controller reference input in time. Results from experiments on a quadrupedal robot demonstrate the fast convergence and synchrony to the perceived music.

[1]  Florian Krebs,et al.  A Multi-model Approach to Beat Tracking Considering Heterogeneous Music Styles , 2014, ISMIR.

[2]  Kazuhiro Kosuge,et al.  Cheek to Chip: Dancing Robots and AI's Future , 2008, IEEE Intelligent Systems.

[3]  Renaud Dubé,et al.  Free Gait — An architecture for the versatile control of legged robots , 2016, 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids).

[4]  Linda Geppert Robotics: QRIO , 2004 .

[5]  Sergei Lupashin,et al.  Synchronizing the motion of a quadrocopter to music , 2010, 2010 IEEE International Conference on Robotics and Automation.

[6]  Tetsuya Ogata,et al.  A biped robot that keeps steps in time with musical beats while listening to music with its own ears , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[7]  Peter Fankhauser,et al.  ANYmal - a highly mobile and dynamic quadrupedal robot , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[8]  Atsushi Nakazawa,et al.  Task model of lower body motion for a biped humanoid robot to imitate human dances , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[9]  Florian Krebs,et al.  madmom: A New Python Audio and Music Signal Processing Library , 2016, ACM Multimedia.

[10]  Candace L. Sidner,et al.  Explorations in engagement for humans and robots , 2005, Artif. Intell..

[11]  L. Geppert,et al.  Qrio, the robot that could , 2004, IEEE Spectrum.

[12]  Kazuyoshi Yoshii,et al.  A robot uses its own microphone to synchronize its steps to musical beats while scatting and singing , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[13]  Catarina B. Santiago,et al.  Autonomous robot dancing synchronized to musical rhythmic stimuli , 2011, 6th Iberian Conference on Information Systems and Technologies (CISTI 2011).

[14]  Brian Scassellati,et al.  Synchronization in Social Tasks: Robotic Drumming , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[15]  Steven van de Par,et al.  Sensitivity to auditory-visual asynchrony and to jitter in auditory-visual timing , 2000, Electronic Imaging.

[16]  Stefan Schaal,et al.  Rapid synchronization and accurate phase-locking of rhythmic motor primitives , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[17]  Shinya Kotosaka,et al.  Synchronized Robot Drumming by Neural Oscillator , 2001 .