A robot uses its own microphone to synchronize its steps to musical beats while scatting and singing

Musical beat tracking is one of the effective technologies for human-robot interaction such as musical sessions. Since such interaction should be performed in various environments in a natural way, musical beat tracking for a robot should cope with noise sources such as environmental noise, its own motor noises, and self voices, by using its own microphone. This paper addresses a musical beat tracking robot which can step, scat and sing according to musical beats by using its own microphone. To realize such a robot, we propose a robust beat tracking method by introducing two key techniques, that is, spectro-temporal pattern matching and echo cancellation. The former realizes robust tempo estimation with a shorter window length, thus, it can quickly adapt to tempo changes. The latter is effective to cancel self noises such as stepping, scatting, and singing. We implemented the proposed beat tracking method for Honda ASIMO. Experimental results showed ten times faster adaptation to tempo changes and high robustness in beat tracking for stepping, scatting and singing noises. We also demonstrated the robot times its steps while scatting or singing to musical beats.

[1]  Masataka Goto,et al.  RWC Music Database: Popular, Classical and Jazz Music Databases , 2002, ISMIR.

[2]  S. Schaal,et al.  Synchronized Robot Drumming with Neural Oscillators , 2000 .

[3]  Masataka Goto,et al.  An Audio-based Real-time Beat Tracking System for Music With or Without Drum-sounds , 2001 .

[4]  Tetsuya Ogata,et al.  A biped robot that keeps steps in time with musical beats while listening to music with its own ears , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[5]  Tetsuya Ogata,et al.  Exploiting known sound source signals to improve ICA-based robot audition in speech separation and recognition , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[6]  Yoichi Muraoka,et al.  A Real-Time Beat Tracking System for Audio Signals , 1996, ICMC.

[7]  Marek P. Michalowski,et al.  A dancing robot for rhythmic social interaction , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[8]  Kazuhito Yokoi,et al.  Imitating human dance motions through motion structure analysis , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[9]  Shinya Kotosaka,et al.  Synchronized Robot Drumming by Neural Oscillator , 2001 .

[10]  Kristoffer Jensen,et al.  Real-Time Beat Estimation Using Feature Extraction , 2003, CMMR.

[11]  George Tzanetakis,et al.  An experimental comparison of audio tempo induction algorithms , 2006, IEEE Transactions on Audio, Speech, and Language Processing.

[12]  Satoru Hayamizu,et al.  Socially Embedded Learning of the Office-Conversant Mobil Robot Jijo-2 , 1997, IJCAI.

[13]  Deb Roy,et al.  Grounded Situation Models for Robots: Where words and percepts meet , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[14]  Kazuhiro Kosuge,et al.  HMM-based Error Detection of Dance Step Selection for Dance Partner Robot -MS DanceR- , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[15]  Tetsunori Kobayashi,et al.  Multi-person conversation via multi-modal interface - a robot who communicate with multi-user - , 1999, EUROSPEECH.

[16]  Hiroaki Kitano,et al.  Active Audition for Humanoid , 2000, AAAI/IAAI.

[17]  Naoyuki Kanda,et al.  A two-layer model for behavior and dialogue planning in conversational service robots , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[18]  Fumio Kanehiro,et al.  Robust speech interface based on audio and video information fusion for humanoid HRP-2 , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[19]  Tetsuya Ogata,et al.  Real-Time Robot Audition System That Recognizes Simultaneous Speech in The Real World , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.