Gesture-based human-robot Jazz improvisation

We present Shimon, an interactive improvisational robotic marimba player, developed for research in Robotic Musicianship. The robot listens to a human musician and continuously adapts its improvisation and choreography, while playing simultaneously with the human. We discuss the robot's mechanism and motion-control, which uses physics simulation and animation principles to achieve both expressivity and safety. We then present a novel interactive improvisation system based on the notion of gestures for both musical and visual expression. The system also uses anticipatory beat-matched action to enable real-time synchronization with the human player. Our system was implemented on a full-length human-robot Jazz duet, displaying highly coordinated melodic and rhythmic human-robot joint improvisation. We have performed with the system in front of a live public audience

[1]  Cynthia Breazeal,et al.  Collaboration in Human-Robot Teams , 2004, AIAA 1st Intelligent Systems Technical Conference.

[2]  Vladimir I. Levenshtein,et al.  Binary codes capable of correcting deletions, insertions, and reversals , 1965 .

[3]  Cynthia Breazeal,et al.  A hybrid control system for puppeteering a live robotic stage actor , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[4]  Gil Weinberg,et al.  The Design of a Perceptual and Improvisational Robotic Marimba Player , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[5]  John Lasseter,et al.  Principles of traditional animation applied to 3D computer animation , 1987, SIGGRAPH.

[6]  Atsuo Takanishi,et al.  Toward enabling a natural interaction between human musicians and musical performance robots: Implementation of a real-time gestural interface , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[7]  Cynthia Breazeal,et al.  Cost-Based Anticipatory Action Selection for Human–Robot Fluency , 2007, IEEE Transactions on Robotics.

[8]  Eric Singer,et al.  LEMUR GuitarBot: MIDI Robotic String Instrument , 2003, NIME.

[9]  Marcelo M. Wanderley,et al.  Trends in Gestural Control of Music , 2000 .

[10]  Roger B. Dannenberg,et al.  McBlare: A Robotic Bagpipe Player , 2005, NIME.

[11]  Gil Weinberg,et al.  Toward Robotic Musicianship , 2006, Computer Music Journal.

[12]  Cynthia Breazeal,et al.  Anticipatory Perceptual Simulation for Human-Robot Joint Practice: Theory and Application Study , 2008, AAAI.

[13]  Robert Rowe,et al.  Machine Musicianship , 2001 .

[14]  C. Breazeal,et al.  Robotic Partners ’ Bodies and Minds : An Embodied Approach to Fluid Human-Robot Collaboration , 2006 .

[15]  Sanford Meisner,et al.  Sanford Meisner on acting , 1987 .

[16]  Atsuo Takanishi,et al.  The waseda flutist robot No. 4 refined IV: enhancing the sound clarity and the articulation between notes by improving the design of the lips and tonguing mechanisms , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.