Anthropomorphic Musical Robots Designed to Produce Physically Embodied Expressive Performances of Music

The recent technological advances in robot technology, musical information retrieval, artificial intelligence, and so forth, may enable anthropomorphic robots to roughly emulate the physical dynamics and motor dexterity of humans while playing musical instruments. In particular, research on musical robots provides opportunity to study several aspects outside of robotics, including understanding human motor control from an engineering point of view, understanding how humans generate expressive music performances, and finding new methods for interactive musical expression. Research into computer systems for expressive music performance has been more frequent during the recent decades; such systems are usually being designed to convert a musical score into an expressive musical performance typically including time, sound, and timbre deviations from a deadpan realization of the score and then reproducing this for a MIDI-enabled instrument. However, the lack of a physical response (embodiment) limits the unique experience of the live performance found in human performances. New research paradigms can be conceived from research on musical robots which focuses on the production of a live performance by mechanical means. However, there are still several technical issues to be solved – enabling musical robots to analyze and synthesize musical sounds as musicians do, to understand and reason about music, and to adapt behaviors accordingly. In this chapter, an overview on the current research trends on wind-instrument-playing musical robots will be given by detailing some examples. In particular, the development of an anthropomorphic flutist robot will be presented by describing its mechanical design, the implementation of intelligent control strategies, and the analysis of a number of musical parameters which enable the robot to play an instrument with expressiveness.

[1]  S. Takashima,et al.  Control of an Automatic Performance Robot of Saxophone : Performance control using standard midi files , 2003 .

[2]  Eric Singer,et al.  LEMUR's Musical Robots , 2004, NIME.

[3]  Shigeki Sugano,et al.  The robot musician 'wabot-2' (waseda robot-2) , 1987, Robotics.

[4]  Atsuo Takanishi,et al.  Musical-based interaction system for the Waseda Flutist Robot , 2010, Auton. Robots.

[5]  Tetsuya Ogata,et al.  Robot musical accompaniment: integrating audio and visual cues for real-time synchronization with a human flutist , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[6]  Atsuo Takanishi,et al.  Design of New Mouth and Hand Mechanisms of the Anthropomorphic Saxophonist Robot and Implementation of an Air Pressure Feed-Forward Control with Dead-Time Compensation , 2010, ICRA 2010.

[7]  Gil Weinberg,et al.  The Design of a Perceptual and Improvisational Robotic Marimba Player , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[8]  Anders Lyhne Christensen,et al.  Self-assembly strategies in a group of autonomous mobile robots , 2010, Auton. Robots.

[9]  Guy Hoffman,et al.  Gesture-based human-robot Jazz improvisation , 2010, 2010 IEEE International Conference on Robotics and Automation.

[10]  Giovanni De Poli,et al.  A fuzzy approach to performance rules , 1995 .

[11]  Atsuo Takanishi,et al.  Towards an expressive performance of the Waseda Flutist Robot: Production of Vibrato , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[12]  Takenobu Tokunaga,et al.  A Case Based Approach to the Generation of Musical Expression , 1999, IJCAI.

[13]  Ajay Kapur,et al.  A History of robotic Musical Instruments , 2005, ICMC.

[14]  Seiji Inokuchi,et al.  Learning performance rules in a music interpretation system , 1993, Comput. Humanit..

[15]  Anders Friberg,et al.  A Quantitative Rule System for Musical Performance , 1995 .

[16]  Irving Wallace The Three Sirens , 1963 .

[17]  Atsuo Takanishi,et al.  An Overview of the Research Approaches on Musical Performance robots , 2007, ICMC.

[18]  Makoto Kajitani Development of Musician Robots , 1989, J. Robotics Mechatronics.

[19]  Christopher M. Bishop,et al.  Neural networks for pattern recognition , 1995 .

[20]  Anders Friberg,et al.  Performance Rules for Computer-Controlled Contemporary Keyboard Music , 1991 .

[21]  Seiji Inokuchi,et al.  Extraction of Musical Performance Rules Using a Modified Algorithm of Multiple Regression Analysis , 2000, ICMC.

[22]  Roger B. Dannenberg,et al.  McBlare: A Robotic Bagpipe Player , 2005, NIME.

[23]  Ramón López de Mántaras,et al.  An Interactive Case-Based Reasoning Approach for Generating Expressive Music , 2004, Applied Intelligence.