Interactive Musical System for Multimodal Musician-Humanoid Interaction

The research on Humanoid Robots designed for playing musical instruments has a long tradition in the research field of robotics. During the past decades, several researches are developing anthropomorphic and automated machines able to create live musical performances for both understanding the human itself and for creating novel ways of musical expression. In particular, Humanoid Robots are being designed to roughly simulate the dexterity of human players and to display higher-level of perceptual capabilities to enhance the interaction with musical partners. In this chapter, the concept and implementation of an interactive musical system for multimodal musician-humanoid interaction is detailed.

[1]  Gil Weinberg,et al.  Toward Robotic Musicianship , 2006, Computer Music Journal.

[2]  Alex Pentland,et al.  Pfinder: Real-Time Tracking of the Human Body , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[3]  Richard A. Foulds,et al.  Toward robust skin identification in video images , 1996, Proceedings of the Second International Conference on Automatic Face and Gesture Recognition.

[4]  Neil J. Gordon,et al.  A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking , 2002, IEEE Trans. Signal Process..

[5]  Luc Van Gool,et al.  An adaptive color-based particle filter , 2003, Image Vis. Comput..

[6]  Koji Shibuya,et al.  Toward Developing a Violin Playing Robot - Bowing by Anthropomorphic Robot Arm and Sound Analysis , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[7]  Robert Rowe,et al.  Machine Musicianship , 2001 .

[8]  Eric Singer,et al.  Proceedings of the 2003 Conference on New Interfaces for Musical Expression (NIME-03), Montreal, Canada LEMUR GuitarBot: MIDI Robotic String Instrument , 2022 .

[9]  Roger B. Dannenberg,et al.  McBlare: A Robotic Bagpipe Player , 2005, NIME.

[10]  Ajay Kapur,et al.  The Electronic Sitar Controller , 2004, NIME.

[11]  Atsuo Takanishi,et al.  Development of a real-time instrument tracking system for enabling the musical interaction with the Waseda Flutist Robot , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[12]  Antonio Frisoli,et al.  Reactive robot system using a haptic interface: an active interaction to transfer skills from the robot to unskilled persons , 2007, Adv. Robotics.

[13]  Cynthia Breazeal,et al.  Social Robots that Interact with People , 2008, Springer Handbook of Robotics.

[14]  Atsuo Takanishi,et al.  Development of a aural real-time rhythmical and harmonic tracking to enable the musical interaction with the Waseda Flutist Robot , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[15]  Atsuo Takanishi,et al.  Refining the flute sound production of the Waseda flutist robot the mechanical design of the artificial organs involved during the flute playing , 2009 .

[16]  Shigeki Sugano,et al.  The robot musician 'wabot-2' (waseda robot-2) , 1987, Robotics.

[17]  Atsuo Takanishi,et al.  Design of New Mouth and Hand Mechanisms of the Anthropomorphic Saxophonist Robot and Implementation of an Air Pressure Feed-Forward Control with Dead-Time Compensation , 2010, ICRA 2010.

[18]  Iain E. G. Richardson,et al.  H.264 and MPEG-4 Video Compression: Video Coding for Next-Generation Multimedia , 2003 .