Imitating the Saxophone playing by the anthropomorphic saxophonist robot

Our research aims in developing an anthropomorphic saxophonist robot designed to mechanically reproduce the human organs involved during the saxophone playing. In this paper, the Waseda Saxophone Robot No. 2 (WAS-2) which is composed by 22-DOFs is detailed. The lip mechanism of WAS-2 has been designed with 3-DOFs to control the motion of the lower, upper and sideway lips. In addition, a human-like hand (16 DOF-s) has been designed to enable to play all the keys of the instrument. Regarding the improvement of the control system, a feed-forward control system with dead-time compensation has been implemented to assure the accurate control of the air pressure. In addition, the implementation of an auditory feedback control system has been proposed and implemented in order to adjust the positioning of the physical parameters of the components of the robot by providing a pitch feedback and defining a recovery position (off-line). A set of experiments were carried out to verify the mechanical design improvements and the dynamic response of the air pressure. As a result, the range of sound pressure has been increased and the proposed control system improved the dynamic response of the air pressure control.

[1]  Gil Weinberg,et al.  Toward Robotic Musicianship , 2006, Computer Music Journal.

[2]  Akihiko Uchiyama,et al.  Information-Power Machine with Senses and Limbs , 1974 .

[3]  Atsuo Takanishi,et al.  Refining the flute sound production of the Waseda flutist robot the mechanical design of the artificial organs involved during the flute playing , 2009 .

[4]  M. Tsujikawa,et al.  PSYCHO-CIRCULATORY RESPONSES CAUSED BY LISTENING TO MUSIC, AND EXPOSURE TO FLUCTUATING NOISE OR STEADY NOISE , 2002 .

[5]  Hyun-Soo Kim,et al.  On-line dead-time compensation method based on time delay control , 2003, IEEE Trans. Control. Syst. Technol..

[6]  S. Takashima,et al.  Control of an Automatic Performance Robot of Saxophone : Performance control using standard midi files , 2003 .

[7]  Koji Shibuya,et al.  Toward Developing a Violin Playing Robot - Bowing by Anthropomorphic Robot Arm and Sound Analysis , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[8]  Mitsuo Kawato,et al.  A computational model of four regions of the cerebellum based on feedback-error learning , 2004, Biological Cybernetics.

[9]  J. Friedman Regularized Discriminant Analysis , 1989 .

[10]  Atsuo Takanishi,et al.  Development of anthropomorphic musical performance robots: From understanding the nature of music performance to its application to entertainment robotics , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[11]  Robert D. Lorenz,et al.  Secondary resistive losses with high-frequency injection-based self-sensing in IPM machines , 2011, 2011 IEEE Energy Conversion Congress and Exposition.

[12]  M. Kawato,et al.  A hierarchical neural-network model for control and learning of voluntary movement , 2004, Biological Cybernetics.

[13]  Ajay Kapur,et al.  A History of robotic Musical Instruments , 2005, ICMC.