Development of a Visual Interface for Sound Parameter Calibration of the Waseda Flutist Robot WF-4RIV

The Waseda Flutist Robot WF-4RIV is a humanoid robot that is able to imitate a human flute performance. In the recent years the mechanical construction of the robot has been improved, so that the robot’s playing capabilities have reached the level of an intermediate human instrument player. To make the robot able to play at this performance level, careful calibration of its musical parameters is necessary. Using the flutist robot’s core control software, this procedure is very complicated to perform. In this paper we present the implementation of a visual control interface that allows also non-technical users to calibrate the sound settings. The newly developed control interface enables a musician to adjust certain parameters of the robot performance while the robot is playing. Experiments in which we verify the functionality of this sound calibration system are presented. We examine the visual processing system’s perception of the instrument movements of a human performer and analyze their effect on the performance of the robot.

[1]  Gil Weinberg,et al.  The interactive robotic percussionist - new developments in form, mechanics, perception and interaction design , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[2]  Atsuo Takanishi,et al.  An Anthropomorphic flutist robot for teaching flute playing to beginning students , 2004 .

[3]  Alan Fern,et al.  Improved Video Registration using Non-Distinctive Local Image Features , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[4]  Atsuo Takanishi,et al.  Refining the flute sound production of the Waseda flutist robot the mechanical design of the artificial organs involved during the flute playing , 2009 .

[5]  Atsuo Takanishi,et al.  The Development of the Anthropomorphic Flutist Robot at Waseda University , 2006, Int. J. Humanoid Robotics.

[6]  Neil J. Gordon,et al.  A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking , 2002, IEEE Trans. Signal Process..

[7]  Richard A. Foulds,et al.  Toward robust skin identification in video images , 1996, Proceedings of the Second International Conference on Automatic Face and Gesture Recognition.

[8]  Atsuo Takanishi,et al.  Implementation of Expressive Performance Rules on the WF-4RIII by modeling a professional flutist performance using NN , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[9]  Atsuo Takanishi,et al.  The mechanical improvements of the anthropomorphic flutist robot WF-4RII to increas the sound clarity and to enhance the interactivity with humans , 2006 .