Toward Robotic Musicianship

We present the development of a robotic percussionist named Haile that is designed to demonstrate musicianship. We define robotic musicianship in this context as a combination of musical, perceptual, and interaction skills with the capacity to produce rich acoustic responses in a physical and visual manner. Haile listens to live human players, analyzes perceptual aspects of their playing in real time, and uses the product of this analysis to play along in a collaborative and improvisatory manner. It is designed to combine the benefits of computational power, perceptual modeling, and algorithmic music with the richness, visual interactivity, and expression of acoustic playing. We believe that combining machine listening, improvisational algorithms, and mechanical operations with human creativity and expression can lead to novel musical experiences and outcome. Haile can therefore serve as a test bed for novel forms of musical humanmachine interaction, bringing perceptual aspects of computer music into the physical world both visually and acoustically. This article presents our goals for the project and the approaches we took in design, mechanics, perception, and interaction to address these goals. After an overview of related work in musical robotics, machine musicianship, and music perception, we describe Haile's design, the development of two robotic arms that can strike different locations on a drum with controllable volume levels, applications developed for lowand high-level perceptual listening and improvisation, and two interactive compositions for humans and a robotic percussionist that use Haile's capabilities. We conclude with a description of a user study that was conducted in an effort to evaluate Haile's perceptual, mechanical, and interaction functionalities. The results of the study showed significant correlation between humans' and Haile's rhythmic perception as well as strong user satisfaction from Haile's perceptual and mechanical capabilities. The study also indicated areas for improvement, such as the need for better timbre and loudness control as well as more advanced and responsive interaction schemes.

[1]  Andranik S. Tangiuane Artificial Perception and Music Recognition , 1993 .

[2]  Eric Singer,et al.  Proceedings of the 2003 Conference on New Interfaces for Musical Expression (NIME-03), Montreal, Canada LEMUR GuitarBot: MIDI Robotic String Instrument , 2022 .

[3]  Anssi Klapuri,et al.  Measuring the similarity of Rhythmic Patterns , 2002, ISMIR.

[4]  A. Roselle RAND Corporation Web Site , 2000 .

[5]  Andranick Tanguiane Artificial Perception and Music Recognition , 1993, Lecture Notes in Computer Science.

[6]  Andrea Lockerd Thomaz,et al.  Robot's play: interactive games with sociable machines , 2004, CIE.

[7]  I. Shmulevich,et al.  A system for machine recognition of music patterns , 1998, Proceedings of the 1998 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP '98 (Cat. No.98CH36181).

[8]  Shingo Uchihashi,et al.  The beat spectrum: a new approach to rhythm analysis , 2001, IEEE International Conference on Multimedia and Expo, 2001. ICME 2001..

[9]  Sergi Jordà Afasia: The Ultimate Homeric One-Man-Multimedia-Band , 2002, NIME.

[10]  Stefan Schaal,et al.  Rapid synchronization and accurate phase-locking of rhythmic motor primitives , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[11]  E. Narmour The analysis and cognition of basic melodic structures , 1992 .

[12]  Roger B. Dannenberg,et al.  An On-Line Algorithm for Real-Time Accompaniment , 1984, ICMC.

[13]  Gil Weinberg,et al.  Interconnected Musical Networks: Toward a Theoretical Framework , 2005, Computer Music Journal.

[14]  David Cope Experiments in Music Intelligence (EMI) , 1987, ICMC.

[15]  S. Trehub,et al.  Infants' perception of melodies: the role of melodic contour. , 1984, Child development.

[16]  Philip N. Johnson-Laird,et al.  How Jazz Musicians Improvise , 2002 .

[17]  Todd Winkler Composing Interactive Music: Techniques and Ideas Using Max , 1998 .

[18]  Barry Vercoe,et al.  The Synthetic Performer in The Context of Live Performance , 1984, International Conference on Mathematics and Computing.

[19]  Peter Desain,et al.  Rhythmic stability as explanation of category size , 2002 .

[20]  Gil Weinberg,et al.  Musical interactions with a perceptual robotic percussionist , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[21]  Eric D. Scheirer,et al.  Tempo and beat analysis of acoustic musical signals. , 1998, The Journal of the Acoustical Society of America.

[22]  Atsuo Takanishi,et al.  Development of an anthropomorphic flutist robot WF-3RII , 1996, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems. IROS '96.

[23]  E. Narmour The Analysis and Cognition of Melodic Complexity: The Implication-Realization Model , 1992 .

[24]  Robert Rowe,et al.  Machine Musicianship , 2001 .

[25]  Robert Rowe,et al.  Interactive Music Systems: Machine Listening and Composing , 1992 .

[26]  S. Coren,et al.  In Sensation and perception , 1979 .

[27]  Massimo Bergamasco,et al.  The anthropomorphic flutist robot WF-4 teaching flute playing to beginner students , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[28]  Laboratorio Nacional de Música Electroacústica Proceedings of the 2001 International Computer Music Conference, ICMC 2001, Havana, Cuba, September 17-22, 2001 , 2001, ICMC.

[29]  R. Jackendoff,et al.  A Generative Theory of Tonal Music , 1985 .

[30]  François Pachet,et al.  The Continuator: Musical Interaction With Style , 2003, ICMC.

[31]  M. Resnick,et al.  Programmable Bricks: Toys to Think With , 1996, IBM Syst. J..

[32]  Parag Chordia Representation and Automatic Transcription of Solo Tabla Music , 2006 .

[33]  Roger B. Dannenberg,et al.  McBlare: A Robotic Bagpipe Player , 2005, NIME.

[34]  Ajay Kapur,et al.  A History of robotic Musical Instruments , 2005, ICMC.

[35]  George Tzanetakis,et al.  Indirect Acquisition of Percussion Gestures Using Timbre Recognition , 2005 .

[36]  George E. Lewis Too Many Notes: Computers, Complexity and Culture in Voyager , 2000, Leonardo Music Journal.

[37]  Tamara M. Lackner,et al.  The Musical Fireflies - Learning About Mathematical Patterns in Music Through Expression and Play , 2000 .

[38]  N. Rashevsky,et al.  The mathematical basis of the arts , 1949 .

[39]  Eric Singer,et al.  LEMUR's Musical Robots , 2004, NIME.

[40]  С.,et al.  The Cell , 1997, Nature Medicine.

[41]  Eleanor Selfridge-Field,et al.  Melodic Similarity : concepts, procedures, and applications , 1998 .

[42]  Miller Puckette,et al.  Real-time audio analysis tools for Pd and MSP , 1998, ICMC.

[43]  Brian Scassellati,et al.  Robotic drumming : synchronization in social tasks , 2005 .

[44]  Nicola Orio,et al.  Score Following: State of the Art and New Developments , 2003, NIME.

[45]  F. Rauscher,et al.  Music and spatial task performance , 1993, Nature.