Music expression with a robot manipulator used as a bidirectional tangible interface

The availability of haptic interfaces in music content processing offers interesting possibilities of performer-instrument interaction for musical expression. These new musical instruments can precisely modulate the haptic feedback, and map it to a sonic output, thus offering new artistic content creation possibilities. With this article, we investigate the use of a robotic arm as a bidirectional tangible interface for musical expression, actively modifying the compliant control strategy to create a bind between gestural input and music output. The user can define recursive modulations of music parameters by grasping and gradually refining periodic movements on a gravity-compensated robot manipulator. The robot learns on-line the new desired trajectory, increasing its stiffness as the modulation refinement proceeds. This article reports early results of an artistic performance that has been carried out with the collaboration of a musician, who played with the robot as part of his live stage setup.

[1]  Volker Krefeld,et al.  The Hand in the Web: An Interview with Michel Waisvisz , 1990 .

[2]  Shunzheng Yu,et al.  Practical implementation of an efficient forward-backward algorithm for an explicit-duration hidden Markov model , 2006, IEEE Transactions on Signal Processing.

[3]  Alin Albu-Schäffer,et al.  The DLR lightweight robot: design and control concepts for robots in human environments , 2007, Ind. Robot.

[4]  Antonio Bicchi,et al.  Fast and "soft-arm" tactics [robot arm design] , 2004, IEEE Robotics & Automation Magazine.

[5]  Darwin G. Caldwell,et al.  Design and Evaluation of a Hybrid Reality Performance , 2011, NIME.

[6]  Claude Cadoz,et al.  A Modular Feedback Keyboard Design , 1990 .

[7]  Jun Nakanishi,et al.  Movement imitation with nonlinear dynamical systems in humanoid robots , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[8]  Eric Singer,et al.  Proceedings of the 2003 Conference on New Interfaces for Musical Expression (NIME-03), Montreal, Canada LEMUR GuitarBot: MIDI Robotic String Instrument , 2022 .

[9]  S. P. Gaskill,et al.  Safety issues in modern applications of robots , 1994 .

[10]  Paul Evrard,et al.  Learning collaborative manipulation tasks by demonstration using a haptic interface , 2009, ICAR.

[11]  Shuji Hashimoto,et al.  Robotic interface for embodied interaction via dance and musical performance , 2004, Proceedings of the IEEE.

[12]  Claude Cadoz,et al.  Haptics in computer music : a paradigm shift , 2004, ArXiv.

[13]  Dongheui Lee,et al.  Incremental motion primitive learning by physical coaching using impedance control , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[14]  Bob L. Sturm,et al.  Proceedings of the International Computer Music Conference , 2011 .

[15]  Aude Billard,et al.  On Learning, Representing, and Generalizing a Task in a Humanoid Robot , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[16]  Tetsuya Ogata,et al.  Human-robot ensemble between robot thereminist and human percussionist using coupled oscillator model , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[17]  Henk Nijmeijer,et al.  Robot Programming by Demonstration , 2010, SIMPAR.

[18]  Eric L. Sauser,et al.  An Approach Based on Hidden Markov Model and Gaussian Mixture Regression , 2010 .

[19]  Darwin G. Caldwell,et al.  Bilateral physical interaction with a robot manipulator through a weighted combination of flow fields , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[20]  Nikolaos G. Tsagarakis,et al.  Safe human robot interaction via energy regulation control , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[21]  Daniel S. Weld,et al.  Programming by Demonstration , 2021, Computer Vision.

[22]  Craig Stuart Sapp,et al.  A Course on Controllers , 2001, NIME.

[23]  Ajay Kapur,et al.  A Pedagogical Paradigm for Musical Robotics , 2010, NIME.

[24]  M. Sile O'Modhrain,et al.  Cutaneous Grooves: Composing for the Sense of Touch , 2002 .

[25]  Shunzheng Yu,et al.  Hidden semi-Markov models , 2010, Artif. Intell..

[26]  Stefan Schaal,et al.  Dynamics systems vs. optimal control--a unifying view. , 2007, Progress in brain research.

[27]  Antonio Frisoli,et al.  Reactive robot system using a haptic interface: an active interaction to transfer skills from the robot to unskilled persons , 2007, Adv. Robotics.

[28]  William Buxton,et al.  An Introduction to the SSSP Digital Synthesizer , 1978, ICMC.

[29]  Hiroshi Ishii,et al.  Tangible bits: towards seamless interfaces between people, bits and atoms , 1997, CHI.

[30]  Max V. Mathews,et al.  The PLANK: Designing a simple Haptic Controller , 2002, NIME.

[31]  Sergi Jordà,et al.  Sonigraphical Instruments: From FMOL to the reacTable* , 2003, NIME.

[32]  Darwin G. Caldwell,et al.  Learning and Reproduction of Gestures by Imitation , 2010, IEEE Robotics & Automation Magazine.

[33]  Sergi Jordà,et al.  The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces , 2007, TEI.

[34]  Atsuo Takanishi,et al.  Musical-based interaction system for the Waseda Flutist Robot , 2010, Auton. Robots.

[35]  Roel Vertegaal,et al.  Quarterly Progress and Status Report Towards a musician’s cockpit: Transducers, feedback and musical function , 2007 .

[36]  Stefan Schaal,et al.  Robot Programming by Demonstration , 2009, Springer Handbook of Robotics.

[37]  Pascal Bussy,et al.  Kraftwerk: Man, Machine and Music , 2000 .