Mapping by Observation: Building a User-Tailored Conducting System From Spontaneous Movements

Metaphors are commonly used in interface design within Human-Computer Interaction (HCI). Interface metaphors provide users with a way to interact with the computer that resembles a known activity, giving instantaneous knowledge or intuition about how the interaction works. A widely used one in Digital Musical Instruments (DMIs) is the conductor-orchestra metaphor, where the orchestra is considered as an instrument controlled by the movements of the conductor. We propose a DMI based on the conductor metaphor that allows to control tempo and dynamics and adapts its mapping specifically for each user by observing spontaneous conducting movements (i.e. movements performed on top of fixed music without any instructions). We refer to this as mapping by observation given that, even though the system is trained specifically for each user, this training is not done explicitly and consciously by the user. More specifically, the system adapts its mapping based on the tendency of the user to anticipate or fall behind the beat and observing the Motion Capture descriptors that best correlate to loudness during spontaneous conducting. We evaluate the proposed system in an experiment with twenty four (24) participants where we compare it with a baseline that does not perform this user-specific adaptation. The comparison is done in a context where the user does not receive instructions and, instead, is allowed to discover by playing. We evaluate objective and subjective measures from tasks where participants have to make the orchestra play at different loudness levels or in synchrony with a metronome. Results of the experiment prove that the usability of the system that automatically learns its mapping from spontaneous movements is better both in terms of providing a more intuitive control over loudness and a more precise control over beat timing. Interestingly, the results also show a strong correlation between measures taken from the data used for training and the improvement introduced by the adapting system. This indicates that it is possible to estimate in advance how useful the observation of spontaneous movements is to build user-specific adaptations. This opens interesting directions for creating more intuitive and expressive DMIs, particularly in public installations.

[1]  Jules Françoise Motion-Sound Mapping by Demonstration , 2015 .

[2]  P. Kolesnik Conducting Gesture Recognition, Analysis and Performance System , 2004 .

[3]  Marco Fabiani Interactive computer-aided expressive music performance : Analysis, control, modification and synthesis , 2011 .

[4]  Marcelo M. Wanderley,et al.  Instrumental Gestural Mapping Strategies as Expressivity Determinants in Computer Music Performance , 1997 .

[5]  Teresa Marrin Nakra,et al.  You're The Conductor: A Realistic Interactive Conducting System for Children , 2004, NIME.

[6]  Teresa Marrin Nakra,et al.  The "Conductor's Jacket": A Device for Recording Expressive Musical Gestures , 1998, ICMC.

[7]  Fabio Kon,et al.  The Quest for Low Latency , 2004, ICMC.

[8]  Joseph A. Paradiso,et al.  The Digital Baton: a Versatile Performance Instrument , 1997, ICMC.

[9]  Norbert Schnell,et al.  Probabilistic Models for Designing Motion and Sound Relationships , 2014, NIME.

[10]  Ståle Andreas Skogstad,et al.  Filtering Motion Capture Data for Real-Time Applications , 2013, NIME.

[11]  Satoshi Usa,et al.  A conducting recognition system on the model of musicians' process , 1998 .

[12]  T. Lokki,et al.  Conductor Follower: Controlling sample-based synthesis with expressive gestural input , 2012 .

[13]  Jan O. Borchers,et al.  Improving orchestral conducting systems in public spaces: examining the temporal characteristics and conceptual models of conducting gestures , 2005, CHI.

[14]  Jordi Bonada,et al.  The Bowed Tube: a Virtual Violin , 2010, NIME.

[15]  Peter Gross,et al.  The MIDI Baton , 1989, ICMC.

[16]  Shuji Hashimoto,et al.  Computer Music System Which Follows a Human Conductor , 1989, ICMC.

[17]  Sergi Jordà Digital Lutherie Crafting musical computers for new musics' performance and improvisation , 2005 .

[18]  John Sell,et al.  The Xbox One System on a Chip and Kinect Sensor , 2014, IEEE Micro.

[19]  J. A. Paradiso Electronic money: toward a virtual wallet , 1997 .

[20]  Baptiste Caramiaux,et al.  The Machine Learning Algorithm as Creative Musical Tool , 2016, ArXiv.

[21]  Álvaro Sarasúa,et al.  Beat Tracking from Conducting Gestural Data: a Multi-Subject Study , 2014, MOCO '14.

[22]  Alexander Refsum Jensenius,et al.  Action-sound : developing methods and tools to study music-related body movement , 2007 .

[23]  Atau Tanaka,et al.  Machine Learning of Personal Gesture Variation in Music Conducting , 2016, CHI.

[24]  Alan F. Blackwell,et al.  The reification of metaphor as a design tool , 2006, TCHI.

[25]  Norbert Schnell,et al.  Mapping Through Listening , 2014, Computer Music Journal.

[26]  Axel G. E. Mulder Design of virtual three-dimensional instruments for sound control , 1998 .

[27]  S. Hurlbert Pseudoreplication and the Design of Ecological Field Experiments , 1984 .

[28]  Guy E. Garnett,et al.  Technological Advances for Conducting a Virtual Ensemble , 2001, ICMC.

[29]  Matthew Wright,et al.  Problems and Prospects for Intimate Musical Control of Computers , 2002, Computer Music Journal.

[30]  Atau Tanaka,et al.  Adaptive Gesture Recognition with Variation Estimation for Interactive Systems , 2014, ACM Trans. Interact. Intell. Syst..

[31]  Guy E. Garnett,et al.  Conductor Following , 1995, ICMC.

[32]  Max Mühlhäuser,et al.  Engineering a realistic real-time conducting system for the audio/video rendering of a real orchestra , 2002, Fourth International Symposium on Multimedia Software Engineering, 2002. Proceedings..

[33]  Ana Maria Barbancho-Perez,et al.  Conducting a virtual ensemble with a kinect device , 2013 .

[34]  Robert A. Boie,et al.  The Radio Drum as a Synthesizer Controller , 1989, ICMC.

[35]  Tapio Takala,et al.  Conductor Following With Artificial Neural Networks , 1999, ICMC.

[36]  Max V. Mathews The radio baton and conductor program, or, pitch, the most important and least expressive part of music , 1991 .

[37]  Sidney Fels,et al.  Mapping transparency through metaphor: towards more expressive musical instruments , 2002, Organised Sound.

[38]  Álvaro Sarasúa,et al.  Dynamics in Music Conducting: A Computational Comparative Study Among Subjects , 2014, NIME.

[39]  Yi-Shin Chen,et al.  An interactive conducting system using Kinect , 2013, 2013 IEEE International Conference on Multimedia and Expo (ICME).

[40]  Max V. Mathews,et al.  The conductor program and mechanical baton , 1988 .

[41]  G. Aschersleben Temporal Control of Movements in Sensorimotor Synchronization , 2002, Brain and Cognition.

[42]  Cornelius Pöpel,et al.  Recent Developments in Violin-related Digital Musical Instruments: Where Are We and Where Are We Going? , 2006, NIME.