A Manual Actions Expressive System (MAES)

This article describes a manual actions expressive system (MAES) which aims to enable music creation and performance using natural hand actions (e.g. hitting virtual objects, or shaking them). Gestures are fully programmable and result from tracking and analysing hand motion and finger bend, potentially allowing performers to concentrate on natural actions from our daily use of the hands (e.g. the physical movement associated with hitting and shaking). Work carried out focused on the development of an approach for the creation of gestures based on intuitive metaphors, their implementation as software for composition and performance, and their realisation within a musical composition through the choice of suitable mappings, sonic materials and processes.

[1]  Denis Smalley,et al.  Spectro-morphology and Structuring Processes , 1986 .

[2]  Marcelo M. Wanderley,et al.  New Digital Musical Instruments: Control And Interaction Beyond the Keyboard (Computer Music and Digital Audio Series) , 2006 .

[3]  Marcelo M. Wanderley,et al.  The Importance of Parameter Mapping in Electronic Instrument Design , 2002, NIME.

[4]  Giuseppe Torre,et al.  The Development of Motion Tracking Algorithms for Low Cost Inertial Measurement Units , 2008, NIME.

[5]  Jean-François Charles,et al.  A Tutorial on Spectral Sound Processing Using Max/MSP and Jitter , 2008, Computer Music Journal.

[6]  Joseph A. Paradiso,et al.  Digito: A Fine-Grain Gesturally Controlled Virtual Musical Instrument , 2012, NIME.

[7]  Ali Momeni,et al.  Characterizing and Controlling Musical Material Intuitively with Geometric Models , 2003, NIME.

[8]  G. Paine Towards Unified Design Guidelines for New Interfaces for Musical Expression , 2009, Organised Sound.

[9]  Marcelo M. Wanderley,et al.  Instrumental Gestural Mapping Strategies as Expressivity Determinants in Computer Music Performance , 1997 .

[10]  Ali Momeni,et al.  Dynamic Independent Mapping Layers for Concurrent Control of Audio and Video Synthesis , 2006 .

[11]  Curtis Roads,et al.  The Computer Music Tutorial , 1996 .

[12]  Geoffrey E. Hinton,et al.  Glove-TalkII-a neural-network interface which maps gestures to parallel formant speech synthesizer controls , 1997, IEEE Trans. Neural Networks.

[13]  Miroslav Spasov,et al.  Music Composition as an Act of Cognition: ENACTIV – interactive multi-modal composing system , 2011, Organised Sound.

[14]  Enric Guaus,et al.  Nuvolet: 3D Gesture-driven Collaborative Audio Mosaicing , 2011, NIME.

[15]  Curtis Roads,et al.  Responsive Input Devices and Sound Synthesis by Stimulation of Instrumental Mechanisms: The Cordis System , 1984 .

[16]  Geraldine Fitzpatrick,et al.  Phalanger: Controlling Music Software With Hand Movement Using A Computer Vision and Machine Learning Approach , 2009, NIME.

[17]  Suguru Goto The Aesthetics and Technological Aspects of Virtual Musical Instruments: The Case of the SuperPolm MIDI Violin , 1999, Leonardo Music Journal.

[18]  Alexander Refsum Jensenius,et al.  Developing Tools for Studying Musical gestures within the Max/MSP/jitter Environment , 2005, ICMC.

[19]  Rajmil Fischman The phase vocoder: theory and practice , 1997 .

[20]  Camille Goudeseune,et al.  A Manifold Interface for a High Dimensional Control Space , 1995, International Conference on Mathematics and Computing.

[21]  Sidney S. Fels,et al.  MetaMuse: Metaphors for Expressive Instruments , 2002, NIME.

[22]  Camille Goudeseune,et al.  Resonant Processing of Instrumental Sound Controller by Spatial Position , 2001, NIME.

[23]  Denis Smalley,et al.  Spectromorphology: explaining sound-shapes , 1997, Organised Sound.

[24]  Sidney S. Fels,et al.  MetaMuse: a novel control metaphor for granular synthesis , 2002, CHI Extended Abstracts.

[25]  Javier Jaimovich,et al.  Biosignal-driven Art: Beyond biofeedback , 2011 .

[26]  Alexander Refsum Jensenius,et al.  OSC Implementation and Evaluation of the Xsens MVN Suit , 2011, NIME.

[27]  Claude Cadoz,et al.  Capture, Representation and "Composition" of the Instrumental Gesture , 1990, ICMC.

[28]  Sidney S. Fels,et al.  Design of Virtual 3D Instruments for Musical Interaction , 1999, Graphics Interface.

[29]  Gilbert Beyer,et al.  Music Interfaces for Novice Users: Composing Music on a Public Display with Hand Gestures , 2011, NIME.

[30]  Axel Mulder Getting a Grip on Alternate Controllers: Addressing the Variability of Gestural Expression in Musical Instrument Design , 2016 .

[31]  Thomas Mitchell,et al.  SoundGrasp: A Gestural Interface for the Performance of Live Music , 2011, NIME.

[32]  Matthew Wright,et al.  Intimate Musical Control of Computers with a Variety of Controllers and Gesture Mapping Metaphors , 2002, NIME.

[33]  Claude Cadoz,et al.  Instrumental Gestures and Musical Composition , 1988, ICMC.

[34]  Stephen McAdams,et al.  Control parameters for musical instruments: a foundation for new mappings of gesture to sound , 2002, Organised Sound.

[35]  Bert Bongers,et al.  Electronic Musical Instruments: Experiences of a New Luthier , 2007, Leonardo Music Journal.

[36]  Sile O'Modhrain,et al.  An enactive approach to the design of new tangible musical instruments , 2006, Organised Sound.

[37]  Marcelo M. Wanderley,et al.  Soundcatcher: Explorations In Audio-Looping And Time-Freezing Using An Open-Air Gestural Controller , 2010, ICMC.

[38]  Axel Mulder Virtual Musical Instruments: Accessing the Sound Synthesis Universe as a Performer , 2007 .

[39]  Thomas Mitchell,et al.  Musical Interaction with Hand Posture and Orientation: A Toolbox of Gestural Control Mechanisms , 2012, NIME.

[40]  Michael Berger The GRIP MAESTRO: Idiomatic Mappings of Emotive Gestures for Control of Live Electroacoustic Music , 2010, NIME.

[41]  Yuichi Sato,et al.  First Person Shooters as Collaborative Multiprocess Instruments , 2011, NIME.

[42]  Roel Vertegaal,et al.  Quarterly Progress and Status Report Towards a musician’s cockpit: Transducers, feedback and musical function , 2007 .

[43]  Suguru Goto Virtual Musical Instruments: Technological Aspects and Interactive Performance Issues , 2000 .

[44]  Alexander Refsum Jensenius,et al.  Developing the Dance Jockey System for Musical Interaction with the Xsens MVN Suit , 2012, NIME.

[45]  Sidney Fels,et al.  Mapping transparency through metaphor: towards more expressive musical instruments , 2002, Organised Sound.

[46]  Volker Krefeld,et al.  The Hand in the Web: An Interview with Michel Waisvisz , 1990 .