HANDSKETCH: BI-MANUAL CONTROL OF VOICE QUALITY DIMENSIONS AND LONG TERM PRACTICE ISSUES

In this paper, we describe the development of a new musical instru- ment, called the HandSketch. This new instrument is developed in respect with the Convergent Luthery Model (10). It means that this instrument has to be practicable right from the beginning of the prototyping, in order to allow the progressive embodiment of the object. This specificity leads us to focus our control paradigm on the writing. The HandSketch is a digital instrument made for the bi-manual control of voice quality dimensions: pitch, instensity, glottal flow parameters (6). It is made of purshasable devices: a pen tablet and force sensing resistors (FSRs). More precisely it is built around a Wacom TM graphic tablet (31), played vertically along the up- per part of the body. The HandSketch uses a particular polar transformation of the control space in order to fit the requirements of ther prefered hand. A sensing strategy inspired by woodwind and string instruments is adapted to FSRs for the use of the non- prefered hand. It is important to highlight that the instrument evolved in nine consecutive versions - being now called HS1.8 - and thus reached a more stable shape and behavior. The most recent playing situation is illustrated in Figure 1.

[1]  Thierry Dutoit,et al.  Advanced Techniques for Vertical Tablet Playing A Overview of Two Years of Practicing the HandSketch 1.x , 2009, NIME.

[2]  Perry R. Cook,et al.  Principles for Designing Computer Music Controllers , 2001, NIME.

[3]  Loïc Kessous,et al.  Bimanuality in Alternate Musical Instruments , 2003, NIME.

[4]  Loïc Kessous Bi-manual mapping experimentation, with angular fundamental frequency control and sound color navigation , 2002, NIME.

[5]  Thierry Dutoit,et al.  RAMCESS/handsketch: a multi-representation framework for realtime and expressive singing synthesis , 2007, INTERSPEECH.

[6]  D. Lowry,et al.  Bokken: Art of the Japanese Sword , 1986 .

[7]  Matthew Wright,et al.  Preparation for Improvised Performance in Collaboration with a Khyal Singer , 1998, ICMC.

[8]  Loïc Kessous,et al.  Expressiveness and Digital Musical Instrument Design , 2005 .

[9]  Claude Cadoz,et al.  Instrumental Gestures and Musical Composition , 1988, ICMC.

[10]  Sidney S. Fels,et al.  GRASSP: Gesturally-Realized Audio, Speech and Song Performance , 2006, NIME.

[11]  Cornelius Pöpel,et al.  On interface expressivity: A player based study , 2005, NIME.

[12]  Abigail Sellen,et al.  Two-handed input in a compound task , 1994, CHI Conference Companion.

[13]  Christopher Dobrian,et al.  The 'E' in NIME: Musical Expression with New Computer Interfaces , 2006, NIME.

[14]  G. Marino,et al.  The UPIC System: Origins and Innovations , 1993 .

[15]  Claude Cadoz,et al.  Capture, Representation and "Composition" of the Instrumental Gesture , 1990, ICMC.

[16]  Ali Momeni,et al.  Ten years of tablet musical interfaces at CNMAT , 2007, NIME '07.

[17]  Marcelo M. Wanderley,et al.  Gestural control of sound synthesis , 2004, Proceedings of the IEEE.

[18]  Loïc Kessous,et al.  Strategies of mapping between gesture data and synthesis model parameters using perceptual spaces , 2002, Organised Sound.

[19]  A. Woodhull,et al.  Alignment of the human body in standing , 2004, European Journal of Applied Physiology and Occupational Physiology.

[20]  Christophe d'Alessandro,et al.  Real-time CALM Synthesizer: New Approaches in Hands-Controlled Voice Synthesis , 2006, NIME.

[21]  Thierry Dutoit,et al.  HandSketch bi-manual controller: investigation on expressive control issues of an augmented tablet , 2007, NIME '07.

[22]  P. Depalle,et al.  Perceptual Evaluation of Vibrato Models , 2005 .

[23]  John Kingston,et al.  Macro and micro F0 in the synthesis of intonation , 1990 .

[24]  Caroline Traube,et al.  On making and playing an electronically-augmented saxophone , 2006, NIME.

[25]  Marcelo M. Wanderley,et al.  The T-Stick: from musical interface to musical instrument , 2007, NIME '07.

[26]  Hans-Christoph Steiner,et al.  Towards a catalog and software library of mapping methods , 2006, NIME.

[27]  Lakhmi C. Jain,et al.  Radial Basis Function Networks 2: New Advances in Design , 2001 .

[28]  Nicola Orio,et al.  Evaluation of Input Devices for Musical Expression: Borrowing Tools from HCI , 2001, Computer Music Journal.

[29]  Marcelo M. Wanderley,et al.  Trends in Gestural Control of Music , 2000 .

[30]  Christophe d'Alessandro,et al.  Realtime and accurate musical control of expression in singing synthesis , 2008, Journal on Multimodal User Interfaces.