Performance has traditionally been a principal outlet for musical expression – it is the moment when music is communicated from musician to listener. This tradition has been challenged and augmented in this century – recorded and broadcast media technologies have brought new modes of listening to music, while conceptual developments have given rise to extremes such as music not to be heard. Computer music makes up part of these developments in new methods of creation and reception of music. It is at its origin a studio based art, one that gave composers possibilities to create a music not possible with traditional performative means. Realizing music in the electronic and computer music studios freed the composer from traditional channels of composition-interpretation-performance, allowing him to focus in a new way on sonic materials. Advancements in computer processing speeds have brought with it the capabilities to realize this music in real time, in effect giving us the possibility to take it out of the studio and put it on stage. This has introduced the problematic of how to articulate computer generated music in a concert setting. Concurrently, developments in the fields of computer-human interface (CHI) and virtual reality (VR) have introduced tools to investigate new interaction modalities between user and computer (Laurel 1990). The intersection of these fields with computer music has resulted in the field of gestural computer music instrument design. The creation of gestural and sensor based musical instruments makes possible the articulation of computer generated sound live through performer intervention. While this may be a recent area of research, the musical issues of engaging performance are something that have been addressed before the arrival of the computer. So while from a technical standpoint research in this field may represent advances in musical and perceptual bases of gestural human-machine interaction, musically these efforts in computer music have precedence in instrumental music. It can be said that musically this brings us full circle, back to the concert as forum for the communication of music. If concert performance is the medium of communication then the instrument becomes the conduit between performer and listener. The listener’s perception of the music is contingent on the instrument’s efficiency at transmitting the performer’s musical expression, and the performer's ability to channel his creativity through his instrument. We must integrate what we know about human machine interaction with our musical sensibility. What follows is a personal reflection on a performance practice developed on sensor based instruments, drawn from several years of concert performance with the BioMuse (Tanaka 1993), ensemble work with Sensorband (Bongers 1998) in the creation of the Soundnet, and the application of instrumental thinking to an installation and network based project: Global String. Observations have been made from personal reflections during practice and development, discussion with other artists, audience reactions, and interaction with students.
[1]
Claude Cadoz.
Le geste canal de communication homme / machine. Models for gestual interactions.
,
1992
.
[2]
R. B. Knapp,et al.
Controlling computers with neural signals.
,
1996,
Scientific American.
[3]
Mark Goldstein.
Gestural coherence and musical interaction design
,
1998,
SMC'98 Conference Proceedings. 1998 IEEE International Conference on Systems, Man, and Cybernetics (Cat. No.98CH36218).
[4]
Roel Vertegaal,et al.
Quarterly Progress and Status Report Towards a musician’s cockpit: Transducers, feedback and musical function
,
2007
.
[5]
David Rosenboom,et al.
Extended Musical Interface with the Human Nervous System: Assessment and Prospectus
,
2017,
Leonardo.
[6]
David Zicarelli,et al.
Max: Interactive Graphic Programming Environment
,
1991
.
[7]
Allen Strange.
Electronic Music: Systems, Techniques, and Controls
,
1983
.
[8]
Fernando Iazzetta.
Meaning in Musical Gesture
,
2000
.
[9]
Chris Chafe.
Tactile Audio Feedback
,
1993,
ICMC.
[10]
David Zicarelli.
Music Technology as a Form of Parasite
,
1992,
ICMC.
[11]
S. Joy Mountford,et al.
The Art of Human-Computer Interface Design
,
1990
.
[12]
Volker Krefeld,et al.
The Hand in the Web: An Interview with Michel Waisvisz
,
1990
.
[13]
Jeff Pressing.
Cybernetic Issues in Interactive Performance Systems
,
1990
.
[14]
Atau Tanaka,et al.
Musical Technical Issues in Using Interactive Instrument Technology with Application to the BioMuse
,
1993,
International Conference on Mathematics and Computing.
[15]
Zack Settel,et al.
Nonobvious roles for electronics in performance enhancement
,
1993,
ICMC.
[16]
Yoichi Nagashima.
Biosensorfusion: New Interfaces for Interactive Multimedia Art
,
1998,
ICMC.
[17]
Marcelo M. Wanderley,et al.
ESCHER-modeling and performing composed instruments in real-time
,
1998,
SMC'98 Conference Proceedings. 1998 IEEE International Conference on Systems, Man, and Cybernetics (Cat. No.98CH36218).
[18]
Seiji Inokuchi,et al.
Demonstration of Gesture Sensors for the Shakuhachi
,
1994,
ICMC.
[19]
C. Cadoz,et al.
Le geste canal de communication homme/machine: la communication "instrumentale"
,
1994
.
[20]
R. Benjamin Knapp,et al.
A Bioelectric Controller for Computer Music Applications
,
1990
.
[21]
R. Benjamin Knapp,et al.
Controlling computers with neural signals.
,
1996
.
[22]
Insook Choi,et al.
From motion to emotion: synthesis of interactivity with gestural primitives
,
1998
.
[23]
Ian Bowler,et al.
Applications of the phase vocoder in the control of real‐time electronic musical instruments
,
1993
.
[24]
David Wessel,et al.
Control of Phrasing and Articulation in Synthesis
,
1987,
ICMC.
[25]
Bert Bongers,et al.
An Interview with Sensorband
,
1998
.