This is a report of research and some experimental applications of human-computer interaction in multi-media performing arts. The human performer and the computer systems perform computer graphic and computer music interactively in realtime. In general, many sensors are used for the interactive communication as interfaces, and the performer receives the output of the system via graphics, sounds and physical reactions of interfaces like musical instruments. I have produced many types of interfaces, not only with physical/electrical sensors but also with biological/physiological sensors. This paper is intended as an investigation of some special approaches: (1) sensing/reacting with "breathing" in performing arts, (2) 16-channel electromyogram sensor and its application of "muscle performing music", (3) 8-channel electricfeedback system and its experiments of "bodyhearing sounds" and "body-listening to music".
[1]
Yoichi Nagashima,et al.
"It's SHO time" - An Interactive Environment for SHO(Sheng) Performance
,
1999,
ICMC.
[2]
Yoichi Nagashima.
Real-Time Interactive Performance with Computer Graphics and Computer Music
,
1998
.
[3]
Seiji Inokuchi,et al.
Pegasus-2: Real-Time Composing Environment with Chaotic Interaction Model
,
1993,
ICMC.
[4]
Seiji Inokuchi,et al.
A Compositional Environment with Intersection and Interaction between Musical Model and Graphical Model: "Listen to the Graphics, Watch the Music"
,
1995,
ICMC.
[5]
Yoichi Nagashima.
Multimedia interactive art: system design and artistic concept of real-time performance with computer graphics and computer music
,
1995
.