SICIB: An Interactive Music Composition System Using Body Movements

Traditionally, music and dance have been comple-mentary arts. However, their integration has notalways been entirely satisfactory. In general, adancer must conform movements to a predefinedpiece of music, leaving very little room for impro-visational creativity. In this article, a system calledSICIB—capable of music composition, improvisa-tion, and performance using body movements—isdescribed. SICIB uses data from sensors attached todancers and “if-then” rules to couple choreo-graphic gestures with music. The article describesthe choreographic elements considered by the sys-tem (such as position, velocity, acceleration, curva-ture, and torsion of movements, jumps, etc.), aswell as the musical elements that can be affectedby them (e.g., intensity, tone, music sequences,etc.) through two different music composition sys-tems: Escamol and Aura. The choreographic infor-mation obtained from the sensors, the musicalcapabilities of the music composition systems, anda simple rule-based coupling mechanism offersgood opportunities for interaction between chore-ographers and composers.The architecture of SICIB, which allows real-time performance, is also described. SICIB hasbeen used by three different composers and a cho-reographer with very encouraging results. In par-ticular, the dancer has been involved in music dia-logues with live performance musicians. Ourexperiences with the development of SICIB andour own insights into the relationship that newtechnologies offer to choreographers and dancersare also discussed.

[1]  Johann Joseph Fux,et al.  Gradus ad Parnassum , 1967 .

[2]  H. Piaggio Differential Geometry of Curves and Surfaces , 1952, Nature.

[3]  Stephan M. Schwanauer,et al.  Machine Models of Music , 1993 .

[4]  Mikael Fernström,et al.  LiteFoot - A Floor Space for Recording Dance and Controlling Media , 1998, ICMC.

[5]  Joseph A. Paradiso,et al.  The magic carpet: physical sensing for immersive environments , 1997, CHI Extended Abstracts.

[6]  Alex Pentland,et al.  The ALIVE system: full-body interaction with autonomous agents , 1995, Proceedings Computer Animation'95.

[7]  Joseph A. Paradiso,et al.  Interactive Music for Instrumented Dancing Shoes , 1999, ICMC.

[8]  Shuji Hashimoto,et al.  EyesWeb: Toward Gesture and Affect Recognition in Interactive Dance and Music Systems , 2000, Computer Music Journal.

[9]  Jens Jacobsen,et al.  The challenges of interactive dance: An overview and case study , 1998 .

[10]  Russell F. Pinkston,et al.  A Touch Sensitive Dance Floor/MIDI Controller , 1995, ICMC.

[11]  James F. Allen Natural language understanding , 1987, Bejnamin/Cummings series in computer science.

[12]  Brent B Welch,et al.  Practical Programming in Tcl and Tk , 1999 .

[13]  Eric Johnstone,et al.  A MIDI Foot Controller-The Podoboard , 1991, International Conference on Mathematics and Computing.

[14]  William F. Clocksin,et al.  Programming in Prolog , 1987, Springer Berlin Heidelberg.

[15]  J. Sloboda Generative Processes in Music: The Psychology of Performance, Improvisation, and Composition , 1985 .

[16]  Roberto Morales{manzanares,et al.  Music Composition , Improvisation , and PerformanceThrough Body MovementsRoberto , .

[17]  J. Sloboda The Musical Mind: The Cognitive Psychology of Music , 1987 .

[18]  Roger B. Dannenberg,et al.  A Flexible Real-Time Software Synthesis System , 1996, ICMC.

[19]  T P Miles Old Dog, New Tricks? , 1996, Journal of the American Geriatrics Society.

[20]  Reginald Owen Morris Contrapuntal Technique in the Sixteenth Century , 1927 .

[21]  Todd Winkler,et al.  Making Motion Musical: Gesture Mapping Strategies for Interactive Computer Music , 1995, ICMC.

[22]  R. Jackendoff,et al.  A Generative Theory of Tonal Music , 1985 .

[23]  Barry Vercoe Csound: A Manual for the Audio Processing System , 1995 .