Contrary Motion: An Oppositional Interactive Music System

The hypothesis of this interaction research project is that it can be stimulating for experimental musicians to confront a system which ‘opposes’ their musical style. The ‘contrary motion’ of the title is the name of a MIDI-based realtime musical software agent which uses machine listening to establish the musical context, and thereby chooses its own responses to dierentiate its position from that of its human interlocutant. To do this requires a deep consideration of the space of musical actions, so as to explicate what opposition should constitute, and machine listening technology (most prominently represented by new online beat and stream tracking algorithms) which gives an accurate measurement of player position so as to consistently avoid it. An initial pilot evaluation was undertaken, feeding back critical data to the developing design.

[1]  Justin London,et al.  Hearing in Time: Psychological Aspects of Musical Meter , 2004 .

[2]  Nick Collins Musical robots and listening machines , 2012, The Cambridge Companion to Electronic Music.

[3]  Paul Hegarty,et al.  Noise Music: A History , 2007 .

[4]  John R. Anderson,et al.  Learning and Memory: An Integrated Approach , 1994 .

[5]  Marc Sosnick-Pérez,et al.  Evaluating Interactive Music Systems: An HCI Approach , 2009, NIME.

[6]  Nicola Orio,et al.  Evaluation of Input Devices for Musical Expression: Borrowing Tools from HCI , 2001, Computer Music Journal.

[7]  Matthew Wright,et al.  Problems and prospects for intimate musical control of computers , 2001 .

[8]  D. Temperley The Cognition of Basic Musical Structures , 2001 .

[9]  P. Desain,et al.  Music, Mind, and Machine: Studies in Computer Music, Music Cognition, and Artificial Intelligence , 1992 .

[10]  Eoin Brazil Proceedings of the 2002 conference on New interfaces for musical expression , 2002 .

[11]  E. Large,et al.  The dynamics of attending: How people track time-varying events. , 1999 .

[12]  Curtis Roads,et al.  Conversations with Cage@@@A John Cage Reader , 1988 .

[13]  Simon Dixon,et al.  Automatic Extraction of Tempo and Beat From Expressive Performances , 2001 .

[14]  Albert S. Bregman,et al.  The Auditory Scene. (Book Reviews: Auditory Scene Analysis. The Perceptual Organization of Sound.) , 1990 .

[15]  Anna Jordanous Voice separation in Polyphonic Music: a Data-Driven Approach , 2008, ICMC.

[16]  John R. Anderson Learning and memory: An integrated approach, 2nd ed. , 2000 .

[17]  Neale Donald Walsch Conversations with God , 1995 .

[18]  Robert Rowe,et al.  Machine Musicianship , 2001 .