A similarity algorithm for interactive style imitation

We consider an interactive musical agent (IMA) which shares control of an ensemble of virtual instruments with a human performer, and which is required to interact in a particular musical style without a priori knowledge of the ensemble itself. We present an algorithm which allows such an agent to find the segment in a corpus of demonstration performances, which best matches a given musical situation. This segment can then be used to choose values for the parameters under the agent’s control. We show the algorithm’s efficacy by demonstrating an IMA which can (i) perform alongside a human musician and (ii) imitate the behaviour of another artificial musical agent which performs according to a set of rules.