New algorithms and technology for analyzing gestural data

We describe the ways in which we are analyzing gestural data as if they were an audio signal, and applying this technique to the radio drum, a novel three-dimensional controller that one of the authors uses regularly for concert performances. The radio drum uses capacitive sensing; a radiofrequency voltage source is conducted from the performer's mallets or sticks, and is received on the drum surface beneath. The two sticks are differentiated by using different frequencies for each one. Signals from each stick represent typical gestures. The signal processing includes two key steps. The first step is to capture all of the subtle motions associated with these gestures, so that the instrument responds in a sensitive manner to the performer's expressive technique. We seek to capture details of the entire motion represented by the x,y,z signals versus time, not just the velocity of a strike. The second step is to map the gesture signal to control musical events. This mapping can take many forms, depending on the creative ideas of the composer and/or performer.