In order to analyse timing in musical performance, it is necessary to develop reliable and eÆcient methods of deriving musical timing information (e.g. tempo, beat and rhythm) from the physical timing of audio signals or MIDI data. We report the results of an experiment in which subjects were asked to mark the positions of beats in musical excerpts, using a multimedia interface which provides various forms of audio and visual feedback. Six experimental conditions were tested, which involved disabling various parts of the system's feedback to the user. Even in extreme cases such as no audio feedback or no visual feedback, subjects were often able to nd the regularities corresponding to the musical beat. In many cases, the subjects' placement of markers corresponded closely to the onsets of on-beat notes (according to the score), but the beat sequences were much more regular than the corresponding note onset times. The form of feedback provided by the system had a signi cant e ect on the chosen beat times: visual feedback encouraged a closer alignment of beats with notes, whereas audio feedback led to a smoother beat sequence.
[1]
C. Drake,et al.
Tapping in Time with Mechanically and Expressively Performed Music
,
2000
.
[2]
Albert S. Bregman,et al.
The Auditory Scene. (Book Reviews: Auditory Scene Analysis. The Perceptual Organization of Sound.)
,
1990
.
[3]
Gerhard Widmer,et al.
Inductive Learning of General and Robust Local Expression Principles
,
2001,
International Conference on Mathematics and Computing.
[4]
J. Progler.
Searching for Swing: Participatory Discrepancies in the Jazz Rhythm Section
,
1995
.
[5]
Gerhard Widmer,et al.
Using AI and machine learning to study expressive music performance: project survey and first report
,
2001,
AI Commun..
[6]
Simon Dixon.
An Interactive Beat Tracking and Visualisation System
,
2001,
ICMC.
[7]
Simon Dixon,et al.
Automatic Extraction of Tempo and Beat From Expressive Performances
,
2001
.