PINPOINTING THE BEAT: TAPPING TO EXPRESSIVE PERFORMANCES

In this study we report on an experiment in which listeners were asked to tap in time with expressively performed music, and compare the results to two other experiments using the same stimuli which investigated beat and tempo perception through other modalities. Many computational models of beat tracking assume that beats correspond with the onset of musical notes; we consider the hypothesis that the beat times are rather given by a curve that is “smoother” than the tempo curve of the note onset times, which nevertheless can be derived from the onset times. The tapping results show a tendency to underestimate the tempo changes, which supports the smoothing hypothesis, and agrees with listening experiments and other tapping studies. Tempo and beat are well-defined in the abstract setting of a musical score, but not in the context of analysis of expressive musical performance. That is, the regular pulse, which is the basis of rhythmic notation in common music notation, is anything but regular when the timing of performed notes is measured. These micro-deviations from mechanical timing are an important part of musical expression, although they remain, for the most part, poorly understood. In this study we report on an experiment in which listeners were asked to tap in time with expressively performed music, and compare the results to two other experiments using the same stimuli which investigated beat and tempo perception through other modalities. In this paper, we define beat to be a perceived pulse consisting of a set of beat times (or beats) which are approximately equally spaced throughout a musical performance. Each pulse corresponds with one of the metrical levels of the musical notation, which is usually the quarter note, eighth note, half note or the dotted quarter note level. We refer to the time interval between two successive beats at a particular metrical level as the inter-beat interval (IBI), which is a measure of instantaneous tempo. A more general measure of tempo is given by averaging IBIs over some time period or number of beats. The IBI is expressed in units of time (per beat); the tempo is more often expressed as the reciprocal, beats per time unit (e.g. beats per minute). To distinguish the discussion of the timing of the participants’ taps from that of the timing of musical notes by the performer, we use the terms tapped IBI (t-IBI) and performed IBI (p-IBI). 1.1. Literature Review

[1]  Ira J. Hirsh,et al.  Auditory Perception of Temporal Order , 1959 .

[2]  R. Parncutt A Perceptual Model of Pulse Salience and Metrical Accent in Musical Rhythms , 1994 .

[3]  Johan Sundberg,et al.  TIME DISCRIMINATION IN A MONOTONIC, ISOCHRONOUS SEQUENCE , 1995 .

[4]  Charles E. Collyer,et al.  A motor timing experiment implemented using a musical instrument digital interface (MIDI) approach , 1997 .

[5]  Mahmood R. Azimi-Sadjadi,et al.  Rhythmic finger tapping to cosine-wave modulated metronome sequences: Evidence of subliminal entrainment , 1998 .

[6]  B H Repp,et al.  Detecting deviations from metronomic timing in music: Effects of perceptual structure on the mental timekeeper , 1999, Perception & psychophysics.

[7]  B. Repp Control of Expressive and Metronomic Timing in Pianists. , 1999, Journal of motor behavior.

[8]  Andreas Wohlschläger,et al.  Synchronization error: an error in time perception , 1999 .

[9]  C. Drake,et al.  Tapping in Time with Mechanically and Expressively Performed Music , 2000 .

[10]  B H Repp,et al.  Compensation for subliminal timing perturbations in perceptual-motor synchronization , 2000, Psychological research.

[11]  Luke Windsor,et al.  Rhythm Perception and Production , 2000 .

[12]  Carol L. Krumhansl,et al.  Tapping to Ragtime: Cues to Pulse Finding , 2001 .

[13]  Simon Dixon,et al.  Beat Extraction from Expressive Music al Performances. , 2001 .

[14]  Guy Madison,et al.  Functional Modelling of the Human Timing Mechanism , 2001 .