The Schultz MIDI Benchmarking Toolbox for MIDI interfaces, percussion pads, and sound cards

The Musical Instrument Digital Interface (MIDI) was readily adopted for auditory sensorimotor synchronization experiments. These experiments typically use MIDI percussion pads to collect responses, a MIDI–USB converter (or MIDI–PCI interface) to record responses on a PC and manipulate feedback, and an external MIDI sound module to generate auditory feedback. Previous studies have suggested that auditory feedback latencies can be introduced by these devices. The Schultz MIDI Benchmarking Toolbox (SMIDIBT) is an open-source, Arduino-based package designed to measure the point-to-point latencies incurred by several devices used in the generation of response-triggered auditory feedback. Experiment 1 showed that MIDI messages are sent and received within 1 ms (on average) in the absence of any external MIDI device. Latencies decreased when the baud rate increased above the MIDI protocol default (31,250 bps). Experiment 2 benchmarked the latencies introduced by different MIDI–USB and MIDI–PCI interfaces. MIDI–PCI was superior to MIDI–USB, primarily because MIDI–USB is subject to USB polling. Experiment 3 tested three MIDI percussion pads. Both the audio and MIDI message latencies were significantly greater than 1 ms for all devices, and there were significant differences between percussion pads and instrument patches. Experiment 4 benchmarked four MIDI sound modules. Audio latencies were significantly greater than 1 ms, and there were significant differences between sound modules and instrument patches. These experiments suggest that millisecond accuracy might not be achievable with MIDI devices. The SMIDIBT can be used to benchmark a range of MIDI devices, thus allowing researchers to make informed decisions when choosing testing materials and to arrive at an acceptable latency at their discretion.

[1]  Jeffrey N. Rouder,et al.  Bayesian t tests for accepting and rejecting the null hypothesis , 2009, Psychonomic bulletin & review.

[2]  Peter Q Pfordresher,et al.  Delayed auditory feedback and movement. , 2011, Journal of experimental psychology. Human perception and performance.

[3]  S. M. Marcus Acoustic determinants of perceptual center (P-center) location , 1981, Perception & psychophysics.

[4]  B. Repp,et al.  Self versus other in piano performance: detectability of timing perturbations depends on personal playing style , 2010, Experimental Brain Research.

[5]  Alessandro D’Ausilio,et al.  Using Arduino microcontroller boards to measure response latencies , 2013, Behavior research methods.

[6]  Peter Q. Pfordresher,et al.  Temporal coordination between actions and sound during sequence production. , 2007, Human movement science.

[7]  Peter E. Keller,et al.  Music’s impact on the visual perception of emotional dyadic interactions , 2011 .

[8]  B. Repp Sensorimotor synchronization: A review of the tapping literature , 2005, Psychonomic bulletin & review.

[9]  James M. Kieley MIDI and Macintosh: Searching for a better mousetrap , 1991 .

[10]  H. W. M. LUNNEY,et al.  Time as heard in speech and music , 1974, Nature.

[11]  Jeff A. Bilmes,et al.  A novel representation for rhythmic structure , 1997 .

[12]  Alan M. Wing,et al.  MatTAP: A MATLAB toolbox for the control and analysis of movement synchronisation experiments , 2009, Journal of Neuroscience Methods.

[13]  Peter Desain,et al.  Real-Time Visual Feedback for Learning to Perform Short Rhythms with Expressive Variations in Timing and Loudness , 2008 .

[14]  Clemens Maidhof,et al.  Combining EEG, MIDI, and motion capture techniques for investigating musical performance , 2014, Behavior research methods.

[15]  Simon Kirby,et al.  Musical evolution in the lab exhibits rhythmic universals , 2016, Nature Human Behaviour.

[16]  F. Richard Moore,et al.  The Dysfunctions of MIDI , 1988, ICMC.

[17]  Michael H. Thaut,et al.  Multiple synchronization strategies in rhythmic sensorimotor tasks: phase vs period correction , 1998, Biological Cybernetics.

[18]  Caroline Palmer,et al.  Synchronizing MIDI and wireless EEG measurements during natural piano performance , 2017, Brain Research.

[19]  Peter E. Keller,et al.  PRODUCTION AND SYNCHRONIZATION OF UNEVEN RHYTHMS AT FAST TEMPI , 2005 .

[20]  J. Michon,et al.  STUDIES ON SUBJECTIVE DURATION. I. DIFFERENTIAL SENSITIVITY IN THE PERCEPTION OF REPEATED TEMPORAL INTERVALS. , 1964, Acta psychologica.

[21]  Ian Cross,et al.  The Effect of Structural and Performance Factors in the Perception of Anacruses , 2009 .

[22]  Peter Q. Pfordresher,et al.  The dynamics of disruption from altered auditory feedback: Further evidence for a dissociation of sequencing and timing. , 2011, Journal of experimental psychology. Human perception and performance.

[23]  R Core Team,et al.  R: A language and environment for statistical computing. , 2014 .

[24]  B. Repp,et al.  Sensorimotor synchronization: A review of recent research (2006–2012) , 2013, Psychonomic Bulletin & Review.

[25]  Bruno H Repp,et al.  Sensorimotor synchronization and perception of timing: effects of music training and task experience. , 2010, Human movement science.

[26]  G Aschersleben,et al.  Delayed auditory feedback in synchronization. , 1997, Journal of motor behavior.

[27]  Johan Sundberg,et al.  TIME DISCRIMINATION IN A MONOTONIC, ISOCHRONOUS SEQUENCE , 1995 .

[28]  Edward W. Large,et al.  Tracking simple and complex sequences , 2002, Psychological research.

[29]  B. Schultz,et al.  Individual Differences in Temporal Anticipation and Adaptation During Sensorimotor Synchronization , 2015 .

[30]  John I. Garney,et al.  An Analysis of Throughput Characteristics of Universal Serial Bus , 1996 .

[31]  Floris van Vugt,et al.  taparduino v1.0 , 2015 .

[32]  P. Keller,et al.  The role of temporal prediction abilities in interpersonal sensorimotor synchronization , 2010, Experimental Brain Research.

[33]  Herbert A. Sturges,et al.  The Choice of a Class Interval , 1926 .

[34]  S A Finney,et al.  FTAP: A Linux-based program for tapping and music experiments , 2001, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[35]  J. Nordmark,et al.  Mechanisms of frequency discrimination. , 1968, The Journal of the Acoustical Society of America.

[36]  B. Piqueras Revisión 2.0 , 2009 .

[37]  Peter A. Martens,et al.  Spontaneous sensorimotor coupling with multipart music. , 2014, Journal of experimental psychology. Human perception and performance.

[38]  Peter E. Keller,et al.  Distortions in Reproduction of Two-Interval Rhythms: When the “Attractor Ratio” Is Not Exactly 1:2 , 2012 .

[39]  T. Hothorn,et al.  Simultaneous Inference in General Parametric Models , 2008, Biometrical journal. Biometrische Zeitschrift.

[40]  Charles E. Collyer,et al.  A motor timing experiment implemented using a musical instrument digital interface (MIDI) approach , 1997 .

[41]  J. Hartigan,et al.  The Dip Test of Unimodality , 1985 .

[42]  Bruno H Repp,et al.  Toward a psychophysics of agency: detecting gain and loss of control over auditory action effects. , 2007, Journal of experimental psychology. Human perception and performance.

[43]  Jonathan Cohen,et al.  Fritzing: a tool for advancing electronic prototyping for designers , 2009, TEI.

[44]  Benjamin G. Schultz,et al.  Tap Arduino: An Arduino microcontroller for low-latency auditory feedback in sensorimotor synchronization experiments , 2015, Behavior Research Methods.

[45]  B. Repp,et al.  No Sustained Sound Illusion in Rhythmic Sequences , 2010 .

[46]  Jason M Haberman,et al.  Sensorimotor coupling in music and the psychology of the groove. , 2012, Journal of experimental psychology. General.

[47]  Katja Rogers,et al.  P.I.A.N.O.: Faster Piano Learning with Interactive Projection , 2014, ITS '14.