In this paper we present current trends in real-time music tracking (a.k.a. score following). Casually speaking, these algorithms "listen" to a live performance of music, compare the audio signal to an abstract representation of the score, and "read" along in the sheet music. In this way at any given time the exact position of the musician(s) in the sheet music is computed. Here, we focus on the aspects of flexibility and usability of these algorithms. This comprises work on automatic identification and flexible tracking of the piece being played as well as current approaches based on Deep Learning. The latter enables direct learning of correspondences between complex audio data and images of the sheet music, avoiding the complicated and time-consuming definition of a mid-level representation.
-----
Diese Arbeit befasst sich mit aktuellen Entwicklungen in der automatischen Musikverfolgung durch den Computer. Es handelt sich dabei um Algorithmen, die einer musikalischen Auff\"uhrung "zuh\"oren", das aufgenommene Audiosignal mit einer (abstrakten) Repr\"asentation des Notentextes vergleichen und sozusagen in diesem mitlesen. Der Algorithmus kennt also zu jedem Zeitpunkt die Position der Musiker im Notentext. Neben der Vermittlung eines generellen \"Uberblicks, liegt der Schwerpunkt dieser Arbeit auf der Beleuchtung des Aspekts der Flexibilit\"at und der einfacheren Nutzbarkeit dieser Algorithmen. Es wird dargelegt, welche Schritte get\"atigt wurden (und aktuell get\"atigt werden) um den Prozess der automatischen Musikverfolgung einfacher zug\"anglich zu machen. Dies umfasst Arbeiten zur automatischen Identifikation von gespielten St\"ucken und deren flexible Verfolgung ebenso wie aktuelle Ans\"atze mithilfe von Deep Learning, die es erlauben Bild und Ton direkt zu verbinden, ohne Umwege \"uber abstrakte und nur unter gro{\ss}em Zeitaufwand zu erstellende Zwischenrepr\"asentationen.
[1]
Markus Schedl,et al.
Polyphonic piano note transcription with recurrent neural networks
,
2012,
2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
[2]
Jakob Grue Simonsen,et al.
Towards a Standard Testbed for Optical Music Recognition: Definitions, Metrics, and Page Images
,
2015
.
[3]
Carlos Guedes,et al.
Optical music recognition: state-of-the-art and open issues
,
2012,
International Journal of Multimedia Information Retrieval.
[4]
Meinard Müller,et al.
Automated Synchronization of Scanned Sheet Music with Audio Recordings
,
2007,
ISMIR.
[5]
Gerhard Widmer,et al.
Towards Score Following In Sheet Music Images
,
2016,
ISMIR.
[6]
Jing Zhang,et al.
A new optical music recognition system based on combined neural network
,
2015,
Pattern Recognit. Lett..
[7]
Roger B. Dannenberg,et al.
An On-Line Algorithm for Real-Time Accompaniment
,
1984,
ICMC.
[8]
Gerhard Widmer,et al.
Learning Audio-Sheet Music Correspondences for Score Identification and Offline Alignment
,
2017,
ISMIR.
[9]
Jaroslav Pokorný,et al.
Further Steps Towards a Standard Testbed for Optical Music Recognition
,
2016,
ISMIR.
[10]
Eita Nakamura,et al.
Autoregressive Hidden Semi-Markov Model of Symbolic Music Performance for Score Following
,
2015,
ISMIR.
[11]
Florian Krebs,et al.
Tracking rests and Tempo changes: Improved Score following with Particle filters
,
2013,
ICMC.
[12]
Tadashi Kitamura,et al.
Ryry: A Real-Time Score-Following Automatic Accompaniment Playback System Capable of Real Performances with Errors, Repeats and Jumps
,
2014,
AMT.
[13]
Bochen Li,et al.
Score Following for Piano Performances with Sustain-Pedal Effects
,
2015,
ISMIR.