An investigation into the linear and nonlinear correlation of two music walk sequences

Abstract In order to investigate the features of a multiple-part musical score which enhance its appeal to the listener’s ear, this study performs a robust analysis of the correlation between two musical sequences. In the proposed approach, a series of notes are extracted from seven well-known classical pieces of music and are converted into one-variable “music walks”. The linear correlation between pairs of music walks is assessed using the conventional linear correlation coefficient, while the nonlinear correlation is examined using the mutual information concept. The results show that even though two musical time walks may exhibit virtually no linear correlation, they invariably have a certain degree of nonlinear correlation. In other words, to truly understand the correlation between two musical sequences, it is necessary to consider not only the linear correlation between them, but also the nonlinear correlation. In addition, it is shown that the normalized mutual information coefficient between musical sequences has a relatively low value and varies significantly over the course of the musical score. Thus, it can be inferred that the appeal of a musical score stems at least in part from significant variations in both the melody and the rhythm of the constituent parts such that the overall score has a rich and unpredictable property.

[1]  Fraser,et al.  Independent coordinates for strange attractors from mutual information. , 1986, Physical review. A, General physics.

[2]  Werner Ebeling,et al.  ENTROPY, TRANSINFORMATION AND WORD DISTRIBUTION OF INFORMATION-CARRYING SEQUENCES , 1995, cond-mat/0204045.

[3]  Kenneth J. Hsü,et al.  Fractal Geometry of Music: From Bird Songs to Bach , 1993 .

[4]  Richard C. Pinkerton Information theory and melody. , 1956 .

[5]  R. Menezes,et al.  Entropy-Based Independence Test , 2006 .

[6]  L. Knopoff,et al.  Entropy as a Measure of Style: The Influence of Sample Length , 1983 .

[7]  B. C. Madden Fractals in Music: Introductory Mathematics for Musical Analysis , 1999 .

[8]  C. Granger,et al.  USING THE MUTUAL INFORMATION COEFFICIENT TO IDENTIFY LAGS IN NONLINEAR MODELS , 1994 .

[9]  Ian H. Witten,et al.  Comparing human and computational models of music prediction , 1994 .

[10]  J. Youngblood Style as Information , 1958 .

[11]  W. Press,et al.  Numerical Recipes in Fortran: The Art of Scientific Computing.@@@Numerical Recipes in C: The Art of Scientific Computing. , 1994 .

[12]  Mauro Barni,et al.  An Information Theoretic Perspective , 2004 .

[13]  Güngör Gündüz,et al.  The mathematical analysis of the structure of some songs , 2005 .

[14]  J. Boon,et al.  Dynamical systems theory for music dynamics. , 1994, Chaos.

[15]  Mark L. James,et al.  On the Entropy of Music: An Experiment with Bach Chorale Melodies , 1992 .

[16]  Huw Jones,et al.  Applications of Fractals and Chaos , 1993, Springer Berlin Heidelberg.

[17]  Zhi-Yuan Su,et al.  Music walk, fractal geometry in music , 2007 .

[18]  William Hutchinson,et al.  Information Theory for Musical Continua , 1981 .

[19]  R. Gallager Information Theory and Reliable Communication , 1968 .

[20]  Georges A. Darbellay,et al.  Predictability: An Information-Theoretic Perspective , 1998 .

[21]  Rui Menezes,et al.  Mutual information: a measure of dependency for nonlinear time series , 2004 .

[22]  Leonard B. Meyer Meaning in music and information theory. , 1957 .

[23]  G. Darbellay,et al.  The entropy as a tool for analysing statistical dependences in financial time series , 2000 .