Entropy as a Measure of Style: The Influence of Sample Length

We can imagine the act of musical composition as the selection of elements from several musical parameters. For example, the composer may choose more tonic than dominant harmonies, more quarter notes than half notes, and create a preponderance of conjunct rather than disjunct motions. These choices will bring about distributional characteristics that may belong to a "style." Once made, these choices are, at any rate, identifiable characteristics of the music itself. Elements in musical parameters are not unlike characters in common speech alphabets. Communicative structures of substantial size are the end result of a complex series of choices that are selections from alphabetic pools in the case of written literature and, in the case of music, from the pools of elements in the several parameters that together comprise musical expression. The study of the selection and distribution of alphabetic characters is the domain of information theory. More than twenty years ago, Youngblood proposed that the computation of information content, the entropy of information theory, could serve as a "method to identify musical style."' The entropy of information theory is a calculation of the freedom with which available alphabetic materials are used. Stated conversely, it