Evaluation Perceptual Separation in an Pilot System for Affective Composition

Research evaluating perceptual responses to music has identified many structural features as correlates that might be incorporated in computer music systems for affectively charged algorithmic composition and/or expressive music performance. In order to investigate the possible integration of isolated musical features to such a system, a discrete feature known to correlate some with emotional responses – rhythmic density – was selected from a literature review and incorporated into a prototype system. This system produces variation in rhythm density via a transformative process. A stimulus set created using this system was then subjected to a perceptual evaluation. Pairwise comparisons were used to scale differences between 48 stimuli. Listener responses were analysed with Multidimensional scaling (MDS). The 2-Dimensional solution was then rotated to place the stimuli with the largest range of variation across the horizontal plane. Stimuli with variation in rhythmic density were placed further from the source material than stimuli that were generated by random permutation. This, combined with the striking similarity between the MDS scaling and that of the 2-dimensional emotional model used by some affective algorithmic composition systems, suggests that isolated musical feature manipulation can now be used to parametrically control affectively charged automated composition in a larger system.

[1]  Duncan Williams,et al.  Towards Affective Algorithmic Composition , 2013 .

[2]  Carlos Agon,et al.  OpenMusic 5: A Cross-Platform Release of the Computer-Assisted Composition Environment , 2005 .

[3]  I. Peretz,et al.  Happy, sad, scary and peaceful musical excerpts for research on emotions , 2008 .

[4]  David Cope,et al.  Computer Modeling of Musical Intelligence in EMI , 1992 .

[5]  J. Marozeau,et al.  Multidimensional scaling of emotional responses to music: The effect of musical expertise and of the duration of the excerpts , 2005 .

[6]  Emmanuel Bigand,et al.  The Time Course of Emotional Responses to Music , 2005, Annals of the New York Academy of Sciences.

[7]  Tuomas Eerola,et al.  Music and emotion , 2011 .

[8]  Jack N. Averitt,et al.  Happy , 2015, Journal of pain & palliative care pharmacotherapy.

[9]  David Cope,et al.  Experiments in musical intelligence (EMI): Non‐linear linguistic‐based composition , 1989 .

[10]  David Cope,et al.  Experiments In Musical Intelligence , 1996 .

[11]  Robert Rowe,et al.  Interactive Music Systems: Machine Listening and Composing , 1992 .

[12]  J. Russell A circumplex model of affect. , 1980 .

[13]  L. Trainor,et al.  Frontal brain electrical activity (EEG) distinguishes valence and intensity of musical emotions , 2001 .

[14]  John Kratus A Developmental Study of Children's Interpretation of Emotion in Music , 1993 .

[15]  Alexis Kirke,et al.  Combining EEG Frontal asymmetry studies with affective algorithmic composition and expressive performance models , 2011, International Conference on Mathematics and Computing.

[16]  Anders Friberg,et al.  Perceptual ratings of musical parameters , 2011 .

[17]  J. Kruskal Multidimensional scaling by optimizing goodness of fit to a nonmetric hypothesis , 1964 .

[18]  Tuomas Eerola,et al.  The role of melodic and temporal cues in perceiving musical meter. , 2004, Journal of experimental psychology. Human perception and performance.

[19]  Shyh-Kang Jeng,et al.  Probabilistic Estimation of a Novel Music Emotion Model , 2008, MMM.

[20]  K. Scherer Which Emotions Can be Induced by Music? What Are the Underlying Mechanisms? And How Can We Measure Them? , 2004 .