Conversations with Expert Users in Music Retrieval and Research Challenges for Creative MIR

Sample retrieval remains a central problem in the creative process of making electronic dance music. This paper describes the findings from a series of interview sessions involving users working creatively with electronic music. We conducted in-depth interviews with expert users on location at the Red Bull Music Academies in 2014 and 2015. When asked about their wishes and expectations for future technological developments in interfaces, most participants mentioned very practical requirements of storing and retrieving files. A central aspect of the desired systems is the need to provide increased flow and unbroken periods of concentration and creativity. From the interviews, it becomes clear that for Creative MIR, and in particular, for music interfaces for creative expression, traditional requirements and paradigms for music and audio retrieval differ to those from consumer-centered MIR tasks such as playlist generation and recommendation and that new paradigms need to be considered. Despite all technical aspects being controllable by the experts themselves, searching for sounds to use in composition remains a largely semantic process. From the outcomes of the interviews, we outline a series of possible conclusions and areas and pose two research challenges for future developments of sample retrieval interfaces in the creative domain. 1. MOTIVATION AND CONTEXT Considerable effort has been put into analysing user behaviour in the context of music retrieval in the past two decades [35]. This includes studies on music information seeking behaviour [14,17], organisation strategies [15], usage of commercial listening services [36], the needs or motivations of particular users, such as kids [28], adolescents [34], or musicologists [29], and behaviour analysis for specific tasks, e.g., playlist and mix generation [13], or in specific settings, e.g., riding together in a car [16] or in music lessons in secondary schools [49]. c © Kristina Andersen, Peter Knees. Licensed under a Creative Commons Attribution 4.0 International License (CC BY 4.0). Attribution: Kristina Andersen, Peter Knees. “Conversations with Expert Users in Music Retrieval and Research Challenges for Creative MIR”, 17th International Society for Music Information Retrieval Conference, 2016. Figure 1. Live electronic music performance at the Red Bull Music Academy 2014 In this paper, we want to address music retrieval from the perspective of music producers, thus investigate the user behaviour of a group that deals with audio retrieval professionally on a daily basis, but has received comparatively less attention in MIR research so far—as have other questions from the area of Creative MIR [27]. The majority of today’s electronic music is created from pre-recorded or live-generated sound material. This process often combines sound loops and samples with synthesized and processed elements using a so-called digital audio workstation (DAW), an electronic device or computer application for recording, editing and producing audio files. In these systems, Music Information Retrieval (MIR) methods, e.g., for content analysis, gain importance. In essence, future tools and applications need to be aware of the nature and content of the music material, in order to effectively support the musician in the creative process. However, user studies on retrieval for musicians and producers are scarce. Cartwright et al. [6] investigate potential alternatives to existing audio production user interfaces in a study with 24 participants. In another example, Bainbridge et al. [4] explore and test a personal digital library environment for musicians, where based on a spatial paradigm musicians should be able to capture, annotate, and retrieve their ideas, e.g., using query-by-humming. In this paper, our approach is not to test an existing system, but to gain an understanding of the processes involved for music producers, who are used to working with existing music software suites.

[1]  Eric J. Humphrey,et al.  A BRIEF REVIEW OF CREATIVE MIR , 2014 .

[2]  Diemo Schwarz,et al.  Current Research in concatenative sound synthesis , 2005, ICMC.

[3]  Frans Wiering,et al.  In Their Own Words: Using Text Analysis to Identify Musicologists' Attitudes towards Technology , 2015, ISMIR.

[4]  Gaël Richard,et al.  Drum Loops Retrieval from Spoken Queries , 2005, Journal of Intelligent Information Systems.

[5]  Antonella De Angeli,et al.  A Hybrid Machine-Crowd Approach to Photo Retrieval Result Diversification , 2014, MMM.

[6]  Dan Stowell,et al.  MIR in School? Lessons from Ethnographic Observation of Secondary School Music Classes , 2011, ISMIR.

[7]  Barry Smyth,et al.  Similarity vs. Diversity , 2001, ICCBR.

[8]  Norbert Schnell,et al.  Playing the "MO" - Gestural Control and Re-Embodiment of Recorded Sound and Music , 2011, NIME.

[9]  Balaji Padmanabhan,et al.  SCENE: a scalable two-stage personalized news recommendation system , 2011, SIGIR.

[10]  Shlomo Dubnov,et al.  OMax brothers: a dynamic yopology of agents for improvization learning , 2006, AMCMM '06.

[11]  Diemo Schwarz,et al.  SOUND SEARCH BY CONTENT-BASED NAVIGATION IN LARGE DATABASES , 2009 .

[12]  J. Stephen Downie,et al.  Everyday Life Music Information-Seeking Behaviour of Young Adults , 2006, ISMIR.

[13]  Peter Knees,et al.  Searching for Audio by Sketching Mental Images of Sound: A Brave New Idea for Audio Retrieval in Creative Music Production , 2016, ICMR.

[14]  Xavier Serra,et al.  Expressive Concatenative Synthesis by Reusing Samples from Real Performance Recordings , 2009, Computer Music Journal.

[15]  Joshua D. Reiss,et al.  MIXPLORATION: rethinking the audio mixer interface , 2014, IUI.

[16]  Sally Jo Cunningham,et al.  Toward an understanding of the history and impact of user studies in music information retrieval , 2013, Journal of Intelligent Information Systems.

[17]  Sebastian Streich,et al.  A Music Loop Explorer System , 2008, ICMC.

[18]  Adam Finkelstein,et al.  AudioQuilt: 2D Arrangements of Audio Samples using Metric Learning and Kernelized Sorting , 2014, NIME.

[19]  Peter Knees,et al.  The Dial: Exploring Computational Strangeness , 2016, CHI Extended Abstracts.

[20]  Sally Jo Cunningham,et al.  'More of an Art than a Science': Supporting the Creation of Playlists and Mixes , 2006, ISMIR.

[21]  Arthur Flexer,et al.  Visualization of perceptual qualities in Textural sounds , 2012, ICMC.

[22]  Paolo Tomeo,et al.  An analysis of users' propensity toward diversity in recommendations , 2014, RecSys '14.

[23]  Juan Pablo Bello,et al.  Random Access Remixing on the iPad , 2011, NIME.

[24]  Sally Jo Cunningham,et al.  A user-centered design of a personal digital library for music exploration , 2010, JCDL '10.

[25]  Daniele Quercia,et al.  Auralist: introducing serendipity into music recommendation , 2012, WSDM '12.

[26]  Peter C. Wright,et al.  Empathy and experience in HCI , 2008, CHI.

[27]  Bryan Pardo,et al.  SynthAssist: Querying an Audio Synthesizer by Vocal Imitation , 2014, NIME.

[28]  Jin Ha Lee,et al.  Understanding Users of Commercial Music Services through Personas: Design Implications , 2015, ISMIR.

[29]  Ajay Kapur,et al.  Query-by-Beat-Boxing: Music Retrieval For The DJ , 2004, ISMIR.

[30]  Sahin Albayrak,et al.  User-centric evaluation of a K-furthest neighbor collaborative filtering recommender algorithm , 2013, CSCW.

[31]  David M. Nichols,et al.  Social Music in Cars , 2014, ISMIR.

[32]  J. Stephen Downie,et al.  "The Pain, the Pain": Modelling Music Information Behavior and the Songs We Hate , 2005, ISMIR.

[33]  Shlomo Dubnov,et al.  Audio Oracle: a New Algorithm for Fast Learning of audio Structures , 2007, ICMC.

[34]  F. Pachet,et al.  MUSICAL MOSAICING , 2001 .

[35]  Sally Jo Cunningham,et al.  Towards the design of a kids' music organizer , 2008, CHINZ.

[36]  Nina Reeves,et al.  An ethnographic study of music information seeking: implications for the design of a music digital library , 2003, 2003 Joint Conference on Digital Libraries, 2003. Proceedings..

[37]  Eoin Brazil,et al.  Sonic browsing: An auditory tool for multimedia asset management , 2001 .

[38]  Kristina Andersen,et al.  GiantSteps: Semi-Structured Conversations with Musicians , 2015, CHI Extended Abstracts.

[39]  Perry R. Cook,et al.  Real-time human interaction with supervised learning algorithms for music composition and performance , 2011 .

[40]  Anne Treisman,et al.  Natural cross-modal mappings between visual and auditory features. , 2011, Journal of vision.

[41]  Matt Jones,et al.  Organizing digital music for use: an examination of personal music collections , 2004, ISMIR.

[42]  Rebecca Fiebrink,et al.  Using Interactive Machine Learning to Support Interface Development Through Workshops with Disabled People , 2015, CHI.

[43]  Graham Coleman Mused: Navigating the Personal sample Library , 2007, ICMC.

[44]  Susanto Rahardja,et al.  Rhythm analysis for personal and social music applications using drum loop patterns , 2009, 2009 IEEE International Conference on Multimedia and Expo.

[45]  Nick Collins Contrary Motion: An Oppositional Interactive Music System , 2010, NIME.

[46]  Arshia Cont,et al.  Antescofo: Anticipatory Synchronization and control of Interactive parameters in Computer Music , 2008, ICMC.

[47]  Òscar Celma,et al.  A new approach to evaluating novel recommendations , 2008, RecSys '08.

[48]  Elias Pampalk,et al.  HIERARCHICAL ORGANIZATION AND VISUALIZATION OF DRUM SAMPLE LIBRARIES , 2004 .

[49]  Sebastian Streich,et al.  Music loop extraction from digital audio signals , 2008, 2008 IEEE International Conference on Multimedia and Expo.

[50]  Nick Collins,et al.  Automatic Composition of Electroacoustic Art Music Utilizing Machine Listening , 2012, Computer Music Journal.