Semantic Web Technology for New Experiences Throughout the Music Production-Consumption Chain

The FAST project (Fusing Audio and Semantic Technology for Intelligent Music Production and Consumption) with 5 years of UK funding, has sought to create a new musical ecosystem that empowers all manner of people, from professional performers to casual listeners, to engage in new, more creative, immersive and dynamic musical experiences. Realising this requires a step-change in digital music technologies. Going beyond today's digital sound files, future experiences will demand far richer musical information, whereby music content will be packaged in a flexible, structured way that combines audio recordings with rich, layered metadata to support interactive and adaptive musical experiences. This defines the overall ambition of FAST-to lay the foundations for a new generation of ‘semantic audio’ technologies that underpin diverse future music experiences. This paper therefore aims to describe the overall vision of the project, set out the broad landscape in which it is working, highlight some key results and show how they bring out a central notion of FAST, that of Digital Music Objects, which are flexible constructs consisting of recorded music essence coupled with rich, semantic, linked metadata.

[1]  Florian Thalmann,et al.  The Mobile Audio Ontology: Experiencing Dynamic Music Objects on Mobile Devices , 2016, 2016 IEEE Tenth International Conference on Semantic Computing (ICSC).

[2]  György Fazekas,et al.  AUFX-O: Novel Methods for the Representation of Audio Processing Workflows , 2016, SEMWEB.

[3]  Steve Benford,et al.  Making Music Together: An Exploration of Amateur and Pro-Am Grime Music Production , 2016, Audio Mostly Conference.

[4]  Steve Benford,et al.  GeoTracks: Adaptive Music for Everyday Journeys , 2016, ACM Multimedia.

[5]  David De Roure,et al.  Music SOFA: An architecture for semantically informed recomposition of Digital Music Objects , 2018, SAAM@ISWC.

[6]  Tim Crawford,et al.  Review: Music Encoding Initiative , 2016 .

[7]  Kevin R. Page,et al.  A Framework for Distributed Semantic Annotation of Musical Score: "Take It to the Bridge!" , 2017, ISMIR.

[8]  György Fazekas,et al.  Realising a Layered Digital Library: Exploration and Analysis of the Live Music Archive through Linked Data , 2017, 2017 ACM/IEEE Joint Conference on Digital Libraries (JCDL).

[9]  David De Roure Executable Music Documents , 2014, DLfM '14.

[10]  Andy Crabtree,et al.  Searching for music: understanding the discovery, acquisition, processing and organization of music in a domestic setting for design , 2016, Personal and Ubiquitous Computing.

[11]  Mark B. Sandler,et al.  The Music Ontology , 2007, ISMIR.

[12]  György Fazekas,et al.  An Ontology for Audio Features , 2016, ISMIR.

[13]  Steve Benford,et al.  Accountable Artefacts: The Case of the Carolan Guitar , 2016, CHI.

[14]  György Fazekas,et al.  An Overview of Semantic Web Activities in the OMRAS2 Project , 2010 .

[15]  Fs Thalmann,et al.  The Semantic Music Player: A Smart Mobile Player Based on Ontological Structures and Analytical Feature Metadata , 2016 .

[16]  David De Roure Towards computational research objects , 2013 .

[17]  György Fazekas,et al.  Knowledge Representation Issues in Audio-Related Metadata Model Design , 2012 .

[18]  Florian Thalmann,et al.  Grateful Live: Mixing Multiple Recordings of a Dead Performance into an Immersive Experience , 2016 .

[19]  Mark B. Sandler,et al.  MusicWeb: Music Discovery with Open Linked Semantic Metadata , 2016, International Semantic Web Conference.

[20]  Steve Benford,et al.  "They're all going out to something weird": Workflow, Legacy and Metadata in the Music Production Process , 2017, CSCW.