SIMAC: semantic interaction with music audio contents

The SIMAC project addresses the study and development of innovative components for a music information retrieval system. The key feature is the usage and exploitation of semantic descriptors of musical content that are automatically extracted from music audio files. These descriptors are generated in two ways: as derivations and combinations of lower-level descriptors and as generalizations induced from manually annotated databases by the intensive application of machine learning. The project aims also towards the empowering (i.e. adding value, improving effectiveness) of music consumption behaviours, especially of those that are guided by the concept of similarity.

[1]  Fabio Vignoli,et al.  Digital Music Interaction Concepts: A User Study , 2004, ISMIR.

[2]  Jouni Paulus,et al.  Unpitched Percussion Transcription , 2006 .

[3]  R. Shepard Circularity in Judgments of Relative Pitch , 1964 .

[4]  Gerhard Widmer,et al.  Towards Characterisation of Music via Rhythmic Patterns , 2004, ISMIR.

[5]  Cheng Yang MACSIS: A Scalable Acoustic Index for Content-Based Music Retrieval , 2002, ISMIR.

[6]  Peter Knees,et al.  Artist Classification with Web-Based Data , 2004, ISMIR.

[7]  Gerhard Widmer,et al.  Hierarchical Organization and Description of Music Collections at the Artist Level , 2005, ECDL.

[8]  Beth Logan,et al.  A music similarity function based on signal analysis , 2001, IEEE International Conference on Multimedia and Expo, 2001. ICME 2001..

[9]  Aymeric Zils,et al.  EXTRACTING AUTOMATICALLY THE PERCEIVED INTENSITY OF MUSIC TITLES , 2003 .

[10]  Anssi Klapuri,et al.  Signal Processing Methods for Music Transcription , 2006 .

[11]  Gerhard Widmer,et al.  Evaluating Rhythmic descriptors for Musical Genre Classification , 2004 .

[12]  G. Cupchik,et al.  Similarity and preference judgments of musical stimuli. , 1982, Scandinavian journal of psychology.

[13]  Mark Sandler,et al.  Automatic Chord Identifcation using a Quantised Chromagram , 2005 .

[14]  Perfecto Herrera,et al.  Towards a Semantic descriptor of subjective Intensity in Music , 2005, ICMC.

[15]  Simon Dixon,et al.  Dance music classification: A tempo-based approach , 2004, ISMIR.

[16]  Simon Dixon,et al.  A Review of Automatic Rhythm Description Systems , 2005, Computer Music Journal.

[17]  Mark B. Sandler,et al.  Polyphonic Score Retrieval Using Polyphonic Audio Queries: A Harmonic Modeling Approach , 2003, ISMIR.

[18]  Mark B. Sandler,et al.  On the use of phase and energy for musical onset detection in the complex domain , 2004, IEEE Signal Processing Letters.

[19]  Perfecto Herrera,et al.  Semantic Segmentation of Music audio Contents , 2005, ICMC.

[20]  Mark B. Sandler,et al.  Phase-based note onset detection for music signals , 2003, 2003 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003. Proceedings. (ICASSP '03)..

[21]  D. Berlyne,et al.  Aesthetics and psychobiology , 1975 .

[22]  E. Pampalk Islands of Music Analysis, Organization, and Visualization of Music Archives , 2002 .

[23]  Matthew E. P. Davies,et al.  Causal Tempo Tracking of Audio , 2004, ISMIR.

[24]  François Pachet,et al.  Improving Timbre Similarity : How high’s the sky ? , 2004 .

[25]  Matthew E. P. Davies,et al.  BEAT TRACKING WITH A TWO STATE MODEL , 2005 .

[26]  Michael McGill,et al.  Introduction to Modern Information Retrieval , 1983 .

[27]  G. Widmer,et al.  ON THE EVALUATION OF PERCEPTUAL SIMILARITY MEASURES FOR MUSIC , 2003 .

[28]  Fabien Gouyon,et al.  Drum sound classification in polyphonic audio recordings using localized sound models , 2004, ISMIR.

[29]  Perfecto Herrera,et al.  Detrended Fluctuation Analysis of Music Signals: Danceability Estimation and Further Semantic Characterization , 2005 .

[30]  Rita S. Wolpert Recognition of Melody, Harmonic Accompaniment, and Instrumentation: Musicians vs. Nonmusicians , 1990 .

[31]  Gerhard Widmer,et al.  Classification of dance music by periodicity patterns , 2003, ISMIR.

[32]  C. Krumhansl Cognitive Foundations of Musical Pitch , 1990 .

[33]  François Pachet,et al.  Music Similarity Measures: What's the use? , 2002, ISMIR.

[34]  Fabien Gouyon,et al.  Percussion-related Semantic Descriptors of Music Audio Files , 2004 .

[35]  Emilia Gómez,et al.  Tonal Description of Polyphonic Audio for Music Content Processing , 2006, INFORMS J. Comput..

[36]  Perfecto Herrera-Boyer,et al.  Automatic Classification of Musical Instrument Sounds , 2003 .

[37]  Fabio Vignoli,et al.  A Music Retrieval System Based on User Driven Similarity and Its Evaluation , 2005, ISMIR.

[38]  Steve Lawrence,et al.  Inferring Descriptions and Similarity for Music from Community Metadata , 2002, ICMC.

[39]  Tim Pohle,et al.  Dynamic Playlist Generation Based on Skipping Behavior , 2005, ISMIR.

[40]  Anssi Klapuri,et al.  Automatic Classification of Pitched Musical Instrument Sounds , 2006 .

[41]  Mike E. Davies,et al.  Musical Instrument Identification using LSF and K-means , 2005 .

[42]  Emilia Gómez,et al.  Estimating The Tonality Of Polyphonic Audio Files: Cognitive Versus Machine Learning Modelling Strategies , 2004, ISMIR.

[43]  Gerhard Widmer,et al.  Improvements of Audio-Based Music Similarity and Genre Classificaton , 2005, ISMIR.

[44]  Juan Pablo Bello,et al.  A Robust Mid-Level Representation for Harmonic Content in Music Signals , 2005, ISMIR.

[45]  Masataka Goto,et al.  Automatic Drum Sound Description for Real-World Music Using Template Adaptation and Matching Methods , 2004, ISMIR.