Separation of impulsive acoustical events

Impulsive acoustical events, or impacts, compose a big family of everyday sounds. Detection and separation of these sounds is an important task of current computational auditory scene analysis. In this paper, we propose a method along with an online architecture for detecting and separating acoustical impacts, which is able to find out each and every impulsive acoustical event in a continuous data flow. An energy density function on a time-frequency span is given as the result of separating each impact from the background and other overlapping impacts. Onsets are used for finding events. A prediction-based method is introduced for separating events that overlap both in time and in frequency. The method does not rely on spectral peak tracks or harmonic properties, thus is applicable to a broad class of sounds.

[1]  Michael I. Jordan,et al.  Advances in Neural Information Processing Systems 30 , 1995 .

[2]  Leslie S. Smith Onset-based Sound Segmentation , 1995, NIPS.

[3]  Guy J. Brown,et al.  Computational auditory scene analysis , 1994, Comput. Speech Lang..

[4]  Daniel Patrick Whittlesey Ellis,et al.  Prediction-driven computational auditory scene analysis , 1996 .

[5]  Anssi Klapuri,et al.  Sound onset detection by applying psychoacoustic knowledge , 1999, 1999 IEEE International Conference on Acoustics, Speech, and Signal Processing. Proceedings. ICASSP99 (Cat. No.99CH36258).

[6]  Guy J. Brown,et al.  A blackboard architecture for computational auditory scene analysis , 1999, Speech Commun..

[7]  M. Davies,et al.  A HYBRID APPROACH TO MUSICAL NOTE ONSET DETECTION , 2002 .