Shift-Invariant Probabilistic Latent Component Analysis

In this paper we present a model which can decompose a probability densities or count data into a set of shift invariant components. We begin by introducing a regular latent variable model and subsequently extend it to deal with shift invariance in order to model more complex inputs. We develop an expectation maximization algorithm for estimating components and present various results on challenging real-world data. We show that this approach is a probabilistic generalization of well known algorithms such as Non-Negative Matrix Factorization and multi-way decompositions, and discuss its advantages over such approaches.

[1]  Athanasios Papoulis,et al.  Probability, Random Variables and Stochastic Processes , 1965 .

[2]  John G. Proakis,et al.  Probability, random variables and stochastic processes , 1985, IEEE Trans. Acoust. Speech Signal Process..

[3]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[4]  Judith C. Brown Calculation of a constant Q spectral transform , 1991 .

[5]  P. Paatero,et al.  Positive matrix factorization: A non-negative factor model with optimal utilization of error estimates of data values† , 1994 .

[6]  R. Bro PARAFAC. Tutorial and applications , 1997 .

[7]  W. Molenaar,et al.  Applications of latent trait and latent class models in the social sciences , 1997 .

[8]  H. Sebastian Seung,et al.  Learning the parts of objects by non-negative matrix factorization , 1999, Nature.

[9]  Thomas Hofmann,et al.  Probabilistic Latent Semantic Analysis , 1999, UAI.

[10]  Brendan J. Frey,et al.  Transformed component analysis: joint estimation of spatial transformations and image components , 1999, Proceedings of the Seventh IEEE International Conference on Computer Vision.

[11]  Matthew Brand,et al.  Structure Learning in Conditional Probability Models via an Entropic Prior and Parameter Extinction , 1999, Neural Computation.

[12]  H. Sebastian Seung,et al.  Algorithms for Non-negative Matrix Factorization , 2000, NIPS.

[13]  Lei M. Li,et al.  Deconvolution of Sparse Positive Spikes: Is It Ill-Posed? , 2000 .

[14]  P. Smaragdis,et al.  Non-negative matrix factorization for polyphonic music transcription , 2003, 2003 IEEE Workshop on Applications of Signal Processing to Audio and Acoustics (IEEE Cat. No.03TH8684).

[15]  B. Frey,et al.  Transformation-Invariant Clustering Using the EM Algorithm , 2003, IEEE Trans. Pattern Anal. Mach. Intell..

[16]  P. Smaragdis Discovering auditory objects through non-negativity constraints , 2004, SAPA@INTERSPEECH.

[17]  Patrik O. Hoyer,et al.  Non-negative Matrix Factorization with Sparseness Constraints , 2004, J. Mach. Learn. Res..

[18]  Paris Smaragdis,et al.  Non-negative Matrix Factor Deconvolution; Extraction of Multiple Sound Sources from Monophonic Inputs , 2004, ICA.

[19]  Morten Mørup,et al.  Nonnegative Matrix Factor 2-D Deconvolution for Blind Single Channel Source Separation , 2006, ICA.

[20]  Neil Henry Latent structure analysis , 1969 .

[21]  Bhiksha Raj,et al.  Sparse Overcomplete Latent Variable Decomposition of Counts Data , 2007, NIPS.

[22]  B. Shinn-Cunningham,et al.  Latent variable framework for modeling and separating single-channel acoustic sources , 2008 .