Beyond LDA: A Unified Framework for Learning Latent Normalized Infinitely Divisible Topic Models through Spectral Methods

In this paper we propose guaranteed spectral methods for learning a broad range of topic models, which generalize the popular Latent Dirichlet Allocation (LDA). We overcome the limitation of LDA to incorporate arbitrary topic correlations, by assuming that the hidden topic proportions are drawn from a flexible class of Normalized Infinitely Divisible (NID) distributions. NID distributions are generated through the process of normalizing a family of independent Infinitely Divisible (ID) random variables. The Dirichlet distribution is a special case obtained by normalizing a set of Gamma random variables. We prove that this flexible topic model class can be learnt via spectral methods using only moments up to the third order, with (low order) polynomial sample and computational complexity. The proof is based on a key new technique derived here that allows us to diagonalize the moments of the NID distribution through an efficient procedure that requires evaluating only univariate integrals, despite the fact that we are handling high dimensional multivariate moments.

[1]  R. Adelson Compound Poisson Distributions , 1966 .

[2]  V. V. Petrov Infinitely Divisible Distributions , 1975 .

[3]  Michael I. Jordan,et al.  Advances in Neural Information Processing Systems 30 , 1995 .

[4]  Michael I. Jordan,et al.  Latent Dirichlet Allocation , 2001, J. Mach. Learn. Res..

[5]  Mark Steyvers,et al.  Finding scientific topics , 2004, Proceedings of the National Academy of Sciences of the United States of America.

[6]  Ramsés H. Mena,et al.  Hierarchical Mixture Modeling With Normalized Inverse-Gaussian Priors , 2005 .

[7]  John D. Lafferty,et al.  Correlated Topic Models , 2005, NIPS.

[8]  Andrew McCallum,et al.  Gibbs Sampling for Logistic Normal Topic Models with Graph-Based Priors , 2008 .

[9]  A. Lijoi,et al.  Models Beyond the Dirichlet Process , 2009 .

[10]  Hiroshi Nakagawa,et al.  Topic models with power-law using Pitman-Yor process , 2010, KDD.

[11]  Jim E. Griffin,et al.  Modeling overdispersion with the normalized tempered stable distribution , 2011, Comput. Stat. Data Anal..

[12]  I. Prünster,et al.  On a class of distributions on the simplex , 2011 .

[13]  Bo Zhang,et al.  Scalable Inference for Logistic-Normal Topic Models , 2013, NIPS.

[14]  Sanjeev Arora,et al.  A Practical Algorithm for Topic Modeling with Provable Guarantees , 2012, ICML.

[15]  Alexander J. Smola,et al.  Spectral Methods for Indian Buffet Process Inference , 2014, NIPS.

[16]  Nizar Bouguila,et al.  Online Learning for Two Novel Latent Topic Models , 2014, ICT-EurAsia.

[17]  Anima Anandkumar,et al.  Tensor decompositions for learning latent variable models , 2012, J. Mach. Learn. Res..

[18]  Anima Anandkumar,et al.  A Spectral Algorithm for Latent Dirichlet Allocation , 2012, Algorithmica.

[19]  Francesca Mangili,et al.  New prior near-ignorance models on the simplex , 2015, Int. J. Approx. Reason..