Spectral Methods for Correlated Topic Models

In this paper, we propose guaranteed spectral methods for learning a broad range of topic models, which generalize the popular Latent Dirichlet Allocation (LDA). We overcome the limitation of LDA to incorporate arbitrary topic correlations, by assuming that the hidden topic proportions are drawn from a flexible class of Normalized Infinitely Divisible (NID) distributions. NID distributions are generated through the process of normalizing a family of independent Infinitely Divisible (ID) random variables. The Dirichlet distribution is a special case obtained by normalizing a set of Gamma random variables. We prove that this flexible topic model class can be learned via spectral methods using only moments up to the third order, with (low order) polynomial sample and computational complexity. The proof is based on a key new technique derived here that allows us to diagonalize the moments of the NID distribution through an efficient procedure that requires evaluating only univariate integrals, despite the fact that we are handling high dimensional multivariate moments. In order to assess the performance of our proposed Latent NID topic model, we use two real datasets of articles collected from New York Times and Pubmed. Our experiments yield improved perplexity on both datasets compared with the baseline.

[1]  Achim Klenke,et al.  Infinitely Divisible Distributions , 2020, Probability Theory.

[2]  Francesca Mangili,et al.  New prior near-ignorance models on the simplex , 2015, Int. J. Approx. Reason..

[3]  John D. Lafferty,et al.  Correlated Topic Models , 2005, NIPS.

[4]  Alexander J. Smola,et al.  Spectral Methods for Indian Buffet Process Inference , 2014, NIPS.

[5]  Anima Anandkumar,et al.  A Spectral Algorithm for Latent Dirichlet Allocation , 2012, Algorithmica.

[6]  R. Adelson Compound Poisson Distributions , 1966 .

[7]  Andrew McCallum,et al.  Correlations and Anticorrelations in LDA Inference , 2011 .

[8]  Jim E. Griffin,et al.  Modeling overdispersion with the normalized tempered stable distribution , 2011, Comput. Stat. Data Anal..

[9]  Anima Anandkumar,et al.  Learning Loopy Graphical Models with Latent Variables: Efficient Methods and Guarantees , 2012, The Annals of Statistics.

[10]  Bo Zhang,et al.  Scalable Inference for Logistic-Normal Topic Models , 2013, NIPS.

[11]  Ramsés H. Mena,et al.  Hierarchical Mixture Modeling With Normalized Inverse-Gaussian Priors , 2005 .

[12]  Mark Steyvers,et al.  Finding scientific topics , 2004, Proceedings of the National Academy of Sciences of the United States of America.

[13]  Nizar Bouguila,et al.  Online Learning for Two Novel Latent Topic Models , 2014, ICT-EurAsia.

[14]  Hiroshi Nakagawa,et al.  Topic models with power-law using Pitman-Yor process , 2010, KDD.

[15]  Michael I. Jordan,et al.  Latent Dirichlet Allocation , 2001, J. Mach. Learn. Res..

[16]  Andrew McCallum,et al.  Gibbs Sampling for Logistic Normal Topic Models with Graph-Based Priors , 2008 .

[17]  I. Prünster,et al.  On a class of distributions on the simplex , 2011 .

[18]  Anima Anandkumar,et al.  Tensor decompositions for learning latent variable models , 2012, J. Mach. Learn. Res..

[19]  Sanjeev Arora,et al.  A Practical Algorithm for Topic Modeling with Provable Guarantees , 2012, ICML.

[20]  A. Lijoi,et al.  Models Beyond the Dirichlet Process , 2009 .

[21]  V. V. Petrov Infinitely Divisible Distributions , 1975 .

[22]  Furong Huang,et al.  Discovery of Latent Factors in High-dimensional Data Using Tensor Methods , 2016, ArXiv.