The uncovering of hidden structures by Latent Semantic Analysis

Latent Semantic Analysis (LSA) is a well-known method for information retrieval. It has also been applied as a model of cognitive processing and word-meaning acquisition. This dual importance of LSA derives from its capacity to modulate the meaning of words by contexts, dealing successfully with polysemy and synonymy. The underlying reasons that make the method work are not clear enough. We propose that the method works because it detects an underlying block structure (the blocks corresponding to topics) in the term-by-document matrix. In real cases this block structure is hidden because of perturbations. We propose that the correct explanation for LSA must be searched in the structure of singular vectors rather than in the profile of singular values. Using the Perron-Frobenius theory we show that the presence of disjoint blocks of documents is marked by sign-homogeneous entries in the vectors corresponding to the documents of one block and zeros elsewhere. In the case of nearly disjoint blocks, perturbation theory shows that if the perturbations are small, the zeros in the leading vectors are replaced by small numbers (pseudo-zeros). Since the singular values of each block might be very different in magnitude, their order does not mirror the order of blocks. When the norms of the blocks are similar, LSA works fine, but we propose that when the topics have different sizes, the usual procedure of selecting the first k singular triplets (k being the number of blocks) should be replaced by a method that selects the perturbed Perron vectors for each block.

[1]  Marcelo A. Montemurro,et al.  Entropic Analysis of the Role of Words in Literary Texts , 2001, Adv. Complex Syst..

[2]  R. Cattell The Scree Test For The Number Of Factors. , 1966, Multivariate behavioral research.

[3]  Michael I. Jordan,et al.  Link Analysis, Eigenvectors and Stability , 2001, IJCAI.

[4]  Anna R. Karlin,et al.  Spectral analysis of data , 2001, STOC '01.

[5]  Hongyuan Zha,et al.  Matrices with Low-Rank-Plus-Shift Structure: Partial SVD and Latent Semantic Indexing , 1999, SIAM J. Matrix Anal. Appl..

[6]  Susan T. Dumais,et al.  Data‐driven approaches to information access , 2003 .

[7]  Patrik O. Hoyer,et al.  Non-negative Matrix Factorization with Sparseness Constraints , 2004, J. Mach. Learn. Res..

[8]  W. Kahan,et al.  The Rotation of Eigenvectors by a Perturbation. III , 1970 .

[9]  Amy Nicole Langville,et al.  A Survey of Eigenvector Methods for Web Information Retrieval , 2005, SIAM Rev..

[10]  Chris Buckley,et al.  OHSUMED: an interactive retrieval evaluation and new large test collection for research , 1994, SIGIR '94.

[11]  Richard Bellman,et al.  Introduction to matrix analysis (2nd ed.) , 1997 .

[12]  Anne Kao,et al.  Natural Language Processing and Text Mining , 2006 .

[13]  J. Elman,et al.  Learnability and the Statistical Structure of Language: Poverty of Stimulus Arguments Revisited , 2004 .

[14]  Michael W. Berry,et al.  Large-Scale Information Retrieval with Latent Semantic Indexing , 1997, Inf. Sci..

[15]  Danielle S. McNamara,et al.  Textual Signatures: Identifying Text-Types Using Latent Semantic Analysis to Measure the Cohesion of Text Structures , 2007 .

[16]  Dae-Won Kim,et al.  Exploiting concept clusters for content-based information retrieval , 2005, Inf. Sci..

[17]  Noam Chomsky Knowledge of language: its nature, origin, and use , 1988 .

[18]  W. Hersh,et al.  Use of a multi-application computer workstation in a clinical setting. , 1994, Bulletin of the Medical Library Association.

[19]  G. W. Stewart,et al.  On the Early History of the Singular Value Decomposition , 1993, SIAM Rev..

[20]  Susan T. Dumais,et al.  Improving the retrieval of information from external sources , 1991 .

[21]  T. Landauer,et al.  A Solution to Plato's Problem: The Latent Semantic Analysis Theory of Acquisition, Induction, and Representation of Knowledge. , 1997 .

[22]  Chris H. Q. Ding,et al.  Nonnegative Matrix Factorization and Probabilistic Latent Semantic Indexing: Equivalence Chi-Square Statistic, and a Hybrid Method , 2006, AAAI.

[23]  Peter W. Foltz,et al.  An introduction to latent semantic analysis , 1998 .

[24]  Rie Kubota Ando Latent semantic space: iterative scaling improves precision of inter-document similarity measurement , 2000, SIGIR '00.

[25]  E. Merzbacher Quantum mechanics , 1961 .

[26]  Santosh S. Vempala,et al.  Latent Semantic Indexing , 2000, PODS 2000.

[27]  T. Landauer,et al.  Indexing by Latent Semantic Analysis , 1990 .

[28]  Lillian Lee,et al.  Iterative Residual Rescaling: An Analysis and Generalization of LSI , 2001, SIGIR 2002.

[29]  Richard Bellman,et al.  Introduction to Matrix Analysis , 1972 .

[30]  Susan T. Dumais,et al.  Using Linear Algebra for Intelligent Information Retrieval , 1995, SIAM Rev..

[31]  H. Sebastian Seung,et al.  Learning the parts of objects by non-negative matrix factorization , 1999, Nature.

[32]  Marga Reis,et al.  Linguistic evidence : empirical, theoretical, and computational perspectives , 2005 .

[33]  E Mizraji,et al.  Memories in context. , 1999, Bio Systems.

[34]  Derrick Higgins Which Statistics Reflect Semantics? Rethinking Synonymy and Word Similarity , 2005 .

[35]  Santosh S. Vempala,et al.  Latent semantic indexing: a probabilistic analysis , 1998, PODS '98.

[36]  Carl D. Meyer,et al.  Matrix Analysis and Applied Linear Algebra , 2000 .

[37]  Michael I. Jordan,et al.  Stable algorithms for link analysis , 2001, SIGIR '01.