We describe a probabilistic approach to the task of placing objects, described by high-dimensional vectors or by pairwise dissimilarities, in a low-dimensional space in a way that preserves neighbor identities. A Gaussian is centered on each object in the high-dimensional space and the densities under this Gaussian (or the given dissimilarities) are used to define a probability distribution over all the potential neighbors of the object. The aim of the embedding is to approximate this distribution as well as possible when the same operation is performed on the low-dimensional "images" of the objects. A natural cost function is a sum of Kullback-Leibler divergences, one per object, which leads to a simple gradient for adjusting the positions of the low-dimensional images. Unlike other dimensionality reduction methods, this probabilistic framework makes it easy to represent each object by a mixture of widely separated low-dimensional images. This allows ambiguous objects, like the document count vector for the word "bank", to have versions close to the images of both "river" and "finance" without forcing the images of outdoor concepts to be located close to those of corporate concepts.
[1]
Teuvo Kohonen,et al.
Self-Organization and Associative Memory
,
1988
.
[2]
Teuvo Kohonen,et al.
Self-organization and associative memory: 3rd edition
,
1989
.
[3]
Jonathan J. Hull,et al.
A Database for Handwritten Text Recognition Research
,
1994,
IEEE Trans. Pattern Anal. Mach. Intell..
[4]
Joshua B. Tenenbaum,et al.
Mapping a Manifold of Perceptual Observations
,
1997,
NIPS.
[5]
Christopher M. Bishop,et al.
GTM: The Generative Topographic Mapping
,
1998,
Neural Computation.
[6]
J. Tenenbaum,et al.
A global geometric framework for nonlinear dimensionality reduction.
,
2000,
Science.
[7]
S T Roweis,et al.
Nonlinear dimensionality reduction by locally linear embedding.
,
2000,
Science.
[8]
Geoffrey E. Hinton,et al.
Learning Distributed Representations of Concepts Using Linear Relational Embedding
,
2001,
IEEE Trans. Knowl. Data Eng..