Fast Convergent Factorial Learning of the Low-Dimensional Independent Manifolds in Optical Imaging D

In many functional-imaging scenarios, it is a challenge to s eparate the response to stimulation from the other, presumabl y independent, sources that contribute to the image formation. When the brain is optically imaged, the typical variabilities of some of these sources force the data to lie close to a low-dimensiona l, nonlinear manifold. When an initial probability model is deriv ed by the Karhunen-Loève Transform (KLT) of the data, and some fa ctors of this manifold happen to be accessibly embedded in sui tably chosen KLT subspaces, vector quantization has been used to c haracterize this embedding as the locus of maximum likelihood o f the data, and to derive an improved probability model, in which t he factors—the dynamics on this locus and away from it—are esti mated independently. Here we show that such a description ca n serve as the starting point for a convergent procedure that a ltern tively refines the estimates of the embedding of, and the dyna mics on, the manifold. Further, we show that even a very crude init ial estimate, from a heavily mixed subspace, is sufficient for co nvergence in a small number of steps. This opens the possibility o f hierarchical semi-blind separation of the independent sou rces in optical imaging data, even when their contributions are non li ear.

[1]  Ehud Kaplan,et al.  Using vector quantization to build nonlinear factorial models of the low-dimensional independent manifolds in optical imaging data , 2000, Proceedings 2000 International Conference on Image Processing (Cat. No.00CH37101).

[2]  R. Llinás,et al.  Thalamocortical dysrhythmia: A neurological and neuropsychiatric syndrome characterized by magnetoencephalography. , 1999, Proceedings of the National Academy of Sciences of the United States of America.

[3]  Lawrence Sirovich,et al.  Separating spatially distributed response to stimulation from background. I. Optical imaging , 1997, Biological Cybernetics.

[4]  K. Karhunen Zur Spektraltheorie stochastischer prozesse , 1946 .

[5]  Mitchell Feigenbaum,et al.  Local feature analysis: a statistical theory for information representation and transmission , 1998 .

[6]  B W Knight,et al.  Representation of spatial frequency and orientation in the visual cortex. , 1998, Proceedings of the National Academy of Sciences of the United States of America.

[7]  Robert M. Gray,et al.  An Algorithm for Vector Quantizer Design , 1980, IEEE Trans. Commun..

[8]  J. W. Silverstein Eigenvalues and eigenvectors of large di-mensional sample covariance matrices , 1986 .

[9]  D. Thomson,et al.  Spectrum estimation and harmonic analysis , 1982, Proceedings of the IEEE.

[10]  Jean-Franois Cardoso High-Order Contrasts for Independent Component Analysis , 1999, Neural Computation.

[11]  M. Loève Probability theory : foundations, random sequences , 1955 .

[12]  A Hyvarinen,et al.  SURVEY OF INDEPENDENT COMPONENT ANALYSIS , 1999 .

[13]  Richard M. Everson,et al.  Inferring the eigenvalues of covariance matrices from limited, noisy data , 2000, IEEE Trans. Signal Process..

[14]  Anirvan M. Sengupta,et al.  Distributions of singular values for some random matrices. , 1997, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[15]  R. Greenberg Biometry , 1969, The Yale Journal of Biology and Medicine.

[16]  P. Mitra,et al.  Analysis of dynamic brain imaging data. , 1998, Biophysical journal.