Distribution Approximation : An In-Place Developmental Algorithm for Sensory Cortices and A Hypothesis

Orientation selective cells in V1 are well known, but the underlying computational principles that guide their emergence (i.e., development) are still illusive. The result reported here is based on high-dimensional probability distribution using a network structure. We introduce a concept called lobe component. Each lobe component represents a high concentration of probability density of the high-dimensional sensory inputs. A developmental algorithm for sensory network has been designed and tested based on the concept of lobe components. Our simulation result of network development revealed that many lobe component cells developed by the algorithm from natural images are orientational selective cells similar to those found in V1. Therefore, we make a hypothesis that orientation selection is only a natural consequence of cortical development but probability density estimation by neurons is a fundamental principle of cortical development. If this hypothesis is true, the principle of neuronal probability approximation may have deep implications to the development and self-organization of other sensory cortices (e.g., auditory and somatosensory cortices) and higher cortices. The proposed developmental algorithm is not meant to be biologically verified but is biological plausible. It is based on two well-known computational neural mechanisms, Hebbian learning and lateral inhibition. It is an in-place developmental algorithm in which every cell is both a signal processor and the developer of the cortex. There is no need for a separate network to deal with the development (and learning). The algorithm is purely incremental in the sense that there is no need for extra storage for information other than that can be stored and incrementally computed by the processing network itself. To clarify technically, four types of algorithms have been considered with progressively more restrictive conditions: batch (Type-1), block-incremental (Type-2), incremental (Type-3) and covariance-free incremental (Type-4). The proposed Candid Covariance-free Incremental (CCI) Lobe Component Analysis (LCA) algorithm seems the first Type-4 algorithm for independent component analysis (ICA). The preliminary simulation studies showed that it outperformed, in convergence rate and computation time, some well-known state-of-the-art ICA algorithms in Type-3 through Type 1 for high-dimensional data streams. The proposed LCA algorithm also has a simple structure, with the lowest possible space and time complexities.

[1]  Erkki Oja,et al.  An Experimental Comparison of Neural ICA Algorithms , 1998 .

[2]  R. F.,et al.  Mathematical Statistics , 1944, Nature.

[3]  Aapo Hyvärinen,et al.  Survey on Independent Component Analysis , 1999 .

[4]  G. McLachlan,et al.  The EM algorithm and extensions , 1996 .

[5]  Teuvo Kohonen,et al.  Self-Organizing Maps , 2010 .

[6]  James L. McClelland,et al.  Autonomous Mental Development by Robots and Animals , 2001, Science.

[7]  Thomas S. Huang,et al.  Motion and Structure from Image Sequences , 1992 .

[8]  Juyang Weng,et al.  Using Discriminant Eigenfeatures for Image Retrieval , 1996, IEEE Trans. Pattern Anal. Mach. Intell..

[9]  Lawrence Sirovich,et al.  Application of the Karhunen-Loeve Procedure for the Characterization of Human Faces , 1990, IEEE Trans. Pattern Anal. Mach. Intell..

[10]  Editors , 1986, Brain Research Bulletin.

[11]  Yi Lin,et al.  Support Vector Machines and the Bayes Rule in Classification , 2002, Data Mining and Knowledge Discovery.

[12]  Allen Gersho,et al.  Asymptotically optimal block quantization , 1979, IEEE Trans. Inf. Theory.

[13]  D. Chakrabarti,et al.  A fast fixed - point algorithm for independent component analysis , 1997 .

[14]  Pierre Comon,et al.  Independent component analysis, A new concept? , 1994, Signal Process..

[15]  Juyang Weng,et al.  Hierarchical Discriminant Regression , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[16]  E. L. Lehmann,et al.  Theory of point estimation , 1950 .

[17]  Erkki Oja,et al.  Independent component analysis: algorithms and applications , 2000, Neural Networks.

[18]  P. Comon Independent Component Analysis , 1992 .

[19]  Terrence J. Sejnowski,et al.  The “independent components” of natural scenes are edge filters , 1997, Vision Research.

[20]  M. Raijmakers Rethinking innateness: A connectionist perspective on development. , 1997 .

[21]  Julia Iiarhunen,et al.  BLIND SOURCE SEPARATION USING LEAST-SQUARES TYPE ADAPTIVE ALGORITHMS , 1997 .

[22]  Juha Karhunen,et al.  Least-Squares Methods for Blind Source Separation Based on Nonlinear PCA , 1997, Int. J. Neural Syst..

[23]  M. Turk,et al.  Eigenfaces for Recognition , 1991, Journal of Cognitive Neuroscience.

[24]  M. Alexander,et al.  Principles of Neural Science , 1981 .

[25]  Juyang Weng,et al.  Autonomous Mental Development: Workshop on Development and Learning (WDL) , 2002, AI Mag..

[26]  Te-Won Lee,et al.  Independent Component Analysis , 1998, Springer US.

[27]  Juha Karhunen,et al.  Principal component neural networks — Theory and applications , 1998, Pattern Analysis and Applications.

[28]  David J. Kriegman,et al.  Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection , 1996, ECCV.

[29]  Terrence J. Sejnowski,et al.  Independent Component Analysis Using an Extended Infomax Algorithm for Mixed Subgaussian and Supergaussian Sources , 1999, Neural Computation.

[30]  C. D. Kemp,et al.  Density Estimation for Statistics and Data Analysis , 1987 .

[31]  P. Bickel,et al.  Mathematical Statistics: Basic Ideas and Selected Topics , 1977 .

[32]  Pierre Comon Independent component analysis - a new concept? signal processing , 1994 .

[33]  John G. Proakis,et al.  Probability, random variables and stochastic processes , 1985, IEEE Trans. Acoust. Speech Signal Process..

[34]  Paul L. Zador,et al.  Asymptotic quantization error of continuous signals and the quantization dimension , 1982, IEEE Trans. Inf. Theory.

[35]  Rama Chellappa,et al.  Discriminant Analysis for Recognition of Human Face Images (Invited Paper) , 1997, AVBPA.

[36]  Juyang Weng,et al.  Candid Covariance-Free Incremental Principal Component Analysis , 2003, IEEE Trans. Pattern Anal. Mach. Intell..

[37]  Nello Cristianini,et al.  An Introduction to Support Vector Machines and Other Kernel-based Learning Methods , 2000 .

[38]  Juyang Weng,et al.  Online image classification using IHDR , 2003, International Journal on Document Analysis and Recognition.

[39]  Erkki Oja,et al.  The nonlinear PCA criterion in blind source separation: Relations with other approaches , 1998, Neurocomputing.