The Generalized Spike Process, Sparsity, and Statistical Independence

A basis under which a given set of realizations of a stochastic process can be represented most sparsely (the so-called best sparsifying basis (BSB)) and the one under which such a set becomes as less statistically dependent as possible (the so-called least statistically-dependent basis (LSDB)) are important for data compression and have generated interests among computational neuroscientists as well as applied mathematicians. Here we consider these bases for a particularly simple stochastic process called ``generalized spike process'', which puts a single spike--whose amplitude is sampled from the standard normal distribution--at a random location in the zero vector of length $\ndim$ for each realization. Unlike the ``simple spike process'' which we dealt with in our previous paper and whose amplitude is constant, we need to consider the kurtosis-maximizing basis (KMB) instead of the LSDB due to the difficulty of evaluating differential entropy and mutual information of the generalized spike process. By computing the marginal densities and moments, we prove that: 1) the BSB and the KMB selects the standard basis if we restrict our basis search within all possible orthonormal bases in ${\mathbb R}^n$; 2) if we extend our basis search to all possible volume-preserving invertible linear transformations, then the BSB exists and is again the standard basis whereas the KMB does not exist. Thus, the KMB is rather sensitive to the orthonormality of the transformations under consideration whereas the BSB is insensitive to that. Our results once again support the preference of the BSB over the LSDB/KMB for data compression applications as our previous work did.

[1]  I. M. Pyshik,et al.  Table of integrals, series, and products , 1965 .

[2]  Steven A. Orszag,et al.  CBMS-NSF REGIONAL CONFERENCE SERIES IN APPLIED MATHEMATICS , 1978 .

[3]  Charles R. Johnson,et al.  Matrix analysis , 1985, Statistical Inference for Engineers and Data Scientists.

[4]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[5]  P. Comon Independent Component Analysis , 1992 .

[6]  Ingrid Daubechies,et al.  Ten Lectures on Wavelets , 1992 .

[7]  Pierre Comon,et al.  Independent component analysis, A new concept? , 1994, Signal Process..

[8]  Michael Unser,et al.  Wavelet Applications in Signal and Image Processing II: 27-29 July 1994, San Diego, California , 1994 .

[9]  Dinh-Tuan Pham,et al.  Blind separation of instantaneous mixture of sources via an independent component analysis , 1996, IEEE Trans. Signal Process..

[10]  David J. Field,et al.  Emergence of simple-cell receptive field properties by learning a sparse code for natural images , 1996, Nature.

[11]  J. B. Buckheit,et al.  Time-frequency tilings which best expose the non-Gaussian behavior of a stochastic process , 1996, Proceedings of Third International Symposium on Time-Frequency and Time-Scale Analysis (TFTS-96).

[12]  David J. Field,et al.  Sparse coding with an overcomplete basis set: A strategy employed by V1? , 1997, Vision Research.

[13]  Terrence J. Sejnowski,et al.  The “independent components” of natural scenes are edge filters , 1997, Vision Research.

[14]  J. V. van Hateren,et al.  Independent component filters of natural images compared with simple cells in primary visual cortex , 1998, Proceedings of the Royal Society of London. Series B: Biological Sciences.

[15]  Martin Vetterli,et al.  Data Compression and Harmonic Analysis , 1998, IEEE Trans. Inf. Theory.

[16]  Jean-Francois Cardoso,et al.  Some experiments on independent component analysis of non-Gaussian processes , 1999, Proceedings of the IEEE Signal Processing Workshop on Higher-Order Statistics. SPW-HOS '99.

[17]  Alfred O. Hero,et al.  Asymptotic theory of greedy approximations to minimal k-point random graphs , 1999, IEEE Trans. Inf. Theory.

[18]  Jean-Franois Cardoso High-Order Contrasts for Independent Component Analysis , 1999, Neural Computation.

[19]  Richard A. Levine,et al.  An Iterative Nonlinear Gaussianization Algorithm for Image Simulation and Synthesis , 2000 .

[20]  Naoki Saito,et al.  Sparsity vs. statistical independence from a best-basis viewpoint , 2000, SPIE Optics + Photonics.

[21]  Yves Meyer,et al.  Oscillating Patterns in Image Processing and Nonlinear Evolution Equations: The Fifteenth Dean Jacqueline B. Lewis Memorial Lectures , 2001 .

[22]  Naoki Saito,et al.  Image approximation and modeling via least statistically dependent bases , 2001, Pattern Recognit..

[23]  A. G. Flesia,et al.  Can recent innovations in harmonic analysis `explain' key findings in natural image statistics? , 2001, Network.

[24]  Naoki Saito,et al.  Sparsity vs. Statistical Independence in Adaptive Signal Representations: A Case Study of the Spike Process , 2001, math/0104083.