Fast and Adaptive Low-Pass Whitening Filters for Natural Images

A fast and simple solution was suggested to reduce the inter-pixels correlations in natural images, of which the power spectra roughly fell off with the increasing spatial frequency f according to a power law; but the 1/f exponent, α, was different from image to image. The essential of the proposed method was to flatten the decreasing power spectrum of each image by using an adaptive low-pass and whitening filter. The act of low-pass filtering was just to reduce the effects of noise usually took place in the high frequencies. The act of whitening filtering was a special processing, which was to attenuate the low frequencies and boost the high frequencies so as to yield a roughly flat power spectrum across all spatial frequencies. The suggested method was computationally more economical than the geometric covariance matrix based PCA method. Meanwhile, the performance degradations accompanied with the computational economy improvement were fairly insignificant.

[1]  J. H. Hateren,et al.  Independent component filters of natural images compared with simple cells in primary visual cortex , 1998 .

[2]  D J Field,et al.  Relations between the statistics of natural images and the response properties of cortical cells. , 1987, Journal of the Optical Society of America. A, Optics and image science.

[3]  Shun-ichi Amari,et al.  Natural Gradient Works Efficiently in Learning , 1998, Neural Computation.

[4]  J. H. van Hateren,et al.  Modelling the Power Spectra of Natural Images: Statistics and Information , 1996, Vision Research.

[5]  David J. Field,et al.  How Close Are We to Understanding V1? , 2005, Neural Computation.

[6]  Alexander Basilevsky,et al.  Statistical Factor Analysis and Related Methods , 1994 .

[7]  R C Reid,et al.  Efficient Coding of Natural Scenes in the Lateral Geniculate Nucleus: Experimental Test of a Computational Theory , 1996, The Journal of Neuroscience.

[8]  Eero P. Simoncelli,et al.  Natural image statistics and neural representation. , 2001, Annual review of neuroscience.

[9]  I. Jolliffe Principal Component Analysis , 2002 .

[10]  David J. Field,et al.  Sparse coding with an overcomplete basis set: A strategy employed by V1? , 1997, Vision Research.

[11]  Andreas Stolcke,et al.  Unification as Constraint Satisfaction in Structured Connectionist Networks , 1989, Neural Computation.

[12]  Zhaoping Li,et al.  Understanding Retinal Color Coding from First Principles , 1992, Neural Computation.

[13]  David J. Field,et al.  What Is the Goal of Sensory Coding? , 1994, Neural Computation.

[14]  Jaime A. Camelio,et al.  Compliant Assembly Variation Analysis Using Component Geometric Covariance , 2004 .

[15]  Bruno A. Olshausen,et al.  Principles of Image Representation in Visual Cortex , 2003 .

[16]  J. Pearl,et al.  Comparison of the cosine and Fourier transforms of Markov-1 signals , 1976 .

[17]  Erkki Oja,et al.  Independent component analysis: algorithms and applications , 2000, Neural Networks.

[18]  Terrence J. Sejnowski,et al.  The “independent components” of natural scenes are edge filters , 1997, Vision Research.

[19]  A. Basilevsky Statistical Factor Analysis and Related Methods: Theory and Applications , 1994 .

[20]  Terrence J. Sejnowski,et al.  Learning Overcomplete Representations , 2000, Neural Computation.

[21]  David J. Field,et al.  Emergence of simple-cell receptive field properties by learning a sparse code for natural images , 1996, Nature.