Neural network model for standard PCA and its variants applied to remote sensing

The conventional approach for principal component analysis (PCA) and its variants applied to remote sensing involves the computation of the input data covariance/correlation matrix and/or that of noise and application of diagonalization procedures for extracting the eigenvalues and corresponding eigenvectors. When the data dimension grows significantly, the matrix computations and manipulations become practically inefficient and inaccurate due to round‐off errors. In addition, all the eigenvalues and their corresponding eigenvectors have to be evaluated. These deficiencies make the conventional scheme inefficient for remote sensing applications. For that we propose here a neural network model that performs the PCA and its variants directly from the original data without any additional non‐neuronal computations or preliminary matrix estimation. Since the end user is usually not a neural network specialist, the neural network model as well as its execution are carefully designed in order to be automated as much as possible. This includes both the design of the network topology and the input/output representation as well as the design of the training algorithms. The global convergence of the model is studied. Its application has been realized on Landsat Thematic Mapper (TM) multispectral data. The obtained results show that the model performs well.

[1]  S.Y. Kung,et al.  Adaptive Principal component EXtraction (APEX) and applications , 1994, IEEE Trans. Signal Process..

[2]  Charles E. Heckler,et al.  Applied Multivariate Statistical Analysis , 2005, Technometrics.

[3]  E. LeDrew,et al.  Application of principal components analysis to change detection , 1987 .

[4]  S. Haykin,et al.  Adaptive Filter Theory , 1986 .

[5]  E. Oja Simplified neuron model as a principal component analyzer , 1982, Journal of mathematical biology.

[6]  Mahmood R. Azimi-Sadjadi,et al.  Detection of mines and minelike targets using principal component and neural-network methods , 1998, IEEE Trans. Neural Networks.

[7]  Anil K. Jain,et al.  Artificial neural networks for feature extraction and multivariate data projection , 1995, IEEE Trans. Neural Networks.

[8]  R. E. Roger A faster way to compute the noise-adjusted principal components transform matrix , 1994, IEEE Trans. Geosci. Remote. Sens..

[9]  G. Zyskind Introduction to Matrices with Applications in Statistics , 1970 .

[10]  P. Switzer,et al.  A transformation for ordering multispectral data in terms of image quality with implications for noise removal , 1988 .

[11]  Amrane Houacine,et al.  Approches neuronales pour l'extraction des composantes principales d'images multispectrales de télédétection , 1999 .

[12]  A. Moreno,et al.  Technical note The application of selective principal components analysis (SPCA) to a Thematic Mapper (TM) image for the recognition of geomorphologic features configuration , 1997 .

[13]  Suzanna Becker,et al.  Unsupervised Learning Procedures for Neural Networks , 1991, Int. J. Neural Syst..

[14]  P. Foldiak,et al.  Adaptive network for optimal linear feature extraction , 1989, International 1989 Joint Conference on Neural Networks.

[15]  A. R. Harrison,et al.  Standardized principal components , 1985 .

[16]  E. Capaldi,et al.  The organization of behavior. , 1992, Journal of applied behavior analysis.

[17]  Anil K. Jain,et al.  An Intrinsic Dimensionality Estimator from Near-Neighbor Information , 1979, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[18]  J. Rubner,et al.  Development of feature detectors by self-organization , 2004, Biological Cybernetics.

[19]  Kurt Hornik,et al.  Neural networks and principal component analysis: Learning from examples without local minima , 1989, Neural Networks.

[20]  Amrane Houacine,et al.  Neuronal principal component analysis for an optimal representation of multispectral images , 2001, Intell. Data Anal..

[21]  Terence D. Sanger,et al.  Optimal unsupervised learning in a single-layer linear feedforward neural network , 1989, Neural Networks.

[22]  E. Oja,et al.  On stochastic approximation of the eigenvectors and eigenvalues of the expectation of a random matrix , 1985 .

[23]  Erkki Oja,et al.  Principal components, minor components, and linear neural networks , 1992, Neural Networks.

[24]  R. E. Roger Principal Components transform with simple, automatic noise adjustment , 1996 .

[25]  H. Bourlard,et al.  Auto-association by multilayer perceptrons and singular value decomposition , 1988, Biological Cybernetics.

[26]  J. B. Lee,et al.  Enhancement of high spectral resolution remote-sensing data by a noise-adjusted principal components transform , 1990 .

[27]  Erkki Oja,et al.  Neural Networks, Principal Components, and Subspaces , 1989, Int. J. Neural Syst..

[28]  Mahmood R. Azimi-Sadjadi,et al.  Principal component extraction using recursive least squares learning , 1995, IEEE Trans. Neural Networks.

[29]  P. Chavez,et al.  Extracting spectral contrast in landsat thematic mapper image data using selective principal component analysis , 1989 .

[30]  L. Eklundh,et al.  A Comparative analysis of standardised and unstandardised Principal Component Analysis in remote sensing , 1993 .