Prediction Improvement via Smooth Component Analysis and Neural Network Mixing

In this paper we derive a novel smooth component analysis algorithm applied for prediction improvement. When many prediction models are tested we can treat their results as multivariate variable with the latent components having constructive or destructive impact on prediction results. The filtration of those destructive components and proper mixing of those constructive should improve final prediction results. The filtration process can be performed by neural networks with initial weights computed from smooth component analysis. The validity and high performance of our concept is presented on the real problem of energy load prediction.

[1]  Andrzej Cichocki,et al.  Adaptive blind signal and image processing , 2002 .

[2]  M. Taqqu,et al.  Stable Non-Gaussian Random Processes : Stochastic Models with Infinite Variance , 1995 .

[3]  L. E. Scales,et al.  Introduction to Non-Linear Optimization , 1985 .

[4]  H. Sebastian Seung,et al.  Learning the parts of objects by non-negative matrix factorization , 1999, Nature.

[5]  E. Oja,et al.  Independent Component Analysis , 2001 .

[6]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[7]  Michael Elad,et al.  Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ1 minimization , 2003, Proceedings of the National Academy of Sciences of the United States of America.

[8]  Krzysztof Siwek,et al.  Regularisation of neural networks for improved load forecasting in the power system , 2002 .

[9]  Andrzej Cichocki,et al.  Adaptive Blind Signal and Image Processing - Learning Algorithms and Applications , 2002 .

[10]  Andrzej Cichocki,et al.  New Algorithms for Non-Negative Matrix Factorization in Applications to Blind Source Separation , 2006, 2006 IEEE International Conference on Acoustics Speech and Signal Processing Proceedings.

[11]  Michel Verleysen,et al.  Prediction of electric load using Kohonen maps - Application to the Polish electricity consumption , 2002, Proceedings of the 2002 American Control Conference (IEEE Cat. No.CH37301).

[12]  Ryszard Szupiluk,et al.  Model Improvement by the Statistical Decomposition , 2004, ICAISC.

[13]  Christopher M. Bishop,et al.  Neural networks for pattern recognition , 1995 .

[14]  Adrian E. Raftery,et al.  Bayesian model averaging: a tutorial (with comments by M. Clyde, David Draper and E. I. George, and a rejoinder by the authors , 1999 .

[15]  Andreas Ziehe,et al.  Unmixing Hyperspectral Data , 1999, NIPS.

[16]  H. E. Hurst,et al.  Long-Term Storage Capacity of Reservoirs , 1951 .

[17]  Charles W. Therrien,et al.  Discrete Random Signals and Statistical Signal Processing , 1992 .

[18]  Barak A. Pearlmutter,et al.  Blind Source Separation via Multinode Sparse Representation , 2001, NIPS.

[19]  S. Amari,et al.  SPARSE COMPONENT ANALYSIS FOR BLIND SOURCE SEPARATION WITH LESS SENSORS THAN SOURCES , 2003 .

[20]  A. Cichocki,et al.  Blind separation of nonstationary sources in noisy mixtures , 2000 .

[21]  Yuhong Yang Adaptive Regression by Mixing , 2001 .

[22]  Schuster,et al.  Separation of a mixture of independent signals using time delayed correlations. , 1994, Physical review letters.

[23]  James V. Stone Blind Source Separation Using Temporal Predictability , 2001, Neural Computation.

[24]  Simon Haykin,et al.  Neural Networks: A Comprehensive Foundation , 1998 .

[25]  Jacek M. Zurada,et al.  Blind Signal Separation and Extraction: Recent Trends, Future Perspectives, and Applications , 2004, ICAISC.