The Parzen kernel approach to learning in non-stationary environment

In this paper a method for nonparametric regression estimation in non-stationary environment is presented. The Parzen kernels are used to design the recursive general regression neural networks to track changes of non-stationary system under non-stationary noise. The probabilistic properties of the proposed method are investigated. Experimental results are presented and discussed.

[1]  Robi Polikar,et al.  COMPOSE: A Semisupervised Learning Framework for Initially Labeled Nonstationary Streaming Data , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[2]  Cesare Alippi,et al.  Just-In-Time Classifiers for Recurrent Concepts , 2013, IEEE Transactions on Neural Networks and Learning Systems.

[3]  Mykola Pechenizkiy,et al.  Dealing With Concept Drifts in Process Mining , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[4]  Plamen P. Angelov,et al.  PANFIS: A Novel Incremental Learning Machine , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[5]  C. Lee Giles,et al.  Nonconvex Online Support Vector Machines , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[6]  Donald F. Specht,et al.  Probabilistic neural networks and the polynomial Adaline as complementary techniques for classification , 1990, IEEE Trans. Neural Networks.

[7]  A. Krzyżak,et al.  Asymptotic properties of kernel estimates of a regression function , 1980 .

[8]  Leszek Rutkowski,et al.  Generalized regression neural networks in time-varying environment , 2004, IEEE Transactions on Neural Networks.

[9]  Piotr Duda,et al.  The CART decision tree for mining data streams , 2014, Inf. Sci..

[10]  Jerzy Stefanowski,et al.  Reacting to Different Types of Concept Drift: The Accuracy Updated Ensemble Algorithm , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[11]  Edwin Lughofer,et al.  Evolving Fuzzy Systems - Methodologies, Advanced Concepts and Applications , 2011, Studies in Fuzziness and Soft Computing.

[12]  Tohru Ozaki,et al.  Modelling non-stationary variance in EEG time series by state space GARCH model , 2006, Comput. Biol. Medicine.

[13]  Ludmila I. Kuncheva,et al.  PCA Feature Extraction for Change Detection in Multidimensional Unlabeled Data , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[14]  P. Ivanov,et al.  Variance fluctuations in nonstationary time series: a comparative study of music genres , 2003, cond-mat/0312380.

[15]  Geoff Holmes,et al.  Active Learning With Drifting Streaming Data , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[16]  Donald F. Specht,et al.  Probabilistic neural networks , 1990, Neural Networks.

[17]  Plamen Angelov,et al.  Evolving Intelligent Systems: Methodology and Applications , 2010 .

[18]  Geoff Holmes,et al.  MOA: Massive Online Analysis , 2010, J. Mach. Learn. Res..

[19]  Peter M. Ellis The Time-Dependent Mean and Variance of the Non-Stationary Markovian Infinite Server System , 2010 .

[20]  Piotr Duda,et al.  Decision Trees for Mining Data Streams Based on the Gaussian Approximation , 2014, IEEE Transactions on Knowledge and Data Engineering.

[21]  Ernestina Menasalvas Ruiz,et al.  Mining Recurring Concepts in a Dynamic Feature Space , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[22]  Vasant Honavar,et al.  Learn++: an incremental learning algorithm for supervised neural networks , 2001, IEEE Trans. Syst. Man Cybern. Part C.

[23]  Haibo He,et al.  Incremental Learning From Stream Data , 2011, IEEE Transactions on Neural Networks.

[24]  W. Greblicki,et al.  An orthogonal series estimate of time-varying regression , 1983 .

[25]  Piotr Duda,et al.  Decision Trees for Mining Data Streams Based on the McDiarmid's Bound , 2013, IEEE Transactions on Knowledge and Data Engineering.