Recurrent Broad Learning Systems for Time Series Prediction

The broad learning system (BLS) is an emerging approach for effective and efficient modeling of complex systems. The inputs are transferred and placed in the feature nodes, and then sent into the enhancement nodes for nonlinear transformation. The structure of a BLS can be extended in a wide sense. Incremental learning algorithms are designed for fast learning in broad expansion. Based on the typical BLSs, a novel recurrent BLS (RBLS) is proposed in this paper. The nodes in the enhancement units of the BLS are recurrently connected, for the purpose of capturing the dynamic characteristics of a time series. A sparse autoencoder is used to extract the features from the input instead of the randomly initialized weights. In this way, the RBLS retains the merit of fast computing and fits for processing sequential data. Motivated by the idea of “fine-tuning” in deep learning, the weights in the RBLS can be updated by conjugate gradient methods if the prediction errors are large. We exhibit the merits of our proposed model on several chaotic time series. Experimental results substantiate the effectiveness of the RBLS. For chaotic benchmark datasets, the RBLS achieves very small errors, and for the real-world dataset, the performance is satisfactory.

[1]  Harald Haas,et al.  Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication , 2004, Science.

[2]  Yiannis Demiris,et al.  Spatio-Temporal Learning With the Online Finite and Infinite Echo-State Gaussian Processes , 2015, IEEE Transactions on Neural Networks and Learning Systems.

[3]  Dawn M. Tilbury,et al.  Multi-Step Ahead Predictions for Critical Levels in Physiological Time Series , 2016, IEEE Transactions on Cybernetics.

[4]  C. L. Philip Chen,et al.  Broad Learning System: An Effective and Efficient Incremental Learning System Without the Need for Deep Architecture , 2018, IEEE Transactions on Neural Networks and Learning Systems.

[5]  Junfei Qiao,et al.  Growing Echo-State Network With Multiple Subreservoirs , 2017, IEEE Transactions on Neural Networks and Learning Systems.

[6]  Jun Wang,et al.  Model Predictive Control of Unknown Nonlinear Dynamical Systems Based on Recurrent Neural Networks , 2012, IEEE Transactions on Industrial Electronics.

[7]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[8]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[9]  Enrico Zio,et al.  Two Machine Learning Approaches for Short-Term Wind Speed Time-Series Prediction , 2016, IEEE Transactions on Neural Networks and Learning Systems.

[10]  Guoqiang Peter Zhang,et al.  Time series forecasting using a hybrid ARIMA and neural network model , 2003, Neurocomputing.

[11]  Chih-Min Lin,et al.  An Efficient Interval Type-2 Fuzzy CMAC for Chaos Time-Series Prediction and Synchronization , 2014, IEEE Transactions on Cybernetics.

[12]  Min Han,et al.  Laplacian Echo State Network for Multivariate Time Series Prediction , 2018, IEEE Transactions on Neural Networks and Learning Systems.

[13]  Gholamali Heydari,et al.  Chaotic time series prediction via artificial neural square fuzzy inference system , 2016, Expert Syst. Appl..

[14]  C. L. Philip Chen,et al.  Gradient Radial Basis Function Based Varying-Coefficient Autoregressive Model for Nonlinear and Nonstationary Time Series , 2015, IEEE Signal Processing Letters.

[15]  Omolbanin Yazdanbakhsh,et al.  Forecasting of Multivariate Time Series via Complex Fuzzy Logic , 2017, IEEE Transactions on Systems, Man, and Cybernetics: Systems.

[16]  Edoardo M. Airoldi,et al.  SLANTS: Sequential Adaptive Nonlinear Modeling of Time Series , 2016, IEEE Transactions on Signal Processing.

[17]  Liang Gao,et al.  A New Deep Transfer Learning Based on Sparse Auto-Encoder for Fault Diagnosis , 2019, IEEE Transactions on Systems, Man, and Cybernetics: Systems.

[18]  Jacek M. Zurada,et al.  Deep Learning of Part-Based Representation of Data Using Sparse Autoencoders With Nonnegativity Constraints , 2016, IEEE Transactions on Neural Networks and Learning Systems.

[19]  Ming Li,et al.  Insights into randomized algorithms for neural networks: Practical issues and common pitfalls , 2017, Inf. Sci..

[20]  Yu Gong,et al.  A Fast Adaptive Tunable RBF Network For Nonstationary Systems , 2016, IEEE Transactions on Cybernetics.

[21]  E. Massera,et al.  On field calibration of an electronic nose for benzene estimation in an urban pollution monitoring scenario , 2008 .

[22]  C. L. Philip Chen,et al.  An incremental adaptive implementation of functional-link processing for function approximation, time-series prediction, and system identification , 1998, Neurocomputing.

[23]  C. L. Philip Chen,et al.  Predictive Deep Boltzmann Machine for Multiperiod Wind Speed Forecasting , 2015, IEEE Transactions on Sustainable Energy.

[24]  Shie-Jue Lee,et al.  A weighted LS-SVM based learning system for time series forecasting , 2015, Inf. Sci..

[25]  Farid U. Dowla,et al.  Backpropagation Learning for Multilayer Feed-Forward Neural Networks Using the Conjugate Gradient Method , 1991, Int. J. Neural Syst..

[26]  Min Han,et al.  Adaptive Elastic Echo State Network for Multivariate Time Series Prediction , 2016, IEEE Transactions on Cybernetics.

[27]  Ponnuthurai Nagaratnam Suganthan,et al.  Empirical Mode Decomposition based ensemble deep learning for load demand time series forecasting , 2017, Appl. Soft Comput..

[28]  Yee Whye Teh,et al.  A Fast Learning Algorithm for Deep Belief Nets , 2006, Neural Computation.

[29]  Wei-Chang Yeh,et al.  New Parameter-Free Simplified Swarm Optimization for Artificial Neural Network Training and its Application in the Prediction of Time Series , 2013, IEEE Transactions on Neural Networks and Learning Systems.

[30]  Yiannis Demiris,et al.  Echo State Gaussian Process , 2011, IEEE Transactions on Neural Networks.

[31]  C. L. Philip Chen,et al.  A rapid learning and dynamic stepwise updating algorithm for flat neural networks and the application to time-series prediction , 1999, IEEE Trans. Syst. Man Cybern. Part B.

[32]  Weizhong Yan,et al.  Toward Automatic Time-Series Forecasting Using Neural Networks , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[33]  Md. Mustafizur Rahman,et al.  Layered Ensemble Architecture for Time Series Forecasting , 2016, IEEE Transactions on Cybernetics.

[34]  Chee Kheong Siew,et al.  Extreme learning machine: Theory and applications , 2006, Neurocomputing.

[35]  C. L. Philip Chen,et al.  A rapid supervised learning neural network for function interpolation and approximation , 1996, IEEE Trans. Neural Networks.

[36]  Le Zhang,et al.  A survey of randomized algorithms for training neural networks , 2016, Inf. Sci..

[37]  Chi-Keong Goh,et al.  Co-evolutionary multi-task learning with predictive recurrence for multi-step chaotic time series prediction , 2017, Neurocomputing.

[38]  C. L. Philip Chen,et al.  Exploiting the interpretability and forecasting ability of the RBF-AR model for nonlinear time series , 2016, Int. J. Syst. Sci..