Long-Short Term Echo State Network for Time Series Prediction

The Echo State Networks (ESNs) is an efficient recurrent neural network consisting of a randomly generated reservoir (a large number of neurons with sparse random recurrent connections) and a trainable linear layer. It has received widespread attention for its simplicity and effectiveness, especially for time series prediction tasks. However, there is no explicit mechanism in ESNs to capture the inherent multi-scale characteristics of time series. To this end, we propose a model consisting of multi-reservoir structure named long-short term echo state networks (LS-ESNs) to capture the multi-scale temporal characteristics of time series. Specifically, LS-ESNs consists of three independent reservoirs, and each reservoir has recurrent connections of a specific time-scale to model the temporal dependencies of time series. The multi-scale echo states are then collected from each reservoir and concatenated together. Finally, the concatenated echo states representations are fed to the linear regression layer to obtain the results. Experiments on two time series prediction benchmark data sets and a real-world power load data sets demonstrate the effectiveness of the proposed LS-ESNs.

[1]  Ali Deihimi,et al.  Short-term electric load and temperature forecasting using wavelet echo state networks with neural reconstruction , 2013 .

[2]  A. N. Tikhonov,et al.  Solutions of ill-posed problems , 1977 .

[3]  Benjamin Schrauwen,et al.  Reservoir Computing Trends , 2012, KI - Künstliche Intelligenz.

[4]  Benjamin Schrauwen,et al.  Extending reservoir computing with random static projections: a hybrid between extreme learning and RC , 2010, ESANN.

[5]  Peter Tiño,et al.  Financial volatility trading using recurrent neural networks , 2001, IEEE Trans. Neural Networks.

[6]  Dorothea Heiss-Czedik,et al.  An Introduction to Genetic Algorithms. , 1997, Artificial Life.

[7]  Yiannis Demiris,et al.  Echo State Gaussian Process , 2011, IEEE Transactions on Neural Networks.

[8]  刘韵洁,et al.  Effect of Hybrid Circle Reservoir Injected with Wavelet-neurons on Performance of Echo State Network , 2014 .

[9]  Alexander J. Smola,et al.  Support Vector Regression Machines , 1996, NIPS.

[10]  Neil D. Lawrence,et al.  Fast Nonparametric Clustering of Structured Time-Series , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[11]  Claudio Gallicchio,et al.  Architectural and Markovian factors of echo state networks , 2011, Neural Networks.

[12]  Simon Haykin,et al.  Making sense of a complex world , 1998 .

[13]  S. Massar,et al.  Mean Field Theory of Dynamical Systems Driven by External Signals , 2012, ArXiv.

[14]  Gilles Wainrib,et al.  A local Echo State Property through the largest Lyapunov exponent , 2016, Neural Networks.

[15]  Yoshua Bengio,et al.  Learning long-term dependencies with gradient descent is difficult , 1994, IEEE Trans. Neural Networks.

[16]  Peter Tiño,et al.  Learning long-term dependencies in NARX recurrent neural networks , 1996, IEEE Trans. Neural Networks.

[17]  Akinori Nishihara,et al.  Evolutionary pre-training for CRJ-type reservoir of echo state networks , 2015, Neurocomputing.

[18]  Garrison W. Cottrell,et al.  DeePr-ESN: A deep projection-encoding echo-state network , 2020, Inf. Sci..

[19]  D. T. Lee,et al.  Travel-time prediction with support vector regression , 2004, IEEE Transactions on Intelligent Transportation Systems.

[20]  Garrison W. Cottrell,et al.  WALKING WALKing walking: Action Recognition from Action Echoes , 2017, IJCAI.

[21]  B. Schrauwen,et al.  Reservoir computing and extreme learning machines for non-linear time-series data analysis , 2013, Neural Networks.

[22]  Zuren Feng,et al.  Hourly electric load forecasting algorithm based on echo state neural network , 2011, 2011 Chinese Control and Decision Conference (CCDC).

[23]  Wenhao Huang,et al.  Deep Architecture for Traffic Flow Prediction: Deep Belief Networks With Multitask Learning , 2014, IEEE Transactions on Intelligent Transportation Systems.

[24]  Claudio Gallicchio,et al.  Design of deep echo state networks , 2018, Neural Networks.

[25]  Yang Li,et al.  Recognizing emotions in speech using short-term and long-term features , 1998, ICSLP.

[26]  Amir Hussain,et al.  Multilayered Echo State Machine: A Novel Architecture and Algorithm , 2017, IEEE Transactions on Cybernetics.

[27]  Heng Wang,et al.  Locality Statistics for Anomaly Detection in Time Series of Graphs , 2013, IEEE Transactions on Signal Processing.

[28]  Gang Li,et al.  Echo State Network with Bayesian Regularization for Forecasting Short-Term Power Production of Small Hydropower Plants , 2015 .

[29]  Doreen Eichel,et al.  Adaptive Dynamic Programming For Control Algorithms And Stability , 2016 .

[30]  Yoshua Bengio,et al.  Hierarchical Recurrent Neural Networks for Long-Term Dependencies , 1995, NIPS.

[31]  S. Haykin,et al.  Making sense of a complex world [chaotic events modeling] , 1998, IEEE Signal Process. Mag..

[32]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[33]  E. Lorenz Deterministic nonperiodic flow , 1963 .

[34]  Jun Wang,et al.  Chaotic Time Series Prediction Based on a Novel Robust Echo State Network , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[35]  R. Willoughby Solutions of Ill-Posed Problems (A. N. Tikhonov and V. Y. Arsenin) , 1979 .

[36]  Guang-Bin Huang,et al.  Trends in extreme learning machines: A review , 2015, Neural Networks.

[37]  Huaguang Zhang,et al.  Robust Stability Analysis for Interval Cohen–Grossberg Neural Networks With Unknown Time-Varying Delays , 2008, IEEE Transactions on Neural Networks.

[38]  Elad Hazan,et al.  Online Time Series Prediction with Missing Data , 2015, ICML.

[39]  Jürgen Schmidhuber,et al.  Learning to Forget: Continual Prediction with LSTM , 2000, Neural Computation.

[40]  Janez Demsar,et al.  Statistical Comparisons of Classifiers over Multiple Data Sets , 2006, J. Mach. Learn. Res..

[41]  Gustavo Deco,et al.  Neural learning of chaotic dynamics , 1995, Neural Processing Letters.

[42]  Benjamin Schrauwen,et al.  Training and Analysing Deep Recurrent Neural Networks , 2013, NIPS.

[43]  Herbert Jaeger,et al.  The''echo state''approach to analysing and training recurrent neural networks , 2001 .

[44]  Witold Pedrycz,et al.  Granular Model of Long-Term Prediction for Energy System in Steel Industry , 2016, IEEE Transactions on Cybernetics.