Model-size reduction for reservoir computing by concatenating internal states through time

Reservoir computing (RC) is a machine learning algorithm that can learn complex time series from data very rapidly based on the use of high-dimensional dynamical systems, such as random networks of neurons, called “reservoirs.” To implement RC in edge computing, it is highly important to reduce the amount of computational resources that RC requires. In this study, we propose methods that reduce the size of the reservoir by inputting the past or drifting states of the reservoir to the output layer at the current time step. To elucidate the mechanism of model-size reduction, the proposed methods are analyzed based on information processing capacity proposed by Dambre et al. (Sci Rep 2:514, 2012). In addition, we evaluate the effectiveness of the proposed methods on time-series prediction tasks: the generalized Hénon-map and NARMA. On these tasks, we found that the proposed methods were able to reduce the size of the reservoir up to one tenth without a substantial increase in regression error.

[1]  Narayan Srinivasa,et al.  Learning to Recognize Actions From Limited Training Examples Using a Recurrent Spiking Neural Model , 2017, Front. Neurosci..

[2]  Rik Van de Walle,et al.  Real-Time Reservoir Computing Network-Based Systems for Detection Tasks on Visual Contents , 2015, 2015 7th International Conference on Computational Intelligence, Communication Systems and Networks.

[3]  Harald Haas,et al.  Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication , 2004, Science.

[4]  Takashi Morie,et al.  A Chaotic Boltzmann Machine Working as a Reservoir and Its Analog VLSI Implementation , 2019, 2019 International Joint Conference on Neural Networks (IJCNN).

[5]  Peter Tiño,et al.  Minimum Complexity Echo State Network , 2011, IEEE Transactions on Neural Networks.

[6]  Takashi Morie,et al.  Reservoir Computing Based on Dynamics of Pseudo-Billiard System in Hypercube , 2019, 2019 International Joint Conference on Neural Networks (IJCNN).

[7]  Minoru Asada,et al.  Information processing in echo state networks at the edge of chaos , 2011, Theory in Biosciences.

[8]  Michelle Girvan,et al.  Hybrid Forecasting of Chaotic Processes: Using Machine Learning in Conjunction with a Knowledge-Based Model , 2018, Chaos.

[9]  Peter I. Frazier,et al.  A Tutorial on Bayesian Optimization , 2018, ArXiv.

[10]  Stefan J. Kiebel,et al.  Re-visiting the echo state property , 2012, Neural Networks.

[11]  Salim Mejaouri,et al.  Reservoir computing with a single delay-coupled non-linear mechanical oscillator , 2018, Journal of Applied Physics.

[12]  Jürgen Schmidhuber,et al.  LSTM: A Search Space Odyssey , 2015, IEEE Transactions on Neural Networks and Learning Systems.

[13]  Christiam F. Frasser,et al.  Efficient parallel implementation of reservoir computing systems , 2018, Neural Computing and Applications.

[14]  Roger Labahn,et al.  Design Strategies for Weight Matrices of Echo State Networks , 2012, Neural Computation.

[15]  P. Mahadevan,et al.  An overview , 2007, Journal of Biosciences.

[16]  Herbert Jaeger,et al.  The''echo state''approach to analysing and training recurrent neural networks , 2001 .

[17]  Weisong Shi,et al.  Edge Computing: Vision and Challenges , 2016, IEEE Internet of Things Journal.

[18]  Toshiyuki Yamane,et al.  Recent Advances in Physical Reservoir Computing: A Review , 2018, Neural Networks.

[19]  Jaideep Pathak,et al.  Model-Free Prediction of Large Spatiotemporally Chaotic Systems from Data: A Reservoir Computing Approach. , 2018, Physical review letters.

[20]  Cory Merkel,et al.  Reservoir Computing in Embedded Systems: Three variants of the reservoir algorithm. , 2017, IEEE Consumer Electronics Magazine.

[21]  Hendrik Richter,et al.  The Generalized HÉnon Maps: Examples for Higher-Dimensional Chaos , 2002, Int. J. Bifurc. Chaos.

[22]  Benjamin Schrauwen,et al.  On Learning Navigation Behaviors for Small Mobile Robots With Reservoir Computing Architectures , 2015, IEEE Transactions on Neural Networks and Learning Systems.

[23]  Timothy P Lillicrap,et al.  Backpropagation through time and the brain , 2019, Current Opinion in Neurobiology.

[24]  Bhavin J. Shastri,et al.  Takens-inspired neuromorphic processor: a downsizing tool for random recurrent neural networks via feature extraction , 2019, Physical Review Research.

[25]  Haim Sompolinsky,et al.  Short-term memory in orthogonal neural networks. , 2004, Physical review letters.

[26]  Giacomo Indiveri,et al.  Real-Time Ultra-Low Power ECG Anomaly Detection Using an Event-Driven Neuromorphic Processor , 2019, IEEE Transactions on Biomedical Circuits and Systems.

[27]  L. Appeltant,et al.  Information processing using a single dynamical node as complex system , 2011, Nature communications.

[28]  Christopher K. Wikle,et al.  An ensemble quadratic echo state network for non‐linear spatio‐temporal forecasting , 2017, 1708.05094.

[29]  Jasper Snoek,et al.  Practical Bayesian Optimization of Machine Learning Algorithms , 2012, NIPS.

[30]  Mantas Lukosevicius,et al.  A Practical Guide to Applying Echo State Networks , 2012, Neural Networks: Tricks of the Trade.

[31]  Herbert Jaeger,et al.  Reservoir computing approaches to recurrent neural network training , 2009, Comput. Sci. Rev..

[32]  Surya Ganguli,et al.  Memory traces in dynamical systems , 2008, Proceedings of the National Academy of Sciences.

[33]  Benjamin Schrauwen,et al.  Information Processing Capacity of Dynamical Systems , 2012, Scientific Reports.

[34]  Zehong Yang,et al.  Short-term stock price prediction based on echo state networks , 2009, Expert Syst. Appl..

[35]  Igor Farkas,et al.  Computational analysis of memory capacity in echo state networks , 2016, Neural Networks.

[36]  R. C. Macridis A review , 1963 .

[37]  Benjamin Schrauwen,et al.  Real-time detection of epileptic seizures in animal models using reservoir computing , 2013, Epilepsy Research.

[38]  Damien Querlioz,et al.  Neuromorphic computing with nanoscale spintronic oscillators , 2017, Nature.

[39]  Gang-Ding Peng,et al.  Fiber Optofluidic Microlaser With Lateral Single Mode Emission , 2018, IEEE Journal of Selected Topics in Quantum Electronics.

[40]  José Carlos Príncipe,et al.  Analysis and Design of Echo State Networks , 2007, Neural Computation.

[41]  F. Takens Detecting strange attractors in turbulence , 1981 .

[42]  Daniel Brunner,et al.  Efficient design of hardware-enabled reservoir computing in FPGAs , 2018, Journal of Applied Physics.

[43]  Xu Chen,et al.  Edge Intelligence: Paving the Last Mile of Artificial Intelligence With Edge Computing , 2019, Proceedings of the IEEE.

[44]  Dianhui Wang,et al.  Randomness in neural networks: an overview , 2017, WIREs Data Mining Knowl. Discov..

[45]  Bhavin J. Shastri,et al.  Neuromorphic Photonic Integrated Circuits , 2018, IEEE Journal of Selected Topics in Quantum Electronics.

[46]  Paul J. Werbos,et al.  Backpropagation Through Time: What It Does and How to Do It , 1990, Proc. IEEE.

[47]  Chi-Yi Tsai,et al.  Robust face tracking control of a mobile robot using self‐tuning Kalman filter and echo state network , 2010 .

[48]  Henry Markram,et al.  Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.

[49]  Miguel C. Soriano,et al.  Digital Implementation of a Single Dynamical Node Reservoir Computer , 2015, IEEE Transactions on Circuits and Systems II: Express Briefs.

[50]  Paul R. Prucnal,et al.  Progress in neuromorphic photonics , 2017 .

[51]  Yi Ren,et al.  Short-term wind speed forecasting based on autoregressive moving average with echo state network compensation , 2020 .

[52]  Cory E. Merkel,et al.  An FPGA Implementation of a Time Delay Reservoir Using Stochastic Logic , 2018, ACM J. Emerg. Technol. Comput. Syst..

[53]  Laurent Larger,et al.  Tutorial: Photonic Neural Networks in Delay Systems , 2018, Journal of Applied Physics.