(Thesis) Reservoir Computing With Dynamical Systems

A reservoir computer is a special type of neural network, where most of the weights are randomly fixed and only a subset are trained. In this thesis we prove results about reservoir computers trained on deterministic dynamical systems, and stochastic processes. We focus mostly on a special type of reservoir computer called an Echo State Network (ESN). In the deterministic case, we prove (under some assumptions) that if a reservoir computer has the Echo State Property (ESP), then there is a C1 generalised synchronisation between the input dynamical system and the dynamics in the reservoir space. Furthermore, we prove that a reservoir computer with the local ESP in several disjoint subsets of the reservoir space will admit several distinct generalised synchronisations. In the special case that the reservoir map is linear, and has the ESP, we prove that the generalised synchronisation is generically an embedding. This result admits Takens' embedding Theorem as a special case. We go to show that ESNs trained on scalar observations of an ergodic dynamical system can approximate an arbitrary target function, including the next step map used in time series forecasting. This universal approximation property holds despite the training process being entirely linear. We prove analogous results for ESNs trained on observations of a stochastic process, which are not be Markovian in general. We use these results to develop supervised learning, and reinforcement learning algorithms supported by an ESN. In the penultimate chapter of this thesis, we use a reservoir computer to numerically solve linear PDEs. In the final chapter, we conclude and discuss directions for future work.