dynoNet: A neural network architecture for learning dynamical systems

This paper introduces a network architecture, called dynoNet, utilizing linear dynamical operators as elementary building blocks. Owing to the dynamical nature of these blocks, dynoNet networks are tailored for sequence modeling and system identification purposes. The back-propagation behavior of the linear dynamical operator with respect to both its parameters and its input sequence is defined. This enables end-to-end training of structured networks containing linear dynamical operators and other differentiable units, exploiting existing deep learning software. Examples show the effectiveness of the proposed approach on well-known system identification benchmarks. Examples show the effectiveness of the proposed approach against well-known system identification benchmarks.

[1]  Oliver Nelles,et al.  Automatic Modeling with Local Model Networks for Benchmark Processes , 2017 .

[2]  Katsuhiko Ogata,et al.  Discrete-time control systems , 1987 .

[3]  G. Palm,et al.  On representation and approximation of nonlinear systems , 1979, Biological Cybernetics.

[4]  Lennart Ljung,et al.  Kernel methods in system identification, machine learning and function estimation: A survey , 2014, Autom..

[5]  Brett Ninness,et al.  Generalised Hammerstein–Wiener system estimation and a benchmark application , 2012 .

[6]  Thomas B. Schön,et al.  Deep Convolutional Networks in System Identification , 2019, 2019 IEEE 58th Conference on Decision and Control (CDC).

[7]  Tim Oates,et al.  Time series classification from scratch with deep neural networks: A strong baseline , 2016, 2017 International Joint Conference on Neural Networks (IJCNN).

[8]  Lennart Ljung,et al.  System Identification: Theory for the User , 1987 .

[9]  Maxime Gautier,et al.  Data Set and Reference Models of EMPS , 2019 .

[10]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[11]  Yves Rolain,et al.  Identification of Wiener-Hammerstein systems by a nonparametric separation of the best linear approximation , 2014, Autom..

[12]  J. P. Noël,et al.  Hysteretic benchmark with a dynamic nonlinearity , 2016 .

[13]  Vladlen Koltun,et al.  An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling , 2018, ArXiv.

[14]  Johan Schoukens,et al.  Polynomial State-Space Model Decoupling for the Identification of Hysteretic Systems , 2017 .

[15]  Barak A. Pearlmutter,et al.  Automatic differentiation in machine learning: a survey , 2015, J. Mach. Learn. Res..

[16]  Jürgen Schmidhuber,et al.  LSTM: A Search Space Odyssey , 2015, IEEE Transactions on Neural Networks and Learning Systems.

[17]  E. Bai,et al.  Block Oriented Nonlinear System Identification , 2010 .

[18]  Yves Rolain,et al.  Parametric identification of parallel Wiener-Hammerstein systems , 2017, Autom..

[19]  Johan A. K. Suykens,et al.  Wiener-Hammerstein Benchmark , 2009 .

[20]  Leon O. Chua,et al.  Fading memory and the problem of approximating nonlinear operators with volterra series , 1985 .