Stability analysis of discrete-time recurrent neural networks

We address the problem of global Lyapunov stability of discrete-time recurrent neural networks (RNNs) in the unforced (unperturbed) setting. It is assumed that network weights are fixed to some values, for example, those attained after training. Based on classical results of the theory of absolute stability, we propose a new approach for the stability analysis of RNNs with sector-type monotone nonlinearities and nonzero biases. We devise a simple state-space transformation to convert the original RNN equations to a form suitable for our stability analysis. We then present appropriate linear matrix inequalities (LMIs) to be solved to determine whether the system under study is globally exponentially stable. Unlike previous treatments, our approach readily permits one to account for non-zero biases usually present in RNNs for improved approximation capabilities. We show how recent results of others on the stability analysis of RNNs can be interpreted as special cases within our approach. We illustrate how to use our approach with examples. Though illustrated on the stability analysis of recurrent multilayer perceptrons, the approach proposed can also be applied to other forms of time-lagged RNNs.