Analog Neural Nets with Gaussian or Other Common Noise Distributions Cannot Recognize Arbitrary Regular Languages

We consider recurrent analog neural nets where the output of each gate is subject to gaussian noise or any other common noise distribution that is nonzero on a sufficiently large part of the state-space. We show that many regular languages cannot be recognized by networks of this type, and we give a precise characterization of languages that can be recognized. This result implies severe constraints on possibilities for constructing recurrent analog neural nets that are robust against realistic types of analog noise. On the other hand, we present a method for constructing feedfor-ward analog neural nets that are robust with regard to analog noise of this type.

[1]  W. Doeblin Sur les propriétés asymptotiques de mouvements régis par certains types de chaînes simples , 1938 .

[2]  Paulo J. L. Adeodato,et al.  Sequential RAM-based Neural Networks: Learnability, Generalisation, Knowledge Extraction, and Grammatical Inference , 1999, Int. J. Neural Syst..

[3]  Hava T. Siegelmann,et al.  Nine switch-affine neurons suffice for Turing universality , 1999, Neural Networks.

[4]  Pekka Orponen,et al.  On the Effect of Analog Noise in Discrete-Time Analog Computations , 1996, Neural Computation.

[5]  Jürgen Schmidhuber,et al.  LSTM recurrent networks learn simple context-free and context-sensitive languages , 2001, IEEE Trans. Neural Networks.

[6]  Nicholas Pippenger Invariance of complexity measures for networks with unreliable gates , 1989, JACM.

[7]  Mikel L. Forcada,et al.  Stable Encoding of Finite-State Machines in Discrete-Time Recurrent Neural Nets with Sigmoid Units , 2000, Neural Computation.

[8]  Nicholas Pippenger,et al.  On networks of noisy gates , 1985, 26th Annual Symposium on Foundations of Computer Science (sfcs 1985).

[9]  Mike Casey,et al.  The Dynamics of Discrete-Time Computation, with Application to Recurrent Neural Networks and Finite State Machine Extraction , 1996, Neural Computation.

[10]  C. Lee Giles,et al.  Constructing deterministic finite-state automata in recurrent neural networks , 1996, JACM.

[11]  Oscar H. IBARm Information and Control , 1957, Nature.

[12]  Nicholas Pippenger,et al.  Developments in "The synthesis of reliable organ-isms from unreliable components , 1990 .

[13]  Hava T. Siegelmann,et al.  Stochastic Analog Networks and Computational Complexity , 1999, J. Complex..