Analog Neural Nets with Gaussian or Other Common Noise Distributions Cannot Recognize Arbitrary Regular Languages
暂无分享,去创建一个
[1] W. Doeblin. Sur les propriétés asymptotiques de mouvements régis par certains types de chaînes simples , 1938 .
[2] Paulo J. L. Adeodato,et al. Sequential RAM-based Neural Networks: Learnability, Generalisation, Knowledge Extraction, and Grammatical Inference , 1999, Int. J. Neural Syst..
[3] Hava T. Siegelmann,et al. Nine switch-affine neurons suffice for Turing universality , 1999, Neural Networks.
[4] Pekka Orponen,et al. On the Effect of Analog Noise in Discrete-Time Analog Computations , 1996, Neural Computation.
[5] Jürgen Schmidhuber,et al. LSTM recurrent networks learn simple context-free and context-sensitive languages , 2001, IEEE Trans. Neural Networks.
[6] Nicholas Pippenger. Invariance of complexity measures for networks with unreliable gates , 1989, JACM.
[7] Mikel L. Forcada,et al. Stable Encoding of Finite-State Machines in Discrete-Time Recurrent Neural Nets with Sigmoid Units , 2000, Neural Computation.
[8] Nicholas Pippenger,et al. On networks of noisy gates , 1985, 26th Annual Symposium on Foundations of Computer Science (sfcs 1985).
[9] Mike Casey,et al. The Dynamics of Discrete-Time Computation, with Application to Recurrent Neural Networks and Finite State Machine Extraction , 1996, Neural Computation.
[10] C. Lee Giles,et al. Constructing deterministic finite-state automata in recurrent neural networks , 1996, JACM.
[11] Oscar H. IBARm. Information and Control , 1957, Nature.
[12] Nicholas Pippenger,et al. Developments in "The synthesis of reliable organ-isms from unreliable components , 1990 .
[13] Hava T. Siegelmann,et al. Stochastic Analog Networks and Computational Complexity , 1999, J. Complex..