Theory of neuromata

A finite automaton—the so-called neuromaton, realized by a finite discrete recurrent neural network, working in parallel computation mode, is considered. Both the size of neuromata (i.e., the number of neurons) and their descriptional complexity (i.e., the number of bits in the neuromaton representation) are studied. It is proved that a constraint time delay of the neuromaton output does not play a role within a polynomial descriptional complexity. It is shown that any regular language given by a regular expression of length <italic>n</italic> is recognized by a neuromaton with &THgr;(<italic>n</italic>) neurons. Further, it is proved that this network size is, in the worst case, optimal. On the other hand, generally there is not an equivalent polynomial length regular expression for a given neuromaton. Then, two specialized constructions of neural acceptors of the optimal descriptional complexity &THgr;(<?Pub Fmt italic>n<?Pub Fmt /italic>) for a single <?Pub Fmt italic>n<?Pub Fmt /italic>-bit string recognition are described. They both require <?Pub Fmt italic>O(n<supscrpt>1/2</supscrpt>)<?Pub Fmt /italic> neurons and either <?Pub Fmt italic>O(n)<?Pub Fmt /italic> connections with constant weights or <?Pub Fmt italic>O(n<supscrpt>1/2</supscrpt>)<?Pub Fmt /italic> edges with weights of the <inline-equation><f> O<fen lp="par">2<sup><rad><rcd>n</rcd></rad></sup> </fen> </f></inline-equation> Hopfield condition stating when a regular language is a Hopfield language, is formulated. A construction of a Hopfield neuromaton is presented for a regular language satisfying the Hopfield condition. The class of Hopfield languages is shown to be closed under union, intersection, concatenation and complement, and it is not closed under iteration. Finally, the problem whether a regular language given by a neuromaton (or by a Hopfield acceptor) is nonempty, is proved to be PSPACE-complete. As a consequence, the same result for a neuromaton equivalence problem is achieved.

[1]  Peter Tiño,et al.  Learning and Extracting Initial Mealy Automata with a Modular Neural Network Model , 1995, Neural Comput..

[2]  Jirí Síma Hopfield Languages , 1995, SOFSEM.

[3]  Pekka Orponen,et al.  Computational complexity of neural networks: a survey , 1994 .

[4]  Noga Alon,et al.  Efficient simulation of finite automata by neural nets , 1991, JACM.

[5]  Alfred V. Aho,et al.  The Design and Analysis of Computer Algorithms , 1974 .

[6]  Jirí Wiedermann,et al.  Complexity Issues in Discrete Neurocomputing , 1990, IMYCS.

[7]  J. R,et al.  Analog Stable Simulation of Discrete Neural Networks , 1997 .

[8]  Tao Jiang,et al.  A Note on the Space Complexity of Some Decision Problems for Finite Automata , 1991, Inf. Process. Lett..

[9]  Don R. Hush,et al.  Bounds on the complexity of recurrent neural network implementations of finite state machines , 1993, Neural Networks.

[10]  C. Lee Giles,et al.  Learning and Extracting Finite State Automata with Second-Order Recurrent Neural Networks , 1992, Neural Computation.

[11]  Pekka Orponen,et al.  Complexity Issues in Discrete Hopfield Networks , 1994 .

[12]  Padhraic Smyth,et al.  Learning Finite State Machines With Self-Clustering Recurrent Networks , 1993, Neural Computation.

[13]  Jirí Wiedermann,et al.  Neural Language Acceptors , 1995, Developments in Language Theory.

[14]  T. Kailath,et al.  Discrete Neural Computation: A Theoretical Foundation , 1995 .

[15]  Piotr Indyk Optimal Simulation of Automata by Neural Nets , 1995, STACS.

[16]  Alvis Brazma,et al.  Learning of regular expressions by pattern matching , 1995, EuroCOLT.

[17]  Ian Parberry,et al.  Circuit complexity and neural networks , 1994 .

[18]  Neil D. Jones,et al.  Space-Bounded Reducibility among Combinatorial Problems , 1975, J. Comput. Syst. Sci..

[19]  Panagiotis Manolios,et al.  First-Order Recurrent Neural Networks and Deterministic Finite State Automata , 1994, Neural Computation.

[20]  S C Kleene,et al.  Representation of Events in Nerve Nets and Finite Automata , 1951 .