Probabilistic computation by neuromine networks.

In this paper, we address the question, can biologically feasible neural nets compute more than can be computed by deterministic polynomial time algorithms? Since we want to maintain a claim of plausibility and reasonableness we restrict ourselves to algorithmically easy to construct nets and we rule out infinite precision in parameters and in any analog parts of the computation. Our approach is to consider the recent advances in randomized algorithms and see if such randomized computations can be described by neural nets. We start with a pair of neurons and show that by connecting them with reciprocal inhibition and some tonic input, then the steady-state will be one neuron ON and one neuron OFF, but which neuron will be ON and which neuron will be OFF will be chosen at random (perhaps, it would be better to say that microscopic noise in the analog computation will be turned into a megascale random bit). We then show that we can build a small network that uses this random bit process to generate repeatedly random bits. This random bit generator can then be connected with a neural net representing the deterministic part of randomized algorithm. We, therefore, demonstrate that these neural nets can carry out probabilistic computation and thus be less limited than classical neural nets.

[1]  Hal L. Smith Monotone semiflows generated by functional differential equations , 1987 .

[2]  T Turova Stochastic dynamics of a neural network with inhibitory and excitatory connections. , 1997, Bio Systems.

[3]  Gordon M. Shepherd,et al.  Olfactory cortex , 1998 .

[4]  Christopher Moore Real-Valued, Continuous-Time Computers: A Model of Analog Computations, Part I , 1993 .

[5]  Xin Yao,et al.  Finding Approximate Solutions to NP-Hard Problems by Neural Networks is Hard , 1992, Inf. Process. Lett..

[6]  Garrett E. Alexander Basal ganglia , 1998 .

[7]  Ricky D. Hangartner Probabilistic computation in stochastic pulse neuromime networks , 1994 .

[8]  Carl-Johan H. Seger,et al.  A unified framework for race analysis of asynchronous networks , 1989, JACM.

[9]  S. Rossignol,et al.  Neural Control of Rhythmic Movements in Vertebrates , 1988 .

[10]  Shunsuke Sato,et al.  On the moments of the firing interval of the diffusion approximated model neuron , 1978 .

[11]  W. D. Evans,et al.  PARTIAL DIFFERENTIAL EQUATIONS , 1941 .

[12]  W. Precht The synaptic organization of the brain G.M. Shepherd, Oxford University Press (1975). 364 pp., £3.80 (paperback) , 1976, Neuroscience.

[13]  Wolfgang Maass,et al.  Lower Bounds for the Computational Power of Networks of Spiking Neurons , 1996, Neural Computation.

[14]  Hava T. Siegelmann,et al.  On the Computational Power of Neural Nets , 1995, J. Comput. Syst. Sci..

[15]  G. Shepherd The Synaptic Organization of the Brain , 1979 .

[16]  Georg Schnitger,et al.  Relating Boltzmann machines to conventional models of computation , 1987, Neural Networks.

[17]  J. Hale Theory of Functional Differential Equations , 1977 .

[18]  Joseph W. Goodman,et al.  On the power of neural networks for solving hard problems , 1990, J. Complex..

[19]  John T. Gill,et al.  Computational complexity of probabilistic Turing machines , 1974, STOC '74.

[20]  Ronald L. Rivest,et al.  Training a 3-node neural network is NP-complete , 1988, COLT '88.

[21]  Y S Abu-Mostafa,et al.  Neural networks for computing , 1987 .

[22]  S C Kleene,et al.  Representation of Events in Nerve Nets and Finite Automata , 1951 .

[23]  B. Dickinson,et al.  The complexity of analog computation , 1986 .

[24]  José L. Balcázar,et al.  Structural Complexity I , 1988, EATCS Monographs on Theoretical Computer Science Series.

[25]  Hava T. Siegelmann,et al.  Some recent results on computing with 'neural nets' , 1992, [1992] Proceedings of the 31st IEEE Conference on Decision and Control.

[26]  Leon O. Chua,et al.  Stability and dynamics of delay-type general and cellular neural networks , 1992 .