On the complexity of learning for a spiking neuron (extended abstract)

Spiking neurons are models for the computational units in biological neural systems where information is considered to be encoded mainly in the temporal patterns of their activity. They provide a way of analyzing neural computation that is not captured by the traditional neuron models such as sigmoidal and threshold gates (or “Perceptrons”). We introduce a simple model of a spiking neuron that, in addition to the weights that model the plasticity of synaptic strength, also has variable transmission delays between neurons as programmable parameters. For coding of input and output values two modes are taken into account: binary coding for the Boolean and analog coding for the real-valued domain. We investigate the complexity of learning for a single spiking neuron within the framework of PAC-learnability. With regard to sample complexity, we prove that the VC-dimension is O(nlogn) and, hence, strictly larger than that of a threshold gate. In particular, the lower bound holds for binary coding and even if all weights are kept fixed. The upper bound is valid for the case of analog coding if weights and delays are programmable. With regard to computational complexity, we show that there is no polynomial-time PAClearning algorithm, unless RP = NP, for a quite *Address: Institute for Theoretical Computer Science, Technische Universitgt Graz, Klosterwiesgasse 32/2, A-8010 Graz, Austria. Email: {maass,mschmitt}Qigi.tu-graz.ac.at, http://www.cis.tu-graz.ac.at/igi/ pennisioll to make di&l/hnrd copies ofall or pa11 ofthis mated for perso,,al or &qsroom use is granted without Ike provided chat lhe copia me not made or distributed for profit or commercial advantage. 11~ COPY right notice, the title of the puhlicntiou and it< date appear. ad notice is given thar copyright is hy pemkion of the ACM, Inc. TO copy othemk. to republish, to post on servers or to redistribute lo list%, requires specific permission and/or fee COLT 97 Nashville, Tennesee. USA Copy+1 1997 ACM 0-89791-891-6/97t7..S3.TO restricted spiking neuron that is only slightly more powerful than a Boolean threshold gate: The consistency problem for a spiking neuron using binary coding and programmable delays from (0, l} is NP-complete. This holds even if all weights are kept fixed. The results demonstrate that temporal coding has a surprisingly large impact on the complexity of learning for single neurons.

[1]  R. Schapire,et al.  Toward Efficient Agnostic Learning , 1994 .

[2]  I Segev,et al.  Signal delay and input synchronization in passive dendritic structures. , 1993, Journal of neurophysiology.

[3]  Ronald L. Rivest,et al.  Training a 3-node neural network is NP-complete , 1988, COLT '88.

[4]  Wulfram Gerstner,et al.  A neuronal learning rule for sub-millisecond temporal coding , 1996, Nature.

[5]  Alan F. Murray,et al.  Analogue Neural Vlsi: A Pulse Stream Approach , 1994 .

[6]  Martin Anthony,et al.  Computational learning theory: an introduction , 1992 .

[7]  R. Schapire Toward Eecient Agnostic Learning , 1992 .

[8]  W. Gerstner,et al.  Time structure of the activity in neural network models. , 1995, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[9]  Wolfgang Maass,et al.  Networks of Spiking Neurons: The Third Generation of Neural Network Models , 1996, Electron. Colloquium Comput. Complex..

[10]  Wolfgang Maass,et al.  Fast Sigmoidal Networks via Spiking Neurons , 1997, Neural Computation.

[11]  Thomas M. Cover,et al.  Geometrical and Statistical Properties of Systems of Linear Inequalities with Applications in Pattern Recognition , 1965, IEEE Trans. Electron. Comput..

[12]  David Haussler,et al.  Learnability and the Vapnik-Chervonenkis dimension , 1989, JACM.

[13]  Barak A. Pearlmutter,et al.  VC Dimension of an Integrate-and-Fire Neuron Model , 1996, Neural Computation.

[14]  Hans Ulrich Simon,et al.  Robust Trainability of Single Neurons , 1995, J. Comput. Syst. Sci..

[15]  Leslie G. Valiant,et al.  Learning Boolean formulas , 1994, JACM.

[16]  David Haussler,et al.  Decision Theoretic Generalizations of the PAC Model for Neural Net and Other Learning Applications , 1992, Inf. Comput..

[17]  David S. Johnson,et al.  Computers and Intractability: A Guide to the Theory of NP-Completeness , 1978 .

[18]  Henry C. Tuckwell,et al.  Introduction to theoretical neurobiology , 1988 .

[19]  H. Schwarz Gesammelte mathematische Abhandlungen , 1970 .

[20]  Saburo Muroga,et al.  Threshold logic and its applications , 1971 .

[21]  Wofgang Maas,et al.  Networks of spiking neurons: the third generation of neural network models , 1997 .

[22]  Linda Sellie,et al.  Toward efficient agnostic learning , 1992, COLT '92.

[23]  Saburo Muroga,et al.  Lower Bound of the Number of Threshold Functions , 1966, IEEE Trans. Electron. Comput..

[24]  Leslie G. Valiant,et al.  A theory of the learnable , 1984, STOC '84.