Dynamics of spiking map-based neural networks in problems of supervised learning
暂无分享,去创建一个
Vladimir I. Nekorkin | Mechislav M. Pugavko | Oleg Maslennikov | V. Nekorkin | O. Maslennikov | M. Pugavko
[1] Henry Markram,et al. Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.
[2] Ilya Sutskever,et al. Learning Recurrent Neural Networks with Hessian-Free Optimization , 2011, ICML.
[3] Vladimir I. Nekorkin,et al. Map-Based Approach to Problems of Spiking Neural Network Dynamics , 2014 .
[4] Wilten Nicola,et al. Supervised learning in spiking neural networks with FORCE training , 2016, Nature Communications.
[5] Christian K. Machens,et al. Predictive Coding of Dynamical Variables in Balanced Spiking Networks , 2013, PLoS Comput. Biol..
[6] David Sussillo,et al. Opening the Black Box: Low-Dimensional Dynamics in High-Dimensional Recurrent Neural Networks , 2013, Neural Computation.
[7] Oleg V Maslennikov,et al. Transient sequences in a hypernetwork generated by an adaptive network of spiking neurons , 2017, Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences.
[8] Filip Ponulak,et al. Introduction to spiking neural networks: Information processing, learning and applications. , 2011, Acta neurobiologiae experimentalis.
[9] Benjamin Schrauwen,et al. An overview of reservoir computing: theory, applications and implementations , 2007, ESANN.
[10] Wofgang Maas,et al. Networks of spiking neurons: the third generation of neural network models , 1997 .
[11] Jürgen Kurths,et al. Basin stability for burst synchronization in small-world networks of chaotic slow-fast oscillators. , 2015, Physical review. E, Statistical, nonlinear, and soft matter physics.
[12] Wolfgang Maass,et al. Liquid State Machines: Motivation, Theory, and Applications , 2010 .
[13] V I Nekorkin,et al. Chaotic oscillations in a map-based model of neural activity. , 2007, Chaos.
[14] Matthew T. Kaufman,et al. A neural network that finds a naturalistic solution for the production of muscle activity , 2015, Nature Neuroscience.
[15] Yoshua Bengio,et al. Learning long-term dependencies with gradient descent is difficult , 1994, IEEE Trans. Neural Networks.
[16] Razvan Pascanu,et al. Advances in optimizing recurrent networks , 2012, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.
[17] Wolfgang Maass,et al. Emergence of complex computational structures from chaotic neural networks through reward-modulated Hebbian learning. , 2014, Cerebral cortex.
[18] Hilbert J. Kappen,et al. Learning Universal Computations with Spikes , 2015, PLoS Comput. Biol..
[19] Igor Franović,et al. Mean-field dynamics of a population of stochastic map neurons. , 2017, Physical review. E.
[20] Adrienne L Fairhall,et al. Constructing Precisely Computing Networks with Biophysical Spiking Neurons , 2014, The Journal of Neuroscience.
[21] Sophie Denève,et al. Enforcing balance allows local supervised learning in spiking recurrent networks , 2015, NIPS.
[22] S. Haykin,et al. Adaptive Filter Theory , 1986 .
[23] Kaushik Roy,et al. Towards spike-based machine intelligence with neuromorphic computing , 2019, Nature.
[24] Yuan Zhao,et al. Interpretable Nonlinear Dynamic Modeling of Neural Trajectories , 2016, NIPS.
[25] W. Newsome,et al. Context-dependent computation by recurrent dynamics in prefrontal cortex , 2013, Nature.
[26] Harald Haas,et al. Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication , 2004, Science.
[27] David Sussillo,et al. Neural circuits as computational dynamical systems , 2014, Current Opinion in Neurobiology.
[28] Benjamin Schrauwen,et al. Reservoir Computing Trends , 2012, KI - Künstliche Intelligenz.
[29] Omri Barak,et al. Recurrent neural networks as versatile tools of neuroscience research , 2017, Current Opinion in Neurobiology.
[30] Anders Krogh,et al. Introduction to the theory of neural computation , 1994, The advanced book program.
[31] Chris Eliasmith,et al. Neural Engineering: Computation, Representation, and Dynamics in Neurobiological Systems , 2004, IEEE Transactions on Neural Networks.
[32] Vladimir I. Nekorkin,et al. Map Based Models in Neurodynamics , 2010, Int. J. Bifurc. Chaos.
[33] Michael Pfeiffer,et al. Deep Learning With Spiking Neurons: Opportunities and Challenges , 2018, Front. Neurosci..
[34] L. F. Abbott,et al. Building functional networks of spiking model neurons , 2016, Nature Neuroscience.
[35] Oleg V Maslennikov,et al. Modular networks with delayed coupling: synchronization and frequency control. , 2014, Physical review. E, Statistical, nonlinear, and soft matter physics.
[36] Guangyu R. Yang,et al. Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework , 2016, PLoS Comput. Biol..
[37] L. F. Abbott,et al. Generating Coherent Patterns of Activity from Chaotic Neural Networks , 2009, Neuron.
[38] Trevor Bekolay,et al. A Large-Scale Model of the Functioning Brain , 2012, Science.
[39] Herbert Jaeger,et al. Reservoir computing approaches to recurrent neural network training , 2009, Comput. Sci. Rev..
[40] Peter Ford Dominey,et al. Reservoir Computing Properties of Neural Dynamics in Prefrontal Cortex , 2016, PLoS Comput. Biol..
[41] Vladimir I. Nekorkin,et al. Discrete model of the olivo-cerebellar system: structure and dynamics , 2012 .
[42] M. Sanjuán,et al. Map-based models in neuronal dynamics , 2011 .
[43] O. Kinouchi,et al. A brief history of excitable map-based neurons and neural networks , 2013, Journal of Neuroscience Methods.