Active Learning in Recurrent Neural Networks Facilitated by a Hebb-like Learning Rule with Memory

We demonstrate in this article that a Hebb-like learning rule with memory paves the way for active learning in the context of recurrent neural networks. We compare active with passive learning and a Hebb-like learning rule with and without memory for the problem of timing to be learned by the neural network. Moreover, we study the influence of the topology of the recurrent neural network. Our results from numerical simulations reveal that active learning decreases the learning time significantly only for the Hebb-like learning rule with memory whereas the learning rule without memory remains unaffected. This result can be observed in all investigated network topologies, indicating the robustness of this effect.

[1]  David A. Cohn,et al.  Active Learning with Statistical Models , 1996, NIPS.

[2]  M. Poo,et al.  Propagation of activity-dependent synaptic depression in simple neural networks , 1997, Nature.

[3]  Terrence J. Sejnowski,et al.  The Computational Brain , 1996, Artif. Intell..

[4]  Lakhmi C. Jain,et al.  New Learning Paradigms in Soft Computing , 2002 .

[5]  Frank Emmert-Streib Aktive computation in offenen Systemen: Lerndynamiken in biologischen Systemen: vom Netzwerk zum Organismus , 2003 .

[6]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[7]  F. H. Adler Cybernetics, or Control and Communication in the Animal and the Machine. , 1949 .

[8]  P. Bak,et al.  Learning from mistakes , 1997, Neuroscience.

[9]  J. Needham Theoretische Biologie , 1933, Nature.

[10]  M. Hasenjäger,et al.  Active learning in neural networks , 2002 .

[11]  Holger Schoener,et al.  Active Learning with Neural Networks , 2007 .

[12]  E. Capaldi,et al.  The organization of behavior. , 1992, Journal of applied behavior analysis.

[13]  P. Bak,et al.  Adaptive learning by extremal dynamics and negative feedback. , 2000, Physical review. E, Statistical, nonlinear, and soft matter physics.

[14]  Duncan J. Watts,et al.  Collective dynamics of ‘small-world’ networks , 1998, Nature.

[15]  Anders Krogh,et al.  Introduction to the theory of neural computation , 1994, The advanced book program.

[16]  J. Lisman,et al.  D1/D5 Dopamine Receptors Inhibit Depotentiation at CA1 Synapses via cAMP-Dependent Mechanism , 1998, The Journal of Neuroscience.

[17]  U. Frey,et al.  Synaptic tagging and long-term potentiation , 1997, Nature.

[18]  Frank Emmert-Streib Self-organized annealing in laterally inhibited neural networks shows power law decay , 2004 .