Hebbian Learning Rule
暂无分享,去创建一个
Hebbian learning rule is one of the earliest and the simplest learning rules for the neural networks. It was proposed by Donald Hebb. Hebb proposed that if two interconnected neurons are both “on” at the same time, then the weight between them should be increased. Hebbian network is a single layer neural network which consists of one input layer with many input units and one output layer with one output unit. This architecture is usually used for pattern classification. The bias which increases the net input has value 1. This chapter includes the Hebbian learning algorithm. Various Matlab coding have been done for different classification problems.
[1] Simon M Stringer,et al. Hebbian learning of hand-centred representations in a hierarchical neural network model of the primate visual system , 2017, PloS one.
[2] Tibor Bosse,et al. An Adaptive Model for Dynamics of Desiring and Feeling Based on Hebbian Learning , 2010, Brain Informatics.
[3] Wulfram Gerstner,et al. Nonlinear Hebbian Learning as a Unifying Principle in Receptive Field Formation , 2016, PLoS Comput. Biol..