Hebbian Learning Rule

Hebbian learning rule is one of the earliest and the simplest learning rules for the neural networks. It was proposed by Donald Hebb. Hebb proposed that if two interconnected neurons are both “on” at the same time, then the weight between them should be increased. Hebbian network is a single layer neural network which consists of one input layer with many input units and one output layer with one output unit. This architecture is usually used for pattern classification. The bias which increases the net input has value 1. This chapter includes the Hebbian learning algorithm. Various Matlab coding have been done for different classification problems.