Linear interpolation with binary neurons

A two-layer network of binary neurons is considered. After learning a finite number of input-output combinations, the network performs linear interpolation between these combinations at the macroscopic level of correlations. It is not necessary to separate learning phase and testing phase. The network can also be taught linear transformations. It is shown that by introducing a special interpretation of the Hebb rule it is possible to construct the model with neurons which are either strictly excitatory or strictly inhibitory. >