A supervised learning algorithm for time-discrete cellular neural networks is introduced. The algorithm is based on the relaxation method and can be used for the determination of suitable template coefficients. This is done by postulating the subsequent output state of all cells. The relaxation method is able to train the network to a desired parameter insensitivity. Incorporating symmetry constraints leads to a fast convergence. If there exists a solution at all, the relaxation method always terminates after a finite number of iteration steps. The algorithm can also be applied to perceptrons or discrete Hopfield nets containing a comparator characteristic as nonlinearity.<<ETX>>
[1]
Josef A. Nossek,et al.
Cellular neural network design using a learning algorithm
,
1990,
IEEE International Workshop on Cellular Neural Networks and their Applications.
[2]
Lin-Bao Yang,et al.
Cellular neural networks: theory
,
1988
.
[3]
I. J. Schoenberg,et al.
The Relaxation Method for Linear Inequalities
,
1954,
Canadian Journal of Mathematics.
[4]
Leon O. Chua,et al.
CNN cloning template: shadow detector
,
1990
.
[5]
Leon O. Chua,et al.
CNN cloning template: hole-filler
,
1990
.