Learning an archetype of a symmetry class

The Boolean functions implemented by feedforward network can be grouped into equivalence classes due to the symmetries of the net. The symmetry operations are the permutation of any pair of input or output signals and the interchange of ones and zeros in the input. A supervised learning protocol for feedforward neural networks is proposed based on such grouping of functions. The learning is assumed to take place in two stages. In the first stage, synaptic efficacies are adapted with a cost function that vanishes if the Boolean function that is represented belongs to the same symmetry class as the target one. The function thus chosen is taken as an archetype of the whole symmetry class. The second stage consists of performing a symmetry transformation to obtain a synaptic matrix that represents the target function. Simulations on small networks that show how this learning protocol increases the probability of generalization for most of the target functions are reported. This learning protocol can help to model dyslexic types of mistakes as arising from a partial learning. In fact, when only the first stage of the learning protocol is accomplished, the network makes mistakes that closely resemble those made by dyslexic persons