Building symmetries into feedforward networks

One of the central tools developed by M. Minsky and S. Papert (1988) was the group invariance theorem. This theorem is concerned with choosing perceptron weights to recognise a predicate that is invariant under a group of permutations of the input. The theorem states that the weights can be chosen to be constant for equivalence classes of predicates under the action of the group. This paper presents this result in a graph theoretic light and then extends consideration to multilayer perceptrons. It is shown that, by choosing a multilayer network in such a way that the action of the group on the input nodes can be extended to the whole network, the invariance of the output under the action of the group can be guaranteed. This greatly reduces the number of degrees of freedom in the training of such a network. An example of using this technique to train a network to recognise isomorphism classes of graphs is given. This compares favourably with previous experiments using standard back-propagation. The connections between the group of symmetries and the network structure are explored and the relation to the problem of graph isomorphism is discussed