Perceptrons above saturation

We study the storage of random patterns by a perceptron above its storage capacity alpha c, i.e. in the region where perfect storage becomes impossible. We determine the minimal fraction of learning errors and the distribution of stabilities for different learning rules in one-step replica symmetry breaking. Thereby we not only extend the known replica symmetric results to values of the storage capacity beyond the AT line but also show that, depending on the learning rule, replica symmetry may be globally unstable already well below the AT line. As an example for possible implications we compare the results for the typical basins of attraction of an extremely diluted attractor neural network as given by replica symmetry and one-step replica symmetry breaking.