Enlarging the attractor basins of neural networks with noisy external fields

A neural network model with optimal connections trained with ensembles of external, discrete, noisy fields is studied. Allowing for non-zero errors in the storage, novel behaviour is observed which is reflected in the model's retrieval map. Improvement in the model's content addressability is determined by comparing the maximum storage level at which there is a near 100% basin of attraction. The cases presented have the external field applied during training, during retrieval and during both with statistically equal parameters. In all three the content addressability is improved over the zero external field network, with the equal training and retrieval fields case having the largest improvement. However, the apparent domination of the retrieval over the training field suggests this simple equality is perhaps not the optimal relationship.