Learning algorithms for RAM-based neural networks

RAM-based neural networks are designed to be hardware amenable, which affects the choice of learning algorithms. Reverse differentiation enables derivatives to be obtained efficiently on any architecture. The performance of four learning methods on three progressively more difficult problems are compared using a simulated RAM network. The learning algorithms are: reward-penalty, batched gradient descent, steepest descent and conjugate gradient. The applications are: the 838 encoder, character recognition, and particle classification. The results indicate that reward-penalty can solve only the simplest problem. All the gradient-based methods solve the particle task, but the simpler ones require more CPU time.