Improving the Rprop Learning Algorithm

The Rprop algorithm proposed by Riedmiller and Braun is one of the best performing first-order learning methods for neural networks. We introduce modifications of the algorithm that improve its learning speed. The resulting speedup is experimentally shown for a set of neural network learning tasks as well as for artificial error surfaces.

[1]  Geoffrey E. Hinton,et al.  Learning internal representations by error propagation , 1986 .

[2]  Robert A. Jacobs,et al.  Increased rates of convergence through learning rate adaptation , 1987, Neural Networks.

[3]  Luís B. Almeida,et al.  Speeding up Backpropagation , 1990 .

[4]  Luís B. Almeida,et al.  Acceleration Techniques for the Backpropagation Algorithm , 1990, EURASIP Workshop.

[5]  Tom Tollenaere,et al.  SuperSAB: Fast adaptive back propagation with good scaling properties , 1990, Neural Networks.

[6]  Wolfram Schiffmann,et al.  Comparison of optimized backpropagation algorithms , 1993, ESANN.

[7]  Martin A. Riedmiller,et al.  A direct adaptive method for faster backpropagation learning: the RPROP algorithm , 1993, IEEE International Conference on Neural Networks.

[8]  Lutz Prechelt,et al.  PROBEN 1 - a set of benchmarks and benchmarking rules for neural network training algorithms , 1994 .

[9]  Martin A. Riedmiller,et al.  Advanced supervised learning in multi-layer perceptrons — From backpropagation to adaptive learning algorithms , 1994 .

[10]  Nikolaus Hansen,et al.  On the Adaptation of Arbitrary Normal Mutation Distributions in Evolution Strategies: The Generating Set Adaptation , 1995, ICGA.

[11]  N. Hansen,et al.  Convergence Properties of Evolution Strategies with the Derandomized Covariance Matrix Adaptation: T , 1997 .

[12]  Heinrich Braun,et al.  Neuronale Netze - Optimierung durch Lernen und Evolution , 1997 .

[13]  Dimitrios I. Fotiadis,et al.  Artificial neural networks for solving ordinary and partial differential equations , 1997, IEEE Trans. Neural Networks.

[14]  Dimitris G. Papageorgiou,et al.  Neural Network Methods for Boundary Value Problems Defined in Arbitrarily Shaped Domains , 1998, ArXiv.

[15]  Wolfram Schiffmann,et al.  Speeding Up Backpropagation Algorithms by Using Cross-Entropy Combined with Pattern Normalization , 1998, Int. J. Uncertain. Fuzziness Knowl. Based Syst..

[16]  Michael Hüsken,et al.  Fast Adaptation of the Solution of Differential Equations to Changing Constraints , 2000 .