Evaluation and improvement of two training algorithms

Two effective neural network training algorithms are output weight optimization - hidden weight optimization and conjugate gradient. The former performs better on correlated data, and the latter performs better on random data. Based on these observations and others, we develop a procedure to test general neural network training algorithms. Since good neural network algorithm should perform well for all kinds of data, we develop alternation algorithms, which have runs of the different algorithms in turn. The alternation algorithm works well for both kinds of data.