ALTERNATIVES TO GRADIENT-BASED NEURAL TRAINING

Neural networks are usually trained using local, gradient-based procedures, and the best architectures are selected by experimentation. Gradient methods frequently find suboptimal solutions being trapped in local minima. Genetic algorithms are frequently used but do not guarantee optimal solutions and are computationally expensive. Several new global optimization methods suitable for architecture optimization and neural training are described here. Multistart initialization methods are also offered as an alternative to global minimization.