Optimizing Weights of Neural Network Using an Adaptive Tabu Search Approach

Feed forward Neural Network (FNN) has been widely applied to many fields because of its ability to closely approximate unknown function to any degree of desired accuracy. Gradient techniques, for instance, Back Propagation (BP) algorithm, are the most general learning algorithms. Since these techniques are essentially local optimization algorithms, they are subject to converging at the local optimal solutions and thus perform poorly even on simple problems when forecasting out of samples. Consequently, we presented an adaptive Tabu Search (TS) approach as a possible alternative to the problematical BP algorithm, which included a novel adaptive search strategy of intensification and diversification that was used to improve the efficiency of the general TS. Taking the classical XOR problem and function approximation as examples, a compare investigation was implemented. The experiment results show that TS algorithm has obviously superior convergence rate and convergence precision compared with other BP algorithms.