A novel line search type algorithm avoidable of small local minima

In this paper, we propose a novel optimization method inspired by the line search algorithm and Glauber dynamics. It is a widely known problem that a network learning with an algorithm of the gradient descent type is easily trapped into local minima of the error surface because the direction of the update is determined by using only local information. In order to reduce the possibility of suffering from this problem, the proposed method iterates the global minimization of the error surface with respect to a randomly selected single direction at each learning step, which is speculated to have a tendency to skip focal minima of small size. The efficacy of this method is investigated by a computer simulation.