Improved nelder mead's simplex method and applications

Derivative free optimization algorithms are often used when it is difficult to find function derivatives, or if finding such derivatives are time consuming. The Nelder Mead's simplex method is one of the most popular derivative free optimization algorithms in the fields of engineering, statistics, and sciences. This algorithm is favored and widely used because of its fast convergence and simplicity. The simplex method converges really well with small scale problems of some variables. However, it does not have much success with large scale problems of multiple variables. This factor has reduced its popularity in optimization sciences significantly. Two solutions of quasi gradients are introduced to improve it in terms of the convergence rate and the convergence speed. The improved algorithm with higher success rate and faster convergence which still maintains the simplicity is the key feature of this paper. This algorithm will be compared on several benchmark functions with the original simplex method and other popular optimization algorithms such as the genetic algorithm, the differential evolution algorithm, and the particle swarm algorithm. Then the comparing results will be reported and discussed.

[1]  K. R. Pardasani,et al.  SVM Model for Identification of human GPCRs , 2010, ArXiv.

[2]  John A. Nelder,et al.  A Simplex Method for Function Minimization , 1965, Comput. J..

[3]  Hao Yu,et al.  Improved Computation for Levenberg–Marquardt Training , 2010, IEEE Transactions on Neural Networks.

[4]  Guo Guanqi,et al.  Evolutionary parallel local search for function optimization , 2003, IEEE Trans. Syst. Man Cybern. Part B.

[5]  Mitsuo Kawato,et al.  Feedback-error-learning neural network for trajectory control of a robotic manipulator , 1988, Neural Networks.

[6]  Y. Ueki,et al.  An application of neural network to dynamic dispatch using multi processors , 1994 .

[7]  T.,et al.  Training Feedforward Networks with the Marquardt Algorithm , 2004 .

[8]  Deris Stiawan,et al.  Classification of habitual activities in behavior-based network detection , 2010 .

[9]  B.M. Wilamowski,et al.  A methodology of synthesis of lossy ladder filters , 2009, 2009 International Conference on Intelligent Engineering Systems.

[10]  Robert A. Jacobs,et al.  Increased rates of convergence through learning rate adaptation , 1987, Neural Networks.

[11]  Tom Tollenaere,et al.  SuperSAB: Fast adaptive back propagation with good scaling properties , 1990, Neural Networks.

[12]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[13]  Paul Tseng,et al.  Gilding the Lily: A Variant of the Nelder-Mead Algorithm Based on Golden-Section Search , 2002, Comput. Optim. Appl..

[14]  Ayoub Al-Hamadi,et al.  Efficient Region-Based Image Querying , 2010, ArXiv.

[15]  Lixing Han,et al.  Implementing the Nelder-Mead simplex algorithm with adaptive parameters , 2010, Computational Optimization and Applications.

[16]  J. Betts Solving the nonlinear least square problem: Application of a general method , 1976 .

[17]  Milos Manic,et al.  Random weights search in compressed neural networks using overdetermined pseudoinverse , 2003, 2003 IEEE International Symposium on Industrial Electronics ( Cat. No.03TH8692).

[18]  Antoniya Georgieva,et al.  Neural Network Learning With Global Heuristic Search , 2007, IEEE Transactions on Neural Networks.