Modeling of Flow Shop Scheduling with Effective Training Algorithms-Based Neural Networks

This paper deals with the performance comparison of three most effective neural network backpropagation training algorithms such as gradient descent, Boyden, Fletcher, Goldfarb and Shanno (BFGS) based Quasi-Newton (Q-N) and Levenberg-Marquardt (L-M) algorithms. The training of the neural network is carried out based on random datasets considering optimal job sequences of the permutation flow shop problems. In the present investigation, a goal of 0.001 of MSE or 3000 of epochs is set as a goal of learning. The overfitting and overtraining are not allowed during model building to avoid poor generalization ability. The performance of different learning techniques is reported in terms of both solution quality and computational times. The computational results demonstrate that the L-M performs best among the three algorithms with respect to both MSE and R2. However, the gradient descent algorithm is the fastest among them.

[1]  David G. Dannenbring,et al.  An Evaluation of Flow Shop Sequencing Heuristics , 1977 .

[2]  S. M. Johnson,et al.  Optimal two- and three-stage production schedules with setup times included , 1954 .

[3]  S.M.A. Suliman,et al.  A two-phase heuristic approach to the permutation flow-shop scheduling problem , 2000 .

[4]  Liang Gong,et al.  Training Feed-forward Neural Networks Using the Gradient Descent Method with the Optimal Stepsize , 2012 .

[5]  Jatinder N. D. Gupta,et al.  A Functional Heuristic Algorithm for the Flowshop Scheduling Problem , 1971 .

[6]  C. G. Broyden The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations , 1970 .

[7]  Mohammad Bagher Menhaj,et al.  Training feedforward networks with the Marquardt algorithm , 1994, IEEE Trans. Neural Networks.

[8]  Ali A. Minai,et al.  Back-propagation heuristics: a study of the extended delta-bar-delta algorithm , 1990, 1990 IJCNN International Joint Conference on Neural Networks.

[9]  Martin A. Riedmiller,et al.  A direct adaptive method for faster backpropagation learning: the RPROP algorithm , 1993, IEEE International Conference on Neural Networks.

[10]  R. A. Dudek,et al.  A Heuristic Algorithm for the n Job, m Machine Sequencing Problem , 1970 .

[11]  Subramaniam Balakrishnan,et al.  A neural network to enhance local search in the permutation flowshop , 2005, Comput. Ind. Eng..

[12]  D. Goldfarb A family of variable-metric methods derived by variational means , 1970 .

[13]  Robert A. Jacobs,et al.  Increased rates of convergence through learning rate adaptation , 1987, Neural Networks.

[14]  R. Fletcher Practical Methods of Optimization , 1988 .

[15]  D. Shanno Conditioning of Quasi-Newton Methods for Function Minimization , 1970 .

[16]  Dilip Sarkar,et al.  Methods to speed up error back-propagation learning algorithm , 1995, CSUR.

[17]  R. Sridharan,et al.  A hybrid neural network–genetic algorithm approach for permutation flow shop scheduling , 2010 .

[18]  Christos Koulamas,et al.  A new constructive heuristic for the flowshop scheduling problem , 1998, Eur. J. Oper. Res..

[19]  D. S. Palmer Sequencing Jobs Through a Multi-Stage Process in the Minimum Total Time—A Quick Method of Obtaining a Near Optimum , 1965 .

[20]  R. Sridharan,et al.  An artificial neural network based heuristic for flow shop scheduling problems , 2011, J. Intell. Manuf..

[21]  Derya Eren Akyol,et al.  Application of neural networks to heuristic scheduling algorithms , 2004, Comput. Ind. Eng..

[22]  Michael J. Shaw,et al.  A neural-net approach to real time flow-shop sequencing , 2000 .

[23]  C. Rajendran Heuristics for scheduling in flowshop with multiple objectives , 1995 .

[24]  Roberto Battiti,et al.  First- and Second-Order Methods for Learning: Between Steepest Descent and Newton's Method , 1992, Neural Computation.

[25]  Francesco Palmieri,et al.  MEKA-a fast, local algorithm for training feedforward neural networks , 1990, 1990 IJCNN International Joint Conference on Neural Networks.

[26]  Tom Tollenaere,et al.  SuperSAB: Fast adaptive back propagation with good scaling properties , 1990, Neural Networks.

[27]  Inyong Ham,et al.  A heuristic algorithm for the m-machine, n-job flow-shop sequencing problem , 1983 .