Prediction of phase equilibrium properties for complicated macromolecular systems by HGALM neural networks

Traditional error back propagation is a widely used training algorithm for feed forward neural networks (FFNNs). However, it generally encounters two problems of slow learning rate and relative low accuracy. In this work, a hybrid genetic algorithm combined with modified Levenberg-Marquardt algorithm (HGALM) was proposed for training FFNNs to improve the accuracy and decrease the time depletion comparing to the traditional EBP algorithm. The FFNNs based on HGALM were used to predict the binodal curve of water-DMAc-PSf system and protein solubility in lysozyme-NaCl-H2O system. The results would be used for guiding experimental researches in preparation of asymmetry polymer membrane and optimization of protein crystal process. (c) 2005 Elsevier B.V. All rights reserved.

[1]  E. Meehan,et al.  The solubility of hen egg-white lysozyme , 1988 .

[2]  Zbigniew Michalewicz,et al.  Genetic Algorithms + Data Structures = Evolution Programs , 1996, Springer Berlin Heidelberg.

[3]  Cliff T. Ragsdale,et al.  Combining a neural network with a genetic algorithm for process parameter optimization , 2000 .

[4]  Bing J. Sheu,et al.  Paralleled hardware annealing for optimal solutions on electronic neural networks , 1993, IEEE Trans. Neural Networks.

[5]  C. A. Smolders,et al.  Calculation of liquid-liquid phase separation in a ternary system of a polymer in a mixture of a solvent and a nonsolvent , 1982 .

[6]  J. Drenth,et al.  The protein-water phase diagram and the growth of protein crystals from aqueous solution , 1998 .

[7]  Marc L. Pusey,et al.  The solubility of the tetragonal form of hen egg white lysozyme from pH 4.0 to 5.4 , 1991 .

[8]  Robert A. Jacobs,et al.  Increased rates of convergence through learning rate adaptation , 1987, Neural Networks.

[9]  L. Shuguang,et al.  The investigation of solution thermodynamics for the polysufone—DMAC—water system , 1987 .

[10]  Armando Blanco,et al.  A real-coded genetic algorithm for training recurrent neural networks , 2001, Neural Networks.

[11]  Michael K. Weir,et al.  A method for self-determination of adaptive learning rates in back propagation , 1991, Neural Networks.

[12]  Mohammad Bagher Menhaj,et al.  Training feedforward networks with the Marquardt algorithm , 1994, IEEE Trans. Neural Networks.

[13]  Dilip Sarkar,et al.  Methods to speed up error back-propagation learning algorithm , 1995, CSUR.

[14]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[15]  Armando Blanco,et al.  A genetic algorithm to obtain the optimal recurrent neural network , 2000, Int. J. Approx. Reason..

[16]  Alessandro Sperduti,et al.  Speed up learning and network optimization with extended back propagation , 1993, Neural Networks.

[17]  M. Pusey,et al.  Tetragonal Chicken Egg White Lysozyme Solubility in Sodium Chloride Solutions , 1999 .