Binary and floating-point function optimization using messy genetic algorithms
暂无分享,去创建一个
Over the years, simple tripartite genetic algorithms (GAs)--search procedures based on the mechanics of natural genetics--have been found successful in a wide variety of problem domains. The working of a simple GA is based on the schema theorem, which implies that an optimal or near-optimal solution is formed by the combination of low-order building blocks in successive generations. There exists, however, a class of functions where low-order building blocks may not combine to form higher-order building blocks. These functions are termed as GA-deceptive functions. In addition, the underlying coding mechanism may be such that important allele combinations are distant along the string and are difficult to preserve due to the action of genetic operators. In the solution of a problem with a combination of GA-deceptive function and poor allele combinations, simple GAs digress from the globally optimal solution and converge to a false solution. Even though the problem can be partially solved using inversion or other reordering operators, these remedies are not practical enough to be of any use. Messy GAs have been found to solve this class of problems successfully. In a messy GA, salient, low-order building blocks are selected in its primordial phase. Thereafter, during the juxtapositional phase, these low-order building blocks are first combined to form higher-order building blocks. The messy GA then acts much like a simple GA, with tight building blocks combining together to form the optimal solution. This dissertation examines the working of a messy GA, analyzes its operators, extends its use to solve problems of nonuniform building block size and scale, and applies messy GAs to solve a real-world engineering problem that is difficult to solve using a simple GA. A messy floating-point code is also designed to enable floating-point function optimization by adaptively assigning precision to a decision parameter where it is needed. Theoretical and empirical studies have demonstrated that messy GAs can be used to tackle hard combinatorial optimization problems which are difficult to solve using other techniques.