Evolutionary generation and training of recurrent artificial neural networks

An evolutionary artificial neural network training and design methodology is presented, aimed at obtaining optimum or quasi-optimum synchronous recurrent neural networks capable of processing sequential inputs. We show that, through the use of this method and working with floating point and integer valued chromosomes, it is possible to achieve optimum results, considering very small populations and few generations. In order to implement this methodology, we have developed GENIAL, a genetic algorithm development environment which is specifically designed for solving this type of problem. It offers ways of testing adequate fitness functions and many tools for improving results. Finally, we comment on the sequential introduction of different constraints in genetic algorithms, presenting a classical example where several design requirements are met simultaneously and which demonstrates the power of this method.<<ETX>>