Block-based neural networks

This paper presents a novel block-based neural network (BBNN) model and the optimization of its structure and weights based on a genetic algorithm. The architecture of the BBNN consists of a 2D array of fundamental blocks with four variable input/output nodes and connection weights. Each block can have one of four different internal configurations depending on the structure settings, The BBNN model includes some restrictions such as 2D array and integer weights in order to allow easier implementation with reconfigurable hardware such as field programmable logic arrays (FPGA). The structure and weights of the BBNN are encoded with bit strings which correspond to the configuration bits of FPGA. The configuration bits are optimized globally using a genetic algorithm with 2D encoding and modified genetic operators. Simulations show that the optimized BBNN can solve engineering problems such as pattern classification and mobile robot control.

[1]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[2]  Lawrence Davis,et al.  Training Feedforward Neural Networks Using Genetic Algorithms , 1989, IJCAI.

[3]  L. Darrell Whitley,et al.  Genetic algorithms and neural networks: optimizing connections and connectivity , 1990, Parallel Comput..

[4]  Michael Scholz A Learning Strategy for Neural Networks Based on a Modified Evolutionary Strategy , 1990, PPSN.

[5]  Darrell Whitley,et al.  Optimizing small neural networks using a distributed genetic algorithm , 1990 .

[6]  Francesco Mondada,et al.  Mobile Robot Miniaturisation: A Tool for Investigation in Control Algorithms , 1993, ISER.

[7]  Donald E. Waagen,et al.  Evolving recurrent perceptrons for time-series modeling , 1994, IEEE Trans. Neural Networks.

[8]  Vittorio Maniezzo,et al.  Genetic evolution of the topology and weight distribution of neural networks , 1994, IEEE Trans. Neural Networks.

[9]  Peter J. Angeline,et al.  An evolutionary algorithm that constructs recurrent neural networks , 1994, IEEE Trans. Neural Networks.

[10]  Peter G. Korning,et al.  Training neural networks by means of genetic algorithms working on very long chromosomes , 1995, Int. J. Neural Syst..

[11]  Francesco Mondada,et al.  Evolution and Mobile Autonomous Robotics , 1995, Towards Evolvable Hardware.

[12]  Byung Ro Moon,et al.  On Multi-Dimensional Encoding/Crossover , 1995, ICGA.

[13]  Xin Yao,et al.  A Preliminary Study on Designing Artiicial Neural Networks Using Co-evolution , 1995 .

[14]  Zbigniew Michalewicz,et al.  Genetic Algorithms + Data Structures = Evolution Programs , 1996, Springer Berlin Heidelberg.

[15]  Zbigniew Michalewicz,et al.  Genetic algorithms + data structures = evolution programs (3rd ed.) , 1996 .

[16]  Sadayoshi Mikami,et al.  Structured learning in recurrent neural network using genetic algorithm with internal copy operator , 1997, Proceedings of 1997 IEEE International Conference on Evolutionary Computation (ICEC '97).

[17]  Thomas Bäck,et al.  Evolutionary computation: comments on the history and current state , 1997, IEEE Trans. Evol. Comput..

[18]  Xin Yao,et al.  A new evolutionary system for evolving artificial neural networks , 1997, IEEE Trans. Neural Networks.

[19]  Ronald S. Gyurcsik,et al.  Toward a general-purpose analog VLSI neural network with on-chip learning , 1997, IEEE Trans. Neural Networks.

[20]  Luigi Raffo,et al.  Analog VLSI circuits as physical structures for perception in early visual tasks , 1998, IEEE Trans. Neural Networks.

[21]  David B. Fogel,et al.  Evolutionary Computation: The Fossil Record , 1998 .