Meta-Learning Neural Networks

Even though artificial neural networks are capable of performing a wide variety of tasks, yet in practice sometimes they deliver only marginal performance. Inappropriate topology selection and learning algorithm are frequently blamed. There is little reason to expect that one can find a uniformly best algorithm for selecting the weights in a feedforward artificial neural network. This is in accordance with the no free lunch theorem, which explains that for any algorithm, any elevated performance over one class of problems is exactly paid for in performance over another class. As the complexity of the problem domain increases, manual design becomes more difficult and unmanageable. Evolutionary design of artificial neural networks has been widely explored. Evolutionary algorithms are used to adapt the connection weights, network architecture and learning rules according to the problem environment. A distinct feature of evolutionary neural networks is their adaptability to a dynamic environment. In other words, such neural networks can adapt to an environment as well as changes in the environment. The two forms of adaptation: evolution and learning in evolutionary artificial neural networks make their adaptation to a dynamic environment much more effective and efficient than the conventional learning approach. Even though evolutionary algorithms are well known as efficient global search algorithms, very often they miss the best local solutions in the complex solution space. Hybrid metalearning approach combines evolutionary learning and local search methods (using 1 st and 2 nd order error information) to improve the learning and faster convergence obtained using a direct evolutionary approach. In this tutorial, we will review the different neural network learning paradigms followed by some experimentation results to demonstrate the difficulties to design neural networks, which are smaller, faster and with a better generalization performance. Further, we introduce evolutionary algorithms and state of the art design of evolutionary artificial neural networks followed by the proposed meta-learning framework. In the meta-learning framework, in addition to the evolutionary search of connection weights and architectures (connectivity and activation functions), local search techniques are used to fine-tune the weights (meta-learning). We will discuss some experimentation results followed by discussions and conclusions.

[1]  Ajith Abraham,et al.  Optimization of evolutionary neural networks using hybrid learning algorithms , 2002, Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290).