Optimising Deep Belief Networks by hyper-heuristic approach

Deep Belief Networks (DBN) have been successful in classification especially image recognition tasks. However, the performance of a DBN is often highly dependent on settings in particular the combination of runtime parameter values. In this work, we propose a hyper-heuristic based framework which can optimise DBNs independent from the problem domain. It is the first time hyper-heuristic entering this domain. The framework iteratively selects suitable heuristics based on a heuristic set, apply the heuristic to tune the DBN to better fit with the current search space. Under this framework the setting of DBN learning is adaptive. Three well-known image reconstruction benchmark sets were used for evaluating the performance of this new approach. Our experimental results show this hyper-heuristic approach can achieve high accuracy under different scenarios on diverse image sets. In addition state-of-the-art meta-heuristic methods for tuning DBN were introduced for comparison. The results illustrate that our hyper-heuristic approach can obtain better performance on almost all test cases.

[1]  Graham Kendall,et al.  A Classification of Hyper-heuristic Approaches , 2010 .

[2]  Graham Kendall,et al.  Tabu exponential Monte-Carlo with counter heuristic for examination timetabling , 2009, 2009 IEEE Symposium on Computational Intelligence in Scheduling.

[3]  Wei Yin,et al.  Understanding of GP-Evolved Motion Detectors , 2013, IEEE Computational Intelligence Magazine.

[4]  João Paulo Papa,et al.  Fine-tuning Deep Belief Networks using Harmony Search , 2016, Appl. Soft Comput..

[5]  Xin-She Yang,et al.  Firefly algorithm, stochastic test functions and design optimisation , 2010, Int. J. Bio Inspired Comput..

[6]  Graham Kendall,et al.  Population based Monte Carlo tree search hyper-heuristic for combinatorial optimization problems , 2015, Inf. Sci..

[7]  Nasser R. Sabar,et al.  An Exponential Monte-Carlo algorithm for feature selection problems , 2014, Comput. Ind. Eng..

[8]  Graham Kendall,et al.  A graph coloring constructive hyper-heuristic for examination timetabling problems , 2012, Applied Intelligence.

[9]  Demis Hassabis,et al.  Mastering the game of Go with deep neural networks and tree search , 2016, Nature.

[10]  Nasser R. Sabar,et al.  Examination timetabling using scatter search hyper-heuristic , 2009, 2009 2nd Conference on Data Mining and Optimization.

[11]  Kunikazu Kobayashi,et al.  Time Series Forecasting Using Restricted Boltzmann Machine , 2012, ICIC.

[12]  Tijmen Tieleman,et al.  Training restricted Boltzmann machines using approximations to the likelihood gradient , 2008, ICML '08.

[13]  Xin-She Yang,et al.  Learning Parameters in Deep Belief Networks Through Firefly Algorithm , 2016, ANNPR.

[14]  Graham Kendall,et al.  Automatic Design of a Hyper-Heuristic Framework With Gene Expression Programming for Combinatorial Optimization Problems , 2015, IEEE Transactions on Evolutionary Computation.

[15]  Mark Johnston,et al.  Binary Image Classification: A Genetic Programming Approach to the Problem of Limited Training Instances , 2016, Evolutionary Computation.

[16]  Geoffrey E. Hinton,et al.  Using fast weights to improve persistent contrastive divergence , 2009, ICML '09.

[17]  Pascal Vincent,et al.  Representation Learning: A Review and New Perspectives , 2012, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[18]  Geoffrey E. Hinton A Practical Guide to Training Restricted Boltzmann Machines , 2012, Neural Networks: Tricks of the Trade.

[19]  Nasser R. Sabar,et al.  A math-hyper-heuristic approach for large-scale vehicle routing problems with time windows , 2015, 2015 IEEE Congress on Evolutionary Computation (CEC).

[20]  Graham Kendall,et al.  Grammatical Evolution Hyper-Heuristic for Combinatorial Optimization Problems , 2013, IEEE Transactions on Evolutionary Computation.

[21]  Graham Kendall,et al.  A Dynamic Multiarmed Bandit-Gene Expression Programming Hyper-Heuristic for Combinatorial Optimization Problems , 2015, IEEE Transactions on Cybernetics.

[22]  Geoffrey E. Hinton Training Products of Experts by Minimizing Contrastive Divergence , 2002, Neural Computation.

[23]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[24]  R Nasser An Exponential Monte-Carlo Local Search Algorithm for the Berth Allocation Problem , 2014 .

[25]  M. Fesanghary,et al.  An improved harmony search algorithm for solving optimization problems , 2007, Appl. Math. Comput..

[26]  Yee Whye Teh,et al.  A Fast Learning Algorithm for Deep Belief Nets , 2006, Neural Computation.

[27]  Z. Geem Music-Inspired Harmony Search Algorithm: Theory and Applications , 2009 .

[28]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[29]  Nasser R. Sabar,et al.  A constructive hyper-heuristics for rough set attribute reduction , 2010, 2010 10th International Conference on Intelligent Systems Design and Applications.

[30]  Kai Liu,et al.  Deep Boltzmann Machines Aided Design Based on Genetic Algorithms , 2014 .