A novel hybrid learning algorithm for full Bayesian approach of artificial neural networks

A novel Monte Carlo algorithm is proposed to train Bayesian neural networks.This algorithm is based on the full Bayesian approach of artificial neural networks.Monte Carlo methods are integrated with GAs and fuzzy membership functions.Proposed algorithm is applied to time series and regression analysis in the context of BNNs.Proposed approach is superior to traditional training methods in terms of estimation performance. The Bayesian neural networks are useful tools to estimate the functional structure in the nonlinear systems. However, they suffer from some complicated problems such as controlling the model complexity, the training time, the efficient parameter estimation, the random walk, and the stuck in the local optima in the high-dimensional parameter cases. In this paper, to alleviate these mentioned problems, a novel hybrid Bayesian learning procedure is proposed. This approach is based on the full Bayesian learning, and integrates Markov chain Monte Carlo procedures with genetic algorithms and the fuzzy membership functions. In the application sections, to examine the performance of proposed approach, nonlinear time series and regression analysis are handled separately, and it is compared with the traditional training techniques in terms of their estimation and prediction abilities.

[1]  Richard M. Golden,et al.  Mathematical Methods for Neural Network Analysis and Design , 1996 .

[2]  H. Tong Non-linear time series. A dynamical system approach , 1990 .

[3]  Oguz Akbilgic,et al.  A novel Hybrid RBF Neural Networks model as a forecaster , 2013, Statistics and Computing.

[4]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[5]  Giovanna Castellano,et al.  An iterative pruning algorithm for feedforward neural networks , 1997, IEEE Trans. Neural Networks.

[6]  Wray L. Buntine,et al.  Bayesian Back-Propagation , 1991, Complex Syst..

[7]  Stephen J. Wright,et al.  Numerical Optimization , 2018, Fundamental Statistical Inference.

[8]  Sadiq M. Sait,et al.  Evolutionary algorithms, simulated annealing and tabu search: a comparative study , 2001 .

[9]  C. C. Homes,et al.  Bayesian Radial Basis Functions of Variable Dimension , 1998, Neural Computation.

[10]  Lotfi A. Zadeh,et al.  Fuzzy Sets , 1996, Inf. Control..

[11]  Christopher M. Bishop,et al.  Neural networks for pattern recognition , 1995 .

[12]  Anthony T. C. Goh,et al.  A hybrid Bayesian back‐propagation neural network approach to multivariate modelling , 2003 .

[13]  David B. Dunson,et al.  Bayesian Data Analysis , 2010 .

[14]  Gomes de Freitas,et al.  Bayesian methods for neural networks , 2000 .

[15]  Michael S. Goodrich Markov Chain Monte Carlo Bayesian Learning for Neural Networks , 2011 .

[16]  P. Green,et al.  On Bayesian Analysis of Mixtures with an Unknown Number of Components (with discussion) , 1997 .

[17]  Geoffrey E. Hinton,et al.  Keeping the neural networks simple by minimizing the description length of the weights , 1993, COLT '93.

[18]  Geoffrey E. Hinton,et al.  Bayesian Learning for Neural Networks , 1995 .

[19]  David J. C. MacKay,et al.  A Practical Bayesian Framework for Backpropagation Networks , 1992, Neural Computation.

[20]  T. Ross Fuzzy Logic with Engineering Applications , 1994 .

[21]  Narasimhan Sundararajan,et al.  A generalized growing and pruning RBF (GGAP-RBF) neural network for function approximation , 2005, IEEE Transactions on Neural Networks.

[22]  P. Green,et al.  Corrigendum: On Bayesian analysis of mixtures with an unknown number of components , 1997 .

[23]  Michael P. Perrone Averaging/modular techniques for neural networks , 1998 .

[24]  Lizhong Wu,et al.  A Smoothing Regularizer for Feedforward and Recurrent Neural Networks , 1996, Neural Computation.

[25]  Martin Fodslette Møller,et al.  A scaled conjugate gradient algorithm for fast supervised learning , 1993, Neural Networks.

[26]  Desheng Dash Wu,et al.  Short-term load forecasting using bayesian neural networks learned by Hybrid Monte Carlo algorithm , 2012, Appl. Soft Comput..

[27]  P. Green Reversible jump Markov chain Monte Carlo computation and Bayesian model determination , 1995 .

[28]  Robert A. Jacobs,et al.  Methods For Combining Experts' Probability Assessments , 1995, Neural Computation.

[29]  Elie Bienenstock,et al.  Neural Networks and the Bias/Variance Dilemma , 1992, Neural Computation.

[30]  Alan D. Marrs An Application of Reversible-Jump MCMC to Multivariate Spherical Gaussian Mixtures , 1997, NIPS.

[31]  Nikolay I. Nikolaev,et al.  Recursive Bayesian Recurrent Neural Networks for Time-Series Modeling , 2010, IEEE Transactions on Neural Networks.

[32]  S. Duane,et al.  Hybrid Monte Carlo , 1987 .

[33]  Baris Asikgil,et al.  Nonlinear time series forecasting with Bayesian neural networks , 2014, Expert Syst. Appl..

[34]  A. T. C. Goh,et al.  Nonlinear modeling with confidence estimation using Bayesian neural networks , 2004 .

[35]  Jouko Lampinen,et al.  Bayesian approach for neural networks--review and case studies , 2001, Neural Networks.

[36]  Goldberg,et al.  Genetic algorithms , 1993, Robust Control Systems with Genetic Algorithms.

[37]  Tshilidzi Marwala Bayesian training of neural networks using genetic programming , 2007, Pattern Recognit. Lett..

[38]  David Mackay,et al.  Probable networks and plausible predictions - a review of practical Bayesian methods for supervised neural networks , 1995 .

[39]  Faming Liang,et al.  Bayesian neural networks for nonlinear time series forecasting , 2005, Stat. Comput..

[40]  W. Wong,et al.  Real-Parameter Evolutionary Monte Carlo With Applications to Bayesian Mixture Models , 2001 .

[41]  Ilya Sutskever,et al.  Learning Recurrent Neural Networks with Hessian-Free Optimization , 2011, ICML.

[42]  Yuanchang Xie,et al.  Predicting motor vehicle collisions using Bayesian neural network models: an empirical analysis. , 2007, Accident; analysis and prevention.