Resilient back-propagation approach in small-world feed-forward neural network topology based on Newman–Watts algorithm

The scientific researches are focused on network topologies and training algorithms fields because they reduce overfitting problem in artificial neural networks. In this context, we showed in our previous work that Newman–Watts small-world feed-forward artificial neural networks present better classification and prediction performance than conventional feed-forward artificial neural networks. In this study, we investigate the effects of the Resilient back-propagation algorithm on SW network topology and propose a Resilient Newman–Watts small-world feed-forward artificial neural network model by assuming fixed initial topological conditions. We find that Resilient small-world network further reduces overfitting and further increases the network performance when compared to the conventional feed-forward artificial neural networks. Furthermore, it is shown that the proposed network model does not increase the algorithmic complexity as per other models. The obtained results imply that the proposed model can contribute to the solving of overfitting problem encountered in both deep neural networks and conventional artificial neural networks.

[1]  P. S. Periasamy,et al.  Recognition of Tamil handwritten character using modified neural network with aid of elephant herding optimization , 2019, Multimedia Tools and Applications.

[2]  Tomasz Pajchrowski,et al.  Neural Speed Controller Trained Online by Means of Modified RPROP Algorithm , 2015, IEEE Transactions on Industrial Informatics.

[3]  D. Simard,et al.  Fastest learning in small-world neural networks , 2004, physics/0402076.

[4]  Nida Shahid,et al.  Applications of artificial neural networks in health care organizational decision-making: A scoping review , 2019, PloS one.

[5]  Moncef Gabbouj,et al.  Evolutionary artificial neural networks by multi-dimensional particle swarm optimization , 2009, Neural Networks.

[6]  Martin A. Riedmiller,et al.  A direct adaptive method for faster backpropagation learning: the RPROP algorithm , 1993, IEEE International Conference on Neural Networks.

[7]  Mahmut Ozer,et al.  Impact of small-world topology on the performance of a feed-forward artificial neural network based on 2 different real-life problems , 2014 .

[8]  Aman Jantan,et al.  State-of-the-art in artificial neural network applications: A survey , 2018, Heliyon.

[9]  Ali A. Ghorbani,et al.  Application of deep learning to cybersecurity: A survey , 2019, Neurocomputing.

[10]  Minoru Asada,et al.  A small-world topology enhances the echo state property and signal propagation in reservoir computing , 2019, Neural Networks.

[11]  Matjaz Perc,et al.  Performance of small-world feedforward neural networks for the diagnosis of diabetes , 2017, Appl. Math. Comput..

[12]  D. Watts,et al.  Small Worlds: The Dynamics of Networks between Order and Randomness , 2001 .

[13]  Kurosh Madani,et al.  INDUSTRIAL AND REAL WORLD APPLICATIONS OF ARTIFICIAL NEURAL NETWORKS Illusion or reality , 2006 .

[14]  Herzegovina,et al.  ANFIS model for the prediction of generated electricity of photovoltaic modules , 2019 .

[15]  Meng Ma,et al.  Extract interpretability-accuracy balanced rules from artificial neural networks: A review , 2020, Neurocomputing.

[16]  Been Kim,et al.  Interactive and interpretable machine learning models for human machine collaboration , 2015 .

[17]  Jabar H. Yousif,et al.  A Comparison Study Based on Artificial Neural Network for Assessing PV/T Solar Energy Production , 2019, Case Studies in Thermal Engineering.

[18]  Feng Xu,et al.  A Multilayer Feed Forward Small-World Neural Network Controller and Its Application on Electrohydraulic Actuation System , 2013, J. Appl. Math..

[19]  M. Newman,et al.  Scaling and percolation in the small-world network model. , 1999, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[20]  Priyanka Mehtani,et al.  Pattern Classification using Artificial Neural Networks , 2011 .

[21]  Sven Behnke,et al.  Object class segmentation of RGB-D video using recurrent convolutional neural networks , 2017, Neural Networks.

[22]  Danielle Smith Bassett,et al.  Small-World Brain Networks , 2006, The Neuroscientist : a review journal bringing neurobiology, neurology and psychiatry.

[23]  Ahmad Reza Heravi,et al.  A New Correntropy-Based Conjugate Gradient Backpropagation Algorithm for Improving Training in Neural Networks , 2018, IEEE Transactions on Neural Networks and Learning Systems.

[24]  S. Hyakin,et al.  Neural Networks: A Comprehensive Foundation , 1994 .

[25]  Hong Li,et al.  Evolving feedforward artificial neural networks using a two-stage approach , 2019, Neurocomputing.

[26]  V Latora,et al.  Efficient behavior of small-world networks. , 2001, Physical review letters.

[27]  Athanasios V. Vasilakos,et al.  Dynamic group optimisation algorithm for training feed-forward neural networks , 2018, Neurocomputing.

[28]  Mahmut Ozer,et al.  Impact of small-world network topology on the conventional artificial neural network for the diagnosis of diabetes , 2016 .

[29]  Duncan J. Watts,et al.  Collective dynamics of ‘small-world’ networks , 1998, Nature.

[30]  Jie Wu,et al.  Small Worlds: The Dynamics of Networks between Order and Randomness , 2003 .

[31]  Qing Song,et al.  Robust learning in SpikeProp , 2017, Neural Networks.

[32]  N. A. Magnitskii Some New Approaches to the Construction and Learning of Artificial Neural Networks , 2001 .

[33]  Okan Erkaymaz,et al.  Impact of Newman-Watts Small-World approach on The Performance of Feed-Forward Artificial Neural Networks , 2016 .

[34]  Edmundas Kazimieras Zavadskas,et al.  Neuro-fuzzy inference systems approach to decision support system for economic order quantity , 2019, Economic Research-Ekonomska Istraživanja.