A neural network sensitivity analysis in the presence of random fluctuations

In this work we investigate the sensitivity analysis of the noise applied to a neural network. The goal of this study is to try to understand the network's behavior and the outputs sensitivity from the perturbation of weights during the training process. From an engineering point of view, noise is perceived as detrimental to the system and the quality of the output, but in biological neural systems, we can observe noise fluctuations, which possess certain abilities to improve information processing. By means of sensitivity analysis tools we show quantitatively the acceptable level of noise which provides optimal solution to the random fluctuations without sacrificing the behavior of the network. The three different indicators utilized in this endeavor allow us to observe whether the noise variance is detrimental or beneficial, and whether it acts as a source of fluctuation. HighlightsWe introduce a sensitivity analysis (using three indicators) of solutions provided by a neural network in the presence of noise in the weights.We propose a relatively uncommon (in the neural network community) mathematical tool that allows to draw quantitative relevant conclusions for the performance of the system in the presence of noise in the weights.we show that a certain amount of perturbation in the set of weights can be, under particular conditions, an advantage.We provide two numerical experiments to showcase the method for calculating the noise in the network and have applied three indicators in the study-Euclidean distance (L2), cosine similarity (Lcos), and L_inf.

[1]  Yongqiang Wang,et al.  An investigation of deep neural networks for noise robust speech recognition , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.

[2]  O. Sporns,et al.  Key role of coupling, delay, and noise in resting brain fluctuations , 2009, Proceedings of the National Academy of Sciences.

[3]  Chong-Ho Choi,et al.  Sensitivity analysis of multilayer perceptron with differentiable activation functions , 1992, IEEE Trans. Neural Networks.

[4]  A. Faisal,et al.  Noise in the nervous system , 2008, Nature Reviews Neuroscience.

[5]  G. Ermentrout,et al.  Reliability, synchrony and noise , 2008, Trends in Neurosciences.

[6]  Alan F. Murray,et al.  Synaptic Weight Noise During MLP Learning Enhances Fault-Tolerance, Generalization and Learning Trajectory , 1992, NIPS.

[7]  Christof Koch,et al.  Ephaptic coupling of cortical neurons , 2011, Nature Neuroscience.

[8]  Farnood Merrikh-Bayat,et al.  Training and operation of an integrated neuromorphic network based on metal-oxide memristors , 2014, Nature.

[9]  W. Pitts,et al.  A Logical Calculus of the Ideas Immanent in Nervous Activity (1943) , 2021, Ideas That Created the Future.

[10]  Kelvin E. Jones,et al.  Neuronal variability: noise or part of the signal? , 2005, Nature Reviews Neuroscience.

[11]  G. Bolton Reliability , 2003, Medical Humanities.

[12]  Heekuck Oh,et al.  Neural Networks for Pattern Recognition , 1993, Adv. Comput..

[13]  C. Lee Giles,et al.  An analysis of noise in recurrent neural networks: convergence and generalization , 1996, IEEE Trans. Neural Networks.

[14]  S. Hashem,et al.  Sensitivity analysis for feedforward artificial neural networks with differentiable activation functions , 1992, [Proceedings 1992] IJCNN International Joint Conference on Neural Networks.

[15]  J J Hopfield,et al.  Learning algorithms and probability distributions in feed-forward and feed-back networks. , 1987, Proceedings of the National Academy of Sciences of the United States of America.

[16]  Ivan Tomov Dimov,et al.  A sensitivity study of the Wigner Monte Carlo method , 2015, J. Comput. Appl. Math..

[17]  C. D. Gelatt,et al.  Optimization by Simulated Annealing , 1983, Science.

[18]  Daniel Graupe,et al.  Principles of Artificial Neural Networks , 2018, Advanced Series in Circuits and Systems.

[19]  PAUL J. WERBOS,et al.  Generalization of backpropagation with application to a recurrent gas market model , 1988, Neural Networks.

[20]  Daniel S. Yeung,et al.  Sensitivity analysis of multilayer perceptron to input perturbation , 2000, Smc 2000 conference proceedings. 2000 ieee international conference on systems, man and cybernetics. 'cybernetics evolving to systems, humans, organizations, and their complex interactions' (cat. no.0.

[21]  Ilya M. Sobol,et al.  Sensitivity Estimates for Nonlinear Mathematical Models , 1993 .