Noise-modulated neural networks as an application of stochastic resonance

Stochastic resonance (SR) is a phenomenon by which the input signal of a nonlinear system, with magnitude too small to affect the output, becomes observable by adding a non-zero level of noise to the system. SR is known to assist biological beings in coping with noisy environments, providing sophisticated information processing and adaptive behaviors. The SR effect can be interpreted as a decrease in the input-output information loss of a nonlinear system by making it stochastically closer to a linear system. This work shows how SR can improve the performance of a system even when the desired input-output relationship is nonlinear, specifically for the case of a neural networks whose hidden layers consist of threshold functions. Universal approximation capability of neural networks exploiting SR is then discussed: although a network consisting of threshold activation functions has been proven to be an universal approximator in the context of the extreme learning machine (ELM), once SR is taken into account, the system can be deemed as a classic three-layer neural network whose universality has been previously proven by simpler proofs. After proving the universal approximation capability for an infinite number of hidden units, the performance achieved with a finite number of hidden units is evaluated using two training algorithms, namely backpropagation and ELM. Results highlight the SR effect occurring in the proposed system, and the relationship among the number of hidden units, noise intensity, and approximation performance.

[1]  Derek Abbott,et al.  Analog-to-digital conversion using suprathreshold stochastic resonance , 2005, SPIE Micro + Nano Materials, Devices, and Applications.

[2]  Christopher M. Bishop,et al.  Current address: Microsoft Research, , 2022 .

[3]  Thomas T. Imhoff,et al.  Noise-enhanced tactile sensation , 1996, Nature.

[4]  Chee Kheong Siew,et al.  Extreme learning machine: Theory and applications , 2006, Neurocomputing.

[5]  Frank Moss,et al.  Noise in human muscle spindles , 1996, Nature.

[6]  Shantanu Chakrabartty,et al.  Noise-exploitation and adaptation in neuromorphic sensors , 2012, Smart Structures and Materials + Nondestructive Evaluation and Health Monitoring.

[7]  Chee Kheong Siew,et al.  Can threshold networks be trained directly? , 2006, IEEE Transactions on Circuits and Systems II: Express Briefs.

[8]  D. J. Toms,et al.  Training binary node feedforward neural networks by back propagation of error , 1990 .

[9]  J. Muller,et al.  Stochastic Resonance in Chemistry. 3. The Minimal-Bromate Reaction , 1996 .

[10]  Masahiro Ohka,et al.  Stochastic resonance aided tactile sensing , 2009, Robotica.

[11]  Bart Kosko,et al.  Robust stochastic resonance for simple threshold neurons. , 2004, Physical review. E, Statistical, nonlinear, and soft matter physics.

[12]  Hongming Zhou,et al.  Extreme Learning Machine for Regression and Multiclass Classification , 2012, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[13]  David Allingham,et al.  THE APPLICATION OF SUPRATHRESHOLD STOCHASTIC RESONANCE TO COCHLEAR IMPLANT CODING , 2002, The Random and Fluctuating World.

[14]  A. Sutera,et al.  The mechanism of stochastic resonance , 1981 .

[15]  Frank Moss,et al.  Use of behavioural stochastic resonance by paddle fish for feeding , 1999, Nature.

[16]  S. Fauve,et al.  Stochastic resonance in a bistable system , 1983 .

[17]  Ditto,et al.  Stochastic Resonance in a Neuronal Network from Mammalian Brain. , 1996, Physical review letters.

[18]  Edward N. Wilson Backpropagation Learning for Systems with Discrete-Valued Functions , 1998 .

[19]  R. J. Gaynier,et al.  The use of random weights for the training of multilayer networks of neurons with Heaviside characteristics , 1995 .

[20]  Sergey M. Bezrukov,et al.  Noise-induced enhancement of signal transduction across voltage-dependent ion channels , 1995, Nature.

[21]  Lawrence M. Ward,et al.  Stochastic resonance in attention control , 2006 .

[22]  Kurt Hornik,et al.  Approximation capabilities of multilayer feedforward networks , 1991, Neural Networks.

[23]  N. Stocks,et al.  Suprathreshold stochastic resonance in multilevel threshold systems , 2000, Physical review letters.

[24]  Bernard Widrow,et al.  Neural nets for adaptive filtering and adaptive pattern recognition , 1988, Computer.

[25]  A. S. Madhukumar,et al.  Design and performance analysis of a signal detector based on suprathreshold stochastic resonance , 2012, Signal Process..

[26]  Paolo Bonato,et al.  Noise‐enhanced balance control in patients with diabetes and patients with stroke , 2006, Annals of neurology.

[27]  Antonette M. Logar,et al.  An iterative method for training multilayer networks with threshold functions , 1994, IEEE Trans. Neural Networks.

[28]  B. Azzerboni,et al.  Micromagnetic understanding of stochastic resonance driven by spin-transfer-torque , 2011 .

[29]  Petri Koistinen,et al.  Using additive noise in back-propagation training , 1992, IEEE Trans. Neural Networks.

[30]  Chee Kheong Siew,et al.  Universal Approximation using Incremental Constructive Feedforward Networks with Random Hidden Nodes , 2006, IEEE Transactions on Neural Networks.

[31]  M Alexe,et al.  Chaos supported stochastic resonance in a metal-ferroelectric-semiconductor heterostructure. , 2005, Physical review. E, Statistical, nonlinear, and soft matter physics.

[32]  George D. Magoulas,et al.  Evolutionary training of hardware realizable multilayer perceptrons , 2006, Neural Computing & Applications.

[33]  Tor Sverre Lande,et al.  Thresholded samplers for UWB impulse radar , 2007, 2007 IEEE International Symposium on Circuits and Systems.

[34]  George Cybenko,et al.  Approximation by superpositions of a sigmoidal function , 1992, Math. Control. Signals Syst..

[35]  Kazuyuki Aihara,et al.  Noise-induced cooperative behavior in a multicell system , 2005, Bioinform..

[36]  Guang-Bin Huang,et al.  Convex incremental extreme learning machine , 2007, Neurocomputing.

[37]  Peter L. Bartlett,et al.  Using random weights to train multilayer networks of hard-limiting units , 1992, IEEE Trans. Neural Networks.

[38]  Christopher G. Atkeson,et al.  Constructive Incremental Learning from Only Local Information , 1998, Neural Computation.

[39]  Stefan Schaal,et al.  Locally Weighted Projection Regression : An O(n) Algorithm for Incremental Real Time Learning in High Dimensional Space , 2000 .

[40]  John P. Miller,et al.  Broadband neural encoding in the cricket cereal sensory system enhanced by stochastic resonance , 1996, Nature.

[41]  Frank Moss,et al.  Noise enhancement of information transfer in crayfish mechanoreceptors by stochastic resonance , 1993, Nature.

[42]  G. Parisi,et al.  Stochastic resonance in climatic change , 1982 .

[43]  Allan Pinkus,et al.  Multilayer Feedforward Networks with a Non-Polynomial Activation Function Can Approximate Any Function , 1991, Neural Networks.

[44]  Derek Abbott,et al.  What Is Stochastic Resonance? Definitions, Misconceptions, Debates, and Its Relevance to Biology , 2009, PLoS Comput. Biol..

[45]  Roy,et al.  Observation of stochastic resonance in a ring laser. , 1988, Physical review letters.

[46]  H. A. Babri,et al.  General approximation theorem on feedforward networks , 1997, Proceedings of ICICS, 1997 International Conference on Information, Communications and Signal Processing. Theme: Trends in Information Systems Engineering and Wireless Multimedia Communications (Cat..