Analysis on the Boltzmann Machine with Random Input Drifts in Activation Function

The Boltzmann machine (BM) model is able to learn the probability distribution of input patterns. However, in analog realization, there are thermal noise and random offset voltages of amplifiers. Those realization issues affect the behaviour of the neurons’ activation function and they can be modelled as random input drifts. This paper analyzes the activation function and state distribution of BMs under the input random drift model. Since the state of a neuron is also determined by its activation function, the random input drifts may cause a BM to change the behaviour. We show that the effect of random input drifts is equivalent to raising temperature factor. Hence, from the Kullback–Leibler (KL) divergence perspective, we propose a compensation scheme to reduce the effect of random input drifts. In our derive of compensation scheme, we assume that the input drift follows the Gaussian distribution. Surprisedly, from our simulations, the proposed compensation scheme also works very well for other distributions.

[1]  Geoffrey E. Hinton,et al.  A Learning Algorithm for Boltzmann Machines , 1985, Cogn. Sci..

[2]  Michel Steyaert,et al.  Measurement of EMI induced input offset voltage of an operational amplifier , 2007 .

[3]  John Sum,et al.  Learning Algorithm for Boltzmann Machines With Additive Weight and Bias Noise , 2019, IEEE Transactions on Neural Networks and Learning Systems.

[4]  Alan F. Murray,et al.  Continuous restricted Boltzmann machine with an implementable training algorithm , 2003 .

[5]  Alan F. Murray,et al.  Continuous-valued probabilistic behavior in a VLSI generative model , 2006, IEEE Transactions on Neural Networks.

[6]  Andrew Chi-Sing Leung,et al.  A Regularizer Approach for RBF Networks Under the Concurrent Weight Failure Situation , 2017, IEEE Transactions on Neural Networks and Learning Systems.

[7]  Degang Chen,et al.  Analyses of Static and Dynamic Random Offset Voltages in Dynamic Comparators , 2009, IEEE Transactions on Circuits and Systems I: Regular Papers.

[8]  Geoffrey E. Hinton,et al.  Robust Boltzmann Machines for recognition and denoising , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[9]  Chi-Sing Leung,et al.  ADMM-Based Algorithm for Training Fault Tolerant RBF Networks and Selecting Centers , 2018, IEEE Transactions on Neural Networks and Learning Systems.

[10]  T. D. Harrison,et al.  Boltzmann machines for speech recognition , 1986 .

[11]  Hede Ma Pattern recognition using Boltzmann machine , 1995, Proceedings IEEE Southeastcon '95. Visualize the Future.

[12]  Andrew Chi-Sing Leung,et al.  Objective Function and Learning Algorithm for the General Node Fault Situation , 2016, IEEE Transactions on Neural Networks and Learning Systems.