Restricted Boltzmann Machine for Nonlinear System Modeling

In this paper, we use a deep learning method, restricted Boltzmann machine, for nonlinear system identification. The neural model has deep architecture and is generated by a random search method. The initial weights of this deep neural model are obtained from the restricted Boltzmann machines. To identify nonlinear systems, we propose special unsupervised learning methods with input data. The normal supervised learning is used to train the weights with the output data. The modified algorithm is validated by modeling two benchmark systems.

[1]  C. Lee Giles,et al.  What Size Neural Network Gives Optimal Generalization? Convergence Properties of Backpropagation , 1998 .

[2]  Yoshua Bengio,et al.  Algorithms for Hyper-Parameter Optimization , 2011, NIPS.

[3]  Yoshua Bengio,et al.  Random Search for Hyper-Parameter Optimization , 2012, J. Mach. Learn. Res..

[4]  Qing Song Robust Initialization of a Jordan Network With Recurrent Constrained Learning , 2011, IEEE Transactions on Neural Networks.

[5]  Xiaoou Li,et al.  Automated Nonlinear System Modeling with Multiple Fuzzy Neural Networks and Kernel Smoothing , 2010, Int. J. Neural Syst..

[6]  Kien A. Hua,et al.  Hybrid Manifold Embedding , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[7]  Kumpati S. Narendra,et al.  Gradient methods for the optimization of dynamical systems containing neural networks , 1991, IEEE Trans. Neural Networks.

[8]  Xiaoou Li,et al.  Automated nonlinear system modelling with multiple neural networks , 2011, Int. J. Syst. Sci..

[9]  George Cybenko,et al.  Approximation by superpositions of a sigmoidal function , 1992, Math. Control. Signals Syst..

[10]  Geoffrey E. Hinton,et al.  A Learning Algorithm for Boltzmann Machines , 1985, Cogn. Sci..

[11]  Jyh-Shing Roger Jang,et al.  ANFIS: adaptive-network-based fuzzy inference system , 1993, IEEE Trans. Syst. Man Cybern..

[12]  Frank L. Lewis,et al.  Identification of nonlinear dynamical systems using multilayered neural networks , 1996, Autom..

[13]  Pascal Vincent,et al.  The Difficulty of Training Deep Architectures and the Effect of Unsupervised Pre-Training , 2009, AISTATS.

[14]  Yoshua Bengio,et al.  Extracting and composing robust features with denoising autoencoders , 2008, ICML '08.

[15]  Jason Weston,et al.  A unified architecture for natural language processing: deep neural networks with multitask learning , 2008, ICML '08.

[16]  Yee Whye Teh,et al.  A Fast Learning Algorithm for Deep Belief Nets , 2006, Neural Computation.

[17]  Shantanu Chakrabartty,et al.  Noise-Shaping Gradient Descent-Based Online Adaptation Algorithms for Digital Calibration of Analog Circuits , 2013, IEEE Transactions on Neural Networks and Learning Systems.

[18]  Yoshua Bengio,et al.  Why Does Unsupervised Pre-training Help Deep Learning? , 2010, AISTATS.