Fault analysis of High Speed Train with DBN hierarchical ensemble

Deep Belief Network (DBN) learns the features of the raw data automatically, and develops a new idea for the study of fault analysis of High Speed Train (HST). Combining deep learning and classification ensemble technology, this paper presents a novel DBN hierarchical ensemble model for HST fault analysis. Firstly, Fast Fourier Transform (FFT) coefficients of the HST vibration signals are extracted as the state of the visible layer of the model, and then DBN is used to learn the hierarchical features of the vibration signals automatically. The features of each layer learned by DBN are used to train Support Vector Machine (SVM), K-Nearest Neighbor (KNN), and Radial Basis Function (RBF) Neural Network respectively. Finally, the Majority Voting (MV), the Classification Entropy Voting Principle (CE), and the Winner Takes All (WTA) ensemble strategies are used for combination to get the final results. The experiments are conducted by using laboratory data sets and simulation data sets. The results show that the fault recognition rate of the proposed model is much higher than the traditional fault analysis methods. In addition, unlike the DBN model, the proposed model is affected slightly by the number of network layers and the size of hidden units.

[1]  Geoffrey E. Hinton Training Products of Experts by Minimizing Contrastive Divergence , 2002, Neural Computation.

[2]  Lior Rokach,et al.  Ensemble-based classifiers , 2010, Artificial Intelligence Review.

[3]  Yee Whye Teh,et al.  A Fast Learning Algorithm for Deep Belief Nets , 2006, Neural Computation.

[4]  Tianrui Li,et al.  Learning features from High Speed Train vibration signals with Deep Belief Networks , 2014, 2014 International Joint Conference on Neural Networks (IJCNN).

[5]  Yang Gao,et al.  Fault diagnosis of gas turbine based on support vector machine , 2014, The 26th Chinese Control and Decision Conference (2014 CCDC).

[6]  R. Fildes Journal of the American Statistical Association : William S. Cleveland, Marylyn E. McGill and Robert McGill, The shape parameter for a two variable graph 83 (1988) 289-300 , 1989 .

[7]  Pingfeng Wang,et al.  Failure diagnosis using deep belief learning based health state classification , 2013, Reliab. Eng. Syst. Saf..

[8]  Yoshua. Bengio,et al.  Learning Deep Architectures for AI , 2007, Found. Trends Mach. Learn..

[9]  Yang Chu Classifier Ensemble with Diversity: Effectiveness Analysis and Ensemble Optimization , 2014 .

[10]  K. R. Remya,et al.  Using weighted majority voting classifier combination for relation classification in biomedical texts , 2014, 2014 International Conference on Control, Instrumentation, Communication and Computational Technologies (ICCICCT).

[11]  Fuqiang Chen,et al.  Automatic sleep stage classification based on sparse deep belief net and combination of multiple classifiers , 2016 .

[12]  Shuanqiang Yang,et al.  Protein secondary structure prediction based on multi-SVM ensemble , 2010, 2010 International Conference on Intelligent Control and Information Processing.

[13]  Yan Yang,et al.  Prediction Analysis of the Railway Track State Based on PCA-RBF Neural Network , 2014 .

[14]  Wenchao Xiao,et al.  Semi-supervised hierarchical clustering ensemble and its application , 2016, Neurocomputing.

[15]  Shuang Liu,et al.  Model selection of RBF kernel for C-SVM based on genetic algorithm and multithreading , 2012, 2012 International Conference on Machine Learning and Cybernetics.

[16]  Rutger van Haasteren,et al.  Gibbs Sampling , 2010, Encyclopedia of Machine Learning.

[17]  Jingjing Zhao,et al.  Application of Empirical Mode Decomposition and Fuzzy Entropy to High-Speed Rail Fault Diagnosis , 2014 .

[18]  Geoffrey E. Hinton A Practical Guide to Training Restricted Boltzmann Machines , 2012, Neural Networks: Tricks of the Trade.

[19]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[20]  Lars Kai Hansen,et al.  Neural Network Ensembles , 1990, IEEE Trans. Pattern Anal. Mach. Intell..

[21]  Marco Wiering,et al.  2011 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN) , 2011, IJCNN 2011.