Kernel Local Linear Discriminate Method for Dimensionality Reduction and Its Application in Machinery Fault Diagnosis

Dimensionality reduction is a crucial task in machinery fault diagnosis. Recently, as a popular dimensional reduction technology, manifold learning has been successfully used in many fields. However, most of these technologies are not suitable for the task, because they are unsupervised in nature and fail to discover the discriminate structure in the data. To overcome these weaknesses, kernel local linear discriminate (KLLD) algorithm is proposed. KLLD algorithm is a novel algorithm which combines the advantage of neighborhood preserving projections (NPP), Floyd, maximum margin criterion (MMC), and kernel trick. KLLD has four advantages. First of all, KLLD is a supervised dimension reduction method that can overcome the out-of-sample problems. Secondly, short-circuit problem can be avoided. Thirdly, KLLD algorithm can use between-class scatter matrix and inner-class scatter matrix more efficiently. Lastly, kernel trick is included in KLLD algorithm to find more precise solution. The main feature of the proposed method is that it attempts to both preserve the intrinsic neighborhood geometry of the increased data and exact the discriminate information. Experiments have been performed to evaluate the new method. The results show that KLLD has more benefits than traditional methods.

[1]  B. Scholkopf,et al.  Fisher discriminant analysis with kernels , 1999, Neural Networks for Signal Processing IX: Proceedings of the 1999 IEEE Signal Processing Society Workshop (Cat. No.98TH8468).

[2]  Joshua B. Tenenbaum,et al.  Global Versus Local Methods in Nonlinear Dimensionality Reduction , 2002, NIPS.

[3]  Guofeng Wang,et al.  Bearing fault classification based on conditional random field , 2013 .

[4]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[5]  Benwei Li,et al.  Supervised locally linear embedding projection (SLLEP) for machinery fault diagnosis , 2011 .

[6]  Wen Jun Using locally estimated geodesic distances to improve Hessian local linear embedding , 2008 .

[7]  W. Y. Liu,et al.  Rolling element bearing fault recognition approach based on fuzzy clustering bispectrum estimation , 2015 .

[8]  Kun Zhou,et al.  Locality Sensitive Discriminant Analysis , 2007, IJCAI.

[9]  Xu Fei-yun Method of Fault Pattern Recognition Based on Laplacian Eigenmaps , 2008 .

[10]  Baoping Tang,et al.  Rotating machine fault diagnosis using dimension reduction with linear local tangent space alignment , 2013 .

[11]  Xuelong Li,et al.  Global structure constrained local shape prior estimation for medical image segmentation , 2013, Comput. Vis. Image Underst..

[12]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[13]  Tao Jiang,et al.  Efficient and robust feature extraction by maximum margin criterion , 2003, IEEE Transactions on Neural Networks.

[14]  Xinping Yan,et al.  A New Method of Nonlinear Feature Extraction for Multi-Fault Diagnosis of Rotor Systems , 2010 .

[15]  Lei Zhang,et al.  A multi-manifold discriminant analysis method for image feature extraction , 2011, Pattern Recognit..

[16]  Yaguo Lei,et al.  A new approach to intelligent fault diagnosis of rotating machinery , 2008, Expert Syst. Appl..

[17]  Nenghai Yu,et al.  Neighborhood Preserving Projections (NPP): A Novel Linear Dimension Reduction Method , 2005, ICIC.

[18]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.