New technique for feature selection: Combination between elastic net and relief

One of the most advanced forms of industrial maintenance is predictive maintenance. Indeed, the present analysis of the behavior of a material helps to predict future behavior. The analysis is also used to predict the nature of the failure and to minimize the delay in the fault detection. So as the diagnosis of faults in rotating machines is an important subject in order to increase their productivity and reliability, the choice of features to be used for classification and diagnosis constitutes a crucial point. The use of all the possible features will cause an increase in the computational cost and it will even lead to the increase of the classification error because of the existence of redundant and non-significant features. In this context, we are interested in presenting different methods of feature selection and proposing a new approach that tends to select the best features among existing ones and perform the classification-identification using the selected features. The evaluation of the proposed technique is done using real signals from a rotating machine.

[1]  Huan Liu,et al.  A Dilemma in Assessing Stability of Feature Selection Algorithms , 2011, 2011 IEEE International Conference on High Performance Computing and Communications.

[2]  J. Ogutu,et al.  Genomic selection using regularized linear regression models: ridge regression, lasso, elastic net and their extensions , 2012, BMC Proceedings.

[3]  R. Tibshirani,et al.  Regression shrinkage and selection via the lasso: a retrospective , 2011 .

[4]  Rong Jin,et al.  Exclusive Lasso for Multi-task Feature Selection , 2010, AISTATS.

[5]  Songyot Nakariyakul,et al.  A Review of Suboptimal Branch and Bound Algorithms , .

[6]  Sri Ramakrishna,et al.  FEATURE SELECTION METHODS AND ALGORITHMS , 2011 .

[7]  Larry A. Rendell,et al.  A Practical Approach to Feature Selection , 1992, ML.

[8]  H. Zou,et al.  Regularization and variable selection via the elastic net , 2005 .

[9]  Igor Kononenko,et al.  ReliefF for estimation and discretization of attributes in classification, regression, and ILP probl , 1996 .

[10]  Pavel Pudil,et al.  Efficient Feature Subset Selection and Subset Size Optimization , 2010 .

[11]  Noor Azilah Muda Computationally Inexpensive Sequential Forward Floating Selection for Acquiring Significant Features , 2011 .

[12]  Marko Robnik-Sikonja,et al.  An adaptation of Relief for attribute estimation in regression , 1997, ICML.

[13]  Masashi Sugiyama,et al.  High-Dimensional Feature Selection by Feature-Wise Kernelized Lasso , 2012, Neural Computation.

[14]  Hassan Chouaib Sélection de caractéristiques : Méthodes et applications , 2011 .

[15]  Yuming Zhou,et al.  A Feature Subset Selection Algorithm Automatic Recommendation Method , 2013, J. Artif. Intell. Res..

[16]  Mohammad-Reza Feizi-Derakhshi,et al.  Classifying Different Feature Selection Algorithms Based on the Search Strategies , .

[17]  Josef Kittler,et al.  Floating search methods in feature selection , 1994, Pattern Recognit. Lett..

[18]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[19]  Isabelle Guyon,et al.  An Introduction to Variable and Feature Selection , 2003, J. Mach. Learn. Res..

[20]  P. Bühlmann,et al.  The group lasso for logistic regression , 2008 .

[21]  Francesc J. Ferri,et al.  Fast Branch & Bound Algorithm in Feature Selection , 2007 .

[22]  Huan Liu,et al.  Feature Selection for Classification , 1997, Intell. Data Anal..