An unsupervised feature selection by back-propagated weighting the non-Gaussianity score of independence components

Feature selection is one of the commonly used technique in machine learning literature. It aims to reduce irrelevant, redundant, unneeded attributes from data that do not contribute to improve or even decrease the performance of analytical model. This paper proposes a new feature selection method that evaluate by back-propagated weighting the nongaussianity, Kurtosis, of the corresponding independent components. The nongaussianity scores are normalized using a suitable logistic function where the parameters of the logistic function are selected using an auto fitting curve technique. This proposed method is called the Logistic function of Kurtosis of Independent Component Analysis (KL-ICA). The results on various benchmarks show significant improvement of analytical model performance over existing technique.