Novel Automatic Filter-Class Feature Selection for Machine Learning Regression

With the increased focus on application of Big Data in all sectors of society, the performance of machine learning becomes essential. Efficient machine learning depends on efficient feature selection algorithms. Filter feature selection algorithms are model-free and therefore very fast, but require a threshold to function. We have created a novel meta-filter automatic feature selection, Ranked Distinct Elitism Selection Filter (RDESF) which is fully automatic and is composed of five common filters and a distinct selection process.

[1]  M. Kramer Nonlinear principal component analysis using autoassociative neural networks , 1991 .

[2]  Martin A. Riedmiller,et al.  A direct adaptive method for faster backpropagation learning: the RPROP algorithm , 1993, IEEE International Conference on Neural Networks.

[3]  C. Granger Some recent development in a concept of causality , 1988 .

[4]  Risto Miikkulainen,et al.  Automatic feature selection in neuroevolution , 2005, GECCO '05.

[5]  Jonathon Shlens,et al.  A Tutorial on Principal Component Analysis , 2014, ArXiv.

[6]  Juan Pardo,et al.  On-line learning of indoor temperature forecasting models towards energy efficiency , 2014 .

[7]  J. Cavanaugh Unifying the derivations for the Akaike and corrected Akaike information criteria , 1997 .

[8]  Michael Y. Hu,et al.  Forecasting with artificial neural networks: The state of the art , 1997 .

[9]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[10]  Charles F. Hockett,et al.  A mathematical theory of communication , 1948, MOCO.

[11]  Isabelle Guyon,et al.  An Introduction to Variable and Feature Selection , 2003, J. Mach. Learn. Res..

[12]  Kim Bjarne Wittchen,et al.  - Reference Climate Dataset for Technical Dimensioning in Building, Construction and other Sectors , 2013 .

[13]  L. Guttman Some necessary conditions for common-factor analysis , 1954 .

[14]  Holger R. Maier,et al.  Review of Input Variable Selection Methods for Artificial Neural Networks , 2011 .

[15]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[16]  K. Shanmugan,et al.  Random Signals: Detection, Estimation and Data Analysis , 1988 .

[17]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.